- Problematic use does not equal addiction. Problematic use has been used to describe people’s relationship with lots of technologies, like TVs and smartphones. We’ve built tools and controls to help people manage when and how they use our services. Furthermore, we have a dedicated team working across our platforms to better understand these issues and ensure people are using our apps in ways that are meaningful to them.
- We have been studying well-being for more than a decade and that continues today. The suggestion that this work stopped in 2019 couldn’t be further from the truth. This is evidenced by the various pieces of research we have published externally since then and our increased engagement and collaboration with experts such as the Aspen Institute, the Humanity Center, and our role as a founding sponsor of the Digital Wellness Lab run jointly by Harvard University and Boston Children’s Hospital.
- We ship features to help people manage their experiences on our apps and services. Just since 2018, we have introduced nearly 10 products to better support people’s well-being, including problematic use of our apps, and that work continues today.
- This is an industry-wide challenge and we have an industry leading, centrally coordinated well-being research effort that works with product and engineering leaders across our services to understand and address issues impacting well-being, including problematic use.
The Wall Street Journal has once again chosen to selectively pick and choose from internal company documents to present a narrative that is simply wrong about how we use research to address an important issue – this time about problematic use.
Our company has been engaged and supportive throughout our multiyear effort to better understand and empower people who use our services to manage problematic use. That’s why this work has taken place over multiple years, including now.
We’ve been working to conduct research and not in secret but in public. In fact, the primary piece of internal research the Journal cites was published in May 2019 at a premier academic conference for communications technology and is available here. We’ve also been exploring these questions and using well-being principles to inform our work for over 10 years.
We want people to have a positive experience on our services so even though there isn’t an industry-established definition of problematic use, it’s something we’re invested in understanding. Our own research as well as external research has revealed significant variation in the number of people who self-report problematic use, depending on how it’s measured. After the May 2019 study, we ran another study that also asked people if they felt guilty about their social media use – intentionally expanding on previous definitions to capture a broader set of experiences. Unsurprisingly, the prevalence of problematic use in this study was higher because we studied more aspects of the issue. Why would we do research that would potentially show higher levels of problematic use? Because this was part of the early-stage research on this topic intended to help us understand the various facets of problematic use and develop more adequate messaging and tools to help support people who use our products. While a causal link between social media and addiction has not been found, and overall, research suggests that, on average, social media does not have a major detrimental impact on well-being, we still want to provide people with tools to help them manage it however they see fit.
What did this research lead Facebook to do? Roll out nearly 10 tools since 2018, including:
- Your Time on Facebook, which we launched in August of 2018, centralizing tools and options for people to manage their time. In April of 2020 we added Quiet Mode to this, which mutes most push notifications. If you try to open Facebook while in Quiet Mode, you’ll be reminded that you set this time aside to limit your time using the app.
- Control Your Notifications, which includes shortcuts to help you manage your notifications. It includes an option to mute all push notifications as well as manage the “red dots” in the shortcuts menu. Red dots can be removed from Marketplace, Groups, News and the “hamburger” menu.
- See Your Time, which showcases the usage time per day, daytime/nighttime, and app visits. You can also get weekly usage updates and easy access to your activity log.
We have also launched a series of tools and features on Instagram to help people control the time they spend on the app. This includes things like the ability to ‘mute’ accounts to control what posts you see, a feature called ‘You’re All Caught Up’ that lets you know when you’ve seen all the recent content in your Feed, and time management tools where you can see your total time on the app each day and set a daily reminder that alerts you when you’ve reached a set amount of time on Instagram. We’ve also shared two new features we’re building to help people control their time on Instagram.
This is just a small sample of the types of products and controls that we have launched publicly or are continuing to explore based on this research. We have plans to address these issues in even greater depth and will keep investing more in this work.
The Journal also cites one internal study to speculate about how many people on Facebook experience problematic use. That’s irresponsible because, as is noted in the study itself, the research was designed to be as expansive as possible to help us better understand the challenge. For decades, there have been concerns about using too much of a new technology when it becomes available. How many people today actually think they should be spending more time on their smartphones or binge-watching their favorite TV shows? For example, a 2018 report on the Morning Consult/Hollywood Reporter poll said, “eighty-six percent said they’ve stayed up past their normal bedtime to watch a show, and 52 percent said they’ve stayed up all night. And while 40 percent of all TV watchers have made less healthy food or exercise choices because of a show, 57 percent of young adults have done so.”
Platforms like ours still have a role to play in addressing this problem. At Meta, we’ve been doing exactly that for many years – and that work is continuing to move full steam ahead.
Source