Connect with us

Business

Facebook: Supporting Open Conversations About Mental Health

For World Mental Health Day on October 10, we’re launching new mental health resources, tools and programming across our apps. We’re working with mental health experts and organizations around the world to connect people with the support they need, and we’re launching new content and tools to encourage people to start conversations about mental health. …

Published

on

For World Mental Health Day on October 10, we’re launching new mental health resources, tools and programming across our apps. We’re working with mental health experts and organizations around the world to connect people with the support they need, and we’re launching new content and tools to encourage people to start conversations about mental health. 

Access New Resources 

We’ve partnered with mental health experts to develop new guides and tools in the Emotional Health resource center on Facebook, including: 

  • Resource Cards developed with the UNICEF and other global health organizations focused on tips for creating positive mental health, managing personal crises and rising above challenging moments
  • Resource guides developed for BIPOC Mental Health Month to provide equitable access to mental health support:

 

Product mock of Emotional Health resource center on Facebook

Get Tips and Support

With support from WhatsApp, UNICEF is launching a new Global Mental Health chatbot to offer tips for communicating what’s on your mind, breaking down stigmas and starting a conversation with someone you’re concerned about. Mental health and wellbeing resources like exercises to help reduce stress can also be found through the WHO’s Health Alert chatbot on WhatsApp. Regional helplines, including a Loneliness Advice chatbot developed by the Connection Coalition in the UK, are also available. On Messenger, we launched the “I Care For You” sticker pack to help kick start conversations when words are hard to find. 

We also recently introduced new suicide prevention resources, developed in partnership with Samaritans UK and 10 of our global suicide crisis response partners. The Responding to Suicide Challenges toolkit has resources for parents, educators, youth and media on how to safely discuss suicide challenges. You can also find resources for yourself or a friend on our suicide prevention hub.

Find Comfort From an Online Community

Hearing others’ experiences can also provide solace during challenging times. 15 million people around the world are members of a Facebook group dedicated to supporting mental health. Black Girl’s Healing House provides wellness and mental health resources for Black women from Black women. With more than 59,000 members, Subtle Asian Mental Health offers a safe space for the Asian community to share their thoughts and feelings and remind each other that no one is alone in their struggles or experiences. With more than 361,000 members, Egyptian group Msh L’Wahdak (which translates to “you are not alone” in English) encourages members to share their struggles with a community that will listen and offer support. In the UK, Parenting Mental Health provides a community of support and advice for adults caring for children with depression, anxiety or other mental health challenges. 

Experts and writers featured on Bulletin, our recently launched platform for independent writers, are also addressing mental health and wellness, including: 

  • Amanda Stern’s How to Live: A newsletter about psychology and mental health by a critically acclaimed writer with a lifelong panic disorder whose insights on suffering encourage us to face our fears.
  • Nedra Tawwab’s Nedra Nuggets: Licensed Therapist, NYT Bestselling Author, and Content Creator Nedra Glover Tawwab writes weekly about mental health and maintaining healthy relationships with self and others.
  • Dr. Laurie Santos’ The Science of Wellbeing: Yale psychology professor Dr Laurie Santos explores the latest scientific research on happiness and offers practical tips so you can lead a happier life.

Watch Candid Conversations

On October 11, a new season of Peace of Mind with Taraji premieres on Facebook Watch. Hosted by Golden Globe-winning actress Taraji P. Henson, each episode features  interviews with celebrities, experts and everyday people about mental health topics with a focus on the Black community. And in case you missed it, the recent finale of Simone vs Herself on Facebook Watch features Olympic gymnast Simone Biles opening up about mental health issues she faced in Tokyo this summer.  

See the Weight of Mental Illness

Goliath: Playing with Reality is a free virtual reality experience on Oculus Quest that provides an up-close look at the weight of schizophrenia. Goliath: Playing with Reality, produced by Anagram in conjunction with Floréal Films and the Oculus VR for Good program, explores the true story of Jon, a man diagnosed with paranoid schizophrenia, through immersive VR and artful narration by Academy Award-winning actress Tilda Swinton. This experience is designed to promote empathy and encourage conversations about mental health.

Source

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

Business

Facebook: Community Standards Enforcement Report Assessment Results

In August of 2020, we committed to undertaking and releasing an independent, third-party assessment of our Community Standards Enforcement Report. Today, we’re delivering on that commitment and publishing EY’s independent findings. We selected EY based on their expertise, experience and ability to work with large, novel data sets. To ensure the metrics were measured and…

Published

on

By

In August of 2020, we committed to undertaking and releasing an independent, third-party assessment of our Community Standards Enforcement Report. Today, we’re delivering on that commitment and publishing EY’s independent findings.

Graphic that reads: EY's assessment concluded with the opinion that the calculation of the metrics reported within Facebook and Instagram Community Standards Enforcement Report for the period October 1, 2021 through December 31, 2021 have been prepared based on the specific criteria and are fairly stated, in all material respects.

We selected EY based on their expertise, experience and ability to work with large, novel data sets. To ensure the metrics were measured and reported correctly, we underwent in-depth preparation to give EY an understanding of our processes, systems and controls. We also provided them with data and evidence requested to conduct their assessment. 

The globally adopted Committee of Sponsoring Organizations of the Treadway Commission (COSO) framework was selected as the criteria against which the assessment would evaluate Meta’s internal controls. This rigorous and widely used internal control framework was used to verify the accuracy of the metrics and validate the design and operating effectiveness of the controls. It’s the same framework we use to assess our internal controls over financial reporting and the financial statements that are included in our Annual Report filed with the Securities and Exchange Commission (SEC).

As part of the assessment, Meta provided EY with full access to the necessary data, documentation and evidence requests. We also gave access to dozens of employees across data science, data engineering, software engineers, product and program managers and Internal Audit teams working on the Community Standards Enforcement Report. The examination was conducted in accordance with the attestation standards as established by the American Institute of Certified Public Accountants and consisted primarily of 1) applying inspection, recalculation and analytical procedures 2) making inquiries of persons responsible for the subject matter and 3) obtaining an understanding of the measurement systems and processes used to generate, process, aggregate and report the subject matter.

EY evaluated the metrics and reporting methods for the Community Standards Enforcement Report covering the fourth quarter of 2021. As part of their assessment, EY focused on the following areas: Governance, Data Collection, Data Processing, Data Aggregation, Data Disclosures and Reporting and Information Technology General Controls. Learn more about the scope of the assessment

Infographic showing the scope of EY's assessment

This assessment builds on our previous work in 2018 with international experts in measurement, statistics, law, economics and governance, who provided an independent, public assessment of the metrics we share in the enforcement report. Since then, we have continued to seek out feedback from external stakeholders and experts, including from the Oversight Board, and are working to expand our transparency reports. As we shared today, we are also committing to measuring and reporting metrics around accuracy of our enforcement decisions.

Timeline of EY assessment

While this assessment looked at the accuracy of the metrics we report, we believe independent, third-party assessments of our integrity systems and processes are also an important part of delivering accountability. We look forward to continuing to build on this commitment and expanding our transparency and accountability efforts, working with industry partners, issue experts and policymakers across the world.

As we said at the beginning of this process, no company should grade its own homework, and the credibility of our systems should be earned, not assumed. Accurate and meaningful transparency is critical to holding platforms accountable. This assessment is a step in that direction.

Source

Continue Reading

Business

Facebook: Widely Viewed Content Report, First Quarter 2022

Today, we’re publishing the Widely Viewed Content Report (WVCR) for the first quarter of 2022. This report highlights the most-viewed organic content in Feed in the US, including domains, links, Pages and posts. It includes content recommended by Facebook and excludes advertising content. See the full report and Companion Guide for more information.  Updates and…

Published

on

By

Today, we’re publishing the Widely Viewed Content Report (WVCR) for the first quarter of 2022. This report highlights the most-viewed organic content in Feed in the US, including domains, links, Pages and posts. It includes content recommended by Facebook and excludes advertising content. See the full report and Companion Guide for more information. 

Updates and Enforcements

Based on feedback from several academic and civil society organizations, we are improving our link and domain data methodologies. Previously, we counted a link view as any time a post or video containing a link was viewed, even if the link was not front and center. However, the feedback from these organizations was that our data would be more meaningful if we only counted link or domain views that rendered a preview. Moving forward, links will need to render a preview in order to be counted as a view, as that more accurately represents what people are seeing. As part of the transition, the Q1 2022 report includes top viewed links using both our old and new methodologies. Starting next quarter, the WVCR will use only the new methodology.

In this report, there were pieces of content that have since been removed from Facebook for violating our policies of Inauthentic Behavior. The removed links were all from the same domain, and links to that domain are no longer allowed on Facebook. During the last reporting cycle, we took seriously the feedback criticizing our approach to disclosing additional details about the content removed from Facebook that appears in this report. We have updated our removal disclosure framework in the report and Companion Guide. We will aim to disclose as much information as possible about removed content, including Inauthentic Behavior, that appears in the report moving forward. However, in instances where disclosing specific details on removed content would cause harm to our community, we will err on the side of keeping the community safe. 

Some lower-quality posts ended up amongst our most viewed last quarter, although it is important to note that the top 20 links in this report represent only 0.03% of all Feed content views in the US during the quarter. The fourth URL in the report linked to a YouTube video of a panel discussion held by a U.S. Senator that was rated False by one of our fact-checking partners. When that happened, we took a number of steps to limit the reach of this link, including adding a warning screen that shared more information about the claim, showing a notification warning to someone when they try to share the link and reducing the distribution of the link in Feed. Our strategy mirrors the recommendations of experts and academics in this field: deeper investments in outreach by trusted organizations online, as well as fact-checking as a primary approach to misinformation, since removing certain false claims about COVID-19 can exacerbate feelings of distrust with authorities and further marginalize populations. And without these features, this link would likely have reached more people, and those who viewed it would not have seen additional information and context from the false fact check. 

Insights from the WVCR help inform how we update our existing policies and products, and develop new ones, to address harmful or otherwise objectionable content. For example, we’ve been testing new ways to reduce clickbait, engagement bait and spam. While we’re seeing improvements from these tests, we will need to continually evaluate and refine our approach before seeing consistent results. We’ll continue to test alternative solutions to reduce engagement bait, misinformation and content from Pages that repeatedly violate our Community Standards.

Source

Continue Reading

Business

Facebook: Transparency Report, Second Half 2021

Today, we are releasing our latest Transparency Report for the second half of 2021.  As always, we strive to be open about the ways we protect users’ privacy, security and access to information online. That’s why we publish biannual transparency reports to provide detail on the numbers and maintain accountability in our work. Over the…

Published

on

By

Today, we are releasing our latest Transparency Report for the second half of 2021. 

As always, we strive to be open about the ways we protect users’ privacy, security and access to information online. That’s why we publish biannual transparency reports to provide detail on the numbers and maintain accountability in our work. Over the years, we’ve expanded our report to include the volume of content restrictions based on local law, the number of global internet disruptions that limit access to our products and, most recently, our proactive efforts to protect intellectual property. Additionally, our Transparency Report includes the Community Standards Enforcement Report for Q1 of 2022, which provides data on how we take action against violating content across our platforms. 

Government Requests for User Data

During the last six months of 2021, global government requests for user data increased 2% from 211,055 to 214,777. Of the total volume, the US continues to submit the largest number of requests, followed by India, Germany, France, Brazil and the UK.

In the US, we received 59,996 requests, which was 6% less than the total we received in the first half of 2021. Non-disclosure orders prohibiting Meta from notifying the user remained consistent at 70% in the first and second halves of 2021. In addition, as a result of transparency updates introduced in the 2016 USA Freedom Act, the US government lifted the non-disclosure orders on 12 National Security Letters we received between 2017 and 2021. These requests, along with the US government’s authorization letters, are available below.

As we have said in prior reports, we always scrutinize every government request we receive to make sure it is legally valid, no matter which government makes the request.  We comply with government requests for user information only where we have a good-faith belief that the law requires us to do so. In addition, we assess whether a request is consistent with internationally recognized standards on human rights, including due process, privacy, free expression and the rule of law. When we do comply, we only produce information that is narrowly tailored to that request. If we determine that a request appears to be deficient or overly broad, we push back and will fight in court, if necessary. We do not provide governments with “back doors” to people’s information. For more information about how we review and respond to government requests for user data and the safeguards we apply, please refer to our FAQs.

Content Restrictions

When content is reported as violating local law, but doesn’t go against our Community Standards, we may limit access to that content in the country where the local violation is alleged. During this reporting period, the volume of content restrictions based on local law increased globally 8% from 47,365 in H1 2021 to 50,959 in H2 2021.

Internet Disruptions

We oppose shutdowns, throttling and other disruptions of internet connectivity and we remain concerned by the trend towards this approach in some countries. Even temporary disruptions of internet services can undermine human rights and economic activity. That’s why we report the number of deliberate internet disruptions caused by governments around the world that impact the availability of our products. In the second half of 2021, we identified 38 disruptions of Facebook services in 12 countries, compared to 62 disruptions in 17 countries in the first half of 2021.

Intellectual Property

Finally, we report on the volume and nature of copyright, trademark and counterfeit reports we receive each half as well as our proactive actions against potential piracy and counterfeits. In connection with our previously reported data for H1 2021, we discovered we were not accounting for some proactive copyright removals due to an error in the way our technology counted these violations, specific to Rights Manager. This resulted in an undercounting of the number of proactive copyright removals for that reporting period. We have reviewed and resolved this issue and have adjusted the numbers to reflect those removals. Specifically, ​​we proactively removed 16.7 million pieces of content for copyright reasons in H1 2021 (previously reported as 9 million).

During this reporting period for H2 2021, we took down 4,384,719 pieces of content based on 1,217,892 copyright reports; 709,642 pieces of content based on 332,340 trademark reports and 2,121,209  pieces of content based on 97,569 counterfeit reports. We also proactively removed 30,245,249 pieces of content for copyright reasons and 223,770,855 pieces of content for counterfeit reasons. 

Publishing this report furthers our deep commitment to transparency. You can see the full report for more information.

Source

Continue Reading

Trending

Copyright © 2021 Today's Digital.