Connect with us

Business

Facebook: Building the Metaverse Responsibly – About Facebook

The metaverse won’t be built overnight by a single company. We’ll collaborate with policymakers, experts and industry partners to bring this to life. We’re announcing a $50 million investment in global research and program partners to ensure these products are developed responsibly. We develop technology rooted in human connection that brings people together. As we…

Published

on

  • The metaverse won’t be built overnight by a single company. We’ll collaborate with policymakers, experts and industry partners to bring this to life.
  • We’re announcing a $50 million investment in global research and program partners to ensure these products are developed responsibly.

We develop technology rooted in human connection that brings people together. As we focus on helping to build the next computing platform, our work across augmented and virtual reality and consumer hardware will deepen that human connection regardless of physical distance and without being tied to devices. 

What Is the Metaverse?

The “metaverse” is a set of virtual spaces where you can create and explore with other people who aren’t in the same physical space as you. You’ll be able to hang out with friends, work, play, learn, shop, create and more. It’s not necessarily about spending more time online — it’s about making the time you do spend online more meaningful. 

The metaverse isn’t a single product one company can build alone. Just like the internet, the metaverse exists whether Facebook is there or not. And it won’t be built overnight. Many of these products will only be fully realized in the next 10-15 years. While that’s frustrating for those of us eager to dive right in, it gives us time to ask the difficult questions about how they should be built. 

How We’re Building Responsibly

We’ll work with experts in government, industry and academia to think through issues and opportunities in the metaverse. For instance, its success depends on building robust interoperability across services, so different companies’ experiences can work together. We also need to involve the human rights and civil rights communities from the start to ensure these technologies are built in a way that’s inclusive and empowering.

Here are a few key areas where we’ll work with others to anticipate the risks and get it right:

  • Economic opportunity: how we can give people more choice, encourage competition and maintain a thriving digital economy
  • Privacy: how we can minimize the amount of data that’s used, build technology to enable privacy-protective data uses and give people transparency and control over their data
  • Safety and integrity: how we can keep people safe online and give them tools to take action or get help if they see something they’re not comfortable with
  • Equity and inclusion: how we can make sure these technologies are designed inclusively and in a way that’s accessible 

Introducing the XR Programs and Research Fund

There’s a long road ahead. But as a starting point, we’re announcing the XR Programs and Research Fund, a two-year $50 million investment in programs and external research to help us in this effort. Through this fund, we’ll collaborate with industry partners, civil rights groups, governments, nonprofits and academic institutions to determine how to build these technologies responsibly. 

Here are a few of our initial partners: 

  • We’re working with the Organization of American States on job training and skills development for students, creators and small business owners.
  • Across Africa, we’re supporting Africa No Filter, Electric South and Imisi3D to support creators who have been pushing the boundaries of digital storytelling using immersive technology through “Amplifying African Voices.”
  • With Women In Immersive Tech, we are supporting women and underrepresented groups driving Europe’s virtual, augmented and mixed reality sectors.

As a part of this effort, we are also facilitating independent external research with institutions across the globe:

We will be sharing more partners and updates on our progress as the work continues.

Source

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

Business

Facebook: Community Standards Enforcement Report Assessment Results

In August of 2020, we committed to undertaking and releasing an independent, third-party assessment of our Community Standards Enforcement Report. Today, we’re delivering on that commitment and publishing EY’s independent findings. We selected EY based on their expertise, experience and ability to work with large, novel data sets. To ensure the metrics were measured and…

Published

on

By

In August of 2020, we committed to undertaking and releasing an independent, third-party assessment of our Community Standards Enforcement Report. Today, we’re delivering on that commitment and publishing EY’s independent findings.

Graphic that reads: EY's assessment concluded with the opinion that the calculation of the metrics reported within Facebook and Instagram Community Standards Enforcement Report for the period October 1, 2021 through December 31, 2021 have been prepared based on the specific criteria and are fairly stated, in all material respects.

We selected EY based on their expertise, experience and ability to work with large, novel data sets. To ensure the metrics were measured and reported correctly, we underwent in-depth preparation to give EY an understanding of our processes, systems and controls. We also provided them with data and evidence requested to conduct their assessment. 

The globally adopted Committee of Sponsoring Organizations of the Treadway Commission (COSO) framework was selected as the criteria against which the assessment would evaluate Meta’s internal controls. This rigorous and widely used internal control framework was used to verify the accuracy of the metrics and validate the design and operating effectiveness of the controls. It’s the same framework we use to assess our internal controls over financial reporting and the financial statements that are included in our Annual Report filed with the Securities and Exchange Commission (SEC).

As part of the assessment, Meta provided EY with full access to the necessary data, documentation and evidence requests. We also gave access to dozens of employees across data science, data engineering, software engineers, product and program managers and Internal Audit teams working on the Community Standards Enforcement Report. The examination was conducted in accordance with the attestation standards as established by the American Institute of Certified Public Accountants and consisted primarily of 1) applying inspection, recalculation and analytical procedures 2) making inquiries of persons responsible for the subject matter and 3) obtaining an understanding of the measurement systems and processes used to generate, process, aggregate and report the subject matter.

EY evaluated the metrics and reporting methods for the Community Standards Enforcement Report covering the fourth quarter of 2021. As part of their assessment, EY focused on the following areas: Governance, Data Collection, Data Processing, Data Aggregation, Data Disclosures and Reporting and Information Technology General Controls. Learn more about the scope of the assessment

Infographic showing the scope of EY's assessment

This assessment builds on our previous work in 2018 with international experts in measurement, statistics, law, economics and governance, who provided an independent, public assessment of the metrics we share in the enforcement report. Since then, we have continued to seek out feedback from external stakeholders and experts, including from the Oversight Board, and are working to expand our transparency reports. As we shared today, we are also committing to measuring and reporting metrics around accuracy of our enforcement decisions.

Timeline of EY assessment

While this assessment looked at the accuracy of the metrics we report, we believe independent, third-party assessments of our integrity systems and processes are also an important part of delivering accountability. We look forward to continuing to build on this commitment and expanding our transparency and accountability efforts, working with industry partners, issue experts and policymakers across the world.

As we said at the beginning of this process, no company should grade its own homework, and the credibility of our systems should be earned, not assumed. Accurate and meaningful transparency is critical to holding platforms accountable. This assessment is a step in that direction.

Source

Continue Reading

Business

Facebook: Widely Viewed Content Report, First Quarter 2022

Today, we’re publishing the Widely Viewed Content Report (WVCR) for the first quarter of 2022. This report highlights the most-viewed organic content in Feed in the US, including domains, links, Pages and posts. It includes content recommended by Facebook and excludes advertising content. See the full report and Companion Guide for more information.  Updates and…

Published

on

By

Today, we’re publishing the Widely Viewed Content Report (WVCR) for the first quarter of 2022. This report highlights the most-viewed organic content in Feed in the US, including domains, links, Pages and posts. It includes content recommended by Facebook and excludes advertising content. See the full report and Companion Guide for more information. 

Updates and Enforcements

Based on feedback from several academic and civil society organizations, we are improving our link and domain data methodologies. Previously, we counted a link view as any time a post or video containing a link was viewed, even if the link was not front and center. However, the feedback from these organizations was that our data would be more meaningful if we only counted link or domain views that rendered a preview. Moving forward, links will need to render a preview in order to be counted as a view, as that more accurately represents what people are seeing. As part of the transition, the Q1 2022 report includes top viewed links using both our old and new methodologies. Starting next quarter, the WVCR will use only the new methodology.

In this report, there were pieces of content that have since been removed from Facebook for violating our policies of Inauthentic Behavior. The removed links were all from the same domain, and links to that domain are no longer allowed on Facebook. During the last reporting cycle, we took seriously the feedback criticizing our approach to disclosing additional details about the content removed from Facebook that appears in this report. We have updated our removal disclosure framework in the report and Companion Guide. We will aim to disclose as much information as possible about removed content, including Inauthentic Behavior, that appears in the report moving forward. However, in instances where disclosing specific details on removed content would cause harm to our community, we will err on the side of keeping the community safe. 

Some lower-quality posts ended up amongst our most viewed last quarter, although it is important to note that the top 20 links in this report represent only 0.03% of all Feed content views in the US during the quarter. The fourth URL in the report linked to a YouTube video of a panel discussion held by a U.S. Senator that was rated False by one of our fact-checking partners. When that happened, we took a number of steps to limit the reach of this link, including adding a warning screen that shared more information about the claim, showing a notification warning to someone when they try to share the link and reducing the distribution of the link in Feed. Our strategy mirrors the recommendations of experts and academics in this field: deeper investments in outreach by trusted organizations online, as well as fact-checking as a primary approach to misinformation, since removing certain false claims about COVID-19 can exacerbate feelings of distrust with authorities and further marginalize populations. And without these features, this link would likely have reached more people, and those who viewed it would not have seen additional information and context from the false fact check. 

Insights from the WVCR help inform how we update our existing policies and products, and develop new ones, to address harmful or otherwise objectionable content. For example, we’ve been testing new ways to reduce clickbait, engagement bait and spam. While we’re seeing improvements from these tests, we will need to continually evaluate and refine our approach before seeing consistent results. We’ll continue to test alternative solutions to reduce engagement bait, misinformation and content from Pages that repeatedly violate our Community Standards.

Source

Continue Reading

Business

Facebook: Transparency Report, Second Half 2021

Today, we are releasing our latest Transparency Report for the second half of 2021.  As always, we strive to be open about the ways we protect users’ privacy, security and access to information online. That’s why we publish biannual transparency reports to provide detail on the numbers and maintain accountability in our work. Over the…

Published

on

By

Today, we are releasing our latest Transparency Report for the second half of 2021. 

As always, we strive to be open about the ways we protect users’ privacy, security and access to information online. That’s why we publish biannual transparency reports to provide detail on the numbers and maintain accountability in our work. Over the years, we’ve expanded our report to include the volume of content restrictions based on local law, the number of global internet disruptions that limit access to our products and, most recently, our proactive efforts to protect intellectual property. Additionally, our Transparency Report includes the Community Standards Enforcement Report for Q1 of 2022, which provides data on how we take action against violating content across our platforms. 

Government Requests for User Data

During the last six months of 2021, global government requests for user data increased 2% from 211,055 to 214,777. Of the total volume, the US continues to submit the largest number of requests, followed by India, Germany, France, Brazil and the UK.

In the US, we received 59,996 requests, which was 6% less than the total we received in the first half of 2021. Non-disclosure orders prohibiting Meta from notifying the user remained consistent at 70% in the first and second halves of 2021. In addition, as a result of transparency updates introduced in the 2016 USA Freedom Act, the US government lifted the non-disclosure orders on 12 National Security Letters we received between 2017 and 2021. These requests, along with the US government’s authorization letters, are available below.

As we have said in prior reports, we always scrutinize every government request we receive to make sure it is legally valid, no matter which government makes the request.  We comply with government requests for user information only where we have a good-faith belief that the law requires us to do so. In addition, we assess whether a request is consistent with internationally recognized standards on human rights, including due process, privacy, free expression and the rule of law. When we do comply, we only produce information that is narrowly tailored to that request. If we determine that a request appears to be deficient or overly broad, we push back and will fight in court, if necessary. We do not provide governments with “back doors” to people’s information. For more information about how we review and respond to government requests for user data and the safeguards we apply, please refer to our FAQs.

Content Restrictions

When content is reported as violating local law, but doesn’t go against our Community Standards, we may limit access to that content in the country where the local violation is alleged. During this reporting period, the volume of content restrictions based on local law increased globally 8% from 47,365 in H1 2021 to 50,959 in H2 2021.

Internet Disruptions

We oppose shutdowns, throttling and other disruptions of internet connectivity and we remain concerned by the trend towards this approach in some countries. Even temporary disruptions of internet services can undermine human rights and economic activity. That’s why we report the number of deliberate internet disruptions caused by governments around the world that impact the availability of our products. In the second half of 2021, we identified 38 disruptions of Facebook services in 12 countries, compared to 62 disruptions in 17 countries in the first half of 2021.

Intellectual Property

Finally, we report on the volume and nature of copyright, trademark and counterfeit reports we receive each half as well as our proactive actions against potential piracy and counterfeits. In connection with our previously reported data for H1 2021, we discovered we were not accounting for some proactive copyright removals due to an error in the way our technology counted these violations, specific to Rights Manager. This resulted in an undercounting of the number of proactive copyright removals for that reporting period. We have reviewed and resolved this issue and have adjusted the numbers to reflect those removals. Specifically, ​​we proactively removed 16.7 million pieces of content for copyright reasons in H1 2021 (previously reported as 9 million).

During this reporting period for H2 2021, we took down 4,384,719 pieces of content based on 1,217,892 copyright reports; 709,642 pieces of content based on 332,340 trademark reports and 2,121,209  pieces of content based on 97,569 counterfeit reports. We also proactively removed 30,245,249 pieces of content for copyright reasons and 223,770,855 pieces of content for counterfeit reasons. 

Publishing this report furthers our deep commitment to transparency. You can see the full report for more information.

Source

Continue Reading

Trending

Copyright © 2021 Today's Digital.