On Tuesday, Facebook revealed for the first time the prevalence of bullying and harassment on its platform, claiming that such content was seen 14 to 15 times per 10,000 visits on the site in the third quarter. In its quarterly content moderation report, the company, which recently changed its name to Meta, also stated that bullying and harassing content was seen between 5 and 6 times per 10,000 views of content on Instagram.
Former employee and whistleblower Frances Haugen disclosed internal documents that included studies and talks regarding Instagram’s effects on teen mental health and whether Facebook’s platforms fuel divisions, bringing the social media behemoth back into the spotlight.
The documents, according to Haugen, reveal that the corporation prioritised profits before user safety. Facebook refuted this assessment, claiming that the records were being exploited to create a “false picture.” The records, first disclosed by the Wall Street Journal, have sparked calls for Facebook to be more transparent, as well as questions about whether measures like prevalence provide a whole picture of how the business addresses abuses.
Facebook claimed that its bullying and harassment statistics only included cases in which the firm did not require additional evidence, such as a user report, to determine whether the content violated its policies. They claimed that 59.4 percent of the 9.2 million pieces of content removed from Facebook for violating its bullying and harassment policies were discovered proactively.
In a blog post, the company’s global head of safety, Antigone Davis, and product management director Amit Bhattacharyya said, “Bullying and harassment is a unique difficulty and one of the most complex issues to handle because context is crucial.”