A Facebook Glitch Is Spreading Harmful Content

Be careful what you read on Facebook, a glitch was found to be pushing out harmful content platform-wide.

By Kristi Eckert | Published

This article is more than 2 years old

giphy facebook

Facebook’s name and reputation have been being dragged through the mud in recent years. Whistleblower confessions warranted that much scrutiny be placed upon how the company operates. More recently, Facebook admitted to inadvertently handing over swathes of private user data to a group of hackers. Now another cause for concern regarding Facebook has been brought to light. The Verge reported that a collective of Facebook engineers identified what they referred to as a “massive ranking failure” in its downranking content algorithm. This wide-scale failure led to copious amounts of potentially inappropriate content being pushed out to user news feeds for the past six months. 

The revelation of the long-term mishap comes out of an internal company report. Facebook engineers noticed the issue late last year and have since been compiling concerning data pertaining to it. As a result of the ranking failure, hoards of misinformation ended up on news feeds across the platform. This widespread distribution caused, in some cases, readership to soar by as much as 30% for content that presented utterly inaccurate information. 

Facebook’s failure to disrupt enormous amounts of misinformation is problematic on numerous levels. First, from an integrity standpoint. An immense amount of people rely on social media outlets, like Facebook, as resources to keep up with important news and current events. Forbes reported that as much as 55% of the American public relies on social media as their primary news source. The fact that Facebook wittingly allowed this misinformation to spread for an extended period raises questions as to the platform’s integrity and whether or not it can be considered a legitimate news outlet. 

Second, the vast distribution of misinformation is a catalyst that perpetuates division and hinders unity. Consider how a great number of people believing a false report could hamper reaching unified viewpoints. People simply cannot form accurate opinions on any matter if their opinion is informed by false information. 

facebook

Furthermore, it’s not only misinformation that was widely circulating on news feeds across Facebook for months on end. The glitch reportedly failed to adequately filter nude and excessively violent content, as well. Russian propaganda was also detected among the inappropriate information that was extensively pushed to the public.

What’s more, is that the internal document that detailed the engineers’ findings in relation to the content in question revealed that the company knew about the issue as far back as 2019. However, the Verge pointed out that nothing was done to address the matter until October of 2021 because it was not believed to be a prevalent enough issue at the time. The issue was reportedly resolved on March 11, 2022. 

Despite the release of this concerning information, Facebook has maintained that the overall damage done was minimal. Meta, Facebook’s parent company, spokesperson Joe Osborne detailed that the glitch in question “has not had any meaningful, long-term impact on our metrics and didn’t apply to content that met its system’s threshold for deletion.”

Facebook has consistently defended its downlinking algorithms, the algorithms that are meant to filter out content such as misinformation or violent depictions. In fact, Facebook insists that, even though there was a glitch in this case, they are continuously working to make their filtering technology better and better. Sahar Massachi, who previously worked for Facebook’s integrity team, pointed to the fact that “In a large complex system like this, bugs are inevitable and understandable.”