A recent report published on March 31 by tech site The Verge highlighted that content that was identified as misleading, harmful or problematic was mistakenly prioritised in users’ Facebook feeds.
The reason behind this surge of misinformation was apparently a software bug that reportedly took six months to fix – between October to March. Was the error critical?
Apparently yes because The Verge reported that the bug was serious enough for a group of Facebook employees to draft an internal report referring to a “massive ranking failure” of content.
As per reports, the employees noticed that some content, that had been marked as questionable by external media, was widely distributed in users’ News Feeds in October.
The popular tech site reported, “Unable to find the root cause, the engineers watched the surge subside a few weeks later and then flare up repeatedly until the ranking issue was fixed on March 11.”
However, the company Facebook disputed the report published by the outlet, labelling it as “vastly overstated”, news agency AFP reported.
According to Joe Osborne, a spokesman for parent company Meta, the bug “had no meaningful, long-term impact on problematic content”. Osborne said that the bug affected “only a very small number of views” of content.
Explaining the process, Osborne said, that “the overwhelming majority of posts in Feed are not eligible to be down-ranked in the first place”.
He added that other mechanisms designed to limit views of “harmful” content remained in place, “including other demotions, fact-checking labels and violating content removals.”
WATCH | Facebook’s parent company Meta gave user data to hackers
(With inputs from agencies)