Internal bug promoted problematic content on Facebook, Marketing & Advertising News, AND BrandEquity

Representative picture

Content identified as misleading or problematic was mistakenly prioritized in users’ Facebook feeds recently, thanks to a software bug that took six months to fix, according to tech site The Verge.

Facebook disputed the report, which was released on Thursday, saying it “greatly exaggerated what this bug was because, ultimately, it had no significant long-term impact on problematic content,” according to Joe. Osborne, spokesperson for parent company Meta.

But the bug was serious enough that a group of Facebook employees wrote an internal report referring to a “massive ranking failure” of content, The Verge reported.

In October, employees noticed that some content that had been flagged as questionable by outside media – members of Facebook’s third-party fact-checking program – was nonetheless favored by the algorithm to be widely distributed in news feeds. users.

“Unable to find the root cause, engineers watched the surge subside a few weeks later, then flare up repeatedly until the filing issue was resolved on March 11,” The Verge reported.

But according to Osborne, the bug only affected “a very small number of views” of the content.

That’s because “the overwhelming majority of posts in Feed aren’t eligible to be downgraded in the first place,” Osborne explained, adding that other mechanisms designed to limit views of “harmful” content remained in place. , “including other downgrades, fact-checking labels and violating content removals.”

AFP currently works with Facebook’s fact-checking program in more than 80 countries and 24 languages. Under the program, which began in December 2016, Facebook pays to use fact checks from around 80 organizations, including media outlets and specialist fact checkers, on its platform, WhatsApp and on Instagram.

Content categorized as “fake” is downgraded in news feeds so fewer people see it. If anyone tries to share this post, they are presented with an article explaining why it is misleading.

Those who choose to share the post anyway receive a notification with a link to the article. No message is deleted. Fact checkers are free to choose how and what they wish to investigate.

On Wednesday, The Washington Post reported that Meta, Facebook’s parent company, is using similar tactics to go after another TikTok rival. Meta, the Post reported, hired a Republican consulting firm called Targeted Victory to “orchestrate a national campaign” against TikTok…

Marilyn J. Hernandez