Facebook disputes that and says it maintained necessary safeguards, adding in a statement that it has “expressly disclosed to investors” the risk of misinformation and extremism occurring on the platform remains.
In 2019, a year after Facebook changed its algorithm to encourage engagement, its own researchers identified a problem, according to internal company documents obtained from the source.
The company set up a fake Facebook account, under the name “Carol,” as a test and followed then-President Trump, first lady Melania Trump and Fox News. Within one day, the algorithm recommended polarizing content. The next day, it recommended conspiracy theory content, and in less than a week, the account received a QAnon suggestion, the internal documents said.
By the second week, the fake account’s News Feed was “comprised by and large” with misleading or false content. In the third week, “the account’s News Feed is an intensifying mix of misinformation, misleading and recycled content, polarizing memes, and conspiracy content, interspersed with occasional engagement bait,” the internal documents said.