Advertisement

Facebook researchers saw how its algorithms led to misinformation

Facebook researchers in 2019 created three dummy accounts to study the platform's technology for recommending content in the News Feed. The first was for a user in India, its biggest market. Then it created two more test accounts to represent a conservative American user and a liberal one.

All three accounts engaged exclusively with content recommended by Facebook's algorithms. Within days, the liberal account, dubbed "Karen Jones," started seeing "Moscow Mitch" memes, which referred to a nickname by critics of Republican Senator Mitch McConnell after he blocked bills to protect American elections from foreign interference.

The conservative account, "Carol Smith," was guided toward QAnon conspiracy theories. Meanwhile, the test user's News Feed in India was filled with inflammatory material containing violent and graphic images related to India's border skirmishes with Pakistan.

The Facebook researcher running the Indian test user's account wrote in a report that year: "I've seen more images of dead people in the past 3 weeks than I've seen in my entire life total," adding that, "the graphic content was recommended by [Facebook] via recommended groups, pages, videos, and posts."

 / Credit: CBS News
/ Credit: CBS News

The internal Facebook memos analyzing the progression of these test accounts were part of thousands of pages of leaked documents provided to Congress by lawyers for Facebook whistleblower Frances Haugen. A consortium of 17 U.S news organizations, including CBS News, has reviewed the redacted version of the documents received by Congress.

ADVERTISEMENT

The three projects illustrate how Facebook's algorithms for the News Feed can steer users to content that sow divisions. And they reveal that the company was aware its algorithms, which predict what posts users want to see and how likely they are to engage with it, can lead users "down the path to conspiracy theories."

In a statement to CBS News, a Facebook spokesperson said the project involving the conservative test user is "a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform."

In 2018, Facebook altered the algorithms that populate users' news feeds to focus on what it calls "Meaningful Social Interactions" in an attempt to increase engagement.

But internal research found that engagement with posts "doesn't necessarily mean that a user actually wants to see more of something."

"A state[d] goal of the move toward meaningful social interactions was to increase well-being by connecting people. However, we know that many things that generate engagement on our platform leave users divided and depressed," a Facebook researcher wrote in a December 2019 report.

 / Credit: CBS News
/ Credit: CBS News

The document, titled "We are Responsible for Viral Content," noted that users had indicated the kind of content they wanted to see more of, but the company ignored those requests for "business reasons."

According to the report, internal Facebook data showed that users are twice as likely to see content that is reshared by others as opposed to content from pages they choose to like and follow. Users who comment on posts to express their dissatisfaction are unaware that the algorithm interprets that as a meaningful engagement and serves them similar content in the future, the report said.

There are several metrics that the News Feed algorithm considers, according to Facebook's internal documents. Each carries a different weight and content goes viral depending on how users interact with the post.

When Facebook first moved toward meaningful social interactions in 2018, using the "Like" button awarded the post one point, according to one document. Signaling engagement using one of the reaction buttons with the emoticons that stand for "Love," "Care," "Haha," "Wow," "Sad," and "Angry" were worth five points. A post that was reshared was also worth five points.

Comments on posts, messages in Groups, and RSVPs to public events awarded the content 15 points. Comments, messages, and reshares that included photos, videos, and links were awarded 30 points.

Facebook researchers quickly uncovered that bad actors were gaming the system. Users were "posting ever more outrageous things to get comments and reactions that our algorithms interpret as signs we should let things go viral," according to a December 2019 memo by a Facebook researcher.

 / Credit: CBS News
/ Credit: CBS News

In one internal memo from November 2019, a Facebook researcher noted that "Angry," "Haha," and "Wow" reactions are heavily tied to toxic and divisive content.

"We consistently find that shares, angrys, and hahas are much more frequent on civic low-quality news, civic misinfo, civic toxicity, health misinfo, and health antivax content," the Facebook researcher wrote.

In April 2019, political parties in Europe complained to Facebook that the News Feed change was forcing them to post provocative content and take up extreme policy positions.

One political party in Poland told Facebook that the platform's algorithm changes forced its social media team to shift from half positive posts and half negative posts to 80% negative and 20% positive.

In a memo titled "Political Party Response to '18 Algorithm Change," a Facebook staffer wrote that "many parties, including those that have shifted strongly to the negative, worry about the long-term effects on democracy."

In a statement to CBS News, a Facebook spokesperson said, "the goal of Meaningful Social Interactions ranking change is in the name: improve people's experience by prioritizing posts that inspire interactions, particularly conversations between family and friends."

The Facebook spokesperson also argued that the ranking change isn't "the source of the world's divisions," adding that "research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook ever existed."

Facebook said its researchers constantly run experiments to study and improve the algorithm's rankings, adding that thousands of metrics are considered before content is shown to users. Anna Stepanov, Facebook's head of app integrity, told CBS News the rankings powering the News Feed evolve based on new data from direct user surveys.

The documents indicate that Facebook did change some of the rankings behind the News Feed algorithm after feedback from researchers. One internal memo from January of last year shows Facebook lowered the weight of "Angry" reactions from five points to 1.5. That was then lowered to zero in September of 2020.

In February, Facebook announced that it is beginning tests to reduce the distribution of political content in the News Feed for a small percentage of users in the U.S. and Canada. The program was expanded earlier this month to include other countries.

Wind and rain lashes the Pacific Northwest in phenomenon known as a bomb cyclone

Eye Opener: Powerful storm slams Northern California

Why Michelle Obama encouraged former President Obama to spend more time with Bruce Springsteen