Download
How does Facebook 'force' political parties to go extreme?
CGTN
03:58

European political parties complained Facebook is "forcing" them to be more extreme, according to the revelation brought by a whistle-blower, who once worked for Facebook but then left with rage. 

Party extremism is not newsworthy anymore; it's common sense. But what's special about this is that it confirms what people have long suspected - that the trend toward party extremism might be driven by something we once thought was just a medium.

To put it more bluntly, it might be Facebook that now shapes democracy, but not its users.

Skewing negative works better

Facebook CEO Mark Zuckerberg is surrounded by journalists as he testifies before the U.S. Senate Commerce, Science and Transportation Committee and Senate Judiciary Committee over data privacy, Washington, April 10, 2018. /VCG Photo

Facebook CEO Mark Zuckerberg is surrounded by journalists as he testifies before the U.S. Senate Commerce, Science and Transportation Committee and Senate Judiciary Committee over data privacy, Washington, April 10, 2018. /VCG Photo

In fact, the 2020 research found that when two American major political parties are getting more and more polarized since World War II, American voters themselves remain just as moderate as they always have been.

If people using Facebook are not getting more extreme, how does Facebook "force" political parties to change this way? The leaked internal report reveals that Facebook makes some political parties have to "skew negative" in their communications.

Skewing negative in communications means that comparing to positively marketing their own policy agenda, political parties turn to spread criticizing, toxic and fake content to attack their opponents.  

Parties choose this winning strategy because they have to admit that it works better. And the secret lies in Facebook's algorithm.

Algorithm encourages extremism

A thumbs up logo at Facebook headquarters in California, U.S., April 14, 2020. /CFP

A thumbs up logo at Facebook headquarters in California, U.S., April 14, 2020. /CFP

Facebook's algorithm weighs heavily reshared materials in its News Feed.

And according to the leaked internal memos, “misinformation, toxicity, and violent content are inordinately prevalent among reshares.“”

Thus, to get their voices louder on Facebook, the parties' messages have to be more negative.

The 2019 internal report shows that one political party estimated that as the algorithm was updated, the proportion of positive to negative messages they posted went from 50 to 50 to 80 percent negative, according to the Wall Street Journal.

When negative attacks on competitors become the mainstream of communication, a middle ground of mutual understanding shrinks. And with exposure as the bait, Facebook's algorithm is pushing political parties away from moderation and toward extreme antagonism.

Little possibility of persuasion

Facebook logo is seen displayed on a phone screen in this illustration photo taken in Poland, November 29, 2020. /CFP

Facebook logo is seen displayed on a phone screen in this illustration photo taken in Poland, November 29, 2020. /CFP

Extremism, in turn, makes persuasion almost impossible.

When their parties are constantly being attacked, users will only become more resistant to the opposite. Communication is no longer about understanding each other but about drawing boundaries.

And political parties have to accept it. One expert estimated that 75 percent of Facebook spending in 2020 American elections was aimed at fundraising and much less on persuasion. 

Even if European political parties raise serious protests, they know they will have to do so as long as Facebook doesn't overhaul its algorithm.

After a series of operations such as screening, amplification and presentation by Facebook, raw voices are rewritten into what we see now as online public opinion. But, unfortunately, what we see is probably far from the real.

Search Trends