Screenshots of the Parler app from the App Store are seen on an iPhone in this photo illustration in Warsaw, Poland, January 10, 2021. /Getty
Screenshots of the Parler app from the App Store are seen on an iPhone in this photo illustration in Warsaw, Poland, January 10, 2021. /Getty
Editor's note: Bradley Blankenship is a Prague-based American journalist, political analyst and freelance reporter. The article reflects the author's opinions and not necessarily the views of CGTN.
After the events of January 6, when radicalized supporters of former U.S. President Donald Trump stormed the U.S. Capitol in an attempt to overturn the results of the November presidential election over blatantly false allegations of fraud, social media platforms has had to make tough decisions on how to manage its content to ensure public safety.
Twitter, for example, moved to delete tens of thousands of radical right-wing conspiracy accounts and even banned Trump himself while he was still president. This exposed the sheer power that social media giants have and called into question whether they should be allowed unilaterally to exert that power, especially when it's used only in some instances and not others.
Top government bodies around the world from the U.S. Senate to the European Commission have sounded the alarm on the dangers social media and its business model poses to fundamental institutions. This has even been the subject of widely consumed books and documentaries for the general public. With so much already said and experienced, there is now a near-unanimous bipartisan consensus in the U.S. and a strong international multilateral foundation to rein in Big Tech over its real-world consequences.
As was already seen with Twitter's controversial moves against Trump and his supporters, social media giants are opting instead to self-regulate over fears of government interference and a blow to their bottom line.
In Facebook's fourth-quarter earning's call, its CEO Mark Zuckerberg said that the social media giant would begin reducing the amount of political content on its platform, going so far as to say that there were some groups he didn't want on Facebook "even if they don't violate [Facebook's] policies."
Zuckerberg added that "one of the top pieces of feedback that we are hearing from our community right now is that people don't want politics and fighting to take over their experience on our services."
Though Facebook has publicly stated that other platforms besides Facebook were largely responsible for hosting the Capitol mob's conversations to organize, it's clear from widely seen public posts – including livestreams – that this is inaccurate. Facebook has long profited from mass engagement with questionable content and top executives even shut down studies aimed at reducing divisiveness on the platform, according to a May report by the Wall Street Journal.
While it's clear to everyone that something needs to be done, is the complete opposite direction, i.e. total depoliticization, really the answer?
Mark Zuckerberg, Chief Executive Officer of Facebook, testifies remotely during the Senate Judiciary Committee hearing on "Breaking the News: Censorship, Suppression, and the 2020 Election," in Washington, D.C., November 17, 2020. /Getty
Mark Zuckerberg, Chief Executive Officer of Facebook, testifies remotely during the Senate Judiciary Committee hearing on "Breaking the News: Censorship, Suppression, and the 2020 Election," in Washington, D.C., November 17, 2020. /Getty
First of all, Zuckerberg has already given away exactly how this would play out with Facebook by essentially saying that the platform's policies wouldn't matter in deciding what content to pull. It clearly shows that what preference he has for deciding what is and what is not political would be the new norm for content, basically making him one of the most powerful political deciders in the world.
Facebook, like Google's parent company Alphabet, has a dual-class share structure that allows company executives access to "Class B" stocks that each carry 10 votes. Zuckerberg alone owns almost 90 percent of those shares, giving him effective control of the entire company and its editorial decisions.
While it would be great to believe that Zuckerberg or any other tech executive might know what's best for the world, this is simply impossible and even trying to put such power in the hands of a few people would be antithetical to Western liberal values.
The internet has its dark side, to be sure, but it has led to an inherently democratic feature of modern society, namely the fact that it has created unprecedented accountability for governments and corporations. It is now extremely difficult for corruption or official missteps to go unnoticed now that people across the world are becoming and more connected. The internet has been, for example, one of the greatest drivers of the Black Lives Matter.
People are also coming into direct contact with first-hand information about places they could have never even managed through social media, which alleviates some of the problems associated with editorial controls in traditional media. For example, it would be much harder to have another Cold War scenario in the 21st century when global netizens can pretty much freely interact and share experiences whenever they choose to.
Basically, in principle, there is serious potential for average people to advance society by using social media. Equating this to the disinformation campaigns and hateful content that drives right-wing violence is simply a false comparison and would throw the baby out with the bath water. We cannot allow the likes of QAnon and genuine intellectual diffusion to both be put in the same "politics" basket just to save Big Tech from very much deserved regulation.
(If you want to contribute and have specific expertise, please contact us at opinions@cgtn.com.)