Fake News Is A Symptom

The problems of our modern public sphere are not the inevitable results of progress - addressing the “post-truth” crisis requires reckoning with the politics of technology.

Orwell feared that the truth would be concealed from us. Huxley feared that the truth would be drowned in a sea of irrelevance.

This line from Neil Postman’s work Amusing Ourselves to Death is as good a summary as any for the predicaments of our modern public sphere. 1984’s ruthless imposition of ideology may have been a good foil for Cold War fear mongering, but Brave New World contended the far more effective approach would be to flood society with information and distraction. And today, we have an app for that. Or rather, a handful of multi-billion dollar monopolistic tech platforms have apps for that.

Fake News Whack-a-Mole Is Not Enough

Research published last year at MIT found that false news stories (defined according to independent fact-checking organisations) on Twitter were retweeted 70% more than true stories, while true stories take about six times as long as false stories to reach 1,500 people. A team at the Oxford Computational Propaganda Project observed that for the first time ever, false news made up a greater percentage of the information circulating on Facebook than professional news—25% to 19%—in the lead-up to the U.S. midterm elections last fall. On YouTube, forget an academic study, you can just click on the side bar a few times to hit red pill enlightenment. (Though if you want a report, here you go.) Further research has shown that while the average person may not experience an overwhelming amount of false news directly, social media platforms enable powerful propaganda feedback loops between far-right extremists, mainstream media, and politicians. Research by sociologist Jessie Daniels has specifically looked at how algorithm-driven platforms “amplify and systematically move white supremacist talking points into the mainstream.”

The #FakeNews moment has led many a thought leader to wring their hands anxiously about an encroaching “post-truth” crisis, and there is merit to concerns that today’s digital platforms function as epistemological smoothie blenders set on high. The result might be a public divided not merely by different preferences or even beliefs, but vastly polarizing the ways we come to know things about anything at all—an inability to form knowledge as a society.

This epistemological collapse is the broken scaffolding on which President Trump can give his State of the Union Address and claim “Wealthy politicians and donors push for open borders while living their lives behind walls and gates and guards. Meanwhile, working-class Americans are left to pay the price for mass illegal migration — reduced jobs, lower wages, overburdened schools and hospitals, increased crime, and a depleted social safety net.” A fantastically impressive reconfiguration of reality and the President’s own actions, and yet just one of an endless ocean of lies. But outrage only drives the believers further afield.

Perhaps predictably, a lot of effort has gone towards countering “fake news” itself, as if the phenomena can be isolated to a solvable equation. Mark Zuckerberg routinely promises to shore things up and plug more data-leaking holes in his modern-day Titanic. Jack Dorsey gives the same winding platitudes about how tricky it is to balance free speech and also make so much damn money. Google put links to Wikipedia on certain conspiratorial YouTube videos (without paying Wikipedia), as if Wikipedia itself wasn’t an informational conflict zone. Dozens of startups and initiatives have cropped up to label fake news or categorise news by ideological bias, because people who are sceptical of authority really like being told what’s true. Or blockchain! Blockchain will save us all.

Look, I don’t mean to be so glib. While many of these efforts amount to large-scale games of misinformation whack-a-mole, right now it’s better than nothing. It’s the “clean-coal” approach to social media. It’s what you do when you want to change things without really changing things.

But take the case of Myanmar — people on the ground were reaching out to Facebook about the use of the platform to incite violence back in 2014. Since then hundreds of thousands of Rohingya people have had their lives thrown into chaos, fleeing mass-violence, horrific sexual assault, and homes destroyed by the military. “A textbook example of ethnic cleansing”, according to Zeid Ra’ad al-Hussein, the United Nations high commissioner for human rights. While it would be ridiculous to say Facebook caused the decades-long persecution of Rohingya people, Facebook’s aggressive tactics to expand in the region have made it the primary news and communication source in the country; the de facto public sphere. As such, the platform rapidly became an amplification hub for misinformation, hate speech and organizing militant terrorism. Millions of people joined the platform via Facebook’s “Free Basics” service which provides free access on mobile devices without a data plan, and for many, Facebook became synonymous with the Internet itself. At the absolute very least, Mark Zuckerberg should have been urgently reaching out to experts and deploying staff to make sure Facebook’s growth wasn’t fueling genocide. Instead, just over a year ago there were reports that the platform was censoring posts by journalists and civilians trying to document the violence for the wider world to see, and in just the past few weeks Facebook’s efforts to deal with the issue have been seen by some experts in Myanmar as still demonstrating a failure to understand the situation and the company’s role in it. With no end to the violence in sight, there is also growing concern for what role Facebook will play in Myanmar’s 2020 elections.

It is self-evident that leaders like Zuckerberg lack the interest or moral guilt to grapple with these problems in a serious way. Serious solutions would require reckoning with the complex sociology of our media and politics, issues that also demand humility. The most charitable reading possible is they don’t grasp the gravity of what they are messing with, or care to learn.

This lack of comprehension was again made clear in Zuckerberg’s public post celebrating Facebook’s 15 year anniversary earlier this month, painting Facebook as a tool of progress that has toppled the hierarchical ordering of society, as if Facebook were a decentralised conduit of a progressive force rather than itself a hierarchical company with a single, all-powerful leader. He writes: “there is a tendency of some people to lament this change, to overly emphasize the negative, and in some cases to go so far as saying the shift to empowering people in the ways the internet and these networks do is mostly harmful to society and democracy.” Y’know, genocide happens, change is hard.

I highly recommend reading Siva Vaidhyanathan’s open letter rebuttal to the post, but at the heart of Zuckerberg’s narrative is the claim that all this change is just an inevitability of technological progress, a story of technological determinism. In Zuckerberg’s framing, “Facebook” and “The Internet” are indeed essentially the same, and Facebook’s problems are just what happens when society transitions to bits and bytes.

The False Inevitability Narrative of Technology, And The Left’s Need to Re-write It

In a sense, Zuckerberg has a point. A the scale of two billion users, it is literally technically impossible to monitor everything, to intercept every bad actor and assess every image or phrase and understand the infinite contexts in which they emerge, even with billions of dollars to throw at it. The idea that we should actually want Facebook to do so is also an ethical dead end, as it amounts to ceding the epistemological challenges in society over to a falsely objective set of algorithms. Objectivity was never real in the mainstream legacy media, though the BBC and CNN would wish it so, and it will never be possible in algorithms either. To claim “objectivity” has always been a tactic to claim a kind of authority over knowledge itself, a power struggle like any other. One does not need to believe there are no truths, only recognize that all truths as we know them are at some level constructed, discovered with all kinds of limited tools and biases at play.

Rather, Zuckerberg’s case for inevitability is itself a failure to understand the problem. First off, as sociologist Zeynep Tufekci has pointed out, “Security isn’t just about who has more Cray supercomputers and cryptography experts but about understanding how attention, information overload, and social bonding work in the digital era.” Truly, the domains of social science have a bright future cleaning up social media’s mess. But more importantly, “Even the free-for-all environment in which these digital platforms have operated for so long can be seen as a symptom of the broader problem, a world in which the powerful have few restraints on their actions while everyone else gets squeezed.”

The fact that we are dealing with these problems is the downstream consequence of political and societal decisions, as Wendy Liu has also argued in her piece Abolish Silicon Valley. A politics that allows an extortive system the scale of Facebook’s holdings to exist in the first place. A system of chaos capitalism, as coined by journalist Paul Mason, a Brave New World cyclone of commodification.

Just take the sheer ubiquity of surveillance capitalism. This is not an externality of tech companies doing business, it is the business. An exciting and lucrative business model for everyone from your credit card company to your favorite seemingly innocuous mobile game laced with banner ads. A model able to exist in a society that has falsely imagined that the market left to its own devices will work out for the best. The same is true of the algorithmic amplification of extremism and hate on these platforms — they are designed to be good at this, optimized to serve a digital advertising market free of interventions and overgrown in the absence of an updated antitrust framework. They might be the capstone to a system, but certainly not the foundation. Likewise, we as a public were primed for epistemological disintegration by a long history of anti-intellectual, anti-media rhetoric among conservative leaders that’s made trust in “the media” itself an issue of political identity, a right-wing media that revels in stoking anti-media rhetoric without a drop of irony to be found. And yes, sure, for our part at least, America was always a little irrational, a nation steeped in a long habit of rejecting rationality for the religion of personal liberty — no one can tell me what to do or what to believe.

And yet, to believe the inevitability narrative is to give in to a free-market nihilism afloat in a void of any serious social thought. One of the most powerful things about today’s digital technology is its power to obfuscate complexity, or in computer science terms, achieve higher levels of abstraction. An architecture that displaces direct human involvement as all the routine processing and decision-making disappears into an algorithmic black box, a system riddled with such complexity that no single person can understand how it all fits together. It’s a tempting allegory for understanding society itself — particularly if you build algorithms. “People do what they do, it’s too complicated to really understand why.” No need for your Conflict Theory to explain things; society is data, and the data is ineffable. In this worldview, all we can do is react to the inevitable march of progress. And don’t bother asking “progress for whom?”, it’s the apotheosis of man-and-machine; you’ll know it when you see it. And hey, change is hard.

In his book The Master Switch, Tim Wu demonstrates how powerful systems of technology and the ways they operate have always been inextricably tied to the rest of society and our politics. In 1876, a decade after the end of the U.S. Civil War, the Western Union company held an exclusive monopoly over telegraph technology in the U.S., operating lines that were built by the Union army following the defeat of the Confederate South. Rutherford B. Hayes was running for President and was favored by Western Union as he would likely enable their continued monopoly. Hayes was a weak candidate, however, and when it became clear he was likely to lose, Western Union sought to throw the election into question. At the time, the Associated Press wire news service was carried exclusively by the Western Union telegraph lines. They colluded to send instructions to Governors across the country to call the election into question even though Hayes’ opponent Samuel Tilden decisively won the popular vote. The electoral vote was disputed in what remains one of the most contentious election outcomes in U.S. history, and in the end a deal was struck, the Compromise of 1877 which granted Hayes the presidency and in return the Union ceded power back to the Southern States. The South proceed to disenfranchise black voters and give way to the era of Jim Crow terrorism against black citizens in the South.

Technologies are specific; they are shaped by society, as we are shaped by them. There is no “dual-use” for an intercontinental ballistic missile, and the QWERTY keyboard isn’t the inevitable “most efficient” keyboard layout, it’s a frozen accident of consumer market demand. Likewise, at the risk of getting too obvious, technologies usually have to be designed for a specific purpose to do that thing really well. The fact that Facebook was not designed for mutual understanding or productive discourse is clear if you’ve ever, y’know, used Facebook. And there’s no reason to expect it ever will be designed to achieve these things so long as it is controlled by an ideology that is uninterested in what it perceives as “overly negative” assessments by academics and journalists whose grievances are obviously just thinly-veiled angst for being left in the dust.

This is not the inherent ideology of technology. There are alternative paths, there have always been alternative visions. Or just take your pick of fully automated luxury gay space communism memes. The unfortunate reality is Facebook plays a bigger role in today’s information ecosystem, our modern public sphere, than any single entity ever has in history. As such, at a strictly pragmatic level, emancipatory politics are harder, much harder, so long as misinformation and disinformation run unchecked on the dominant communication systems as they exist today. The left has used these platforms rather effectively to change hearts and minds, but so has the far right. I don’t see that changing any time soon, so we might as well demand the platforms at the very least hem in their role in textbook cases of ethnic cleansing while we create the preconditions for technologies with far greater social value and potential.

And if I’ve spent too much time picking on Facebook rather than its compatriots or the political leaders who have enabled their abuses, then I’ll freely admit my own bias: a knee-jerk reaction whenever Mark Zuckerberg insists on sticking his neck out to offer up deluded Silicon Valley propaganda. Or maybe I’m still not being charitable enough to the genuine truth in Zuckerberg’s belief — change is hard. Real change, the kind that requires a company executive to hand power over to its workers, establish new hiring practices, and commit to solving the local problems of the community in its own backyard. Or possibly, be dismantled altogether. The story that these technologies and their problems are inevitable is not just a misleading trope by those who have built them, it is a mandate for those of us on the left to state loudly that this is false, and to write the alternative, more complicated story. Ultimately, Brave New World is a fiction, not fact.

Last fall, a small crowd of developers and entrepreneurs gathered at a startup hub in downtown Austin Texas for the inaugural CredCon, an event geared toward jumpstarting fake-news fighting prototype technologies. The event’s kickoff speaker declared “Your kids and your grandkids are going to ask you what you did to fight misinformation — and this is not hyperbole.” Unironic startup hyperbole notwithstanding, maybe that’s true. But I hope my kids and grandkids are smarter. I hope they’ll ask “what did you do to fight those who held power and refused to relinquish it?”


Erik Martin (@erik_nikolaus)

Erik is a grad student studying Social Science of the Internet at Oxford University. He was previously a Policy Advisor at the White House Office of Science and Technology Policy during the Obama Administration, and managed education programs at the software company Unity.