Sign up here for the weekly email newsletter

Narrative Flash Crashes

As a civilization, we entertain plenty of myths about the way we never were. One of the most attractive is that we were born into a world unmediated by technology, a pure state of nature, before we gradually enhanced or corrupted that world with our inventions over the millennia that followed. From the simplest tools onward, of course, we’ve never been without technology: A lightbulb, a lit match, and a painting on the wall of a cave are all “technology,” extending our bodies, changing our brains, and remaking the environments we inhabit.

If we never had a clear, pure image of the world around us, that image was at least relatively stable even recently, if still distorted, shaped by all of the technologies that Marshall McLuhan called media: books, newspapers, radio, television, music, religious ceremony, language itself. In our hypothetical state of nature we may have apprehended what mattered directly—an approaching storm or a source of food—but that perception has since been wrapped in increasingly complex layers of language, imagery, and context—narrative—most of which was created by other people and transmitted to us, usually with the goal of influencing us somehow. We are narrative-loving animals, and we’re without narratives less often than we’re without clothing.

Like clothing, we’re also much more comfortable with narratives than without them (at least outside the house). We are terrible at apprehending reality directly so we constantly grasp for some simplifying context anywhere the possibility arises. Narratives are thus immensely powerful: Everyone needs them, and the people who create them program minds, to various degrees. Narratives are also extremely dangerous: Every destructive mass movement in history—Nazism, Stalinism, Maoism—has fueled itself with them.

Fortunately for all of us, narratives are difficult to wield individually. First, there’s a huge capital requirement: Every narrative requires a distribution channel. Do you own a media conglomerate or have a million Twitter followers? Second, and more importantly, our environments are full of information, much of which doesn’t fit a given narrative and some of which may even contradict it outright. If you read that all dogs are brown in the morning paper and see a white dog as soon as you leave the house, that’s a problem for that narrative. Good thing your environment doesn’t respond to your morning news, or even worse, collude with it.

But what if it did? What if the newspaper and the street you stepped out of your front door into were facets of the same underlying system, a system that generated its output in both domains using the same logic and rules? In this scenario, the propaganda you just finished reading in the paper populates your surroundings as soon as you finish. Now all dogs are in fact brown.

This depraved Matrix is our emergent reality and Facebook is its foundation. Facebook’s long-existent fake news problem has risen to urgent prominence since the presidential election for its apparent role in stoking pro-Trump sentiment. If Facebook were just a “news website,” a digital version of some mid-century media form, there would be no problem: It would have lost credibility as a legitimate source of truth and repelled the users who recognized the unreliable nature of its content. But Facebook is an entire environment, a platform where many people live surprisingly huge portions of their lives, a public space as well as a newspaper, radio station, message board, and plenty else, condensing the multiplicative power of all of those media into its firehose blast (and in this broad sense Facebook is correct to argue that it’s not a media company). Ben Thompson characterized Facebook’s power as follows: “The use of mobile devices occupies all of the available time around intent. It is only when we’re doing something specific that we aren’t using our phones, and the empty spaces of our lives are far greater than anyone imagined.” By bundling together as many things as possible that the internet can offer, Facebook has ensured that we’ll come for something we care about and stay for the incidental bullshit, for hours on end.


  A contemporary human habitat (source)

Again, Facebook is an environment in every sense but the physical, so its relationship to narrative truth and falsehood is more complex and powerful than anything fitting within our narrower definition of “media.” We immerse ourselves in Facebook to a fault, check in with it throughout the day, and even fall asleep with it. To compare it to an agora or neighborhood street is to underrate it, because it’s so much more of a public space than those physical places usually are today. Even worse, at least for the goal of curtailing fake news, Facebook responds to us, giving us tools to amplify the content we find emotionally resonant. Because we are narrative animals, this tends to be content that aligns with the existing narratives our brains are primed to accept. We like and share first and verify later, if ever. Rich narratives get richer and poor narratives get poorer, withering away unnoticed. That narrative snowball effect parallels the financial transformations of recent decades, also enabled by unrestrained digital technology, and the 2016 election was its trillion-dollar flash crash.

The fundamental problem with Facebook, then, is that narratives of all kinds get traction more quickly in digital terrain. The physical world is simply hard for individuals to control, especially where information and culture are concerned, but the digital is optimized for focusing and channeling and manipulating information—by definition. We’re not used to the power that can be exercised over narratives in these new spaces. The merging of media content and environment is another phase of what McLuhan called the retribalization of mankind, and we’re discovering that the negative connotations of that euphemism are as true as the positive.

Many have suggested solutions to Facebook’s fake news problem in the past week: internal changes to company policy or externally-imposed rules. It’s possible these solutions would enmesh us in additional layers of algorithmic opacity, with new imperfect algorithms deployed to fix the flaws wrought by today’s imperfect algorithms. Andy Warhol defined art as whatever you can get away with, and any platform enabling its users to create and share their own content will be rife with experiments in what they can get away with. Fake news is likely a feature of these platforms rather than a bug. McLuhan’s most famous dictum was that the medium is the message; perhaps the blurring of fact and fiction is inherent to Facebook, and we can best negotiate that by actively reducing the platform’s grip on our personal and public lives.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s