Between Futurism and Dark Euphoria

Dark Euphoria is what the twenty-teens feels like. Things are just falling apart, you can’t believe the possibilities, it’s like anything is possible, but you never realized you’re going to have to dread it so much.”

-Bruce Sterling

The short twentieth century began in 1914 but the spiritual twentieth century started six years earlier when Filippo Marinetti lost control of his Fiat and plunged it into a muddy ditch outside of Milan, forging the Futurist manifesto’s introductory myth and launching the movement of the same name. Futurism, a pivotal moment in design modernism, celebrated the raw power of the car that Marinetti crashed and the many other fruits of the industrialism then transforming Italy, its adherents eager to discard the past and all its limitations. As Reyner Banham described Futurism, “the poet, painter, intellectual, was no longer a passive recipient of technological experience, but could create it for himself,” and Boccioni later proclaimed their cohort “the primitives of a sensibility that has been completely overhauled.” A new type of person was being born amid the roaring engines and factories.

Futurism, for all its shortcomings, was an admirable moment of raw exuberance about the potential of technology to make the world more interesting. The futurists celebrated machinery for its romantic possibilities and “universal dynamism” and rightly saw that it could be beautiful, not merely functional. You can imagine the ways this attitude (and the technology it celebrated) can and did eventually go wrong, but the Futurists dared to demand more from machines than economic production or convenience and in that sense their movement was a human triumph and an example to future generations.

work_173.jpg

       Boccioni, The City Rises (link)

A century later, the affluent, well-educated, city-dwelling beneficiaries of the information economy—now more easily grouped by the recent election’s fault lines—enjoy a more cautious, productive optimism about the technological progress of the internet age. Or did until recently. That broad optimism, so solid even a year ago, is one of the reliable constants that 2016 has upended. The list of reasons for growing wariness about today’s digital landscape includes almost nothing new: Twitter has become a snakepit of harassment and unchecked hate speech. Facebook blurs the line between biased truth and outright falsehood. Almost anything connected to the internet can and will be hacked. The platforms and apps we use the most have evolved into the most effective surveillance infrastructure imaginable, perfectly gift-wrapped instruments for a totalitarian regime. Most of our jobs will be automated out of existence just as America dismantles its remaining safety nets. And even the exponentially-growing internet has an insatiable appetite for fossil fuels that could make everything else irrelevant by submerging us all underwater.

Somehow, the past year—the election in particular—revealed how each of those risks could potentially blow up. And some of them did blow up. The internet doesn’t feel as fun today as it did when we first applied an Instagram filter or read a @Horse_ebooks tweet. To those of us too old to get it, Snapchat looks like fiddling while Rome burns rather than another platform for unbridled 2011-style internet frivolity. The technological progress that’s still capturing popular enthusiasm is largely either the residue of mid-century speculation (space travel, self-driving cars) or the consumer-facing tips of foreboding icebergs like AI and automation (Alexa, Amazon Prime).

The digital foundation onto which civilization has migrated, we’re finally acknowledging, is more fragile than we thought, though it continues to bring us countless real benefits. Even Jane Jacobs predicted an imminent dark age and, surveying our recent cultural shift and its causes, it’s getting easier to see how formerly unthinkable dark ages set in. Again, the election solidified the narratives and camps: Some were already in revolt against the so-called cutting edge of technology and its eagerness to automate everything. Today more people are, for many of the above reasons. We thought we’d arrived at the end of history but now worry we’re just at another inflection point of its eternal cycle. In a year when something called a phishing attack is one of the major stories from a presidential election, it’s at least certain that things are getting weirder.

So we’ve trapped ourselves in a technological stack that unsettles us, but just as bad, we’ve built one that’s boring and prosaic and even cowardly, one that becomes most exciting when it fails. I came across a perfect statement of where we’ve ended up, by Geoff Manaugh, in the comments of a Kazys Varnelis blog post:

“(T)he people today most concerned with building flexible, just-in-time, climate-controlled interiors in which You Can Do Anything™ are less often swinging nightclub owners and far more likely Big Box retailers, with their clip-on ornaments and infinitely rearrangeable modular shelves and their themes of the week. There are already Christmas decorations up at Ikea. Similarly, the people building instant cities today aren’t the Balkan ravers of the 1990s (at least no more); it’s Camp Bondsteel and the logistics support teams of Bechtel. Or, for that matter, it’s the “megaslums.” Either way, it’s not a leisure class of hi-fi-owning Jimi Hendrix aficianados.”

If anything, the Silicon Valley version of technology and progress is too bland and too conservative, because it needs to scale, optimize, add value—buzzwords as dull as the results they produce. There’s a tech industry aphorism that companies ship their org chart (a version of Conway’s Law), a more concrete way of saying people recreate the milieu in which they live and work—which in this domain involves a lot of Excel spreadsheets and PowerPoints. We’re thus experts at extracting convenience and consumer value from technology but worse than we should be at using it to explore unknown territory, have an adventure, understand ourselves, or even throw a better party.

David Graeber famously complained that his generation was promised flying cars, as well as force fields, teleportation pads, Mars colonies, and every other exciting sci-fi trapping, but instead got bureaucratic tedium and screens to stare at. A popular counterargument is that iPhones and the internet are, in fact, more amazing than flying cars, but that rebuttal sidesteps Graeber’s broader point that poetic technologies, “the use of rational and technical means to bring wild fantasies to reality” (the Futurists’ bread and butter) are becoming increasingly difficult to pursue. Not because we don’t know how, but due to a failure of culture and will. The Steve Jobs dictum that customers don’t know what they want until you show them reflects a belief that individuals don’t achieve their full potential until they become customers. One of Apple’s most masterful accomplishments was concealing that bait-and-switch.

Our society continues to lose the thread: Not only are we not building flying cars, we’re instead building hardware and software that increase our fragility, anxiety, and dissolution, even if they do streamline shopping and get our laundry picked up faster. We could use a new Futurism, a recalibration of the ultimate purpose of all this work we’re doing, something better than convenience and efficiency. Give the Balkan ravers a shot. We’re more technically capable than ever and can build whatever we want, supposedly, so if we’re going to keep trying let’s build something that’s beautiful or weird or something that increases our collective freedom.

One example of such an alternate reality, already 50 years old, is New Babylon, the Situationist utopian city/megastructure. It’s the kind of model we should keep creating to remind ourselves of the futures we could be pursuing but aren’t. Acts of imagination like this, fantastical as they may be, represent a step toward realizing the poetic technology we’re currently missing, as McKenzie Wark writes:

“Owning property affords someone a house in which to be at home, at the price of being homeless in the world. Dispense with property, dispense with separation, and the feeling of being merely thrown into the world goes with them. Our species-being can give vent to its wanderlust, at home in a house-like world. Constant thought modernity was already accelerating a return to a nomadic existence. New Babylon is nomadic life fully realized. It is an architecture of duration, of thresholds, of collaborative place-making, writ large. Freed from the fixity and uniformity of property, space could again have its qualities. A short trip in New Babylon should offer more variety than the most interminable journeys through the concentrated city of spectacular society. “Life is an endless journey across a world that is changing so rapidly that it seems forever another.” The New Babylonians could wander over the whole surface of a world that was in flux. “New Babylon ends nowhere (the earth is round).”

Flying cars are probably a bad idea, and megacities cantilevered above the earth probably are too, but there are a thousand other desirable and already-possible technological outcomes that we’re failing to imagine or seek, beyond the pale of what we already have.


Cars, Clothes & Battlesuits

Cars belong near the top of a long list of reasons why America is the way it is, but one American quality I’ve never heard attributed to cars is our increasing casualness of dress, which seems to have mirrored our impulse to drive during the past century. There’s no obvious connection between the two phenomena, yet whenever I leave New York for a more American part of America I remember that the city where we drive the least is also our least casual and I wonder if cars are somehow the cause.

Sweatpants, flip flops, sneakers, t-shirts, and baseball hats have pervaded nearly every realm of life except for weddings and funerals, slowly conquering former bastions of formality like the workplace. Technology has been a factor: Watches are mere jewelry now thanks to the digital clocks that accompany us in our phones, but more broadly than that specific side effect, all technology, in a sense, clothes us, augmenting our natural faculties and our bodies. Clothing itself is a technology—Marshall McLuhan called it an extension of our skin that stores and channels energy, an increasingly tactile shell that (especially in America) overthrew the more visually-oriented attire of aristocracy.

For McLuhan, clothing and housing are two different layers of our technological exoskeletons. The city is yet another layer. If the human body moves through the world encased within a protective stack including these components, surely the car has a place in that stack as well, somewhere between the clothing and city layers. Furthermore, we should observe significant cultural differences where any of the layers is minimized or absent altogether, with adjacent layers intensifying to compensate. This is my theory of clothing’s amplified role in New York, where the car layer is anemic compared to elsewhere. When I visited my hometown of Indianapolis over Thanksgiving I didn’t bring a coat with me, knowing that I’d spend all my time in my house or in a car, save for short walks across parking lots. “My car is my coat” was a dumb joke I made. I found myself wondering why anyone who has a car would bring their coat on most errands, or even own an expensive one for daily life.

walking city archigram.png

   Archigram’s “Walking City” (source)

The more layers that encase us, the less is demanded of our bodies themselves. To follow this dynamic to its logical conclusion is to end up with inert humans hibernating in the fluid-filled pods of The Matrix, naked and fully immersed in an advanced technological stack, wrapped in the multiple layers of protection it offers. Wearing sweatpants and spending whole days surfing the internet is not entirely different from that extreme scenario, while traditional urban fabric seems anachronistic by comparison: Walking outdoors on the streets of dense cities, we’re vulnerable and suboptimized, clad in boots and coats rather than temperature-controlled pods of the automotive or Matrix variety.

If cities are an outer protective layer in this ecosystem, on the other hand, maybe we’re not so vulnerable after all—see Matt Jones’ 2009 blog post “The City is a Battlesuit for Surviving the Future,” arguing that the outer urban shell is the most important layer of all. The post’s title refers to Jack Hawksmoor, the protagonist of Warren Ellis’s comic book The Authority, who wraps himelf in the city of Tokyo to fight a sentient, time-traveling version of 73rd-century Cleveland. Jones observes that we increasingly wrap ourselves in the city as a defense against all the forces of nature that have assailed humanity throughout history, and in the networked present and future, this can become more true than ever. Jones writes, “The city of the future increases its role as an actor in our lives.” A stronger outer shell, the city, might then enable a more humane life within, while a compromised city layer shifts its functions onto the house and car layers, dividing its inhabitants into more atomized enclaves.

The networked city Jones describes is a different animal than its forebears, and is somehow more tactile than even McLuhan anticipated. We now inhabit the meatspace city, whose previous functions of information exchange have increasingly migrated to digital channels (which are, of course, embedded in its physical fabric). The features of traditional urbanism most likely to intensify under this new regime, to the extent that it continues spreading, are the most difficult to encode: eating, drinking, shopping for specialized merchandise, and the more precious types of human interaction. The meatspace city is a construction of affluence and is far from ubiquitous—it might never be—but even it presents a more comfortable and convivial interior than the car in the suburban wilderness. We’ve lost some of the optimism about networked urbanism that we enjoyed in 2009 when Jones wrote his battlesuit piece, but many of the reasons for that loss are pervasive and only most visible in cities, which are still better armor than most of the substitutes we’ve tried.


Narrative Flash Crashes

As a civilization, we entertain plenty of myths about the way we never were. One of the most attractive is that we were born into a world unmediated by technology, a pure state of nature, before we gradually enhanced or corrupted that world with our inventions over the millennia that followed. From the simplest tools onward, of course, we’ve never been without technology: A lightbulb, a lit match, and a painting on the wall of a cave are all “technology,” extending our bodies, changing our brains, and remaking the environments we inhabit.

If we never had a clear, pure image of the world around us, that image was at least relatively stable even recently, if still distorted, shaped by all of the technologies that Marshall McLuhan called media: books, newspapers, radio, television, music, religious ceremony, language itself. In our hypothetical state of nature we may have apprehended what mattered directly—an approaching storm or a source of food—but that perception has since been wrapped in increasingly complex layers of language, imagery, and context—narrative—most of which was created by other people and transmitted to us, usually with the goal of influencing us somehow. We are narrative-loving animals, and we’re without narratives less often than we’re without clothing.

Like clothing, we’re also much more comfortable with narratives than without them (at least outside the house). We are terrible at apprehending reality directly so we constantly grasp for some simplifying context anywhere the possibility arises. Narratives are thus immensely powerful: Everyone needs them, and the people who create them program minds, to various degrees. Narratives are also extremely dangerous: Every destructive mass movement in history—Nazism, Stalinism, Maoism—has fueled itself with them.

Fortunately for all of us, narratives are difficult to wield individually. First, there’s a huge capital requirement: Every narrative requires a distribution channel. Do you own a media conglomerate or have a million Twitter followers? Second, and more importantly, our environments are full of information, much of which doesn’t fit a given narrative and some of which may even contradict it outright. If you read that all dogs are brown in the morning paper and see a white dog as soon as you leave the house, that’s a problem for that narrative. Good thing your environment doesn’t respond to your morning news, or even worse, collude with it.

But what if it did? What if the newspaper and the street you stepped out of your front door into were facets of the same underlying system, a system that generated its output in both domains using the same logic and rules? In this scenario, the propaganda you just finished reading in the paper populates your surroundings as soon as you finish. Now all dogs are in fact brown.

This depraved Matrix is our emergent reality and Facebook is its foundation. Facebook’s long-existent fake news problem has risen to urgent prominence since the presidential election for its apparent role in stoking pro-Trump sentiment. If Facebook were just a “news website,” a digital version of some mid-century media form, there would be no problem: It would have lost credibility as a legitimate source of truth and repelled the users who recognized the unreliable nature of its content. But Facebook is an entire environment, a platform where many people live surprisingly huge portions of their lives, a public space as well as a newspaper, radio station, message board, and plenty else, condensing the multiplicative power of all of those media into its firehose blast (and in this broad sense Facebook is correct to argue that it’s not a media company). Ben Thompson characterized Facebook’s power as follows: “The use of mobile devices occupies all of the available time around intent. It is only when we’re doing something specific that we aren’t using our phones, and the empty spaces of our lives are far greater than anyone imagined.” By bundling together as many things as possible that the internet can offer, Facebook has ensured that we’ll come for something we care about and stay for the incidental bullshit, for hours on end.

computer-setup-wall-mounts.jpg

  A contemporary human habitat (source)

Again, Facebook is an environment in every sense but the physical, so its relationship to narrative truth and falsehood is more complex and powerful than anything fitting within our narrower definition of “media.” We immerse ourselves in Facebook to a fault, check in with it throughout the day, and even fall asleep with it. To compare it to an agora or neighborhood street is to underrate it, because it’s so much more of a public space than those physical places usually are today. Even worse, at least for the goal of curtailing fake news, Facebook responds to us, giving us tools to amplify the content we find emotionally resonant. Because we are narrative animals, this tends to be content that aligns with the existing narratives our brains are primed to accept. We like and share first and verify later, if ever. Rich narratives get richer and poor narratives get poorer, withering away unnoticed. That narrative snowball effect parallels the financial transformations of recent decades, also enabled by unrestrained digital technology, and the 2016 election was its trillion-dollar flash crash.

The fundamental problem with Facebook, then, is that narratives of all kinds get traction more quickly in digital terrain. The physical world is simply hard for individuals to control, especially where information and culture are concerned, but the digital is optimized for focusing and channeling and manipulating information—by definition. We’re not used to the power that can be exercised over narratives in these new spaces. The merging of media content and environment is another phase of what McLuhan called the retribalization of mankind, and we’re discovering that the negative connotations of that euphemism are as true as the positive.

Many have suggested solutions to Facebook’s fake news problem in the past week: internal changes to company policy or externally-imposed rules. It’s possible these solutions would enmesh us in additional layers of algorithmic opacity, with new imperfect algorithms deployed to fix the flaws wrought by today’s imperfect algorithms. Andy Warhol defined art as whatever you can get away with, and any platform enabling its users to create and share their own content will be rife with experiments in what they can get away with. Fake news is likely a feature of these platforms rather than a bug. McLuhan’s most famous dictum was that the medium is the message; perhaps the blurring of fact and fiction is inherent to Facebook, and we can best negotiate that by actively reducing the platform’s grip on our personal and public lives.


Bored by Randomness

The PlayStation game No Man’s Sky promised a revolution for its medium before its release two months ago, getting attention from gamers and non-gamers alike for its “procedurally-generated universe” in which a single 64-bit seed number could randomly create 18 quintillion distinct planets via algorithmic logic, each replete with its own weird flora and fauna. The space explorers playing the game would effectively create each planet upon discovery: Arriving somewhere undisovered would spur the procedural generation of a random, new, and hopefully fascinating world. It was going to be a major step toward humans getting out of computers’ way in yet another domain, after giving the machines sufficient instructions to make 18 quintillion of something that other people would actually want.

Not surprisingly, No Man’s Sky was boring. Its beautiful graphics couldn’t overcome the fact that, on one planet randomly selected from the infinite possibilities in the procedurally generated universe, nothing was happening. The variety among these planets was shallow and nominal, the 99.999% virgin territory untouched by any hand that could form it into something interesting. As one reviewer wrote, “There are no grand civilizations sequestered somewhere in this galaxy with Turing Test-passable aliens waiting to wow us with riveting conversation.” The procedural generation process, additionally, means the only parts of the universe that exist are the ones you see—a solipsistic vision of reality that is, again, boring.

Maybe some of the No Man’s Sky planets did end up with compelling advanced civilizations and weird creatures, but they’re too hard to find. Every baseball season has roughly 24,000 games and if you watch a random handful of these you’ll find them boring as well (unless you go to the games and sit in the sun and eat hot dogs and do everything but watch). Each baseball game is somewhere on a bell curve of expected outcomes so a single randomly chosen game probably won’t yield a no-hitter or two grand slams by one player in the same inning.

The randomness of baseball is more interesting than that of No Man’s Sky though, because it’s wrapped in the context of an existing culture and infused with meaning from that culture. There are also other ways to make baseball (slightly) more interesting for yourself: Become a fan of one team whose 162 games will excite you more than the other 2,268. The 70 to 90 games that your team wins will especially excite you, and the ones with lots of home runs even more so. Or you could adopt a different strategy and only watch the postseason, in which every game matters and is inherently nonrandom.

1-650x366.jpg

   What a procedurally generated planet looks like

What a random universe really needs is editing, in other words. 18 quintillion of something is great for advertising copy but terrible for experience. Some work by other people is still required to make a massive procedurally-generated universe interesting, to put some meaning into it—a map or search mechanism to guide players toward the best parts; a cultural context that imposes meaning on the existing randomness; or a few planets created by human hands. Most games that painstakingly create one world are better than this game with its quintillions of mass-produced worlds.

The 55 snapshots of imaginary cities in Italo Calvino’s 1972 novel Invisible Cities are the opposite of the ennui in No Man’s Sky. Each fantastical city that Calvino’s Marco Polo recalls from his world travels, while also invented (and generated by Calvino’s “rules”), is rich with the meaning and magic that No Man’s Sky promised but couldn’t produce. Calvino’s own work—the brilliant imagination that enabled him to craft each city’s description as well as the editing that removed the meaningless noise in between those vignettes—is why his 55 worlds will outlive the 18 quintillion in No Man’s Sky. 55, it turns out, is plenty.


Five Things

Here’s an assortment of the best things I’ve read on the internet lately, none of which were published in the past week:

Walking to the mall: Ten years ago Tom Chiarella published this essay in Esquire about, well, walking to the mall—in Indianapolis, where walking to the mall makes even less sense than elsewhere. He imbues the journey with an epic quality, stretching it out in time, without forgetting the absurdity of what he’s doing or the nondescriptness of shoulders and embankments meant to be seen at 40+ miles per hour, if at all. In short: You’re not supposed to walk to the mall. A perfect description of the suburban carscape and a nonfiction companion to Ballard’s Concrete Island.

Wardriving: Some family in rural Kansas is being terrorized by strangers because their farm occupies the default geographic coordinates for IP addresses with unknown locations (more of what I described as digital NIMBYism). Companies like MaxMind apparently compile computers’ locations in online databases for sale to advertisers. One of the techniques for collecting that data caught my attention: wardriving, or sending cars driving around to physically collect IP addresses from open wi-fi networks. I keep picturing armadas of vehicles roaming small towns in middle America like the darknet version of Google’s Street View cars. In case you were wondering, the term wardriving is a reference to the “wardialing” done by Matthew Broderick’s character in the movie WarGames.

The Nostalgic Comfort of Normcore Dining: In the search-don’t-sort augmented reality that Google/Yelp/Foursquare ushered in, being ordinary is the only way to hide. I didn’t understand what normcore was until I read this.

Untitled-Freeway-Crash-20-001.jpg

   Concrete island (source)

Our Brand Could be Your Crisis: One of the best pieces I’ve come across in a while. I still haven’t seen the Zac Efron movie, We Are Your Friends, that Ayesha Siddiqi reviews here (I’m going to!) but she accomplishes the impossible, writing a thoughtful—brilliant, really—essay about that dreaded topic, the millennial. Required reading for anyone wanting a better grip on the current zeitgeist, the one you and I are too old to understand, just like Snapchat. Siddiqi also sees in the film the cultural evidence of our slow, ongoing economic collapse, which manifests itself in such subtle ways (what Bret Easton Ellis calls post-empire). “We can invent an app, start a blog, sell things online” could become a mantra for all of us. I’ve been looking for a way to build a longer post off of ideas embedded in this essay, but until then I’m stashing it here. Read it. 

Walmart: Last month Bloomberg ran this darker companion to the above essay. This is a truly dystopian look at how much crime happens on Walmart’s properties and the problems that crime creates for the local police forces that have to deal with it all, not to mention the cities from which the megastores have carved out a big privatized chunk. Corporate commercial space is not public space. Enjoy your weekend!


Occupy Twitter

It’s increasingly obvious that Twitter as we know it will stop existing before long. Maybe Facebook will buy and dismantle it; maybe it will successfully turn itself into the profitable ad-friendly platform that all of its users dread (it won’t); or maybe it will just disappear, bleeding away its remaining users as it’s already been doing until there’s nothing left but bots and clueless self-promoters and hateful egg avatars with ten followers each.

Twitter has already embraced the algorithmic feed, which is as shitty as expected, and it will further relax the 140-character tweet limit next week. Having shed its two definitive features, Twitter will become a worse Facebook timeline, recognizable only by its inability to curtail trolls and harassment.

flint-demo.jpg

  The traditional shrinking city (source)

I wrote in February that Twitter was a shrinking city but now it’s a city in full collapse. The parallels abound: a growing presence of unchecked dysfunction; an exodus of permanent citizens along with their economic contributions; the creeping presence of opportunists who hope to buy up its valuable parts and trash the rest; the sense that it was a better place back in the day.

One way or another, you (if you still user Twitter) and I will probably have to leave Twitter eventually. This is a true tragedy—many of us only talk to one another on Twitter and could never have formed certain communities without it. Like every collapsing community, Twitter is sure to further debase itself before finally forcing us all out, ensuring a messy exodus.

We should all keep in touch. Let’s decide now where we’re meeting up after Twitter dies. I suggest we meet in Zuccotti Park. If we’re lucky @dril will show up.

We should meet in Zuccotti Park because the internet isn’t the free outlet or the escape from physical constraints that it once was. Occupy Wall Street celebrates its fifth anniversary this week, and five years is a long time. In 2011, Twitter was cool—cool enough that it could function as a support system for a movement like Occupy. Now, Twitter is dying because it can’t survive in an ecosystem that requires it to grow profitably, and the internet is no longer a mainstream outlet for overprogrammed, corporate urban space, but more and more a mirror of that space, which forces out the weird and the unmonetized.

Now, more than five years ago, a place like the Zuccotti Park of Occupy Wall Street feels like a haven from the internet’s panopticon, maybe still a place to make a noise, but not a noise that the internet would reliably amplify. If Twitter continues its decline, there will be few digital spaces left that do what it did in its prime, but maybe physical space can again.


Invisible Maps, Beautiful Numbers

“Where we’re going, we don’t need roads,” Doc Brown announces at the end of Back to the Future. Revisiting that line today, roads are a sure thing in the foreseeable future—it’s more likely that where we’re going roads won’t need us.

For maps more than roads, the future is uncertain. Maps are as important as ever but somehow vanishing from their familiar haunts and reappearing everywhere else. A little more than a decade ago, a map was something you found printed on paper that helped you fumble toward a new destination, an item you packed for a road trip. Now maps show up anywhere, used in daily life passively and actively, guiding us through the familiar as well as the unknown, appearing in NY Times blogs, on TV screens in the middle of transatlantic flights, and in almost every video game. Throughout the spectrum between pure entertainment and pure utility there are maps everywhere.

Maps got more important after iPhones ensured we’d have them at our fingertips all the time. Tools began emerging that used maps in more sophisticated ways—searching for what’s nearby (Yelp, Tinder), tagging locations (Facebook, Instagram), or augmenting geographical reality (Pokémon Go). Driving, of course, expanded the everyday need for maps decades before GPS devices could narrate directions in real time, if not eventually drive the cars themselves.

In most of the examples just listed, there’s no fundamental need to read a map or even see one. The map just works in the background, another invisible algorithm that frames reality. Navigational maps are going “under the interface,” as I’ve written before. Nicholas Carr observes, “It would seem to be the golden age of maps and map-reading. And yet, even as the map is becoming omnipresent, the map is fading in importance.”

I wouldn’t put it that way, but we should decide what we mean by “map.” To Carr, a map is visual—a geographical diagram that you look at as you work out how to get where you’re going. But if a map is defined more broadly, a kind of logic, a protocol for navigation, then maps are certainly not losing importance—they are just becoming invisible as they disseminate everywhere (and invisibility is the destiny of so much advanced technology in the digital age).

Carr describes how the look of Google maps has evolved over time to show less detail and less text (“as a cartographic tool, Google Maps has gone to hell”) while becoming more aesthetically pleasing. This shift may be Google’s effort to optimize its maps for the smaller screens of smartphones, but that doesn’t quite explain it.

The real reason, Carr suggests, is that pictorial maps themselves don’t need to do much—when it’s time to actually navigate, the user enters a destination and gets an optimal route with turn-by-turn directions on a separate screen: “As a navigation aid, the map is becoming a vestigial organ. So why not get rid of the useful details and start to think of the map as merely a picture or an image, or a canvas for advertisements?”

Maps are being unbundled—split into their functional and aesthetic components. Is reading a map something humans are even meant to do?

Recently, after having to memorize a number for a reason I already forget, the following thought popped into my head: Whenever a person is dealing directly with a number, that’s a task that a computer will eventually do.

The modern world has been saturated with numbers long enough to make them feel organic, but humans and numbers mix like oil and water. We are always trying to turn numbers into narratives because we hate numbers and love narratives, and we have animal brains that deal better in generalities than precise calculations. Thus, we buy lottery tickets and feel that flying is more dangerous than driving. Even memorizing numbers is hard for us, but we used to carry countless seven- or ten-digit phone numbers around in our heads because we had no choice.

DP236148.jpg

          Charles Demuth; the future of numbers (source)

By any rational standard, people cannot be trusted with numbers. We got much better at using numbers to our advantage once computers came around to do all the hard work for us. By now, we don’t have to remember many numbers (or many other things), much less do anything with those numbers. We’ve even outsourced remembering birthdays to Facebook. Increasingly we can embrace our natural narrative inclination. The world has become more data-driven because there’s more data being captured, but people are not becoming more data-driven, the tools we operate are.

As numbers go under the interface, like maps, they’ve disappeared from their usual places in the visual landscape. You see a phone number once—when you add it to your address book—and then it becomes a name, forever mapped to a person you know. Online banking eliminated the arithmetic of balancing one’s checkbook. ID numbers and passwords get saved in digital notes or email drafts and copy/pasted (a practice certain to be replaced by a more elegant solution). The list goes on and on. Many still deal with numbers in more sophisticated ways at their jobs, but such work stands at the frontier of what will be eaten by software eventually.

The reason for computers’ widespread adoption is not that they further immerse us all in the world of numbers, the machines’ native language—it’s that they help us escape from numbers and go back to what we do best, which is almost everything else.

In the 1960s, Marshall McLuhan wrote that “the horse has lost its role in transportation but has made a strong comeback in entertainment.” Numbers and maps are undergoing a similar transition now. Both have the same future in the human-readable landscape: aesthetic symbolism. The numbers that matter outside of software aren’t for memorization, addition, or multiplication, but cultural signification: infographics, athletes’ jersey numbers, famous addresses (1600 Pennsylvania Avenue), the numbers in social media username handles. Maps are better than ever as data visualization tools, but a map seen by a human is the last stop for that data, its sublimation into the realm of the irrational.

Steve Jobs understood this future of digital information and created its look. Apple’s sleek devices and operating systems became the dominant aesthetic of the digital age and did for numbers what Google and others would, in a different way, do for maps, hiding them beneath smooth aluminum surfaces, uncluttered interfaces, and rectangular icons with rounded corners. When we see maps or numbers now, we expect them to look good. Numbers still appear on iPhones where they absolutely must but these exceptions prove the rule: The red app notification badge icons communicate most of their information through color and shape, not the digit in the middle of each circle.

With fewer maps and fewer numbers to process ourselves, we glimpse a surprising future of algorithmic premodernism. McLuhan said that television, radio, and phones were “retribalizing” mankind by circumventing the role of the printed word and returning us to the mental and social patterns of primitive oral cultures. Apple and Google, in their own way, are completing that retribalization process, freeing us from a few more bulwarks of this rationally-biased era and synthesizing the machine age with ancient tendencies our brains still haven’t outgrown. If maps still look good to us after that synthesis, we’ll decorate the walls with them.