anx_uncanny_openertype.png

anxiety_glass.png

In YouTube’s earliest days, when people were still blowing up Diet Coke with Mentos, responses to the site’s curious videos tended to fall into two categories: astonishment and skepticism. First came the huzzahs. “Better than Hendrix!” one commenter wrote in 2006, praising a shred guitar solo that was YouTube’s most-discussed video for a time. The commenter surrendered to the spectacle. Then came the forensics. “Look at 1:14,” another wrote of the same video. “It’s not in sync. He’s not really playing.” In 2007, this magazine published a piece called “Why I Hate YouTube/Google Video,” which took aim at an innocuous video showing a man counting to 1,000.

This “disgusting mouth breather … will make you want to choke the life from him,” argued the piece, a virtuoso expression of fearful resistance to YouTube, which at the time seemed poised to choke the life from the whole internet. The same anxiety turned contempt attends much of today’s social media, notably Twitter and Snapchat, where the sheen of fatuousness, cryptic UX, and clubhouse jargon appears designed to humiliate and enfeeble.

Anxiety is much more than a rookie response to internet-borne humiliation and weakness; sometimes it seems like the animating principle of the entire commercial web. That’s part of the reason for our decade-old retreat to apps, where McModern design and the illusion of walls seems like a hedge against the malware and rabble of the original web metropolis.

But leave standard consumer software aside and you’ll find that straight panic haunts the latest phase of digitization. Virtual reality, AI, the blockchain, drones, cyberwarfare—these things spike the cortisol in everyone but power users of Github and people with PGP session keys in their Twitter bios. The recent obsession with whether Barack Obama and James Comey did the right thing when confronted with evidence of Russian cyberattacks in 2016 misses the point: No one—no world leader, no FBI director, no masterful subredditor—knows exactly what to do about cyberattacks. The word alone is destabilizing.

Closer to home, I can’t even think about bitcoin lately without suffering the vapors. If only I had taken the time to listen to that bitcoin ranter on Acela seven years ago and bought $5 worth of bitcoin I’d have $4 million today.

Five dollars! Clearly I’ve been excessively cautious with cryptocurrency, but I’ve also been spectacularly undisciplined in other areas of my digital life, leaving my phone and laptop eminently hackable. I have too many apps running in my brain’s background, overheating it. What other opportunities am I missing, then—or flanks am I leaving unguarded?

The cycle of doubt and self-doubt—bitcoin sounds fishy; I’m an idiot for thinking bitcoin sounded fishy—can turn palpable, somatic. And that’s when it starts to seem clear that what we’re doing with software is not just interacting with machines, something our emotionally detached left brains can deal with. What we’re doing still, after all these years, is seeking serviceable metaphors that will make sense of the digital onslaught, trying to match its many facets, in scale and tenor, with traditional human experiences.

Of course, every metaphor carries its own baggage. For example, maybe bitcoin is “money.” Money surfaces all the emotional chaos surrounding credit, debt, thrift, riches, banks, bankruptcy. Or maybe bitcoin is a weapon, or cult esoterica. Maybe it’s the dark internet or benign nonsense. With any of those hypotheses comes a set of associations, aversions, even attractions. At the same time, metaphors are poor things that never adequately illuminate the things they stand in for. We’re practiced in the old connections from glittery gold to paper dollars to all that money connotes, but connecting bitcoin (which I defy any reader to clearly picture) to good old coins, the minted ingots used by our grandpas and ancient Romans alike, is a taxing mental operation.

To put it simply: Much of digital technology seems to be, in the words of our YouTube debunker, not in sync. It doesn’t quite track. Twitter emotion doesn’t rise and fall the way human emotions do. Similarly, death, final by definition, is not final in Super Mario 0dyssey. GPS tech is not true to the temperature and texture of physical landscapes. Alexa of Amazon’s Echo sometimes seems bright, sometimes moronic, but of course she’s neither; she’s not even a she, and it’s a constant category error to consider her one.

Living in the flicker of that error—interacting with a bot as if its sentiments were sentiments—is to take up residence in the so-called uncanny valley, home to that repulsion we feel from robots that look a lot, but not exactly, like us, a phenomenon identified nearly 50 years ago by robotics professor Masahiro Mori. When something gets close to looking human but just misses the mark—like that CGI creep in The Polar Express—it induces fear and loathing, the exact opposite of affection.

I’m unaccountably afraid. At root the anxiety is: Who is the human here, and who the simulacrum?

Mori used the notion of the uncanny valley to describe a restrictive aesthetic response to robots. But the internet, by aiming to represent a monstrous range of human experiences that includes everything from courtship and commerce to finance and war, introduces a near-constant dysphoria. An uncanny experience registers like a bad note to someone with perfect pitch. And bad notes are everywhere on the internet. Queasy-making GIFs, nonsense autocorrect, memes that suggest broken minds. The digital artifacts produced on Facebook, Instagram, and Spotify are identifiable as conversation, bodies, and guitars, and yet they don’t sync with those things in the three-dimensional world. Our bodies absorb the dissonance, and our brains work overtime to harmonize it or explain it away.

Consider my stock response to a friend’s honey-colored Instagram photos that show her on a yacht in Corsica. Is that what life is supposed to look like? Why does my own life by this loud municipal swimming pool look sort of—but not really—like that? We rightly call this jealousy, but the comparison of one’s multisensory experience to a heavily staged photo, passing for existence, entails cognitive discomfort too.

My poolside afternoon changes millisecond to millisecond. It also has a horizon line; robust unsweetened audio (yelps of “Marco” and “Polo” in my daughter’s pool-glee voice); a start-and-stop breeze; the scent of a nearby grill; ever-changing and infinitesimal shades that elude pixelation and suggest even hues outside the human spectrum that my sunblock (scented to evoke the tropics) is meant to guard against. What’s more, because it’s my experience, this scene is also inflected by proprioception, the sense of my own swim-suited body present in space. Compared to this robust and fertile experience as a mammal on Earth, isn’t it the Instagram image that’s thin, dry, and inert? “The imagined object lacks the vivacity and vitality of the perceived one,” philosopher Elaine Scarry wrote in Dreaming by the Book, her 1999 manifesto on literature and the imagination.

And yet. There’s that tile-sized cluster of pixels on my phone. The heightened portrait there, let’s call it Woman in Corsica, makes my own present moment—real life—seem like the impoverished thing. I’m unaccountably afraid. At root the anxiety is: Who is the human here, and who the simulacrum?

The good news is that the anxiety of the uncanny is nothing new or unique to digital experience. Every single realist form, the ones that claim to hold a mirror to nature, has made beholders panic—and worse. In the fifth century BC, the Greek artist Zeuxis is said to have painted voluminous grapes that looked so much like the real thing that birds pecked themselves to death trying to eat them. Novels, which were intended to show unfiltered middle-class life in everyday prose instead of fakey verse, drove women to promiscuity by representing their feelings so exactly. And, of course, there was the famous stampede in Paris in 1896, when audiences watching an early movie, Arrival of a Train at La Ciotat, retreated to escape the train hurtling toward them from the screen.

A Snopes search for these stories turns up nothing; all of them are now considered folklore. But they’re useful. We like stories that suggest that experiences with art and entertainment we now take for granted—realist paintings, novels, the movies—once overwhelmed our ancestors. As a species, we must have learned something: how to stimulate ourselves with movies without being duped. And if we learned it then, we can learn it now. Because while the gap between the real and the replica can seem nauseatingly narrow, we do have a brilliant mechanism for telling reality from artifice. It’s literacy.

Skillful readers of novels recognize language as a symbolic order with rules that set it apart from the disorder of real life. Musicians recognize sound signals in OGG format as decent representations of the sounds produced by their tubas and vocal cords, but not the music itself. Similarly, the Instagram image of Corsica is not life itself. It’s not even Corsica. It’s software. To read novels, hear recorded music, or scroll through Instagram is not to experience the world. It’s to read it.

But we forget this, over and over. Our eyes are still adjusting to the augmented reality of everyday life mediated by texts and images on phones. The oceanic internet has grown far too fast, with the highest aspirations to realism, for anyone to have developed guidelines for reading it without getting subsumed. Our phones even intercede between ourselves and the world. Like last year’s Pokémon Go, virtual artifacts now seem to embellish everything. And the enchantment they cast is potent: We will, it seems, drive off the road rather than resist the text. Literacy entails knowing when not to read.

Framing the internet as a text to be read, not a life to be led, tends to break, without effort, its spell.

As David Kessler has written about mental illness, thoughts, ideologies, and persistent images of past or future can “capture” a person and stall their mental freedom. If this is hard to grasp in the abstract, look at the captivating quality of sexting, doctored photos, or something as silly and fanciful as Twitter, with its birdies and secret codes. Even as artificial and stylized as Twitter is, the excitement there rarely seems like a comic opera to users. Encounter a troll, or a godawful doxer, and it’s not like watching a sitcom—it’s a bruising personal affront. “You’re a fool,” tweeted by @willywombat4, with your home address, makes the face flush and heart pound every bit as much as if a thug cornered you in a dark alley. Sometimes more.

But you don’t cool your anxiety by staying off the internet. Instead, you refine your disposition. Looking at a screen is not living. It’s a concentrated decoding operation that requires the keen, exhausting vision of a predator and not the soft focus that allows all doors of perception to swing open. At the same time, mindful readers stop reading during a doxing siege—and call the police to preempt the word being made flesh. They don’t turn quixotic and mix themselves up with their various avatars, or confuse the ritualized drama of social media with mortal conflicts on battlefields. The trick is to read technology instead of being captured by it—to maintain the whip hand.

Paradoxically, framing the internet as a text to be read, not a life to be led, tends to break, without effort, its spell. Conscious reading, after all, is a demanding ocular and mental activity that satisfies specific intellectual reward centers. And it’s also a workout; at the right time, brain sated, a reader tends to become starved for the sensory, bodily, three-dimensional experience of mortality, nature, textures, and sounds—and flees the thin gruel of text.

The key to subduing anxiety is remembering the second wave of YouTube commenters: the doubters. Keep skepticism alive. We can climb out of the uncanny valley by recognizing that the perceivable gap between reality and internet representations of reality is not small. It’s vast. Remember how the body recoils from near-perfect replicas but is comforted by impressionistic representations, like Monets and stuffed animals?

So imagine: Twitter does not resemble a real mob any more than a teddy bear resembles a grizzly. If you really go nuts and nuzzle up to a teddy, I guess you could swallow a button eye, but you’re not going to get mauled. Tell this to your poor rattled central nervous system as many times a day as you can remember. Make it your mantra, and throw away the benzos. Nothing on your phone alone can hurt you more than a teddy bear.

Virginia Heffernan (@page88) is the author of Magic and Loss: The Internet as Art.

This article appears in the September issue. Subscribe now.

illustrations by zohar lazar. lettering by braulio amado.