We Need to Have an Honest Talk About Our Data

VR pioneer Jaron Lanier talks with WIRED about why the original architecture of the internet forced us into a kind of information trickery, and how we can fix it—to everyone’s benefit.
Image may contain Jaron Lanier Face Human Person Hair Clothing Sleeve and Apparel
"To me, criticism and optimism are the same thing," says Jaron Lanier. "When you criticize things, it's because you think they can be improved. It's the complacent person or the fanatic who's the true pessimist, because they feel they already have the answer."Amy Lombard

More than two decades ago, WIRED ran its first profile of VR pioneer and author Jaron Lanier. We wrote "Yea, though he has walked through the valley of silicon, he fears no evil. His music and his software comfort him, and having survived reasonably intact he can only revel in the exquisite wonder of it all." Since then, Lanier has been a fierce critic of Silicon Valley and a fierce critic of where technology has gone—and it always comes back to music and spirituality. We sat down with him at our WIRED 25 festival in October.

(This interview has been condensed and edited.)

Nicholas Thompson: Jaron, for you it's not about product, it's not about efficiency, it's never about money. It always comes down to music and spirituality. And reading some of your recent writings, it seems like you're actually a little worried about what technology has done to our spiritual health as humans. So why don't we start with your critique of social media.

Jaron Lanier: To me, criticism and optimism are the same thing. When you criticize things, it's because you think they can be improved. It's the complacent person or the fanatic who's the true pessimist, because they feel they already have the answer. It's the people who think that things are open-ended, that things can still be changed through thought, through creativity---those are the true optimists. So I worry, sure, but it's optimistic worry.

Also, to me, a sense of the world being open-ended is absolutely core to being a good scientist, a good technologist, a good writer, a good artist, or just a good human being. We're surrounded by a sea of mystery.

I used to imagine this tightrope that you have to walk: On one side you fall into a kind of excessive, nerd-supremacy reductionism, and then everything becomes kind of meaningless, because you've made yourself blind with some abstraction that you think explains everything. And on the other side is superstition, where you start to say, "Well, we don't really understand how quantum field theory and general relativity connect, so it must mean that my mind can talk to plants." And it’s about finding this point in between. There is mystery, and the way to address that mystery is with rigor. It's with self-doubt, with intellectual modesty, where you don't assume narratives that are really beyond your reach. But at the same time, you believe in a destination and a quest for meaning that's totally beyond your reach. And you quest for it incrementally. That tightrope, I think, is where technology can improve. It's where beauty can happen and where relationships can be real.

NT: Well let's let's talk about the technology part of walking that tightrope, and particularly about social media platforms. What is the role that they should play in keeping society at the right point as we progress on the tightrope?

JL: That’s an interesting way to phrase it. It’s a top-down assumption that they have a role in keeping society a certain way.

NT: Well, they do. They have a role in influencing where we are.

JL: I've always believed that people connecting using information technology can and should be beautiful and even essential. It’s actually a matter of survival, because we couldn't even understand what the climate's doing without connected devices on an internet.

So it's not even a question that the internet is something we need. When we use the term "social media," what we tend to mean is these giant platforms that have effectively taken over the internet for almost everybody, almost all the time. And they do so using this weird business model where, any time two people connect, it's financed by a third person whose only motive is to manipulate those two in a sneaky way. So this whole architecture is on every level based on sneakiness and manipulation, often using weird behaviorist, hypnotic, unacknowledged techniques to get people more and more engaged or addicted and persuaded, or to get them into compulsive behavior patterns that aren't necessarily in their own interest. That's the thing I criticize.

NT: Was it inevitable that social media systems would end up the way they did?

JL: Not at all. In fact the earliest ones were different. They weren't necessarily perfect, but they were certainly better. I think we made a series of mistakes. And the mistakes weren't driven by a lack of consideration, but rather by a firm ideology that happens to have backfired. So for instance, there was this very strong culture in the 1980s and '90s demanding that everything be free.

But the problem is, at the same time there was also this worship of Silicon Valley entrepreneurship. Like Steve Jobs put it, “You're denting the universe.”

It's this Nietzschean contract with the future where these magical, special, elevated people can change the course of events through their brilliance. And if you want to have hero entrepreneurs, and also everything is supposed to be free, there aren't too many ways to reconcile those. So you land in this "Finance through third parties who are sneaky" place.

So it was a case of two ideologies that each, by themselves, made sense, but combined created this third outcome that was horrible.

NT: If Facebook, instead of being ad-supported, had had a business model based on subscriptions or sales or commerce or individual payments, would it be harder for the Russians to hack? Would it have truly evolved in a different way? Would Twitter have evolved in different ways?

JL: Absolutely. Economic incentives are ultimately the most powerful elements in any system that has a market.

NT: What if Sheryl Sandberg were to wake up and say, “You know what? Enough ads. Let's make it payments and subscriptions.” What happens?

JL: Sheryl doesn't have the power. Someone else has to wake up.

NT: Sheryl and Mark, they both wake up one morning …

JL: Now we're talking. I think you can change it. And in fact, because I am an optimist, I'm convinced we will. I don't know exactly how soon, and I don't think it'll be totally smooth, but I think once it's done, shareholders will be happy. Everybody will be happier. Vladimir Putin might not be happy. That's OK.

An example that can serve as inspiration is Netflix. At first the business was "We'll send you discs by mail.” A very common worry about that idea at the time was that you could just get all the streaming content you wanted for free. Which was true, but the response to that is, well, (A) we can make an overall experience that's still worth the money, because it'll just be easier, less hassle or less risk. And (B) we can expand our value proposition so it's actually saving you money versus cable. So if you look at Netflix’s ability to start a subscription business, I think it gives you hope that business models can change, and if people are used to free things, they actually can be persuaded that a paid model makes the world better.

Every day there's more sort of radiation from Facebook that they're about to announce some sort of thing in this space, so let's wait and see what they say. I hope it's creative and bold.

I think people who are on Facebook need to be able to earn money directly through it. It can't just be "Give us money." It also has to be "You can earn money."

NT: What are the other choices in the architecture that put it in the wrong direction?

JL: From the '80s into the '90s, for those who don't remember, we had the packet-switch idea that's at the core of the internet. It predated what we call the internet, but it was a bunch of incompatible different packet-switch networks. People were persuaded through government bribery, basically, which was put together by a senator named Al Gore, to become interoperable. And out of that we got the internet.

The original idea was to make the internet just super bare bones. So the initial internet had no representation of people. There was no membership concept. There was no identity concept. There was no sense of authentication. There was certainly no implementation of commerce solutions. There was nothing. It was just very, very raw. And in that spirit of keeping everything as minimal as possible, the web protocol committed a primal sin of not having backlinks.

Something could point at something else to get at that thing's data. But the thing that was pointed at didn't know it was being pointed at, and that created this web where there was no provenance for data. No way of knowing what was real. No way of knowing where it had come from, and therefore no way for people to accumulate personal achievement.

I was part of this early community. I was a chief scientist at Internet2 for a while, which was the academic consortium that figured out how to scale this thing in the '90s. And we talked about it—we knew that we were making gifts of hundreds of billions of dollars to persons unknown to fill these missing gaps.

Turns out Google filled in the backlinks. That's essentially Google's core function. Or it was at the start. Who's going to create these accounts? Well, initially, firms like MySpace, but ultimately Facebook. And so all the things we left out deliberately turned into these giant monopolistic companies.

NT: One of the great problems in the internet today is that you don't own your data. My data stays in a Facebook server, and I can access it as I travel around the web, but it should actually stay with me as I travel around the web. How do you architect an internet from the beginning so that the data stays with the person?

JL: The architectural problem of keeping your own data is a solved problem. Tim Berners-Lee's new thing is called Solid, and it does just that. We should have done it before. The technology is not a mystery for how to do that. But … can I share something about the economics?

NT: Please!

JL: Take language translators. So for years and years my mentor, Marvin Minsky, tried to figure out a way to translate between natural languages like English and Spanish, and it never worked. Then in the '90s, some researchers at IBM figured out you could do it with big data, by having massive statistical correlations with pre-existing corpora that had been translated.

So then companies like Google and Microsoft started offering free services, and that has had the effect of reducing the employment prospects for professional translators to a tenth of what they were.

If you look at this on the surface you might say, well, “Too bad. They're buggy whips. Their economic niche has been made obsolete by automation.” Except if you look a little deeper, you discover that language changes every day. There are new public events, new pop culture, new memes, new slang, and so we have to scrape or steal from these people tens of millions of new phrase translations every day just to keep the translator current.

So from one side of our mouth we're telling them, “You're obsolete. You don't get paid, the robot is doing your job.” But through the other side of our mouth we're saying, “Oh, but we sure better be able to steal data from you in order to create that illusion.” And it's just fundamentally dishonest and twisted.

And this becomes crucial, because another of the big questions in tech is whether robots will put people out of work, and whether we need to all go on universal basic income. But in this case, if we could just be more honest about the provenance of data and the way things work, we could transition people to new jobs in the data space instead of telling them they're obsolete.

NT: So you believe there are jobs we're getting rid of, because of religious devotion to data, that would actually make the data better if we kept them.

JL: Well it's like a phase transition right now—a paradigm shift. Except it’s a paradigm shaft, where we have this kind of fake situation. We can't tell people we need their data, so we have to trick them into giving it to us. But it would make much more sense to just tell them, “Hey, this is the data we need. We love you. We'd like you to thrive.”

If we could actually be honest about what data we need and how data is used, where it comes from, we could actually offer people more dignity and have an expanded economy and better-working technology. But the transition is hard because we're so ingrained in this sort of fallacy.

NT: Speaking of free, what about open source?

JL: What we basically did by making code free is, we made data into the super power center. So now we're in this bizarre situation where companies like Facebook or Google have Apache stacks of open source code, but it's hidden away in secret data centers, with all your data, running algorithms that run the world, and they're hyper-secret.

And then if you look at the open source community, what tends to happen when you make everything free is not that you impoverish everyone but you take what had been a bell curve and make it into a Zipf curve.

If you have an open-market society, you should see results that are kind of like a bell curve, where most people end up with middle results, and there are a few people who are super-high performers and a few people who fall on the other end. But when you have control from a central hub that seized control in the way that a Facebook or Google has, you end up with a Zipf curve. So a few open source developers end up doing pretty well through consulting contracts or whatever. And then if you look at what might be called the long tail, you see a lot of kind of impoverished people who contributed fundamental code that keeps the internet running everyday. So it has created this absolutely untenable extreme of reward and lack of reward in society. I mean, totally aside from whether it's fair, it's just not sustainable.

NT: I want to end on something beautiful. So what does it mean to improvise in virtual reality?

JL: One of my thoughts about it is that someday there might be a way to share through virtual reality that transcends communication as we know it. That is no longer about sharing symbols, as we do with words and language, but about directly co-improvising a shared world, directly making stuff that's experienced without necessarily predefining a symbolic context for those things.

You can think of the cortex of the brain as being like a planet with undiscovered continents. A huge part, which is the motor cortex, kind of runs along the middle from front to back where a Mohawk would go. And there's a thing called the homunculus, which is a mapping of the body to it. We know that if people explore abstract computation through that, they have powers of speed that they don't have through other modalities; a jazz pianist figures out what notes to play and solves difficult harmonic problems spontaneously much faster than they can any other way. And so part of the idea was to try to leverage this underused part of the brain for creative purposes by creating musical instruments within virtuality, with which you could improvise and create this sort of shared world. It's not a dream I've given up on. To this day I'm chasing it.


More Great WIRED Stories