Tech is "Downgrading Humans"
Tristan Harris: Tech Is ‘Downgrading Humans.’ It’s Time to Fight Back
The creator of the “time well spent” movement disappeared for a year, but now he’s come back with a new phrase and a plan to stop technology from destroying free will, creating social anomie, and wrecking democracy.
Published in Wired by Nicholas Thompson
A YEAR AGO, you couldn’t go anywhere in Silicon Valley without being reminded in some way of Tristan Harris.
The former Googler was giving talks, appearing on podcasts, counseling Congress, sitting on panels, posing for photographers. The central argument of his evangelism—that the digital revolution had gone from expanding our minds to hijacking them—had hit the zeitgeist, and maybe even helped create it.
We were addicted to likes, retweets, and reshares, and our addiction was making us distracted and depressed. Democracy itself was faltering. Harris coined a series of phrases that became so popular they morphed into cliché. “The people behind the screen have a lot more power than the people in front of the screen,” he said, pithily explaining the power of engineers. He talked about the ability of technology to manipulate our basest instincts through “the race to the bottom of the brain stem.” The problem was “the attention economy.”
And most significantly, he popularized a three-word phrase—time well spent—that became a rallying cry for people arguing that we needed to look up from our damn phones from time to time. In February 2018, Harris founded an organization called the Center for Humane Technology. But then, oddly, he seemed to disappear.
What happened? It turns out he snuck off into seclusion to write on his walls. Today he’s rejoining the public conversation, and he’s doing it in part because he believes the old phrases—the words themselves—were tepid and insufficient. Talking about the attention economy or time well spent didn’t capture the true ability of modern technology to dismantle free will and create social anomie. The words didn’t say anything about the risks that increase as AI improves and as deepfakes proliferate.
Harris says language shapes reality, but in his estimation the language describing the real impact of technology wasn’t sufficient to illustrate the ever-darkening storms. So, months ago, he draped his office with white sheets of paper and began attacking them with dark markers.
He jotted phrases, made doodles, put things in all caps. He was looking for the right combination of words, a conceptual framework that could help reverse the trends tearing society apart. “There’s this sort of cacophony of grievances and scandals in the tech industry,” he says. “But there’s no coherent agenda about what specifically is wrong that we’re agreeing on, and what specifically we need to do, and what specifically we want.”
His brainstorming was almost manic: part Don Draper, part Carrie Mathison, and part John Nash as portrayed by Russell Crowe. He and his colleague Aza Raskin went down to the Esalen Institute in Big Sur, California, and covered the walls of their room with paper.
[…]
As he struggled with the words, he had a few eureka moments. One was when he realized that the danger for humans isn’t when technology surpasses our strengths, like when machines powered by AI can make creative decisions and write symphonies better than Beethoven. The danger point is when computers can overpower our weaknesses—when algorithms can sense our emotional vulnerabilities and exploit them for profit.
Another breakthrough came in a meeting when he blurted out, “There’s Hurricane Cambridge Analytica, Hurricane Fake News, and there’s Hurricane Tech Addiction. And no one’s asking the question ‘Why are we getting all these hurricanes?’” He hadn’t thought of the problem that way until he said it, so he made a note. He fixated on E. O. Wilson’s concise observation that humans have “paleolithic emotions, medieval institutions, and god-like technology.”
[…]
Finally, in February, he got it. He and Raskin had been spending time with someone whom Harris won’t identify, except to note that the mysterious friend consulted on the famous “story of stuff” video. In any case, the three of them were brainstorming, kicking around the concept of downgrading. “It feels like a downgrading of humans, a downgrading of humanity,” he remembers them saying, “a downgrading of our relationships, a downgrading of our attention, a downgrading of democracy, a downgrading of our sense of decency.”
[…]
The phrase he found after months of thought: human downgrading.
The company, Harris realizes, has unparalleled power over a vast swath of humanity, and it makes him question whether all that power is going into making people’s lives richer. So he writes a manifesto in February 2013—“A Call to Minimize Distraction & Respect Users’ Attention”—that he sends to 10 friends.
Those friends send it to their own 10 friends, who then send it to their friends and so on. It’s Harris’ first experience with the delightfully recursive phenomenon of going viral by attacking virality. Google gives Harris the slightly awkward title of “design ethicist” but doesn’t actually change the onslaught of notifications it sends to users.
[…]
Steve Jobs talked about technology as a “bicycle for the mind.” Harris’ argument last year was that the bicycle was now taking us places we didn’t want to go. His argument this year is that the tires are flat, the handlebars are broken, and there’s a truck coming right at us. And that’s why he decided to try to come up with something new for act four.
[…]
His actual choice—starting an organization called the Center for Humane Technology and then obsessing over language—seems, in one way, remarkably unambitious. If you really believe that the most powerful companies in the world are destroying the human species, shouldn’t you counterattack with more than a sharpie and a thesaurus?
Harris doesn’t buy that argument. For one thing, he’s good at language, so why shouldn’t he focus on it? More importantly, he believes in its power. Pressed on this point, he recalls a moment back at Stanford. “We studied the power of language semiotics and Alfred Korzybski and people like him,” he says. “And they have this concept that something doesn’t exist until there’s a language symbol for it. I used to think of that as a kind of a poetic thing. But I’m really convinced now that language actually does create things, and it creates momentum and pressure. That’s why we’re focused on it.”
He believes this phenomenon was at play over the past two years. Yes, the world began to question Silicon Valley in part because of the election of Donald Trump and in part because our addiction to our devices seemed so obvious. But the sense of alarm also spiked because people had language that allowed them to name the icky feelings they had about what their phones were doing to them and to society. They had language that helped them center their thoughts and thus their critique.
[…]
“It’s not like I want to get up and think about words,” he adds. “I want to get up and actually see these problems go away. But if we don’t have the language as a lever it’s not going to happen.” And so, with that, Tristan Harris is back.