GLENN: Tristan Harris, the founder of Time Well Spent. He is a magician when he was a kid, and a Google design ethicist. He has a great blog on -- on Medium: How Technology is Hijacking Your Mind.
He says, I'm an expert on how technology hijacks our psychological vulnerabilities. It's why I spent the last three years as a design ethicist at Google, caring about how to design things in a way that defends a billion people's minds from getting hijacked.
We use technology. And we focus optimistically in all the things it does. But I want to show you that it can also do the exact opposite. It can hijack and exploit your mind's weakness.
He's here to talk to us a little bit about the way the Russians did that and what's happening on Capitol Hill today. Tristan, welcome to the program. How are you?
TRISTAN: I'm great. Thank you, Glenn, for having me back.
GLENN: You bet.
Great conversation last time. Let's continue it. Let's start Facebook, Google, Twitter testifying on Capitol Hill today. And this makes me really nervous, for some reason. And I'm not sure exactly why.
(chuckling)
TRISTAN: Well, yeah. You know, they're testifying on Capitol Hill. And the question really Americans need to be asking them is, what is their role in enabling -- essentially, what's been discovered to be just totalizing Russian propaganda? It went from them first saying that it was about $100,000 in ads, which is a very small amount of ads. Not a big deal, right? But, really, that hid the bigger picture, which is that there were 470 Facebook groups that they created, and pages, that basically shared content that was shared organically, meaning just by all of us, by Americans, without even knowing it. And it ultimately affected 126 million Americans, which is 90 percent of the -- of the US voter base that voted in the last election.
So, you know, I think the real question we have to ask is: Given that the business model of these platforms is spreading engaging information, and the Russians figured out basically how to manufacture deliberately polarizing content -- I mean, they created groups around veterans rights, around immigration. They created pro-police groups, pro-Black Lives Matters groups. They created groups on both sides. And they did it because they wanted -- so we can't talk to each other. And they were able to do that with -- with Facebook.
GLENN: So, Tristan, this is something that we've been warning about since before Donald Trump came down that escalator and said he was going to be president. We had been warning for years that the Russians are deliberately trying to infiltrate and control the conversation and split us apart as a nation.
TRISTAN: Right.
GLENN: Nobody really wanted to pay attention. Everybody denied it. And I think still there's a lot of people who will listen to what he just said and say, "Oh, yeah, big deal, so it was -- no, it really was a big deal.
TRISTAN: Yeah.
GLENN: However, how do we -- how do we -- how do we want Google and Facebook to start controlling or deciding who gets to speak and who doesn't?
TRISTAN: Yeah. Well, this is an incredibly difficult area. Because essentially what we've created is systems that have exponential impact, right? There are apparently, as of yesterday, we found out in a judiciary committee, there's five million advertisers on Facebook. So if some of them are -- say China or North Korea. Or, you know, Russia. How would we know? You can't vet 5 million advertisers. Right?
So we had this problem where essentially by creating exponential impact that has the ability to take one advertiser and send the message to ten people in a very specific Zip Recruiter. And there's no way given all those different ad buys, happening literally 100 million times a second. When you load a page on Facebook, you know, in that snap of your fingers, there's this instant auction, and millions of people are competing for your attention. And Facebook can't look with human beings at every single one and say, is that Russia? Is that North Korea?
So we have this real problem on our hands, where we basically created this kind of runaway artificial intelligent system, except instead of the terminator, it's basically saying -- given this goal of, what can I show this human being that will capture their attention?
And it works really, really well. But it's not aligned with our democracy. Because what's good for capturing just your attention, basically is not the same as what's good at capturing everyone else's attention. So it takes society, like a paper shredder, through -- you know, it takes whole societies of input and spits out this sort of shredded society that only listens to its own information as an output.
So what we really need to do is change the structure of Facebook, in terms of who is paying. Because if we're the product, which we are. Our eyeballs are sold to advertisers. Which means that their business model is basically to keep us addicted, so that we -- they can keep selling our eyeballs to advertisers. You know, with that arrangement, we're kind of screwed, unless we change who is paying who.
You know, one option is to have people pay Facebook. But we're not going to be very happy about that. Because we've been getting it for free. And another option is have governments pay Facebook. But that's not -- that doesn't feel right either.
The challenge is, we find ourselves indebted into a situation where, you know, we don't like the current situation. We don't want to regulate free speech. But we also don't like the status quo. Because we honestly -- Glenn, I really believe we can't survive when the business model is basically catering to an individual's attention -- the most difficult thing for society is we have to be able to talk to each other and basically have open minds and say, "Well, what do you believe, and what do I believe?" And Facebook basically shreds that process, because we can't -- we don't even listen to the same information anymore.
GLENN: I'm also concerned that, you know, the government has pretty much stayed out of Silicon Valley for a long time, mainly because they're a bunch of dolts that don't even understand technology. I mean, I've talked to people in Washington, and their eyes glaze over, the minute you start talking about anything, I mean, at my level. And they just don't understand it. And you're like, "Oh, dear God, we're in trouble." But, you know, now they're starting to pay attention because it affects them. They see the power of -- of how it can affect people. And once the government gets involved, they will make sure that it helps them.
I mean, they have different goals. So what could -- what could Google or Facebook suggest, that would be good for the republic?
TRISTAN: Yeah. Well, I mean, you know, we have this challenge, right? We have thousands of people that go to work today at Facebook. And whatever their choices are, they basically are designing the information flows that affect 2 billion people. There's 2 billion people who use Facebook. As we said last time, that's more of the number of followers of Christianity, 1.3 billion of which use it every day. And so when they're designing the information flows, it's by design. It's going to influence all of those people's thoughts, right? Because they set up, basically whether or not the top of your news feed is your high school friend or it's the baby photos or it's Donald Trump every day. Right?
And so, yeah. I mean, we have to have an honest conversation about a few things. One is, for example, bots.
What people don't realize is that up to 15 percent in the academic literature, they say on Twitter, are bots. Fifteen percent of its users are bots.
GLENN: Explain that for people who don't know what bots are.
TRISTAN: Yeah. Bots are basically things that when you click on a page on Twitter -- you know, you see Glenn Beck or whatever, and it looks like it's you. It's got your photo. But you click on someone else, and it looks they're, I don't know, an Asian-American living in Kansas or something like that. And they're actually not. It's just a fake photo. And it's a fake profile. And the profile is run by a computer, which is called a bot. And the thing is that 15 percent of Facebook -- or, excuse me of Twitter's claimed users are actually bots.
Now, the problem is there's this ability to create manufactured consensus. So when you see, you know, someone tweet something, whether it's the president or it's someone else, you can have hundreds of thousands of people like it that are not people, but they're actually bots. So you can manufacture the sense that these certain messages are popular. You can also make conspiracies become trending. And if you make it trend, you make it true. So the reason I'm bringing this up is, one thing we can do is we make it -- we should have total disclosure for bots. So just think of it like a Blade Runner law. I mean, if you've seen Blade Runner which is out right now, it's all about, how do we know that someone is a human or a bot, or a cyborg?
And what you want is when you're on Twitter, having everything that is a bot to be labeled as such. I mean, why should our discourse be poisoned by essentially bots, especially when in this case, many of them were actually run by Russia? And Twitter has been crawling with bots. And the reason they don't shut them down is their current stock price is dependent on telling Wall Street, hey, this is how many users we have. So they can't shut down all of these other bots because then their user accounts drop, right?
GLENN: Holy cow.
TRISTAN: So that's why we have to have a conversation about why these companies won't really regulate themselves -- self-police themselves.
Now, I'm not a fan of regulation. I just want to make that really clear. I'm not trying to --
GLENN: Neither am I.
TRISTAN: I'm with you.
But the problem is, the status quo is also really not survivable. We need to be able to find some way that these companies have to do more. And given the fact that Facebook dug its heels in the ground for the last, you know, six months -- and, you know, why are we only finding out the day before the hearings today, that 90 percent of Americans were affected by Russian propaganda?
Now, you may not believe that. But that's literally the truth from the mouth of Facebook. And they've got all the data.
GLENN: So, Tristan, I want to go back on one thing you just said. You said, "This is not survival."
TRISTAN: Yeah.
GLENN: That's not hyperbole coming from you. Can you explain?
TRISTAN: No.
Well, you know, I think like you, I believe in free speech, and I believe in our need to be able to talk to each other and ask, "What is important for our society? And where do we want to go?" I mean, if you have kids, you want to ask, like, what do I want the world my kids are going to live in to be? Now, if we can't talk to each other, we can't make those decisions together.
And the problem with Facebook is that its business model is dividing societies, not deliberately, but because it's more profitability to capture your attention by showing things that just cater to your individual mind, right? Just your specific mind. By default, it means that every person is only looking at a feed that's related to their world.
So it's shredding society into these echo chambers where we only see our own beliefs. And I think the danger of that is that if we can't talk to each other, then there's violence. And I don't want to go there. But the point is, we need Facebook and these other companies to be basically -- instead of designing to shred our attention and capture it individually, to be designing for the most empowering, and enhanced public square we've ever built. Because it is the new public square. It's not just a product we used, given the scale of people who use these products. It is the public square.
Now, the question is, who is going to pay for that? And also, who is to say what the public square is? You know, do you want these young California guys at Facebook designing the public square for 2 billion people?
So it really brings in huge questions about governance, and how do you have what somebody -- these companies are private superpowers. They don't have militaries. But they have more influence, certainly on people's daily thoughts, than any government in history, that I know of.
GLENN: They also, at least Apple owns more treasuries than most countries do. So they -- they have more T-bills that they could dump if they wanted to get nasty as well. I mean, they're amassing enormous amounts of power.
Tristan -- go ahead.
TRISTAN: No, no. Go ahead.
GLENN: I just want to thank you for being on with us. And hope we can continue our conversation. It's extremely what you're talking about and what you're doing. Thank you so much. Tristan Harris.
TRISTAN: Absolutely.
GLENN: Founder of Time Well Spent.