In this clip, Glenn shares an OUTRAGEOUS story about how Google recently banned an innocent father, deleting both his account and associated documents and pictures. And this dad was SO innocent that not only did the San Francisco Police Department vouch for him, but so did the New York Times! But unfortunately, Glenn says, this story is not just about one dad. This story is about YOU. Because this kind of unjust, online ban has the potential to stretch far beyond Big Tech. Glenn explains possible scenarios where something very similar could happen to YOU…but even WORSE.
Transcript
Below is a rush transcript that may contain errors
GLENN: All right. We have a really important story to share with you. It broke over the weekend. And it involves Google and a dad.
STU: Yes. So a dad in San Francisco, this is February 2021. If you know anything about San Francisco, this was like mid-lockdown. They were still in pull full fledge, right. Of lockdown. So the dad. Stay at home dad, had his son. And his son is having some issues in a sensitive area, if you would. And a rash of some sort, some redness, some swelling, breaking out.
GLENN: Monkey pox.
STU: Now, of course -- now, this wasn't monkey pox. This was pre-monkey pox era. This child is having some discomfort. You're of course not allowed to go outside for some reason. So they're doing a virtual doctor's visit. While they're doing this virtual doctor's visit. The doctor requests photos to understand what's going on.
GLENN: Look, I'm not going anywhere really dark with the doctor, is it?
STU: No.
GLENN: Good. I'm just asking for the listener.
STU: Well, if you survived the monkey pox update, I think you already are here.
GLENN: Right. This one is a lot more tame.
STU: Yes. This is more tame. So he takes some photos, to give to the doctor of his child's area. Sends the photos. The doctor recognizes what the rash is, what the issue is, sends antibiotics. Gets it knocked out immediately. Everybody is happy.
GLENN: Got it. So, I mean, want to recap this story. It's during covid lockdown. Dad is locked in the house. The doctor has these virtual visits. The doctor, a good guy. Asks the father, a good guy, to take a picture of the sensitive areas of the son who is a good guy. So the doctor can diagnosis and give the right prescription.
STU: Which he does, and it works. Everybody is happy. Apparently not. Not according to the people over at Google. Who have an algorithm, running over all his photos, that are in the cloud.
GLENN: Oh.
STU: And this photo that was apparently uploaded automatically to the cloud.
GLENN: Uh-huh.
STU: Sends -- sets off some alerts, that say, this could be child porn. Now, of course --
GLENN: But it's not child porn.
STU: It's not child porn.
GLENN: Now, was Google monitoring this guy, because they suspected child porn?
STU: No. This is an automated algorithm, that is scanning the photos much every single person who uses Google cloud.
GLENN: Everyone. Uh-huh. Okay.
STU: Now, you might say, there's some utility to this. If it was child porn, it would probably be really good that this was alerted. And maybe some child could have been protected from some horrible, horrible incident.
GLENN: Sure. So they should have maybe reached out to some doctor. Well, but the doctor could have been -- he was on the receiving end, so to speak.
STU: Right. But what needs to happen here? The algorithm sets off these alarms. And then it goes to a human. And the human would have to determine at some level. So this happens, apparently. It's egregious enough for the people at Google, to alert the police. And shut down his entire account. Shutting down his access to his email. Deleting all of his photos from the beginning of his child's life all the way through. Deleting all of his documents.
GLENN: Okay. So wait a minute. Hang on just a second. If you were trying to catch somebody who was in child porn, the last thing you would want to do is tip them off that the authorities are on to them. So Google just -- they call the police. Then they just delete everything?
STU: At least from his access point. So he cannot access any of his stuff. Now, of course, this means he can't access the photos to prove he's innocent. Because now he no longer has excess to the photos that he took, that were his.
GLENN: What happened to the doctor's office that received the photos?
STU: Well, nothing at this point.
GLENN: Okay.
STU: So this goes on. He goes to Google, and appeals it. And says, look, my kid was sick, the doctor asked for these photos. I sent them. They reject his appeal. Then months later, he gets a letter from the San Francisco police department. San Francisco police department has alerted him that they have begun an investigation. Have looked at all these photos. He gets in touch with them, and explains to the San Francisco Police Department, hey, look, this is what was the situation. The Police Department sees all the evidence, and agrees with them. And says, okay. Obviously, no crime here. He did not commit a crime here. This was not child porn. He was sending them to a doctor. So now you have the dad. You have the doctor. You have the police department, all saying the same thing.
GLENN: And the boy.
STU: And the boy. This is not a crime. There is no abuse here. Seems all appearances. All the evidence that we have. A good dad, trying it help his son, through a difficult moment in his life. The only standout here is Google. So now the story escalates to the New York Times. The Times comes in, documents all of this. Has actually, apparently looked at the photos now. And has also determined, this is not child porn. Right? So we're sure on this one, it seems. Every point of evidence.
GLENN: I think people at the New York Times might be able to know what child porn looks like.
STU: They may very well be able to do that.
GLENN: So he wanted -- so they've gone through all of this. The dad wanted to sue Google, because, you know --
GLENN: They've shut --
STU: They shut him out. They say no. Even with the word of the police department. They still said no. So he wanted to sue Google. He realized it was too expensive. He didn't have the money to do it. So he is just basically now in the constant state of trying to get them to change their mind, even with all of this. The Times contacts Google and gets a comment on the record where they say, yeah, we're not reversing it. After all of this. The police department is on the record saying, we have a copy of all of his data, but on a jump track. And they are saying, they want to work with the dad, to get him access to all his information back. But at this point, Google is still denying it.
GLENN: Now, imagine when Google and the United States are in bed with each other more than they already are. Imagine the ESG aspect of this. Dad is put on a list by Google. Google shares information by the government. The government shares information by the banks. Dad does not just lose all of his pictures. All of his contacts. And his Google phone. Dad would lose all phones. Dad would lose his banking. Dad would lose absolutely everything, because he would be too much of a risk. And who do you go to? Who do you go to? The New York Times? Who do you go to, to say, hey. I need to get my name off of this list. It doesn't -- now, let me add one additional thing to this. I told you last week, that the World Economic Forum, has said, that bullying and everything else, online and disinformation, misinformation, malinformation is too big of a problem globally. So they are now pushing for high-tech and governments to endorse a system that would look at your tweet or whatever in question, and the algorithm would decide whether or not that is good or bad. If it's bad, it then makes a tree of everything that you do. So it goes back, and it looks at, who is influencing you? And if those people -- it deletes you. Then it goes to all of the people with contacts. All of the people in your social media realm. And it looks for anyone else, that is spreading that information. And on each of those people, there's made a tree. And they lose their access. All the way down -- this is according to -- look it up at the WorldEconomicForum.com. Or org. Would you look it up, which one is it. But look it up at the main page of the World Economic Forum. It was there at least last week. Where they were talking about making a tree that would -- I mean, 7 degrees from Kevin Bacon. If this happened with this guy, I guarantee you, it's only a matter of time, before they get my name or your name. Because it trees out. And the World Economic Forum says that it's not enough to get the problem, that is manifesting itself on social media. They need to see where that idea originally came from. Because they now need to silence ideas, before they get into the bloodstream of the population. If that's not terrifying, especially coupled with this, that is actually happening. And you have a chance of stopping this. But you won't have that chance to stop these kinds of things. Look at how hard it is to get your name off of a No Fly List. You're on there -- you're on there by mistake. Look how long it took people to get their names off of no fly lists. You can't even find out from the government, if you're on it or off it. What the status is. Or why you're on it or off it.
STU: Yeah. In fact, one of the things that came up in this investigation. They said, well, we've also flagged a video from six months ago.
GLENN: Yes.
STU: And we thought that was problematic too. He's like, well, what video? They're like, well, we're not going to give you access to it. So he can't even defend the video that he supposedly had on his phone.
GLENN: You can't -- you have a right to face your accuser.
STU: Hmm.
GLENN: But that's only in governmental law.