1 00:00:05,519 --> 00:00:08,760 Speaker 1: Hi, Katie, So, how would you describe your relationship with 2 00:00:08,840 --> 00:00:14,760 Speaker 1: your smartphone? Um? Dependent? I've got a problem. It's bad. 3 00:00:15,640 --> 00:00:18,720 Speaker 1: I would use the word obsessive to describe me. Probably 4 00:00:19,000 --> 00:00:21,919 Speaker 1: not as much as you, but me too. So that's 5 00:00:21,920 --> 00:00:25,640 Speaker 1: the focus of today's show, technology specifically, are growing dependence 6 00:00:25,720 --> 00:00:28,720 Speaker 1: on it or should I say our addiction to it? Now, 7 00:00:28,760 --> 00:00:31,040 Speaker 1: we asked all of you to write in and call 8 00:00:31,160 --> 00:00:33,839 Speaker 1: us with tales from the tech trenches. You had a 9 00:00:33,920 --> 00:00:37,239 Speaker 1: lot to say. Angie loved Joy on Twitter wrote, I 10 00:00:37,240 --> 00:00:39,760 Speaker 1: have fiber my algia and no one I've been over 11 00:00:39,880 --> 00:00:42,879 Speaker 1: using my phone when my thumb on the phone holding 12 00:00:42,960 --> 00:00:45,760 Speaker 1: hand starts to hurt. Even through the pain, I sometimes 13 00:00:45,760 --> 00:00:49,040 Speaker 1: soldier through like it's my life's work to check social 14 00:00:49,040 --> 00:00:53,720 Speaker 1: media and scroll through the news. Ouch. I feel your pain. Well, 15 00:00:53,760 --> 00:00:56,560 Speaker 1: I don't exactly feel your pain, Angie, but I understand 16 00:00:56,800 --> 00:00:59,840 Speaker 1: where you're coming from and share a Dupuis, which is 17 00:00:59,840 --> 00:01:03,000 Speaker 1: a beautiful last name. On Twitter, told us about the 18 00:01:03,040 --> 00:01:05,399 Speaker 1: moment she knew she had a problem with tech, writing, 19 00:01:05,640 --> 00:01:07,800 Speaker 1: I realized that when my children were having to ask 20 00:01:07,880 --> 00:01:11,160 Speaker 1: me questions multiple times because I was engrossed in my phone. 21 00:01:11,880 --> 00:01:14,440 Speaker 1: You know we've all been there, Yes, Nalini Sang on 22 00:01:14,480 --> 00:01:17,560 Speaker 1: Twitter said, I give myself test to see how long 23 00:01:17,600 --> 00:01:19,399 Speaker 1: I will go without looking at my phone, and I 24 00:01:19,440 --> 00:01:22,200 Speaker 1: feel so quickly. I noticed when a friend's dad forced 25 00:01:22,200 --> 00:01:24,920 Speaker 1: me to turn off my phone during a dinner. We 26 00:01:25,000 --> 00:01:28,480 Speaker 1: have a sickness as I type on my phone. Listen. 27 00:01:28,640 --> 00:01:31,640 Speaker 1: I agree with all of the tweets that we got 28 00:01:31,680 --> 00:01:36,240 Speaker 1: about this subject. We are addicted, and we are unfortunately 29 00:01:36,560 --> 00:01:40,400 Speaker 1: ignoring our children at times, ignoring our spouses or our friends. 30 00:01:40,440 --> 00:01:43,039 Speaker 1: There's an actual term Brian for it. It's called fubbing 31 00:01:43,480 --> 00:01:46,920 Speaker 1: p h U B b I n G. That means 32 00:01:46,959 --> 00:01:49,440 Speaker 1: blowing someone off to look at your telephone. And it's 33 00:01:49,520 --> 00:01:52,200 Speaker 1: so aggravating, and I'm sure it is for other people, 34 00:01:52,200 --> 00:01:54,520 Speaker 1: because I'm sure I do it all the time never. 35 00:01:56,080 --> 00:01:59,640 Speaker 1: But you know, it's maybe not entirely our fault, because 36 00:01:59,760 --> 00:02:03,160 Speaker 1: in hard our phones and these tech platforms are designed 37 00:02:03,160 --> 00:02:05,760 Speaker 1: to addict us, which we talk about in today's show. 38 00:02:05,960 --> 00:02:09,560 Speaker 1: That's right, in an attention economy, that's definitely true. We 39 00:02:09,600 --> 00:02:12,080 Speaker 1: also got this very interesting voicemail from a listener named 40 00:02:12,080 --> 00:02:15,400 Speaker 1: Steve who tried to return to a flip phone. Hi, 41 00:02:15,520 --> 00:02:18,040 Speaker 1: my name is Steve Glenn. I'm an airline pilot and 42 00:02:18,120 --> 00:02:20,880 Speaker 1: for eight months last year I went back to a 43 00:02:20,880 --> 00:02:23,799 Speaker 1: flip phone. I decided I was spending too much time 44 00:02:23,800 --> 00:02:27,480 Speaker 1: on my smartphone, and so I got a Kyo Seros 45 00:02:27,480 --> 00:02:30,240 Speaker 1: flip phone like they used in the military, and I 46 00:02:30,280 --> 00:02:34,360 Speaker 1: really enjoyed unplugging a bit and not being as accessible 47 00:02:34,440 --> 00:02:37,440 Speaker 1: all the time. But I finally had to go back 48 00:02:37,440 --> 00:02:40,800 Speaker 1: and buy an iPhone because as an airline captain, I 49 00:02:40,840 --> 00:02:45,919 Speaker 1: had to access the Internet and access company websites using 50 00:02:45,919 --> 00:02:49,640 Speaker 1: my smartphone. And also texting is going to a different 51 00:02:49,639 --> 00:02:52,680 Speaker 1: type of technology where people send pictures and people send 52 00:02:52,720 --> 00:02:54,680 Speaker 1: files with texts that you just can't do that with 53 00:02:54,680 --> 00:02:57,640 Speaker 1: with pumps anymore. So I had my my experiment was 54 00:02:57,680 --> 00:02:59,840 Speaker 1: a failure. I had to go back to an iPhone. 55 00:03:00,160 --> 00:03:02,520 Speaker 1: I try and limit my use of it more than 56 00:03:02,560 --> 00:03:05,120 Speaker 1: I used to. But I thought you'd like to hear that. 57 00:03:05,280 --> 00:03:08,440 Speaker 1: Thank you very much, well, Steve Glenn. A noble effort, 58 00:03:08,560 --> 00:03:12,440 Speaker 1: we must say, going back to a flip phone. How 59 00:03:12,560 --> 00:03:16,040 Speaker 1: luddite ish of you. But you can see, Brian why 60 00:03:16,120 --> 00:03:18,480 Speaker 1: that would be really hard because so much of our 61 00:03:18,560 --> 00:03:23,600 Speaker 1: day to day activity and interactions require, uh, you know, 62 00:03:23,760 --> 00:03:26,240 Speaker 1: a smartphone and not a flip phone. There's even an 63 00:03:26,280 --> 00:03:29,040 Speaker 1: app Katie called Moment, which I was yes, I have 64 00:03:29,160 --> 00:03:32,519 Speaker 1: a very long time, and it tells you not only 65 00:03:32,520 --> 00:03:34,280 Speaker 1: how much time you're spending on your phone, but what 66 00:03:34,280 --> 00:03:36,360 Speaker 1: you're actually doing on your phone. So you kind of 67 00:03:36,400 --> 00:03:38,160 Speaker 1: a lie to yourself and you think, oh, I'm checking 68 00:03:38,200 --> 00:03:40,840 Speaker 1: my email and you're really on Instagram and Twitter, and 69 00:03:40,840 --> 00:03:43,920 Speaker 1: then you get sucked into a vortex of crap. Yeah, 70 00:03:43,960 --> 00:03:47,040 Speaker 1: and self loathing for being sucked into a vortex of crap. 71 00:03:47,080 --> 00:03:50,280 Speaker 1: But anyway, it's a vicious cycle. Clearly, technology is eating 72 00:03:50,360 --> 00:03:52,840 Speaker 1: up our time, attention, and changing our way of life. 73 00:03:53,120 --> 00:03:56,400 Speaker 1: Loyal listeners, you may recall when we talk with psychologist 74 00:03:56,480 --> 00:03:59,560 Speaker 1: Jane Twangy about the costs and consequences of endless screen 75 00:03:59,600 --> 00:04:03,000 Speaker 1: time is especially for kids and teens. I encourage you 76 00:04:03,000 --> 00:04:05,720 Speaker 1: all to listen to that episode if you haven't, its 77 00:04:05,800 --> 00:04:08,800 Speaker 1: number thirty six. But for this episode, we wanted to 78 00:04:08,800 --> 00:04:12,320 Speaker 1: get into these topics from the perspective of a tech insider, 79 00:04:12,560 --> 00:04:15,400 Speaker 1: and I can't think of many people better on this topic, Katie, 80 00:04:15,440 --> 00:04:19,320 Speaker 1: than today's guest Tristan Harris. In his twenties, he sold 81 00:04:19,360 --> 00:04:21,720 Speaker 1: his tech company to Google and they worked there as 82 00:04:21,760 --> 00:04:25,039 Speaker 1: an in house design ethicist of all things, which I 83 00:04:25,080 --> 00:04:28,479 Speaker 1: think many people think is probably an oxymoron when it 84 00:04:28,480 --> 00:04:31,799 Speaker 1: comes to big tech companies, But The Atlantic called trist 85 00:04:31,920 --> 00:04:34,800 Speaker 1: On the closest thing Silicon Valley has to a conscience. 86 00:04:35,240 --> 00:04:38,520 Speaker 1: His latest project is something called the Center for Humane Technology, 87 00:04:38,560 --> 00:04:42,680 Speaker 1: which he started in February, bringing together many former insiders 88 00:04:42,680 --> 00:04:45,480 Speaker 1: in the tech world who believe we need to design 89 00:04:45,560 --> 00:04:49,200 Speaker 1: technology in a different, more humane way. To give you 90 00:04:49,200 --> 00:04:52,000 Speaker 1: an idea, here's how the Center on its website refers 91 00:04:52,040 --> 00:04:55,960 Speaker 1: to Snapchat, Instagram, Facebook, and YouTube. They say these are 92 00:04:55,960 --> 00:04:59,120 Speaker 1: not neutral products, they are part of a system designed 93 00:04:59,200 --> 00:05:01,800 Speaker 1: to addict us. Now. I originally spoke with Triston for 94 00:05:01,880 --> 00:05:08,360 Speaker 1: my Nachio documentary series available gosh on demand and ironically 95 00:05:08,760 --> 00:05:13,440 Speaker 1: on YouTube, Facebook, and Hulu, but I wanted to continue 96 00:05:13,520 --> 00:05:17,320 Speaker 1: our conversation. He's such a smart guy an excellent communicator. 97 00:05:17,400 --> 00:05:20,520 Speaker 1: So for today's episode, we talked with Tristan about how 98 00:05:20,560 --> 00:05:24,040 Speaker 1: tech hooks us and what ethical design means to him. 99 00:05:24,080 --> 00:05:26,200 Speaker 1: And to start things off, I asked him how he 100 00:05:26,279 --> 00:05:31,840 Speaker 1: landed at Google and why he left. Well, UM, I 101 00:05:31,920 --> 00:05:34,560 Speaker 1: was a tech entrepreneur when I was twenty two at 102 00:05:34,560 --> 00:05:38,040 Speaker 1: a Stanford I started a small tech company and UM 103 00:05:38,160 --> 00:05:40,520 Speaker 1: It's a long story, but after about five or six years, 104 00:05:40,560 --> 00:05:43,200 Speaker 1: we we soft landed the company at Google, we were acquired, 105 00:05:43,880 --> 00:05:47,039 Speaker 1: and after a year into being at Google, I became 106 00:05:47,800 --> 00:05:50,200 Speaker 1: kind of disenchanted with where things in the tech industry 107 00:05:50,200 --> 00:05:54,200 Speaker 1: we're heading that instead of really building tools that were 108 00:05:54,200 --> 00:05:56,839 Speaker 1: empowering people, it was more and more becoming this race 109 00:05:57,600 --> 00:06:01,200 Speaker 1: between these different companies of getting people's attention and exploiting 110 00:06:01,400 --> 00:06:04,560 Speaker 1: people's psychology. And I I felt alarmed by this. And 111 00:06:04,600 --> 00:06:07,000 Speaker 1: I was working with the Gmail team and Gmail has 112 00:06:07,000 --> 00:06:10,200 Speaker 1: its own problems with people feeling addicted. Uh, And I 113 00:06:10,240 --> 00:06:13,640 Speaker 1: made this you know, this presentation about Google's moral responsibility 114 00:06:13,640 --> 00:06:16,039 Speaker 1: and shaping a billion people's attention. It was sort of 115 00:06:16,040 --> 00:06:18,599 Speaker 1: a slide deck and it exploded. I mean it went 116 00:06:18,640 --> 00:06:21,480 Speaker 1: throughout the company. Tens of thousands of people saw it. 117 00:06:22,080 --> 00:06:24,960 Speaker 1: And um, that led to becoming a design ethicist, where 118 00:06:24,960 --> 00:06:27,400 Speaker 1: I basically was asking the question, which know what had 119 00:06:27,400 --> 00:06:30,320 Speaker 1: asked before? You know, how do you ethically steer a 120 00:06:30,360 --> 00:06:33,160 Speaker 1: billion people's attention? We should pause you for a second 121 00:06:33,240 --> 00:06:36,360 Speaker 1: because you mentioned this slide show. You designed this hundred 122 00:06:36,400 --> 00:06:41,640 Speaker 1: and forty four page Google slide presentation which was called 123 00:06:41,640 --> 00:06:46,000 Speaker 1: a call to Minimize distraction and respect users attention. Yeah, 124 00:06:46,000 --> 00:06:48,240 Speaker 1: and that was in two I mean, this is a 125 00:06:48,240 --> 00:06:50,880 Speaker 1: long time ago that this these concerns first rose up, 126 00:06:50,960 --> 00:06:53,920 Speaker 1: and you did this as a Google employee, correct, intending 127 00:06:53,960 --> 00:06:56,840 Speaker 1: to sort of share it within your team. But then 128 00:06:57,000 --> 00:06:59,760 Speaker 1: I think Larry Page saw it and other senior leaders 129 00:06:59,800 --> 00:07:02,760 Speaker 1: with in the company. And what was the reaction to 130 00:07:02,920 --> 00:07:05,919 Speaker 1: the concerns that you were raising. Yeah, I can't imagine 131 00:07:05,960 --> 00:07:10,240 Speaker 1: they were super jiggy about that. Well, you know, um, 132 00:07:10,280 --> 00:07:13,440 Speaker 1: I was. I was nervous about that presentation because I 133 00:07:13,480 --> 00:07:15,560 Speaker 1: mean I was ready at the time to leave the company. 134 00:07:15,600 --> 00:07:17,640 Speaker 1: I kind of felt like this was the thing that 135 00:07:17,720 --> 00:07:20,960 Speaker 1: was most concerning to me. Um. But before I left, 136 00:07:21,000 --> 00:07:23,840 Speaker 1: I wanted to raise alarms about this issue. And I 137 00:07:23,880 --> 00:07:27,200 Speaker 1: had a whole background prior to this on how people's 138 00:07:27,200 --> 00:07:29,480 Speaker 1: minds are manipulated, an influence when I was a kid, 139 00:07:29,480 --> 00:07:32,640 Speaker 1: I was a magician. I studied at this persuasive tech lab, 140 00:07:32,680 --> 00:07:36,160 Speaker 1: so I understood that technology could really manipulate people's minds. 141 00:07:36,160 --> 00:07:38,080 Speaker 1: And that's why I made this presentation. So I sent 142 00:07:38,120 --> 00:07:40,840 Speaker 1: it to ten people and said, hey, I want your feedback. 143 00:07:40,960 --> 00:07:42,679 Speaker 1: It was just a slide deck head, give me your feedback. 144 00:07:42,960 --> 00:07:44,760 Speaker 1: I went home. I came back the next morning, I 145 00:07:44,840 --> 00:07:47,080 Speaker 1: checked my email and I had like, you know, a 146 00:07:47,160 --> 00:07:50,640 Speaker 1: hundred emails about this presentation, and I clicked on the 147 00:07:50,760 --> 00:07:53,080 Speaker 1: you know, their their links, and when you click on 148 00:07:53,120 --> 00:07:55,960 Speaker 1: the link, it shows you, um, inside of Google slides 149 00:07:56,200 --> 00:07:57,800 Speaker 1: the number of people who are looking at it at 150 00:07:57,800 --> 00:08:00,440 Speaker 1: the same time, and there was something like a hundred 151 00:08:00,440 --> 00:08:02,640 Speaker 1: and fifty people that that morning. When I looked at 152 00:08:02,640 --> 00:08:05,160 Speaker 1: it later that day, there was four hundred. Uh And 153 00:08:05,240 --> 00:08:07,520 Speaker 1: just through the next you know week, it was just exploding. 154 00:08:07,600 --> 00:08:09,680 Speaker 1: I mean, and I had heard that Larry Page it 155 00:08:09,760 --> 00:08:11,600 Speaker 1: had two or three meetings that day where people brought 156 00:08:11,640 --> 00:08:14,640 Speaker 1: it up in conversation with him. Uh And so it 157 00:08:14,720 --> 00:08:17,000 Speaker 1: became this kind of momentum. I wouldn't I don't want 158 00:08:17,000 --> 00:08:18,920 Speaker 1: to overstate its influence. I mean, I don't think it 159 00:08:19,160 --> 00:08:22,520 Speaker 1: changed the course of anything, but that it definitely rose 160 00:08:22,600 --> 00:08:24,800 Speaker 1: alarms and people started to talk about it. But you 161 00:08:24,840 --> 00:08:28,200 Speaker 1: wrote something that I thought was pretty smart and prescient, 162 00:08:28,560 --> 00:08:32,080 Speaker 1: which was never before in history. Had the decisions of 163 00:08:32,120 --> 00:08:36,119 Speaker 1: a handful of designers, mostly men, white, living in San Francisco, 164 00:08:36,200 --> 00:08:40,840 Speaker 1: aged thirty five, working at three companies that is, Google, Apple, 165 00:08:40,880 --> 00:08:44,200 Speaker 1: and Facebook, had so much impact on how millions of 166 00:08:44,240 --> 00:08:47,480 Speaker 1: people around the world spend their attention. We should feel 167 00:08:47,520 --> 00:08:51,440 Speaker 1: an enormous responsibility to get this right. True now more 168 00:08:51,520 --> 00:08:54,720 Speaker 1: than ever. Um. And so what did Google do in 169 00:08:54,720 --> 00:09:00,240 Speaker 1: in response to this besides change your job? Um? Not much. 170 00:09:00,320 --> 00:09:04,200 Speaker 1: I mean I was given this space to kind of 171 00:09:04,240 --> 00:09:07,880 Speaker 1: explore these topics, and I didn't get fired. I really 172 00:09:07,960 --> 00:09:11,559 Speaker 1: just focused on understanding what it meant to hold that responsibility. 173 00:09:11,600 --> 00:09:14,240 Speaker 1: It's not like we've ever had There's no academic discipline, 174 00:09:14,240 --> 00:09:16,920 Speaker 1: there's no university that could teach you this is how 175 00:09:17,000 --> 00:09:20,199 Speaker 1: we do manipulation of a billion people's attention, and that 176 00:09:20,679 --> 00:09:23,880 Speaker 1: never existed before. So this was a new field, and 177 00:09:23,960 --> 00:09:26,200 Speaker 1: so I did mostly a lot of research. Um. And 178 00:09:26,200 --> 00:09:29,920 Speaker 1: in terms of things changing at Google, I tried bringing 179 00:09:29,960 --> 00:09:32,560 Speaker 1: this up with some of the key products that I 180 00:09:32,559 --> 00:09:35,600 Speaker 1: thought would be most important to change, so Android because 181 00:09:35,640 --> 00:09:39,240 Speaker 1: that's the home screen, the phone notifications that people billions 182 00:09:39,240 --> 00:09:42,640 Speaker 1: of people live by, Chromes, the web browser. People spend 183 00:09:42,640 --> 00:09:46,000 Speaker 1: most of their time in UH and Gmail. But it 184 00:09:46,040 --> 00:09:50,400 Speaker 1: was really hard to get a concerted focus on saying 185 00:09:50,480 --> 00:09:53,000 Speaker 1: we have this is an enormously important topic. This is 186 00:09:53,040 --> 00:09:56,000 Speaker 1: literally everything, and we need a whole different way of 187 00:09:56,000 --> 00:09:59,800 Speaker 1: thinking about this. And I was unsuccessful at getting getting 188 00:09:59,800 --> 00:10:01,800 Speaker 1: those neems to change inside Google. Well, you must have 189 00:10:01,840 --> 00:10:05,520 Speaker 1: been considered such a massive disruptor, Tristan. I mean here 190 00:10:05,600 --> 00:10:08,880 Speaker 1: you are saying, hey, hold the phone, everybody, and these 191 00:10:08,920 --> 00:10:14,160 Speaker 1: companies that are growing like weeds, and obviously the concerns 192 00:10:14,200 --> 00:10:18,040 Speaker 1: that you were raising were anathetical to their business model, right, 193 00:10:18,720 --> 00:10:21,160 Speaker 1: you know, It's it's funny, Katie, because it wasn't an 194 00:10:21,160 --> 00:10:24,840 Speaker 1: explicit There was never an explicit responsive we can't do that. 195 00:10:24,920 --> 00:10:27,200 Speaker 1: You're asking us to make less money? No, you know 196 00:10:27,240 --> 00:10:30,000 Speaker 1: it was I never got that response. There this be 197 00:10:30,080 --> 00:10:33,160 Speaker 1: kind of the smiling and nodding, but then no traction, 198 00:10:33,240 --> 00:10:36,120 Speaker 1: no momentum. Right, So I talked to you know, an Android, 199 00:10:36,200 --> 00:10:38,000 Speaker 1: the Android team, and say, well, what if we designed 200 00:10:38,000 --> 00:10:39,920 Speaker 1: it this way we help people check their phone less 201 00:10:40,400 --> 00:10:42,560 Speaker 1: and there'd be this kind of smiling and nodding and yeah, 202 00:10:42,600 --> 00:10:44,400 Speaker 1: maybe we could do that, and there's some teams that 203 00:10:44,440 --> 00:10:47,480 Speaker 1: are kind of working on that, but then nothing much 204 00:10:47,520 --> 00:10:50,440 Speaker 1: would really happen. There was no concerted effort. And and 205 00:10:50,480 --> 00:10:52,560 Speaker 1: this is why ultimately I did leave, is because I 206 00:10:52,600 --> 00:10:55,400 Speaker 1: realized that there needed to be a much bigger public 207 00:10:55,400 --> 00:10:58,280 Speaker 1: conversation in public demand for this stuff. But I will 208 00:10:58,280 --> 00:11:00,880 Speaker 1: say there's a difference between and roid, you know, which 209 00:11:00,920 --> 00:11:03,240 Speaker 1: is the how your mobile phone works that doesn't need 210 00:11:03,280 --> 00:11:05,360 Speaker 1: to maximize how much time you spend on the phone. 211 00:11:05,400 --> 00:11:07,320 Speaker 1: It's really it's like, no one goes to work at 212 00:11:07,360 --> 00:11:10,600 Speaker 1: Androids saying, gosh, how do we just steal everyone's time. 213 00:11:10,760 --> 00:11:13,440 Speaker 1: No one says that YouTube, on the other hand, that 214 00:11:13,640 --> 00:11:15,720 Speaker 1: is their goal. I mean, that is the business model. 215 00:11:15,880 --> 00:11:18,960 Speaker 1: Is more time watching videos equals more money for YouTube. 216 00:11:19,800 --> 00:11:23,360 Speaker 1: That means more attention to the advertisements, right exactly. That 217 00:11:23,400 --> 00:11:26,199 Speaker 1: means more more attention to the advertising. Advertising is the 218 00:11:26,280 --> 00:11:30,320 Speaker 1: driving business model behind this, this addiction to stealing people's 219 00:11:30,360 --> 00:11:33,360 Speaker 1: attention and time. So Tristan, let's just step back for 220 00:11:33,400 --> 00:11:36,560 Speaker 1: a second and talk about how and why these tech 221 00:11:36,640 --> 00:11:39,360 Speaker 1: companies addict us. It's gone to the point where we 222 00:11:39,440 --> 00:11:41,760 Speaker 1: check our phones more than a d and fifty times 223 00:11:41,800 --> 00:11:45,320 Speaker 1: per day. Knowledge workers on average spend a third of 224 00:11:45,360 --> 00:11:48,360 Speaker 1: their day just doing email. Um, as you said, it's 225 00:11:48,400 --> 00:11:51,840 Speaker 1: sort of central to the business model of these tech platforms. 226 00:11:51,880 --> 00:11:54,560 Speaker 1: So can you talk to us about, you know, how 227 00:11:54,600 --> 00:11:58,199 Speaker 1: and why this happened. Yeah, well, you know it starts 228 00:11:58,280 --> 00:12:01,200 Speaker 1: by you know, someone building an app and saying I 229 00:12:01,240 --> 00:12:03,320 Speaker 1: gotta get you to use it. So what they what 230 00:12:03,360 --> 00:12:06,440 Speaker 1: they want is they want people, you know, to register accounts. 231 00:12:06,480 --> 00:12:08,600 Speaker 1: They want people to create new accounts and new users 232 00:12:08,640 --> 00:12:10,560 Speaker 1: to show up, and then they want each of those 233 00:12:10,559 --> 00:12:13,720 Speaker 1: accounts to come back every single day. They want you 234 00:12:13,760 --> 00:12:16,760 Speaker 1: to be hooked, So they have to start finding reasons 235 00:12:16,800 --> 00:12:19,800 Speaker 1: for how can I get you to come back tomorrow. 236 00:12:20,880 --> 00:12:22,720 Speaker 1: So if your Instagram and you say, okay, if this 237 00:12:22,920 --> 00:12:24,560 Speaker 1: in the early days of Instagram, I know the guys 238 00:12:24,559 --> 00:12:26,040 Speaker 1: who made it, they went to school with me, Mike 239 00:12:26,120 --> 00:12:28,640 Speaker 1: and Kevin, you know, and they're thinking, okay, how do 240 00:12:28,679 --> 00:12:31,000 Speaker 1: we keep people coming back to this photo sharing app? 241 00:12:31,240 --> 00:12:34,760 Speaker 1: And they didn't always have this feature called the number 242 00:12:34,800 --> 00:12:37,440 Speaker 1: of followers you have? Right, why would they add that? 243 00:12:37,480 --> 00:12:39,319 Speaker 1: We think that that's just natural, that's just the world 244 00:12:39,320 --> 00:12:42,520 Speaker 1: we live in. But the number of followers we have. 245 00:12:43,080 --> 00:12:45,280 Speaker 1: Showing that number to you is a good way to 246 00:12:45,280 --> 00:12:47,960 Speaker 1: get you to come back tomorrow because you want to 247 00:12:47,960 --> 00:12:51,400 Speaker 1: know if that number went up, And you also want 248 00:12:51,440 --> 00:12:53,199 Speaker 1: to know how many likes you get on each photo 249 00:12:53,280 --> 00:12:55,080 Speaker 1: you post, and that's another reason to get you to 250 00:12:55,120 --> 00:12:58,000 Speaker 1: come back. And so think of these as like biological 251 00:12:58,120 --> 00:13:01,600 Speaker 1: organisms sitting on a table and are mutating these new limbs, 252 00:13:01,640 --> 00:13:03,880 Speaker 1: which are just these things that are good at getting 253 00:13:03,880 --> 00:13:06,760 Speaker 1: you to come back and to stay longer. And if 254 00:13:06,800 --> 00:13:09,120 Speaker 1: that thing works, it keeps that limb, and it starts, 255 00:13:09,120 --> 00:13:11,400 Speaker 1: it continues to evolve, and so it's involving all of 256 00:13:11,400 --> 00:13:17,199 Speaker 1: these new ways. Likes, um messages, shares, filters, these are 257 00:13:17,240 --> 00:13:19,760 Speaker 1: all ways to keep you coming back and hooked. But 258 00:13:19,880 --> 00:13:23,600 Speaker 1: what we think we miss is these cultural externalities. This 259 00:13:23,760 --> 00:13:27,720 Speaker 1: entire selfie culture where you know, young you know, teenagers 260 00:13:27,760 --> 00:13:30,480 Speaker 1: and a lot of women take these selfies over and 261 00:13:30,520 --> 00:13:34,000 Speaker 1: over again of basically their their appearance. And let's say 262 00:13:34,000 --> 00:13:36,480 Speaker 1: that Instagram didn't have a feature called the number of 263 00:13:36,520 --> 00:13:38,600 Speaker 1: followers you have. Let's say they never went down that road. 264 00:13:39,160 --> 00:13:42,480 Speaker 1: Would we have an entire culture where everyone's taking photos 265 00:13:42,520 --> 00:13:45,120 Speaker 1: of themselves and posting it on social media if we 266 00:13:45,160 --> 00:13:49,160 Speaker 1: didn't have the notion of followers for ourselves. I mean, 267 00:13:49,160 --> 00:13:51,520 Speaker 1: these design techniques are all being done because they're good 268 00:13:51,520 --> 00:13:55,319 Speaker 1: at hooking people, but downstream they create these cultural externalities. 269 00:13:55,320 --> 00:13:58,000 Speaker 1: You have people being more concerned that they're self worth 270 00:13:58,360 --> 00:14:00,240 Speaker 1: is directly tied to how many likes they on in 271 00:14:00,280 --> 00:14:03,520 Speaker 1: that last photo. And um, I think we're confusing a 272 00:14:03,559 --> 00:14:06,719 Speaker 1: whole generation of children to attach their self worth and 273 00:14:06,760 --> 00:14:09,360 Speaker 1: belonging to the wrong place, not just children triest On. 274 00:14:09,440 --> 00:14:12,120 Speaker 1: I mean, I am always checking my likes on Instagram 275 00:14:12,120 --> 00:14:15,120 Speaker 1: and my followers and has it gone up? And you 276 00:14:15,160 --> 00:14:18,480 Speaker 1: know what comments have I gotten? And on the one hand, 277 00:14:18,520 --> 00:14:22,320 Speaker 1: I kind of appreciate the community that Instagram allows you 278 00:14:22,360 --> 00:14:24,560 Speaker 1: to have. You do feel like you're kind of relating 279 00:14:24,600 --> 00:14:29,080 Speaker 1: to people who have similar interests, and I appreciate hearing 280 00:14:29,200 --> 00:14:32,000 Speaker 1: from people. But the flip side of that for me 281 00:14:32,600 --> 00:14:36,960 Speaker 1: is this kind of hunger and almost frantic feeling that, oh, 282 00:14:37,000 --> 00:14:39,280 Speaker 1: I need to get more followers. I need to make 283 00:14:39,320 --> 00:14:42,360 Speaker 1: sure they like what I'm posting. And I'm embarrassed to 284 00:14:42,400 --> 00:14:44,880 Speaker 1: admit that. But I'm a sixty one year old woman. 285 00:14:44,960 --> 00:14:48,360 Speaker 1: Think about the impact this has on teenagers who are 286 00:14:48,400 --> 00:14:52,680 Speaker 1: still developing their self esteem and sense of worth. Yeah. 287 00:14:52,680 --> 00:14:54,560 Speaker 1: I think of this a lot like sugar. I mean, 288 00:14:54,840 --> 00:14:58,320 Speaker 1: think of our human evolutionary instincts. I mean, sugar tastes 289 00:14:58,320 --> 00:15:01,800 Speaker 1: good to all human animals. We're built to love sugar. Right, 290 00:15:01,800 --> 00:15:03,800 Speaker 1: There's a reason because it used to be really rare. 291 00:15:04,400 --> 00:15:06,560 Speaker 1: Um It's just like social approval. I mean, there's a 292 00:15:06,600 --> 00:15:09,080 Speaker 1: reason that it should feel good to get those likes 293 00:15:09,080 --> 00:15:12,040 Speaker 1: and to get that social validation and approval, but we 294 00:15:12,040 --> 00:15:14,560 Speaker 1: weren't built to get it at this level of frequency 295 00:15:14,680 --> 00:15:17,640 Speaker 1: dripping into our our mind every you know, five ten 296 00:15:17,680 --> 00:15:20,040 Speaker 1: minutes with a new batch of notifications on our film. 297 00:15:20,080 --> 00:15:23,240 Speaker 1: Speaking of that, two things really struck me from Katie's 298 00:15:23,320 --> 00:15:28,400 Speaker 1: Natchio hour on tech addiction, which is the physiological reaction 299 00:15:28,640 --> 00:15:33,960 Speaker 1: to this stimulus, the stress hormone cortisol that we're getting 300 00:15:34,440 --> 00:15:38,800 Speaker 1: sort of overdosed on because we feel nervous about our 301 00:15:38,840 --> 00:15:41,680 Speaker 1: text and our emails and checking them, are not checking them. 302 00:15:42,040 --> 00:15:47,920 Speaker 1: And then dopamine, which is this um this pleasure neurotransmitter, 303 00:15:48,280 --> 00:15:51,800 Speaker 1: neurotransmitter you know this, I don't um that kind of 304 00:15:51,840 --> 00:15:54,840 Speaker 1: gives us a high when we're getting social validation on 305 00:15:54,880 --> 00:15:58,440 Speaker 1: these apps and we're not meant or we weren't designed 306 00:15:58,480 --> 00:16:01,480 Speaker 1: to receive these two things this at the level we are, 307 00:16:02,360 --> 00:16:04,760 Speaker 1: with the frequency we are. That's right, And I think 308 00:16:04,840 --> 00:16:06,720 Speaker 1: that's how we can think about how we fix this, 309 00:16:07,400 --> 00:16:09,960 Speaker 1: which is to say, we need to turn the lens 310 00:16:10,040 --> 00:16:13,000 Speaker 1: back at ourselves and say, what were we built for? 311 00:16:13,280 --> 00:16:16,720 Speaker 1: You know how? How are all of our evolutionary instincts 312 00:16:16,720 --> 00:16:19,520 Speaker 1: tuned and how do we respect those instincts. So a 313 00:16:19,520 --> 00:16:22,920 Speaker 1: good example is something like a simple one that leads 314 00:16:22,920 --> 00:16:26,280 Speaker 1: to phones is trigger colors. So red is a trigger color. 315 00:16:26,640 --> 00:16:28,160 Speaker 1: So every single time you look at your phone and 316 00:16:28,200 --> 00:16:30,880 Speaker 1: you see that red dot with the number of notifications, 317 00:16:31,120 --> 00:16:33,680 Speaker 1: it's triggering you into a little bit of a a 318 00:16:33,760 --> 00:16:35,280 Speaker 1: kind of alarm or or or sort of you know, 319 00:16:35,320 --> 00:16:37,480 Speaker 1: grabbing your attention, like maybe I should see there's there's 320 00:16:37,480 --> 00:16:39,640 Speaker 1: something important there that I have to check into. Right, 321 00:16:40,240 --> 00:16:43,160 Speaker 1: But do we want to be sounding the alarm inside 322 00:16:43,160 --> 00:16:45,720 Speaker 1: of our minds every single time we check our phone 323 00:16:45,760 --> 00:16:48,680 Speaker 1: with notifications? You know, in the same way that social 324 00:16:48,760 --> 00:16:51,720 Speaker 1: validation or social approval are really important things to care about. 325 00:16:51,760 --> 00:16:54,480 Speaker 1: It's useful back in the savannah to know what do 326 00:16:54,600 --> 00:16:56,560 Speaker 1: my peers think of me? Otherwise how else are you 327 00:16:56,560 --> 00:16:58,720 Speaker 1: going to survive in the community or the tribe. But 328 00:16:58,800 --> 00:17:02,120 Speaker 1: we weren't meant for a virtual community of tens of 329 00:17:02,120 --> 00:17:04,719 Speaker 1: thousands of people around the world to be dosing us 330 00:17:04,760 --> 00:17:08,480 Speaker 1: with little bits of social approval and dopamine every ten minutes. 331 00:17:08,520 --> 00:17:10,480 Speaker 1: And so I think we have to ask how do 332 00:17:10,520 --> 00:17:13,959 Speaker 1: we go back to being in alignment with how our 333 00:17:14,040 --> 00:17:17,479 Speaker 1: human evolutionary instincts work? Before we even talk about that, Trystan, 334 00:17:17,720 --> 00:17:19,960 Speaker 1: I want to dive in a little deeper on some 335 00:17:20,080 --> 00:17:23,159 Speaker 1: of the techniques that we can be aware of and 336 00:17:23,280 --> 00:17:27,200 Speaker 1: ways we're being manipulated. You mentioned the red lettering for alerts. 337 00:17:27,760 --> 00:17:30,320 Speaker 1: Can you help us understand other ways they do it 338 00:17:30,359 --> 00:17:33,120 Speaker 1: so we can aware be aware that we're being manipulated. 339 00:17:34,200 --> 00:17:36,640 Speaker 1: Oh man, there's just so many. One thing that really 340 00:17:36,680 --> 00:17:39,080 Speaker 1: struck me in your writing was the bottomless bowl. Can 341 00:17:39,080 --> 00:17:42,120 Speaker 1: you explain what that is? Sure? Um? You know, so 342 00:17:42,240 --> 00:17:45,080 Speaker 1: a bottomless bowl. There's this study that if you give 343 00:17:45,119 --> 00:17:48,520 Speaker 1: people a bowl of soup, and you know, we think 344 00:17:48,520 --> 00:17:50,640 Speaker 1: we're in control and we choose how much food we eat, 345 00:17:51,080 --> 00:17:52,800 Speaker 1: and you give a bunch of people sitting down at 346 00:17:52,840 --> 00:17:55,359 Speaker 1: the table a bowl setter bowls of soup, and some 347 00:17:55,480 --> 00:17:58,400 Speaker 1: of the bowls of soup um are just regular, there's 348 00:17:58,720 --> 00:18:01,080 Speaker 1: all the same size, But some of the other bowls 349 00:18:01,119 --> 00:18:03,440 Speaker 1: of soup have a little um pipe at the bottom 350 00:18:03,480 --> 00:18:06,480 Speaker 1: and it's actually refilling the bowl of soup as you're 351 00:18:06,520 --> 00:18:09,399 Speaker 1: drinking it. And the question was would the people who 352 00:18:09,480 --> 00:18:13,560 Speaker 1: are drinking from the bottomless bowl notice and to stop eating? 353 00:18:14,320 --> 00:18:17,359 Speaker 1: And the point of the study basically showed that people 354 00:18:17,480 --> 00:18:21,439 Speaker 1: don't notice the bottomless bowl. Uh. And that is what 355 00:18:21,560 --> 00:18:24,879 Speaker 1: how our technology works. The Instagram feed and the Facebook 356 00:18:24,880 --> 00:18:29,280 Speaker 1: feed and the Twitter feed all scroll infinitely, right. They 357 00:18:29,320 --> 00:18:31,800 Speaker 1: could choose to not do that. They could have stopping cues. 358 00:18:31,840 --> 00:18:34,160 Speaker 1: Your your mind is built for something called a stopping queue, 359 00:18:34,720 --> 00:18:37,040 Speaker 1: where basically we expect that there's going to be a 360 00:18:37,160 --> 00:18:39,719 Speaker 1: queue that says, Okay, now this is done, and your 361 00:18:39,720 --> 00:18:41,560 Speaker 1: mind kind of wakes up and figures out what do 362 00:18:41,560 --> 00:18:44,040 Speaker 1: I want to do next? But if I'm trying to 363 00:18:44,080 --> 00:18:46,080 Speaker 1: hook your attention and keep you going for as long 364 00:18:46,080 --> 00:18:48,840 Speaker 1: as possible, my job is to figure out how can 365 00:18:48,880 --> 00:18:51,400 Speaker 1: I remove all of those stopping cues so I can 366 00:18:51,480 --> 00:18:55,160 Speaker 1: keep you sucked in as long as possible. What other 367 00:18:55,280 --> 00:18:59,160 Speaker 1: things other than the bottomless bowl? Uh? I was interested 368 00:18:59,320 --> 00:19:04,919 Speaker 1: that President Trump's digital campaign advisor was able to plant 369 00:19:05,000 --> 00:19:06,720 Speaker 1: these ads, and of course we're going to get into 370 00:19:06,720 --> 00:19:10,320 Speaker 1: the political uses of these big tech companies in a minute, 371 00:19:10,320 --> 00:19:14,600 Speaker 1: but that they could measure if red or green or blue, 372 00:19:14,920 --> 00:19:18,239 Speaker 1: the different backgrounds for and and and the kind of 373 00:19:18,280 --> 00:19:21,080 Speaker 1: words that seem to be attracting more people on the 374 00:19:21,160 --> 00:19:25,600 Speaker 1: design that would suck people in or kind of agitate them, 375 00:19:26,080 --> 00:19:27,960 Speaker 1: and they were able to figure out what was the 376 00:19:27,960 --> 00:19:32,560 Speaker 1: most effective way of in essence, trashing Hillary Clinton? Yeah, 377 00:19:32,600 --> 00:19:34,880 Speaker 1: I mean this, this whole. All of these systems are 378 00:19:34,920 --> 00:19:38,879 Speaker 1: basically asking one question, which is, you know, how do 379 00:19:38,920 --> 00:19:40,439 Speaker 1: we get your attention? And the best way to get 380 00:19:40,440 --> 00:19:43,640 Speaker 1: your attention is what works on your evolutionary instincts? Does 381 00:19:43,720 --> 00:19:46,080 Speaker 1: red work better on your evolutionary instincts? Are on your mind? 382 00:19:46,400 --> 00:19:49,080 Speaker 1: Does blue work better? Does this word work better for you? 383 00:19:49,359 --> 00:19:51,680 Speaker 1: If I use this political message? Does that work better 384 00:19:51,720 --> 00:19:55,240 Speaker 1: for you? We've reared so far away from any authentic 385 00:19:55,320 --> 00:19:58,479 Speaker 1: relationship between a person trying to just talk to you 386 00:19:59,119 --> 00:20:01,879 Speaker 1: versus a person and sitting there scanning all of the 387 00:20:01,920 --> 00:20:04,560 Speaker 1: things you ever said on social media and trying to 388 00:20:04,600 --> 00:20:07,720 Speaker 1: figure out you use um. Whenever you talk about a 389 00:20:07,800 --> 00:20:12,119 Speaker 1: concept like immigration, you always use these three adjectives. And 390 00:20:12,160 --> 00:20:14,800 Speaker 1: let's say they're positive or they're negative. Right now, if 391 00:20:14,800 --> 00:20:16,480 Speaker 1: I'm a political advertiser, what do I want to do 392 00:20:16,600 --> 00:20:18,440 Speaker 1: to manipulate you? Well, I want to use your own 393 00:20:18,480 --> 00:20:20,919 Speaker 1: words back to you so you most agree with me. 394 00:20:21,400 --> 00:20:24,439 Speaker 1: So I'm going to start repeating your own viewpoints, your 395 00:20:24,440 --> 00:20:27,320 Speaker 1: own opinions back at you with your exact word choices. 396 00:20:27,640 --> 00:20:29,600 Speaker 1: So you nod your head in agreement, like, man, that 397 00:20:29,640 --> 00:20:33,280 Speaker 1: person really understands me and what we've created with social 398 00:20:33,320 --> 00:20:35,879 Speaker 1: media and Facebook. Beyond the addiction layer, the addiction just 399 00:20:35,920 --> 00:20:38,439 Speaker 1: sets up the kind of matrix everyone's jacked in. So 400 00:20:38,520 --> 00:20:40,040 Speaker 1: now the moment they wake up in the morning, their 401 00:20:40,040 --> 00:20:42,919 Speaker 1: thoughts are being sort of influenced by these phones. The 402 00:20:43,000 --> 00:20:45,560 Speaker 1: second layer is that we've sold that to the highest 403 00:20:45,560 --> 00:20:48,919 Speaker 1: bidder because of advertising, and we've enabled anyone to go 404 00:20:48,960 --> 00:20:52,720 Speaker 1: in there and say I can target the precise messages 405 00:20:52,720 --> 00:20:56,120 Speaker 1: that will resonate with your specific mind. And that's exactly 406 00:20:56,160 --> 00:20:59,600 Speaker 1: what Cambridge Analytica was in the last election. They were 407 00:20:59,600 --> 00:21:03,280 Speaker 1: trying to sell campaigns on the on the ability to 408 00:21:03,520 --> 00:21:07,240 Speaker 1: specifically persuade people using the things that would be most 409 00:21:07,280 --> 00:21:10,600 Speaker 1: persuasive per mind, so they'd have a profile that your 410 00:21:10,640 --> 00:21:13,919 Speaker 1: mind is influenced. Um, you know, more by authority. If 411 00:21:13,920 --> 00:21:15,480 Speaker 1: I tell you that the New York Times said this 412 00:21:15,560 --> 00:21:17,520 Speaker 1: was true, then you're really likely to believe it. Or 413 00:21:17,560 --> 00:21:19,400 Speaker 1: if I told you that Fox News said this was true, 414 00:21:19,400 --> 00:21:21,919 Speaker 1: then you're really likely to believe it. Um. These are 415 00:21:21,920 --> 00:21:24,160 Speaker 1: all different ways of influencing people, you know. I think 416 00:21:24,160 --> 00:21:26,120 Speaker 1: we have to realize that all of our minds can 417 00:21:26,119 --> 00:21:28,199 Speaker 1: be influenced. That's what I learned as a magician. It 418 00:21:28,240 --> 00:21:30,800 Speaker 1: doesn't matter how smart you are, it doesn't matter what 419 00:21:30,960 --> 00:21:33,719 Speaker 1: language you speak. Magic and slight of hand works on 420 00:21:33,880 --> 00:21:38,040 Speaker 1: every single mind. And what we've just enabled is this, 421 00:21:38,320 --> 00:21:40,720 Speaker 1: you know, arms race for anybody to go in there 422 00:21:40,720 --> 00:21:47,639 Speaker 1: and do what they want. It's time to take a 423 00:21:47,720 --> 00:21:50,800 Speaker 1: quick break. We'll be back with Tristan Harris right after this. 424 00:21:58,320 --> 00:22:02,600 Speaker 1: And now back to our conversation with Tristan Harris. Can 425 00:22:02,640 --> 00:22:06,719 Speaker 1: you talk about how tech in general and social media 426 00:22:06,840 --> 00:22:12,480 Speaker 1: in particular have had a corrosive impact on people's perceptions 427 00:22:12,520 --> 00:22:15,679 Speaker 1: of the news and the issues and basically all of 428 00:22:15,680 --> 00:22:20,359 Speaker 1: these things we've spoken about since Donald Trump became a candidate. Yeah. Well, 429 00:22:20,760 --> 00:22:23,720 Speaker 1: oftentimes people think that these feeds, you know what, what 430 00:22:23,720 --> 00:22:25,639 Speaker 1: pot what shows it. The inside of your feed is 431 00:22:25,680 --> 00:22:28,760 Speaker 1: just whatever your friends post, so it's just a neutral tool. 432 00:22:28,800 --> 00:22:31,439 Speaker 1: But this is not true. When you open up that 433 00:22:31,520 --> 00:22:34,840 Speaker 1: Facebook feed, there's actually thousands and thousands of things they 434 00:22:34,880 --> 00:22:37,760 Speaker 1: could show you, and out of those thousand things, they 435 00:22:37,760 --> 00:22:40,280 Speaker 1: try to figure out what will be most likely to 436 00:22:40,280 --> 00:22:43,199 Speaker 1: get you to click to watch, or to share, or 437 00:22:43,240 --> 00:22:46,840 Speaker 1: to like, And it turns out that outrage is really 438 00:22:46,840 --> 00:22:49,000 Speaker 1: good at getting you to click and to share because 439 00:22:49,000 --> 00:22:50,800 Speaker 1: you want to tell other people, I can't believe the 440 00:22:50,800 --> 00:22:54,040 Speaker 1: thing that these guys, these politicians did you know today, 441 00:22:54,160 --> 00:22:56,920 Speaker 1: And so those things enter at the top of everyone's 442 00:22:56,960 --> 00:23:00,399 Speaker 1: feed So now everybody's going to these feeds, and them, huters, 443 00:23:00,440 --> 00:23:03,040 Speaker 1: not humans, are selecting out of all the thousands of things, 444 00:23:03,040 --> 00:23:06,360 Speaker 1: the outrageous things. And what that creates are these waves 445 00:23:06,400 --> 00:23:08,320 Speaker 1: of outrage inside of all these human animals. We have 446 00:23:08,400 --> 00:23:11,199 Speaker 1: these you know, human evolutionary instincts, and you turn your 447 00:23:11,200 --> 00:23:13,760 Speaker 1: phone over in the morning, and instead of feeling calm 448 00:23:13,800 --> 00:23:15,960 Speaker 1: and opening your eyes and taking a breath and asking 449 00:23:16,000 --> 00:23:17,520 Speaker 1: like what are my dreams or what are my hopes 450 00:23:17,560 --> 00:23:19,800 Speaker 1: for today? Or what am I grateful for? What I 451 00:23:19,800 --> 00:23:21,560 Speaker 1: want to do? What am I going to have for breakfast? 452 00:23:21,680 --> 00:23:23,679 Speaker 1: What am I going to have for breakfast? You know? 453 00:23:24,000 --> 00:23:26,080 Speaker 1: You instead, the first thing you do when you set 454 00:23:26,119 --> 00:23:28,600 Speaker 1: off your turn off your alarm, after your phone vibrates, 455 00:23:28,680 --> 00:23:30,800 Speaker 1: is you open up one of these feeds and suddenly 456 00:23:30,960 --> 00:23:34,400 Speaker 1: your mind is like filled with outrage. Screw those democrats 457 00:23:34,760 --> 00:23:38,080 Speaker 1: you wake up to exactly and we have to ask, 458 00:23:38,119 --> 00:23:41,639 Speaker 1: I mean, it's like this totalizing on it on both sides. 459 00:23:41,640 --> 00:23:43,520 Speaker 1: That's The thing is everyone is influenced by this, which 460 00:23:43,560 --> 00:23:45,960 Speaker 1: is why it should be a unifying issue, right. I 461 00:23:46,200 --> 00:23:49,240 Speaker 1: think that no one wants this to happen to politics. 462 00:23:49,240 --> 00:23:50,600 Speaker 1: I mean, once you sort of see where this is 463 00:23:50,640 --> 00:23:52,400 Speaker 1: all going, we don't want to live in a society 464 00:23:52,400 --> 00:23:56,119 Speaker 1: that's just triggered and filled with outrage, or that's manipulating 465 00:23:56,200 --> 00:23:57,679 Speaker 1: kids from the first moment they wake up in the 466 00:23:57,720 --> 00:24:00,040 Speaker 1: morning and they see photo after photo after photo of 467 00:24:00,560 --> 00:24:03,560 Speaker 1: their friends having fun without them. I mean, it's always 468 00:24:03,600 --> 00:24:06,320 Speaker 1: true that people are having fun without us. But the 469 00:24:06,440 --> 00:24:09,119 Speaker 1: question is, do I fill your day in your morning 470 00:24:09,119 --> 00:24:11,679 Speaker 1: with evidence of that? Do I just make you believe 471 00:24:11,760 --> 00:24:13,680 Speaker 1: that's the only thing that's going on. I mean, this 472 00:24:13,760 --> 00:24:16,639 Speaker 1: is crazy, I agree, And I think it's so corrosive 473 00:24:16,680 --> 00:24:18,840 Speaker 1: for kids in general. You know, when you see that 474 00:24:18,920 --> 00:24:22,639 Speaker 1: the suicide rate for teenage girls has I think tripled 475 00:24:23,200 --> 00:24:26,320 Speaker 1: in the last ten years. And if you look at 476 00:24:26,720 --> 00:24:31,080 Speaker 1: anxiety being the number one mental health disorder in this country, 477 00:24:31,119 --> 00:24:34,720 Speaker 1: isolation being one of the biggest problems, it's so ironic 478 00:24:34,800 --> 00:24:37,680 Speaker 1: in an error when we're all connected, we've never been 479 00:24:37,760 --> 00:24:41,159 Speaker 1: so lonely. Yeah. Well, and I think Sherry Turkle was 480 00:24:41,200 --> 00:24:43,679 Speaker 1: just so precient with her book alone together about this 481 00:24:43,760 --> 00:24:49,000 Speaker 1: that you know, giving having the virtual experience of connection 482 00:24:49,200 --> 00:24:51,639 Speaker 1: of seeing all these people doesn't mean that as an 483 00:24:51,680 --> 00:24:54,760 Speaker 1: animal sitting there with the body right and breathing, that 484 00:24:54,840 --> 00:24:57,960 Speaker 1: we don't need that physical presence. Right. It feels a 485 00:24:57,960 --> 00:25:01,240 Speaker 1: lot better to be physically present someone than to sit 486 00:25:01,280 --> 00:25:04,000 Speaker 1: the entire day on a screen getting that virtual form 487 00:25:04,040 --> 00:25:06,360 Speaker 1: of presidents. Can I ask you Trystan a little more 488 00:25:06,400 --> 00:25:10,160 Speaker 1: about like a firm like Cambridge Analytica, because I don't 489 00:25:10,160 --> 00:25:14,080 Speaker 1: think people fully understand how these firms or even tech 490 00:25:14,119 --> 00:25:18,320 Speaker 1: companies are able to kind of gather absorb these this information, 491 00:25:18,680 --> 00:25:22,840 Speaker 1: highly personalized information, and then spew stuff out at you. 492 00:25:22,960 --> 00:25:25,880 Speaker 1: So if I'm talking on my phone with a friend 493 00:25:26,080 --> 00:25:28,840 Speaker 1: about the fact that I really want to buy a 494 00:25:28,920 --> 00:25:32,960 Speaker 1: new winter coat, can they pick that up and then 495 00:25:33,119 --> 00:25:36,159 Speaker 1: start sending me ads for winter coats? Because that's happened 496 00:25:36,200 --> 00:25:39,640 Speaker 1: to a couple of my friends. They've experimented. They talked 497 00:25:39,640 --> 00:25:42,680 Speaker 1: about Emmy Rawsom just for fun to see if suddenly 498 00:25:42,720 --> 00:25:45,240 Speaker 1: they'd get all this stuff about Emmy Rossum, who I 499 00:25:45,240 --> 00:25:47,560 Speaker 1: really love and I think she's a great person. No, 500 00:25:47,760 --> 00:25:50,720 Speaker 1: no dis on Emmy, but she got started getting all 501 00:25:50,760 --> 00:25:54,359 Speaker 1: this information about Emmy Rowsom So that to me is 502 00:25:54,680 --> 00:25:58,040 Speaker 1: super creepy. I mean, how do they find out all 503 00:25:58,040 --> 00:26:02,040 Speaker 1: this stuff about us? Well, I mean especially Facebook, I 504 00:26:02,040 --> 00:26:03,879 Speaker 1: mean the their business. I mean, how much have you 505 00:26:03,920 --> 00:26:07,280 Speaker 1: paid for your Facebook account? Zero dollars? But we're paying 506 00:26:07,359 --> 00:26:09,879 Speaker 1: in our attention obviously, Well that's interesting, right, So we 507 00:26:09,920 --> 00:26:12,760 Speaker 1: don't pay for Facebook. So the question who is paying them? 508 00:26:12,800 --> 00:26:15,639 Speaker 1: And it's the advertiser, which means that all those people 509 00:26:15,640 --> 00:26:17,960 Speaker 1: who go to work at Facebook. As much as they say, hey, 510 00:26:18,240 --> 00:26:19,920 Speaker 1: we want to make the world more open and connected, 511 00:26:20,400 --> 00:26:24,320 Speaker 1: I'm sorry, but if your business model is serving not people, 512 00:26:24,359 --> 00:26:27,560 Speaker 1: but the business model is serving advertisers, then guess what. 513 00:26:27,800 --> 00:26:30,679 Speaker 1: All of that information over time is going to be 514 00:26:30,760 --> 00:26:33,359 Speaker 1: used more and more to make the advertisers more successful. 515 00:26:33,400 --> 00:26:36,520 Speaker 1: Otherwise those advertisers aren't going to spend their money on Facebook. 516 00:26:37,000 --> 00:26:39,399 Speaker 1: So in the long run, Facebook has to be, you know, 517 00:26:39,440 --> 00:26:41,600 Speaker 1: helping their advertisers be successful. And what that's going to 518 00:26:41,720 --> 00:26:45,119 Speaker 1: mean is enabling them to access more and more personal 519 00:26:45,200 --> 00:26:47,840 Speaker 1: information when they target uh, you know, adds to you. 520 00:26:48,200 --> 00:26:50,200 Speaker 1: And so when you send a message to someone else 521 00:26:50,240 --> 00:26:53,000 Speaker 1: on Facebook, you know, it'll pick up those keywords and 522 00:26:53,119 --> 00:26:54,800 Speaker 1: that will be part of the way that it starts 523 00:26:54,800 --> 00:26:58,199 Speaker 1: to target and enabling advertisers to target you over time. Well, 524 00:26:58,280 --> 00:27:01,159 Speaker 1: let's talk about Facebook in parta killer for a second. 525 00:27:01,400 --> 00:27:05,720 Speaker 1: Facebook on Instagram. Um, we sometimes use the euphemism of 526 00:27:05,800 --> 00:27:09,280 Speaker 1: social media companies different technology platus, so many of them 527 00:27:09,400 --> 00:27:12,160 Speaker 1: really just Facebook. Yeah. I think we're really in large 528 00:27:12,240 --> 00:27:15,199 Speaker 1: part just talking about Facebook. So the question is, is 529 00:27:15,280 --> 00:27:19,359 Speaker 1: Facebook the worst offender here? Are they doing things that 530 00:27:19,400 --> 00:27:26,480 Speaker 1: are qualitatively different and more harmful than Google for example? Um? Yeah, 531 00:27:26,560 --> 00:27:29,960 Speaker 1: I think the challenges it's just a ground. For a second, 532 00:27:30,040 --> 00:27:32,879 Speaker 1: just how many people use Facebook. There's more than two 533 00:27:33,080 --> 00:27:37,480 Speaker 1: billion people on Facebook. That's more than a quarter of 534 00:27:37,520 --> 00:27:40,919 Speaker 1: the world's population. Um, that's about the number of notional 535 00:27:40,920 --> 00:27:44,760 Speaker 1: followers of Christianity who every day are jacked into this 536 00:27:44,800 --> 00:27:47,240 Speaker 1: system where they start looking at a feed and their 537 00:27:47,280 --> 00:27:50,120 Speaker 1: thoughts get you know, start flowing into them from these 538 00:27:50,160 --> 00:27:53,159 Speaker 1: one from this one company in California, you know, with 539 00:27:53,200 --> 00:27:56,320 Speaker 1: a handful of engineers who make the design and algorithm decisions. 540 00:27:56,920 --> 00:27:58,719 Speaker 1: And so the reason I say that is that no 541 00:27:58,760 --> 00:28:04,399 Speaker 1: matter what Facebook us, it's creating exponentially complex consequences for 542 00:28:04,680 --> 00:28:08,320 Speaker 1: two billion people. So you know, how many engineers speak 543 00:28:08,440 --> 00:28:11,920 Speaker 1: at Facebook speak Burmese. I mean, I don't know one 544 00:28:12,240 --> 00:28:16,240 Speaker 1: zero maybe, and yet Facebook is the number one way 545 00:28:16,280 --> 00:28:18,840 Speaker 1: for people in Burma to access the internet. Facebook is 546 00:28:18,880 --> 00:28:21,480 Speaker 1: the Internet if you're in Burma. And it started amplifying 547 00:28:21,560 --> 00:28:24,399 Speaker 1: genocides in Burma because it was amplifying this fake news 548 00:28:24,600 --> 00:28:27,360 Speaker 1: about a specific minority group. So the point is that 549 00:28:27,680 --> 00:28:30,120 Speaker 1: they can't reign in this machine that they've created. They've 550 00:28:30,160 --> 00:28:33,680 Speaker 1: created this automated machine that, because there's no humans, they're 551 00:28:33,720 --> 00:28:35,720 Speaker 1: figuring out which thoughts should we put in people's minds. 552 00:28:35,720 --> 00:28:37,639 Speaker 1: It's just the machines calculating what it should put in 553 00:28:37,680 --> 00:28:41,160 Speaker 1: people's minds based on what's most engaging, and it's starting 554 00:28:41,200 --> 00:28:44,160 Speaker 1: to push thoughts into people's minds and elections and democracies 555 00:28:44,200 --> 00:28:47,920 Speaker 1: around the world in languages that the engineers at California 556 00:28:47,960 --> 00:28:50,880 Speaker 1: and Menlo Park they don't even speak. And so they've 557 00:28:50,920 --> 00:28:54,040 Speaker 1: created this kind of monster that they no longer control. 558 00:28:54,320 --> 00:28:56,400 Speaker 1: So now what's going on is they're trying to go 559 00:28:56,480 --> 00:28:58,480 Speaker 1: back and say, how do we reigin this thing in 560 00:28:58,560 --> 00:29:01,000 Speaker 1: How do we quickly, you know, throw all the firefighters 561 00:29:01,000 --> 00:29:02,959 Speaker 1: at this thing and try to save it as much 562 00:29:03,000 --> 00:29:06,720 Speaker 1: as possible. But I think the challenges we trusted them 563 00:29:06,840 --> 00:29:09,720 Speaker 1: to try and be thoughtful about this whole thing, and 564 00:29:09,760 --> 00:29:11,760 Speaker 1: they didn't see from the very beginning that this is 565 00:29:11,760 --> 00:29:13,560 Speaker 1: the level of influence they had. That they aren't just 566 00:29:13,920 --> 00:29:15,640 Speaker 1: you know, helping us keep in touch with our friends. 567 00:29:15,640 --> 00:29:19,440 Speaker 1: They're a political actor. They're steering elections, They're creating addictions, 568 00:29:19,480 --> 00:29:22,480 Speaker 1: they're making people lonely. I know that some activists and 569 00:29:22,560 --> 00:29:25,800 Speaker 1: Mimar wrote a letter to Mark Zuckerberg about this, about 570 00:29:25,840 --> 00:29:28,160 Speaker 1: the very thing that you were describing in Burma, and 571 00:29:28,200 --> 00:29:32,880 Speaker 1: he wrote them back personally. But you know, to your point, 572 00:29:32,920 --> 00:29:36,560 Speaker 1: all the people in the world, it's impossible to monitor 573 00:29:37,360 --> 00:29:40,240 Speaker 1: these things, isn't it. So before we talk about what 574 00:29:40,320 --> 00:29:43,440 Speaker 1: Facebook can and cannot do, let me ask you about 575 00:29:43,480 --> 00:29:46,640 Speaker 1: Mark Zuckerberg's testimony, because I'm sure you watched it with 576 00:29:46,840 --> 00:29:50,080 Speaker 1: a great amount of interest. Um, what did you think 577 00:29:50,160 --> 00:29:53,040 Speaker 1: about it? And did you think he was questioned vigorously 578 00:29:53,240 --> 00:29:56,640 Speaker 1: enough by members of Congress, you know, some of whom 579 00:29:56,680 --> 00:30:00,800 Speaker 1: still have flip phones. Well, I think that is the issue. Fundamentally. 580 00:30:00,840 --> 00:30:05,200 Speaker 1: It is clear that you know, the modern realities of 581 00:30:05,240 --> 00:30:10,920 Speaker 1: how technology companies work have far outpaced the governance capacity 582 00:30:10,960 --> 00:30:14,200 Speaker 1: of really probably any government to stay in touch with 583 00:30:14,880 --> 00:30:17,080 Speaker 1: you know all of the ways that these things are 584 00:30:17,120 --> 00:30:20,360 Speaker 1: working and evolved in the business model. And you know, 585 00:30:20,480 --> 00:30:23,960 Speaker 1: we were involved in the November first hearings um myself 586 00:30:23,960 --> 00:30:27,200 Speaker 1: and a few other people and briefing major Congress members. Unfortunately, 587 00:30:27,240 --> 00:30:29,520 Speaker 1: there wasn't as much time for these latest hearings, and 588 00:30:29,560 --> 00:30:32,400 Speaker 1: I think it came across, as you know, a whole 589 00:30:32,400 --> 00:30:35,120 Speaker 1: hodgepodge of issues. Some people are asking about privacy, some 590 00:30:35,160 --> 00:30:38,000 Speaker 1: people are asking about housing discrimination adds, some people are 591 00:30:38,080 --> 00:30:42,120 Speaker 1: asking about election integrity, data breaches, Cambridge Altica. So to 592 00:30:42,200 --> 00:30:44,480 Speaker 1: the average person, it feels like, oh man, this is 593 00:30:44,520 --> 00:30:47,640 Speaker 1: about just a bunch of unrelated things. But I actually 594 00:30:47,640 --> 00:30:50,440 Speaker 1: want to reframe that. The reason why it felt like 595 00:30:50,480 --> 00:30:53,520 Speaker 1: it's about a bunch of unrelated things and that these 596 00:30:53,560 --> 00:30:57,480 Speaker 1: harms are showing up everywhere is because Facebook affects every 597 00:30:57,600 --> 00:31:03,240 Speaker 1: part of society. It affects election campaign pricing, it affects 598 00:31:03,440 --> 00:31:07,400 Speaker 1: housing discrimination ads where some groups get discriminated by over others. 599 00:31:07,440 --> 00:31:11,480 Speaker 1: It affects elections, it affects people's privacy. And I think 600 00:31:11,520 --> 00:31:13,640 Speaker 1: the problem is if you think about Facebook is almost 601 00:31:13,640 --> 00:31:15,760 Speaker 1: like a global government. I mean, how much power does 602 00:31:15,800 --> 00:31:18,960 Speaker 1: Mark Zuckerberg have even over someone like President Trump at 603 00:31:18,960 --> 00:31:23,160 Speaker 1: controlling people's thoughts and actions, right. I mean, you could 604 00:31:23,240 --> 00:31:26,920 Speaker 1: argue that he has more power that's completely unaccountable except 605 00:31:26,960 --> 00:31:29,880 Speaker 1: to him, since he's the major shareholder at doing whatever 606 00:31:29,920 --> 00:31:32,880 Speaker 1: he wants, and we're sort of left to you know, 607 00:31:33,000 --> 00:31:35,400 Speaker 1: his moral compass and whatever happens, you know, to be 608 00:31:35,480 --> 00:31:38,080 Speaker 1: running between the his eyes and ears as the way 609 00:31:38,120 --> 00:31:40,080 Speaker 1: he's thinking about this. Well, what did you think of 610 00:31:40,120 --> 00:31:43,719 Speaker 1: his testimony? Well, you know, I thought he was dodging 611 00:31:43,720 --> 00:31:48,440 Speaker 1: the fundamental issue, which is that the business model of 612 00:31:48,840 --> 00:31:53,040 Speaker 1: advertising and keeping people engaged is the problem. All of 613 00:31:53,080 --> 00:31:57,240 Speaker 1: these issues come down to that one issue, and so 614 00:31:57,400 --> 00:32:01,120 Speaker 1: he was trying to distract Congress from that core issue 615 00:32:01,560 --> 00:32:05,280 Speaker 1: that the business model is what incentivizes them to offer 616 00:32:05,360 --> 00:32:08,520 Speaker 1: better ways for advertisers to take people's personal information and 617 00:32:08,560 --> 00:32:11,400 Speaker 1: target against it and offer better and better ways to 618 00:32:11,480 --> 00:32:13,760 Speaker 1: keep people hooked on Facebook for as long as possible. 619 00:32:13,920 --> 00:32:15,920 Speaker 1: And it's also a little misleading when he said that 620 00:32:15,960 --> 00:32:19,480 Speaker 1: they don't sell data to anyone, because what they actually 621 00:32:19,520 --> 00:32:24,000 Speaker 1: do is they allow advertisers to micro target people based 622 00:32:24,000 --> 00:32:27,360 Speaker 1: on their data. So even though the advertisers themselves don't 623 00:32:27,360 --> 00:32:29,680 Speaker 1: see the data, that's get all the benefits of that data. 624 00:32:29,720 --> 00:32:32,040 Speaker 1: That's right, and and and Facebook is quick to sort of, 625 00:32:32,200 --> 00:32:34,720 Speaker 1: you know, smirk when Congress asked them, why do you 626 00:32:34,760 --> 00:32:37,240 Speaker 1: sell people's data, because they don't do that. But the 627 00:32:37,280 --> 00:32:40,600 Speaker 1: point is that it is equivalently the same because advertisers 628 00:32:40,600 --> 00:32:44,000 Speaker 1: are paying Facebook not to own and access your data, 629 00:32:44,040 --> 00:32:46,480 Speaker 1: not to use it and then spread somewhere else, but 630 00:32:46,520 --> 00:32:50,280 Speaker 1: they're paying to target ads specifically to those people. You 631 00:32:50,320 --> 00:32:54,280 Speaker 1: can go on Facebook and you can target um conspiracy 632 00:32:54,320 --> 00:32:57,360 Speaker 1: theorists by knowing what keywords they tend to identify with. 633 00:32:57,720 --> 00:33:01,120 Speaker 1: So are you saying that until and allows Facebook changes 634 00:33:01,320 --> 00:33:06,480 Speaker 1: its fundamental business model, the problem can't be solved. That's 635 00:33:06,640 --> 00:33:09,720 Speaker 1: that's right. And the thing is that we can't change 636 00:33:09,720 --> 00:33:12,280 Speaker 1: their business model overnight, and they can't change their business 637 00:33:12,280 --> 00:33:15,120 Speaker 1: model overnight, even if actually they saw these issues. So 638 00:33:15,200 --> 00:33:16,880 Speaker 1: I know that there's a lot of work, and we 639 00:33:16,880 --> 00:33:19,160 Speaker 1: should celebrate the work that they're trying to do right 640 00:33:19,200 --> 00:33:22,000 Speaker 1: now to try and peel away the problem against their 641 00:33:22,000 --> 00:33:24,840 Speaker 1: own financial interests. And I think that there's some authentic 642 00:33:24,920 --> 00:33:27,320 Speaker 1: things that they're doing there, But the question is there's 643 00:33:27,320 --> 00:33:29,840 Speaker 1: an upper bound to that, And and do they see 644 00:33:30,120 --> 00:33:33,000 Speaker 1: that the fundamental heart of all of these problems comes 645 00:33:33,040 --> 00:33:36,200 Speaker 1: down to their business model. They're not on our team. 646 00:33:36,240 --> 00:33:39,479 Speaker 1: They're not on democracy's team to help strengthen the fabric 647 00:33:39,480 --> 00:33:41,640 Speaker 1: of society. So long as the people who pay them 648 00:33:42,000 --> 00:33:44,120 Speaker 1: are the advertisers they're they're not going to be able 649 00:33:44,160 --> 00:33:46,920 Speaker 1: to solve these problems. So if you were Mark Zuckerberg 650 00:33:47,320 --> 00:33:49,600 Speaker 1: and you were in charge of Facebook, what would you 651 00:33:49,640 --> 00:33:52,840 Speaker 1: be doing to put Facebook back on the side of 652 00:33:52,960 --> 00:33:57,000 Speaker 1: democracy and the American people and everything that's good and 653 00:33:57,080 --> 00:33:59,320 Speaker 1: just in the world. The first thing I would do 654 00:34:00,080 --> 00:34:03,520 Speaker 1: isn't just say I'm sorry and there's this one bad 655 00:34:03,560 --> 00:34:06,280 Speaker 1: actor way over there in the corner called Cambridge Analytica, 656 00:34:06,280 --> 00:34:09,600 Speaker 1: but Facebook is fine. I would tell the world I'm 657 00:34:09,719 --> 00:34:12,880 Speaker 1: sorry that I didn't see that our business model was 658 00:34:12,920 --> 00:34:17,160 Speaker 1: so corrosive and I feel bad about that. And now 659 00:34:17,200 --> 00:34:19,920 Speaker 1: what we're gonna do is start a transition plan to 660 00:34:19,960 --> 00:34:23,160 Speaker 1: get off of this business model. And here's how we're 661 00:34:23,200 --> 00:34:25,000 Speaker 1: gonna do that, and here's how much time it's gonna take. 662 00:34:25,000 --> 00:34:27,160 Speaker 1: It's not gonna happen instantly, but here's how we're gonna 663 00:34:27,160 --> 00:34:30,160 Speaker 1: work on that. Here's why, and hey, shareholders, here's why 664 00:34:30,160 --> 00:34:32,239 Speaker 1: we're going to be regulated if we don't do this. 665 00:34:32,280 --> 00:34:35,399 Speaker 1: In the long run anyway, And I would start with that. 666 00:34:35,640 --> 00:34:37,760 Speaker 1: I was going to say, trust, now, that won't another 667 00:34:37,840 --> 00:34:41,719 Speaker 1: behemoth just take Facebook's place. I mean, I hate to 668 00:34:41,760 --> 00:34:45,520 Speaker 1: say it, but when it comes to profits versus ethics, 669 00:34:45,640 --> 00:34:49,279 Speaker 1: profits usually when don't they that that's right? But that's 670 00:34:49,280 --> 00:34:52,960 Speaker 1: all based on consumer demand. So right now, there's essentially 671 00:34:52,960 --> 00:34:55,440 Speaker 1: Facebook as a monopoly. Um and it's a new species 672 00:34:55,440 --> 00:34:58,200 Speaker 1: of monopoly because all of our anti trust, anti monopoly 673 00:34:58,280 --> 00:35:03,000 Speaker 1: laws are about um uh, they don't handle zero price monopolies, 674 00:35:03,120 --> 00:35:05,359 Speaker 1: usually a monopoly as they price discriminate so they can 675 00:35:05,400 --> 00:35:07,320 Speaker 1: control pricing. They can offer one price to you and 676 00:35:07,440 --> 00:35:09,600 Speaker 1: no different price to someone else, and no one can 677 00:35:09,600 --> 00:35:12,720 Speaker 1: stop them because you know their a monopoly. The challenge 678 00:35:12,719 --> 00:35:15,840 Speaker 1: here is it's a monopoly that's free. So all of 679 00:35:15,840 --> 00:35:19,320 Speaker 1: our antitrust law that's normally about regulating these things doesn't 680 00:35:19,360 --> 00:35:22,960 Speaker 1: handle free monopolies. And so I think if we change 681 00:35:23,000 --> 00:35:26,080 Speaker 1: consumer demand and people realize that Facebook doesn't have our 682 00:35:26,120 --> 00:35:28,839 Speaker 1: best interests at heart, and it never will unless they 683 00:35:28,920 --> 00:35:31,839 Speaker 1: change their business model, then we're starting to see this 684 00:35:31,880 --> 00:35:34,759 Speaker 1: movement where people do want an alternative. We're not there yet, 685 00:35:34,840 --> 00:35:37,000 Speaker 1: but I think slowly that's what's happening. I don't know, 686 00:35:37,120 --> 00:35:39,720 Speaker 1: you know, sorry to play Devil's advocate for a moment, 687 00:35:39,760 --> 00:35:42,000 Speaker 1: but it seems to me, Tristan, that a lot of 688 00:35:42,040 --> 00:35:47,359 Speaker 1: this depends on consumer demand or lack thereof. And there 689 00:35:47,440 --> 00:35:50,040 Speaker 1: was a big thing, oh, you know people were going 690 00:35:50,080 --> 00:35:53,920 Speaker 1: to quit Facebook. Well, to quote Brokeback Mountain, I just 691 00:35:54,000 --> 00:35:56,880 Speaker 1: can't quit you Facebook. I mean it's really really hard 692 00:35:57,080 --> 00:36:01,799 Speaker 1: because people are so addicted to live without it. And furthermore, 693 00:36:01,960 --> 00:36:04,080 Speaker 1: there was an interesting piece in The New York Times 694 00:36:04,200 --> 00:36:06,759 Speaker 1: by my friend Andrew Ross Sorkin, and the headline was 695 00:36:06,840 --> 00:36:10,440 Speaker 1: our privacy has eroded. We're okay with that? I mean, 696 00:36:10,560 --> 00:36:12,600 Speaker 1: don't you think that. People It's sort of like being 697 00:36:12,640 --> 00:36:16,000 Speaker 1: a frog in a slowly boiling pot of water. People 698 00:36:16,120 --> 00:36:19,480 Speaker 1: have become you know, complacent and in order to it 699 00:36:19,520 --> 00:36:23,160 Speaker 1: and have pretty much accepted this is the new normal. Well, 700 00:36:23,239 --> 00:36:26,040 Speaker 1: I would frame it a little bit differently. I would 701 00:36:26,120 --> 00:36:29,719 Speaker 1: say that it's a testament to how much of a fabric, 702 00:36:30,280 --> 00:36:33,080 Speaker 1: basic fabric of our lives it's become that we can't 703 00:36:33,120 --> 00:36:34,840 Speaker 1: just delete Facebook. So when I say this, I'm not 704 00:36:34,880 --> 00:36:37,279 Speaker 1: saying that consumer demands going to change and everyone's just 705 00:36:37,280 --> 00:36:39,840 Speaker 1: gonna quit because you know, in fact, people I know 706 00:36:39,880 --> 00:36:42,080 Speaker 1: at Facebook say, with a smug on their face, will 707 00:36:42,680 --> 00:36:45,280 Speaker 1: if you don't like the product, just use a different product. 708 00:36:45,400 --> 00:36:47,279 Speaker 1: It's like, okay, great, so I'll just switch to that 709 00:36:47,360 --> 00:36:49,920 Speaker 1: other two billion person social network. That's so, that's right 710 00:36:49,960 --> 00:36:52,400 Speaker 1: there to my right. You know, it doesn't exist. And 711 00:36:52,480 --> 00:36:54,799 Speaker 1: right now, what we need to do is change the 712 00:36:54,880 --> 00:36:58,520 Speaker 1: kind of the government and consumer context so that those 713 00:36:58,560 --> 00:37:01,560 Speaker 1: alternatives can start to as so it becomes harder and 714 00:37:01,600 --> 00:37:03,759 Speaker 1: harder for them to innovate. And we need make it 715 00:37:03,800 --> 00:37:05,960 Speaker 1: easier for the alternatives to exist, because we're not going 716 00:37:06,040 --> 00:37:09,040 Speaker 1: to quit. I mean, I still use Facebook every day 717 00:37:09,680 --> 00:37:11,440 Speaker 1: because it's the best. It's the only way I can 718 00:37:11,480 --> 00:37:14,080 Speaker 1: get my ideas out to a large enough audience. And 719 00:37:14,120 --> 00:37:15,840 Speaker 1: that speaks to the amount of power that they have, 720 00:37:16,040 --> 00:37:18,040 Speaker 1: right it's not. And the key thing here is also 721 00:37:18,120 --> 00:37:20,720 Speaker 1: that it's not just an addiction. They've they've really taken 722 00:37:20,800 --> 00:37:24,080 Speaker 1: over the fundamental communications fabric and the fabric of staying 723 00:37:24,080 --> 00:37:26,160 Speaker 1: in touch with the people that matter to us. We 724 00:37:26,200 --> 00:37:28,520 Speaker 1: don't have an alternative, and that's why people aren't just 725 00:37:28,520 --> 00:37:30,680 Speaker 1: going to delete it. But that doesn't mean people don't 726 00:37:30,719 --> 00:37:32,279 Speaker 1: have a problem with it. But aren't there some good 727 00:37:32,280 --> 00:37:35,280 Speaker 1: things about it. I mean about social media writ large, 728 00:37:35,360 --> 00:37:39,200 Speaker 1: not just Facebook, but all these different modalities in terms 729 00:37:39,239 --> 00:37:42,760 Speaker 1: of like galvanizing people, getting like minded people to stand 730 00:37:42,840 --> 00:37:45,680 Speaker 1: up and want more sensible gun laws for example, or 731 00:37:46,080 --> 00:37:49,759 Speaker 1: start you know, stopping genocide in different places because there's 732 00:37:49,800 --> 00:37:53,799 Speaker 1: kind of a grassroots uprising. Uh, the fact that, yes, 733 00:37:53,840 --> 00:37:56,520 Speaker 1: we're lonely, but it is nice to be able to 734 00:37:56,920 --> 00:38:00,000 Speaker 1: look at video of your grandchild if you live far away. 735 00:38:00,040 --> 00:38:03,040 Speaker 1: I mean, I can think of countless examples of the 736 00:38:03,120 --> 00:38:06,360 Speaker 1: positive impact this is having. I mean, you're such a 737 00:38:06,400 --> 00:38:10,399 Speaker 1: debbut down or is there anything that well I think 738 00:38:10,400 --> 00:38:12,480 Speaker 1: this is This is such a critical thing because this 739 00:38:12,480 --> 00:38:15,400 Speaker 1: this conversation often makes it seem as if there's this 740 00:38:15,440 --> 00:38:17,960 Speaker 1: all or nothing choice, like either we have Facebook and 741 00:38:18,040 --> 00:38:20,640 Speaker 1: we we have these these benefits in these costs, or 742 00:38:20,680 --> 00:38:23,080 Speaker 1: we just don't use Facebook at all. And the question 743 00:38:23,160 --> 00:38:25,000 Speaker 1: is is there a middle way? And as I said 744 00:38:25,040 --> 00:38:27,040 Speaker 1: this in my Ted talk, that it's like, you know, 745 00:38:27,120 --> 00:38:29,960 Speaker 1: we can have social movements that take off with positive 746 00:38:29,960 --> 00:38:34,839 Speaker 1: messages without most of the time creating viral outrage most 747 00:38:34,880 --> 00:38:37,120 Speaker 1: of the time. I mean, these positive social movements are 748 00:38:37,239 --> 00:38:40,520 Speaker 1: very very rare in comparison to the daily outrage that 749 00:38:40,600 --> 00:38:43,799 Speaker 1: we experience on these things. The experience of knowing that 750 00:38:43,840 --> 00:38:45,680 Speaker 1: your friend is you haven't seen for ten years is 751 00:38:45,719 --> 00:38:48,520 Speaker 1: also visiting a city, um that you find out about 752 00:38:48,520 --> 00:38:51,160 Speaker 1: that only because you had a Facebook account. That's experience 753 00:38:51,280 --> 00:38:54,279 Speaker 1: is great, but it's very, very very rare compared to 754 00:38:54,320 --> 00:38:57,080 Speaker 1: all the time people spend sort of just mindlessly browsing 755 00:38:57,080 --> 00:39:00,080 Speaker 1: and they self report regretting. And I think there's a 756 00:39:00,080 --> 00:39:03,080 Speaker 1: different way to design all of this stuff. And that's that's, 757 00:39:03,080 --> 00:39:04,560 Speaker 1: by the way, what we're trying to do with our 758 00:39:04,640 --> 00:39:06,799 Speaker 1: with our work with the Center for Humane Technologies, show 759 00:39:06,880 --> 00:39:08,839 Speaker 1: that there actually is a different way to design these 760 00:39:08,840 --> 00:39:11,520 Speaker 1: products that's not an all or nothing choice. So what 761 00:39:11,600 --> 00:39:14,080 Speaker 1: does a good Facebook look like? Is it a subscription 762 00:39:14,160 --> 00:39:18,000 Speaker 1: model where advertisers wouldn't be on the platform, that people 763 00:39:18,000 --> 00:39:21,080 Speaker 1: would pay to use it, and therefore the business of 764 00:39:21,160 --> 00:39:24,040 Speaker 1: social media isn't about capturing as much of our attention 765 00:39:24,080 --> 00:39:28,080 Speaker 1: as possible. Yeah, think of it like a utility. I mean, um, 766 00:39:28,120 --> 00:39:31,680 Speaker 1: you know, um, we would we would pay to have 767 00:39:31,840 --> 00:39:35,600 Speaker 1: access to a service that's all about benefiting our lives. 768 00:39:35,600 --> 00:39:37,880 Speaker 1: So what I mean by that is, right now, if 769 00:39:37,880 --> 00:39:39,880 Speaker 1: you walked I mean, if you walked into Facebook today, 770 00:39:39,920 --> 00:39:42,040 Speaker 1: and you just interviewed a random sample of people and 771 00:39:42,080 --> 00:39:43,840 Speaker 1: you said, what are you doing with your time today? 772 00:39:43,880 --> 00:39:46,160 Speaker 1: You as an engineer at Facebook, and you would find 773 00:39:46,280 --> 00:39:50,160 Speaker 1: every engineer is basically on one goal, which is am 774 00:39:50,160 --> 00:39:54,480 Speaker 1: I keeping people engaged? Scrolling, clicking, liking, hooked or not. 775 00:39:54,840 --> 00:39:57,439 Speaker 1: They're only concerned with that goal. And instead, imagine we 776 00:39:57,560 --> 00:40:00,200 Speaker 1: paid for Facebook. All those engineers go to work and 777 00:40:00,239 --> 00:40:03,040 Speaker 1: they ask, how do we generate positive benefits in society? 778 00:40:03,040 --> 00:40:05,600 Speaker 1: How do we make this a resource? And Mark Zuckerberg 779 00:40:05,680 --> 00:40:08,800 Speaker 1: himself back in two thousand five used to talk about 780 00:40:08,880 --> 00:40:13,239 Speaker 1: Facebook as a social utility. He didn't try to defend 781 00:40:13,280 --> 00:40:16,160 Speaker 1: this stuff about news feeds and content and all this 782 00:40:16,239 --> 00:40:19,080 Speaker 1: kind of stuff, which that came later after they hinge 783 00:40:19,120 --> 00:40:22,239 Speaker 1: their success on the advertising model. Before that, it was 784 00:40:22,280 --> 00:40:24,839 Speaker 1: a utility. It was an address book. It was something 785 00:40:24,880 --> 00:40:27,960 Speaker 1: I could use to make things happen in my life. 786 00:40:28,640 --> 00:40:31,000 Speaker 1: And I think, if we paid for Facebook, you don't 787 00:40:31,040 --> 00:40:33,680 Speaker 1: just see news feeds without the ads. I'm talking about 788 00:40:33,719 --> 00:40:37,200 Speaker 1: a radically different service that is entirely built as a 789 00:40:37,320 --> 00:40:41,040 Speaker 1: utility to empower us to make new social choices together 790 00:40:41,239 --> 00:40:43,200 Speaker 1: that we wouldn't be able to make without Facebook. Things 791 00:40:43,239 --> 00:40:45,279 Speaker 1: like Oh, I can find out when my best friend 792 00:40:45,320 --> 00:40:48,239 Speaker 1: is visiting town that I haven't seen in ten years. Um, 793 00:40:48,280 --> 00:40:50,680 Speaker 1: I can have strangers who when they move to a 794 00:40:50,719 --> 00:40:52,520 Speaker 1: new city, they can quickly find the groups that they 795 00:40:52,520 --> 00:40:54,520 Speaker 1: can be in touch with. But it wouldn't be built 796 00:40:54,560 --> 00:40:57,160 Speaker 1: around news feeds and mindless consumption, which is what it's 797 00:40:57,160 --> 00:41:00,000 Speaker 1: built around. Now. I want to ask about fake news 798 00:41:00,160 --> 00:41:04,000 Speaker 1: is because listen, these seem to be long term goals, Tristan, 799 00:41:04,160 --> 00:41:07,200 Speaker 1: but you know, we have an important midterm elections coming 800 00:41:07,320 --> 00:41:11,920 Speaker 1: up in November, a hugely important presidential election coming up 801 00:41:11,960 --> 00:41:16,439 Speaker 1: in um and so what immediately, what can be done 802 00:41:16,480 --> 00:41:21,719 Speaker 1: about the proliferation of fake news, these bots, Russian influence 803 00:41:21,840 --> 00:41:25,520 Speaker 1: and what we're seeing and general chaos when it comes 804 00:41:25,560 --> 00:41:29,640 Speaker 1: to this platform in general. Yeah, the thing that people 805 00:41:29,680 --> 00:41:33,000 Speaker 1: should understand is, you know, if I ask you the question, 806 00:41:33,400 --> 00:41:36,799 Speaker 1: how much more confident do you feel today if an 807 00:41:36,840 --> 00:41:40,000 Speaker 1: election was to be held that it is less vulnerable 808 00:41:40,040 --> 00:41:44,120 Speaker 1: to outside manipulation and influence than it was in like 809 00:41:44,280 --> 00:41:49,600 Speaker 1: zero point zero? Correct? These platforms are still highly highly 810 00:41:49,680 --> 00:41:53,399 Speaker 1: vulnerable to not just Russia, but any state and even 811 00:41:53,440 --> 00:41:57,360 Speaker 1: non state actors China, North Korea are in the playbook. 812 00:41:57,400 --> 00:42:00,200 Speaker 1: Is now out there, and now anyone can spend money 813 00:42:00,239 --> 00:42:03,359 Speaker 1: on Facebook to target information to the audiences that they want. 814 00:42:03,360 --> 00:42:04,919 Speaker 1: And it's not just the ads, by the way, there's 815 00:42:05,280 --> 00:42:07,640 Speaker 1: loads of other techniques. I work with ex intelligence people 816 00:42:07,640 --> 00:42:11,160 Speaker 1: that know about those techniques, and it's still one possible 817 00:42:11,200 --> 00:42:15,160 Speaker 1: to create impersonation accounts, to create fake images that are 818 00:42:15,239 --> 00:42:17,959 Speaker 1: dark don't exist, these deep fakes where you can fake 819 00:42:18,280 --> 00:42:21,120 Speaker 1: videos of people saying things they didn't they didn't say. 820 00:42:21,320 --> 00:42:23,239 Speaker 1: All this is going to get worse, which is why 821 00:42:23,440 --> 00:42:25,319 Speaker 1: we're doing this work. This is the short term, as 822 00:42:25,320 --> 00:42:28,200 Speaker 1: you said, Katie, is if you care about the integrity 823 00:42:28,280 --> 00:42:30,840 Speaker 1: of our elections and our democracy, then you ought to 824 00:42:30,880 --> 00:42:34,200 Speaker 1: care about how these platforms reform their practices to better 825 00:42:34,239 --> 00:42:37,640 Speaker 1: protect them from manipulation. Is their awareness part of the solution, though, 826 00:42:37,680 --> 00:42:39,360 Speaker 1: I feel like you know, I had a good friend 827 00:42:39,480 --> 00:42:41,960 Speaker 1: I've mentioned this a couple of times on our podcast 828 00:42:42,000 --> 00:42:44,680 Speaker 1: who got this video and sent it to me about 829 00:42:44,760 --> 00:42:47,640 Speaker 1: whom Aberdeen and her connections to the Muslim Brotherhood, and 830 00:42:47,640 --> 00:42:49,719 Speaker 1: she was appalled and have you seen this? And I 831 00:42:49,719 --> 00:42:52,440 Speaker 1: said day that this is bullshit, this is not true stuff, 832 00:42:52,640 --> 00:42:55,399 Speaker 1: but it was so professionally done. And do you think 833 00:42:55,440 --> 00:42:58,239 Speaker 1: that people are at least more skeptical and are kind 834 00:42:58,280 --> 00:43:01,880 Speaker 1: of more educated did in terms of their consumption of 835 00:43:01,960 --> 00:43:04,480 Speaker 1: this kind of thing, or or am I just a 836 00:43:04,520 --> 00:43:09,319 Speaker 1: little more sophisticated than your average consumer. Well, I'm very 837 00:43:09,320 --> 00:43:11,920 Speaker 1: worried about this because often we have this feeling that well, 838 00:43:11,960 --> 00:43:14,799 Speaker 1: I'm the smart one, and it's only those very persuadable 839 00:43:14,840 --> 00:43:17,200 Speaker 1: person's way over there that were influenced by Russia. But 840 00:43:17,239 --> 00:43:21,200 Speaker 1: not right. Look, and you're as a magician, as a kid, 841 00:43:21,239 --> 00:43:23,239 Speaker 1: you realize that everyone feels that way, Like, no, no, no, 842 00:43:23,320 --> 00:43:25,400 Speaker 1: I I'm you know, magic is only gonna work on 843 00:43:25,440 --> 00:43:27,440 Speaker 1: those dumb people you know, who don't are un educated. 844 00:43:27,440 --> 00:43:29,880 Speaker 1: But I have a PhD. So therefore I'm not manipulated. 845 00:43:30,040 --> 00:43:32,719 Speaker 1: It's actually the opposite. Usually, the people who are most 846 00:43:32,760 --> 00:43:35,239 Speaker 1: confident are the ones who are much easier to manipulate 847 00:43:35,719 --> 00:43:38,359 Speaker 1: and the you know, we didn't always see society that way. 848 00:43:38,360 --> 00:43:41,440 Speaker 1: Back in the nineteen forties, UM, the United States government 849 00:43:41,760 --> 00:43:44,360 Speaker 1: UH created you know, the Committee for National Morale UH 850 00:43:44,400 --> 00:43:46,680 Speaker 1: and there is also the Institute for propagand Analysis to 851 00:43:46,719 --> 00:43:51,719 Speaker 1: protect the American psyche from foreign influence. We recognized fundamentally 852 00:43:51,760 --> 00:43:54,920 Speaker 1: how vulnerable our population was to outside influence. And we 853 00:43:54,920 --> 00:43:57,960 Speaker 1: actually made it a government campaign funded to make make 854 00:43:57,960 --> 00:44:00,359 Speaker 1: sure there is large public awareness campaigns. And I think 855 00:44:00,680 --> 00:44:03,319 Speaker 1: Facebook and other companies need to do much better and 856 00:44:03,360 --> 00:44:05,200 Speaker 1: step up their job and spend millions and millions of 857 00:44:05,280 --> 00:44:08,520 Speaker 1: dollars on making sure people are aware of these conspiracy 858 00:44:08,520 --> 00:44:11,960 Speaker 1: theories and needs deliberate campaigns because we're very vulnerable, so 859 00:44:12,000 --> 00:44:14,439 Speaker 1: we need some more awareness for sure, but we also 860 00:44:14,520 --> 00:44:16,640 Speaker 1: need the platforms to crack down on the ways that 861 00:44:16,719 --> 00:44:20,080 Speaker 1: they continue to be vulnerable to outside influence. So what 862 00:44:20,080 --> 00:44:22,719 Speaker 1: you're saying is that for all the rhetoric, Facebook really 863 00:44:22,760 --> 00:44:26,240 Speaker 1: hasn't cleaned up the problem of fakes and foreign influence, 864 00:44:26,239 --> 00:44:28,560 Speaker 1: and it could have as profound in effect this year 865 00:44:28,600 --> 00:44:31,640 Speaker 1: as it did in It could totally have as profound 866 00:44:31,640 --> 00:44:34,000 Speaker 1: in effect this year as in now. I do want 867 00:44:34,040 --> 00:44:36,640 Speaker 1: to say there's a lot of people working really hard, 868 00:44:36,760 --> 00:44:39,520 Speaker 1: I'm sure at Facebook and at Twitter to try and 869 00:44:39,560 --> 00:44:42,279 Speaker 1: clean this up. My point is just that they're not 870 00:44:42,400 --> 00:44:45,600 Speaker 1: nearly close enough to make us feel confident about our 871 00:44:45,640 --> 00:44:48,840 Speaker 1: elections not being manipulated. So to close the gap, I 872 00:44:48,840 --> 00:44:50,840 Speaker 1: think we need to be highly aware and spread this 873 00:44:50,880 --> 00:44:53,759 Speaker 1: message to everybody we know that these conspiracy theories are 874 00:44:53,760 --> 00:44:56,160 Speaker 1: going to be very compelling and look very true, and 875 00:44:56,200 --> 00:44:58,560 Speaker 1: they'll spread really quickly. But we're gonna need you know, 876 00:44:58,560 --> 00:45:01,000 Speaker 1: there's a reason we have a five second delay on television. 877 00:45:01,440 --> 00:45:04,560 Speaker 1: You know. We don't want to just automatically broadcast everything 878 00:45:04,640 --> 00:45:07,360 Speaker 1: instantly to millions of people and have it reshared that 879 00:45:07,520 --> 00:45:11,640 Speaker 1: fast networks. When Donald Trump had been elected president without Facebook, 880 00:45:13,520 --> 00:45:16,560 Speaker 1: and I'm not saying that with the political specific political bias, 881 00:45:16,600 --> 00:45:19,400 Speaker 1: I can just tell you that from the perspective of 882 00:45:19,840 --> 00:45:24,960 Speaker 1: how media works, social media was critical to his election. 883 00:45:25,719 --> 00:45:29,239 Speaker 1: That's pretty chilling. What about artificial intelligence? Just on how 884 00:45:29,239 --> 00:45:32,759 Speaker 1: concerned are you about that? And this trend to give 885 00:45:32,800 --> 00:45:37,120 Speaker 1: more and more decision making authority to robots or to 886 00:45:38,000 --> 00:45:42,400 Speaker 1: you know, algorithms or too technology in general, versus having 887 00:45:42,480 --> 00:45:46,240 Speaker 1: some kind of human judgment involved. Yeah, this is really 888 00:45:46,239 --> 00:45:49,600 Speaker 1: the critical thing we think of artificial intelligence. People think 889 00:45:49,640 --> 00:45:53,239 Speaker 1: of the Terminator movies and Arnold Schwarzenegger and you know, 890 00:45:53,400 --> 00:45:55,960 Speaker 1: death bots or something like that, and they think it's 891 00:45:56,000 --> 00:45:58,080 Speaker 1: all about the future. Oh, and the future will worry 892 00:45:58,080 --> 00:46:00,359 Speaker 1: the AI is going to kill everybody or something like that. 893 00:46:00,719 --> 00:46:04,400 Speaker 1: And what this misses is that we already live inside 894 00:46:04,400 --> 00:46:07,760 Speaker 1: of a system that is governed and run by artificial 895 00:46:07,800 --> 00:46:10,560 Speaker 1: intelligence algorithms. And that's because when you open up a 896 00:46:10,560 --> 00:46:13,439 Speaker 1: Facebook news feed, that's an AI. It's trying to figure 897 00:46:13,480 --> 00:46:15,440 Speaker 1: out what can I show you that's going to keep 898 00:46:15,440 --> 00:46:17,759 Speaker 1: you hooked. When you open up YouTube and you wake 899 00:46:17,840 --> 00:46:19,840 Speaker 1: up two hours later saying, what the hell just happened 900 00:46:19,840 --> 00:46:23,400 Speaker 1: with my time? The reason was because it was playing. 901 00:46:23,440 --> 00:46:26,000 Speaker 1: It was an AI that was playing chess against your mind. 902 00:46:26,000 --> 00:46:28,279 Speaker 1: Think if your mind as a chessboard, and you can. 903 00:46:28,360 --> 00:46:30,000 Speaker 1: It's sitting there and it thinks it knows what it 904 00:46:30,040 --> 00:46:32,279 Speaker 1: wants to do and what your goals it has, But 905 00:46:32,480 --> 00:46:34,799 Speaker 1: YouTube sitting there trying to play chess against your mind, 906 00:46:34,880 --> 00:46:37,920 Speaker 1: asking what are the videos I can put on AutoPlay 907 00:46:38,000 --> 00:46:40,200 Speaker 1: next that will keep you on here for as long 908 00:46:40,280 --> 00:46:43,520 Speaker 1: as possible. And just like when Gary Kasprov played the 909 00:46:43,560 --> 00:46:46,719 Speaker 1: AI at chess and it and and Gary lost, it's 910 00:46:46,760 --> 00:46:49,520 Speaker 1: because the AI was seeing way more moves ahead on 911 00:46:49,560 --> 00:46:52,760 Speaker 1: the chessboard than Gary could see. And at some point 912 00:46:52,800 --> 00:46:55,440 Speaker 1: it can see so many moves ahead that it's done. 913 00:46:55,560 --> 00:46:58,279 Speaker 1: It's it's checkmate against humanity. And the issue that we 914 00:46:58,360 --> 00:47:01,080 Speaker 1: have right now is that when you land on YouTube 915 00:47:01,360 --> 00:47:03,920 Speaker 1: and it sees that many more moves ahead it's not 916 00:47:03,960 --> 00:47:07,520 Speaker 1: aligned with our goals, and YouTube also drove fifteen billion 917 00:47:07,640 --> 00:47:11,399 Speaker 1: views to Alex jones conspiracy theory videos on its own 918 00:47:11,560 --> 00:47:15,160 Speaker 1: using AI, So we already have an AI problem right now, 919 00:47:15,200 --> 00:47:17,239 Speaker 1: which is why it's so critical to bring awareness to 920 00:47:17,280 --> 00:47:21,359 Speaker 1: these issues. In closing, Tristan, what are steps that all 921 00:47:21,400 --> 00:47:25,200 Speaker 1: of us can take to protect ourselves from being manipulated 922 00:47:25,239 --> 00:47:30,239 Speaker 1: by tech until our kids from getting addicted to exactly 923 00:47:30,320 --> 00:47:33,759 Speaker 1: until and unless these companies fix themselves. What can we 924 00:47:33,840 --> 00:47:37,120 Speaker 1: do in the interim to to protect ourselves as much 925 00:47:37,160 --> 00:47:41,040 Speaker 1: as we can? Absolutely, I mean, obviously we don't want 926 00:47:41,040 --> 00:47:44,400 Speaker 1: to wait five years for tech companies to finally come around, 927 00:47:44,400 --> 00:47:46,520 Speaker 1: so we have to have choices we can make right now. 928 00:47:46,560 --> 00:47:48,600 Speaker 1: And the good news is we can. Um A bunch 929 00:47:48,640 --> 00:47:50,880 Speaker 1: of these, by the way, are on our website, Humane 930 00:47:50,920 --> 00:47:53,239 Speaker 1: tech dot com. Um. But you know, you can do 931 00:47:53,320 --> 00:47:56,440 Speaker 1: things like turn off all notifications. Most people don't realize 932 00:47:56,440 --> 00:48:00,640 Speaker 1: that notifications on your phone are mostly invented by machines 933 00:48:01,120 --> 00:48:03,399 Speaker 1: trying to find a way to get you coming back, saying, oh, 934 00:48:03,520 --> 00:48:05,600 Speaker 1: ten new friends posted some likes over here. Don't you 935 00:48:05,600 --> 00:48:07,839 Speaker 1: want to see what those are? Just turn all that 936 00:48:07,880 --> 00:48:10,440 Speaker 1: stuff off. Try to make as many notifications on your 937 00:48:10,440 --> 00:48:13,319 Speaker 1: phone off as possible. Um, when I say this, you'll 938 00:48:13,360 --> 00:48:15,280 Speaker 1: hear it, but you probably won't do it. So actually 939 00:48:15,280 --> 00:48:18,480 Speaker 1: consider what you know. Can I turn off notifications? Am 940 00:48:18,480 --> 00:48:21,320 Speaker 1: I willing to make that step? I really recommended. Another 941 00:48:21,360 --> 00:48:23,600 Speaker 1: thing is setting your phone to gray scale. If you 942 00:48:23,600 --> 00:48:26,040 Speaker 1: feel really addicted, every time you turn your phone over 943 00:48:26,080 --> 00:48:28,560 Speaker 1: and you see those colors, it's lighting up some of 944 00:48:28,600 --> 00:48:31,440 Speaker 1: that dopamine rewards, just even if by looking at it, 945 00:48:31,719 --> 00:48:33,359 Speaker 1: And if you set your phone to gray scale, it's 946 00:48:33,400 --> 00:48:37,240 Speaker 1: sort of cuts out about fift of that like addictive feeling. 947 00:48:39,640 --> 00:48:42,920 Speaker 1: Our listeners know. Yeah, you go into uh your on 948 00:48:42,920 --> 00:48:45,880 Speaker 1: an iPhone, you go into general settings, and then general 949 00:48:46,360 --> 00:48:49,120 Speaker 1: and then accessibility and the scroll all the way to 950 00:48:49,120 --> 00:48:53,720 Speaker 1: the bottom to accessibility shortcut. Well, this is a good example. 951 00:48:53,760 --> 00:48:55,279 Speaker 1: And this is where Apple you know, when the new 952 00:48:55,320 --> 00:48:57,239 Speaker 1: iPhone comes out in a few months that they can 953 00:48:57,280 --> 00:48:59,560 Speaker 1: make this a lot easier for people. But I really 954 00:48:59,600 --> 00:49:02,440 Speaker 1: recommend and just not using social media, just you know, 955 00:49:02,880 --> 00:49:05,200 Speaker 1: at least uninstalling it from your phone and only using 956 00:49:05,200 --> 00:49:09,200 Speaker 1: it when you're on a desktop. Um, and that that 957 00:49:09,239 --> 00:49:12,399 Speaker 1: will never happen. By the way, I did do that 958 00:49:13,160 --> 00:49:16,719 Speaker 1: with Facebook. I should probably do it with Instagram as well, 959 00:49:17,200 --> 00:49:20,480 Speaker 1: um Twitter. I couldn't give up, but it made a 960 00:49:20,520 --> 00:49:22,520 Speaker 1: huge difference in my life not to have Facebook on 961 00:49:22,560 --> 00:49:25,000 Speaker 1: my phone. I tried to turn my phone screen to 962 00:49:25,080 --> 00:49:29,239 Speaker 1: black and white per Tristan's recommendation. That lasted about fifteen 963 00:49:29,520 --> 00:49:32,160 Speaker 1: minutes because I missed it. I was like, the world 964 00:49:32,239 --> 00:49:35,040 Speaker 1: just doesn't feel good with a black and white phone. 965 00:49:35,280 --> 00:49:38,840 Speaker 1: See what's my problem, Tristan. You don't have a problem, Katie. Know. 966 00:49:38,920 --> 00:49:40,719 Speaker 1: The thing is that I think people need to be 967 00:49:40,760 --> 00:49:43,799 Speaker 1: able to switch. I think the thing is people need 968 00:49:43,800 --> 00:49:45,600 Speaker 1: to be able to switch it back and forth between 969 00:49:45,640 --> 00:49:47,319 Speaker 1: color and black and white. The point isn't to stay 970 00:49:47,360 --> 00:49:49,640 Speaker 1: in black and white all the time. It's just that, um, 971 00:49:49,680 --> 00:49:51,440 Speaker 1: you know, keep it in black and white by default, 972 00:49:51,920 --> 00:49:53,600 Speaker 1: because then you you actually get used to it. You 973 00:49:53,640 --> 00:49:56,440 Speaker 1: actually it starts to feel overwhelming to see color. It 974 00:49:56,520 --> 00:49:58,719 Speaker 1: feels like whoa, that's that's too much. And that's kind 975 00:49:58,719 --> 00:50:00,840 Speaker 1: of where you want to be is to remind yourself 976 00:50:00,840 --> 00:50:03,480 Speaker 1: that this is a tool in my pocket. It's only 977 00:50:03,520 --> 00:50:05,600 Speaker 1: a tool, and I want to make sure I'm putting 978 00:50:05,640 --> 00:50:08,000 Speaker 1: it in its place. That that's the role I wanted 979 00:50:08,040 --> 00:50:09,719 Speaker 1: to serve in black and white can be a helpful mind. 980 00:50:09,800 --> 00:50:12,440 Speaker 1: And what about for kids? Real quickly, Tristan, because a 981 00:50:12,440 --> 00:50:16,200 Speaker 1: lot of our listeners have children or grandchildren. I find 982 00:50:16,280 --> 00:50:19,040 Speaker 1: that it's quite depressing when I see parents at the 983 00:50:19,160 --> 00:50:22,959 Speaker 1: park or playground. They're so you know, they have They're 984 00:50:23,000 --> 00:50:25,040 Speaker 1: so busy looking at their phone. And I'm not being 985 00:50:25,120 --> 00:50:27,719 Speaker 1: judge here because I probably would be doing the same 986 00:50:27,760 --> 00:50:30,399 Speaker 1: thing if my kids were younger. But they are not 987 00:50:30,600 --> 00:50:33,520 Speaker 1: interacting with their kids in the same way. And it's 988 00:50:33,640 --> 00:50:36,400 Speaker 1: or people having dinner. I don't know. I'm sure you 989 00:50:36,440 --> 00:50:39,480 Speaker 1: all have seen this. Nothing is more depressing than to 990 00:50:39,560 --> 00:50:42,720 Speaker 1: see a couple at a nice restaurant on their phones 991 00:50:42,960 --> 00:50:46,160 Speaker 1: or even a family. I get so bummed out. And 992 00:50:46,200 --> 00:50:48,799 Speaker 1: again I'm not being judge e listeners, but it just 993 00:50:49,200 --> 00:50:51,000 Speaker 1: it's just, I don't know, there's something about it that 994 00:50:51,040 --> 00:50:55,160 Speaker 1: just makes me incredibly sad. You know, this is the conversation. 995 00:50:55,320 --> 00:50:57,439 Speaker 1: I mean, this is the culture that we're that these 996 00:50:57,440 --> 00:51:01,200 Speaker 1: design choices are creating, and um, you know, this is 997 00:51:01,239 --> 00:51:02,839 Speaker 1: going to be up to us in the short term 998 00:51:02,880 --> 00:51:05,319 Speaker 1: to realize is this the society we want to live in? 999 00:51:05,560 --> 00:51:08,279 Speaker 1: Is this? How is this who we are? Um? And 1000 00:51:08,320 --> 00:51:10,120 Speaker 1: I think that that's a question that all parents have 1001 00:51:10,239 --> 00:51:13,080 Speaker 1: to start asking themselves. Um. And you can use the 1002 00:51:13,080 --> 00:51:15,000 Speaker 1: tips that we just talked about in the meantime to 1003 00:51:15,200 --> 00:51:17,600 Speaker 1: try and use it less. And finally, finally, there are 1004 00:51:17,680 --> 00:51:20,040 Speaker 1: a couple of apps that you recommend to make this 1005 00:51:20,080 --> 00:51:22,080 Speaker 1: easier for all of us. Can you just briefly talk 1006 00:51:22,120 --> 00:51:25,920 Speaker 1: about some of those. Yet you can download and installed Moments, 1007 00:51:25,960 --> 00:51:27,799 Speaker 1: which is an app for the iPhone that tracks how 1008 00:51:27,880 --> 00:51:30,080 Speaker 1: much time you spend. You know, I mean that helps 1009 00:51:30,120 --> 00:51:32,280 Speaker 1: to sort of see and get a picture of where 1010 00:51:32,280 --> 00:51:34,359 Speaker 1: your time is going. But I do really think that 1011 00:51:34,760 --> 00:51:37,200 Speaker 1: the much better thing is simply to look at your 1012 00:51:37,200 --> 00:51:39,480 Speaker 1: phone and you know, bleep most of the apps you're 1013 00:51:39,480 --> 00:51:42,680 Speaker 1: not using, turn off notifications, set it to black and white. Um, 1014 00:51:42,680 --> 00:51:45,480 Speaker 1: there's a couple other tips on on the website. It 1015 00:51:45,480 --> 00:51:47,480 Speaker 1: can really make a big difference. And just simply being 1016 00:51:47,480 --> 00:51:49,840 Speaker 1: aware of this can change your relationship to it. And 1017 00:51:50,200 --> 00:51:53,160 Speaker 1: you know, I think that realizing that you're missing out 1018 00:51:53,160 --> 00:51:56,879 Speaker 1: on life. Talk about not being present, whether it means 1019 00:51:56,920 --> 00:51:59,000 Speaker 1: staring at your phone when you're in a new city 1020 00:51:59,080 --> 00:52:02,680 Speaker 1: and you don't even know a your surroundings, or not 1021 00:52:03,160 --> 00:52:06,239 Speaker 1: ever being bored, which is the key to creativity. In 1022 00:52:06,280 --> 00:52:09,560 Speaker 1: this hour, I did Tristan where You're You're featured. One 1023 00:52:09,600 --> 00:52:12,640 Speaker 1: of the things I realized and thought about is the 1024 00:52:12,719 --> 00:52:15,759 Speaker 1: part of our prefrontal cortex that allows us to be 1025 00:52:15,920 --> 00:52:19,120 Speaker 1: creative that fires up when we have a creative thought, 1026 00:52:19,200 --> 00:52:23,319 Speaker 1: cannot really operate or function when we're constantly distracted by 1027 00:52:23,320 --> 00:52:27,400 Speaker 1: our phone. So you know, every moment, whether it's driving, 1028 00:52:27,640 --> 00:52:29,600 Speaker 1: you know, if you're in the passenger seat of a car, 1029 00:52:29,840 --> 00:52:33,920 Speaker 1: or you're just you know, waiting, nobody ever has this 1030 00:52:34,120 --> 00:52:37,839 Speaker 1: time to just sit and think, and that is absolutely 1031 00:52:37,960 --> 00:52:41,440 Speaker 1: key to being creative and coming up with ideas and 1032 00:52:41,480 --> 00:52:45,560 Speaker 1: having epiphanies of all kinds. And that's one thing I 1033 00:52:45,600 --> 00:52:50,360 Speaker 1: think about too, This this constant stimulation which only increases 1034 00:52:50,400 --> 00:52:54,000 Speaker 1: the cortisol and doesn't allow us to have a moment 1035 00:52:54,239 --> 00:52:59,719 Speaker 1: where we can think and consider and contemplate things. That's 1036 00:53:00,160 --> 00:53:02,719 Speaker 1: be right. And that's why this is such an invisible 1037 00:53:02,760 --> 00:53:06,560 Speaker 1: problem beneath all other problems, because every choice we make 1038 00:53:06,560 --> 00:53:09,319 Speaker 1: in our lives is on top of the background of 1039 00:53:09,360 --> 00:53:13,600 Speaker 1: how our mind is feeling, thinking, um choosing, and how 1040 00:53:13,640 --> 00:53:17,000 Speaker 1: we think feeling. Shoes are basically, you know, never been 1041 00:53:17,000 --> 00:53:20,680 Speaker 1: more influenced by how our phones shape our attention and 1042 00:53:20,719 --> 00:53:23,640 Speaker 1: whether you care about creativity or mental health or loneliness. 1043 00:53:24,080 --> 00:53:26,279 Speaker 1: We're gonna be publishing something soon that's kind of a 1044 00:53:26,360 --> 00:53:29,719 Speaker 1: ledger of all of these negative externalities, these cultural harms 1045 00:53:29,760 --> 00:53:31,640 Speaker 1: on society, so that people can really see it all 1046 00:53:31,680 --> 00:53:34,600 Speaker 1: in one place, because the effects are so profound and 1047 00:53:34,600 --> 00:53:37,799 Speaker 1: so invisible. Well, I'm so proud of you raising these 1048 00:53:37,840 --> 00:53:41,880 Speaker 1: issues at the ripe old age of thirty four being 1049 00:53:42,440 --> 00:53:45,560 Speaker 1: thirty three still but I know the Atlantic called you 1050 00:53:45,640 --> 00:53:48,799 Speaker 1: the closest thing Silicon Valley has to a conscience, and 1051 00:53:49,280 --> 00:53:54,200 Speaker 1: I really appreciate everything you're doing, the consciousness you're raising 1052 00:53:54,320 --> 00:53:57,200 Speaker 1: about these issues. If people want to learn more, they 1053 00:53:57,200 --> 00:54:00,319 Speaker 1: want to actually support what you're doing, triest On, how 1054 00:54:00,360 --> 00:54:04,040 Speaker 1: can we do that? Yeah, just go to the website 1055 00:54:04,400 --> 00:54:06,920 Speaker 1: humane tech dot com, which is for our Center for 1056 00:54:07,040 --> 00:54:10,200 Speaker 1: Humane Technology, and um, there's ways to get involved, join 1057 00:54:10,239 --> 00:54:12,839 Speaker 1: the community, make a donation. Um. You know, we see 1058 00:54:12,840 --> 00:54:15,120 Speaker 1: this as a team effort, and this is a team 1059 00:54:15,200 --> 00:54:18,439 Speaker 1: humanity really protecting and fighting for for the world we want. 1060 00:54:22,360 --> 00:54:24,520 Speaker 1: I was more than a little freaked out after speaking 1061 00:54:24,560 --> 00:54:28,160 Speaker 1: with Tristan. What about you, Brian? For sure? But knowledge 1062 00:54:28,200 --> 00:54:30,000 Speaker 1: is power, as they say, and I think it's time 1063 00:54:30,000 --> 00:54:31,799 Speaker 1: for a lot of us to take a hard look 1064 00:54:31,880 --> 00:54:35,319 Speaker 1: at our tech habits, myself included, and just how much 1065 00:54:35,360 --> 00:54:38,799 Speaker 1: time they're sucking up in our lives. That's true, and 1066 00:54:38,840 --> 00:54:41,000 Speaker 1: I think if you're just conscious of it, it does 1067 00:54:41,160 --> 00:54:43,360 Speaker 1: help you. And I think there are times when you 1068 00:54:43,400 --> 00:54:47,040 Speaker 1: can leave it at home. I know parents and people say, well, 1069 00:54:47,040 --> 00:54:49,480 Speaker 1: what if my kid calls, what if there's an emergency? 1070 00:54:49,520 --> 00:54:53,200 Speaker 1: And I feel that way too, But I also feel 1071 00:54:53,200 --> 00:54:55,640 Speaker 1: like you have to take a break from it sometimes 1072 00:54:55,800 --> 00:55:00,640 Speaker 1: and take a walk, don't listen to our podcast. Wait wait, 1073 00:55:00,680 --> 00:55:02,759 Speaker 1: did I just say that? But sometimes you just need 1074 00:55:02,800 --> 00:55:05,839 Speaker 1: to leave it at home, get outside and be on tether. 1075 00:55:06,000 --> 00:55:09,239 Speaker 1: Don't have it permanently, you know, attached to your hand. 1076 00:55:09,719 --> 00:55:11,920 Speaker 1: By the way, I have a friend who has two phones, 1077 00:55:12,160 --> 00:55:15,880 Speaker 1: one for during the week which has social media and 1078 00:55:15,960 --> 00:55:18,480 Speaker 1: all the bells and whistles of a smartphone, and one 1079 00:55:18,560 --> 00:55:21,640 Speaker 1: for the weekend, which is just texting and calling in 1080 00:55:21,719 --> 00:55:23,600 Speaker 1: case his kids need to reach him. I thought that 1081 00:55:23,640 --> 00:55:26,000 Speaker 1: was an interesting idea. Must be nice to have such 1082 00:55:26,200 --> 00:55:29,600 Speaker 1: rich friends, Brian. Meanwhile, stop checking your phone a hundred 1083 00:55:29,640 --> 00:55:33,359 Speaker 1: and fifty times a day, people, Brian. I downloaded that 1084 00:55:33,400 --> 00:55:36,799 Speaker 1: Moment app and the worst day I was on my 1085 00:55:36,840 --> 00:55:42,400 Speaker 1: phone for ready nine hours now. In fairness, I was sick, 1086 00:55:42,840 --> 00:55:45,440 Speaker 1: I had like the flu, and I had nothing to 1087 00:55:45,480 --> 00:55:50,440 Speaker 1: do but lying in bed in my phone. Well, I 1088 00:55:50,480 --> 00:55:53,319 Speaker 1: was watching The Crown. If I recall and on my 1089 00:55:53,360 --> 00:55:56,000 Speaker 1: phone and you weren't transfixed by the Crown, I can't 1090 00:55:56,000 --> 00:55:58,399 Speaker 1: believe I was. I was, and it was crazy because 1091 00:55:58,440 --> 00:56:00,359 Speaker 1: I had to keep rewinding it because I'm us things 1092 00:56:00,360 --> 00:56:02,320 Speaker 1: because I'd be on my phone anyway. I got a 1093 00:56:02,360 --> 00:56:04,799 Speaker 1: lot of problems people. That does it for this week's show, 1094 00:56:05,120 --> 00:56:08,040 Speaker 1: Thanks as usual to our pod squad over at Stitcher. 1095 00:56:08,080 --> 00:56:11,319 Speaker 1: That's Gianna Palmer, Jared O'Connell and Nora Richie. And thanks 1096 00:56:11,360 --> 00:56:13,800 Speaker 1: as well to the team over at Katie Curric Media. 1097 00:56:13,880 --> 00:56:16,880 Speaker 1: That would be Alison Bresnik, Emily Beena and Beth Demas. 1098 00:56:17,320 --> 00:56:20,319 Speaker 1: Mark Phillips wrote our theme music, and Brian and I 1099 00:56:20,360 --> 00:56:23,680 Speaker 1: are the show's executive producers. For Better for Worse, I'm 1100 00:56:23,760 --> 00:56:26,719 Speaker 1: under Katie Couric on social media. Instagram is where I 1101 00:56:26,760 --> 00:56:31,840 Speaker 1: shine people. Brian tweets his little hard out at Goldsmith b. Meanwhile, 1102 00:56:31,880 --> 00:56:34,560 Speaker 1: don't forget to call in with your stories about discrimination 1103 00:56:34,680 --> 00:56:37,120 Speaker 1: and gender bias at work. That number again is nine 1104 00:56:37,120 --> 00:56:39,960 Speaker 1: to nine to to four four six three seven, or 1105 00:56:40,040 --> 00:56:44,000 Speaker 1: drop us a line at comments at currect podcast dot 1106 00:56:44,040 --> 00:56:46,600 Speaker 1: com and by the way, if you haven't already, please 1107 00:56:46,680 --> 00:56:49,200 Speaker 1: leave us a rating over at Apple Podcast and be 1108 00:56:49,280 --> 00:56:52,239 Speaker 1: sure to subscribe to the show as well. Thank you 1109 00:56:52,360 --> 00:56:54,680 Speaker 1: so much for listening, and we'll talk to you next week.