1 00:00:01,760 --> 00:00:04,200 Speaker 1: Sleep Walkers is a production of I Heart Radio and 2 00:00:04,360 --> 00:00:10,000 Speaker 1: Unusual Productions. So I mean, let's just ground the conversation 3 00:00:10,039 --> 00:00:13,239 Speaker 1: for a second. There's two point two billion people who 4 00:00:13,360 --> 00:00:16,640 Speaker 1: use Facebook. It's about the size of Christianity. There's one 5 00:00:16,640 --> 00:00:19,040 Speaker 1: point nine billion people who use YouTube. It's about the 6 00:00:19,079 --> 00:00:22,520 Speaker 1: size of Islam. That's Tristan Harris. When I walk into 7 00:00:22,560 --> 00:00:25,520 Speaker 1: that Facebook room, you know, I'm walking into a room 8 00:00:25,560 --> 00:00:28,760 Speaker 1: that's designed to never make me leave with a thousand 9 00:00:28,840 --> 00:00:32,159 Speaker 1: engineers using supercomputers to calculate the perfect seductive thing to 10 00:00:32,200 --> 00:00:34,920 Speaker 1: put in front of my brain. And Tristan knows a 11 00:00:35,000 --> 00:00:38,120 Speaker 1: thing or two about seduction. I was a design ethosistic 12 00:00:38,200 --> 00:00:41,199 Speaker 1: Google and have spent a decade understanding some of the 13 00:00:41,240 --> 00:00:44,320 Speaker 1: invisible forces that shape the way that we see and 14 00:00:44,360 --> 00:00:46,360 Speaker 1: make sense of the world and choose in the world. 15 00:00:46,640 --> 00:00:50,240 Speaker 1: In case you're wondering they teach this stuff in school, well, 16 00:00:50,440 --> 00:00:53,200 Speaker 1: they do at Stanford. I studied at this lab called 17 00:00:53,200 --> 00:00:57,480 Speaker 1: the Stanford Persuasive Technology Lab that teaches engineering students about 18 00:00:57,480 --> 00:01:00,680 Speaker 1: the entire discipline of persuasion. Everything you know how to 19 00:01:00,720 --> 00:01:03,040 Speaker 1: persuade dogs with clicker training, So click click, you know 20 00:01:03,080 --> 00:01:05,280 Speaker 1: you've got the food. Click click, reward all the way 21 00:01:05,280 --> 00:01:07,440 Speaker 1: to casino design and slot machines, and how do you 22 00:01:07,560 --> 00:01:09,520 Speaker 1: change the lighting to get people to buy things? And 23 00:01:09,560 --> 00:01:12,319 Speaker 1: then you know supermarket design and choice architecture and putting 24 00:01:12,360 --> 00:01:14,320 Speaker 1: the candy in the final aisle, because that's the thing 25 00:01:14,360 --> 00:01:17,920 Speaker 1: that gets you to buy. Tristan and his classmates used 26 00:01:17,959 --> 00:01:21,280 Speaker 1: what they learned at the Persuasive Technology Lab to define 27 00:01:21,319 --> 00:01:24,720 Speaker 1: how we live online. My partners in that class, and 28 00:01:24,800 --> 00:01:27,560 Speaker 1: two thousands of sex at Stanford were the founders of Instagram. 29 00:01:27,680 --> 00:01:29,679 Speaker 1: Half my friends built some of these products and then 30 00:01:29,720 --> 00:01:32,319 Speaker 1: to the like button. Think about that for a moment. 31 00:01:32,840 --> 00:01:35,360 Speaker 1: The people who designed the apps we use the most 32 00:01:35,840 --> 00:01:38,880 Speaker 1: learned how to make them as appealing as possible by 33 00:01:38,920 --> 00:01:42,000 Speaker 1: borrowing the science that trains dogs and gets people hooked 34 00:01:42,040 --> 00:01:44,800 Speaker 1: on gambling. And now all of that is baked into 35 00:01:44,800 --> 00:01:48,120 Speaker 1: a device that's basically become an extension of our body. 36 00:01:48,480 --> 00:01:51,040 Speaker 1: We check our phones about eighty times a day, that's 37 00:01:51,040 --> 00:01:54,320 Speaker 1: the conservative number. And you know their incentive is to 38 00:01:54,480 --> 00:01:57,960 Speaker 1: calculate what is the perfect most seductive thing I can 39 00:01:57,960 --> 00:02:00,040 Speaker 1: show you next to the most seductive red color for 40 00:02:00,120 --> 00:02:03,280 Speaker 1: that notification, or the most seductive video that you know 41 00:02:03,440 --> 00:02:06,520 Speaker 1: you can't help but want to watch next. At a 42 00:02:06,600 --> 00:02:09,480 Speaker 1: time when technology is changing faster than our ability to 43 00:02:09,560 --> 00:02:13,200 Speaker 1: understand it and seeping into nearly every corner of our lives, 44 00:02:13,560 --> 00:02:17,000 Speaker 1: what kind of murky future are we sleepwalking into? What 45 00:02:17,040 --> 00:02:19,240 Speaker 1: can we do to take back some control? And how 46 00:02:19,360 --> 00:02:23,400 Speaker 1: is emerging technology changing our lives for the better. This 47 00:02:24,040 --> 00:02:39,720 Speaker 1: is Sleepwalkers, So welcome. I'mrs and I'm doing this show 48 00:02:39,720 --> 00:02:42,440 Speaker 1: because I'm fascinated with how we relate to the technologies 49 00:02:42,480 --> 00:02:45,000 Speaker 1: that they're changing our lives, whether they end up being 50 00:02:45,040 --> 00:02:49,280 Speaker 1: something like Dr Frankenstein's Monster or ai craft it seltzer Water. 51 00:02:49,840 --> 00:02:52,200 Speaker 1: In this episode, we look at how our technology gets 52 00:02:52,200 --> 00:02:56,320 Speaker 1: into our heads and we take on some distinctly modern monsters, 53 00:02:56,680 --> 00:03:01,080 Speaker 1: from the trick to successful online dating to deterring potential 54 00:03:01,240 --> 00:03:05,919 Speaker 1: terrorists with invisible technology. And I'll have some company along 55 00:03:05,919 --> 00:03:10,680 Speaker 1: the way. Hi. That's Kara Price. She's my friend and 56 00:03:10,720 --> 00:03:13,000 Speaker 1: she hosted a show called Talk Nerdy to Me for 57 00:03:13,040 --> 00:03:15,840 Speaker 1: the Huffington Post. Um. So, I saw this article on 58 00:03:15,880 --> 00:03:19,160 Speaker 1: the Times, the New York Times about how all of 59 00:03:19,200 --> 00:03:23,320 Speaker 1: these Silicon Valley exacts are taking away their kids screen time, 60 00:03:23,840 --> 00:03:26,679 Speaker 1: you know, telling their kids nanny's that the nanny can't 61 00:03:26,800 --> 00:03:30,799 Speaker 1: use screens. And there's actually this quote from Mark Zuckerberg's 62 00:03:30,800 --> 00:03:35,640 Speaker 1: former executive assistant, this woman Athena Tivaria. She says, I 63 00:03:35,680 --> 00:03:38,200 Speaker 1: am convinced the devil lives in our phones and is 64 00:03:38,240 --> 00:03:41,520 Speaker 1: wreaking havoc on our children. Is that like the devil 65 00:03:41,600 --> 00:03:43,360 Speaker 1: is in the details, or is it like the devil 66 00:03:43,360 --> 00:03:45,320 Speaker 1: who lives in hell? No? I think it's the red 67 00:03:45,560 --> 00:03:48,240 Speaker 1: devil who wears no clothes that guys in the phone. Yeah, 68 00:03:48,240 --> 00:03:50,920 Speaker 1: because you know what he does. It leaves people in temptation. 69 00:03:51,200 --> 00:03:53,280 Speaker 1: That's right, that's the devil's show. That's really right, and 70 00:03:53,320 --> 00:03:55,040 Speaker 1: it is have you ever seen a child on an iPad? 71 00:03:55,120 --> 00:03:58,600 Speaker 1: Have you ever seen me? Well, have you ever seen me? Yes? Mesmerized, 72 00:03:58,600 --> 00:04:01,600 Speaker 1: clickity clack, don't come back, actually showing Pocca, who's the 73 00:04:01,640 --> 00:04:05,640 Speaker 1: first president of Facebook, said God only knows what it's 74 00:04:05,680 --> 00:04:09,400 Speaker 1: doing to our children's brains. Got on the devil. Here's 75 00:04:09,440 --> 00:04:14,040 Speaker 1: all of these technology executives who have built basically what 76 00:04:14,080 --> 00:04:17,440 Speaker 1: we use every day to do everything from being in 77 00:04:17,440 --> 00:04:20,240 Speaker 1: touch with our friends to meeting our loves to getting 78 00:04:20,240 --> 00:04:23,320 Speaker 1: from A to B. And suddenly they're saying, well, you guys, 79 00:04:23,320 --> 00:04:26,919 Speaker 1: go ahead, but not for my kids. Yeah, yeah. I 80 00:04:26,920 --> 00:04:29,880 Speaker 1: think it's interesting that this feels very similar to the 81 00:04:29,880 --> 00:04:33,839 Speaker 1: conversation we have surrounding you know, junk food, sugar, tobacco, alcohol, 82 00:04:33,880 --> 00:04:36,160 Speaker 1: that these are things that your parents are supposed to 83 00:04:36,320 --> 00:04:39,280 Speaker 1: protect you from up until a certain point, whether it's 84 00:04:39,320 --> 00:04:42,039 Speaker 1: when you go to college or whatever. And now our 85 00:04:42,080 --> 00:04:44,200 Speaker 1: technology used, whether it be how much we're on social 86 00:04:44,200 --> 00:04:47,760 Speaker 1: media or how much time we're spending gaming on our screens, 87 00:04:48,240 --> 00:04:51,000 Speaker 1: is something that parents have to regulate. The problem is 88 00:04:51,040 --> 00:04:53,400 Speaker 1: one like I'm too old for my parents to regulate it, 89 00:04:53,440 --> 00:04:55,520 Speaker 1: so so what am I going to do? And also 90 00:04:55,560 --> 00:04:57,520 Speaker 1: like my parents use it, so it is sort of 91 00:04:57,520 --> 00:04:59,279 Speaker 1: like sugar in the sense it's like there are parents 92 00:04:59,279 --> 00:05:01,000 Speaker 1: that tell their kids not each sugar who have major 93 00:05:01,040 --> 00:05:03,919 Speaker 1: sugar addictions. We had those conversations in the past, and 94 00:05:03,920 --> 00:05:06,280 Speaker 1: now we're starting to have them about technology. But how 95 00:05:06,279 --> 00:05:09,080 Speaker 1: do we tell our kids to regulate their use of technology? 96 00:05:09,080 --> 00:05:11,960 Speaker 1: And we can't regulate our own And wants to say 97 00:05:11,960 --> 00:05:17,400 Speaker 1: to us that we can't regulate our own technology. His 98 00:05:17,560 --> 00:05:19,680 Speaker 1: Cara with a story of what happens when you add 99 00:05:19,680 --> 00:05:23,640 Speaker 1: service think they know everything about you, but actually get 100 00:05:23,680 --> 00:05:30,120 Speaker 1: it all wrong. I just wrote the whole thing in 101 00:05:30,200 --> 00:05:36,640 Speaker 1: like thirty minutes just you know, banging the keyboard, just 102 00:05:37,040 --> 00:05:41,240 Speaker 1: sobbing the whole time. Dear tech companies. Gillian is a 103 00:05:41,320 --> 00:05:44,400 Speaker 1: social media power user. She uses Twitter for work at 104 00:05:44,400 --> 00:05:47,919 Speaker 1: the Washington Post and Facebook for her social calendar. Even 105 00:05:47,920 --> 00:05:51,240 Speaker 1: her wedding is on YouTube, and in late she was 106 00:05:51,360 --> 00:05:53,800 Speaker 1: entering the final months of her pregnancy. She and her 107 00:05:53,880 --> 00:05:56,560 Speaker 1: husband Bobby were doing the normal what to expect when 108 00:05:56,560 --> 00:06:00,000 Speaker 1: you're expecting stuff, preparing for their newborn son to come 109 00:06:00,000 --> 00:06:04,680 Speaker 1: into their lives. I can remember about two days before 110 00:06:04,920 --> 00:06:10,000 Speaker 1: everything happened, soaking in the tub, and I was just thinking, God, 111 00:06:10,120 --> 00:06:14,880 Speaker 1: this has just gone perfectly, Like I've never been accidentally 112 00:06:14,960 --> 00:06:20,359 Speaker 1: pregnant despite not always being responsible, and we had no 113 00:06:20,520 --> 00:06:24,320 Speaker 1: problem getting pregnant when we wanted to, despite my being 114 00:06:24,400 --> 00:06:27,919 Speaker 1: thirty eight. So I was just like, wow, one and done. 115 00:06:28,400 --> 00:06:32,600 Speaker 1: This totally worked. A couple of days later, I was 116 00:06:32,760 --> 00:06:36,479 Speaker 1: feeling some pain, so I called the doctor's office and 117 00:06:36,520 --> 00:06:40,240 Speaker 1: I said, well, I haven't felt a move today, and 118 00:06:40,360 --> 00:06:42,719 Speaker 1: they said, okay, come in, and so I went in 119 00:06:42,839 --> 00:06:46,920 Speaker 1: and they put me on the sonogram, and I knew immediately. 120 00:06:48,200 --> 00:06:55,360 Speaker 1: I was just screaming no, no, no. A week or 121 00:06:55,400 --> 00:06:58,240 Speaker 1: so after Gillian's sun died. She was being haunted by 122 00:06:58,279 --> 00:07:01,520 Speaker 1: targeted advertising ad that assumed she had given birth to 123 00:07:01,560 --> 00:07:05,640 Speaker 1: a healthy baby boy. When I would scroll through Facebook 124 00:07:05,760 --> 00:07:09,920 Speaker 1: or Instagram, I would get maternity where ads, and so 125 00:07:09,960 --> 00:07:11,440 Speaker 1: I was like, okay, I have to teach it that 126 00:07:11,480 --> 00:07:14,880 Speaker 1: I'm not pregnant anymore. Gillian did the only thing people 127 00:07:14,960 --> 00:07:17,560 Speaker 1: can do with ads they don't want to see. She 128 00:07:17,720 --> 00:07:19,640 Speaker 1: clicked the three dots in the corner of the ad 129 00:07:19,760 --> 00:07:22,760 Speaker 1: and gave feedback. So I would say I don't want 130 00:07:22,760 --> 00:07:25,960 Speaker 1: to see this ad, and then it would say why, 131 00:07:26,080 --> 00:07:28,880 Speaker 1: and I would say, because it's not relevant to me, 132 00:07:29,080 --> 00:07:32,920 Speaker 1: which is like so hard to acknowledge. But Gillian learned 133 00:07:32,920 --> 00:07:35,520 Speaker 1: the hard way that the Facebook algorithm isn't programmed for 134 00:07:35,560 --> 00:07:38,160 Speaker 1: the outcome of a still birth. Then when I god, 135 00:07:38,200 --> 00:07:41,160 Speaker 1: when I got that experienced email, I just I can't 136 00:07:41,200 --> 00:07:45,600 Speaker 1: even like I just snapped finished registering your baby for 137 00:07:45,920 --> 00:07:49,440 Speaker 1: lifelong credit tracking. I just I was just like, you 138 00:07:49,520 --> 00:07:53,840 Speaker 1: have got to be kidding me. Gillian was angry, and 139 00:07:53,960 --> 00:07:56,200 Speaker 1: she knew she had a platform, so she wrote the 140 00:07:56,280 --> 00:07:59,600 Speaker 1: letter and posted it on Twitter. Dear tech companies, I 141 00:07:59,640 --> 00:08:02,760 Speaker 1: know you knew I was pregnant. You probably saw me 142 00:08:02,840 --> 00:08:07,000 Speaker 1: googling maternity plaid and baby safe crib paint, and I 143 00:08:07,000 --> 00:08:10,120 Speaker 1: bet Amazon dot Com even told you my due date 144 00:08:10,320 --> 00:08:14,240 Speaker 1: when I created that prime registry, and silly me, I 145 00:08:14,360 --> 00:08:16,960 Speaker 1: even clicked once or twice on the maternity where ads 146 00:08:17,000 --> 00:08:20,440 Speaker 1: Facebook served up. What can I say, I'm your ideal 147 00:08:20,560 --> 00:08:25,160 Speaker 1: engaged user, but didn't you also see me googling baby 148 00:08:25,200 --> 00:08:28,720 Speaker 1: not moving? And then the announcement posts with key where 149 00:08:28,720 --> 00:08:32,679 Speaker 1: it's like heartbroken and problem and stillborn and the two 150 00:08:32,760 --> 00:08:37,439 Speaker 1: hundred tear drop emoticons from my friends. Gillian's letters struck 151 00:08:37,440 --> 00:08:41,320 Speaker 1: a chord. It got retweeted twenty eight thousand times, and 152 00:08:41,400 --> 00:08:44,600 Speaker 1: shortly after Gillian posted her letter to Twitter, she received 153 00:08:44,600 --> 00:08:48,480 Speaker 1: a response tweet from Facebook's VP of Ads, Rob Goldman. 154 00:08:49,440 --> 00:08:52,080 Speaker 1: I'm so sorry for your loss, he said, and you're 155 00:08:52,120 --> 00:08:55,719 Speaker 1: painful experience with our products. We have a setting available 156 00:08:55,720 --> 00:08:58,840 Speaker 1: that can block ads about some topics people may find painful, 157 00:08:59,040 --> 00:09:02,760 Speaker 1: including parenting. It still needs improvement, but please know that 158 00:09:02,800 --> 00:09:05,880 Speaker 1: we're working on it and welcome your feedback. So I 159 00:09:05,920 --> 00:09:09,880 Speaker 1: turned it off, and within a few hours I got 160 00:09:09,880 --> 00:09:15,240 Speaker 1: an ad for adoption agencies, and the next day I 161 00:09:15,360 --> 00:09:20,840 Speaker 1: got an ad for um father's son matching one sees. 162 00:09:21,360 --> 00:09:24,199 Speaker 1: I've taken screenshots to while every time I get one, 163 00:09:24,240 --> 00:09:27,120 Speaker 1: I just take a picture of it, just like it's 164 00:09:27,120 --> 00:09:31,440 Speaker 1: not working. Yeah, the adoption one g f y. You know, 165 00:09:33,000 --> 00:09:35,520 Speaker 1: Rob Goldman's advice may have changed the ads, but it 166 00:09:35,559 --> 00:09:39,880 Speaker 1: didn't solve the fundamental problem. The algorithms couldn't stop reminding 167 00:09:39,880 --> 00:09:43,440 Speaker 1: Gillian of her loss. Other people were like, well, don't 168 00:09:43,440 --> 00:09:45,800 Speaker 1: be on Facebook at all, you know, don't do any 169 00:09:45,880 --> 00:09:48,720 Speaker 1: of those things, And first of all, I don't think 170 00:09:48,720 --> 00:09:53,000 Speaker 1: that's realistic, especially because for Gillian, Facebook was also providing 171 00:09:53,040 --> 00:09:55,920 Speaker 1: comfort through it all. It was helpful to have my 172 00:09:56,040 --> 00:09:59,880 Speaker 1: friends timing in and saying, we're so sorry, you know, 173 00:10:00,040 --> 00:10:02,400 Speaker 1: what can we do? And you know, a bunch of 174 00:10:02,440 --> 00:10:04,920 Speaker 1: people made an Uber eats fund for me and my 175 00:10:05,000 --> 00:10:07,400 Speaker 1: husband so we could just have food delivered. While when 176 00:10:07,440 --> 00:10:09,959 Speaker 1: we got home it's like, you know, the twenty one 177 00:10:10,040 --> 00:10:15,960 Speaker 1: century bringing over lasagna. And this was all organized on Facebook. Yeah, 178 00:10:16,000 --> 00:10:19,040 Speaker 1: it was on Facebook, and ironically it helped her connect 179 00:10:19,080 --> 00:10:21,480 Speaker 1: with people going through the same thing. A woman who 180 00:10:21,559 --> 00:10:23,160 Speaker 1: used to go to my church, she had a still 181 00:10:23,240 --> 00:10:26,120 Speaker 1: birth on Christmas Day a few weeks after I did, 182 00:10:26,240 --> 00:10:30,599 Speaker 1: and the memorial service for her baby was posted to Facebook, 183 00:10:30,960 --> 00:10:33,120 Speaker 1: I wouldn't have known about it, and I wouldn't have 184 00:10:33,160 --> 00:10:39,360 Speaker 1: gone if I hadn't checked Facebook. Gillian isn't planning to 185 00:10:39,360 --> 00:10:42,120 Speaker 1: delete social media. She just wishes it could be better. 186 00:10:42,720 --> 00:10:44,440 Speaker 1: She's still going to use it to keep in touch 187 00:10:44,480 --> 00:10:47,480 Speaker 1: with friends and family. But whereas she was once comfortable 188 00:10:47,520 --> 00:10:50,400 Speaker 1: with her wedding on YouTube, there are now something she 189 00:10:50,440 --> 00:10:53,440 Speaker 1: won't be comfortable sharing. And I have to say, now 190 00:10:54,320 --> 00:10:57,880 Speaker 1: having this experience, you know, I knew it was being tracked, 191 00:10:58,040 --> 00:11:01,199 Speaker 1: but having the tracking revealed to me in such a 192 00:11:01,320 --> 00:11:05,559 Speaker 1: garish display. If we do have a living child someday, 193 00:11:05,920 --> 00:11:08,079 Speaker 1: I think that's going to be actually really easy for 194 00:11:08,120 --> 00:11:11,960 Speaker 1: me to just be like, no Internet, you don't get 195 00:11:11,960 --> 00:11:24,280 Speaker 1: to have that. Even after the experience. Gillian doesn't hate Facebook. 196 00:11:24,679 --> 00:11:27,960 Speaker 1: At the same time, she wouldn't want her future children tracked, 197 00:11:28,360 --> 00:11:30,880 Speaker 1: a bit like the Silicon Valley execs we talked about 198 00:11:30,880 --> 00:11:34,120 Speaker 1: earlier taking their own children off social media. But what 199 00:11:34,240 --> 00:11:37,480 Speaker 1: if the same targeting technology that harm Gillian could help 200 00:11:37,520 --> 00:11:41,040 Speaker 1: others and keep people who are hurting from damaging themselves 201 00:11:41,040 --> 00:11:44,280 Speaker 1: and society. And what if it wasn't a broad group 202 00:11:44,400 --> 00:11:47,600 Speaker 1: like new mothers being targeted by ad bots, but a 203 00:11:47,640 --> 00:11:51,600 Speaker 1: specific group of people having their personal search results changed. 204 00:11:52,160 --> 00:11:55,600 Speaker 1: I heard about a programmer Alphabet, Google's parent company trying 205 00:11:55,640 --> 00:11:57,760 Speaker 1: to do just that, and I wanted to know more. 206 00:11:58,600 --> 00:12:09,000 Speaker 1: We'll get there after the brain. My name is Yasmin Green. 207 00:12:09,360 --> 00:12:13,119 Speaker 1: I am the director of Research and Development at Jigsaw. 208 00:12:13,520 --> 00:12:16,360 Speaker 1: We are sitting in one of our rooms in our 209 00:12:16,400 --> 00:12:19,360 Speaker 1: New York office. The name of the meeting room is 210 00:12:19,600 --> 00:12:25,040 Speaker 1: Smoke Signal Communication Technology. Yeah, they're all named after different 211 00:12:25,080 --> 00:12:29,280 Speaker 1: communication technologies. Jigsaw is a part of Alphabet and Google 212 00:12:29,440 --> 00:12:34,719 Speaker 1: that's focused on technology that addresses big security challenges. So 213 00:12:34,760 --> 00:12:37,559 Speaker 1: how did Jigsaw begin? Two decades ago? When we were 214 00:12:37,559 --> 00:12:41,400 Speaker 1: designing the platforms and apps, we were not really imagining 215 00:12:41,440 --> 00:12:46,600 Speaker 1: that repressive governments and criminals and terrorist groups or like 216 00:12:46,679 --> 00:12:50,400 Speaker 1: salivating and about innovating to use these platforms just as 217 00:12:50,440 --> 00:12:52,440 Speaker 1: just as well as everyone else. Um. And now we 218 00:12:52,520 --> 00:12:55,120 Speaker 1: realize that we really need to understand their goals and 219 00:12:55,160 --> 00:12:57,520 Speaker 1: their activities if we want to keep people safe online. 220 00:12:57,880 --> 00:13:01,160 Speaker 1: It's Yasmin's job to look at how those repressive regimes, 221 00:13:01,200 --> 00:13:05,200 Speaker 1: criminals and terrorist groups operate, find trends and try to 222 00:13:05,320 --> 00:13:08,520 Speaker 1: counter them. Our approaches to try to understand the human 223 00:13:08,600 --> 00:13:10,880 Speaker 1: journeys and to understand the role of technology and see 224 00:13:10,920 --> 00:13:14,040 Speaker 1: if we can build technology that stops people getting to 225 00:13:14,120 --> 00:13:16,640 Speaker 1: the point that we would consider them of ident extremist 226 00:13:16,720 --> 00:13:20,040 Speaker 1: or a terrorist. So if you want to understand terrorists, 227 00:13:20,320 --> 00:13:24,679 Speaker 1: where do you start. Well, to better understand extremism, Yasmin's 228 00:13:24,720 --> 00:13:28,480 Speaker 1: team use some somewhat extreme methods themselves. One of the 229 00:13:28,480 --> 00:13:30,720 Speaker 1: first things we did in two thousand and eleven was 230 00:13:30,760 --> 00:13:35,959 Speaker 1: we brought together eighty four former extremists and survivors of terrorism. 231 00:13:36,040 --> 00:13:38,199 Speaker 1: And we had isn't a mist people who's going to 232 00:13:38,200 --> 00:13:43,320 Speaker 1: find Afghanistan or al Shabab from Somalia, Nigerian isna miss militia, 233 00:13:43,400 --> 00:13:45,520 Speaker 1: but we also had a form of violent Israeli settler. 234 00:13:45,640 --> 00:13:47,640 Speaker 1: We had form of Christian militia. We've got them one 235 00:13:47,679 --> 00:13:51,760 Speaker 1: together on one place. Can you imagine the security concern 236 00:13:52,000 --> 00:13:55,240 Speaker 1: for Google in helping us convene everyone in one place. 237 00:13:55,280 --> 00:13:57,640 Speaker 1: We had snipers on the roof, We had six months 238 00:13:57,679 --> 00:14:00,960 Speaker 1: vetting for everyone. We cared about them being public and 239 00:14:00,960 --> 00:14:03,480 Speaker 1: associated with an engine to bring people out of those 240 00:14:03,480 --> 00:14:07,480 Speaker 1: extremist groups. And beyond gathering all the former terrorists in 241 00:14:07,559 --> 00:14:12,160 Speaker 1: one place, Yasmin's team also invited the victims of terrorism. 242 00:14:12,240 --> 00:14:14,920 Speaker 1: We had survivors, So we had people who celebrated nine 243 00:14:14,920 --> 00:14:17,640 Speaker 1: eleven and people who had lost family members in eleven. 244 00:14:17,720 --> 00:14:21,560 Speaker 1: We had a lady who said that she had woken 245 00:14:21,640 --> 00:14:23,800 Speaker 1: up in hospital with a name band on her risk 246 00:14:23,880 --> 00:14:27,080 Speaker 1: that said gender unknown because her body was like so mutilated. 247 00:14:27,600 --> 00:14:29,600 Speaker 1: The point of bringing everyone together was to say what 248 00:14:29,920 --> 00:14:33,800 Speaker 1: is common in the human radicalization path even when you're 249 00:14:33,800 --> 00:14:39,240 Speaker 1: looking across ideologies, and does technology have a role to play? 250 00:14:39,440 --> 00:14:42,720 Speaker 1: In other words, Yasmin wanted to understand how people become 251 00:14:42,920 --> 00:14:46,880 Speaker 1: terrorists and what role the internet plays. But Chicksaw's role 252 00:14:46,960 --> 00:14:49,320 Speaker 1: was never just to have summits and collect the information. 253 00:14:49,720 --> 00:14:53,000 Speaker 1: They wanted to take action, and ISIS was the obvious 254 00:14:53,000 --> 00:14:56,480 Speaker 1: place to start. In March of this year, ISIS actually 255 00:14:56,520 --> 00:14:59,680 Speaker 1: lost the last of its remaining territory, but when the 256 00:15:00,040 --> 00:15:03,000 Speaker 1: left fate was that it's height. Thousands of foreign fighters 257 00:15:03,080 --> 00:15:07,760 Speaker 1: were recruited online. They left their homes in places like Germany, England, 258 00:15:07,840 --> 00:15:11,600 Speaker 1: America for the battle fields of Syria. This trip that 259 00:15:11,640 --> 00:15:15,160 Speaker 1: we took to Iraq, where we spoke to defectors of ISIS, 260 00:15:15,440 --> 00:15:17,560 Speaker 1: was really instructive for us. We had these face to 261 00:15:17,600 --> 00:15:20,640 Speaker 1: face conversations with these young men who had left home, 262 00:15:20,840 --> 00:15:24,360 Speaker 1: gone to Syria, and Iraq trained with ISIS, did their 263 00:15:24,360 --> 00:15:27,560 Speaker 1: religious indoctrination with ISIS, got their postings, some of them 264 00:15:27,560 --> 00:15:30,360 Speaker 1: as suicide bombers, some of them as technical drivers, some 265 00:15:30,400 --> 00:15:33,560 Speaker 1: of them as night watchmen. And they had realized that 266 00:15:33,640 --> 00:15:35,880 Speaker 1: it was all a lie. So they're telling us about it, 267 00:15:35,920 --> 00:15:37,920 Speaker 1: and we say, if you knew the day that you 268 00:15:38,000 --> 00:15:40,000 Speaker 1: left everything that you know and now, would you still 269 00:15:40,080 --> 00:15:44,200 Speaker 1: have gone? And they invariably said that they would still 270 00:15:44,240 --> 00:15:49,800 Speaker 1: have gone. Honestly, I'd already finished, he asked me in 271 00:15:49,800 --> 00:15:51,960 Speaker 1: a sentence. In my head, I thought she was going 272 00:15:52,000 --> 00:15:55,240 Speaker 1: to say, they invariably said that they wouldn't have gone, 273 00:15:55,920 --> 00:15:59,840 Speaker 1: but in fact, knowing they were facing not just heroic martyrdom, 274 00:16:00,200 --> 00:16:04,880 Speaker 1: red cues, lack of medical care, splintered leadership, they still 275 00:16:04,880 --> 00:16:07,400 Speaker 1: would have left the comfort of their home countries again 276 00:16:07,760 --> 00:16:12,160 Speaker 1: places like America, England, Canada, to travel to Syria, an 277 00:16:12,160 --> 00:16:16,160 Speaker 1: active war zone. So Jackson needed to understand what what 278 00:16:16,240 --> 00:16:18,680 Speaker 1: ICE is doing to lure people in that was so 279 00:16:18,920 --> 00:16:21,760 Speaker 1: very powerful. We identified the recruitment narratives and they were 280 00:16:21,920 --> 00:16:25,520 Speaker 1: largely around people who thought that this was the devout, correct, 281 00:16:25,520 --> 00:16:28,400 Speaker 1: religous thing to do, that this wasn't mslim utopia than 282 00:16:28,560 --> 00:16:31,000 Speaker 1: was going to lead to a healthier, happier life than 283 00:16:31,160 --> 00:16:34,080 Speaker 1: remaining in the West. People who are interested in the 284 00:16:34,080 --> 00:16:37,840 Speaker 1: military conflated. So there were several and we generated a 285 00:16:37,920 --> 00:16:42,760 Speaker 1: targeting strategy to reach these people based on their online browsing, 286 00:16:43,280 --> 00:16:47,080 Speaker 1: specifically like their online searches take a moment to absorb that. 287 00:16:47,600 --> 00:16:51,240 Speaker 1: Based on online browsing history and searches, Jigsaw were able 288 00:16:51,280 --> 00:16:55,120 Speaker 1: to identify potential extremists and serve them adds to push 289 00:16:55,120 --> 00:16:59,240 Speaker 1: them towards alternative content. But wouldn't that raise alarm bells 290 00:16:59,280 --> 00:17:02,320 Speaker 1: for the people do in the searching Yasmin and her 291 00:17:02,360 --> 00:17:05,280 Speaker 1: team had to be subtle. If you were interested in 292 00:17:05,359 --> 00:17:09,000 Speaker 1: fact was about Jahard, religious udics about Johard, we would 293 00:17:09,000 --> 00:17:11,320 Speaker 1: give you those originally, It's just not the ones that 294 00:17:11,359 --> 00:17:14,359 Speaker 1: Isis was proposing, but alternative ones. Or if you're interested 295 00:17:14,359 --> 00:17:16,680 Speaker 1: to understand what life is like in the Caliphate, let's 296 00:17:16,680 --> 00:17:20,200 Speaker 1: show you citizen journalism of the long cueues for bread 297 00:17:20,440 --> 00:17:22,399 Speaker 1: or the state of the hospitals in the Caliphate. So 298 00:17:22,400 --> 00:17:26,679 Speaker 1: you're still getting something that addresses your interest. Um, it 299 00:17:26,680 --> 00:17:29,240 Speaker 1: would just be an alternative information source to the one 300 00:17:29,280 --> 00:17:31,679 Speaker 1: that you were looking for and it was really important 301 00:17:31,680 --> 00:17:34,040 Speaker 1: to us to make sure that we were targeting, you know, 302 00:17:34,119 --> 00:17:36,240 Speaker 1: and finding people who were really at risk as opposed 303 00:17:36,240 --> 00:17:38,760 Speaker 1: to just people who are interested in isis that's right. 304 00:17:39,160 --> 00:17:42,560 Speaker 1: Looking at browsing and search history, Jigsaw and Google can 305 00:17:42,600 --> 00:17:45,679 Speaker 1: look at two different people looking up fatoas online and 306 00:17:45,760 --> 00:17:49,320 Speaker 1: only serve redirecting ads to one. When you get the 307 00:17:49,480 --> 00:17:52,040 Speaker 1: level more granular, you can start to set up a 308 00:17:52,080 --> 00:17:55,879 Speaker 1: targeting strategy that really does differentiate mainstream interest in this 309 00:17:55,960 --> 00:17:59,320 Speaker 1: group to the people who are you know, sympathetic and 310 00:17:59,520 --> 00:18:02,480 Speaker 1: potential join us. It has turned out to be really 311 00:18:02,480 --> 00:18:07,280 Speaker 1: effective since my conversation with Yasmin. A terrorist attack on 312 00:18:07,359 --> 00:18:11,879 Speaker 1: two mosques in Christchurch, New Zealand killed fifty people and 313 00:18:11,960 --> 00:18:14,680 Speaker 1: focus the world's attention on the threat posed by far 314 00:18:14,800 --> 00:18:17,960 Speaker 1: right terror. I emailed with Yasmin and she told me 315 00:18:18,000 --> 00:18:20,359 Speaker 1: that she and her team are quote attending to the 316 00:18:20,440 --> 00:18:26,520 Speaker 1: far right threat with increased urgency. I think it's nice 317 00:18:26,560 --> 00:18:28,400 Speaker 1: that they're trying to fix things, but you know, it's 318 00:18:28,400 --> 00:18:32,240 Speaker 1: like the Anhauser Busch family funding a study on alcohol, right, 319 00:18:32,280 --> 00:18:34,760 Speaker 1: I guess, except we kind of no alcohol is bad, 320 00:18:34,800 --> 00:18:38,000 Speaker 1: whereas we can still hope that the internet is neutral, 321 00:18:38,400 --> 00:18:40,280 Speaker 1: although it is clear the Internet has become a key 322 00:18:40,320 --> 00:18:43,399 Speaker 1: method of radicalization for terrorists of all stripes, and so 323 00:18:43,720 --> 00:18:45,280 Speaker 1: for me at least, it's good to know that the 324 00:18:45,320 --> 00:18:48,359 Speaker 1: technology companies are acknowledging that and smart people like Yasmin 325 00:18:48,400 --> 00:18:51,520 Speaker 1: are working on it. Although it does also mean living 326 00:18:51,520 --> 00:18:54,399 Speaker 1: in a world where our main internet search providers also 327 00:18:54,560 --> 00:18:57,480 Speaker 1: edit the results. Do we want Google controlling what we 328 00:18:57,560 --> 00:19:01,120 Speaker 1: know about the world or recommending what else we should now? Right? 329 00:19:01,280 --> 00:19:03,280 Speaker 1: And then what if Google turned that immense power to 330 00:19:03,320 --> 00:19:09,680 Speaker 1: influence directly onto me or you? More on that after 331 00:19:09,720 --> 00:19:18,360 Speaker 1: the break, Tristan, we've been talking about chicksaw on their 332 00:19:18,359 --> 00:19:22,399 Speaker 1: manipulation when you look at elections and the way extremist 333 00:19:22,480 --> 00:19:27,760 Speaker 1: groups are using technology to radicalize and proliferate. Should tech 334 00:19:27,800 --> 00:19:30,240 Speaker 1: companies be trying to offset some of the damage they're 335 00:19:30,240 --> 00:19:33,720 Speaker 1: doing by intervening for good? Yeah, Well, this is an 336 00:19:33,720 --> 00:19:38,200 Speaker 1: incredibly nuanced topic because essentially the way to see what's 337 00:19:38,200 --> 00:19:41,399 Speaker 1: going on with technology today is there's a massive asymmetry. 338 00:19:41,600 --> 00:19:43,040 Speaker 1: It would be easy for you and I and he 339 00:19:43,040 --> 00:19:46,560 Speaker 1: has mean to say clearly isis is bad and terrorism 340 00:19:46,640 --> 00:19:50,400 Speaker 1: is bad. So let's redirect when you're searching for things 341 00:19:50,400 --> 00:19:53,760 Speaker 1: that look like terrorist videos. But if you suddenly then 342 00:19:53,760 --> 00:19:56,639 Speaker 1: open the door and say you're about to watch a 343 00:19:56,760 --> 00:20:01,320 Speaker 1: video on climate change, and people at Google, for whatever reason, 344 00:20:01,359 --> 00:20:04,040 Speaker 1: believe that climate change wasn't real and they said, we're 345 00:20:04,040 --> 00:20:07,159 Speaker 1: going to start redirecting you away from that. Again, who 346 00:20:07,200 --> 00:20:09,880 Speaker 1: would they be to say? And that's actually a critical 347 00:20:09,960 --> 00:20:12,440 Speaker 1: juncture that we're at right now, because there's kind of 348 00:20:12,520 --> 00:20:15,560 Speaker 1: a pluralism where we all believe and think different things, 349 00:20:15,560 --> 00:20:17,119 Speaker 1: and we want to say that's your truth and not 350 00:20:17,200 --> 00:20:19,200 Speaker 1: my truth, and so we have this collapse of well, 351 00:20:19,240 --> 00:20:23,280 Speaker 1: who is the moral authority on these topics? And after all, 352 00:20:23,640 --> 00:20:27,399 Speaker 1: we're all Americans, well not me, but a citizens of 353 00:20:27,400 --> 00:20:30,919 Speaker 1: a democracy. We have a voice in our diplomacy. What 354 00:20:31,000 --> 00:20:33,800 Speaker 1: about is uses as the Internet? I think it's important 355 00:20:33,800 --> 00:20:36,680 Speaker 1: that we not try to just have a philosophy cocktail 356 00:20:36,720 --> 00:20:39,840 Speaker 1: conversation and debate it. We have to actually recognize that 357 00:20:39,880 --> 00:20:43,120 Speaker 1: this is having real world consequences. But the difficult thing 358 00:20:43,200 --> 00:20:46,119 Speaker 1: to reckon with is that it all comes back to attention. 359 00:20:46,640 --> 00:20:48,639 Speaker 1: YouTube has a tilt. We're on the one side of 360 00:20:48,640 --> 00:20:54,520 Speaker 1: the spectrum you have the calm science Carl Siggin slowly explaining, 361 00:20:54,640 --> 00:20:57,440 Speaker 1: and on the other side of the spectrum. You have UFOs, Bigfoot, 362 00:20:57,440 --> 00:21:01,240 Speaker 1: conspiracy theories, and crazy Town. If I'm YouTube and I 363 00:21:01,280 --> 00:21:03,399 Speaker 1: want you to watch more, Let's say you start at 364 00:21:03,400 --> 00:21:05,400 Speaker 1: the science section. If I want you to watch more, 365 00:21:05,600 --> 00:21:07,960 Speaker 1: which direction between those two am I going to steer you? 366 00:21:08,520 --> 00:21:11,560 Speaker 1: I'm always going to steer you too. Crazy Town so 367 00:21:11,800 --> 00:21:14,639 Speaker 1: at scale, even in language that the engineers don't speak, 368 00:21:14,840 --> 00:21:17,880 Speaker 1: the algorithms had figured out that crazy Town is really 369 00:21:17,920 --> 00:21:20,080 Speaker 1: really good at getting people's attention. And we know the 370 00:21:20,119 --> 00:21:23,760 Speaker 1: effect isn't just on politics, what about regular people who 371 00:21:23,760 --> 00:21:27,200 Speaker 1: get pushed into joinning ISIS or taking other extreme positions. 372 00:21:27,480 --> 00:21:29,840 Speaker 1: If you start a team girl on a dieting video, 373 00:21:30,080 --> 00:21:34,080 Speaker 1: the YouTube algorithm recommends an arexia videos because it's calculating 374 00:21:34,080 --> 00:21:36,000 Speaker 1: away and figures out those are the things that are 375 00:21:36,000 --> 00:21:39,840 Speaker 1: really good at keeping people of that age demographic on YouTube. 376 00:21:39,920 --> 00:21:42,280 Speaker 1: And if you start someone on a Supember eleventh news 377 00:21:42,359 --> 00:21:45,760 Speaker 1: video of the planes crashing into the towers, the recommendations 378 00:21:45,760 --> 00:21:48,280 Speaker 1: are all going to be the nine eleven conspiracy theory videos. 379 00:21:48,280 --> 00:21:53,679 Speaker 1: And Alex Jones YouTube actually recommended fifteen billion times Alex 380 00:21:53,760 --> 00:21:57,080 Speaker 1: Jones videos, two billion of which were viewed, even if 381 00:21:57,119 --> 00:21:59,600 Speaker 1: only one out of a thousand of those people believed it, 382 00:21:59,680 --> 00:22:01,880 Speaker 1: So one out of a thousand people watching those two 383 00:22:01,880 --> 00:22:04,520 Speaker 1: billion views believe that that's like printing, you know, a 384 00:22:04,600 --> 00:22:07,560 Speaker 1: church of scientology called about you know once a month. 385 00:22:07,600 --> 00:22:11,160 Speaker 1: In terms of the scale, we are jacked into these systems, 386 00:22:11,160 --> 00:22:14,640 Speaker 1: We are jacked into these environments that are telling us 387 00:22:14,680 --> 00:22:17,600 Speaker 1: and steering us towards flows of attention and ways of 388 00:22:17,640 --> 00:22:20,840 Speaker 1: seeing the world that are inevitable. And the question is 389 00:22:20,960 --> 00:22:24,679 Speaker 1: what flows of attention do we want? And in the end, 390 00:22:24,720 --> 00:22:27,480 Speaker 1: it all comes down to who's steering the ship and 391 00:22:27,520 --> 00:22:31,879 Speaker 1: what they're steering you towards. Who watches the watchman he 392 00:22:32,000 --> 00:22:35,400 Speaker 1: has been argues that jakes Will takes ethics heavily into consideration, 393 00:22:36,240 --> 00:22:39,560 Speaker 1: but not all companies do. And Tristan actually has a 394 00:22:39,640 --> 00:22:43,280 Speaker 1: radical idea. He believes that we need to wean ourselves 395 00:22:43,480 --> 00:22:47,000 Speaker 1: as an economy and as a society from business models 396 00:22:47,040 --> 00:22:49,560 Speaker 1: that compete for our attention to sell us things. I 397 00:22:49,560 --> 00:22:53,200 Speaker 1: think we really underestimate that this is affecting every layer 398 00:22:53,320 --> 00:22:56,200 Speaker 1: of society. And when you have three or four tech 399 00:22:56,240 --> 00:22:59,560 Speaker 1: companies choosing what will show up in the minds of 400 00:22:59,560 --> 00:23:02,119 Speaker 1: two bill and people every day, you know, our minds 401 00:23:02,160 --> 00:23:04,920 Speaker 1: are the source of all of our actions, But don't 402 00:23:04,920 --> 00:23:09,159 Speaker 1: we bear some responsibility for our own actions? Isn't that 403 00:23:09,240 --> 00:23:14,160 Speaker 1: part of the American dream? After all? No one forces 404 00:23:14,200 --> 00:23:16,480 Speaker 1: you into a casino or to buy that bar of 405 00:23:16,560 --> 00:23:19,879 Speaker 1: chocolate as you're paying at the supermarket. Right. You know, 406 00:23:20,040 --> 00:23:22,439 Speaker 1: you're the one clicking on those YouTube videos. You're the 407 00:23:22,440 --> 00:23:24,880 Speaker 1: ones who are clicking on those Russian ads. You're the ones, 408 00:23:25,080 --> 00:23:27,119 Speaker 1: you know, checking your phones because of those light colors 409 00:23:27,119 --> 00:23:29,560 Speaker 1: that are shining up all the time. We're just giving 410 00:23:29,560 --> 00:23:32,720 Speaker 1: people what they want. You're responsible for your choices. What 411 00:23:32,800 --> 00:23:35,080 Speaker 1: we have to do is flip the table around. We're 412 00:23:35,160 --> 00:23:38,119 Speaker 1: not giving people what they want. We're giving people what 413 00:23:38,200 --> 00:23:41,080 Speaker 1: they can't help but watch and once we acknowledge that, 414 00:23:41,200 --> 00:23:43,680 Speaker 1: we can start to find some reasons to hope. We've 415 00:23:43,760 --> 00:23:47,199 Speaker 1: found by getting Apple and Google to launch both um 416 00:23:47,320 --> 00:23:50,720 Speaker 1: these digital well being initiatives, which take the blue light 417 00:23:50,720 --> 00:23:52,919 Speaker 1: out of your screens and do gray scale late at 418 00:23:53,000 --> 00:23:55,320 Speaker 1: night and show you ways of minimizing the time you 419 00:23:55,320 --> 00:23:57,400 Speaker 1: spend on your phone and things like that. It's a 420 00:23:57,440 --> 00:24:00,679 Speaker 1: baby step in the right direction. But that happened against 421 00:24:00,680 --> 00:24:04,119 Speaker 1: their own business interests because the engineers themselves started to 422 00:24:04,160 --> 00:24:07,159 Speaker 1: see it this way. Of course, this addiction economy is 423 00:24:07,200 --> 00:24:11,119 Speaker 1: fueling so much more, from heck democracy to radicalization, to 424 00:24:11,200 --> 00:24:14,439 Speaker 1: mental health issues and the epidemic of a loneliness. The 425 00:24:14,520 --> 00:24:17,760 Speaker 1: list of problems that we need to solve is much 426 00:24:17,800 --> 00:24:21,600 Speaker 1: bigger than just making our own gray skill, and that's 427 00:24:21,680 --> 00:24:25,720 Speaker 1: what has to happen next. Happily, not everything in life 428 00:24:25,720 --> 00:24:28,320 Speaker 1: that makes us feel good is bad for us, and 429 00:24:28,440 --> 00:24:32,400 Speaker 1: some target recommendations can make us very happy. Indeed, So 430 00:24:32,680 --> 00:24:36,880 Speaker 1: let's talk about love, because one of the ways that 431 00:24:36,920 --> 00:24:39,400 Speaker 1: you might have felt AI touched your life is through 432 00:24:39,560 --> 00:24:42,240 Speaker 1: online dating. Maybe the person you woke up next to 433 00:24:42,280 --> 00:24:45,400 Speaker 1: this morning was matched to you by a computer algorithm. 434 00:24:45,440 --> 00:24:50,760 Speaker 1: And according to a study released in January Stanford, straight 435 00:24:50,920 --> 00:24:56,560 Speaker 1: and six of LGBTQ couples now meet online. Tinder processes 436 00:24:56,600 --> 00:24:59,679 Speaker 1: two billion swipes per day. That's a lot of data, 437 00:25:00,400 --> 00:25:02,240 Speaker 1: and it feels pretty good to meet someone who like 438 00:25:02,680 --> 00:25:06,440 Speaker 1: no matter how it happens. The lgbt Q statistic makes 439 00:25:06,440 --> 00:25:07,840 Speaker 1: a lot of sense to me because we don't have 440 00:25:07,880 --> 00:25:10,840 Speaker 1: to come in contact with judgment, with hate speech, with 441 00:25:10,920 --> 00:25:12,919 Speaker 1: home whatever it may be, there's a lot less of 442 00:25:12,920 --> 00:25:15,280 Speaker 1: it when it's a peer to peer thing on your phone. 443 00:25:15,400 --> 00:25:18,280 Speaker 1: Hopefully the other thing is the heterosexual dating pool is 444 00:25:18,320 --> 00:25:21,320 Speaker 1: a pool. The in my case, lesbian dating pool is 445 00:25:21,720 --> 00:25:24,600 Speaker 1: a puddle, and that puddle at least can be amplified 446 00:25:25,240 --> 00:25:28,719 Speaker 1: on an app like Hinge and Tinder and whatever. I mean. 447 00:25:28,720 --> 00:25:30,600 Speaker 1: I have noticed a lot more gay couples in the 448 00:25:30,640 --> 00:25:33,840 Speaker 1: New York Times wedding section, which I read religiously. The 449 00:25:33,880 --> 00:25:36,560 Speaker 1: other thing that I notice is that a lot of 450 00:25:36,600 --> 00:25:40,400 Speaker 1: the couples, both straight and gay, meet on apps. It's 451 00:25:40,440 --> 00:25:42,679 Speaker 1: interesting to see that it's influenced the culture in that 452 00:25:42,720 --> 00:25:45,000 Speaker 1: way so much so that you know, X amount of 453 00:25:45,000 --> 00:25:48,000 Speaker 1: people per week are meeting on those apps. And actually 454 00:25:48,040 --> 00:25:50,840 Speaker 1: in the States right now, there's been quite a significant 455 00:25:50,880 --> 00:25:54,440 Speaker 1: uptick in interracial marriages. That's pretty cool and it's something 456 00:25:54,480 --> 00:25:56,520 Speaker 1: to really celebrate as well. I mean, I was on 457 00:25:56,560 --> 00:25:59,400 Speaker 1: Instagram a few days ago and I saw somebody who 458 00:25:59,480 --> 00:26:01,320 Speaker 1: uploaded it picture of their wedding cake which had the 459 00:26:01,359 --> 00:26:04,400 Speaker 1: Tinder logo on it. Um. So you know, people feel 460 00:26:04,440 --> 00:26:08,119 Speaker 1: real terms of gratitude and excitement about meeting their partner, obviously, 461 00:26:08,520 --> 00:26:11,399 Speaker 1: And one person had a successful Tinder experience is the 462 00:26:11,440 --> 00:26:14,439 Speaker 1: producer of the show Julian Weller. That's right. So I 463 00:26:14,480 --> 00:26:16,119 Speaker 1: met my girlfriend on mine I've actually been to a 464 00:26:16,160 --> 00:26:18,840 Speaker 1: couple of weddings also that they called Tinder Ella stories 465 00:26:19,480 --> 00:26:22,800 Speaker 1: if the iPhone case fits. Yeah, But you know, I 466 00:26:22,840 --> 00:26:25,119 Speaker 1: think in essence, the use of something like Tinder is 467 00:26:25,119 --> 00:26:27,600 Speaker 1: that it just makes your dating pool bigger. To Cara's point, 468 00:26:27,960 --> 00:26:30,560 Speaker 1: but there are other apps, right, there's other approaches where 469 00:26:30,840 --> 00:26:33,919 Speaker 1: people can enter lots more information in about themselves beyond 470 00:26:34,040 --> 00:26:37,440 Speaker 1: just this sort of casual way to meet. We wanted 471 00:26:37,480 --> 00:26:39,479 Speaker 1: to find out more about the data that drives who 472 00:26:39,560 --> 00:26:42,479 Speaker 1: we fall in love with, so we spoke to Helen Fisher. 473 00:26:43,760 --> 00:26:47,760 Speaker 1: I'm Dr Helen Fisher. I'm a biological anthropologist at the 474 00:26:47,920 --> 00:26:51,280 Speaker 1: Kinsey Institute. I've written six books on love, put people 475 00:26:51,320 --> 00:26:54,320 Speaker 1: in brain scanners, and actually I'm also the chief scientific 476 00:26:54,359 --> 00:26:56,840 Speaker 1: advisor to match dot com. We've been talking a lot 477 00:26:56,840 --> 00:27:00,840 Speaker 1: about computer algorithms influencing behavior in this episode Code, But 478 00:27:01,160 --> 00:27:04,600 Speaker 1: what motivates us in the first place, Well, there are 479 00:27:04,640 --> 00:27:08,000 Speaker 1: pathways and rules in the brain, not unlike a computer. 480 00:27:08,520 --> 00:27:11,040 Speaker 1: I developed a questionnaire some time ago that's now been 481 00:27:11,040 --> 00:27:14,520 Speaker 1: taken by fourteen million people in forty countries. Actually, and 482 00:27:14,640 --> 00:27:17,520 Speaker 1: there's patterns of behavior. I mean, we walk around with 483 00:27:17,560 --> 00:27:20,360 Speaker 1: algorithms in our head I mean, the brain is constantly 484 00:27:20,400 --> 00:27:23,199 Speaker 1: responding to all kinds of things, and it certainly is 485 00:27:23,240 --> 00:27:26,800 Speaker 1: a series of algorithms. So if the brain is like 486 00:27:26,840 --> 00:27:31,199 Speaker 1: a collection of computer algorithms but much more complicated, what 487 00:27:31,280 --> 00:27:34,920 Speaker 1: are the inputs what drives who we fall in love with? 488 00:27:35,320 --> 00:27:37,959 Speaker 1: There's four brain systems that each one of them is 489 00:27:38,000 --> 00:27:41,240 Speaker 1: linked with a constellation of biological traits, and they are 490 00:27:41,280 --> 00:27:45,399 Speaker 1: the dopamine, serotonin, testasterone, and estrogen systems. And so I 491 00:27:45,440 --> 00:27:47,720 Speaker 1: created a questionnaire to see what you know, how you 492 00:27:47,760 --> 00:27:50,320 Speaker 1: expressed the traits in each one of these systems. Once 493 00:27:50,359 --> 00:27:53,280 Speaker 1: Helen had developed the quiz, she validated it using a 494 00:27:53,359 --> 00:27:56,840 Speaker 1: brain scanner, and then I watched now at fourteen million people, 495 00:27:56,960 --> 00:27:59,800 Speaker 1: and Helen got an insight into the brain system that 496 00:28:00,040 --> 00:28:03,000 Speaker 1: makes us experience love in the first place. When we 497 00:28:03,040 --> 00:28:05,520 Speaker 1: put the people in the machine, I had expected that 498 00:28:05,720 --> 00:28:08,840 Speaker 1: brain readers, link with the emotions and cognitive processes, would 499 00:28:08,880 --> 00:28:11,639 Speaker 1: become active, and they do. But what everybody had in 500 00:28:11,680 --> 00:28:14,680 Speaker 1: common is activity in brain readers way at the base 501 00:28:14,760 --> 00:28:17,919 Speaker 1: of the brain link with drive, with craving, with focus, 502 00:28:18,040 --> 00:28:21,600 Speaker 1: with motivation, with energy. In other words, Helen found that 503 00:28:21,640 --> 00:28:24,480 Speaker 1: the ability to fall in love is just as deep 504 00:28:24,520 --> 00:28:27,719 Speaker 1: in the brain as other survival drives things we have 505 00:28:27,760 --> 00:28:31,560 Speaker 1: no control over, but it lies right near factories that 506 00:28:31,720 --> 00:28:35,600 Speaker 1: orchestrate thirst and hunger. Thirst and hunger keep you alive today. 507 00:28:35,920 --> 00:28:38,840 Speaker 1: Romantic love drives you to fall in love form of 508 00:28:38,920 --> 00:28:43,240 Speaker 1: pair bound and pass your DNA into tomorrow. That brain system, 509 00:28:43,280 --> 00:28:45,320 Speaker 1: the one that makes us feel that we need to eat, 510 00:28:45,480 --> 00:28:48,000 Speaker 1: or need to drink, or need to be with someone, 511 00:28:48,560 --> 00:28:51,520 Speaker 1: is also the one that drives addiction, and that has 512 00:28:51,520 --> 00:28:55,120 Speaker 1: implications for the technologies we use every day beyond just 513 00:28:55,240 --> 00:28:58,480 Speaker 1: dating apps, not only the substance addictions, but the behavioral 514 00:28:58,520 --> 00:29:02,600 Speaker 1: addictions like gambling or food addiction. That brain region became 515 00:29:02,720 --> 00:29:06,200 Speaker 1: activated not only among people who are madly and happily 516 00:29:06,240 --> 00:29:10,000 Speaker 1: in love, but also among people who were rejected in love, 517 00:29:10,360 --> 00:29:12,840 Speaker 1: and even in people who are in love long term. 518 00:29:12,880 --> 00:29:15,920 Speaker 1: And it is linked with the addiction centers in the brain. 519 00:29:16,000 --> 00:29:17,720 Speaker 1: And perhaps at some point we're going to come to 520 00:29:17,800 --> 00:29:21,480 Speaker 1: understand a much broader view of the word addiction. This 521 00:29:21,600 --> 00:29:24,480 Speaker 1: brings us right back to what Tristan was saying. Our 522 00:29:24,520 --> 00:29:27,840 Speaker 1: apps and smart devices are hijacking some of the deepest 523 00:29:27,920 --> 00:29:31,240 Speaker 1: and most powerful systems in our brain. The truth is 524 00:29:31,280 --> 00:29:35,440 Speaker 1: that Tinder apps social media validation. They all generate the 525 00:29:35,480 --> 00:29:39,120 Speaker 1: same feelings as romantic love. So of course we're prone 526 00:29:39,160 --> 00:29:41,800 Speaker 1: to be addicted. We want to look at somebody, to 527 00:29:41,920 --> 00:29:44,480 Speaker 1: hear somebody, to have somebody respond to us. If you're 528 00:29:44,480 --> 00:29:46,720 Speaker 1: an alien who came to Earth and he looked at 529 00:29:46,760 --> 00:29:50,200 Speaker 1: the way human touched their phone, always in their hand, 530 00:29:50,680 --> 00:29:54,400 Speaker 1: caressing it, reaching forward with a panic look in their eye, 531 00:29:54,760 --> 00:29:57,280 Speaker 1: responding to it with a smile, you would say, the 532 00:29:57,280 --> 00:29:59,920 Speaker 1: same thing is happening. Well, it's very different, but it may. 533 00:30:00,040 --> 00:30:01,880 Speaker 1: We used to me that some of the same brain systems. 534 00:30:02,000 --> 00:30:04,640 Speaker 1: I mean, you know when you feel fear, you feel fear. 535 00:30:04,920 --> 00:30:08,360 Speaker 1: As Helen found, love is fundamentally about the survival of 536 00:30:08,400 --> 00:30:11,520 Speaker 1: our species. So it's handled by the same part of 537 00:30:11,520 --> 00:30:15,880 Speaker 1: our brain as hunger, thirst, and addiction. And it's those 538 00:30:16,000 --> 00:30:19,280 Speaker 1: very brain centers that Twitter and Facebook on Instagram appeal 539 00:30:19,360 --> 00:30:23,000 Speaker 1: to using behavioral science to keep us engaged, to keep 540 00:30:23,040 --> 00:30:28,240 Speaker 1: us sharing. So when thousand Twitter users retweeted Gillian's open 541 00:30:28,320 --> 00:30:31,760 Speaker 1: letter to the technology companies, they did it because on 542 00:30:31,760 --> 00:30:35,480 Speaker 1: a very deep level, it triggered a survival response, and 543 00:30:35,560 --> 00:30:38,920 Speaker 1: in turn, that feedback loop is addictive. You know, I 544 00:30:38,920 --> 00:30:41,520 Speaker 1: did a study with Matt and I asked the singles 545 00:30:41,520 --> 00:30:44,440 Speaker 1: in America and do you feel that these machines are addictive? 546 00:30:44,520 --> 00:30:48,320 Speaker 1: And some like eight over said yes, And people said 547 00:30:48,360 --> 00:30:50,600 Speaker 1: that they would like to go back to dating without 548 00:30:50,640 --> 00:30:53,320 Speaker 1: any of them, and knowing how tempting it can be 549 00:30:53,360 --> 00:30:57,000 Speaker 1: to keep swiping, keep searching. Helen has some advice for 550 00:30:57,040 --> 00:31:00,240 Speaker 1: those of us looking for love. It's very well own 551 00:31:00,440 --> 00:31:03,960 Speaker 1: in this community that the more choices you have, the 552 00:31:04,120 --> 00:31:07,080 Speaker 1: less likely you are to choose anybody. So one of 553 00:31:07,160 --> 00:31:09,040 Speaker 1: the things that I say to people is, after you 554 00:31:09,120 --> 00:31:13,240 Speaker 1: meet nine people, stop and get to know at least 555 00:31:13,440 --> 00:31:16,400 Speaker 1: one of those people more. Because all the data show 556 00:31:16,440 --> 00:31:18,200 Speaker 1: that the more you get to know somebody, the more 557 00:31:18,240 --> 00:31:19,760 Speaker 1: you like them and the more that you think that 558 00:31:19,800 --> 00:31:21,840 Speaker 1: they're like you. You know, for millions of years, we 559 00:31:21,840 --> 00:31:24,160 Speaker 1: weren't doing this over the internet or even the telephone. 560 00:31:24,320 --> 00:31:26,200 Speaker 1: We were doing this in person. I mean, that's the 561 00:31:26,200 --> 00:31:29,480 Speaker 1: way people met, and the brain is built to meet 562 00:31:29,560 --> 00:31:34,880 Speaker 1: in person. Helen Fisher still believes in love. She just 563 00:31:34,920 --> 00:31:38,240 Speaker 1: doesn't want you to sleep walk into endlessly swiping, endless 564 00:31:38,320 --> 00:31:42,000 Speaker 1: date spurred by an addiction to see what's next, what's possible, 565 00:31:42,360 --> 00:31:45,760 Speaker 1: what's right around the corner. We know we're at a 566 00:31:45,880 --> 00:31:50,520 Speaker 1: dangerous crossroads and that we're susceptible to manipulation, and companies 567 00:31:50,560 --> 00:31:53,800 Speaker 1: know they can manipulate us for good and evil with 568 00:31:53,880 --> 00:31:57,720 Speaker 1: technology that touches us at our evolutionary roots. But the 569 00:31:57,800 --> 00:32:01,680 Speaker 1: future of those technologies isn't in vitable, and we still 570 00:32:01,720 --> 00:32:05,040 Speaker 1: have the power in our hands to decide what to 571 00:32:05,080 --> 00:32:08,120 Speaker 1: allow in our lives, because the ways we decide to 572 00:32:08,160 --> 00:32:11,120 Speaker 1: live with new technologies could have as much impact on 573 00:32:11,160 --> 00:32:14,560 Speaker 1: our lives as the Constitution or the New Deal. What 574 00:32:14,600 --> 00:32:16,840 Speaker 1: we decide to do next and how we do it 575 00:32:17,080 --> 00:32:20,720 Speaker 1: will have consequences. And on this series, we'll be interviewing 576 00:32:20,800 --> 00:32:23,640 Speaker 1: some of the world's greatest thinkers, that people who are 577 00:32:23,720 --> 00:32:26,920 Speaker 1: changing the future of our food, and the researchers helping 578 00:32:26,960 --> 00:32:31,960 Speaker 1: disabled people control machines with their minds. Together, we'll see 579 00:32:32,000 --> 00:32:34,360 Speaker 1: what we can do to pry our eyes open from 580 00:32:34,360 --> 00:32:38,440 Speaker 1: this dangerous sleepwalk I'm as veloshin see you next time. 581 00:32:50,920 --> 00:32:54,640 Speaker 1: Sleepwalkers is a production of our Heart Radio and unusual productions. 582 00:32:55,080 --> 00:32:57,040 Speaker 1: There's so much we don't have time for in our 583 00:32:57,080 --> 00:32:59,560 Speaker 1: episodes but that we'd love to share with you. So 584 00:33:00,080 --> 00:33:03,160 Speaker 1: the latest AI news, live interviews and behind the scenes 585 00:33:03,160 --> 00:33:07,320 Speaker 1: footage find us on Instagram, at Sleepwalker's Podcast or at 586 00:33:07,320 --> 00:33:11,800 Speaker 1: Sleepwalker's podcast dot com. Sleepwalkers is hosted by me Osloshin 587 00:33:11,920 --> 00:33:14,719 Speaker 1: and co hosted by me Kara Price. We produced by 588 00:33:14,800 --> 00:33:18,240 Speaker 1: Julian Weller, with audio editing by Jacopo Penzo, Taylor ship 589 00:33:18,280 --> 00:33:21,080 Speaker 1: Coin and mixing by Tristan McNeil. Our story editor is 590 00:33:21,120 --> 00:33:24,600 Speaker 1: Matthew Riddle. Recording assistance this episode from Chris Hambroke and 591 00:33:24,680 --> 00:33:28,680 Speaker 1: Jackson Bierfeld. Sleepwalkers is executive produced by me Ozveloshin and 592 00:33:28,720 --> 00:33:32,160 Speaker 1: Manges Ticketer. For more podcasts from my Heart Radio, visit 593 00:33:32,160 --> 00:33:35,200 Speaker 1: the I heart Radio app, Apple Podcasts, or wherever you 594 00:33:35,240 --> 00:33:36,479 Speaker 1: listen to your favorite shows.