1 00:00:04,600 --> 00:00:08,320 Speaker 1: Sleepwalkers is a production of my heart radio and unusual productions. 2 00:00:11,080 --> 00:00:13,200 Speaker 1: Every day I go to work, I use my car. 3 00:00:13,600 --> 00:00:15,440 Speaker 1: Every tag I talked to somebody is more than a 4 00:00:15,440 --> 00:00:17,360 Speaker 1: few more feet away. I use my phone, so I'm 5 00:00:17,720 --> 00:00:22,720 Speaker 1: using certain technologies that extend me beyond my physical normal range. Yes, 6 00:00:22,800 --> 00:00:28,639 Speaker 1: so maybe I'm a cyborg. That Sebastian Throne speaking. He 7 00:00:28,720 --> 00:00:31,520 Speaker 1: found a Google X and he's appeared on Sleepwalkers a 8 00:00:31,520 --> 00:00:34,760 Speaker 1: few times. As far as we know, he's not a cyborg. 9 00:00:35,000 --> 00:00:38,160 Speaker 1: If summer took all technology away from me, like refrigeration 10 00:00:38,200 --> 00:00:41,840 Speaker 1: and everything, would I not survive? I have no clue. 11 00:00:41,880 --> 00:00:44,520 Speaker 1: I good, I'm in catching deer here my neighborhood. But yes, 12 00:00:44,560 --> 00:00:47,120 Speaker 1: I would be pretty miserable. You said something which I 13 00:00:47,159 --> 00:00:51,360 Speaker 1: found quite interesting. You said, I would love to directly 14 00:00:51,400 --> 00:00:54,040 Speaker 1: interface my brain to all the computers in the world, 15 00:00:54,240 --> 00:00:57,120 Speaker 1: so I could be truly superhuman. I would know everything, 16 00:00:57,240 --> 00:00:59,960 Speaker 1: every name, every phone number. If there was a button 17 00:01:00,040 --> 00:01:02,600 Speaker 1: you could press to do that right now, would you 18 00:01:02,600 --> 00:01:05,039 Speaker 1: have any hesitation pressing it? I would press it in 19 00:01:05,080 --> 00:01:10,000 Speaker 1: a microsecond. Why I make sure the product of data. 20 00:01:10,080 --> 00:01:12,800 Speaker 1: I went to college, I read probably four or five 21 00:01:13,160 --> 00:01:15,560 Speaker 1: books in the time, and I build a scientific career 22 00:01:15,640 --> 00:01:18,440 Speaker 1: on the shoulders of others. That data empowered me to 23 00:01:18,560 --> 00:01:20,440 Speaker 1: who I am. If you took that data away from 24 00:01:20,440 --> 00:01:22,200 Speaker 1: me and said, Sebastian, I'm going to put you into 25 00:01:22,240 --> 00:01:24,120 Speaker 1: the China in the seventies, you're gonna work in the field, 26 00:01:24,400 --> 00:01:27,480 Speaker 1: I would be less as Sebastian than I am today, 27 00:01:27,560 --> 00:01:29,720 Speaker 1: because I would probably know how to plow a field. 28 00:01:29,760 --> 00:01:33,160 Speaker 1: But that was pretty much it. To Sebastian's regret, there's 29 00:01:33,200 --> 00:01:35,360 Speaker 1: been a hard limit on how much data he can 30 00:01:35,360 --> 00:01:41,480 Speaker 1: actually consume his hardware. His physical body has held him back. Unfortunately, 31 00:01:41,760 --> 00:01:44,400 Speaker 1: the human I owe the input out with the ears 32 00:01:44,400 --> 00:01:46,959 Speaker 1: and eyes and smell and so on, voice are still 33 00:01:47,040 --> 00:01:50,880 Speaker 1: very inefficient. If I could accelerate the reading of all 34 00:01:50,880 --> 00:01:52,880 Speaker 1: the books into my brain, oh my god, that would 35 00:01:52,880 --> 00:01:56,480 Speaker 1: be so awesome. We talked in the last episode about 36 00:01:56,480 --> 00:01:59,960 Speaker 1: brain computer interfaces in the medical world, helping disabled pay 37 00:02:00,000 --> 00:02:04,320 Speaker 1: ss like Jan Sherman move prosthetic limbs with their minds. Today, 38 00:02:04,640 --> 00:02:09,120 Speaker 1: we look at trans humanism becoming cyborks. What happens when 39 00:02:09,160 --> 00:02:12,120 Speaker 1: we merge with computers not just to restore function, but 40 00:02:12,240 --> 00:02:15,560 Speaker 1: to upgrade and enhance ourselves. And as we head into 41 00:02:15,639 --> 00:02:20,000 Speaker 1: this new future, who might get left behind. I musoloshin 42 00:02:20,639 --> 00:02:36,880 Speaker 1: this is Sleepwalkers. So Carrot Sebastian was talking about cyborgs 43 00:02:37,000 --> 00:02:40,000 Speaker 1: and whether we're already cy walks because we're so reliant 44 00:02:40,040 --> 00:02:43,680 Speaker 1: on our technology, our cars, and especially our phones, which 45 00:02:43,919 --> 00:02:46,600 Speaker 1: become kind of like a second brain, albeit a very 46 00:02:46,639 --> 00:02:48,799 Speaker 1: distracting one. Yeah. I think what he was saying that 47 00:02:48,919 --> 00:02:52,720 Speaker 1: if technology is required to keep us alive, that merger 48 00:02:52,840 --> 00:02:56,400 Speaker 1: between humans and machines has already happened. We're already that's right. 49 00:02:56,520 --> 00:02:58,760 Speaker 1: But let's be real, that's not really what people think 50 00:02:58,840 --> 00:03:01,520 Speaker 1: about when they hear the word cyborgs. They think about 51 00:03:01,560 --> 00:03:05,120 Speaker 1: more extreme cases like Sebastian having his brain directly connected 52 00:03:05,160 --> 00:03:07,519 Speaker 1: to all the information in the world ever, you know, 53 00:03:07,560 --> 00:03:09,560 Speaker 1: and there are already some people working on that. There's 54 00:03:09,560 --> 00:03:12,519 Speaker 1: this guy named Elon Musk that you might have heard of, 55 00:03:12,600 --> 00:03:16,040 Speaker 1: and he actually one of his new companies is called Neuralink, 56 00:03:16,320 --> 00:03:19,639 Speaker 1: and he's trying to do is create a mesh that 57 00:03:19,720 --> 00:03:22,240 Speaker 1: can be inserted in the brain that connects the brain 58 00:03:22,800 --> 00:03:25,880 Speaker 1: to a computer. And I think part of his goal 59 00:03:26,720 --> 00:03:31,200 Speaker 1: is to make humans competitive enough to take on machines 60 00:03:32,200 --> 00:03:34,520 Speaker 1: if and when they out smart us. Well, this idea 61 00:03:34,600 --> 00:03:38,880 Speaker 1: of humans and machines merging to create something superior is 62 00:03:38,920 --> 00:03:42,080 Speaker 1: a great interest throughout Silicon Valley, and one person who 63 00:03:42,200 --> 00:03:44,880 Speaker 1: kind of wrote the book on this transhumanism is you 64 00:03:45,000 --> 00:03:47,280 Speaker 1: have al No Harari. He is a historian and a 65 00:03:47,360 --> 00:03:51,080 Speaker 1: futurist and the author of Homodaeus, a book that explores 66 00:03:51,160 --> 00:03:53,840 Speaker 1: the future of our humanity, and his writing is a 67 00:03:53,880 --> 00:03:56,280 Speaker 1: major source of inspiration, not just in Silicon Valley but 68 00:03:56,360 --> 00:03:58,840 Speaker 1: for this podcast. So I was rather excited when he 69 00:03:58,840 --> 00:04:01,920 Speaker 1: agreed to have a conversation with us. Every generation in 70 00:04:02,040 --> 00:04:05,320 Speaker 1: history thinks that they are on the verge of the apocalypse, 71 00:04:06,120 --> 00:04:09,240 Speaker 1: and usually they are wrong. But as a story and 72 00:04:09,280 --> 00:04:11,200 Speaker 1: I have the sense that we are on the verge 73 00:04:11,280 --> 00:04:15,480 Speaker 1: of the most important revolution since the very beginning of life. 74 00:04:17,800 --> 00:04:20,400 Speaker 1: You've all believes that right now we're actually looking at 75 00:04:20,440 --> 00:04:23,120 Speaker 1: the end of history as we know it, the dawning 76 00:04:23,200 --> 00:04:27,240 Speaker 1: of a new era. Something has changed. I wanted to 77 00:04:27,360 --> 00:04:33,800 Speaker 1: know what we're really deciphering, the underlying rules of the 78 00:04:33,880 --> 00:04:37,640 Speaker 1: game of life and now acquiring the ability to change 79 00:04:37,720 --> 00:04:41,839 Speaker 1: these rules. I mean, previously, the expectations of the apocalypse 80 00:04:42,000 --> 00:04:47,159 Speaker 1: were directed outwards towards the gods, towards some external entity 81 00:04:47,680 --> 00:04:52,320 Speaker 1: that will come and intervene and change everything. Now we 82 00:04:52,480 --> 00:04:56,080 Speaker 1: don't expect an external entity to do it. We expect 83 00:04:56,120 --> 00:04:59,920 Speaker 1: ourselves to do it. And when you look really calm 84 00:05:00,000 --> 00:05:04,680 Speaker 1: only and objectively at the advances that science has been making, 85 00:05:05,360 --> 00:05:08,920 Speaker 1: it doesn't sound so far fetched as you've all would 86 00:05:08,920 --> 00:05:11,120 Speaker 1: have it. We no longer look to the heavens or answers, 87 00:05:11,480 --> 00:05:15,520 Speaker 1: because we ourselves are becoming gods, and as technology improves, 88 00:05:15,720 --> 00:05:19,679 Speaker 1: we're getting better and better at finding answers by looking inwards. Literally. 89 00:05:21,000 --> 00:05:24,640 Speaker 1: One of the biggest misunderstandings about the whole AI revolution 90 00:05:25,080 --> 00:05:28,159 Speaker 1: is that many people see it as a revolution coming 91 00:05:28,240 --> 00:05:33,680 Speaker 1: out of computer science, but actually it comes equally from 92 00:05:33,800 --> 00:05:37,800 Speaker 1: the life sciences, from biology, from brain science. It's not 93 00:05:38,120 --> 00:05:42,840 Speaker 1: enough that computers are becoming smarter. It's also essential that 94 00:05:43,160 --> 00:05:49,040 Speaker 1: we view humans as algorithms. If our feelings are not 95 00:05:49,240 --> 00:05:53,280 Speaker 1: the product of some kind of extremely complicated algorithm in 96 00:05:53,360 --> 00:05:57,920 Speaker 1: the brain, then no matter how smart computers will become, 97 00:05:58,240 --> 00:06:02,000 Speaker 1: many things won't happen. If something like self driving cars 98 00:06:02,640 --> 00:06:06,039 Speaker 1: to navigate a street full of pedestrians, the car must 99 00:06:06,120 --> 00:06:10,720 Speaker 1: be able to understand human behavior and human feelings. And 100 00:06:11,000 --> 00:06:13,960 Speaker 1: if you think that human feelings are the result of 101 00:06:14,160 --> 00:06:19,880 Speaker 1: some spiritual soul or something which can never be deciphered. 102 00:06:20,279 --> 00:06:24,040 Speaker 1: Then you can't have self driving cars. In order to 103 00:06:24,120 --> 00:06:27,120 Speaker 1: create so many of the technologies we talk about on Sleepwalkers, 104 00:06:27,560 --> 00:06:31,320 Speaker 1: from self driving cars to targeted ads to parole algorithms, 105 00:06:31,720 --> 00:06:34,320 Speaker 1: we have to assume that human behavior can be modeled, 106 00:06:34,760 --> 00:06:38,760 Speaker 1: that our habits, routines, even our personalities could be expressed 107 00:06:38,839 --> 00:06:42,600 Speaker 1: as mathematical formulas. But even if you don't take things 108 00:06:42,720 --> 00:06:46,800 Speaker 1: that far. Uv OL's key point about the transformative potential 109 00:06:46,920 --> 00:06:50,520 Speaker 1: of combining computer science and biology is shared by people 110 00:06:50,680 --> 00:06:53,920 Speaker 1: in the business of building the future. You may remember 111 00:06:54,080 --> 00:06:57,000 Speaker 1: Arthur Provoca from earlier in the season Arthur round the 112 00:06:57,080 --> 00:07:01,320 Speaker 1: Defense Advanced Research Projects Agency Dark. But we're living in 113 00:07:01,400 --> 00:07:06,680 Speaker 1: a time in which the biological sciences are converging with 114 00:07:07,040 --> 00:07:11,840 Speaker 1: information technology. The DARPER We are always looking for these 115 00:07:12,000 --> 00:07:15,720 Speaker 1: areas where we see these seeds of technological surprise, and 116 00:07:15,800 --> 00:07:20,080 Speaker 1: today I think this intersection of biology with technology is 117 00:07:20,200 --> 00:07:23,720 Speaker 1: one of the most fertile seed beds for surprise. And 118 00:07:23,800 --> 00:07:26,920 Speaker 1: when these advances are coming from the ability to autobiology 119 00:07:27,000 --> 00:07:29,720 Speaker 1: with computer science, it may be hard for us to 120 00:07:29,840 --> 00:07:34,400 Speaker 1: predict what surprises might bloom. You've always argued that advances 121 00:07:34,440 --> 00:07:37,480 Speaker 1: in AI and gene editing will lead to new forms 122 00:07:37,600 --> 00:07:41,280 Speaker 1: of life. We are now gaining the ability to break 123 00:07:41,440 --> 00:07:44,040 Speaker 1: or bend or change the rules of life. We are 124 00:07:44,120 --> 00:07:47,920 Speaker 1: about to create the first inorganic life forms after four 125 00:07:48,000 --> 00:07:52,200 Speaker 1: billion years of evolution. We can't really imagine how the 126 00:07:52,320 --> 00:07:56,040 Speaker 1: new entities would look like because even the wildest dreams 127 00:07:56,160 --> 00:08:02,120 Speaker 1: are still subject to natural selection and organic biochemistry. So 128 00:08:02,360 --> 00:08:04,960 Speaker 1: according to you of our Cara, it's rather difficult to 129 00:08:05,000 --> 00:08:07,640 Speaker 1: say what might come next because it's somehow out of 130 00:08:07,680 --> 00:08:10,640 Speaker 1: our frame of reference. It's almost like saying what aliens 131 00:08:10,680 --> 00:08:12,880 Speaker 1: look like. And I know if you ever watched movies, 132 00:08:12,960 --> 00:08:16,800 Speaker 1: but in movies, aliens and cyborgs tend to look remarkably consistent. 133 00:08:17,720 --> 00:08:20,080 Speaker 1: I actually watched Men in Black last week, and it 134 00:08:20,360 --> 00:08:24,560 Speaker 1: is alien central casting. You know, a lot of representations 135 00:08:24,600 --> 00:08:26,920 Speaker 1: of cyborgs tend to have this element of cliche. You know, 136 00:08:26,960 --> 00:08:29,680 Speaker 1: there's a pitiable robot, the robot that's hell by love. 137 00:08:30,120 --> 00:08:32,679 Speaker 1: You can think of Wally c three p O, the 138 00:08:32,800 --> 00:08:34,959 Speaker 1: tin Man from the Wizard of Us. The Wizard was 139 00:08:35,040 --> 00:08:37,320 Speaker 1: my favorite movie. It's kind of a fantasy film. We've 140 00:08:37,320 --> 00:08:40,160 Speaker 1: actually talked about real life people who may meet the 141 00:08:40,280 --> 00:08:43,839 Speaker 1: qualification of cyborg. Already on this show, we spoke about 142 00:08:43,920 --> 00:08:47,000 Speaker 1: Jan Sherman in the last episode using a robotic arm 143 00:08:47,400 --> 00:08:50,280 Speaker 1: with her mind, but the research didn't actually end there. 144 00:08:50,559 --> 00:08:54,040 Speaker 1: In a subsequent experiment, Jan flew a simulated plane with 145 00:08:54,160 --> 00:08:57,360 Speaker 1: our brain waves. Remember this research was being funded by 146 00:08:57,440 --> 00:09:00,199 Speaker 1: Darper and it raised big questions for Arthur Prabba Car 147 00:09:00,520 --> 00:09:02,760 Speaker 1: who was running the agency at the time, about the 148 00:09:02,800 --> 00:09:08,800 Speaker 1: ethics of transhumanism. When Jan went from moving a prosthetic 149 00:09:08,920 --> 00:09:12,880 Speaker 1: limb to controlling a flight simulator, that was the moment 150 00:09:13,000 --> 00:09:17,319 Speaker 1: that it became visible that these technologies that allow you 151 00:09:17,400 --> 00:09:21,880 Speaker 1: to restore function by harnessing motor control signals also mean 152 00:09:22,240 --> 00:09:24,720 Speaker 1: that now we have a way for a human brain 153 00:09:25,400 --> 00:09:29,600 Speaker 1: to engage with the world in a completely different modality. 154 00:09:29,880 --> 00:09:32,040 Speaker 1: It was a very eye opening moment. I think for 155 00:09:32,120 --> 00:09:34,240 Speaker 1: all of us, what did you feel in that moment? 156 00:09:34,880 --> 00:09:40,240 Speaker 1: It's creepy, it's powerful. It makes you realize that as 157 00:09:40,400 --> 00:09:44,240 Speaker 1: hard as the technology is, it might be the easiest 158 00:09:44,360 --> 00:09:48,120 Speaker 1: part of figuring out how we harness these capabilities for 159 00:09:48,679 --> 00:09:53,960 Speaker 1: good in the future. As we've said, this technology remains 160 00:09:54,040 --> 00:09:58,040 Speaker 1: firmly in research labs at least for now. Andy Schwartz 161 00:09:58,160 --> 00:10:00,959 Speaker 1: is the neurobiologist at the Universe. You have Pittsburgh who 162 00:10:01,040 --> 00:10:03,920 Speaker 1: led the project with Jan and he's very clear that 163 00:10:04,080 --> 00:10:08,000 Speaker 1: his mission was to restore function, not enhance it. The 164 00:10:08,160 --> 00:10:11,199 Speaker 1: idea was, can we help paralyze people coming back from 165 00:10:11,320 --> 00:10:15,240 Speaker 1: war zones regain function? So I think that was very 166 00:10:15,280 --> 00:10:17,600 Speaker 1: good motivation for the kind of research we are doing. 167 00:10:17,920 --> 00:10:21,040 Speaker 1: And despite the huge potential of Andy's research, he's quick 168 00:10:21,120 --> 00:10:23,720 Speaker 1: to point out that the brain computer interface he built 169 00:10:24,120 --> 00:10:26,760 Speaker 1: is much less efficient than our existing connection to the world. 170 00:10:27,320 --> 00:10:30,440 Speaker 1: At present, our bodies are the best we've got right now. 171 00:10:30,520 --> 00:10:33,680 Speaker 1: If you wanted to control the computer with brain activity, 172 00:10:33,800 --> 00:10:36,000 Speaker 1: it would be ten times slower. That would be if 173 00:10:36,040 --> 00:10:39,719 Speaker 1: you were typing with your own fingers. Things move so 174 00:10:39,920 --> 00:10:42,480 Speaker 1: fast that we've gotten accustomed to the idea that if 175 00:10:42,520 --> 00:10:44,920 Speaker 1: you see it once, it's just going to become pervasive. 176 00:10:45,280 --> 00:10:49,839 Speaker 1: That's actually not true of a medically invasive procedure that 177 00:10:50,000 --> 00:10:54,520 Speaker 1: involves putting implants on the surface of the brain. We're 178 00:10:54,520 --> 00:10:56,839 Speaker 1: going to need all of that time to figure out 179 00:10:56,880 --> 00:10:59,000 Speaker 1: what we were we want to go with these technologies. 180 00:11:00,640 --> 00:11:03,480 Speaker 1: So it's a perfect moment to ask where do we 181 00:11:03,600 --> 00:11:07,120 Speaker 1: want to go? For many in Silicon Valley, the principle 182 00:11:07,200 --> 00:11:11,280 Speaker 1: of a more direct interface between humans and computers is enticing. 183 00:11:11,679 --> 00:11:15,360 Speaker 1: But here's Sebastian again. At this point, I don't think 184 00:11:15,880 --> 00:11:18,800 Speaker 1: customers would love having surgery to get an implant in 185 00:11:18,840 --> 00:11:21,680 Speaker 1: their brain just to be lived smarter. But there are 186 00:11:21,720 --> 00:11:25,439 Speaker 1: other ways to approach the problem. You may remember Google Glass. 187 00:11:25,679 --> 00:11:28,599 Speaker 1: Sebastian lad that project when he ran Google X, and 188 00:11:28,600 --> 00:11:30,840 Speaker 1: in fact, when I first met him in twenty twelve, 189 00:11:31,080 --> 00:11:34,200 Speaker 1: he was wearing a pair we bet Google glass because 190 00:11:34,280 --> 00:11:36,719 Speaker 1: I wanted a camera right next to your eye, and 191 00:11:36,760 --> 00:11:38,520 Speaker 1: I wanted to speak of right next to your ear, 192 00:11:39,080 --> 00:11:41,880 Speaker 1: so the computer could perceive the same sensation, the same 193 00:11:41,920 --> 00:11:45,280 Speaker 1: stuff you see every day. I think these technologies become 194 00:11:45,360 --> 00:11:49,880 Speaker 1: closer and closer to us. When I first met Sebastian, 195 00:11:50,240 --> 00:11:53,520 Speaker 1: he was wearing these weird Google glasses carra It kind 196 00:11:53,559 --> 00:11:56,160 Speaker 1: of looked like a cyborg, and he told me that 197 00:11:56,240 --> 00:11:58,920 Speaker 1: these glasses might solve the problem of people constantly having 198 00:11:58,960 --> 00:12:01,679 Speaker 1: to look down at their ownes because the information could 199 00:12:01,720 --> 00:12:04,679 Speaker 1: be at eye level. But that really hasn't solved the 200 00:12:04,720 --> 00:12:09,480 Speaker 1: problem of tech neck. Technic women and men are now 201 00:12:09,559 --> 00:12:13,719 Speaker 1: getting this like very you know, unsavory under neck, fatally 202 00:12:13,840 --> 00:12:16,200 Speaker 1: from looking down at their phones. Yeah, and then there's 203 00:12:16,240 --> 00:12:19,200 Speaker 1: turtle posture, which is people hunting forward because they're sitting 204 00:12:19,240 --> 00:12:21,080 Speaker 1: at their computers so much so we don't know if 205 00:12:21,120 --> 00:12:24,960 Speaker 1: these are going to have an impact on us evolutionarily speaking, 206 00:12:25,000 --> 00:12:27,559 Speaker 1: but I imagine they might. Or thumbs if you think 207 00:12:27,600 --> 00:12:29,920 Speaker 1: about it, well, our bodies are already being changed by 208 00:12:29,960 --> 00:12:33,360 Speaker 1: our interaction with our phones. Absolutely. There's actually really quickly, 209 00:12:33,440 --> 00:12:36,200 Speaker 1: there's this fake video that was going around called Lookout, 210 00:12:36,520 --> 00:12:39,840 Speaker 1: where this guy invented a product where your phone basically 211 00:12:39,920 --> 00:12:42,480 Speaker 1: had the same camera it does, except it would face out, 212 00:12:42,679 --> 00:12:44,800 Speaker 1: so you never had to look away from your phone. 213 00:12:44,880 --> 00:12:47,040 Speaker 1: You could just look at your life through your phone, 214 00:12:47,080 --> 00:12:49,439 Speaker 1: which is essentially what everybody is already doing well, And 215 00:12:49,520 --> 00:12:51,719 Speaker 1: that was kind of the insight behind Good Last was 216 00:12:52,040 --> 00:12:55,719 Speaker 1: what if you could overlay kind of augmented reality over 217 00:12:55,800 --> 00:12:58,920 Speaker 1: what you saw And there's something very promising cool about it, 218 00:12:59,320 --> 00:13:01,839 Speaker 1: but it wasn't mostly viable because you had to wear 219 00:13:01,960 --> 00:13:04,240 Speaker 1: something on your face that made it like a cyborg. 220 00:13:04,400 --> 00:13:07,680 Speaker 1: But for consumers and the benefits of a computer at 221 00:13:07,720 --> 00:13:10,360 Speaker 1: that proximity just weren't enough to make up for the 222 00:13:10,440 --> 00:13:13,679 Speaker 1: inconvenience of wearing something on their face. There is one 223 00:13:13,720 --> 00:13:15,439 Speaker 1: group of people who are willing to put up with 224 00:13:15,559 --> 00:13:18,360 Speaker 1: inconvenient technology. You know, people like Jan who had her 225 00:13:18,440 --> 00:13:21,320 Speaker 1: skull opens that she could move an arm again. Right 226 00:13:21,360 --> 00:13:24,079 Speaker 1: as Andy said, this wasn't about letting her fly planes 227 00:13:24,160 --> 00:13:27,280 Speaker 1: with her mind. It was about restoring function. When we 228 00:13:27,320 --> 00:13:29,240 Speaker 1: come back after the break, we look at a much 229 00:13:29,280 --> 00:13:32,440 Speaker 1: less invasive technology that hopes to be as miraculous as 230 00:13:32,520 --> 00:13:42,160 Speaker 1: Jan Sherman's robotic arm. Okay, let me do you want 231 00:13:42,240 --> 00:13:44,800 Speaker 1: to test to test it with the tambourine or whatever. Okay, 232 00:13:56,640 --> 00:14:01,080 Speaker 1: that's Noise Toka. He's an award winning Italian blues guitarist 233 00:14:01,120 --> 00:14:03,280 Speaker 1: who lives in New York. When Noi came to play 234 00:14:03,320 --> 00:14:05,480 Speaker 1: for Julian and I in the studio earlier this year, 235 00:14:05,840 --> 00:14:07,840 Speaker 1: he asked that I need him at the bus stop. 236 00:14:08,960 --> 00:14:11,199 Speaker 1: I went to Berkeley College of Music in Boston to 237 00:14:11,320 --> 00:14:14,640 Speaker 1: study jazz guitar performance and Noise the real deal. Well 238 00:14:14,720 --> 00:14:17,280 Speaker 1: he was at Berkeley. He was given the Jimmy Hendricks 239 00:14:17,320 --> 00:14:19,880 Speaker 1: Award for being the school's leading guitarist, as well as 240 00:14:19,920 --> 00:14:23,480 Speaker 1: the Billboard magazine and owed scholarship. And now he plays 241 00:14:23,520 --> 00:14:27,000 Speaker 1: a consistent roster of shows around New York. But he's 242 00:14:27,040 --> 00:14:29,360 Speaker 1: working to reach a digital fan base as well. Every 243 00:14:29,440 --> 00:14:31,800 Speaker 1: day Noi posts a video to help build his audience. 244 00:14:32,240 --> 00:14:34,960 Speaker 1: This is how I found him. What you wouldn't know 245 00:14:35,160 --> 00:14:37,320 Speaker 1: just listening to this podcast is that no Way also 246 00:14:37,400 --> 00:14:40,440 Speaker 1: happens to be blind. I was born three months early, 247 00:14:41,080 --> 00:14:43,080 Speaker 1: so they put me in the colbator, but there was 248 00:14:43,120 --> 00:14:47,400 Speaker 1: too much oxygen in his infancy. Noi suffered from retinopathy 249 00:14:47,480 --> 00:14:51,080 Speaker 1: of prematurity known as r o P. He needed surgery 250 00:14:51,160 --> 00:14:54,360 Speaker 1: to reattach his retinas and now he has very limited vision, 251 00:14:55,800 --> 00:15:00,400 Speaker 1: totally blind, like blind. Technically, I guess it's called light 252 00:15:00,480 --> 00:15:02,480 Speaker 1: and shades or something. I mean, I can see colors, 253 00:15:02,800 --> 00:15:05,440 Speaker 1: I can read really large print only out of one eye, 254 00:15:05,800 --> 00:15:08,320 Speaker 1: but my vision field is small, so I still use 255 00:15:08,400 --> 00:15:10,560 Speaker 1: the k and I still read brail. This is why 256 00:15:10,640 --> 00:15:11,920 Speaker 1: no I asked me to pick him up at the 257 00:15:11,960 --> 00:15:15,080 Speaker 1: bus stop. Getting to and from new destinations is a 258 00:15:15,200 --> 00:15:17,920 Speaker 1: huge pain in the ass. But here's the thing. There 259 00:15:17,960 --> 00:15:20,000 Speaker 1: are some companies that are trying to fill the gap 260 00:15:20,360 --> 00:15:23,800 Speaker 1: and make smart glasses for people with visual impairments. Their 261 00:15:23,840 --> 00:15:26,840 Speaker 1: hope is that with the right technology, blindness could become 262 00:15:26,880 --> 00:15:30,240 Speaker 1: a thing of the past. When we were researching for 263 00:15:30,320 --> 00:15:33,840 Speaker 1: this podcast, Julian came across a new technology company. Yeah, 264 00:15:33,920 --> 00:15:35,840 Speaker 1: after a few months, I had a pretty AI focused 265 00:15:35,840 --> 00:15:39,080 Speaker 1: search history, right, so my Instagram ad started to reflect that, 266 00:15:39,320 --> 00:15:41,800 Speaker 1: which is convenient, you know, AI helping us make a 267 00:15:41,840 --> 00:15:45,360 Speaker 1: podcast about AI. Yeah, super consider it, right, So I 268 00:15:45,440 --> 00:15:48,280 Speaker 1: got an ad for east Site presented itself and east 269 00:15:48,280 --> 00:15:51,560 Speaker 1: Side build themselves as creators of the world's most advanced 270 00:15:51,720 --> 00:15:55,480 Speaker 1: site enhancing glasses for the visually impaired and medical tech 271 00:15:55,560 --> 00:15:58,560 Speaker 1: can be pretty expensive, but the sites recently lowered their 272 00:15:58,600 --> 00:16:02,360 Speaker 1: prices from hundred dollars for appair of these glasses down 273 00:16:02,400 --> 00:16:05,680 Speaker 1: to fifty what do they call that? A price slash? 274 00:16:06,320 --> 00:16:10,040 Speaker 1: All Hans glasses must go. So it's still pretty steep, 275 00:16:10,240 --> 00:16:12,560 Speaker 1: but east Side as payment plans and health insurance can 276 00:16:12,640 --> 00:16:15,520 Speaker 1: help people out as well. Yeah, but even so, I 277 00:16:15,560 --> 00:16:18,280 Speaker 1: mean I'd rather spend money on guitars, you know, like 278 00:16:18,400 --> 00:16:20,400 Speaker 1: that's what I do in my life. Like, why would 279 00:16:20,440 --> 00:16:23,880 Speaker 1: I spend money on something that is still so experimental? 280 00:16:24,320 --> 00:16:27,479 Speaker 1: Still No, I was curious to test out east Sides technology, 281 00:16:27,840 --> 00:16:30,520 Speaker 1: so I reached out, explained noise condition and we got 282 00:16:30,600 --> 00:16:34,680 Speaker 1: an appointment. Yeah, don't do anything crazy when you put 283 00:16:34,720 --> 00:16:41,160 Speaker 1: them on to start jumping around like a crazy person. Yes, 284 00:16:41,920 --> 00:16:45,760 Speaker 1: if it doesn't work, just make it up. I'm I'm doing. 285 00:16:47,680 --> 00:16:50,840 Speaker 1: The east Side glasses basically look like Cyclops from X Men, 286 00:16:51,080 --> 00:16:54,200 Speaker 1: except without the blinding laser beam that shoots out from them. 287 00:16:54,720 --> 00:16:57,680 Speaker 1: The glasses capture high quality video with a camera above 288 00:16:57,720 --> 00:17:00,400 Speaker 1: your nose, and then project the video to a high 289 00:17:00,440 --> 00:17:03,560 Speaker 1: definition screen in front of each eye. The footage is 290 00:17:03,680 --> 00:17:07,480 Speaker 1: enhanced by software designed especially for people with vision impairment. 291 00:17:07,960 --> 00:17:10,720 Speaker 1: You can also zoom, change focus, and do things like 292 00:17:10,800 --> 00:17:14,040 Speaker 1: boost contrast or go to gray scale. And east Side 293 00:17:14,080 --> 00:17:16,240 Speaker 1: rep named Nigel helped No. I get fitted with a pair, 294 00:17:16,640 --> 00:17:18,200 Speaker 1: but it can be a little hard to get used to. 295 00:17:20,040 --> 00:17:23,000 Speaker 1: I mean, I am trying to see. Isn't it like 296 00:17:23,240 --> 00:17:25,440 Speaker 1: right here where I should be looking like? Yes, it's 297 00:17:25,520 --> 00:17:28,480 Speaker 1: it's it should be right in front of your people. Um, yeah, 298 00:17:28,520 --> 00:17:30,880 Speaker 1: it's a little it's a little high. Let me see 299 00:17:30,880 --> 00:17:32,840 Speaker 1: if I can just adjust that. Yeah, there you go. 300 00:17:32,960 --> 00:17:36,800 Speaker 1: That's better? Is better? Yeah? That seems like I get 301 00:17:36,840 --> 00:17:38,879 Speaker 1: more details out of things, but I don't really know 302 00:17:39,320 --> 00:17:41,760 Speaker 1: what I'm looking at. Nigel also asked, no way to 303 00:17:41,840 --> 00:17:46,040 Speaker 1: identify me. Let's let's see if you can recognize characters. 304 00:17:47,320 --> 00:17:49,320 Speaker 1: I see, I see it. Do you know what I 305 00:17:49,400 --> 00:17:50,880 Speaker 1: look like? No, I like, do you have a sense 306 00:17:50,920 --> 00:17:52,760 Speaker 1: of what I look like? I mean from what I 307 00:17:52,800 --> 00:17:54,119 Speaker 1: saw when I was close to you, like have like 308 00:17:54,359 --> 00:17:56,520 Speaker 1: light hair, like the light brown or something like that. 309 00:17:57,080 --> 00:18:00,240 Speaker 1: That's right, And uh yeah, I mean it's cool. I 310 00:18:00,359 --> 00:18:02,399 Speaker 1: just don't know, you know, we have to see how 311 00:18:02,720 --> 00:18:04,960 Speaker 1: useful it would actually be, Like, I mean, I don't 312 00:18:04,960 --> 00:18:06,439 Speaker 1: think I would use it to recognize people like there, 313 00:18:06,440 --> 00:18:08,680 Speaker 1: because I would have no way of I don't know 314 00:18:08,760 --> 00:18:13,160 Speaker 1: what anybody his face looks like anyway, you know, because 315 00:18:13,200 --> 00:18:15,880 Speaker 1: he has been blind for his entire life. Nobody doesn't 316 00:18:15,920 --> 00:18:18,560 Speaker 1: have a point of reference for the details in people's faces. 317 00:18:19,240 --> 00:18:21,520 Speaker 1: Imagine that you have no idea what any of the 318 00:18:21,600 --> 00:18:24,320 Speaker 1: people in your life really look like. You learn to 319 00:18:24,400 --> 00:18:28,200 Speaker 1: identify them in other ways, like their voices. So no 320 00:18:28,520 --> 00:18:31,879 Speaker 1: is ambivalent about the prospect of seeing people clearly, but 321 00:18:32,320 --> 00:18:34,359 Speaker 1: he would like it to be easier to get around. 322 00:18:34,920 --> 00:18:37,840 Speaker 1: Recognizing people's faces would not be one of my priorities, 323 00:18:38,520 --> 00:18:40,840 Speaker 1: but reading where the bus stop is or when it's 324 00:18:40,880 --> 00:18:44,000 Speaker 1: coming will be a priority. I mean, it's cool, it's interesting. 325 00:18:44,000 --> 00:18:45,920 Speaker 1: I would just have to see how it actually practically 326 00:18:45,920 --> 00:18:48,080 Speaker 1: will be because I mean here's the thing, Like, when 327 00:18:48,119 --> 00:18:50,080 Speaker 1: you're outside, it's not that you have time to stop 328 00:18:50,119 --> 00:18:52,080 Speaker 1: and figure stuff out, especially when you're walking on the 329 00:18:52,160 --> 00:18:55,520 Speaker 1: sidewalks of New York. Like, I'd rather just use blind Square. 330 00:18:56,040 --> 00:18:58,120 Speaker 1: Nobody first told me about blind Square when I picked 331 00:18:58,200 --> 00:19:00,960 Speaker 1: him up from the bus. He just the most popular 332 00:19:01,080 --> 00:19:04,800 Speaker 1: iOS navigation app for blind people. It describes the environment 333 00:19:04,840 --> 00:19:08,639 Speaker 1: you're in, the intersection you're crossing, and announces points of interest, 334 00:19:09,040 --> 00:19:13,080 Speaker 1: all using GPS noise. Point in mentioning blind Squared to 335 00:19:13,160 --> 00:19:15,240 Speaker 1: Nigel is that it is a tool that he can 336 00:19:15,320 --> 00:19:18,280 Speaker 1: work with within his set of limitations. Now, he spent 337 00:19:18,400 --> 00:19:21,400 Speaker 1: his whole life navigating the world one way, his way, 338 00:19:21,920 --> 00:19:24,399 Speaker 1: and east Site would actually change that, maybe more than 339 00:19:24,480 --> 00:19:27,959 Speaker 1: he would want. I've never really seen a subway map anyway, 340 00:19:28,040 --> 00:19:29,840 Speaker 1: so I would have to get a hang of how 341 00:19:29,880 --> 00:19:33,280 Speaker 1: it works first, and then because your brain needs to 342 00:19:33,320 --> 00:19:36,920 Speaker 1: recognize it and then know what it's looking, that's hard 343 00:19:36,960 --> 00:19:39,480 Speaker 1: for human beings to comprehend. I think, yeah, it's like 344 00:19:39,560 --> 00:19:45,040 Speaker 1: imagining a new color, right. Yeah, So Julian east site worked, 345 00:19:45,440 --> 00:19:48,080 Speaker 1: but it wasn't perfect for no Way, And I think 346 00:19:48,119 --> 00:19:50,280 Speaker 1: it's important to know that east is actually really a 347 00:19:50,359 --> 00:19:52,919 Speaker 1: cool technology, whether or not, it worked for him. Nigel 348 00:19:53,000 --> 00:19:56,320 Speaker 1: mentioned that many elderly people who suffer from macular degeneration 349 00:19:56,560 --> 00:19:58,360 Speaker 1: losing their sight due to old age, have been able 350 00:19:58,359 --> 00:20:00,520 Speaker 1: to see their grandchildren for the first time, which is 351 00:20:00,600 --> 00:20:02,280 Speaker 1: really cool. I mean it does restore site for a 352 00:20:02,280 --> 00:20:05,080 Speaker 1: lot of people who have seen before. Noways said, the 353 00:20:05,119 --> 00:20:07,359 Speaker 1: sacrifices he makes to get around to his gigs as 354 00:20:07,359 --> 00:20:10,600 Speaker 1: a blind person aren't so bad, but there is one 355 00:20:10,640 --> 00:20:13,639 Speaker 1: part of his career that's less easy to navigate. Like 356 00:20:13,760 --> 00:20:16,959 Speaker 1: many musicians, no Way uses his social media channels as 357 00:20:17,040 --> 00:20:19,760 Speaker 1: his primary way of connecting to his current fans and 358 00:20:19,840 --> 00:20:23,760 Speaker 1: hopefully reaching future ones. The problem is it's pretty hard 359 00:20:23,840 --> 00:20:27,600 Speaker 1: to reach your fans on Instagram if you're blind. Instagram 360 00:20:27,680 --> 00:20:30,480 Speaker 1: needs to listen to me right now. Uh No, Basically 361 00:20:30,640 --> 00:20:35,200 Speaker 1: the issue is this editing videos and putting the part 362 00:20:35,280 --> 00:20:38,119 Speaker 1: that you want is not accessible on Instagram. You know, 363 00:20:38,240 --> 00:20:41,520 Speaker 1: thankfully the Apple app is accessible. So I edited on 364 00:20:41,600 --> 00:20:44,119 Speaker 1: the iPhone app because there is like sliders and he 365 00:20:44,160 --> 00:20:46,719 Speaker 1: tells you like current positions zero seconds of like two 366 00:20:46,800 --> 00:20:49,040 Speaker 1: minutes and thirty seconds, and so basically you slide with 367 00:20:49,119 --> 00:20:50,720 Speaker 1: your finger up and down. If it wasn't for that, 368 00:20:50,720 --> 00:20:52,520 Speaker 1: I wouldn't be able to post on Instagram. I would 369 00:20:52,560 --> 00:20:54,600 Speaker 1: only be able to post the first minute, which is 370 00:20:54,640 --> 00:20:56,359 Speaker 1: not what I always want to post. On the songs, 371 00:20:56,480 --> 00:20:59,640 Speaker 1: sometimes I post some other part, you know. But Instagram 372 00:20:59,760 --> 00:21:02,359 Speaker 1: has implemented the app because you know, black people aren't 373 00:21:02,359 --> 00:21:07,119 Speaker 1: supposed to be on Instagram. I've never thought about this. 374 00:21:07,640 --> 00:21:11,680 Speaker 1: Digital technology is almost exclusively visual. I don't think about 375 00:21:11,760 --> 00:21:14,080 Speaker 1: how easy it is for me to use an iPhone 376 00:21:14,080 --> 00:21:16,440 Speaker 1: every day. I just use it. And when you have 377 00:21:16,680 --> 00:21:19,440 Speaker 1: these technologies like iPhones being designed by people in Silicon 378 00:21:19,560 --> 00:21:22,280 Speaker 1: Valley who fit pretty healthily within the norm, it's not 379 00:21:22,520 --> 00:21:25,240 Speaker 1: really doing no way any favors. And when we focus 380 00:21:25,280 --> 00:21:27,160 Speaker 1: on the next big thing in tech, we can leap 381 00:21:27,240 --> 00:21:31,479 Speaker 1: frog past uses for these existing technologies. So we say, okay, well, 382 00:21:31,480 --> 00:21:33,560 Speaker 1: when you're starting next big unicorn, we want to do 383 00:21:33,680 --> 00:21:37,160 Speaker 1: these miraculous, gigantic ideas. But you don't think about subway maps. 384 00:21:37,200 --> 00:21:39,720 Speaker 1: You don't think about things like blind Square unless you 385 00:21:39,760 --> 00:21:42,520 Speaker 1: actually need that app. And I think it's we're not 386 00:21:42,720 --> 00:21:47,359 Speaker 1: anti technology or anti unicorn. Those things are important, you know. No, 387 00:21:47,480 --> 00:21:49,280 Speaker 1: I actually said something really funny, which is that like 388 00:21:49,400 --> 00:21:51,280 Speaker 1: he would love a self driving car. Of course he 389 00:21:51,359 --> 00:21:56,359 Speaker 1: would so we all benefit from these moonshot innovations, but 390 00:21:56,440 --> 00:22:00,880 Speaker 1: it's really important I think that everyone thinks about who 391 00:22:01,040 --> 00:22:04,800 Speaker 1: is forgotten. When we innovate too quickly, we don't all 392 00:22:04,880 --> 00:22:13,920 Speaker 1: benefit in the same ways. We began this episode debating 393 00:22:14,040 --> 00:22:17,520 Speaker 1: what defines a cyborg and asking how close we are 394 00:22:17,600 --> 00:22:21,840 Speaker 1: to being transhuman. These are important philosophical questions, and they 395 00:22:21,880 --> 00:22:25,719 Speaker 1: may even become practical questions in our lifetime. But as 396 00:22:25,800 --> 00:22:28,080 Speaker 1: we wrestle with them, it's important to remember that a 397 00:22:28,240 --> 00:22:31,440 Speaker 1: huge number of people still don't have access to technologies 398 00:22:31,520 --> 00:22:33,919 Speaker 1: that most of us take for granted to expand our 399 00:22:33,960 --> 00:22:36,760 Speaker 1: horizons and what we can do. There are billions of 400 00:22:36,840 --> 00:22:40,080 Speaker 1: people in the world who don't have smartphones, and millions 401 00:22:40,080 --> 00:22:42,840 Speaker 1: of people in the US, like Noah, who can't take 402 00:22:42,880 --> 00:22:46,639 Speaker 1: full advantage of them. Briany Cole is tackling this problem 403 00:22:46,800 --> 00:22:49,919 Speaker 1: head on. After working at Microsoft, she founded The Future 404 00:22:50,000 --> 00:22:53,040 Speaker 1: of Sex, and as part of her work, she produces 405 00:22:53,119 --> 00:22:57,200 Speaker 1: sex tech hackathons around the world. I think it's so 406 00:22:57,280 --> 00:23:00,520 Speaker 1: important is because no one's talking about it. We all 407 00:23:00,600 --> 00:23:04,639 Speaker 1: got here by having sex sexual identities so core to 408 00:23:05,000 --> 00:23:08,560 Speaker 1: who we are, and the concentration of people innovating in 409 00:23:08,640 --> 00:23:12,000 Speaker 1: this field is quite small. So Carol, I was a 410 00:23:12,040 --> 00:23:14,920 Speaker 1: bit nervous to talk about sex in this episode because 411 00:23:15,280 --> 00:23:17,359 Speaker 1: I don't want people just think we're being sensational for 412 00:23:17,440 --> 00:23:19,760 Speaker 1: the sake of it. But it is an area where 413 00:23:19,800 --> 00:23:23,080 Speaker 1: we can see the tangible consequences of what happens when 414 00:23:23,119 --> 00:23:26,040 Speaker 1: one group designs for the rest. Like with No Way, 415 00:23:26,480 --> 00:23:28,359 Speaker 1: I don't think it's sensational at all. It's like with 416 00:23:28,480 --> 00:23:32,080 Speaker 1: Gillian and episode one, when people programming ads for Facebook 417 00:23:32,280 --> 00:23:36,960 Speaker 1: do not understand different outcomes for pregnancy. They don't know 418 00:23:37,240 --> 00:23:39,760 Speaker 1: that an outcome for pregnancy could be still birth, and 419 00:23:39,880 --> 00:23:42,960 Speaker 1: that's how Gillian's ads were making her life worse because 420 00:23:43,000 --> 00:23:45,560 Speaker 1: of people programming them a man by and large, By 421 00:23:45,600 --> 00:23:48,240 Speaker 1: and large, now it takes these edgu cases like Gillian 422 00:23:48,320 --> 00:23:51,440 Speaker 1: being haunted by targeted ads for us to notice that 423 00:23:52,040 --> 00:23:54,080 Speaker 1: more of from than not. We don't have a counter factual. 424 00:23:54,160 --> 00:23:56,840 Speaker 1: We can't make a comparison to a version of technology 425 00:23:57,160 --> 00:24:00,560 Speaker 1: that wasn't built by Silicon Value. And that is exactly 426 00:24:00,760 --> 00:24:04,840 Speaker 1: why Briany Cole organizes events to design new sex tech. 427 00:24:06,640 --> 00:24:10,159 Speaker 1: While hackathon sound really geeky and typically you're going to 428 00:24:10,240 --> 00:24:14,159 Speaker 1: attract just people involved in technology, we're really careful in 429 00:24:14,720 --> 00:24:19,400 Speaker 1: structuring the hackason, inviting other people in from other classes, 430 00:24:19,520 --> 00:24:22,320 Speaker 1: from other ethnicities, from other genders to allow us to 431 00:24:22,400 --> 00:24:25,920 Speaker 1: build different sorts of products. What was the surprise that 432 00:24:26,040 --> 00:24:30,119 Speaker 1: we didn't design for was people with disabilities that showed up. 433 00:24:30,920 --> 00:24:33,600 Speaker 1: And with these new groups of people came new kinds 434 00:24:33,640 --> 00:24:37,240 Speaker 1: of products. One team came up with a voice activated 435 00:24:37,359 --> 00:24:41,920 Speaker 1: vibrator for people in wheelchairs. Another team in Singapore, a 436 00:24:42,040 --> 00:24:46,680 Speaker 1: deaf man a blind man built a sex intimacy education 437 00:24:46,760 --> 00:24:50,600 Speaker 1: platform about bringing a woman back to their dorm room 438 00:24:50,640 --> 00:24:53,320 Speaker 1: and really not knowing how to read intimacy ques because 439 00:24:53,320 --> 00:24:56,359 Speaker 1: they've never been taught. These people just showed up on 440 00:24:56,440 --> 00:25:02,480 Speaker 1: my thank you um. We feel invisible technology very quickly 441 00:25:02,560 --> 00:25:05,920 Speaker 1: becomes part of our homeostasis. We take for granted the 442 00:25:05,960 --> 00:25:08,920 Speaker 1: fact that we have a supercomputer as our constant companion, 443 00:25:09,080 --> 00:25:11,440 Speaker 1: and that we can summon the entire corpus of human 444 00:25:11,520 --> 00:25:14,680 Speaker 1: knowledge with the flick of a finger. But as we 445 00:25:14,840 --> 00:25:18,520 Speaker 1: constantly focus on the next frontier, what's new, it's easy 446 00:25:18,600 --> 00:25:21,080 Speaker 1: to forget that many people don't get to share in 447 00:25:21,119 --> 00:25:24,560 Speaker 1: the fruits of what we've already built. Other populations that 448 00:25:24,680 --> 00:25:28,520 Speaker 1: also resonate with the hackathon's people in remote areas or 449 00:25:28,640 --> 00:25:32,399 Speaker 1: rural populations that have trouble finding a mate, and of 450 00:25:32,560 --> 00:25:36,200 Speaker 1: course women, which is predominantly who turns up to these hackathons. 451 00:25:37,520 --> 00:25:41,400 Speaker 1: It's people that typically don't have access to this providing 452 00:25:41,440 --> 00:25:46,320 Speaker 1: input into how we build it. Briani's hackathons allow people 453 00:25:46,359 --> 00:25:50,640 Speaker 1: who feel invisible to create products directly for themselves, people 454 00:25:50,680 --> 00:25:53,880 Speaker 1: who may even be invisible on the campuses of Silicon Valley. 455 00:25:54,560 --> 00:25:58,440 Speaker 1: For Briany, access begins at the design phase and including 456 00:25:58,480 --> 00:26:02,879 Speaker 1: new voices can create new products and technological surprises that 457 00:26:03,000 --> 00:26:06,560 Speaker 1: otherwise might not exist. So perhaps our hackathons could serve 458 00:26:06,600 --> 00:26:09,199 Speaker 1: as a model for tech innovation. More broadly, I think 459 00:26:09,240 --> 00:26:11,280 Speaker 1: where I'd most like to see it go is too 460 00:26:11,280 --> 00:26:13,600 Speaker 1: sort of the trickle down effect across the world in 461 00:26:13,720 --> 00:26:16,879 Speaker 1: terms of how that technology reaches other populations because it 462 00:26:17,040 --> 00:26:22,040 Speaker 1: is so concentrated in these like Silicon Valley types. It's 463 00:26:22,200 --> 00:26:25,399 Speaker 1: the access that I'm more excited about. We actually have 464 00:26:25,800 --> 00:26:29,720 Speaker 1: incredible technology available to us, and yet the US is 465 00:26:29,880 --> 00:26:32,639 Speaker 1: right now are so clunky, you know, and we're we 466 00:26:32,960 --> 00:26:35,320 Speaker 1: just need to figure out how we're going to use 467 00:26:35,359 --> 00:26:38,840 Speaker 1: it and who we're going to invite in Briani paints 468 00:26:38,840 --> 00:26:40,960 Speaker 1: the picture of a world whether benefits of god like 469 00:26:41,080 --> 00:26:44,400 Speaker 1: technology spread to all people. But as we've talked about 470 00:26:44,440 --> 00:26:47,879 Speaker 1: before on Sleepwalkers, there's a very real possibility that technology 471 00:26:47,920 --> 00:26:50,600 Speaker 1: will be co opted by the rich and powerful. So 472 00:26:50,720 --> 00:26:53,240 Speaker 1: we have to ask who gets to become a cyborg? 473 00:26:53,840 --> 00:26:56,560 Speaker 1: And what if the new gods don't want to share 474 00:26:56,600 --> 00:27:01,879 Speaker 1: their divinity? Here's you all again, m One disturbing thought 475 00:27:02,280 --> 00:27:04,800 Speaker 1: is that there is no us, There is no we. 476 00:27:05,520 --> 00:27:08,920 Speaker 1: Different groups of humans have a very different future, and 477 00:27:09,080 --> 00:27:12,080 Speaker 1: they should prepare themselves in a different way. It could 478 00:27:12,200 --> 00:27:14,840 Speaker 1: lead to the creation of a new cost system with 479 00:27:15,240 --> 00:27:18,960 Speaker 1: immense differences in wealth and power, much greater than we 480 00:27:19,080 --> 00:27:22,320 Speaker 1: ever saw before in history. It could even lead to 481 00:27:22,440 --> 00:27:30,040 Speaker 1: a speciation, Homo sapiens splitting into different species with different capabilities. 482 00:27:30,560 --> 00:27:35,040 Speaker 1: So the descendants of Elon Musk will be a different 483 00:27:35,080 --> 00:27:39,400 Speaker 1: specie then the descendants of people who are now living 484 00:27:39,480 --> 00:27:42,480 Speaker 1: in some Favella in some Paolo. So it's a big 485 00:27:42,560 --> 00:27:45,800 Speaker 1: question always what is the future of humanity? What should 486 00:27:45,840 --> 00:27:49,800 Speaker 1: we do? What is our future? What is the future 487 00:27:49,840 --> 00:27:53,879 Speaker 1: of our humanity? What we sleepwalking towards. It's tempting to 488 00:27:53,960 --> 00:27:57,480 Speaker 1: look for the dramatic ways technology is changing us. Might 489 00:27:57,560 --> 00:28:00,840 Speaker 1: we be able to physically merge with computers? Will computer 490 00:28:00,960 --> 00:28:04,720 Speaker 1: vision help the blind, sea or the marginalized experience intimacy? 491 00:28:05,480 --> 00:28:08,720 Speaker 1: They're big questions, but there's one that's even more urgent. 492 00:28:09,320 --> 00:28:12,080 Speaker 1: How is the AI revolution already changing the way we 493 00:28:12,160 --> 00:28:15,480 Speaker 1: think about and perceive the world. What are the algorithms 494 00:28:15,560 --> 00:28:18,879 Speaker 1: we interact with every day doing to us, and how 495 00:28:18,920 --> 00:28:21,960 Speaker 1: are they changing our society? More on those questions with 496 00:28:22,080 --> 00:28:35,680 Speaker 1: you all know a Harrari. When we come back, until 497 00:28:35,760 --> 00:28:37,920 Speaker 1: I was twenty one, I didn't know I was gay. 498 00:28:38,320 --> 00:28:40,680 Speaker 1: And I look back at the time when I was 499 00:28:40,840 --> 00:28:44,000 Speaker 1: I don't know, fifteen or seventeen, and I just can't 500 00:28:44,120 --> 00:28:47,040 Speaker 1: understand how I missed it. I mean, it should have 501 00:28:47,120 --> 00:28:50,320 Speaker 1: been so obvious, But the fact is I didn't know 502 00:28:50,440 --> 00:28:54,960 Speaker 1: an extremely important thing about myself, which an AI could 503 00:28:55,000 --> 00:28:58,120 Speaker 1: have discovered within you know, like two minutes. For you 504 00:28:58,240 --> 00:29:00,680 Speaker 1: all know a Harari, AI could have made a big 505 00:29:00,720 --> 00:29:03,560 Speaker 1: difference to his early life, and that got him thinking. 506 00:29:04,120 --> 00:29:07,560 Speaker 1: When we talk about AI, we tend to greatly exaggerate 507 00:29:07,960 --> 00:29:11,160 Speaker 1: the potential abilities, but at the same time we also 508 00:29:11,320 --> 00:29:16,200 Speaker 1: tend to exaggerate the abilities of humans. People say that 509 00:29:17,000 --> 00:29:20,120 Speaker 1: AI is not going to take over our lives because 510 00:29:20,240 --> 00:29:23,240 Speaker 1: it's very imperfect and it won't be able to know 511 00:29:23,480 --> 00:29:28,120 Speaker 1: us perfectly. But what people forget is that humans often 512 00:29:28,200 --> 00:29:33,240 Speaker 1: have a very poor understanding of themselves, of the desires, 513 00:29:33,600 --> 00:29:38,040 Speaker 1: of their emotions, of their mental states. For AI to 514 00:29:38,840 --> 00:29:42,320 Speaker 1: take over your life, it doesn't need to know you perfectly. 515 00:29:42,880 --> 00:29:46,360 Speaker 1: It just needs to know you better than you know yourself. 516 00:29:46,840 --> 00:29:49,720 Speaker 1: And that's not very difficult because we often don't know 517 00:29:49,960 --> 00:29:53,520 Speaker 1: the most important things about ourselves. So let's say you 518 00:29:53,560 --> 00:29:56,240 Speaker 1: could turn back the clock to being fifteen, would you 519 00:29:56,320 --> 00:29:59,280 Speaker 1: have wanted to live in a world where there was 520 00:29:59,320 --> 00:30:03,880 Speaker 1: sufficiently sensors to monitor your eyes, your eye movement, you're breathing, 521 00:30:04,200 --> 00:30:06,280 Speaker 1: you know, while you're going about your daily life, and 522 00:30:06,360 --> 00:30:09,120 Speaker 1: then to interpret that and say to you you've all, 523 00:30:10,000 --> 00:30:13,000 Speaker 1: more likely than not you're gay. That's a very good question, 524 00:30:13,400 --> 00:30:17,000 Speaker 1: which will become very practical questions in a few years. 525 00:30:17,440 --> 00:30:21,200 Speaker 1: And the way that I grew up and developed it 526 00:30:21,320 --> 00:30:25,440 Speaker 1: would have been a very bad idea. I wouldn't like 527 00:30:25,720 --> 00:30:29,400 Speaker 1: to receive this kind of insight from form a machine. 528 00:30:29,760 --> 00:30:32,480 Speaker 1: I'm not sure how I would have dealt with it 529 00:30:32,760 --> 00:30:34,720 Speaker 1: when I was fifteen, you know, in Israel, in the 530 00:30:34,800 --> 00:30:37,800 Speaker 1: nineteen eighties, and maybe proudly it was, you know, a 531 00:30:37,840 --> 00:30:40,760 Speaker 1: defense mechanism in the future to it. It depends where 532 00:30:40,800 --> 00:30:45,400 Speaker 1: you live. Brunei has instituted a death penalty for gay people, 533 00:30:45,400 --> 00:30:48,680 Speaker 1: at least for people engaged in homosexual sex. So if 534 00:30:48,720 --> 00:30:51,200 Speaker 1: I'm a teenager in brune I, I don't want to 535 00:30:51,320 --> 00:30:54,240 Speaker 1: be told by the computer that I'm gay, because the 536 00:30:54,280 --> 00:30:56,600 Speaker 1: computer will then be able to tell that to the 537 00:30:56,680 --> 00:30:59,960 Speaker 1: police and to the authorities as well. Looking to the future, 538 00:31:00,120 --> 00:31:03,680 Speaker 1: say ten twenty years, the danger is if I still 539 00:31:03,720 --> 00:31:06,720 Speaker 1: don't know that I'm gay, but the government and Coca 540 00:31:06,800 --> 00:31:10,200 Speaker 1: Cola and and Amazon and Google they already know it. 541 00:31:10,560 --> 00:31:14,200 Speaker 1: I'm at a huge disadvantage. So it could be something 542 00:31:14,320 --> 00:31:17,920 Speaker 1: as frightening as the secret police coming and taking me 543 00:31:18,080 --> 00:31:21,440 Speaker 1: to a concentration camp. But it could also be something 544 00:31:21,560 --> 00:31:25,080 Speaker 1: like Coca Cola knowing that I'm gay, they want to 545 00:31:25,200 --> 00:31:28,680 Speaker 1: sell me a new drink, and they choose the advertisement 546 00:31:29,080 --> 00:31:32,720 Speaker 1: with the shirtless guy and not the advertisement with the 547 00:31:32,760 --> 00:31:35,240 Speaker 1: girl in the beginning, and next day morning I go 548 00:31:35,400 --> 00:31:38,080 Speaker 1: and I buy this soft drink and I don't even 549 00:31:38,160 --> 00:31:41,520 Speaker 1: know why, and they have this huge advantage over me 550 00:31:41,680 --> 00:31:44,480 Speaker 1: and can manipulate me in all kinds of ways. What 551 00:31:44,680 --> 00:31:48,280 Speaker 1: you've all suggests is that once we become reducible to data, 552 00:31:48,760 --> 00:31:52,520 Speaker 1: we become predictable to algorithms, and once we're predictable, we 553 00:31:52,600 --> 00:31:57,040 Speaker 1: can be manipulated. We talked in the last episode about 554 00:31:57,080 --> 00:32:01,680 Speaker 1: how AI is helping us decode life's fundamental histories brain waves, 555 00:32:01,960 --> 00:32:05,640 Speaker 1: health outcomes based on genetics, time of death. But the 556 00:32:05,760 --> 00:32:09,720 Speaker 1: next frontier could be our very behavior is off the again. 557 00:32:10,880 --> 00:32:14,840 Speaker 1: We have tools to evaluate vast volumes of data that 558 00:32:14,960 --> 00:32:18,080 Speaker 1: we have previously collected or that we've always collected. We 559 00:32:18,240 --> 00:32:22,719 Speaker 1: have new sources of data. Think about everything from fitbits 560 00:32:23,000 --> 00:32:26,120 Speaker 1: and those kinds of measurements that you can make on individuals, 561 00:32:26,200 --> 00:32:30,560 Speaker 1: to the volume of data that people are spewing out 562 00:32:30,760 --> 00:32:34,960 Speaker 1: into the online environment every day. The implications of spewing 563 00:32:35,040 --> 00:32:37,760 Speaker 1: this data onto the internet is where we began the series. 564 00:32:38,240 --> 00:32:40,520 Speaker 1: And knowing that your data is being used to build 565 00:32:40,560 --> 00:32:43,240 Speaker 1: an accurate model of you make if you pause about 566 00:32:43,280 --> 00:32:45,920 Speaker 1: putting it out there. But it's a combination of that 567 00:32:46,080 --> 00:32:49,120 Speaker 1: data with advances in AI that's allowing us to start 568 00:32:49,240 --> 00:32:53,320 Speaker 1: to see into the future. Data plus that deep knowledge 569 00:32:53,760 --> 00:32:58,840 Speaker 1: now allows you to form hypotheses and to design experiments 570 00:32:59,000 --> 00:33:02,360 Speaker 1: that allow us now to start a journey of building 571 00:33:02,480 --> 00:33:06,160 Speaker 1: better models of these complex systems. This is at the 572 00:33:06,240 --> 00:33:09,200 Speaker 1: core of the revolution that we're seeing. But I think 573 00:33:09,320 --> 00:33:13,480 Speaker 1: the next wave, after starting to understand biology, is about 574 00:33:13,800 --> 00:33:18,160 Speaker 1: a transformation in the social sciences. Of course, in some ways, 575 00:33:18,200 --> 00:33:21,960 Speaker 1: that transformation in the social sciences is already here. It's 576 00:33:22,000 --> 00:33:24,480 Speaker 1: what you' I was talking about in terms of computers 577 00:33:24,560 --> 00:33:27,960 Speaker 1: that understand us better than we understand ourselves. And we've 578 00:33:27,960 --> 00:33:31,360 Speaker 1: already seen the real world consequences altering the course of 579 00:33:31,480 --> 00:33:36,520 Speaker 1: history with Cambridge Analytica, Brexit and the six presidential election. 580 00:33:37,440 --> 00:33:40,480 Speaker 1: But Uvou believes that was just the beginning, and this 581 00:33:40,680 --> 00:33:44,120 Speaker 1: revolution in biology and computer science will shake the very 582 00:33:44,200 --> 00:33:50,320 Speaker 1: foundations of how we live. The ideology of humanism basically 583 00:33:50,400 --> 00:33:55,120 Speaker 1: says that the human experience is the ultimate source of 584 00:33:55,440 --> 00:33:59,959 Speaker 1: authority and meaning in the universe. If you look at polity, 585 00:34:00,600 --> 00:34:04,520 Speaker 1: then if originally political authority came from the gods, now 586 00:34:04,640 --> 00:34:08,080 Speaker 1: political authority comes from humanity. The idea is that the 587 00:34:08,200 --> 00:34:12,280 Speaker 1: voter is always right. You look at economics, they're the motto. 588 00:34:12,360 --> 00:34:15,320 Speaker 1: The slogan is the customer is always right. If the 589 00:34:15,400 --> 00:34:20,400 Speaker 1: customers want something, then, however, irrational and illogical it is, 590 00:34:21,200 --> 00:34:24,920 Speaker 1: this is what the entire system is geared to provide. 591 00:34:25,520 --> 00:34:28,520 Speaker 1: And you have the same idea in ethics, Why is 592 00:34:28,560 --> 00:34:32,239 Speaker 1: it wrong to murder? Not because some God said so, 593 00:34:32,760 --> 00:34:36,759 Speaker 1: but because it hurts other people? So we already view 594 00:34:36,760 --> 00:34:40,560 Speaker 1: yourself in this sense as kind of divine entities that 595 00:34:40,800 --> 00:34:44,560 Speaker 1: provide meaning and authority to the world. The big question 596 00:34:44,760 --> 00:34:50,520 Speaker 1: is what happens once some algorithm can decipher and manipulate 597 00:34:50,960 --> 00:34:54,200 Speaker 1: human feelings and experiences, then they can no longer be 598 00:34:54,600 --> 00:34:57,680 Speaker 1: the source of authority if it's so easy to hackn 599 00:34:57,760 --> 00:35:00,960 Speaker 1: manipulate them. And this is part of the crisis which 600 00:35:01,000 --> 00:35:05,480 Speaker 1: we are already beginning to see today. This crisis is 601 00:35:05,560 --> 00:35:08,920 Speaker 1: the intersection, the culmination of many of the ideas that 602 00:35:08,960 --> 00:35:12,200 Speaker 1: we've spoken about on this first season of Sleepwalkers, and 603 00:35:12,280 --> 00:35:15,239 Speaker 1: it's why it's so urgently important not to ignore the 604 00:35:15,400 --> 00:35:18,839 Speaker 1: changes going on around us and to wake up if 605 00:35:18,920 --> 00:35:21,840 Speaker 1: we refuse to see it. If we just hold on 606 00:35:22,080 --> 00:35:28,000 Speaker 1: to this liberal belief that humans are free agents, we 607 00:35:28,160 --> 00:35:31,840 Speaker 1: have free will, nobody will ever be able to understand me, 608 00:35:32,120 --> 00:35:35,520 Speaker 1: nobody will ever be able to manipulate me. If you 609 00:35:35,680 --> 00:35:38,880 Speaker 1: really believe that, then you are not open to the 610 00:35:39,040 --> 00:35:42,960 Speaker 1: danger and you won't be able to reinvent the system 611 00:35:43,000 --> 00:35:46,279 Speaker 1: in a better way. Not everyone agrees that we've reached 612 00:35:46,320 --> 00:35:50,080 Speaker 1: the point you've all describes. According to Arthie, there's a 613 00:35:50,120 --> 00:35:53,440 Speaker 1: world of mystery that we've barely begun to penetrate. I 614 00:35:53,560 --> 00:35:59,320 Speaker 1: think we are so far from having a full understanding 615 00:36:00,080 --> 00:36:02,680 Speaker 1: of any of these systems. The fact that we are 616 00:36:02,760 --> 00:36:08,000 Speaker 1: making this rapid accelerated progress, I think sometimes leads to 617 00:36:08,800 --> 00:36:11,360 Speaker 1: a hyperbolic sense that we're going to know everything and 618 00:36:11,400 --> 00:36:14,000 Speaker 1: it'll all get reduced to a bunch of algorithms. And 619 00:36:14,239 --> 00:36:16,120 Speaker 1: I don't think we're anywhere near that. And I'm not 620 00:36:16,280 --> 00:36:18,840 Speaker 1: even sure that's the endpoint, not the end point in 621 00:36:18,920 --> 00:36:20,719 Speaker 1: a sense that that will never happen, or there maybe 622 00:36:20,719 --> 00:36:24,800 Speaker 1: an endpoint beyond that. My own view is that what 623 00:36:24,960 --> 00:36:27,799 Speaker 1: it means to be human is so much richer than 624 00:36:27,880 --> 00:36:31,440 Speaker 1: these mechanistic components that we're talking about. You know, if 625 00:36:31,480 --> 00:36:33,040 Speaker 1: you said to me, when do you think we will 626 00:36:33,080 --> 00:36:36,160 Speaker 1: fully map and be able to predict behavioral systems, my 627 00:36:36,280 --> 00:36:40,440 Speaker 1: answer might be never. The incredible depth of these systems, 628 00:36:40,880 --> 00:36:43,600 Speaker 1: how messy and organic they are, means we've got a 629 00:36:43,680 --> 00:36:46,920 Speaker 1: long way to go before we've understood everything. The uniqueness 630 00:36:46,920 --> 00:36:50,520 Speaker 1: of our humanity lives to fight another day. But as 631 00:36:50,600 --> 00:36:54,000 Speaker 1: you've out said, the AI to profoundly influence us, it 632 00:36:54,080 --> 00:36:57,040 Speaker 1: doesn't need a perfect understanding. It just needs to know 633 00:36:57,239 --> 00:37:00,600 Speaker 1: a little bit more than we do, and that's become reality. 634 00:37:01,239 --> 00:37:05,239 Speaker 1: So what should we do on the individual level. It's 635 00:37:05,320 --> 00:37:08,080 Speaker 1: more urgent than ever to get to know yourself better 636 00:37:08,880 --> 00:37:13,319 Speaker 1: because you have competition. Once there is somebody out there, 637 00:37:13,360 --> 00:37:16,000 Speaker 1: a system out or an algorithm out there that knows 638 00:37:16,040 --> 00:37:18,640 Speaker 1: you better than you know yourself, the game is up. 639 00:37:19,320 --> 00:37:22,960 Speaker 1: You can do something about it, not just by beholding data, 640 00:37:23,560 --> 00:37:27,600 Speaker 1: but above all by improving your own understanding of yourself. 641 00:37:28,280 --> 00:37:31,560 Speaker 1: The better you understand yourself, the more difficult it is 642 00:37:31,640 --> 00:37:36,600 Speaker 1: to manipulate. You know thyself. You have our references the 643 00:37:36,680 --> 00:37:40,799 Speaker 1: ancient Greek maxim and last episodes to Dartha called our 644 00:37:40,840 --> 00:37:43,759 Speaker 1: attention to the Hippocratic oath in the midst of the 645 00:37:43,800 --> 00:37:47,400 Speaker 1: incredible upheavals of modernity, with being urged to remember the 646 00:37:47,480 --> 00:37:50,640 Speaker 1: wisdom of the ancients. But how do we get to 647 00:37:50,719 --> 00:37:54,440 Speaker 1: know ourselves better? I meditate, some people go to therapy. 648 00:37:54,719 --> 00:37:57,880 Speaker 1: Whatever works for you, do it, and do it quickly. 649 00:37:58,640 --> 00:38:01,399 Speaker 1: Get to know yourself better, because this is your best 650 00:38:01,480 --> 00:38:06,160 Speaker 1: protection against being hacked. If you're an engineer, then one 651 00:38:06,239 --> 00:38:08,719 Speaker 1: of the best things you can do for humanity is 652 00:38:08,920 --> 00:38:15,759 Speaker 1: build AI sidekicks that serve individuals and not corporations or governments. 653 00:38:16,080 --> 00:38:20,200 Speaker 1: AI systems that yes, they monitor you and they analyze you, 654 00:38:20,360 --> 00:38:23,840 Speaker 1: and they hack you, but they serve you. They reveal 655 00:38:24,040 --> 00:38:27,279 Speaker 1: that what they find to you, and they work like 656 00:38:27,520 --> 00:38:30,800 Speaker 1: an antivirus, just like your computer has an antivirus, so 657 00:38:30,960 --> 00:38:34,719 Speaker 1: your brain also needs an antivirus. And uh and and 658 00:38:34,920 --> 00:38:36,600 Speaker 1: this is something that that a I can do for 659 00:38:36,800 --> 00:38:41,160 Speaker 1: us to protect us against other malevolent Aiyes. Of course, 660 00:38:41,280 --> 00:38:43,240 Speaker 1: one of the problems we've talked about on the series 661 00:38:43,760 --> 00:38:46,759 Speaker 1: is that the vast majority of talented engineers work for 662 00:38:46,840 --> 00:38:50,000 Speaker 1: the very corporations Val says we need to be protected from. 663 00:38:50,760 --> 00:38:53,719 Speaker 1: But there is something compelling about using technology as a 664 00:38:53,800 --> 00:38:58,279 Speaker 1: tool to protect ourselves from other technology. And brianie Cole 665 00:38:58,440 --> 00:39:00,719 Speaker 1: also sees avenues for technology you to help us get 666 00:39:00,719 --> 00:39:04,400 Speaker 1: to know ourselves better. We can explore in virtual worlds 667 00:39:04,719 --> 00:39:09,839 Speaker 1: without shame. It's pushing us to reveal ourselves, even things 668 00:39:09,920 --> 00:39:13,399 Speaker 1: we didn't even know about ourselves. If we think about, Wow, 669 00:39:13,640 --> 00:39:16,759 Speaker 1: this dark and wacky world of sexuality that we haven't 670 00:39:16,800 --> 00:39:20,480 Speaker 1: even explored ourselves because we've been too afraid of what 671 00:39:20,680 --> 00:39:24,000 Speaker 1: we might discover. You put technology in there where you're 672 00:39:24,040 --> 00:39:27,640 Speaker 1: suddenly able to create any world that you want, These 673 00:39:27,719 --> 00:39:31,759 Speaker 1: totally fantastical edges of our minds that we can go 674 00:39:31,960 --> 00:39:36,400 Speaker 1: to thanks to technology. And according to Briany, this is 675 00:39:36,480 --> 00:39:40,000 Speaker 1: not just about seeking thrills. Our sexuality is actually driven 676 00:39:40,040 --> 00:39:43,120 Speaker 1: by something much more profound, the core of our humanity. 677 00:39:43,200 --> 00:39:46,120 Speaker 1: We want to connect, we want to belong, we want 678 00:39:46,120 --> 00:39:49,279 Speaker 1: to feel like we're part of something that's sort of 679 00:39:49,360 --> 00:39:51,920 Speaker 1: like the core part of that right down to our sexuality. 680 00:39:52,480 --> 00:39:55,280 Speaker 1: What is the future of sex? The answer has nothing 681 00:39:55,320 --> 00:39:57,880 Speaker 1: to do with technology, and it always has to do 682 00:39:58,000 --> 00:40:02,160 Speaker 1: with being human. We can take great strides personally to 683 00:40:02,239 --> 00:40:04,680 Speaker 1: get to know ourselves better, but we also have to 684 00:40:04,760 --> 00:40:07,680 Speaker 1: recognize the limits of what we can achieve as individuals. 685 00:40:08,480 --> 00:40:11,520 Speaker 1: To create a technological future that is fair and positive, 686 00:40:11,920 --> 00:40:15,640 Speaker 1: we need governance and policy. Here's you again. We need 687 00:40:15,719 --> 00:40:19,600 Speaker 1: to regulate things like the ownership of data and the 688 00:40:19,719 --> 00:40:23,760 Speaker 1: immense power the divine powers of creation, of being able 689 00:40:23,800 --> 00:40:26,080 Speaker 1: to engineer and create life. This this should be a 690 00:40:26,200 --> 00:40:31,080 Speaker 1: major political issue of who owns these kinds of abilities. 691 00:40:31,400 --> 00:40:34,600 Speaker 1: This is not something you can do by yourself. So 692 00:40:35,239 --> 00:40:39,360 Speaker 1: here that my best recommendation is join an organization. Fifty 693 00:40:39,440 --> 00:40:43,560 Speaker 1: people in an organization can do far, far more than 694 00:40:43,719 --> 00:40:50,360 Speaker 1: five hundred individual activists. So whatever cause seems to you 695 00:40:50,480 --> 00:40:55,080 Speaker 1: the most important, join a relevant organization and do it 696 00:40:55,400 --> 00:41:01,839 Speaker 1: this week. The AI revolution isn't far off in the future, Kara, 697 00:41:02,280 --> 00:41:04,440 Speaker 1: as Uval says, is here with us, and we have 698 00:41:04,600 --> 00:41:07,759 Speaker 1: this personal and political responsibility to make sure that the 699 00:41:07,840 --> 00:41:10,600 Speaker 1: future is a future we want to live in. Yeah, 700 00:41:10,640 --> 00:41:12,760 Speaker 1: I think that's actually a good place to leave Season 701 00:41:12,840 --> 00:41:15,759 Speaker 1: one of Sleepwalkers, don't they're breaking my heart? Ye. It's 702 00:41:15,760 --> 00:41:21,919 Speaker 1: been incredible to hear you say, Karma, Kara. But also 703 00:41:21,960 --> 00:41:23,839 Speaker 1: it's been incredible to report on you know, to meet 704 00:41:23,880 --> 00:41:27,439 Speaker 1: people like no Way and Gillian brock Hell or Glenn Rodriguez, 705 00:41:27,800 --> 00:41:31,480 Speaker 1: and to understand how this new technology is affecting people's lives. 706 00:41:31,800 --> 00:41:34,360 Speaker 1: It's been a real privilege to hear those stories and 707 00:41:34,480 --> 00:41:36,400 Speaker 1: also to get access to some of the people who 708 00:41:36,440 --> 00:41:39,520 Speaker 1: are building the technologies we live by, people like Yasmine 709 00:41:39,600 --> 00:41:43,760 Speaker 1: Green at Google's Jigsaw, Nathaniel Gleitscher, Facebook's head of cybersecurity, 710 00:41:44,239 --> 00:41:47,680 Speaker 1: and Ben Singleton director of Analytics at the NYPD. We 711 00:41:47,800 --> 00:41:50,120 Speaker 1: got to hear firsthand what these leaders in the world 712 00:41:50,200 --> 00:41:53,360 Speaker 1: of technology is thinking about and how hard they wrestle 713 00:41:53,480 --> 00:41:55,600 Speaker 1: with the ethics of their creations. And there's so many 714 00:41:55,680 --> 00:41:58,200 Speaker 1: more areas that AI is transforming and that we're going 715 00:41:58,239 --> 00:42:00,800 Speaker 1: to look at in season two of things like money 716 00:42:01,160 --> 00:42:03,919 Speaker 1: and climate change and some of the problems that I've 717 00:42:03,960 --> 00:42:06,960 Speaker 1: noticed even since we started doing this podcast, like what 718 00:42:07,080 --> 00:42:09,480 Speaker 1: happens if your boss is an algorithm. Well, that's why 719 00:42:09,520 --> 00:42:12,440 Speaker 1: we're doing a season two of Sleepwalkers, So please join us. 720 00:42:12,680 --> 00:42:14,960 Speaker 1: And in the meantime, we'll be keeping our Instagram, Twitter 721 00:42:15,080 --> 00:42:19,160 Speaker 1: and website updated. That's at Sleepwalker's Podcast on Instagram and 722 00:42:19,360 --> 00:42:23,040 Speaker 1: at Sleepwalker's Pod on Twitter. And if you have any stories, 723 00:42:23,120 --> 00:42:27,440 Speaker 1: suggestions or criticisms, send us an email at Sleepwalkers Press 724 00:42:27,560 --> 00:42:30,400 Speaker 1: at I heart media dot com. That Sleepwalker's p R 725 00:42:30,480 --> 00:42:33,319 Speaker 1: E s s at iHeart media dot com. Well, that's 726 00:42:33,320 --> 00:42:36,319 Speaker 1: all from us. I'm Ozveloshin and we'll see you next time. 727 00:42:48,800 --> 00:42:52,520 Speaker 1: Sleepwalkers is a production of I Heart Radio and Unusual Productions. 728 00:42:53,040 --> 00:42:55,920 Speaker 1: For the latest AI news, live interviews, and behind the 729 00:42:56,000 --> 00:43:00,279 Speaker 1: scenes footage. Find us on Instagram at Sleepwalker's podcast at 730 00:43:00,320 --> 00:43:04,880 Speaker 1: Sleepwalkers podcast dot com. Special thanks this episode to the Forward, 731 00:43:05,200 --> 00:43:08,560 Speaker 1: the digital news and culture website and Jonathan and Cody 732 00:43:08,640 --> 00:43:10,920 Speaker 1: from the SEO Agency make it all work here in 733 00:43:10,960 --> 00:43:13,319 Speaker 1: New York City. And thanks to Noise So Hard That's 734 00:43:13,520 --> 00:43:16,000 Speaker 1: s O. C h A for his involvement in this episode. 735 00:43:16,280 --> 00:43:17,920 Speaker 1: If you'd like to hear more of Noise Music, you 736 00:43:17,960 --> 00:43:20,840 Speaker 1: can find him on Facebook, YouTube, and Instagram at Simple 737 00:43:20,840 --> 00:43:23,839 Speaker 1: blues Boy. Thanks also to Allen Ullman, author of Life 738 00:43:23,880 --> 00:43:26,839 Speaker 1: in Code, and Gary Marcus of Robust dot Ai, who 739 00:43:26,880 --> 00:43:29,920 Speaker 1: gave generous interviews which helped shape off thinking for this series. 740 00:43:32,120 --> 00:43:35,480 Speaker 1: Sleepwalkers is hosted by me Ozveloshin and co hosted by 741 00:43:35,520 --> 00:43:38,440 Speaker 1: me Kara Price, with produced by Julian Weller, with help 742 00:43:38,440 --> 00:43:42,080 Speaker 1: from Jacopo Penzo and Taylor Shakoyne. Mixing by Tristan McNeil 743 00:43:42,200 --> 00:43:45,440 Speaker 1: and Julian Weller. Our story editor is Matthew Riddle. Recording 744 00:43:45,480 --> 00:43:48,880 Speaker 1: assistance this episode from Joe and Luna and Sabrina Boden. 745 00:43:49,320 --> 00:43:53,520 Speaker 1: Sleepwalkers is executive produced by me Ozveloshin and mangesh Hat Tigetter. 746 00:43:54,120 --> 00:43:56,120 Speaker 1: For more podcasts from my Art Radio, visit the I 747 00:43:56,239 --> 00:43:59,120 Speaker 1: Heart Radio app Apple podcasts, or wherever you listen to 748 00:43:59,160 --> 00:44:01,959 Speaker 1: your favorite show, simly for the wing of they would 749 00:44:02,000 --> 00:44:02,960 Speaker 1: be for the DA