1 00:00:04,760 --> 00:00:08,480 Speaker 1: Sleepwalkers is a production of I Heart Radio and unusual productions. 2 00:00:11,280 --> 00:00:14,040 Speaker 1: We can choose to have a poker face, but the 3 00:00:14,080 --> 00:00:17,560 Speaker 1: point is that our bodies are still reacting and what's 4 00:00:17,680 --> 00:00:22,040 Speaker 1: changed is the ability to see those signals. That's Poppy Crumb, 5 00:00:22,160 --> 00:00:26,040 Speaker 1: chief scientists at Dolby Labs and a professor at Stanford University. 6 00:00:26,520 --> 00:00:30,000 Speaker 1: Her work is at the forefront of neuroscience and data science, 7 00:00:30,120 --> 00:00:33,960 Speaker 1: and it's bad news for the poker face pret Solo 8 00:00:34,040 --> 00:00:37,320 Speaker 1: just want to an Academy award. And the filmmakers were 9 00:00:37,760 --> 00:00:41,520 Speaker 1: at my company doing a screening and had we had 10 00:00:41,640 --> 00:00:45,200 Speaker 1: captured carbon dioxide of the audience with her approval, of course, 11 00:00:45,560 --> 00:00:47,840 Speaker 1: and I wasn't actually at the screening. I just saw 12 00:00:47,880 --> 00:00:50,800 Speaker 1: the Ceoto capture and I knew exactly where the climbs were, 13 00:00:50,840 --> 00:00:53,920 Speaker 1: where he abandoned his climbs. As the audience watched Annlex 14 00:00:53,960 --> 00:00:57,640 Speaker 1: Honald attempted to climb El Capitan, their bodies responded to 15 00:00:57,680 --> 00:01:01,200 Speaker 1: the suspense, their breathing change, and thanks to the carbon 16 00:01:01,240 --> 00:01:04,319 Speaker 1: dioxide sensors in her theater, Poppy had a map of 17 00:01:04,360 --> 00:01:09,120 Speaker 1: the audiences emotionally experience. It's this power of the audiences 18 00:01:09,200 --> 00:01:12,959 Speaker 1: on that journey and experiencing it with the filmmakers, and 19 00:01:13,560 --> 00:01:17,960 Speaker 1: it's pretty exciting to see that engagement in the theater 20 00:01:18,080 --> 00:01:21,679 Speaker 1: and have that history. But our breath isn't our only 21 00:01:21,760 --> 00:01:25,360 Speaker 1: tell There's an increasing number of ways machines are becoming 22 00:01:25,400 --> 00:01:28,840 Speaker 1: able to read us, even how hard we're thinking. In 23 00:01:28,959 --> 00:01:32,480 Speaker 1: thermal cameras, you can track, you can look at dynamics 24 00:01:32,560 --> 00:01:36,680 Speaker 1: of blood flow to know stress levels and engagement. Just 25 00:01:36,800 --> 00:01:40,319 Speaker 1: in the infrared signatures, you can understand cognitive load. You 26 00:01:40,360 --> 00:01:43,720 Speaker 1: can then look at micro expressions of facial recognition to 27 00:01:44,200 --> 00:01:47,120 Speaker 1: get past not just if I'm feigning emotion, but really 28 00:01:47,120 --> 00:01:49,920 Speaker 1: the authenticity of what I'm experiencing. That gives us a 29 00:01:49,960 --> 00:01:52,200 Speaker 1: lot of insight about how hard my brain is working 30 00:01:52,720 --> 00:01:55,720 Speaker 1: and how engaged I am. We haven't changed this humans. 31 00:01:55,800 --> 00:01:59,320 Speaker 1: What's changed this ubiquity of sensors and the capacity of sensors. 32 00:01:59,320 --> 00:02:03,000 Speaker 1: The the cost just Fitteen years ago, the cost of 33 00:02:03,040 --> 00:02:07,600 Speaker 1: a typical device would be about maybe. Now you're looking 34 00:02:07,720 --> 00:02:10,280 Speaker 1: at those devices not even having to be close up 35 00:02:10,400 --> 00:02:14,280 Speaker 1: for pennies dollars integrated into every pair of smart glasses. 36 00:02:14,400 --> 00:02:19,680 Speaker 1: Going forward, we're on the cusp of two explosions. The 37 00:02:19,760 --> 00:02:22,919 Speaker 1: power of machine learning to find patterns and make predictions, 38 00:02:23,280 --> 00:02:28,360 Speaker 1: and simultaneously the miniaturization and affordability of cameras and other senses. 39 00:02:29,160 --> 00:02:33,200 Speaker 1: Last episode we talked about facial recognition and surveillance by governments. 40 00:02:33,560 --> 00:02:37,160 Speaker 1: But when machines contract how we're feeling, our most private 41 00:02:37,240 --> 00:02:41,400 Speaker 1: selves become readable, and while that may sound frightening, it 42 00:02:41,480 --> 00:02:45,120 Speaker 1: also holds enormous promise for many parts of life. From 43 00:02:45,200 --> 00:02:49,200 Speaker 1: beginning to end this episode, we look at what's changing 44 00:02:49,400 --> 00:03:06,320 Speaker 1: and what's possible. I'm as Veloshen. This is Sleepwalkers, my 45 00:03:06,480 --> 00:03:11,440 Speaker 1: Ma Mama. So there are quite a lot of situations 46 00:03:11,520 --> 00:03:14,320 Speaker 1: where personally, I don't actually want to be read. I'm 47 00:03:14,320 --> 00:03:17,200 Speaker 1: not sure about you, like I want to hold them 48 00:03:17,280 --> 00:03:20,200 Speaker 1: like they do in Texas play, Like when I'm playing Texas, 49 00:03:20,200 --> 00:03:22,600 Speaker 1: hold them on people to know what I'm thinking, right, 50 00:03:22,680 --> 00:03:24,880 Speaker 1: and not just at the poker table. In fact, our 51 00:03:24,880 --> 00:03:27,200 Speaker 1: society kind of relies on the idea that we can 52 00:03:27,560 --> 00:03:31,040 Speaker 1: look one way but feel another. Obviously, in plays like Hamlet, 53 00:03:31,200 --> 00:03:36,240 Speaker 1: interiority is dramatized, but but more broadly, society is where 54 00:03:36,240 --> 00:03:38,480 Speaker 1: people have no privacy tend to be a bit scary. 55 00:03:38,600 --> 00:03:41,080 Speaker 1: It is scary because the last thing we have on 56 00:03:41,200 --> 00:03:43,560 Speaker 1: earth is our privacy. You know, it's like people have 57 00:03:43,600 --> 00:03:45,920 Speaker 1: this impulse to share everything on Instagram and give away 58 00:03:45,920 --> 00:03:49,040 Speaker 1: their name to a company that wants to sell them jewelry, 59 00:03:49,400 --> 00:03:53,040 Speaker 1: and it's just like, truly, our deep our secrets are 60 00:03:53,040 --> 00:03:56,440 Speaker 1: the last thing we have, well the last thing we had. 61 00:03:56,760 --> 00:03:58,720 Speaker 1: Right probably did a full ted talk on this, and 62 00:03:58,760 --> 00:04:00,760 Speaker 1: the thing that I took away from it is will 63 00:04:00,800 --> 00:04:02,800 Speaker 1: we live in a near future where a slasher film 64 00:04:02,840 --> 00:04:05,880 Speaker 1: will be edited with people's biometric data in mind? Poppy 65 00:04:05,880 --> 00:04:08,160 Speaker 1: suddenly sees that on the horizon, and she has a 66 00:04:08,280 --> 00:04:12,440 Speaker 1: term for technology that starts to understand us. Empathetic technology 67 00:04:12,480 --> 00:04:15,120 Speaker 1: is the idea that you know, it's it's not technology 68 00:04:15,120 --> 00:04:18,159 Speaker 1: that empathizes with me or technology that is trying to 69 00:04:18,200 --> 00:04:21,359 Speaker 1: emulate human empathy. It's technology that makes use of my 70 00:04:21,440 --> 00:04:24,880 Speaker 1: internal experience to be able to integrate that as part 71 00:04:24,880 --> 00:04:28,680 Speaker 1: of its interface. Today, it's impressive that Poppy can understand 72 00:04:28,720 --> 00:04:32,359 Speaker 1: an audience's emotional journey watching Free Solo by tracking the 73 00:04:32,440 --> 00:04:35,200 Speaker 1: levels of CEO two in their breath. But tomorrow it 74 00:04:35,240 --> 00:04:38,360 Speaker 1: could lead to new kinds of art and entertainment that 75 00:04:38,440 --> 00:04:42,520 Speaker 1: responds to us. Really great hip hop producer I was 76 00:04:42,600 --> 00:04:46,080 Speaker 1: talking to wants to create music that is personalized, almost 77 00:04:46,080 --> 00:04:48,520 Speaker 1: like a tailored suit for individuals. So you start to 78 00:04:48,520 --> 00:04:52,800 Speaker 1: think about a very dynamic integration of the human experience. 79 00:04:53,120 --> 00:04:56,279 Speaker 1: That experience becomes something that our technology can be aware 80 00:04:56,279 --> 00:04:59,480 Speaker 1: of and optimized for. I want my technology to make 81 00:04:59,520 --> 00:05:01,719 Speaker 1: the right di visions so that the experience I have 82 00:05:01,839 --> 00:05:06,480 Speaker 1: with it is seamless. Seamless such a seductive word they 83 00:05:06,560 --> 00:05:10,160 Speaker 1: named a food delivery service after it, but a dangerous 84 00:05:10,200 --> 00:05:13,720 Speaker 1: word too, because for technology to read and respond to 85 00:05:13,800 --> 00:05:16,880 Speaker 1: us in real time, it needs to make decisions about 86 00:05:16,960 --> 00:05:20,080 Speaker 1: us on its own. You may remember last episode we 87 00:05:20,120 --> 00:05:22,880 Speaker 1: spoke with Lisa talia Moretti about some of the risks 88 00:05:22,880 --> 00:05:26,400 Speaker 1: of facial recognition technology, but that's not her only area 89 00:05:26,400 --> 00:05:30,279 Speaker 1: of research, something that I was looking into really recently, 90 00:05:30,560 --> 00:05:36,120 Speaker 1: as our relationship with technology is completely shifting. So we're 91 00:05:36,160 --> 00:05:39,520 Speaker 1: moving from a relationship with technology where we are asking 92 00:05:39,520 --> 00:05:41,960 Speaker 1: it to do something, you know, it's a pure sort 93 00:05:42,000 --> 00:05:46,160 Speaker 1: of input output, And what now we're moving towards is 94 00:05:46,200 --> 00:05:50,680 Speaker 1: a relationship with technology where we are trusting technology to 95 00:05:50,760 --> 00:05:55,080 Speaker 1: make decisions on our behalf. Lisa teaches at Goldsmiths in 96 00:05:55,160 --> 00:05:58,760 Speaker 1: London and Cardiff University. She told us about how her 97 00:05:58,800 --> 00:06:02,640 Speaker 1: students have encountered chnology making its own decisions about their 98 00:06:02,680 --> 00:06:05,760 Speaker 1: future prospects. One of the things that the students are 99 00:06:05,800 --> 00:06:10,120 Speaker 1: starting to do is to game the algorithms that are 100 00:06:10,160 --> 00:06:15,680 Speaker 1: being used to mind through candidates cvs. And so what 101 00:06:15,720 --> 00:06:18,720 Speaker 1: they figured out is if they put right in white 102 00:06:18,760 --> 00:06:24,560 Speaker 1: text anywhere on their CV UM Cambridge, Harvard, Oxford, they're 103 00:06:24,560 --> 00:06:27,279 Speaker 1: more likely to get through to the interview process. Does 104 00:06:27,360 --> 00:06:32,080 Speaker 1: students know that recruiting algorithms prioritize applications from certain schools, 105 00:06:32,360 --> 00:06:34,880 Speaker 1: so they pepper their applications with words they know the 106 00:06:34,960 --> 00:06:38,119 Speaker 1: algorithm will like, but written in white. So the human 107 00:06:38,160 --> 00:06:41,800 Speaker 1: recruiters are not the wiser. They're marketing themselves straight to 108 00:06:41,839 --> 00:06:46,360 Speaker 1: the AHI. They're gaming the algorithm system, which I think 109 00:06:46,400 --> 00:06:51,479 Speaker 1: is pretty genius. There's also certain things where students or 110 00:06:51,600 --> 00:06:55,159 Speaker 1: candidates who are having to conduct their first interview in 111 00:06:55,200 --> 00:06:58,359 Speaker 1: some companies purely online and there's no person on the 112 00:06:58,360 --> 00:07:02,480 Speaker 1: other side. You're essentially talking into your webcam, and there's 113 00:07:02,520 --> 00:07:08,600 Speaker 1: algorithmic technology that is recording your voice, that's listening for intonation, 114 00:07:08,880 --> 00:07:11,440 Speaker 1: that's listening for the types of words that you're saying, 115 00:07:11,520 --> 00:07:14,640 Speaker 1: like if you use smart words or your language isn't 116 00:07:14,720 --> 00:07:17,360 Speaker 1: perhaps at a level that they would think is appropriate 117 00:07:17,360 --> 00:07:22,280 Speaker 1: for business. But do we want computers to deny opportunities 118 00:07:22,280 --> 00:07:25,240 Speaker 1: to job applicants that may be qualified but not fully 119 00:07:25,280 --> 00:07:29,720 Speaker 1: polished without human review. And the algorithms weren't only analyzing 120 00:07:29,720 --> 00:07:33,560 Speaker 1: the student's words. They're also looking at your facial features, 121 00:07:33,600 --> 00:07:36,600 Speaker 1: and so they can say if you're nervous or shy. 122 00:07:36,760 --> 00:07:39,440 Speaker 1: And some of my students have said that if they 123 00:07:39,760 --> 00:07:43,040 Speaker 1: very quickly use like hand gestures, they confuse the camera 124 00:07:43,080 --> 00:07:45,520 Speaker 1: and the camera councy if they were nervous or shy 125 00:07:45,720 --> 00:07:49,320 Speaker 1: for those particular moments. There's something quite hopeful about these 126 00:07:49,360 --> 00:07:52,760 Speaker 1: students averting the algorithms designed to read them. It's not 127 00:07:52,800 --> 00:07:54,680 Speaker 1: quite the Summer of six y eight, Cara, but the 128 00:07:54,720 --> 00:07:57,320 Speaker 1: youth have still got something. Well, it's true that you 129 00:07:57,400 --> 00:07:59,840 Speaker 1: and I both look very good in black leather and 130 00:08:00,160 --> 00:08:05,800 Speaker 1: are considered cyberpunks. That's how we met that cyberpunk rally. 131 00:08:05,880 --> 00:08:10,560 Speaker 1: I was reading about this Kickstarter campaign called reflectacles like specticles, 132 00:08:10,560 --> 00:08:15,000 Speaker 1: but reflective, that's right, And that's because they reflect invisible 133 00:08:15,040 --> 00:08:17,960 Speaker 1: and infrared light. When you look back into a camera 134 00:08:18,040 --> 00:08:20,640 Speaker 1: that's watching you, which is like an ultimate that's like 135 00:08:20,680 --> 00:08:23,480 Speaker 1: a techy middle finger, like I'm gonna look right back 136 00:08:23,480 --> 00:08:25,960 Speaker 1: in this camera. It's just gonna like buzz light back 137 00:08:26,000 --> 00:08:29,520 Speaker 1: at it completely. So you know, there are methods obviously 138 00:08:29,720 --> 00:08:33,720 Speaker 1: for resistance, But what I'm worried about is that algorithms 139 00:08:33,800 --> 00:08:37,160 Speaker 1: are very smart and they will wise up and be 140 00:08:37,240 --> 00:08:39,240 Speaker 1: harder to trick. Right, and showing up for your job 141 00:08:39,240 --> 00:08:42,720 Speaker 1: interview and reflecticles probably carried his own burden as well. 142 00:08:42,840 --> 00:08:45,040 Speaker 1: It is how I got this job, though, But we 143 00:08:45,040 --> 00:08:48,640 Speaker 1: should also ask ourselves why these companies are using AI 144 00:08:48,679 --> 00:08:52,280 Speaker 1: to filter candidates and conduct interviews, And of course it's 145 00:08:52,320 --> 00:08:55,839 Speaker 1: really about saving money and saving resources, which brings out 146 00:08:55,840 --> 00:08:58,240 Speaker 1: the big question of the series. Who benefits from this 147 00:08:58,320 --> 00:09:03,240 Speaker 1: new technology Amazon and Facebook and Google. When we come back, 148 00:09:03,440 --> 00:09:05,360 Speaker 1: we look at the economics of giving up our data 149 00:09:05,559 --> 00:09:15,319 Speaker 1: and what we get in return. Sensors and AI to 150 00:09:15,400 --> 00:09:18,040 Speaker 1: analyze our response to movies or decide if we're a 151 00:09:18,080 --> 00:09:20,600 Speaker 1: good fit for a job. May sound like the stuff 152 00:09:20,679 --> 00:09:25,319 Speaker 1: of dystopian science fiction, and that's because it is set 153 00:09:25,400 --> 00:09:29,400 Speaker 1: up a progner and Tilmon Rope. I'm placing you under 154 00:09:29,480 --> 00:09:31,199 Speaker 1: arrest for the future murder, Sarah Marks, you have a 155 00:09:31,200 --> 00:09:35,640 Speaker 1: man his head. Yeah, oh, gosh, Well, it's interesting you 156 00:09:35,679 --> 00:09:42,120 Speaker 1: mentioned Minority Report because, um to this day, so many 157 00:09:42,200 --> 00:09:45,400 Speaker 1: years later, decades later, I'll be in some meeting in 158 00:09:45,440 --> 00:09:48,520 Speaker 1: Silicon Valley and We'll be looking at some gadget and 159 00:09:49,600 --> 00:09:51,520 Speaker 1: so I say, Wow, this gadget is great. It's like 160 00:09:51,559 --> 00:09:54,720 Speaker 1: from Minority Report. It's so cool, And I'm like, that 161 00:09:54,840 --> 00:09:58,120 Speaker 1: was supposed to be cautionary. That was a description of 162 00:09:58,160 --> 00:10:00,480 Speaker 1: the bad world. That was what we want to avo wide. Oh, 163 00:10:00,480 --> 00:10:04,360 Speaker 1: for God's sake, that's Jaron Lania. He's a research scientist 164 00:10:04,440 --> 00:10:06,920 Speaker 1: at Microsoft and in the eighties he coined the term 165 00:10:07,160 --> 00:10:11,160 Speaker 1: virtual reality after helping invent the field. Jaren's thought a 166 00:10:11,160 --> 00:10:13,839 Speaker 1: lot about what our relationships with technology mean for us, 167 00:10:14,120 --> 00:10:17,560 Speaker 1: So when Steven Spielberg was making Minority Report, he called 168 00:10:17,600 --> 00:10:20,760 Speaker 1: on Jarren to act as a technology consultant. Mostly, what 169 00:10:20,840 --> 00:10:24,520 Speaker 1: I've taken from Minority Report is that just trying to 170 00:10:24,559 --> 00:10:30,800 Speaker 1: do cautionary portrayals of technology actually backfires, because there's some 171 00:10:30,840 --> 00:10:33,480 Speaker 1: way that it's a little bit like when you show 172 00:10:33,520 --> 00:10:36,240 Speaker 1: the Life of Billionaires, people don't get angry about like 173 00:10:36,240 --> 00:10:38,400 Speaker 1: why do those people monopol or where they own whole 174 00:10:38,400 --> 00:10:41,520 Speaker 1: islands or something. Instead they say, oh, I identify with 175 00:10:41,520 --> 00:10:43,600 Speaker 1: that person. Maybe I could own a whole island someday. 176 00:10:43,920 --> 00:10:47,959 Speaker 1: And despite our fascination with dystopian fiction, we also have 177 00:10:48,000 --> 00:10:51,680 Speaker 1: a tendency to fantasize about ourselves as the beneficiaries, not 178 00:10:51,800 --> 00:10:54,960 Speaker 1: the victims, of the systems we create, and we tend 179 00:10:55,000 --> 00:10:58,800 Speaker 1: to ascribe those systems their own will, even though we've 180 00:10:58,800 --> 00:11:02,360 Speaker 1: made them. Early in the history of capitalism, Adam Smith 181 00:11:02,800 --> 00:11:06,199 Speaker 1: suggested that capitalism or markets were an invisible hand, as 182 00:11:06,200 --> 00:11:08,720 Speaker 1: sort of a life form. And in the same way 183 00:11:08,760 --> 00:11:12,200 Speaker 1: that you can interpret a market as being this living 184 00:11:12,320 --> 00:11:15,280 Speaker 1: thing just because it's a little beyond our understanding, it's 185 00:11:15,280 --> 00:11:19,119 Speaker 1: a little too complicated to fully predict and fully understand, 186 00:11:19,240 --> 00:11:21,920 Speaker 1: and and that's actually its power. In the same way, 187 00:11:21,920 --> 00:11:24,880 Speaker 1: big computational systems can be a little out of control, 188 00:11:24,920 --> 00:11:26,760 Speaker 1: not entirely, but even if they're only a little bit, 189 00:11:26,800 --> 00:11:30,040 Speaker 1: you can interpret that as being the new invisible hand, 190 00:11:30,040 --> 00:11:34,120 Speaker 1: which we call artificial intelligence. Invoking an external force like 191 00:11:34,200 --> 00:11:38,479 Speaker 1: the invisible hand, or an algorithm that automatically reads resumes 192 00:11:38,640 --> 00:11:43,560 Speaker 1: or makes parole recommendations obsculls real human decisions. We have 193 00:11:43,640 --> 00:11:46,760 Speaker 1: to remember that our creations reflect us. If you use 194 00:11:46,840 --> 00:11:50,040 Speaker 1: that to abdicate your responsibility. If you use it just 195 00:11:50,080 --> 00:11:52,400 Speaker 1: to cower and fear, then you're not being a good 196 00:11:52,400 --> 00:11:56,040 Speaker 1: computer scientist. That is not the responsible way to do things. 197 00:11:56,080 --> 00:11:59,199 Speaker 1: Just as if an economist says, well, the invisible hand 198 00:11:59,240 --> 00:12:03,040 Speaker 1: says all these people should starve, that's not a responsible economist. 199 00:12:03,080 --> 00:12:07,040 Speaker 1: The responsible economist fixes the problem in a sense. I 200 00:12:07,040 --> 00:12:11,520 Speaker 1: think it's very hard to be effective if you believe 201 00:12:11,640 --> 00:12:14,719 Speaker 1: in some kind of magical agency in your own inventions. 202 00:12:14,760 --> 00:12:16,720 Speaker 1: I think you make yourself into an idiot. And and 203 00:12:16,800 --> 00:12:20,360 Speaker 1: so I'm really concerned that not only economists but computer 204 00:12:20,440 --> 00:12:24,200 Speaker 1: scientists make that error all the time. It's almost like 205 00:12:24,240 --> 00:12:27,480 Speaker 1: a new form of mythology. I've been calling it alchemy lately. 206 00:12:27,520 --> 00:12:30,920 Speaker 1: But yeah, sure, it's certainly easier to say, oh, we 207 00:12:30,960 --> 00:12:34,160 Speaker 1: should respect this amazing autonomous living thing that has arisen 208 00:12:34,160 --> 00:12:36,719 Speaker 1: in our own inventions. It's much easier to say that 209 00:12:36,720 --> 00:12:39,920 Speaker 1: when it's benefiting you and you're getting very rich. Jarn 210 00:12:40,000 --> 00:12:42,800 Speaker 1: puts his finger on a central irony in our relationship 211 00:12:42,840 --> 00:12:46,760 Speaker 1: with technology. When our creations benefit us, we're quick to 212 00:12:46,800 --> 00:12:51,440 Speaker 1: forget who pays the price. People who translate between natural languages, 213 00:12:51,480 --> 00:12:54,920 Speaker 1: such as between English and Spanish, have seen their career 214 00:12:54,960 --> 00:12:59,599 Speaker 1: prospects on the whole decrease tenfold since the arrival of 215 00:12:59,640 --> 00:13:02,679 Speaker 1: Autumn Addic Translation, which is offered for free but companies 216 00:13:02,679 --> 00:13:06,960 Speaker 1: like Google and Microsoft. Now, the thing is, you might say, well, 217 00:13:07,080 --> 00:13:09,560 Speaker 1: this is very sad, but it always happens. People are made, 218 00:13:09,679 --> 00:13:14,160 Speaker 1: people's jobs become obsolete when new technologies come along. The 219 00:13:14,200 --> 00:13:17,560 Speaker 1: buggy whip goes away and the motor car comes. All right, 220 00:13:17,960 --> 00:13:20,600 Speaker 1: but the problem is that every single day, those of 221 00:13:20,679 --> 00:13:24,480 Speaker 1: us who help run these free services have to scrape 222 00:13:24,559 --> 00:13:27,439 Speaker 1: or steal tens of millions of example phrases from all 223 00:13:27,440 --> 00:13:29,960 Speaker 1: over the world from people who don't know it's being 224 00:13:29,960 --> 00:13:32,600 Speaker 1: done to them. And the reason why every single day 225 00:13:32,640 --> 00:13:35,439 Speaker 1: there's new pop culture and slang in public events and 226 00:13:35,920 --> 00:13:38,240 Speaker 1: memes and on and on and so you need to 227 00:13:38,320 --> 00:13:41,720 Speaker 1: constantly get new phrase examples to feed into the translation engines. 228 00:13:42,160 --> 00:13:44,160 Speaker 1: So it's a weird thing. We're telling the people you 229 00:13:44,200 --> 00:13:46,360 Speaker 1: don't get a job anymore because you're not needed. Oh, 230 00:13:46,400 --> 00:13:48,400 Speaker 1: by the way, you're needed. We need to steal with you. 231 00:13:48,440 --> 00:13:50,120 Speaker 1: Oh but by the way, we won't even tell you. 232 00:13:50,360 --> 00:13:52,360 Speaker 1: And it's all based on this lie that we don't 233 00:13:52,360 --> 00:13:55,120 Speaker 1: need people. Um, and that lie is based on this 234 00:13:55,200 --> 00:13:57,600 Speaker 1: need to pretend the AI is this free standing thing, 235 00:13:57,760 --> 00:14:00,680 Speaker 1: whereas we could instead think of it it's just the 236 00:14:00,720 --> 00:14:03,640 Speaker 1: way that's African channel value between people in a new 237 00:14:03,679 --> 00:14:07,520 Speaker 1: and better way. The free availability of real time translation 238 00:14:07,960 --> 00:14:12,320 Speaker 1: opens up a world of possibilities for travelers, for language learners, 239 00:14:12,360 --> 00:14:17,240 Speaker 1: for long distance lovers. But these technologies have invisible costs, 240 00:14:17,240 --> 00:14:19,600 Speaker 1: like the translators losing their jobs to a tool that 241 00:14:19,720 --> 00:14:22,880 Speaker 1: was trained on their work. And this kind of unpaid 242 00:14:22,960 --> 00:14:27,320 Speaker 1: labor is actually something all of us participating every day 243 00:14:27,360 --> 00:14:31,280 Speaker 1: without even realizing. Here's Lisa again. The way that these 244 00:14:31,440 --> 00:14:35,440 Speaker 1: voice activated assistance are being trained is through huge amounts 245 00:14:35,520 --> 00:14:39,520 Speaker 1: of data. A bit of an unknown secret by many 246 00:14:39,560 --> 00:14:42,760 Speaker 1: people who have an Alexa is that every single time 247 00:14:42,920 --> 00:14:46,480 Speaker 1: you are talking to that device, it's being recorded and 248 00:14:46,560 --> 00:14:49,560 Speaker 1: being stored and going back to the cloud to train 249 00:14:49,880 --> 00:14:53,800 Speaker 1: all of the other echoes around the world. So the 250 00:14:54,000 --> 00:14:57,960 Speaker 1: users of echo devices are providing free labor on behalf 251 00:14:57,960 --> 00:15:02,480 Speaker 1: of these massive organizations in order to train the system. 252 00:15:02,520 --> 00:15:05,200 Speaker 1: It's not the first thing we think that after purchasing 253 00:15:05,240 --> 00:15:08,520 Speaker 1: an Alexa and using it to buy stuff online, every 254 00:15:08,520 --> 00:15:11,520 Speaker 1: time we interact with it, we're also helping Amazon improve 255 00:15:11,720 --> 00:15:14,840 Speaker 1: and make more money, but framing our data in terms 256 00:15:14,840 --> 00:15:17,680 Speaker 1: of labor helps us think about technology in new ways, 257 00:15:18,360 --> 00:15:22,360 Speaker 1: and our Alexa use reminds Jarren of another science fiction movie. 258 00:15:22,800 --> 00:15:24,520 Speaker 1: The reason. One that's really gotten to me is that 259 00:15:24,800 --> 00:15:29,160 Speaker 1: probably the most famous cautionary tale about computers was in 260 00:15:29,280 --> 00:15:32,200 Speaker 1: two thousand one Stanley Kubrick and Arthur C. Clark's movie 261 00:15:34,040 --> 00:15:36,600 Speaker 1: And There's this computer called how that's this round thing 262 00:15:36,680 --> 00:15:38,280 Speaker 1: that sits on the wall and just looks at you 263 00:15:38,400 --> 00:15:41,640 Speaker 1: and talks to you, I'm sorry, I'm afraid I can't 264 00:15:41,680 --> 00:15:44,040 Speaker 1: do that, and it ends up going berserk and killing 265 00:15:44,120 --> 00:15:46,080 Speaker 1: people and just and they have to the program it. 266 00:15:46,280 --> 00:15:48,480 Speaker 1: And the hot new gadge for the last few years 267 00:15:48,520 --> 00:15:50,240 Speaker 1: has been this round thing that sits there and looks 268 00:15:50,280 --> 00:15:54,200 Speaker 1: at you and you talk to I'm afraid I can't 269 00:15:53,520 --> 00:15:57,120 Speaker 1: do these smart speakers and so on, and it's like, 270 00:15:57,280 --> 00:15:59,600 Speaker 1: no matter how many cautions we put forward, people just 271 00:15:59,680 --> 00:16:02,880 Speaker 1: follow right into it. It's it's astonishing to me. Of course, 272 00:16:02,920 --> 00:16:06,200 Speaker 1: Alectra is powered by artificial intelligence. It takes a machine 273 00:16:06,280 --> 00:16:09,320 Speaker 1: learning to understand what you say to it. But maybe 274 00:16:09,320 --> 00:16:12,000 Speaker 1: the bigger breakthrough has been our decision to let listening 275 00:16:12,000 --> 00:16:15,200 Speaker 1: devices into our homes. Yeah, I think that's true. I 276 00:16:15,240 --> 00:16:17,680 Speaker 1: didn't grow up talking to something in my house. It's 277 00:16:17,680 --> 00:16:21,160 Speaker 1: interesting when you read those articles like deep inside North Korea. 278 00:16:21,720 --> 00:16:24,760 Speaker 1: You know, the thing that journalists always writes, there's a 279 00:16:24,800 --> 00:16:28,080 Speaker 1: speaker in every house which projects the chairman's voice into 280 00:16:28,080 --> 00:16:31,400 Speaker 1: the homes. As always the shocking detail. And of course 281 00:16:31,440 --> 00:16:33,760 Speaker 1: now we will have Alexas in our houses, you know. 282 00:16:33,800 --> 00:16:36,520 Speaker 1: I think it's interesting that Amazon was being delivered to 283 00:16:36,600 --> 00:16:38,120 Speaker 1: us the boxes on our door step. That was the 284 00:16:38,120 --> 00:16:41,520 Speaker 1: farthest they were going to get right, And now with 285 00:16:41,720 --> 00:16:44,600 Speaker 1: Echo and Dot, these are devices that are inside of 286 00:16:44,600 --> 00:16:47,320 Speaker 1: our homes, that are on our countertops. We're at this 287 00:16:47,400 --> 00:16:50,920 Speaker 1: place where Alexa is now a part of the family. 288 00:16:51,280 --> 00:16:53,880 Speaker 1: And now we have this first generation of children growing 289 00:16:53,920 --> 00:16:57,400 Speaker 1: up with Alexas and other smart devices at home, interacting 290 00:16:57,440 --> 00:16:59,800 Speaker 1: with them, seeing their parents talk to them all the time, 291 00:17:00,360 --> 00:17:03,800 Speaker 1: and they're already used to this responsive technology. The giving 292 00:17:03,840 --> 00:17:07,399 Speaker 1: away our data piece is disconcerting, but there's another piece 293 00:17:07,440 --> 00:17:10,639 Speaker 1: of our shifting relationship with technology, which is why I 294 00:17:10,720 --> 00:17:13,320 Speaker 1: dragged Julian into New Jersey to one of the smartest 295 00:17:13,359 --> 00:17:16,399 Speaker 1: homes I know. And no I'm not talking about i Q. 296 00:17:17,520 --> 00:17:19,800 Speaker 1: Let me ask you a question. What does your little 297 00:17:19,840 --> 00:17:24,000 Speaker 1: brother look like a little guy? So what would you say? 298 00:17:24,040 --> 00:17:32,119 Speaker 1: What does Alexe look like? But where does she live in? 299 00:17:33,560 --> 00:17:37,080 Speaker 1: They really so every time you talk to Alexei, you're 300 00:17:37,080 --> 00:17:42,800 Speaker 1: talking to space, despite having an Alexa who lives in space. 301 00:17:43,280 --> 00:17:45,240 Speaker 1: My friend and her husband live in the suburbs with 302 00:17:45,280 --> 00:17:47,959 Speaker 1: their sons, who are almost two and five. When they 303 00:17:48,000 --> 00:17:50,439 Speaker 1: recently moved to New Jersey to fit their expanding family, 304 00:17:50,560 --> 00:17:53,440 Speaker 1: they did not scrimp on smart home devices. Yeah, we've 305 00:17:53,440 --> 00:17:56,200 Speaker 1: got two little kids. We both work. It doesn't bother 306 00:17:56,280 --> 00:17:58,680 Speaker 1: me that they know our habits make it easier for us, 307 00:17:58,720 --> 00:18:01,760 Speaker 1: like to get stuff done. Like I'm cool with Amazon 308 00:18:01,880 --> 00:18:03,840 Speaker 1: just sending me diapers because it knows what I need 309 00:18:03,840 --> 00:18:07,200 Speaker 1: diapers for parents like my friends, devices like the Amazon Alexa, 310 00:18:07,440 --> 00:18:11,440 Speaker 1: Google Home and their marriad counterparts are genuinely helpful, which 311 00:18:11,480 --> 00:18:14,520 Speaker 1: is probably why over a hundred and eighteen million American 312 00:18:14,560 --> 00:18:17,760 Speaker 1: households have a smart speaker. That's half of the US. 313 00:18:18,440 --> 00:18:20,520 Speaker 1: And when you've got something so involved in your home 314 00:18:20,560 --> 00:18:23,280 Speaker 1: life that it helps with diapers and groceries, it's bound 315 00:18:23,280 --> 00:18:25,600 Speaker 1: to affect some other areas as well. Your dad said, 316 00:18:25,600 --> 00:18:30,959 Speaker 1: they're too There's there are three women in the house Google, 317 00:18:31,440 --> 00:18:37,320 Speaker 1: who I don't know, who's the third woman in the house? 318 00:18:43,520 --> 00:18:48,320 Speaker 1: And even bedtime is different. Hey, Google, tell me a story. Sure, 319 00:18:48,520 --> 00:18:52,080 Speaker 1: here's one from Nickelodeon. It was a sunny day and 320 00:18:52,200 --> 00:18:55,560 Speaker 1: Mr Porter was visiting farmer. He'll have I'll Google reading 321 00:18:55,600 --> 00:18:57,199 Speaker 1: a story when he's in bed and we don't want 322 00:18:57,200 --> 00:19:00,919 Speaker 1: to read any more books. You for your Google go 323 00:19:01,040 --> 00:19:05,679 Speaker 1: off when you as if closed the door. Yeah. To 324 00:19:05,760 --> 00:19:09,760 Speaker 1: be clear, Google Home has not replaced real life bedtime stories, 325 00:19:10,400 --> 00:19:12,480 Speaker 1: but it has enabled my friend's kid to get more 326 00:19:12,480 --> 00:19:15,600 Speaker 1: out of bedtime. He can keep asking for stories long 327 00:19:15,640 --> 00:19:18,639 Speaker 1: after his parents need to stop reading. But focusing on 328 00:19:18,680 --> 00:19:22,080 Speaker 1: the privacy component of these devices doesn't capture the full picture. 329 00:19:22,600 --> 00:19:24,879 Speaker 1: Not only can the rhythms of family life change in 330 00:19:24,920 --> 00:19:28,520 Speaker 1: response to a digital assistant, but so can kids expectations. 331 00:19:28,960 --> 00:19:31,119 Speaker 1: And that's true for all of us, even those of 332 00:19:31,200 --> 00:19:35,119 Speaker 1: us already passed early childhood development. We should think about 333 00:19:35,119 --> 00:19:39,240 Speaker 1: how we're affected long term by our expectation of seamless delivery. 334 00:19:40,680 --> 00:19:43,640 Speaker 1: You've got a good rapport with those kids, Kara. Neither 335 00:19:43,680 --> 00:19:46,000 Speaker 1: of us have our own kids, but our editor among 336 00:19:46,040 --> 00:19:48,600 Speaker 1: us does. I was curious for his take on next 337 00:19:48,600 --> 00:19:51,320 Speaker 1: to joining the family So I'd gone out to this 338 00:19:51,320 --> 00:19:54,360 Speaker 1: wedding a couple of years ago in Seattle, and it 339 00:19:54,520 --> 00:19:58,040 Speaker 1: was in this fancy hotel and there was an Alexa 340 00:19:58,080 --> 00:20:01,160 Speaker 1: there in the room, and Azzie and I went out 341 00:20:01,280 --> 00:20:03,159 Speaker 1: for dinner or something, and we left the kids with 342 00:20:03,200 --> 00:20:07,200 Speaker 1: a sitter, and the kids were mesmerized because they never 343 00:20:07,320 --> 00:20:10,679 Speaker 1: encountered one of these devices before, so they were watching 344 00:20:10,680 --> 00:20:13,440 Speaker 1: the babysitter interact with it, calling up music and whatever else. 345 00:20:13,480 --> 00:20:16,280 Speaker 1: But then she also ordered food, so they were just 346 00:20:16,320 --> 00:20:18,760 Speaker 1: floored by this. Then the next week, we were back 347 00:20:18,760 --> 00:20:21,000 Speaker 1: at home and I was watching my four year old 348 00:20:21,040 --> 00:20:23,680 Speaker 1: just like stomping around the house, and she started barking 349 00:20:24,359 --> 00:20:31,480 Speaker 1: Alexa pizza, and it was just so confusing. She immediately 350 00:20:31,520 --> 00:20:34,119 Speaker 1: knew that there was this thing you could just bark 351 00:20:34,160 --> 00:20:37,520 Speaker 1: at it and and get food, and she wanted results. 352 00:20:37,600 --> 00:20:41,199 Speaker 1: But I have a conflicted feeling about all of this 353 00:20:41,400 --> 00:20:44,960 Speaker 1: because I had grown up in the States, very middle class, 354 00:20:45,160 --> 00:20:47,080 Speaker 1: and every couple of years we go to India and 355 00:20:47,160 --> 00:20:49,960 Speaker 1: visit my relatives who were a little wealthier. And we 356 00:20:50,000 --> 00:20:51,960 Speaker 1: went to a party once with one of my cousins 357 00:20:52,640 --> 00:20:55,680 Speaker 1: and I saw this kid who was super wealthy, and 358 00:20:55,760 --> 00:20:58,680 Speaker 1: he was yelling at his chauffeur. He was yelling at 359 00:20:58,720 --> 00:21:02,320 Speaker 1: his mom, he was barking at the maid that they had, 360 00:21:02,440 --> 00:21:05,840 Speaker 1: and it was just so gross. And when you see 361 00:21:06,000 --> 00:21:09,080 Speaker 1: this sort of entitlement in front of you and expressed 362 00:21:09,080 --> 00:21:11,639 Speaker 1: in this way, you don't want your kids growing up 363 00:21:11,680 --> 00:21:14,280 Speaker 1: with that, right, and you want everyone to be treated 364 00:21:14,280 --> 00:21:16,320 Speaker 1: as humans. And and so for my daughter to be 365 00:21:16,320 --> 00:21:20,440 Speaker 1: stomping around barking what she wanted, you know, I don't 366 00:21:20,440 --> 00:21:24,040 Speaker 1: want that to be her way of speech. This kind 367 00:21:24,040 --> 00:21:27,040 Speaker 1: of reminds me of the movie Invasion of the Body Snatchers, 368 00:21:27,680 --> 00:21:29,760 Speaker 1: except the Monash version, which is that you know, he 369 00:21:29,800 --> 00:21:32,280 Speaker 1: went out to dinner with his wife. He comes back, 370 00:21:32,520 --> 00:21:35,720 Speaker 1: his daughter is like Alexa pizza, and he's like, what 371 00:21:35,880 --> 00:21:39,320 Speaker 1: has Alexa done with my chid? You know, she was 372 00:21:39,359 --> 00:21:42,000 Speaker 1: like the sweet little girl, and now she's like a 373 00:21:42,119 --> 00:21:45,280 Speaker 1: child that's aware of an Alexa based on one encounter. Right. 374 00:21:45,880 --> 00:21:47,919 Speaker 1: I feel like if an alien came down from space 375 00:21:48,240 --> 00:21:50,160 Speaker 1: and came into my apartment and saw me cooking dinner 376 00:21:50,160 --> 00:21:53,760 Speaker 1: and then saw me go, Alexa turn on Paul Simon radio, 377 00:21:54,119 --> 00:21:56,159 Speaker 1: the alien would be like, is she yelling at a 378 00:21:56,200 --> 00:21:58,920 Speaker 1: woman who's going to go turn on the music? Um? 379 00:21:58,960 --> 00:22:02,040 Speaker 1: I think right, now we are all, you know, Mangesh's 380 00:22:02,119 --> 00:22:06,639 Speaker 1: daughter included, participating in this life altering moment of self 381 00:22:06,680 --> 00:22:10,240 Speaker 1: delusion where we're sort of collectively accepting smart devices in 382 00:22:10,280 --> 00:22:13,320 Speaker 1: our homes. We're not just accepting them, we're treating them 383 00:22:13,359 --> 00:22:15,880 Speaker 1: as human right, And I think this is the slippery 384 00:22:15,960 --> 00:22:19,320 Speaker 1: ish slope of using voice assistance. More and more we're 385 00:22:19,359 --> 00:22:22,920 Speaker 1: speaking to voice activated devices as though there are family members, 386 00:22:23,520 --> 00:22:25,920 Speaker 1: and there have got to be some long term implications, 387 00:22:25,960 --> 00:22:29,600 Speaker 1: both emotional and psychological, of blurring the line. I agree, 388 00:22:29,920 --> 00:22:31,800 Speaker 1: and we're going to hear more from Jared about that 389 00:22:31,840 --> 00:22:35,040 Speaker 1: after the break, But we'll also speak to Poppy Crumb again, 390 00:22:35,160 --> 00:22:38,280 Speaker 1: who believes that we've barely scratched the surface of how 391 00:22:38,320 --> 00:22:48,520 Speaker 1: these devices can change how we live for the better children. 392 00:22:48,520 --> 00:22:51,879 Speaker 1: Interacting with Alexa and Siri and other digital assistance is 393 00:22:51,920 --> 00:22:56,240 Speaker 1: particularly striking because of the developmental implications, But interacting with 394 00:22:56,359 --> 00:23:00,840 Speaker 1: machines who understand and respond to us can his questions 395 00:23:00,880 --> 00:23:05,199 Speaker 1: about who and what gets to be treated as a person. Here, Jarren, 396 00:23:05,960 --> 00:23:10,320 Speaker 1: I think the problem isn't the math or the computer 397 00:23:10,359 --> 00:23:13,600 Speaker 1: science algorithms. I think the problem is our framework for 398 00:23:13,680 --> 00:23:17,040 Speaker 1: thinking about them tends to be machine centric. Instead of 399 00:23:17,119 --> 00:23:19,320 Speaker 1: human centric, and it tends to create dangerous for the 400 00:23:19,359 --> 00:23:21,439 Speaker 1: whole and and to create a lot of confusion, And 401 00:23:21,480 --> 00:23:23,439 Speaker 1: a lot of it is because of this ideology of 402 00:23:23,480 --> 00:23:26,560 Speaker 1: thinking of the machine as being alive. And when we 403 00:23:26,600 --> 00:23:30,400 Speaker 1: remember that our AI inventions aren'to live and simply reflect 404 00:23:30,480 --> 00:23:33,040 Speaker 1: the inputs we give them, we can do a better 405 00:23:33,080 --> 00:23:35,800 Speaker 1: job at harnessing their power for good. You may remember 406 00:23:35,880 --> 00:23:39,280 Speaker 1: kai Fu Lee from the last episode. Before running Google China, 407 00:23:39,720 --> 00:23:43,679 Speaker 1: Kaifu worked at Apple, where he helped develop Siri. AI 408 00:23:43,800 --> 00:23:46,879 Speaker 1: is programmed by people. It is up to us to 409 00:23:47,040 --> 00:23:50,720 Speaker 1: remove the factors that we don't think it's appropriate to 410 00:23:50,840 --> 00:23:53,680 Speaker 1: be considered in the decision from an AI. So if 411 00:23:53,720 --> 00:23:58,959 Speaker 1: we want to eliminate sexual orientation from a long decision engine, 412 00:23:59,000 --> 00:24:01,399 Speaker 1: we can do that, or if one to eliminate it 413 00:24:01,440 --> 00:24:04,800 Speaker 1: from a job application, we can do that. It's actually 414 00:24:04,800 --> 00:24:08,600 Speaker 1: better than people. You can't force people to completely ignore 415 00:24:09,280 --> 00:24:11,800 Speaker 1: these sort of things from their decision. They can try, 416 00:24:12,200 --> 00:24:15,520 Speaker 1: but our brains are not separable in that way. Engines 417 00:24:15,520 --> 00:24:18,919 Speaker 1: actually are being able to program a way out of 418 00:24:18,920 --> 00:24:22,200 Speaker 1: our messier human biases is a big deal, but it's 419 00:24:22,240 --> 00:24:25,320 Speaker 1: only made possible when we acknowledge that we control what 420 00:24:25,400 --> 00:24:28,680 Speaker 1: an algorithm learns from. So if we get it right, 421 00:24:29,240 --> 00:24:32,679 Speaker 1: just how good could our relationship with technology get? His 422 00:24:32,880 --> 00:24:35,720 Speaker 1: Poppy crume again. I had a relative who you know, 423 00:24:36,040 --> 00:24:39,199 Speaker 1: was at the end of life, and I was with 424 00:24:39,280 --> 00:24:42,119 Speaker 1: him for the last few weeks in the hospital. He 425 00:24:42,160 --> 00:24:45,360 Speaker 1: hadn't been speaking for a couple of days. I had 426 00:24:45,440 --> 00:24:47,800 Speaker 1: taken on Amazon Alexa. I was using it simply to 427 00:24:47,800 --> 00:24:55,240 Speaker 1: play music. I was playing classical music. And all of 428 00:24:55,280 --> 00:24:59,240 Speaker 1: a sudden, he says, Alexa, play al Green and I was, 429 00:25:00,960 --> 00:25:03,800 Speaker 1: And you know, and Poppy had always interacted with our 430 00:25:03,880 --> 00:25:07,720 Speaker 1: uncle through classical music. It was their thing. And here 431 00:25:07,760 --> 00:25:10,720 Speaker 1: he was, at the end of his life, requesting R 432 00:25:10,800 --> 00:25:13,760 Speaker 1: and B an interest she didn't even know he had. 433 00:25:14,000 --> 00:25:15,960 Speaker 1: He wanted to hear Algreen and sly in the family stone, 434 00:25:15,960 --> 00:25:19,640 Speaker 1: and I was like, nowhere near that. But the empowerment 435 00:25:19,680 --> 00:25:24,040 Speaker 1: the device allowed at a very vulnerable and sensitive and 436 00:25:24,280 --> 00:25:28,960 Speaker 1: important time, he smiled the end of life. And it's 437 00:25:29,000 --> 00:25:33,760 Speaker 1: the access to memories, the access to that internal richness, 438 00:25:33,880 --> 00:25:37,320 Speaker 1: the things that might bring someone the most comfort are 439 00:25:37,640 --> 00:25:42,840 Speaker 1: we all don't know. Amazon's Alexa really opened up great 440 00:25:42,840 --> 00:25:47,359 Speaker 1: opportunities for what our relationship with our technology can be suddenly, 441 00:25:47,640 --> 00:25:50,520 Speaker 1: Alexa isn't just something that dims your lights or tells 442 00:25:50,520 --> 00:25:52,320 Speaker 1: you if his reigning as you rush out of the door, 443 00:25:52,880 --> 00:25:55,520 Speaker 1: but actually a device that can change profoundly how you 444 00:25:55,560 --> 00:25:59,640 Speaker 1: live and die. And Poppy noticed other ways, and Alexa 445 00:25:59,640 --> 00:26:02,960 Speaker 1: could have up her uncle beyond playing al Green. I 446 00:26:03,000 --> 00:26:07,040 Speaker 1: sat in a hospital room where I saw errors be made. 447 00:26:07,160 --> 00:26:11,639 Speaker 1: I saw information be captured, incorrectly written on the board 448 00:26:11,680 --> 00:26:15,159 Speaker 1: one way, shared to a different nurse, another, shared to 449 00:26:15,200 --> 00:26:18,080 Speaker 1: a different doctor, another And I said, all of these 450 00:26:18,119 --> 00:26:24,080 Speaker 1: different things happened that with the right coordination of that 451 00:26:24,200 --> 00:26:26,720 Speaker 1: same device that just allowed my uncle to hear al 452 00:26:26,760 --> 00:26:29,760 Speaker 1: Green on Q it could have also been a huge 453 00:26:29,840 --> 00:26:32,520 Speaker 1: part of improving not just his mental wellness but his 454 00:26:32,600 --> 00:26:36,160 Speaker 1: physical wellness. Because we're humans, we make errors, we make mistakes, 455 00:26:36,560 --> 00:26:38,960 Speaker 1: We're not good at integrating information all the time, and 456 00:26:39,000 --> 00:26:42,439 Speaker 1: our fallacy comes in places that technology can solve. I 457 00:26:42,440 --> 00:26:44,720 Speaker 1: don't want to discount hospital stuff. They work very hard, 458 00:26:45,240 --> 00:26:48,520 Speaker 1: uh and and everything, but people are sometimes you haven't 459 00:26:48,560 --> 00:26:50,800 Speaker 1: had enough sleep, or they don't know someone else. You know. 460 00:26:50,880 --> 00:26:53,159 Speaker 1: People try to help at different points in time and 461 00:26:53,280 --> 00:26:57,439 Speaker 1: end up sometimes introducing error and mistakes. Technology that's actually 462 00:26:57,480 --> 00:27:01,840 Speaker 1: capturing or registering information for a user. There's obvious ways 463 00:27:01,840 --> 00:27:04,800 Speaker 1: that it can help improve the interaction to Poppy. The 464 00:27:04,840 --> 00:27:08,080 Speaker 1: true power of Alexa is not to respond to specific 465 00:27:08,119 --> 00:27:12,359 Speaker 1: requests like a super assistant. It's to monitor us constantly, 466 00:27:12,680 --> 00:27:16,399 Speaker 1: detecting patterns that we can't more like a parent. And 467 00:27:16,440 --> 00:27:19,119 Speaker 1: as of now, as far as we know, that's not 468 00:27:19,160 --> 00:27:21,879 Speaker 1: what it does. What Alexa does right now is not 469 00:27:22,000 --> 00:27:25,399 Speaker 1: what would actually benefit us most from a healthcare perspective. 470 00:27:26,080 --> 00:27:29,560 Speaker 1: Right Alexa is listening for a wake word. It's listening 471 00:27:29,560 --> 00:27:33,640 Speaker 1: for a particular que It's not holding that longitudinal data 472 00:27:33,720 --> 00:27:37,040 Speaker 1: to learn our behaviors and and such right now. Not 473 00:27:37,160 --> 00:27:41,040 Speaker 1: because of technological barrier, No, because of I think social 474 00:27:41,160 --> 00:27:45,040 Speaker 1: and privacy barriers, and those barriers tend to erode in 475 00:27:45,119 --> 00:27:48,440 Speaker 1: response to new technology. Twenty years ago, we would never 476 00:27:48,480 --> 00:27:51,159 Speaker 1: have believed we would summon strangers from the Internet and 477 00:27:51,200 --> 00:27:53,840 Speaker 1: climb in their cars either, you know. So we evolve 478 00:27:53,880 --> 00:28:00,399 Speaker 1: when the capacity, when the convenience is introduced, and the capability. Clearly, 479 00:28:00,480 --> 00:28:04,680 Speaker 1: technological innovation is only part of the equation. Becoming comfortable 480 00:28:04,680 --> 00:28:07,800 Speaker 1: with new uses for those technologies opens up new worlds 481 00:28:07,800 --> 00:28:11,119 Speaker 1: of possibilities and everything from how we listen to music, 482 00:28:11,440 --> 00:28:13,960 Speaker 1: to how we take care of each other. So let's 483 00:28:14,000 --> 00:28:16,679 Speaker 1: say we accept this new bargain and open ourselves up 484 00:28:16,680 --> 00:28:20,560 Speaker 1: to constant monitoring what might the future look like. Companies 485 00:28:20,600 --> 00:28:23,040 Speaker 1: are looking at these things as ways of knowing not 486 00:28:23,200 --> 00:28:26,800 Speaker 1: just someone is taking their medicine for an aging population 487 00:28:27,080 --> 00:28:30,200 Speaker 1: or someone who's healing, but to know if actually they're depressed, 488 00:28:30,440 --> 00:28:34,320 Speaker 1: are they under mental stress as well? And that becomes 489 00:28:34,320 --> 00:28:38,480 Speaker 1: a great opportunity for autonomous living for elders, where the 490 00:28:38,600 --> 00:28:41,960 Speaker 1: caretaker knows a lot more about how well the individual 491 00:28:42,720 --> 00:28:45,840 Speaker 1: is healing and is doing at a particular point in time. 492 00:28:46,280 --> 00:28:48,880 Speaker 1: There's this real irony in all of these situations that 493 00:28:49,000 --> 00:28:54,920 Speaker 1: where through more tracking of my information comes freedom and 494 00:28:55,160 --> 00:28:58,920 Speaker 1: you gain autonomy through the amalgamated data. There is an 495 00:28:58,960 --> 00:29:02,120 Speaker 1: angel belief that more privacy we have, the more freedom 496 00:29:02,160 --> 00:29:06,000 Speaker 1: we have. But Poppy says it's time to rethink that relationship. 497 00:29:06,600 --> 00:29:09,280 Speaker 1: If you look at an elder who might otherwise be 498 00:29:09,320 --> 00:29:11,840 Speaker 1: in a care home but instead gains ten years of 499 00:29:11,840 --> 00:29:16,440 Speaker 1: autonomous living because you now have more ubiquitous understanding of 500 00:29:16,480 --> 00:29:20,719 Speaker 1: our mental and physical wellness, you have people having a 501 00:29:20,800 --> 00:29:24,160 Speaker 1: lot more freedom with simply having a richer understanding of 502 00:29:24,200 --> 00:29:28,560 Speaker 1: their internal experience and their personal data. Crucially, if we 503 00:29:28,640 --> 00:29:32,120 Speaker 1: do feel comfortable trading our privacy for more agency, we 504 00:29:32,160 --> 00:29:34,440 Speaker 1: need to be very sure we can trust the people 505 00:29:34,480 --> 00:29:37,640 Speaker 1: who get to see our data, because they're not just 506 00:29:37,680 --> 00:29:41,280 Speaker 1: seeing us naked, they're seeing under the hood. The physiological 507 00:29:41,360 --> 00:29:45,480 Speaker 1: tells that betray our private emotions. For me, everything is 508 00:29:45,480 --> 00:29:47,720 Speaker 1: about transparency. No one should be tracked when they don't 509 00:29:47,760 --> 00:29:51,200 Speaker 1: know they're being tracked. How our technology interacts with us, 510 00:29:51,840 --> 00:29:54,920 Speaker 1: whether we share information or not, our technology can't know. 511 00:29:55,200 --> 00:29:57,880 Speaker 1: I think that's the real issue. We have to recognize 512 00:29:57,920 --> 00:30:01,080 Speaker 1: that this sort of cognitive sovereignty or agency that we 513 00:30:01,120 --> 00:30:04,480 Speaker 1: believe in is a thing of the past. It means 514 00:30:04,480 --> 00:30:07,560 Speaker 1: we have to redefine what that future looks like. It's 515 00:30:07,560 --> 00:30:10,760 Speaker 1: a future we can all participate in building, and it's 516 00:30:10,760 --> 00:30:13,240 Speaker 1: one way better off building as citizens with a collective 517 00:30:13,320 --> 00:30:16,840 Speaker 1: voice and long term objectives, rather than as lone consumers 518 00:30:17,240 --> 00:30:20,040 Speaker 1: in search of the best deal, whatever the cost to 519 00:30:20,160 --> 00:30:23,440 Speaker 1: us in society. As Poppy says, there's a difference between 520 00:30:23,480 --> 00:30:26,360 Speaker 1: an Alexa waking up to respond to commands like play 521 00:30:26,360 --> 00:30:29,800 Speaker 1: our green and an Alexa that is always on building 522 00:30:29,800 --> 00:30:32,280 Speaker 1: a model of our behavior that knows us better that 523 00:30:32,400 --> 00:30:35,360 Speaker 1: we know ourselves, but the crux of that difference is 524 00:30:35,360 --> 00:30:39,560 Speaker 1: more cultural and political than technological. Are we willing to 525 00:30:39,600 --> 00:30:42,320 Speaker 1: give up our poker faces and allow ourselves to be 526 00:30:42,440 --> 00:30:46,040 Speaker 1: read by sensors and algorithms in return for longer, safer, 527 00:30:46,280 --> 00:30:50,840 Speaker 1: happier lives like Poppy's uncle? Or knowing the history of 528 00:30:50,880 --> 00:30:54,360 Speaker 1: governments who have monitored and categorized citizens, Should we be 529 00:30:54,440 --> 00:30:58,320 Speaker 1: doing everything we can to hit pause? Will our technology 530 00:30:58,360 --> 00:31:05,080 Speaker 1: become a safety net or a spider's web? Next episode, 531 00:31:05,080 --> 00:31:08,200 Speaker 1: we travel to Facebook's headquarters and investigate some of the 532 00:31:08,280 --> 00:31:11,320 Speaker 1: more dangerous corners of the Internet, and, knowing that AI 533 00:31:11,440 --> 00:31:14,760 Speaker 1: can both learn about us and imitate us, we take 534 00:31:14,800 --> 00:31:17,400 Speaker 1: a hard look at deep fakes and examine a world 535 00:31:17,440 --> 00:31:21,720 Speaker 1: where it's increasingly difficult to tell truth from fiction. I'm 536 00:31:21,760 --> 00:31:37,640 Speaker 1: az veloshen see you next time. Sleepwalkers is a production 537 00:31:37,680 --> 00:31:41,080 Speaker 1: of I Heart Radio and Unusual Productions. For the latest 538 00:31:41,120 --> 00:31:44,640 Speaker 1: AI news, live interviews, and behind the scenes footage, find 539 00:31:44,680 --> 00:31:48,840 Speaker 1: us on Instagram, at Sleepwalker's podcast, or at Sleepwalker's podcast 540 00:31:48,880 --> 00:31:51,560 Speaker 1: dot com. Special thanks to Briany Cole. We had a 541 00:31:51,560 --> 00:31:54,960 Speaker 1: conversation with Briany that made this episode possible, and Brian 542 00:31:55,120 --> 00:31:58,000 Speaker 1: is the host of a fascinating podcast called Future of 543 00:31:58,040 --> 00:32:01,040 Speaker 1: Sex that's all about using technology to make our lives better. 544 00:32:02,960 --> 00:32:06,400 Speaker 1: Sleepwalkers is hosted by me Ozveloshin and co hosted by 545 00:32:06,400 --> 00:32:09,280 Speaker 1: me Kara Price, with produced by Julian Weller with help 546 00:32:09,320 --> 00:32:12,920 Speaker 1: from Jacopo Penzo and Taylor Chikoin mixing by Tristan McNeil 547 00:32:13,040 --> 00:32:16,600 Speaker 1: and Julian Weller. Our story editor is Matthew Riddle recording 548 00:32:16,600 --> 00:32:21,480 Speaker 1: assistance this episode from Tofarel and Phil Bodger. Sleepwalkers is 549 00:32:21,520 --> 00:32:24,920 Speaker 1: executive produced by me Ozveloshin and Mangesh Had to Get Up. 550 00:32:25,800 --> 00:32:27,840 Speaker 1: For more podcasts from my Heart Radio, visit the I 551 00:32:27,920 --> 00:32:30,840 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 552 00:32:30,840 --> 00:32:38,000 Speaker 1: your favorite shows. Alexa Pizza