1 00:00:00,080 --> 00:00:02,760 Speaker 1: Hi everyone, os Vloshin, co host of Sleepwalkers. 2 00:00:02,759 --> 00:00:03,000 Speaker 2: Here. 3 00:00:03,200 --> 00:00:05,160 Speaker 1: I know it's been a few years since our original 4 00:00:05,240 --> 00:00:08,319 Speaker 1: series run, but I'm here with an exciting announcement. We 5 00:00:08,360 --> 00:00:11,520 Speaker 1: are relaunching this feed, but instead of me and Karen 6 00:00:11,560 --> 00:00:14,320 Speaker 1: Price's host since Karen and I and our partners in 7 00:00:14,360 --> 00:00:16,520 Speaker 1: hosting tech Stuff, which is on twice a week check 8 00:00:16,560 --> 00:00:19,799 Speaker 1: it out, We're bringing on journalists Dexa Thomas to shepherd 9 00:00:19,800 --> 00:00:24,440 Speaker 1: the next iteration of this show. Dexter, Welcome, yo us. 10 00:00:24,520 --> 00:00:27,320 Speaker 2: I'm very happy to be here with you and hopefully 11 00:00:27,320 --> 00:00:29,680 Speaker 2: build on the original show, which I think in a 12 00:00:29,720 --> 00:00:32,360 Speaker 2: lot of ways predicted where we're at right now. 13 00:00:32,880 --> 00:00:35,800 Speaker 1: Yeah, we launched it in twenty nineteen, and you know, 14 00:00:35,880 --> 00:00:38,640 Speaker 1: AI was already kind of bubbling, but we were starting 15 00:00:38,680 --> 00:00:42,000 Speaker 1: to see these things that Karen and I called crash sites, 16 00:00:42,560 --> 00:00:47,040 Speaker 1: where like a new technology and a real person would 17 00:00:47,120 --> 00:00:50,559 Speaker 1: collide in a way that could be very traumatic or 18 00:00:50,600 --> 00:00:53,559 Speaker 1: that could be kind of wonderful. I think everyone in 19 00:00:53,560 --> 00:00:55,800 Speaker 1: twenty nineteen was kind of still wondering though, like, is 20 00:00:55,840 --> 00:00:58,280 Speaker 1: AI really going to become a thing, Like these trends 21 00:00:58,360 --> 00:01:00,920 Speaker 1: are kind of going to be a like the metaverse? 22 00:01:01,440 --> 00:01:02,880 Speaker 1: Or is this going to really take off, and I 23 00:01:02,920 --> 00:01:05,200 Speaker 1: think since then it has taken off, and I think 24 00:01:05,200 --> 00:01:07,880 Speaker 1: these these crash sites are going to become more and 25 00:01:07,880 --> 00:01:09,000 Speaker 1: more important to explore. 26 00:01:09,280 --> 00:01:12,360 Speaker 2: Yeah, you know, one of the reasons to relaunch the 27 00:01:12,440 --> 00:01:14,760 Speaker 2: show is because a lot of what you were covering 28 00:01:14,760 --> 00:01:17,320 Speaker 2: in the first iteration of the show, at that point 29 00:01:17,400 --> 00:01:20,800 Speaker 2: it's still felt like the future. But now I think 30 00:01:20,800 --> 00:01:23,840 Speaker 2: in twenty twenty five, it's either here right now and 31 00:01:23,880 --> 00:01:27,200 Speaker 2: it's kind of boring, or it's about to be. I mean, 32 00:01:27,240 --> 00:01:28,400 Speaker 2: you all were discussed. 33 00:01:28,520 --> 00:01:32,320 Speaker 1: Us ever great nun kidding, No, no. 34 00:01:32,040 --> 00:01:34,080 Speaker 2: No, I mean, you know in the sense in the 35 00:01:34,120 --> 00:01:37,480 Speaker 2: sense that oh Ai, yeah, I know about that, you know. 36 00:01:37,640 --> 00:01:39,959 Speaker 2: I mean I think when when you all first started 37 00:01:40,640 --> 00:01:44,320 Speaker 2: llm's was something that basically just a few computer science 38 00:01:44,360 --> 00:01:46,320 Speaker 2: has had access to. I mean, you had to go 39 00:01:46,360 --> 00:01:49,840 Speaker 2: and visit somebody and you know, talk about liar bird. 40 00:01:49,840 --> 00:01:53,160 Speaker 2: I mean, liar Bird now is almost completely unremarkable because 41 00:01:53,160 --> 00:01:56,880 Speaker 2: it's what we use to edit podcasts. It's just part 42 00:01:56,920 --> 00:02:01,080 Speaker 2: of software that we use every day. And I think 43 00:02:01,120 --> 00:02:03,360 Speaker 2: it's at this point so much of what you all 44 00:02:03,400 --> 00:02:06,800 Speaker 2: were talking about as this is coming or this might 45 00:02:06,880 --> 00:02:10,840 Speaker 2: come someday is completely normal, you know, and we almost 46 00:02:10,880 --> 00:02:15,280 Speaker 2: don't think about it as technology anymore. And and so 47 00:02:15,680 --> 00:02:17,760 Speaker 2: in some ways, you know, the way that you talk 48 00:02:17,800 --> 00:02:19,920 Speaker 2: about the show, and the way that I think originally 49 00:02:19,960 --> 00:02:22,200 Speaker 2: you two had been introducing the show is Okay, this 50 00:02:22,280 --> 00:02:25,560 Speaker 2: is a show about the way that technology affects people's lives. 51 00:02:26,240 --> 00:02:29,600 Speaker 2: It's almost hard to introduce a show as that now 52 00:02:30,120 --> 00:02:34,040 Speaker 2: because it's it almost the response when I even try 53 00:02:34,040 --> 00:02:37,000 Speaker 2: to tell people sometimes it's like sex sual. Yeah, Okay, 54 00:02:37,000 --> 00:02:39,040 Speaker 2: what are you going to talk about? Because I think 55 00:02:39,120 --> 00:02:42,120 Speaker 2: technology has just become kind of the background hum of 56 00:02:42,160 --> 00:02:45,600 Speaker 2: our everyday life. But I would argue that that's precisely 57 00:02:45,720 --> 00:02:49,880 Speaker 2: why we need a show about how technology affects our lives, 58 00:02:49,919 --> 00:02:53,280 Speaker 2: because it's still there. It's not just part of the background. 59 00:02:53,400 --> 00:02:56,640 Speaker 2: It's it's something that we have choices about and that 60 00:02:56,680 --> 00:02:57,919 Speaker 2: we really should be thinking about. 61 00:02:58,360 --> 00:03:00,639 Speaker 1: I think that's really well put. What kinds of things 62 00:03:00,720 --> 00:03:02,280 Speaker 1: you interested in exploring on the show? 63 00:03:02,960 --> 00:03:06,000 Speaker 2: Yeah, so I know that one of your first episodes 64 00:03:06,040 --> 00:03:09,960 Speaker 2: actually was about facial recognition. You go visit the NYPD, 65 00:03:10,360 --> 00:03:16,000 Speaker 2: and you know, NYPD obviously massive police department, very technologically adept, 66 00:03:16,520 --> 00:03:20,440 Speaker 2: and they promise you that, hey, listen, we're using facial 67 00:03:20,480 --> 00:03:25,600 Speaker 2: recognition very carefully. We're very serious about it. And back 68 00:03:25,639 --> 00:03:27,840 Speaker 2: then it seemed like something Okay, well only the most 69 00:03:27,840 --> 00:03:30,680 Speaker 2: advanced police departments have this, and well they say they're 70 00:03:30,720 --> 00:03:33,560 Speaker 2: being careful with it. Now this is something that rural 71 00:03:33,560 --> 00:03:37,520 Speaker 2: police departments all across America have access to, and it's 72 00:03:37,680 --> 00:03:40,760 Speaker 2: putting innocent people in jail, and so I think that's 73 00:03:40,800 --> 00:03:43,200 Speaker 2: really something we should be thinking about. And we talk 74 00:03:43,280 --> 00:03:46,480 Speaker 2: to a reporter who's covered it, and we hear actually 75 00:03:46,520 --> 00:03:49,160 Speaker 2: from some people that this is affected who've been arrested 76 00:03:49,800 --> 00:03:52,240 Speaker 2: because the computer thought they did something and they weren't 77 00:03:52,280 --> 00:03:55,040 Speaker 2: even in the area. There's another episode of Working on 78 00:03:55,360 --> 00:03:58,720 Speaker 2: which shows you how people are making money doing things 79 00:03:58,800 --> 00:04:03,440 Speaker 2: like generating fake AI videos of natural disasters or baby animals, 80 00:04:03,880 --> 00:04:05,920 Speaker 2: and you know, another thing, and maybe this is just 81 00:04:05,920 --> 00:04:08,560 Speaker 2: a personal thing for me. I feel like the DIY 82 00:04:09,080 --> 00:04:11,640 Speaker 2: and the hacker ethos has kind of left tech and 83 00:04:12,120 --> 00:04:14,680 Speaker 2: hopefully we can at least bring that back a little 84 00:04:14,720 --> 00:04:16,560 Speaker 2: bit or provide a place for that can come back. 85 00:04:17,040 --> 00:04:19,599 Speaker 2: And I think there's some more practical stuff too, like 86 00:04:20,240 --> 00:04:22,360 Speaker 2: how can you mitigate the risk that you're gonna get 87 00:04:22,400 --> 00:04:25,719 Speaker 2: hacked again. I think these are things that just have 88 00:04:25,920 --> 00:04:28,920 Speaker 2: kind of passed us by in terms of we've gotten 89 00:04:28,920 --> 00:04:31,279 Speaker 2: really used to, Oh well, just let Apple handle this 90 00:04:31,360 --> 00:04:34,640 Speaker 2: for us, just let Google handle this. Firust No, let's 91 00:04:34,680 --> 00:04:38,520 Speaker 2: do this ourselves. That's what makes this technology stuff fun, 92 00:04:38,520 --> 00:04:39,120 Speaker 2: at least for me. 93 00:04:39,600 --> 00:04:42,279 Speaker 1: I think you're you're exactly right. So I'm really happy 94 00:04:42,320 --> 00:04:43,440 Speaker 1: that you're taking a show on. 95 00:04:44,240 --> 00:04:47,920 Speaker 2: There's a few places where hopefully we're picking up the 96 00:04:47,920 --> 00:04:50,920 Speaker 2: baton a little bit in something that you predicted a 97 00:04:50,920 --> 00:04:53,880 Speaker 2: little bit, and we're able to keep that going and say, Okay, 98 00:04:53,960 --> 00:04:55,920 Speaker 2: here's where we are right now. This is no longer 99 00:04:56,040 --> 00:04:58,560 Speaker 2: science fiction. This is your everyday life. What are we 100 00:04:58,600 --> 00:04:59,400 Speaker 2: going to do about this? 101 00:05:00,080 --> 00:05:03,080 Speaker 1: We call the original show Sleepwalkers. There was actually a 102 00:05:03,080 --> 00:05:05,680 Speaker 1: book about the origins of the First World War with 103 00:05:05,800 --> 00:05:09,080 Speaker 1: that title, and part of the thesis was that, you know, 104 00:05:09,120 --> 00:05:12,279 Speaker 1: you had this moment where the industrial revolution had happened, 105 00:05:12,320 --> 00:05:15,520 Speaker 1: and so you had like trains and telegraphs and gas 106 00:05:15,600 --> 00:05:19,040 Speaker 1: and automatic weapons, and all of a sudden there was 107 00:05:19,120 --> 00:05:21,719 Speaker 1: a conflict that got bigger and bigger and bigger, and 108 00:05:21,800 --> 00:05:24,279 Speaker 1: all of these technologies that kind of existed as one 109 00:05:24,320 --> 00:05:27,919 Speaker 1: offs coalesced on the battlefield and we'll use, you know, 110 00:05:28,000 --> 00:05:30,640 Speaker 1: to create carnage and kill millions of millions of people. 111 00:05:31,600 --> 00:05:33,120 Speaker 1: So the name was a little bit of a warning, 112 00:05:33,200 --> 00:05:35,360 Speaker 1: is to say, like, these new technologies are here, even 113 00:05:35,360 --> 00:05:38,120 Speaker 1: if they're still hiding in plain sight. You, as part 114 00:05:38,160 --> 00:05:40,120 Speaker 1: of the of the rebrand and the relaunch, I think, 115 00:05:40,160 --> 00:05:41,039 Speaker 1: have chosen a new name. 116 00:05:41,640 --> 00:05:44,400 Speaker 2: Yeah, so we're calling it kill switch. And I'm going 117 00:05:44,440 --> 00:05:46,680 Speaker 2: to mix metaphors here a little bit, so bear with me. 118 00:05:47,160 --> 00:05:49,000 Speaker 2: But you know a kill switch. I think we all 119 00:05:49,000 --> 00:05:51,520 Speaker 2: know what that is. Basically, it's a safety mechanism for 120 00:05:52,000 --> 00:05:55,400 Speaker 2: if a machine starts going nuts, you can disable it. 121 00:05:55,760 --> 00:05:58,320 Speaker 2: And I think some of the inspiration behind the name 122 00:05:58,600 --> 00:06:02,000 Speaker 2: is it theoretically, these are all machines, these are computers. 123 00:06:02,279 --> 00:06:05,800 Speaker 2: We can turn these off, right, but it's really hard 124 00:06:05,800 --> 00:06:09,159 Speaker 2: to imagine turning it off now. And to borrow from 125 00:06:09,200 --> 00:06:11,800 Speaker 2: what your iteration of the show was doing, is I 126 00:06:11,800 --> 00:06:15,200 Speaker 2: think we've actually sleep walked up to a point where 127 00:06:15,200 --> 00:06:17,400 Speaker 2: we don't even know where the switch is anymore. If 128 00:06:17,440 --> 00:06:19,039 Speaker 2: we wanted to turn it off, we don't know where 129 00:06:19,080 --> 00:06:21,200 Speaker 2: it is. And I think the show that you were 130 00:06:21,240 --> 00:06:25,279 Speaker 2: hosting kind of predicted this, is that we're in this 131 00:06:25,360 --> 00:06:30,039 Speaker 2: really interesting paradox where the more user friendly technology gets 132 00:06:30,640 --> 00:06:34,159 Speaker 2: the less tech literate we are, and I want to 133 00:06:34,160 --> 00:06:37,719 Speaker 2: continue to interrogate that and say, no, don't just offer 134 00:06:37,800 --> 00:06:41,200 Speaker 2: me these fun new products. Don't just cram AI to everything. 135 00:06:41,680 --> 00:06:43,520 Speaker 2: I want to know how this stuff works. And I 136 00:06:43,520 --> 00:06:46,560 Speaker 2: think we should continue to ask that question. And you know, 137 00:06:46,880 --> 00:06:48,599 Speaker 2: maybe we need to start looking for the kill. 138 00:06:48,400 --> 00:06:52,040 Speaker 1: Switch very well put deck. So I already can't wait 139 00:06:52,040 --> 00:06:54,200 Speaker 1: to hit the show and feel come and tech stuff 140 00:06:54,240 --> 00:06:55,960 Speaker 1: one of these days as well to tell us about 141 00:06:56,000 --> 00:06:58,200 Speaker 1: the show. When isn't it Stride? But in the meantime, 142 00:06:58,200 --> 00:06:58,920 Speaker 1: when's it launch? 143 00:06:59,600 --> 00:07:02,040 Speaker 2: First off, I would love to and please, the door 144 00:07:02,160 --> 00:07:04,400 Speaker 2: is always open. You made the door. Please come on 145 00:07:04,480 --> 00:07:08,320 Speaker 2: in anytime you want. But yeah, first episode drops April 146 00:07:08,360 --> 00:07:11,040 Speaker 2: twenty third, and anywhere you get your podcasts you can 147 00:07:11,080 --> 00:07:11,600 Speaker 2: listen to it.