WEBVTT - Sleepwalkers Is Back 

0:00:00.080 --> 0:00:02.760
<v Speaker 1>Hi everyone, os Vloshin, co host of Sleepwalkers.

0:00:02.759 --> 0:00:03.000
<v Speaker 2>Here.

0:00:03.200 --> 0:00:05.160
<v Speaker 1>I know it's been a few years since our original

0:00:05.240 --> 0:00:08.319
<v Speaker 1>series run, but I'm here with an exciting announcement. We

0:00:08.360 --> 0:00:11.520
<v Speaker 1>are relaunching this feed, but instead of me and Karen

0:00:11.560 --> 0:00:14.320
<v Speaker 1>Price's host since Karen and I and our partners in

0:00:14.360 --> 0:00:16.520
<v Speaker 1>hosting tech Stuff, which is on twice a week check

0:00:16.560 --> 0:00:19.799
<v Speaker 1>it out, We're bringing on journalists Dexa Thomas to shepherd

0:00:19.800 --> 0:00:24.440
<v Speaker 1>the next iteration of this show. Dexter, Welcome, yo us.

0:00:24.520 --> 0:00:27.320
<v Speaker 2>I'm very happy to be here with you and hopefully

0:00:27.320 --> 0:00:29.680
<v Speaker 2>build on the original show, which I think in a

0:00:29.720 --> 0:00:32.360
<v Speaker 2>lot of ways predicted where we're at right now.

0:00:32.880 --> 0:00:35.800
<v Speaker 1>Yeah, we launched it in twenty nineteen, and you know,

0:00:35.880 --> 0:00:38.640
<v Speaker 1>AI was already kind of bubbling, but we were starting

0:00:38.680 --> 0:00:42.000
<v Speaker 1>to see these things that Karen and I called crash sites,

0:00:42.560 --> 0:00:47.040
<v Speaker 1>where like a new technology and a real person would

0:00:47.120 --> 0:00:50.559
<v Speaker 1>collide in a way that could be very traumatic or

0:00:50.600 --> 0:00:53.559
<v Speaker 1>that could be kind of wonderful. I think everyone in

0:00:53.560 --> 0:00:55.800
<v Speaker 1>twenty nineteen was kind of still wondering though, like, is

0:00:55.840 --> 0:00:58.280
<v Speaker 1>AI really going to become a thing, Like these trends

0:00:58.360 --> 0:01:00.920
<v Speaker 1>are kind of going to be a like the metaverse?

0:01:01.440 --> 0:01:02.880
<v Speaker 1>Or is this going to really take off, and I

0:01:02.920 --> 0:01:05.200
<v Speaker 1>think since then it has taken off, and I think

0:01:05.200 --> 0:01:07.880
<v Speaker 1>these these crash sites are going to become more and

0:01:07.880 --> 0:01:09.000
<v Speaker 1>more important to explore.

0:01:09.280 --> 0:01:12.360
<v Speaker 2>Yeah, you know, one of the reasons to relaunch the

0:01:12.440 --> 0:01:14.760
<v Speaker 2>show is because a lot of what you were covering

0:01:14.760 --> 0:01:17.320
<v Speaker 2>in the first iteration of the show, at that point

0:01:17.400 --> 0:01:20.800
<v Speaker 2>it's still felt like the future. But now I think

0:01:20.800 --> 0:01:23.840
<v Speaker 2>in twenty twenty five, it's either here right now and

0:01:23.880 --> 0:01:27.200
<v Speaker 2>it's kind of boring, or it's about to be. I mean,

0:01:27.240 --> 0:01:28.400
<v Speaker 2>you all were discussed.

0:01:28.520 --> 0:01:32.320
<v Speaker 1>Us ever great nun kidding, No, no.

0:01:32.040 --> 0:01:34.080
<v Speaker 2>No, I mean, you know in the sense in the

0:01:34.120 --> 0:01:37.480
<v Speaker 2>sense that oh Ai, yeah, I know about that, you know.

0:01:37.640 --> 0:01:39.959
<v Speaker 2>I mean I think when when you all first started

0:01:40.640 --> 0:01:44.320
<v Speaker 2>llm's was something that basically just a few computer science

0:01:44.360 --> 0:01:46.320
<v Speaker 2>has had access to. I mean, you had to go

0:01:46.360 --> 0:01:49.840
<v Speaker 2>and visit somebody and you know, talk about liar bird.

0:01:49.840 --> 0:01:53.160
<v Speaker 2>I mean, liar Bird now is almost completely unremarkable because

0:01:53.160 --> 0:01:56.880
<v Speaker 2>it's what we use to edit podcasts. It's just part

0:01:56.920 --> 0:02:01.080
<v Speaker 2>of software that we use every day. And I think

0:02:01.120 --> 0:02:03.360
<v Speaker 2>it's at this point so much of what you all

0:02:03.400 --> 0:02:06.800
<v Speaker 2>were talking about as this is coming or this might

0:02:06.880 --> 0:02:10.840
<v Speaker 2>come someday is completely normal, you know, and we almost

0:02:10.880 --> 0:02:15.280
<v Speaker 2>don't think about it as technology anymore. And and so

0:02:15.680 --> 0:02:17.760
<v Speaker 2>in some ways, you know, the way that you talk

0:02:17.800 --> 0:02:19.920
<v Speaker 2>about the show, and the way that I think originally

0:02:19.960 --> 0:02:22.200
<v Speaker 2>you two had been introducing the show is Okay, this

0:02:22.280 --> 0:02:25.560
<v Speaker 2>is a show about the way that technology affects people's lives.

0:02:26.240 --> 0:02:29.600
<v Speaker 2>It's almost hard to introduce a show as that now

0:02:30.120 --> 0:02:34.040
<v Speaker 2>because it's it almost the response when I even try

0:02:34.040 --> 0:02:37.000
<v Speaker 2>to tell people sometimes it's like sex sual. Yeah, Okay,

0:02:37.000 --> 0:02:39.040
<v Speaker 2>what are you going to talk about? Because I think

0:02:39.120 --> 0:02:42.120
<v Speaker 2>technology has just become kind of the background hum of

0:02:42.160 --> 0:02:45.600
<v Speaker 2>our everyday life. But I would argue that that's precisely

0:02:45.720 --> 0:02:49.880
<v Speaker 2>why we need a show about how technology affects our lives,

0:02:49.919 --> 0:02:53.280
<v Speaker 2>because it's still there. It's not just part of the background.

0:02:53.400 --> 0:02:56.640
<v Speaker 2>It's it's something that we have choices about and that

0:02:56.680 --> 0:02:57.919
<v Speaker 2>we really should be thinking about.

0:02:58.360 --> 0:03:00.639
<v Speaker 1>I think that's really well put. What kinds of things

0:03:00.720 --> 0:03:02.280
<v Speaker 1>you interested in exploring on the show?

0:03:02.960 --> 0:03:06.000
<v Speaker 2>Yeah, so I know that one of your first episodes

0:03:06.040 --> 0:03:09.960
<v Speaker 2>actually was about facial recognition. You go visit the NYPD,

0:03:10.360 --> 0:03:16.000
<v Speaker 2>and you know, NYPD obviously massive police department, very technologically adept,

0:03:16.520 --> 0:03:20.440
<v Speaker 2>and they promise you that, hey, listen, we're using facial

0:03:20.480 --> 0:03:25.600
<v Speaker 2>recognition very carefully. We're very serious about it. And back

0:03:25.639 --> 0:03:27.840
<v Speaker 2>then it seemed like something Okay, well only the most

0:03:27.840 --> 0:03:30.680
<v Speaker 2>advanced police departments have this, and well they say they're

0:03:30.720 --> 0:03:33.560
<v Speaker 2>being careful with it. Now this is something that rural

0:03:33.560 --> 0:03:37.520
<v Speaker 2>police departments all across America have access to, and it's

0:03:37.680 --> 0:03:40.760
<v Speaker 2>putting innocent people in jail, and so I think that's

0:03:40.800 --> 0:03:43.200
<v Speaker 2>really something we should be thinking about. And we talk

0:03:43.280 --> 0:03:46.480
<v Speaker 2>to a reporter who's covered it, and we hear actually

0:03:46.520 --> 0:03:49.160
<v Speaker 2>from some people that this is affected who've been arrested

0:03:49.800 --> 0:03:52.240
<v Speaker 2>because the computer thought they did something and they weren't

0:03:52.280 --> 0:03:55.040
<v Speaker 2>even in the area. There's another episode of Working on

0:03:55.360 --> 0:03:58.720
<v Speaker 2>which shows you how people are making money doing things

0:03:58.800 --> 0:04:03.440
<v Speaker 2>like generating fake AI videos of natural disasters or baby animals,

0:04:03.880 --> 0:04:05.920
<v Speaker 2>and you know, another thing, and maybe this is just

0:04:05.920 --> 0:04:08.560
<v Speaker 2>a personal thing for me. I feel like the DIY

0:04:09.080 --> 0:04:11.640
<v Speaker 2>and the hacker ethos has kind of left tech and

0:04:12.120 --> 0:04:14.680
<v Speaker 2>hopefully we can at least bring that back a little

0:04:14.720 --> 0:04:16.560
<v Speaker 2>bit or provide a place for that can come back.

0:04:17.040 --> 0:04:19.599
<v Speaker 2>And I think there's some more practical stuff too, like

0:04:20.240 --> 0:04:22.360
<v Speaker 2>how can you mitigate the risk that you're gonna get

0:04:22.400 --> 0:04:25.719
<v Speaker 2>hacked again. I think these are things that just have

0:04:25.920 --> 0:04:28.920
<v Speaker 2>kind of passed us by in terms of we've gotten

0:04:28.920 --> 0:04:31.279
<v Speaker 2>really used to, Oh well, just let Apple handle this

0:04:31.360 --> 0:04:34.640
<v Speaker 2>for us, just let Google handle this. Firust No, let's

0:04:34.680 --> 0:04:38.520
<v Speaker 2>do this ourselves. That's what makes this technology stuff fun,

0:04:38.520 --> 0:04:39.120
<v Speaker 2>at least for me.

0:04:39.600 --> 0:04:42.279
<v Speaker 1>I think you're you're exactly right. So I'm really happy

0:04:42.320 --> 0:04:43.440
<v Speaker 1>that you're taking a show on.

0:04:44.240 --> 0:04:47.920
<v Speaker 2>There's a few places where hopefully we're picking up the

0:04:47.920 --> 0:04:50.920
<v Speaker 2>baton a little bit in something that you predicted a

0:04:50.920 --> 0:04:53.880
<v Speaker 2>little bit, and we're able to keep that going and say, Okay,

0:04:53.960 --> 0:04:55.920
<v Speaker 2>here's where we are right now. This is no longer

0:04:56.040 --> 0:04:58.560
<v Speaker 2>science fiction. This is your everyday life. What are we

0:04:58.600 --> 0:04:59.400
<v Speaker 2>going to do about this?

0:05:00.080 --> 0:05:03.080
<v Speaker 1>We call the original show Sleepwalkers. There was actually a

0:05:03.080 --> 0:05:05.680
<v Speaker 1>book about the origins of the First World War with

0:05:05.800 --> 0:05:09.080
<v Speaker 1>that title, and part of the thesis was that, you know,

0:05:09.120 --> 0:05:12.279
<v Speaker 1>you had this moment where the industrial revolution had happened,

0:05:12.320 --> 0:05:15.520
<v Speaker 1>and so you had like trains and telegraphs and gas

0:05:15.600 --> 0:05:19.040
<v Speaker 1>and automatic weapons, and all of a sudden there was

0:05:19.120 --> 0:05:21.719
<v Speaker 1>a conflict that got bigger and bigger and bigger, and

0:05:21.800 --> 0:05:24.279
<v Speaker 1>all of these technologies that kind of existed as one

0:05:24.320 --> 0:05:27.919
<v Speaker 1>offs coalesced on the battlefield and we'll use, you know,

0:05:28.000 --> 0:05:30.640
<v Speaker 1>to create carnage and kill millions of millions of people.

0:05:31.600 --> 0:05:33.120
<v Speaker 1>So the name was a little bit of a warning,

0:05:33.200 --> 0:05:35.360
<v Speaker 1>is to say, like, these new technologies are here, even

0:05:35.360 --> 0:05:38.120
<v Speaker 1>if they're still hiding in plain sight. You, as part

0:05:38.160 --> 0:05:40.120
<v Speaker 1>of the of the rebrand and the relaunch, I think,

0:05:40.160 --> 0:05:41.039
<v Speaker 1>have chosen a new name.

0:05:41.640 --> 0:05:44.400
<v Speaker 2>Yeah, so we're calling it kill switch. And I'm going

0:05:44.440 --> 0:05:46.680
<v Speaker 2>to mix metaphors here a little bit, so bear with me.

0:05:47.160 --> 0:05:49.000
<v Speaker 2>But you know a kill switch. I think we all

0:05:49.000 --> 0:05:51.520
<v Speaker 2>know what that is. Basically, it's a safety mechanism for

0:05:52.000 --> 0:05:55.400
<v Speaker 2>if a machine starts going nuts, you can disable it.

0:05:55.760 --> 0:05:58.320
<v Speaker 2>And I think some of the inspiration behind the name

0:05:58.600 --> 0:06:02.000
<v Speaker 2>is it theoretically, these are all machines, these are computers.

0:06:02.279 --> 0:06:05.800
<v Speaker 2>We can turn these off, right, but it's really hard

0:06:05.800 --> 0:06:09.159
<v Speaker 2>to imagine turning it off now. And to borrow from

0:06:09.200 --> 0:06:11.800
<v Speaker 2>what your iteration of the show was doing, is I

0:06:11.800 --> 0:06:15.200
<v Speaker 2>think we've actually sleep walked up to a point where

0:06:15.200 --> 0:06:17.400
<v Speaker 2>we don't even know where the switch is anymore. If

0:06:17.440 --> 0:06:19.039
<v Speaker 2>we wanted to turn it off, we don't know where

0:06:19.080 --> 0:06:21.200
<v Speaker 2>it is. And I think the show that you were

0:06:21.240 --> 0:06:25.279
<v Speaker 2>hosting kind of predicted this, is that we're in this

0:06:25.360 --> 0:06:30.039
<v Speaker 2>really interesting paradox where the more user friendly technology gets

0:06:30.640 --> 0:06:34.159
<v Speaker 2>the less tech literate we are, and I want to

0:06:34.160 --> 0:06:37.719
<v Speaker 2>continue to interrogate that and say, no, don't just offer

0:06:37.800 --> 0:06:41.200
<v Speaker 2>me these fun new products. Don't just cram AI to everything.

0:06:41.680 --> 0:06:43.520
<v Speaker 2>I want to know how this stuff works. And I

0:06:43.520 --> 0:06:46.560
<v Speaker 2>think we should continue to ask that question. And you know,

0:06:46.880 --> 0:06:48.599
<v Speaker 2>maybe we need to start looking for the kill.

0:06:48.400 --> 0:06:52.040
<v Speaker 1>Switch very well put deck. So I already can't wait

0:06:52.040 --> 0:06:54.200
<v Speaker 1>to hit the show and feel come and tech stuff

0:06:54.240 --> 0:06:55.960
<v Speaker 1>one of these days as well to tell us about

0:06:56.000 --> 0:06:58.200
<v Speaker 1>the show. When isn't it Stride? But in the meantime,

0:06:58.200 --> 0:06:58.920
<v Speaker 1>when's it launch?

0:06:59.600 --> 0:07:02.040
<v Speaker 2>First off, I would love to and please, the door

0:07:02.160 --> 0:07:04.400
<v Speaker 2>is always open. You made the door. Please come on

0:07:04.480 --> 0:07:08.320
<v Speaker 2>in anytime you want. But yeah, first episode drops April

0:07:08.360 --> 0:07:11.040
<v Speaker 2>twenty third, and anywhere you get your podcasts you can

0:07:11.080 --> 0:07:11.600
<v Speaker 2>listen to it.