1 00:00:05,559 --> 00:00:09,440 Speaker 1: They had that like continuous Seinfeld generator for a while, 2 00:00:10,160 --> 00:00:15,120 Speaker 1: which I oh, really yeah, I forgot what happens. I 3 00:00:15,120 --> 00:00:17,439 Speaker 1: think it started. I think it just went to and 4 00:00:17,480 --> 00:00:19,200 Speaker 1: started doing racist things though. 5 00:00:19,840 --> 00:00:21,800 Speaker 2: Started talking like Jerry Seinfeld. 6 00:00:22,200 --> 00:00:25,360 Speaker 1: Yeah actually yeah exactly, it started talking like Sinfield and 7 00:00:25,360 --> 00:00:28,560 Speaker 1: Michael Richards, right, yeah. 8 00:00:27,760 --> 00:00:29,440 Speaker 3: The idea of he's doing great things. 9 00:00:29,960 --> 00:00:36,440 Speaker 1: Oh lord, what's with it with these dudent protesters anyway? 10 00:00:36,479 --> 00:00:39,400 Speaker 3: What's the deal? They all got the same tent? Who's 11 00:00:39,440 --> 00:00:51,320 Speaker 3: giving him the tents? That is his material? Hello the Internet, 12 00:00:51,400 --> 00:00:54,640 Speaker 3: and welcome to season three, forty four, Episode. 13 00:00:54,120 --> 00:00:57,040 Speaker 2: Two of dar Day's gay. 14 00:00:56,680 --> 00:00:59,920 Speaker 3: Production of iHeart Radio. And this is a podcast where 15 00:01:00,080 --> 00:01:04,400 Speaker 3: we take a deep dive into America's shared consciousness, America's 16 00:01:04,520 --> 00:01:07,880 Speaker 3: deep brain. If you will, A little typic cap because 17 00:01:08,360 --> 00:01:11,000 Speaker 3: we're big AI fan, A little little spoiler. We're big 18 00:01:11,080 --> 00:01:15,240 Speaker 3: AI fans, now, folks. I have seen the light come around. 19 00:01:16,040 --> 00:01:18,720 Speaker 3: I think when it's all you see on social media, 20 00:01:18,800 --> 00:01:19,160 Speaker 3: you're like. 21 00:01:19,319 --> 00:01:20,440 Speaker 2: That this is all right? 22 00:01:20,680 --> 00:01:24,399 Speaker 3: Oh yeah, maybe this is cool. Yeah. It's Tuesday, June 23 00:01:24,480 --> 00:01:26,280 Speaker 3: twenty fifth, twenty twenty four. 24 00:01:26,800 --> 00:01:31,360 Speaker 2: Oh yeah, big day. It's National Catfish Day, which is 25 00:01:31,520 --> 00:01:35,840 Speaker 2: odd because it's also my partner's birthday at breakday? Are 26 00:01:35,880 --> 00:01:39,280 Speaker 2: you have you been catfishing me this whole time? It's 27 00:01:39,360 --> 00:01:42,240 Speaker 2: national straw? Guys, haven't met her person yet? Still right, 28 00:01:42,440 --> 00:01:45,000 Speaker 2: but one of these days. No, Well, and every time 29 00:01:45,080 --> 00:01:47,319 Speaker 2: I want a video chat, she says her phone's broken, 30 00:01:47,480 --> 00:01:49,400 Speaker 2: so we just kind of stick to the phone call stuff. 31 00:01:50,040 --> 00:01:52,160 Speaker 3: But it was also very normal. It's this stage in 32 00:01:52,200 --> 00:01:53,360 Speaker 3: a marriage very normal. 33 00:01:53,640 --> 00:01:55,880 Speaker 2: Also, this is so weird and this is just like 34 00:01:56,000 --> 00:01:58,840 Speaker 2: some weird religious stuff. It's national leak? Did you know 35 00:01:58,960 --> 00:02:00,680 Speaker 2: this is national day? 36 00:02:01,760 --> 00:02:02,800 Speaker 3: Do you know what that even is? 37 00:02:03,040 --> 00:02:08,000 Speaker 2: It's because you're six months You're six months away from Christmas. 38 00:02:08,600 --> 00:02:15,959 Speaker 3: Noel words the most it's the most opposite time of 39 00:02:16,400 --> 00:02:19,840 Speaker 3: the year. What does this even mean? Jesus gets a 40 00:02:19,880 --> 00:02:23,919 Speaker 3: half bird instead of ding dong, they dong ding quote 41 00:02:24,480 --> 00:02:29,440 Speaker 3: Doctor Seuss. All right, anyways, happy National Leon Day. We're 42 00:02:29,480 --> 00:02:32,239 Speaker 3: gonna need to workshop that. Maybe we can run it 43 00:02:32,320 --> 00:02:36,480 Speaker 3: through an AI comedy writer, because those are good and 44 00:02:36,880 --> 00:02:41,200 Speaker 3: in no way just humans posing as AI comedy writers. Anyways, 45 00:02:41,280 --> 00:02:44,480 Speaker 3: my name is Jack O'Brien aka Big Ass Plumpers to 46 00:02:44,560 --> 00:02:47,680 Speaker 3: Thin Ass Fools Geist keep it coming like grim as Spooch. 47 00:02:47,800 --> 00:02:49,560 Speaker 3: We kill it on the podcast break in all the 48 00:02:49,639 --> 00:02:52,600 Speaker 3: news say, I'll be goddamned if there ain't more raccoons. 49 00:02:52,800 --> 00:02:58,840 Speaker 3: That is courtesy of Halcyon Salad. Like that name, Halcion Salad. 50 00:02:58,919 --> 00:03:00,720 Speaker 3: I see what you've done there, and I enjoy it. 51 00:03:01,400 --> 00:03:04,920 Speaker 3: I'm thrilled to be joined as always by my co host, 52 00:03:05,040 --> 00:03:06,080 Speaker 3: mister Miles Gray. 53 00:03:06,760 --> 00:03:07,840 Speaker 2: It's Miles Baker. 54 00:03:08,360 --> 00:03:12,040 Speaker 3: I got them bell big boots when the weather suit. 55 00:03:13,280 --> 00:03:17,760 Speaker 2: Yeah, I got that sweat because I'm a dad. Shout 56 00:03:17,800 --> 00:03:20,440 Speaker 2: out razagg on the discord because yeah, like I said, 57 00:03:20,880 --> 00:03:23,519 Speaker 2: I have zip off cargo pants that go from pants 58 00:03:23,600 --> 00:03:26,000 Speaker 2: to shorts because I'm a dad and you have that's 59 00:03:26,080 --> 00:03:27,320 Speaker 2: like just mandatory swag. 60 00:03:27,400 --> 00:03:31,440 Speaker 3: You unzip the bottom of the shorts, yep, which is 61 00:03:31,480 --> 00:03:33,560 Speaker 3: what they're designed to do when it gets warm at 62 00:03:33,560 --> 00:03:37,120 Speaker 3: the knee, and then you flip them. You invert the 63 00:03:37,320 --> 00:03:40,320 Speaker 3: likes and put the cuff at your knee. So you've 64 00:03:40,360 --> 00:03:43,400 Speaker 3: got little bell bottoms, like two part bell bottoms. 65 00:03:43,640 --> 00:03:45,880 Speaker 2: Yeah, the three part bell bottoms. I guess I saw 66 00:03:45,880 --> 00:03:47,840 Speaker 2: it on Pinterest. It could have been AI, but it 67 00:03:47,880 --> 00:03:51,080 Speaker 2: looked like people where we get our fashion from. Yeah, yeah, 68 00:03:51,240 --> 00:03:53,680 Speaker 2: it might have been completely off. But hey, Miles, most 69 00:03:53,720 --> 00:03:57,120 Speaker 2: people say, hey, good luck on you. Yes, people say, hey, Miles, 70 00:03:57,200 --> 00:03:59,760 Speaker 2: We're thrilled to be joined Yes by the hosts of 71 00:03:59,840 --> 00:04:03,560 Speaker 2: the Mystery AI Hype Theater three thousand podcasts. Once again, 72 00:04:04,160 --> 00:04:06,640 Speaker 2: it's doctor Alex Hannah and Professor Emily M. 73 00:04:06,720 --> 00:04:09,440 Speaker 3: Bender. What's up, guys, welcome to the show. 74 00:04:11,480 --> 00:04:16,200 Speaker 1: Hello, Hey, so I'm so glad to hear you there. 75 00:04:16,400 --> 00:04:17,839 Speaker 1: What was the thing about the raccoon? 76 00:04:18,960 --> 00:04:22,120 Speaker 3: A raccoon ripped a crow in half in my backyard. 77 00:04:23,080 --> 00:04:27,599 Speaker 1: Gosh, like it's just wwe style just put it over, 78 00:04:27,800 --> 00:04:29,680 Speaker 1: it's it's knee when it's strange. 79 00:04:29,880 --> 00:04:32,719 Speaker 3: I didn't see it happen. I just found the crow, 80 00:04:32,880 --> 00:04:37,320 Speaker 3: if grown friendly with a murder of crows in my backyard, 81 00:04:38,000 --> 00:04:42,440 Speaker 3: and as one does, and found the crow's body right 82 00:04:42,520 --> 00:04:45,680 Speaker 3: next to like a garbage bag garbage can that had 83 00:04:45,720 --> 00:04:49,039 Speaker 3: been like flipped over like by what could have only 84 00:04:49,120 --> 00:04:51,440 Speaker 3: been a raccoon. And we do have a raccoon family 85 00:04:51,520 --> 00:04:52,359 Speaker 3: living nearby, so. 86 00:04:52,800 --> 00:04:54,560 Speaker 4: If you have if you have an infra raccoons, have 87 00:04:54,640 --> 00:04:56,640 Speaker 4: you ever seen the video where someone's feeding a raccoon 88 00:04:56,720 --> 00:04:57,279 Speaker 4: cotton candy? 89 00:04:57,640 --> 00:04:57,680 Speaker 2: No? 90 00:04:58,560 --> 00:05:00,680 Speaker 3: Oh, and don't they like take it to the water. 91 00:05:01,000 --> 00:05:01,520 Speaker 3: I've seen like. 92 00:05:03,360 --> 00:05:08,320 Speaker 2: Wash it dissolved and like no, Yeah, they always watch 93 00:05:08,400 --> 00:05:08,880 Speaker 2: their food. 94 00:05:09,279 --> 00:05:12,880 Speaker 3: Yeah, very respect it. I just have a much healthier 95 00:05:12,960 --> 00:05:17,280 Speaker 3: respect for raccoons. We should be treating raccoons like handguns. 96 00:05:17,440 --> 00:05:22,320 Speaker 3: They're very impressive and dangerous and we should just give 97 00:05:22,360 --> 00:05:25,840 Speaker 3: them the proper respect. They have a thumbs exactly, they 98 00:05:26,040 --> 00:05:27,000 Speaker 3: and they hunt birds. 99 00:05:27,279 --> 00:05:30,360 Speaker 2: So it's all I knew. I didn't realize the food 100 00:05:30,520 --> 00:05:33,040 Speaker 2: washing thing. My mom famously has opened her home to 101 00:05:33,160 --> 00:05:37,240 Speaker 2: possums and other neighborhood wildlife and where she lives, and 102 00:05:37,320 --> 00:05:40,000 Speaker 2: I remember in the kitchen where the cat food is, 103 00:05:40,240 --> 00:05:42,279 Speaker 2: there was like a bunch of kibble in the water 104 00:05:42,400 --> 00:05:44,480 Speaker 2: bowl and I was like, what is going on? And 105 00:05:44,520 --> 00:05:46,640 Speaker 2: Mom was like, I think that's what the raccoon does. 106 00:05:47,279 --> 00:05:48,960 Speaker 2: And I was like oh, and it was said so 107 00:05:49,240 --> 00:05:52,479 Speaker 2: casually that I was like in your home. He's like yeah, 108 00:05:52,520 --> 00:05:54,600 Speaker 2: yeah yeah. But then it leaves and I was like, 109 00:05:54,760 --> 00:05:55,919 Speaker 2: this is very okay. 110 00:05:56,080 --> 00:05:58,120 Speaker 1: Well does it just does it use like the cat 111 00:05:58,440 --> 00:06:00,240 Speaker 1: like door? Does it? Yeah? 112 00:06:00,279 --> 00:06:03,440 Speaker 2: Yeah, it comes through, watches the kibble, has a few bites, 113 00:06:03,440 --> 00:06:05,280 Speaker 2: and then takes off into the night. It's like a 114 00:06:05,360 --> 00:06:06,000 Speaker 2: pit stop for. 115 00:06:06,120 --> 00:06:09,000 Speaker 1: One of the credit Yeah, I'm leading, So I'm leading 116 00:06:09,279 --> 00:06:14,039 Speaker 1: a raccoon based Dungeons and Dragons campaigns starting next week. 117 00:06:14,680 --> 00:06:17,080 Speaker 1: That's going to be like, yes, it's going to be 118 00:06:17,160 --> 00:06:21,159 Speaker 1: like a heist that takes place in the warehouse where 119 00:06:21,279 --> 00:06:24,200 Speaker 1: where I play Roller Derby. Like, I'm very excited. I've 120 00:06:24,240 --> 00:06:27,480 Speaker 1: got you got it really architected. I don't want to 121 00:06:27,480 --> 00:06:30,279 Speaker 1: give any secrets in case any of my my player 122 00:06:30,400 --> 00:06:31,839 Speaker 1: characters listen to this podcast. 123 00:06:31,960 --> 00:06:35,560 Speaker 3: But right, okay, that sounds amazing And what a coincidence 124 00:06:35,680 --> 00:06:39,039 Speaker 3: raccoons having a bit of a moment really on this podcast. 125 00:06:39,200 --> 00:06:39,360 Speaker 1: Yeah. 126 00:06:40,279 --> 00:06:44,520 Speaker 3: So in addition to being a host of the wonderful 127 00:06:45,000 --> 00:06:49,440 Speaker 3: Mystery and Hype Theater three thousand podcast, which podcast hosts 128 00:06:49,640 --> 00:06:51,880 Speaker 3: the highest honor one can attain in American life. But 129 00:06:52,600 --> 00:06:55,840 Speaker 3: you both have some pretty impressive credits. Emily, you are 130 00:06:55,839 --> 00:06:59,360 Speaker 3: a linguist and professor at the University of Washington, where 131 00:06:59,520 --> 00:07:03,039 Speaker 3: you are director of the Computational Linguistics Laboratory. 132 00:07:03,440 --> 00:07:05,559 Speaker 5: Yep, that's right, okay. 133 00:07:05,960 --> 00:07:09,240 Speaker 3: Alex you are director of research at the Distributed AI 134 00:07:09,360 --> 00:07:13,480 Speaker 3: Research Institute. Both widely published, both received a number of 135 00:07:13,520 --> 00:07:17,240 Speaker 3: academic awards. Both have PhDs. We had you on the 136 00:07:17,320 --> 00:07:21,280 Speaker 3: podcast a few months back, told everyone the truth about 137 00:07:21,400 --> 00:07:24,800 Speaker 3: AI that a lot of the stuff that we're scared of, 138 00:07:25,640 --> 00:07:27,120 Speaker 3: and a lot of the stuff we think it can 139 00:07:27,200 --> 00:07:31,560 Speaker 3: do is not true. It's bullshit. And I sat back 140 00:07:31,640 --> 00:07:34,960 Speaker 3: and was like, well, we'll see what AI does after 141 00:07:35,200 --> 00:07:39,000 Speaker 3: this one. And it's just kept happening, you guys, what 142 00:07:39,160 --> 00:07:43,120 Speaker 3: the what the heck can we do? If anything, it's 143 00:07:43,160 --> 00:07:46,320 Speaker 3: gotten worse since we told everybody the truth what's happening? 144 00:07:46,880 --> 00:07:53,040 Speaker 4: Truly, everybody seems to want to believes absurd so wild. Yeah, 145 00:07:53,440 --> 00:07:55,680 Speaker 4: And part of what we do with the podcast actually 146 00:07:55,920 --> 00:07:57,800 Speaker 4: is like try to be a focal point for a 147 00:07:57,840 --> 00:08:00,960 Speaker 4: community of the people who are like, know, that's not right. 148 00:08:01,160 --> 00:08:03,760 Speaker 4: Why does everybody around me seem to believe that it's like, 149 00:08:04,000 --> 00:08:07,840 Speaker 4: you know, actually doing all these things. Yeah, so it's yeah, 150 00:08:07,920 --> 00:08:10,000 Speaker 4: it's it's you know, that's what we say in our podcast, 151 00:08:10,080 --> 00:08:12,440 Speaker 4: like every time we think we've reached peak, a hype, 152 00:08:12,520 --> 00:08:14,920 Speaker 4: the summit, the bullshit mountain, we discover there's. 153 00:08:14,760 --> 00:08:17,760 Speaker 2: Worse to come, Like yeah, right, Like oh yeah, this 154 00:08:17,920 --> 00:08:19,680 Speaker 2: is just a base camp until you get to the 155 00:08:19,760 --> 00:08:20,280 Speaker 2: real peak. 156 00:08:20,640 --> 00:08:23,239 Speaker 1: Like right, well, it's just that you just keep on thinking, 157 00:08:23,320 --> 00:08:27,640 Speaker 1: that keeps on becoming and there's more and more things 158 00:08:27,800 --> 00:08:32,640 Speaker 1: that these CEOs just you know, are really just say 159 00:08:32,760 --> 00:08:35,839 Speaker 1: incredible nonsense. I don't know if you saw this. I 160 00:08:35,920 --> 00:08:39,360 Speaker 1: think it was the last week the chief technology officer 161 00:08:40,120 --> 00:08:46,640 Speaker 1: of open Ai, mir Murti Muradi, yeah, who famously was 162 00:08:46,760 --> 00:08:48,719 Speaker 1: I think it was an interview of sixty minutes and 163 00:08:49,400 --> 00:08:51,480 Speaker 1: when they were talking about one of their tools, soa 164 00:08:52,040 --> 00:08:53,280 Speaker 1: you know, they had asked if. 165 00:08:53,679 --> 00:08:58,959 Speaker 3: They favorite filmmaker, Yeah exactly, my favorite Yeah, thank you, 166 00:08:59,200 --> 00:08:59,840 Speaker 3: yeah exactly. 167 00:09:00,240 --> 00:09:03,160 Speaker 1: Just so are really edging out, you know, David Lynch 168 00:09:03,240 --> 00:09:06,200 Speaker 1: these days, and so, you know, and they asked her 169 00:09:06,720 --> 00:09:09,079 Speaker 1: do you train the stuff on YouTube? And she she 170 00:09:09,320 --> 00:09:15,560 Speaker 1: kind of grimaced, yeah, painfully. And I remember a great 171 00:09:15,679 --> 00:09:17,559 Speaker 1: Twitter comment. I was like, well, if you're gonna just 172 00:09:17,679 --> 00:09:19,319 Speaker 1: lie about stuff, you at least have to have a 173 00:09:19,360 --> 00:09:22,840 Speaker 1: good poker face about it. Yes, And so the last 174 00:09:22,880 --> 00:09:25,439 Speaker 1: week she was like, well, I think she's doing another 175 00:09:25,480 --> 00:09:28,840 Speaker 1: interview and she was like, well, some some some creative 176 00:09:28,920 --> 00:09:31,360 Speaker 1: jobs are going to go away, like some artists should 177 00:09:31,440 --> 00:09:33,480 Speaker 1: be you know, some. 178 00:09:34,160 --> 00:09:37,640 Speaker 2: Creative jobs maybe shouldn't have existed in the first place. Right, 179 00:09:37,840 --> 00:09:41,480 Speaker 2: If like these jobs were in affront to God or something, 180 00:09:41,720 --> 00:09:43,560 Speaker 2: some of them just shouldn't have even been there. 181 00:09:43,640 --> 00:09:46,520 Speaker 3: But she does have a French accent, so it's really 182 00:09:46,600 --> 00:09:48,559 Speaker 3: hard to be like, this is ridiculous. 183 00:09:49,120 --> 00:09:50,560 Speaker 1: Let's he's Italian and. 184 00:09:50,760 --> 00:09:54,600 Speaker 3: That's what's amazing about her having a friend. I don't. 185 00:09:55,120 --> 00:09:57,520 Speaker 3: I'm not I'm not a cultured person. I don't know 186 00:09:57,600 --> 00:10:01,559 Speaker 3: the difference between They're all French to me, I'm a man, right, Hey, 187 00:10:01,600 --> 00:10:04,160 Speaker 3: she has a Canadian accent. I think. I'm not sure. 188 00:10:04,200 --> 00:10:09,000 Speaker 1: I know it's only it's only plagiarism if it comes 189 00:10:09,160 --> 00:10:11,360 Speaker 1: from the French region of. 190 00:10:11,480 --> 00:10:16,880 Speaker 3: Italy's right, that's right. Yeah, we're going to get into 191 00:10:16,960 --> 00:10:21,439 Speaker 3: that story and just yeah, all of the madness that 192 00:10:21,720 --> 00:10:25,920 Speaker 3: has continued to happen, the bullshit has continued to rain 193 00:10:26,400 --> 00:10:30,240 Speaker 3: even harder, it seems like, yeah, which, yes, does make 194 00:10:30,280 --> 00:10:33,840 Speaker 3: the mountain go higher, unfortunately, the bullshit mountain. But before 195 00:10:33,880 --> 00:10:36,599 Speaker 3: we get to that, Emily Alex, we do like to 196 00:10:36,679 --> 00:10:40,760 Speaker 3: ask our guests, what is something from your search histories 197 00:10:41,000 --> 00:10:44,079 Speaker 3: that's revealing about who you are? Alex? You want to 198 00:10:44,160 --> 00:10:44,600 Speaker 3: kick us off? 199 00:10:45,200 --> 00:10:48,800 Speaker 1: Oh gosh, okay, I don't. The thing is, I don't 200 00:10:49,080 --> 00:10:52,839 Speaker 1: think so I use duck duckgo, and so it doesn't 201 00:10:52,880 --> 00:10:55,600 Speaker 1: actually keep us search history and if I actually look 202 00:10:55,600 --> 00:10:58,480 Speaker 1: at my Google history, it's actually going to be really shameful. 203 00:10:58,720 --> 00:11:01,360 Speaker 1: It's going to be me like searching my own name 204 00:11:01,480 --> 00:11:04,800 Speaker 1: to see it, like if people are like ship talking 205 00:11:04,840 --> 00:11:05,360 Speaker 1: to me online. 206 00:11:05,720 --> 00:11:07,720 Speaker 3: Now this isn't just how we tell if someone's honest, 207 00:11:07,880 --> 00:11:10,320 Speaker 3: is if they actually give that answer, we're like, okay, so. 208 00:11:10,400 --> 00:11:13,559 Speaker 1: Yeah, you actually search yourself. Yeah, actually, but I think 209 00:11:13,640 --> 00:11:17,640 Speaker 1: the last thing I actually searched was like queer barbers 210 00:11:17,679 --> 00:11:19,960 Speaker 1: in the Bay Area because I haven't had a haircut 211 00:11:20,040 --> 00:11:22,920 Speaker 1: in like a year, and I think I need to 212 00:11:23,240 --> 00:11:26,760 Speaker 1: turm up or get you know, air out the the 213 00:11:26,880 --> 00:11:29,760 Speaker 1: sides of my head for pride months. So that's yeah, 214 00:11:30,280 --> 00:11:31,079 Speaker 1: that's elastic. 215 00:11:31,240 --> 00:11:35,280 Speaker 3: I searched, what are you going? You're going full shaved 216 00:11:35,320 --> 00:11:36,119 Speaker 3: on the sides. 217 00:11:35,960 --> 00:11:38,959 Speaker 1: Or maybe trim it a little bit and up the 218 00:11:39,080 --> 00:11:41,640 Speaker 1: back and bring out the curls a little bit. 219 00:11:41,800 --> 00:11:44,040 Speaker 3: So okay, love it? Yeah, on board. 220 00:11:44,400 --> 00:11:48,160 Speaker 2: I wish I could've got a few more days in look. 221 00:11:50,720 --> 00:11:56,520 Speaker 1: In July like discounts, Like it's like like after Valentine's Day? 222 00:11:57,400 --> 00:11:59,280 Speaker 1: Do I get an undercut at fifty percent off? 223 00:11:59,360 --> 00:11:59,480 Speaker 3: Now? 224 00:11:59,720 --> 00:12:00,400 Speaker 2: Right? Exactly? 225 00:12:01,800 --> 00:12:04,120 Speaker 3: Emily, how about you? What's something from your search history? 226 00:12:04,679 --> 00:12:07,679 Speaker 4: So forgive the poor pronunciation of this and the rest 227 00:12:07,720 --> 00:12:09,679 Speaker 4: of the story, because Spanish is not one of my languages. 228 00:12:09,840 --> 00:12:13,320 Speaker 4: But Champurrado, Oh yeah, it's something I search instrs. 229 00:12:13,440 --> 00:12:13,640 Speaker 3: Yeah. 230 00:12:13,720 --> 00:12:16,559 Speaker 4: So I was in Mexico City for a conference last 231 00:12:16,600 --> 00:12:18,680 Speaker 4: week and at one of the coffee breaks, they had 232 00:12:19,000 --> 00:12:20,720 Speaker 4: coffee and decaf coffee, and then. 233 00:12:20,640 --> 00:12:21,240 Speaker 1: They had. 234 00:12:25,400 --> 00:12:30,040 Speaker 3: And the Spanish don't give us about that. 235 00:12:30,160 --> 00:12:31,079 Speaker 2: What do you see when you see that? 236 00:12:31,120 --> 00:12:34,000 Speaker 3: Way? Champarado Mexican hot chocolate. 237 00:12:35,440 --> 00:12:38,079 Speaker 2: So yeah, you're literally Google results. 238 00:12:40,800 --> 00:12:43,640 Speaker 4: So the the labels all had like translations into English, 239 00:12:43,640 --> 00:12:46,360 Speaker 4: and so it was champurado with o hack and Chocolate'm like, yeah, 240 00:12:46,360 --> 00:12:46,640 Speaker 4: I got that. 241 00:12:46,800 --> 00:12:47,600 Speaker 5: What's chopado? 242 00:12:48,880 --> 00:12:50,520 Speaker 4: And so I look it up because I want to 243 00:12:50,559 --> 00:12:52,280 Speaker 4: know what I'm consuming before I consume it. And it's 244 00:12:52,360 --> 00:12:58,280 Speaker 4: basically a corn flour based thick drink chocolate corn soup. 245 00:12:58,360 --> 00:12:59,200 Speaker 5: It was amazing. 246 00:12:59,400 --> 00:13:04,439 Speaker 3: Chocolate corn you had me until chocolate corn soup. But 247 00:13:05,120 --> 00:13:10,040 Speaker 3: the corn is just a thickening yeah, chocolate drink. 248 00:13:10,360 --> 00:13:15,079 Speaker 4: Yeah, slight corn flavor like think corn tortilla, not on 249 00:13:15,160 --> 00:13:15,440 Speaker 4: the cob. 250 00:13:15,840 --> 00:13:18,199 Speaker 3: Yeah yeah, oh yeah, yeah, yeah that sounds amazing. 251 00:13:18,720 --> 00:13:19,920 Speaker 5: Yeah, it was really good. 252 00:13:20,160 --> 00:13:23,000 Speaker 3: I love some corn flakes in a chocolate bar. Yeah, 253 00:13:23,600 --> 00:13:25,880 Speaker 3: corn chocolate. There you go. Yeah, you got it. 254 00:13:25,920 --> 00:13:27,160 Speaker 2: You gotta arrive in your own way. 255 00:13:30,480 --> 00:13:31,240 Speaker 3: Corn chocolate. 256 00:13:32,840 --> 00:13:34,719 Speaker 4: It was really good and just awesome that it was there, 257 00:13:34,840 --> 00:13:36,640 Speaker 4: Like you know, the the coffee breaks had like the 258 00:13:37,040 --> 00:13:39,000 Speaker 4: Mexican sweetbreads and stuff like that, but otherwise it was 259 00:13:39,040 --> 00:13:40,920 Speaker 4: pretty standard like coffee break stuff, and all of a 260 00:13:40,920 --> 00:13:42,520 Speaker 4: sudden there's this wonderful mystery drink. 261 00:13:43,280 --> 00:13:44,600 Speaker 5: The big urns was. It was lovely. 262 00:13:44,760 --> 00:13:47,920 Speaker 3: That sounds great. What is something you think is underrated? Emily? 263 00:13:49,200 --> 00:13:54,240 Speaker 4: I think Seattle's weather is underrated. Oh yeah, everyone makes 264 00:13:54,280 --> 00:13:56,480 Speaker 4: fun of our weather and like, you know, fine believe 265 00:13:56,520 --> 00:13:57,960 Speaker 4: that we don't need lots of people coming here. 266 00:13:58,000 --> 00:13:59,600 Speaker 5: And it's true it gets dark in the winter. 267 00:13:59,520 --> 00:14:02,520 Speaker 4: But like almost any day, you can be outside and 268 00:14:02,640 --> 00:14:05,240 Speaker 4: you are not in physical danger because you are outside. 269 00:14:06,679 --> 00:14:10,120 Speaker 1: I guess that's that's I mean, if you're going for yeah, 270 00:14:10,360 --> 00:14:13,079 Speaker 1: that's interesting, but I mean it's I mean, the winters 271 00:14:13,200 --> 00:14:15,720 Speaker 1: are just so punishing though it's so gray. 272 00:14:16,280 --> 00:14:22,000 Speaker 2: It's dark, but the weather it's dark, it looks it 273 00:14:22,280 --> 00:14:26,520 Speaker 2: looks like shit, but experientially not bad for you. I mean, 274 00:14:26,600 --> 00:14:30,240 Speaker 2: I yeah, I know it's when does like it doesn't 275 00:14:30,240 --> 00:14:32,720 Speaker 2: get all gloomy. I imagine in the summer, right, you 276 00:14:32,840 --> 00:14:34,760 Speaker 2: have wonderful blue skies, and you can. 277 00:14:34,800 --> 00:14:39,040 Speaker 4: Enjoy the gorgeous fire season aside. But yeah, from sort 278 00:14:39,080 --> 00:14:43,200 Speaker 4: of mid October to early January, it can be pretty 279 00:14:43,440 --> 00:14:46,480 Speaker 4: like it's gray and so like when the sun is 280 00:14:46,520 --> 00:14:47,720 Speaker 4: technically above the horizon, it's. 281 00:14:47,600 --> 00:14:48,320 Speaker 5: A little hard to tell. 282 00:14:48,680 --> 00:14:52,920 Speaker 4: Yeah, right right, So, but you know, compared to like Chicago, 283 00:14:53,640 --> 00:14:56,640 Speaker 4: where you have maybe four liverable weeks a year between 284 00:14:56,680 --> 00:14:57,640 Speaker 4: the too hot and the two cold. 285 00:14:57,720 --> 00:15:00,280 Speaker 1: Wow, wow, I'll do that. Because my thing was going 286 00:15:00,360 --> 00:15:03,280 Speaker 1: to be Chicago, because I was just there and I 287 00:15:03,600 --> 00:15:05,400 Speaker 1: was going to say my answer was going to be, 288 00:15:05,800 --> 00:15:11,560 Speaker 1: Chicago is the best American city. I stand on this 289 00:15:12,600 --> 00:15:13,400 Speaker 1: like one. 290 00:15:15,760 --> 00:15:17,440 Speaker 3: No, absolutely not true. 291 00:15:17,960 --> 00:15:22,040 Speaker 1: No, I'll even deal. I'll even deal with I'll deal 292 00:15:22,120 --> 00:15:25,760 Speaker 1: with the winter. I mean if I okay, I'll be honest. 293 00:15:25,800 --> 00:15:28,520 Speaker 1: If I didn't, you know, if the weather in Chicago, 294 00:15:28,600 --> 00:15:31,400 Speaker 1: if if I could bring Bay Area weather to Chicago, 295 00:15:31,960 --> 00:15:34,600 Speaker 1: I would live in Chicago. I mean there's other reasons, 296 00:15:35,040 --> 00:15:40,640 Speaker 1: but I mean it's it's look, the vibes, immaculate street festivals, 297 00:15:41,040 --> 00:15:45,160 Speaker 1: the neighborhoods. It's the one place that's probably the food. 298 00:15:45,760 --> 00:15:49,320 Speaker 1: It's still comparatively affordable compared. 299 00:15:48,960 --> 00:15:49,840 Speaker 3: To the coasts. 300 00:15:50,680 --> 00:15:54,480 Speaker 1: Radical history, you know, just you know, some of the 301 00:15:54,520 --> 00:16:00,520 Speaker 1: best politics. Yeah, you know, I would say, fuga. They 302 00:16:00,560 --> 00:16:04,160 Speaker 1: shot shot an issue there, the fugitive, Oh I did. 303 00:16:04,360 --> 00:16:07,480 Speaker 1: That's a deep cut. Yeah, I mean they I think 304 00:16:07,520 --> 00:16:10,280 Speaker 1: they've shot a lot of Batman movies there, because you know, 305 00:16:10,360 --> 00:16:14,080 Speaker 1: the iconic kind of lower Whacker drive and they call 306 00:16:14,120 --> 00:16:18,560 Speaker 1: it with them and it's yeah, yeah, that's pretty great city. 307 00:16:18,960 --> 00:16:22,120 Speaker 4: Crappy weather, right, if you're going to dump on weather somewhere. 308 00:16:22,120 --> 00:16:23,400 Speaker 4: Everyone makes fun of Seattle's weather. 309 00:16:25,080 --> 00:16:28,480 Speaker 1: Honestly, Emily, this is a hot take. I'd rather take 310 00:16:28,560 --> 00:16:31,440 Speaker 1: Chicago's weather than Seattle's weather. 311 00:16:32,280 --> 00:16:35,080 Speaker 3: I can't. I can't do gray. I can do I 312 00:16:35,520 --> 00:16:36,840 Speaker 3: feel like I'm a crossfire. 313 00:16:37,040 --> 00:16:37,920 Speaker 1: I can bridge it. 314 00:16:38,200 --> 00:16:39,200 Speaker 3: I cannot do gray. 315 00:16:39,640 --> 00:16:40,560 Speaker 1: It's super pressing. 316 00:16:41,840 --> 00:16:43,680 Speaker 4: Well, this is why I say, like, don't move to 317 00:16:43,720 --> 00:16:45,560 Speaker 4: Seattle if you can't handle our weather. Like the people 318 00:16:45,560 --> 00:16:47,000 Speaker 4: who move here and then complain about. 319 00:16:46,760 --> 00:16:49,760 Speaker 2: The weather, what do you expect all of this? What 320 00:16:49,880 --> 00:16:52,360 Speaker 2: they say is true about it being great, like, oh, 321 00:16:52,400 --> 00:16:53,800 Speaker 2: I didn't expect you to be that gray? 322 00:16:54,280 --> 00:16:59,080 Speaker 3: Right, I think people talk about it like that. All right, Alex, 323 00:16:59,160 --> 00:17:02,600 Speaker 3: let's stay with you. What is you, guys, is overrated 324 00:17:02,760 --> 00:17:05,440 Speaker 3: and please do it in a point counterpoint style also 325 00:17:06,760 --> 00:17:07,920 Speaker 3: that contradicts one another. 326 00:17:08,240 --> 00:17:11,800 Speaker 1: Well, I got to think about what's what's overrated these days. 327 00:17:13,080 --> 00:17:15,159 Speaker 1: I just don't know what's in the I know this 328 00:17:15,400 --> 00:17:17,200 Speaker 1: the name of the show is the Daily Zeitgeist, But 329 00:17:17,280 --> 00:17:19,400 Speaker 1: I don't really know what's in the zeitgeist. I mean, 330 00:17:20,400 --> 00:17:23,040 Speaker 1: I guess Taylor Swift, I mean, I don't really have 331 00:17:23,880 --> 00:17:26,840 Speaker 1: Maybe that's controversial. I'm saying something that's hot take, but 332 00:17:26,960 --> 00:17:30,840 Speaker 1: I guess that's maybe not controversial to people of of 333 00:17:31,440 --> 00:17:32,960 Speaker 1: our you know, our generation. 334 00:17:33,760 --> 00:17:38,200 Speaker 2: No so joining Dave Grol on the on the attack 335 00:17:38,320 --> 00:17:41,840 Speaker 2: this weekend. Yeah, wait, what happened with David Girl, like 336 00:17:42,000 --> 00:17:44,919 Speaker 2: implying that she's like He's like, well, we play our 337 00:17:45,040 --> 00:17:48,680 Speaker 2: music live like raw live rock and roll, you know, 338 00:17:49,000 --> 00:17:52,560 Speaker 2: unlike the Era's tour, you know, with the errors tour, 339 00:17:52,680 --> 00:17:55,359 Speaker 2: and then everyone's like fuck you Dave or other people 340 00:17:55,359 --> 00:17:58,480 Speaker 2: will being like exactly exactly. Yeah. 341 00:17:59,080 --> 00:18:02,399 Speaker 1: It's just like yeah, yeah, I mean Dave Girl is 342 00:18:02,760 --> 00:18:06,720 Speaker 1: also overrated, I guess. But like, I mean, I enjoyed Look, 343 00:18:06,800 --> 00:18:11,280 Speaker 1: I enjoy ever long like the next like middle aged 344 00:18:11,960 --> 00:18:16,159 Speaker 1: sort of like dad figure like, but I you know, 345 00:18:16,680 --> 00:18:20,200 Speaker 1: I'm sure like I'm glad that she played every part 346 00:18:20,240 --> 00:18:23,080 Speaker 1: in that song. It sounds good, but you know, yeah, 347 00:18:23,560 --> 00:18:26,800 Speaker 1: it doesn't make you an authority on Taylor Swift. So yeah, 348 00:18:27,160 --> 00:18:28,920 Speaker 1: so I think I'm undercutting my own point. 349 00:18:29,119 --> 00:18:33,480 Speaker 3: But now let's go, Yeah you did in your own over. 350 00:18:34,080 --> 00:18:37,120 Speaker 4: Yeah that's excellent because I don't even have an opinion about. 351 00:18:37,040 --> 00:18:42,240 Speaker 3: Saw Tucker Carlson do that was that that was called crossfire? 352 00:18:42,480 --> 00:18:46,480 Speaker 1: Or was that the crossfire was with what's his face? 353 00:18:47,280 --> 00:18:50,080 Speaker 3: Carlson and Paul That was wasn't the one that the 354 00:18:50,119 --> 00:18:53,640 Speaker 3: one that John Stewart came on and was like yeah, 355 00:18:53,760 --> 00:18:56,639 Speaker 3: it was like this show is bad, and then like 356 00:18:56,720 --> 00:18:58,560 Speaker 3: they cancel it a couple of weeks. 357 00:18:58,600 --> 00:19:01,160 Speaker 1: Well then, but then there was at once show Hannity 358 00:19:01,240 --> 00:19:05,560 Speaker 1: and Combs where Sean, you know, Sean Hannity was supposed 359 00:19:05,560 --> 00:19:08,680 Speaker 1: to be a conservative voice, and then you know Combs 360 00:19:08,840 --> 00:19:10,560 Speaker 1: where like I don't I don't even know the guy's 361 00:19:10,640 --> 00:19:13,920 Speaker 1: first name. They just they kind of just had him 362 00:19:13,920 --> 00:19:16,240 Speaker 1: as a token like liberal on and then they just 363 00:19:16,400 --> 00:19:18,520 Speaker 1: it was on Fox News. So he just attacked him, 364 00:19:18,800 --> 00:19:19,560 Speaker 1: you know, relentlessly. 365 00:19:19,680 --> 00:19:22,680 Speaker 3: He wasn't allowed to read the news. He's like, you 366 00:19:22,920 --> 00:19:26,000 Speaker 3: argue the liberal points, but you're actually not permitted to 367 00:19:26,320 --> 00:19:28,480 Speaker 3: leave this room. We're gonna keep you in here, old 368 00:19:28,520 --> 00:19:29,480 Speaker 3: boy style. 369 00:19:29,440 --> 00:19:32,000 Speaker 2: For the Oh, that was the end of sixty minutes 370 00:19:32,080 --> 00:19:35,200 Speaker 2: that Andy Rooney would do. There was part of sixty 371 00:19:35,240 --> 00:19:38,119 Speaker 2: minutes was point counterpoint, and it would be Andy Rooney 372 00:19:39,000 --> 00:19:40,280 Speaker 2: if that's her thinking, Jack. 373 00:19:40,920 --> 00:19:42,920 Speaker 3: No, No, there was a show. Yeah, yeah, it was 374 00:19:43,040 --> 00:19:45,800 Speaker 3: right when I got out of college and worked for 375 00:19:45,920 --> 00:19:49,200 Speaker 3: ABC News and so everybody was always watching news and 376 00:19:49,320 --> 00:19:51,719 Speaker 3: at that time there was a big show on CNN 377 00:19:51,800 --> 00:19:57,959 Speaker 3: called Crossfire. It was Tucker, Yeah, Tucker Carlson was the conservative. 378 00:19:58,160 --> 00:20:01,720 Speaker 3: Paul mcgala was the roll and they just like got 379 00:20:01,800 --> 00:20:03,720 Speaker 3: on and yelled at each other. 380 00:20:04,080 --> 00:20:05,160 Speaker 1: I'm looking at us now. 381 00:20:05,640 --> 00:20:06,280 Speaker 3: Apparently good. 382 00:20:06,640 --> 00:20:10,840 Speaker 1: Apparently there there was a they they had a revival 383 00:20:11,440 --> 00:20:15,040 Speaker 1: and then in twenty thirteen and fourteen on the left 384 00:20:15,240 --> 00:20:19,560 Speaker 1: was Stephanie Cutter and Van Jones and then Nuke king 385 00:20:19,680 --> 00:20:23,119 Speaker 1: Ridge and se Cup on the right, and then whatever. 386 00:20:23,320 --> 00:20:26,119 Speaker 1: And then whenever they needed breaking news they'd bring in 387 00:20:26,200 --> 00:20:28,400 Speaker 1: Wolf Blitzer for some reason. 388 00:20:29,400 --> 00:20:31,520 Speaker 5: Because Wolf the situation room. 389 00:20:31,720 --> 00:20:35,119 Speaker 1: Yeah yeah, they released him from the cryogenic. 390 00:20:35,119 --> 00:20:38,760 Speaker 3: Whose helicopter lifted from the situation room three rooms over 391 00:20:38,960 --> 00:20:42,520 Speaker 3: to the Crossfire set. Just with dead pan. You know, 392 00:20:43,040 --> 00:20:46,200 Speaker 3: we need you go into emotion on his face. Ever 393 00:20:46,720 --> 00:20:50,720 Speaker 3: you guys ever seen the Wolf Blitzer episode of Celebrity 394 00:20:50,760 --> 00:20:57,960 Speaker 3: Jeopardy No Yourself a favor is It hasn't been scrubbed yet. 395 00:20:58,440 --> 00:21:01,720 Speaker 1: Is it as good as like the SNL parody? Is 396 00:21:01,760 --> 00:21:07,119 Speaker 1: the Celebrity Jeopardy with like the Sean Connery it's just yeah, no. 397 00:21:07,760 --> 00:21:13,320 Speaker 3: No, and also incorrect, just one after another, like negative 398 00:21:14,600 --> 00:21:18,880 Speaker 3: went into the into the red. He's in there quickly separty. Well, Wolf, 399 00:21:18,960 --> 00:21:21,639 Speaker 3: we're gonna spot you three thousand because we can't have 400 00:21:21,800 --> 00:21:26,400 Speaker 3: somebody be in negative numbers going into going into Final Jeopardy. 401 00:21:26,920 --> 00:21:31,000 Speaker 3: And I think Andy Richter was on with him and 402 00:21:31,480 --> 00:21:33,200 Speaker 3: just destroyed. Was so good. 403 00:21:33,600 --> 00:21:38,440 Speaker 1: That's so funny. Andy Richter like this the kind of 404 00:21:38,520 --> 00:21:40,040 Speaker 1: crossover I didn't know I need. 405 00:21:40,600 --> 00:21:43,240 Speaker 2: Yeah, it's still up there mostly from what I could tell, 406 00:21:43,440 --> 00:21:44,359 Speaker 2: it's incredible. 407 00:21:44,560 --> 00:21:48,240 Speaker 3: Yeah, I am an old person. All right, we still 408 00:21:48,320 --> 00:21:52,400 Speaker 3: have Emily. You're overrated. What do you think is overrated? 409 00:21:52,440 --> 00:21:53,880 Speaker 5: Big cars are overrated? 410 00:21:53,960 --> 00:21:54,040 Speaker 6: Oh? 411 00:21:54,160 --> 00:21:58,000 Speaker 4: Totally sort of half heartedly looking for our next car 412 00:21:58,200 --> 00:22:01,240 Speaker 4: and can't find anything that is like small. And the 413 00:22:01,359 --> 00:22:03,480 Speaker 4: other day I was in the parking lot for a 414 00:22:03,520 --> 00:22:06,280 Speaker 4: grocery store in your hair mostly I can walk for groceries, 415 00:22:06,320 --> 00:22:09,240 Speaker 4: but occasionally have to drive this other store. And half 416 00:22:09,280 --> 00:22:12,200 Speaker 4: the spots were labeled compact, and like all of those 417 00:22:12,240 --> 00:22:15,440 Speaker 4: spots were taken up two at a time by what 418 00:22:15,560 --> 00:22:18,920 Speaker 4: we now have is regular cars, because somebody's decided that 419 00:22:19,080 --> 00:22:21,880 Speaker 4: people in this country don't deserve normal sized cars. 420 00:22:22,480 --> 00:22:23,440 Speaker 3: Yeah, they're so. 421 00:22:23,840 --> 00:22:25,159 Speaker 2: I mean, it's still the point where like even the 422 00:22:25,200 --> 00:22:27,560 Speaker 2: people who design parking lots are like, we have to 423 00:22:27,640 --> 00:22:31,639 Speaker 2: tell the automobile manufacturers, like the standard we've set as 424 00:22:31,720 --> 00:22:34,840 Speaker 2: people who like create parking lots, like they're pushing the 425 00:22:34,920 --> 00:22:36,800 Speaker 2: boundaries of what we can actually do or how we 426 00:22:36,920 --> 00:22:40,120 Speaker 2: measure things because the cars are so fucking big. 427 00:22:40,600 --> 00:22:42,560 Speaker 4: And our streets around here in Seattle, we have a 428 00:22:42,560 --> 00:22:45,040 Speaker 4: lot of neighborhood streets where there's like parking on both 429 00:22:45,080 --> 00:22:47,720 Speaker 4: sides and then sort of just barely enough space for 430 00:22:47,800 --> 00:22:49,879 Speaker 4: two normal cars to go through, right, or sometimes you 431 00:22:49,920 --> 00:22:51,280 Speaker 4: have to like pull over to the car, and the 432 00:22:51,359 --> 00:22:53,160 Speaker 4: bigger the car is like the harder that gets. 433 00:22:53,760 --> 00:22:54,280 Speaker 3: I love that thing. 434 00:22:54,440 --> 00:22:56,320 Speaker 2: I remember one of the times went to Seattle, seeing 435 00:22:56,359 --> 00:22:59,000 Speaker 2: how everybody just parks on whatever side of the street 436 00:22:59,040 --> 00:23:00,400 Speaker 2: in whatever direction they want. 437 00:23:00,760 --> 00:23:03,960 Speaker 3: I was like, all right, I'm like, all right, I 438 00:23:04,720 --> 00:23:05,440 Speaker 3: was not familiar. 439 00:23:05,680 --> 00:23:06,160 Speaker 1: That's funny. 440 00:23:06,200 --> 00:23:07,800 Speaker 5: Yeah, yeah, a little bit of chaos. 441 00:23:08,040 --> 00:23:11,040 Speaker 4: It totally offends my spouse, who's like, that's not how 442 00:23:11,119 --> 00:23:12,160 Speaker 4: parking works. 443 00:23:12,280 --> 00:23:12,560 Speaker 1: That's it. 444 00:23:13,560 --> 00:23:17,080 Speaker 3: Yeah, yeah, I love it. Automakers just seem to be 445 00:23:17,400 --> 00:23:20,600 Speaker 3: getting bigger and heavier. They won't stop until they make 446 00:23:20,640 --> 00:23:23,159 Speaker 3: a car that like is legally required to have a 447 00:23:23,240 --> 00:23:24,440 Speaker 3: foghorn on it. 448 00:23:24,680 --> 00:23:27,760 Speaker 4: Right, So the cyber truck, Yeah, cyber truck. 449 00:23:27,920 --> 00:23:30,120 Speaker 3: I was going to ask, have you have you considered 450 00:23:30,200 --> 00:23:31,080 Speaker 3: the cyber truck. 451 00:23:31,359 --> 00:23:32,480 Speaker 5: I've seen one in person. 452 00:23:32,600 --> 00:23:36,000 Speaker 4: They are hilarious, like you can't not laugh exactly. 453 00:23:35,760 --> 00:23:39,320 Speaker 3: What it's such. It is an experiencing one in the wild. 454 00:23:39,600 --> 00:23:41,480 Speaker 4: It's like, wow, I just want to say that what 455 00:23:41,600 --> 00:23:44,560 Speaker 4: we really need is functional public transit, but like, short 456 00:23:44,600 --> 00:23:46,600 Speaker 4: of that, we also need to not be doing bigger 457 00:23:46,600 --> 00:23:47,200 Speaker 4: and bigger cars. 458 00:23:47,520 --> 00:23:50,359 Speaker 1: Yeah yeah, yeah, no, I just I mean, I have 459 00:23:50,480 --> 00:23:53,480 Speaker 1: a truck. I have a twenty twenty truck, and I wish, 460 00:23:54,119 --> 00:23:56,920 Speaker 1: I really wish it was much smaller because it's hard 461 00:23:56,960 --> 00:23:59,760 Speaker 1: to park. It's way too big. I mean, I think 462 00:23:59,800 --> 00:24:03,680 Speaker 1: that peak of truck design it was like, you know, 463 00:24:03,760 --> 00:24:08,440 Speaker 1: a nineteen eighty seven Toyota Tacoba long cab, you know, 464 00:24:08,840 --> 00:24:11,359 Speaker 1: just you know where like, yeah, you had to bunch 465 00:24:11,480 --> 00:24:13,399 Speaker 1: up your knees in the back if you want to 466 00:24:13,520 --> 00:24:16,159 Speaker 1: fit four people on it. But you actually had a 467 00:24:16,240 --> 00:24:19,440 Speaker 1: long you know, you actually had a truck bed that 468 00:24:20,000 --> 00:24:24,119 Speaker 1: actually had you know, some carrying capacity, you know, and. 469 00:24:24,480 --> 00:24:26,919 Speaker 2: It was a car you could absolutely run into the. 470 00:24:27,000 --> 00:24:27,879 Speaker 3: Ground with nobody. 471 00:24:27,960 --> 00:24:29,800 Speaker 1: Yeah, oh yeah, oh yes. 472 00:24:30,040 --> 00:24:32,320 Speaker 2: Now people were like, my new Ford Lightning needs a 473 00:24:32,440 --> 00:24:33,520 Speaker 2: software update. 474 00:24:33,720 --> 00:24:36,320 Speaker 1: Oh god, well that's the thing, is like yeah, I 475 00:24:36,359 --> 00:24:38,359 Speaker 1: mean like, I mean that's a big deal. I know, 476 00:24:38,480 --> 00:24:40,479 Speaker 1: in like organ which is like the where they had 477 00:24:40,560 --> 00:24:43,480 Speaker 1: right to repair a right to repair bill, and I 478 00:24:43,560 --> 00:24:45,879 Speaker 1: mean in some ways the people that were kind of 479 00:24:45,960 --> 00:24:50,040 Speaker 1: into it weirdly were like Google actually came out kind 480 00:24:50,040 --> 00:24:52,800 Speaker 1: of into it. There was a Good for for Media 481 00:24:52,880 --> 00:24:56,359 Speaker 1: podcast where they talked about this with an Apple because 482 00:24:56,400 --> 00:24:59,280 Speaker 1: they have such a closed ecosystem was so against right 483 00:24:59,359 --> 00:25:02,000 Speaker 1: to repair or you know, even if you have right 484 00:25:02,040 --> 00:25:04,040 Speaker 1: to repair. They actually they'd add on all these things 485 00:25:04,040 --> 00:25:05,639 Speaker 1: where you you'd still have to send them to an 486 00:25:05,640 --> 00:25:09,879 Speaker 1: authorized dealers because of firmware issues or whatever. And then 487 00:25:10,000 --> 00:25:12,560 Speaker 1: John Deere, like John Deere is this kind of thing 488 00:25:12,600 --> 00:25:15,840 Speaker 1: where they have you know, so much of their tractors 489 00:25:15,880 --> 00:25:18,399 Speaker 1: are computerized, and so there's like a lot of like 490 00:25:18,600 --> 00:25:23,080 Speaker 1: these John Deeer hacking kinds of things that people who 491 00:25:23,119 --> 00:25:26,240 Speaker 1: are outside of the US, you know, programming these kinds 492 00:25:26,280 --> 00:25:29,440 Speaker 1: of hacks for for people running these tractors and can't 493 00:25:29,480 --> 00:25:30,640 Speaker 1: run into their firm where. 494 00:25:32,080 --> 00:25:36,560 Speaker 3: Yeah, the farmers have all the good the GPS something, 495 00:25:37,080 --> 00:25:37,480 Speaker 3: but did. 496 00:25:37,400 --> 00:25:39,320 Speaker 4: You hear about how the GPS was out for a 497 00:25:39,400 --> 00:25:44,040 Speaker 4: while with there's actually is again for a for media podcast, 498 00:25:44,160 --> 00:25:47,120 Speaker 4: but that the way those tractors work for like planting 499 00:25:47,359 --> 00:25:52,200 Speaker 4: is so precise and the GPS. With the GPS off, 500 00:25:52,240 --> 00:25:55,600 Speaker 4: they basically couldn't plant because then the seeds wouldn't be 501 00:25:55,600 --> 00:25:57,600 Speaker 4: in the right spot for the next process, and so 502 00:25:57,720 --> 00:25:59,840 Speaker 4: they had to wait. And there's a really narrow window 503 00:25:59,880 --> 00:26:02,680 Speaker 4: of currently with our you know, currently genetically modified like 504 00:26:03,080 --> 00:26:06,639 Speaker 4: very very specific corn that Santo owns, and so it 505 00:26:06,720 --> 00:26:08,480 Speaker 4: was actually looking pretty bad for a while. I didn't 506 00:26:08,480 --> 00:26:10,480 Speaker 4: hear any follow ups. So maybe the solar flail was 507 00:26:10,520 --> 00:26:13,040 Speaker 4: short enough and the GPS came back online, but apparently. 508 00:26:12,760 --> 00:26:13,320 Speaker 5: That was a big thing. 509 00:26:15,280 --> 00:26:17,560 Speaker 3: It has to be a brief window because it goes 510 00:26:17,600 --> 00:26:20,600 Speaker 3: from corn plant corn seed planted in the ground to 511 00:26:20,800 --> 00:26:23,480 Speaker 3: like popcorn in the movie theater in two and a 512 00:26:23,480 --> 00:26:24,080 Speaker 3: half weeks. 513 00:26:24,440 --> 00:26:30,080 Speaker 4: Yeah, hyper, most of it doesn't even go to popcorn 514 00:26:30,080 --> 00:26:31,639 Speaker 4: in the movie theater, right, most of it goes to 515 00:26:31,840 --> 00:26:33,400 Speaker 4: animal feed or after. 516 00:26:33,280 --> 00:26:36,840 Speaker 3: All, I think, yeah, right, right, yeah, all right, Well, 517 00:26:37,119 --> 00:26:39,359 Speaker 3: let's take a quick break and we're going to come 518 00:26:39,440 --> 00:26:43,879 Speaker 3: back and dive into why Miles and I are excited 519 00:26:44,240 --> 00:26:48,600 Speaker 3: about the future of AI. We'll be right back. Crossfire 520 00:27:00,520 --> 00:27:04,760 Speaker 3: and we're back. We're back. So just to for people 521 00:27:04,840 --> 00:27:09,040 Speaker 3: who haven't listened to your previous appearance in a while, 522 00:27:09,600 --> 00:27:13,280 Speaker 3: I feel like a broad over generalization, but it feels 523 00:27:13,359 --> 00:27:16,240 Speaker 3: like the stuff that AI is actually being used for 524 00:27:16,560 --> 00:27:20,800 Speaker 3: and capable of is not what we're being told about 525 00:27:21,040 --> 00:27:25,000 Speaker 3: through the mainstream media. Like it is not an autonomous 526 00:27:25,119 --> 00:27:27,760 Speaker 3: intelligence that is going to be the bad guy in 527 00:27:27,880 --> 00:27:30,879 Speaker 3: a Mission Impossible movie. I mean, it is a bad 528 00:27:30,920 --> 00:27:32,720 Speaker 3: guy in a Mission Impossible movie, but it's not going 529 00:27:32,760 --> 00:27:35,479 Speaker 3: to be a bad guy in the in reality our 530 00:27:35,520 --> 00:27:39,879 Speaker 3: way it is, yeah, the way that our actual president believes. 531 00:27:41,440 --> 00:27:45,840 Speaker 3: That was an amazing reveal that Joe Biden basically watched 532 00:27:45,920 --> 00:27:48,800 Speaker 3: Mission Impossible and was like, we gotta gotta worry about 533 00:27:48,840 --> 00:27:52,240 Speaker 3: this AI stuff. Jack it's gonna know my next move. 534 00:27:52,960 --> 00:27:56,640 Speaker 3: But it is more like the large language models are 535 00:27:57,480 --> 00:28:01,359 Speaker 3: basically more sophisticated auto complete that is telling you what 536 00:28:01,680 --> 00:28:05,200 Speaker 3: it's data set indicates you want to hear, or what 537 00:28:05,320 --> 00:28:08,159 Speaker 3: its data set indicates will make you think. It is 538 00:28:08,240 --> 00:28:11,920 Speaker 3: thinking talking like a person. In many cases that means 539 00:28:12,359 --> 00:28:15,960 Speaker 3: what they call hallucinating. What is actually just making ship up? 540 00:28:16,720 --> 00:28:19,160 Speaker 2: What are their jobs? Could you say that You're like, sorry, 541 00:28:19,240 --> 00:28:22,800 Speaker 2: I was hallucinating and like okay. 542 00:28:25,440 --> 00:28:27,880 Speaker 1: Oh but what you wouldn't last long as a pre cog. 543 00:28:28,119 --> 00:28:31,960 Speaker 3: Yeah, I would be the worst precog on. 544 00:28:32,000 --> 00:28:34,120 Speaker 2: The I R S. I was hallucinating on that last 545 00:28:34,160 --> 00:28:34,760 Speaker 2: tax return. 546 00:28:34,800 --> 00:28:36,040 Speaker 3: I think that's probably. 547 00:28:37,240 --> 00:28:38,880 Speaker 4: About using this to do your tax return. 548 00:28:39,280 --> 00:28:42,600 Speaker 1: Yeah right, there's actually, yeah, there's actually a I mean 549 00:28:42,640 --> 00:28:46,800 Speaker 1: in California, there's a whatever the Department of Tax and Revenue. 550 00:28:46,920 --> 00:28:49,840 Speaker 1: They've been There was some great reporting and how it 551 00:28:49,880 --> 00:28:53,520 Speaker 1: matters by Karie Johnson, and he was talking about how 552 00:28:53,600 --> 00:28:57,880 Speaker 1: they were using this thing, some language model to effectively 553 00:28:58,960 --> 00:29:02,479 Speaker 1: advise the people who call in or advise the agents 554 00:29:02,520 --> 00:29:06,200 Speaker 1: who respond to people to call into the California Franchise 555 00:29:06,240 --> 00:29:09,520 Speaker 1: sax board and you know, and they're like, well, they're 556 00:29:09,600 --> 00:29:11,880 Speaker 1: they're and they're like, well, you know, the the agents 557 00:29:11,880 --> 00:29:14,920 Speaker 1: are still going to you know, have the last worry. 558 00:29:14,960 --> 00:29:17,400 Speaker 1: But I'm just like, yeah, yeah, but they might they're overworked, 559 00:29:17,480 --> 00:29:19,239 Speaker 1: Like are they going to just they're going to read 560 00:29:19,280 --> 00:29:20,240 Speaker 1: this stuff or me and them. 561 00:29:20,240 --> 00:29:22,680 Speaker 3: You know, right exactly. Oh, you're going to use this 562 00:29:22,760 --> 00:29:25,920 Speaker 3: as an extra thing, just an extra expense to do 563 00:29:26,080 --> 00:29:29,240 Speaker 3: the product, do your job even better. That doesn't sound 564 00:29:29,560 --> 00:29:34,800 Speaker 3: like a company necessarily. Yeah. Yeah, So an interesting thing 565 00:29:34,840 --> 00:29:37,959 Speaker 3: that we're seeing happen. You know, we pay attention when 566 00:29:38,000 --> 00:29:41,880 Speaker 3: there's a an AI story that captures the zeitgeist. We 567 00:29:42,360 --> 00:29:45,400 Speaker 3: covered the B minus version of a George Carlin routine 568 00:29:46,000 --> 00:29:48,640 Speaker 3: that came out. They were like, a, I just brought 569 00:29:48,680 --> 00:29:52,160 Speaker 3: George Carlin back from the dead. We covered Amazon Fresh 570 00:29:52,640 --> 00:29:56,560 Speaker 3: having that store where the cameras know what you've taken 571 00:29:57,040 --> 00:29:59,400 Speaker 3: and so even if you try and shoplift like the 572 00:29:59,520 --> 00:30:02,400 Speaker 3: cameras know they're gonna catch it, and then you don't 573 00:30:02,400 --> 00:30:04,360 Speaker 3: even have to check out, you just walk out and 574 00:30:04,520 --> 00:30:08,040 Speaker 3: they like it, charges your account because of AI. And 575 00:30:08,200 --> 00:30:11,840 Speaker 3: then what we're seeing is that when the truth emerges, 576 00:30:12,640 --> 00:30:16,400 Speaker 3: it does not enter the zeitgeist because you guys cover 577 00:30:16,520 --> 00:30:18,360 Speaker 3: it on your show which is why we're so thrilled 578 00:30:18,440 --> 00:30:20,800 Speaker 3: to have you back. But you know, we have updates 579 00:30:20,800 --> 00:30:23,840 Speaker 3: on those two stories, Carlin. That was just written by 580 00:30:23,880 --> 00:30:29,520 Speaker 3: a person the Amazon Fresh. Those videos were being fed 581 00:30:29,720 --> 00:30:35,080 Speaker 3: to people working in India to try to track where 582 00:30:35,120 --> 00:30:37,280 Speaker 3: everything was going, which was why there was like a 583 00:30:37,360 --> 00:30:41,200 Speaker 3: weird pause like as people were where they're like, oh, 584 00:30:41,600 --> 00:30:43,960 Speaker 3: I think we got okay, yeah, we're we're just gonna 585 00:30:44,160 --> 00:30:47,560 Speaker 3: do a best guess. But it's straight up like mechanical 586 00:30:47,640 --> 00:30:51,400 Speaker 3: Turk like it's which again Amazon named one of their 587 00:30:51,480 --> 00:30:55,680 Speaker 3: companies the Mechanical Turk, so they they know what's going on. 588 00:30:55,880 --> 00:30:59,120 Speaker 3: They knew what they were planning to do here all along. Maybe, 589 00:30:59,680 --> 00:31:02,800 Speaker 3: But is that kind of the model you're seeing is big, 590 00:31:02,880 --> 00:31:07,080 Speaker 3: flashy announcement, this is what AI integration can do, and 591 00:31:07,240 --> 00:31:10,560 Speaker 3: then when it falls short, people just kind of ignore 592 00:31:10,600 --> 00:31:14,880 Speaker 3: it or how does it seem from where you're sitting. 593 00:31:15,360 --> 00:31:19,520 Speaker 4: Yeah, we haven't seen really good implosions yet, and it's 594 00:31:19,560 --> 00:31:22,040 Speaker 4: surprising because like the stuff that goes wrong goes like 595 00:31:22,240 --> 00:31:25,400 Speaker 4: really really wrong, and people are like, yeah, well, it's 596 00:31:25,480 --> 00:31:28,560 Speaker 4: just in its infancy, which is a really really annoying 597 00:31:28,640 --> 00:31:31,320 Speaker 4: metaphor because it first of all suggests that this is 598 00:31:31,360 --> 00:31:34,200 Speaker 4: something that is you know, like a human, like an 599 00:31:34,240 --> 00:31:36,280 Speaker 4: animal at least that's that's a baby and can grow 600 00:31:36,800 --> 00:31:40,200 Speaker 4: something that is learning over time. And also sort of 601 00:31:40,400 --> 00:31:42,600 Speaker 4: like pulls on this idea that we should be kind 602 00:31:42,840 --> 00:31:45,920 Speaker 4: to these systems because they're just little babies, right, and 603 00:31:45,960 --> 00:31:47,880 Speaker 4: so if something goes wrong, it's like, well, no, that's 604 00:31:47,960 --> 00:31:50,840 Speaker 4: that's just it's it's still learning. And we get all 605 00:31:50,920 --> 00:31:52,760 Speaker 4: of these appeals to the future, like how good it 606 00:31:52,800 --> 00:31:54,200 Speaker 4: is going to be in the future, And there is 607 00:31:54,600 --> 00:31:57,240 Speaker 4: at this point, I think so much money sunk into 608 00:31:57,360 --> 00:32:01,280 Speaker 4: this that people aren't ready to like let go and 609 00:32:01,720 --> 00:32:04,000 Speaker 4: own up to the fact that yeah, so, and it 610 00:32:04,160 --> 00:32:07,760 Speaker 4: is I guess too easy to hire exploited workers for 611 00:32:07,920 --> 00:32:11,520 Speaker 4: poor pay, usually overseas to like backstop the stuff. There's 612 00:32:11,520 --> 00:32:14,240 Speaker 4: also so you gave us the Amazon ghost stores actually 613 00:32:14,280 --> 00:32:16,720 Speaker 4: being monitored by people in India. There was one of 614 00:32:16,840 --> 00:32:21,240 Speaker 4: the self driving car companies admitted that their cars were 615 00:32:21,280 --> 00:32:22,840 Speaker 4: being supervised by workers in Mexico. 616 00:32:23,360 --> 00:32:25,480 Speaker 2: And remember the stats on the alex Yeah it was 617 00:32:25,600 --> 00:32:25,760 Speaker 2: it was. 618 00:32:26,080 --> 00:32:29,720 Speaker 1: It was, So was Eric Voight, the CEO of Cruise, 619 00:32:30,400 --> 00:32:32,560 Speaker 1: and he had said And then there was this reporting 620 00:32:32,680 --> 00:32:35,160 Speaker 1: on the New York Times where they said, you know, 621 00:32:35,480 --> 00:32:38,960 Speaker 1: they used they used humans. And then he was like, wait, 622 00:32:39,040 --> 00:32:41,520 Speaker 1: well wait, wait, wait, wait, you're you're really blowing out 623 00:32:41,520 --> 00:32:45,360 Speaker 1: of proportion. We only use it something of three to 624 00:32:45,520 --> 00:32:48,800 Speaker 1: five percent at a time, Like that's a huge it's 625 00:32:48,840 --> 00:32:52,400 Speaker 1: a huge amount of out and and he posted this 626 00:32:52,720 --> 00:32:55,920 Speaker 1: himself on on Hacker News, which is this, you know, 627 00:32:56,120 --> 00:32:59,000 Speaker 1: kind of like, I don't know four chan for tech pros. 628 00:32:59,000 --> 00:33:01,080 Speaker 1: I guess, well, I guess four four chan is four 629 00:33:01,160 --> 00:33:03,520 Speaker 1: Chan for techpros. But I mean it's you know, but 630 00:33:03,800 --> 00:33:08,200 Speaker 1: like with a little less overt racism. I guess, yeah, 631 00:33:09,240 --> 00:33:12,440 Speaker 1: especially yeah, but it was still Yeah, but we're seeing 632 00:33:12,480 --> 00:33:14,200 Speaker 1: this in a lot of different industries. At the end 633 00:33:14,240 --> 00:33:17,479 Speaker 1: of the day, it's just this is outsourcing humans. Dan 634 00:33:17,520 --> 00:33:22,400 Speaker 1: at Vertessi is a sociologist at Princeton. She has a 635 00:33:22,560 --> 00:33:25,880 Speaker 1: piece in Tech Policy Press which the title is something 636 00:33:26,000 --> 00:33:29,760 Speaker 1: like AI it's just forecasting. It was just outsourcing two 637 00:33:29,760 --> 00:33:32,720 Speaker 1: point zero effectively. And yeah, we're seeing we're seeing a 638 00:33:32,760 --> 00:33:35,720 Speaker 1: lot of the same patterns that we saw in the 639 00:33:35,840 --> 00:33:40,640 Speaker 1: early nineties when these business process outsourcing or BPO organizations 640 00:33:40,720 --> 00:33:42,720 Speaker 1: were really becoming all the rage in the US. 641 00:33:43,680 --> 00:33:45,640 Speaker 2: Right. The other thing that I see a lot too, 642 00:33:45,800 --> 00:33:48,080 Speaker 2: is like I felt early on, especially when we were 643 00:33:48,120 --> 00:33:50,440 Speaker 2: talking about it. The thing that intrigued us, like when 644 00:33:50,600 --> 00:33:52,880 Speaker 2: was when everyone's like, dude, this thing is gonna fucking 645 00:33:53,080 --> 00:33:56,920 Speaker 2: end the world. It's how powerful AI is. Have I 646 00:33:57,000 --> 00:34:00,360 Speaker 2: have a whole plan to take myself off this mortale 647 00:34:00,440 --> 00:34:02,800 Speaker 2: if I have to the moment in which AI becomes 648 00:34:02,840 --> 00:34:05,920 Speaker 2: sentient and takes over. And like I think it felt 649 00:34:05,960 --> 00:34:07,920 Speaker 2: like maybe the markets were like, hey, man, you're scaring 650 00:34:07,960 --> 00:34:09,680 Speaker 2: the kids. Man, do we have another way to talk 651 00:34:09,719 --> 00:34:12,279 Speaker 2: about this? And I feel like recently I seem more 652 00:34:12,360 --> 00:34:17,400 Speaker 2: of like together, when we harness human intelligence with AI, 653 00:34:17,920 --> 00:34:21,320 Speaker 2: we can achieve a new level of existence and ideation 654 00:34:21,600 --> 00:34:23,719 Speaker 2: that has not been seen ever in the course of 655 00:34:23,840 --> 00:34:26,719 Speaker 2: human history. And I saw that in the Netflix j 656 00:34:26,880 --> 00:34:30,000 Speaker 2: Loo movie, where like the entire crux of the film 657 00:34:30,239 --> 00:34:35,080 Speaker 2: was this AI. AI skeptic had to embrace the AI 658 00:34:35,360 --> 00:34:38,719 Speaker 2: in order to overcome the main problem conflict in the film, 659 00:34:39,120 --> 00:34:42,320 Speaker 2: or just even now, like with the CTO of Open 660 00:34:42,400 --> 00:34:45,480 Speaker 2: AI also doing a similar thing when talking about how AI, 661 00:34:45,960 --> 00:34:48,239 Speaker 2: like some creative jobs are just gonna vanish, But that's 662 00:34:48,320 --> 00:34:52,200 Speaker 2: because when the human mind harnesses the power of the AI, 663 00:34:52,280 --> 00:34:54,360 Speaker 2: we're gonna come up with such new things that feels 664 00:34:54,400 --> 00:34:56,520 Speaker 2: like the new thing, which is more like we got 665 00:34:56,640 --> 00:34:59,640 Speaker 2: to embrace it so we can evolve into this next 666 00:34:59,800 --> 00:35:04,120 Speaker 2: level of thinking, et cetera, computation or whatever you guys on. 667 00:35:04,840 --> 00:35:07,200 Speaker 3: I was just gonna say, Mystery Hype Theater three thousand 668 00:35:07,280 --> 00:35:10,360 Speaker 3: reads the research papers so that we don't have to, 669 00:35:10,560 --> 00:35:13,160 Speaker 3: and Miles watches the j Lo movies so that. 670 00:35:13,440 --> 00:35:14,040 Speaker 5: You don't have to. 671 00:35:14,400 --> 00:35:15,319 Speaker 3: Got to know what they're saying. 672 00:35:15,600 --> 00:35:17,960 Speaker 1: But I'm glad you're I'm glad you're watching the day 673 00:35:18,040 --> 00:35:21,040 Speaker 1: because there's so many different cultural touchstones of this. 674 00:35:21,440 --> 00:35:21,640 Speaker 3: Yeah. 675 00:35:21,680 --> 00:35:23,320 Speaker 1: I had to look because I thought I thought the 676 00:35:23,400 --> 00:35:26,359 Speaker 1: movie you were talking about was the the like sort 677 00:35:26,400 --> 00:35:29,760 Speaker 1: of the autobiography This is Me Now a love story, 678 00:35:29,960 --> 00:35:32,480 Speaker 1: and I'm like, there's a film and I was like, 679 00:35:32,600 --> 00:35:37,160 Speaker 1: there's an AI subplot in that. I didn't know that 680 00:35:37,640 --> 00:35:42,240 Speaker 1: j Loo's life was, you know, a complete cautionary tale 681 00:35:42,320 --> 00:35:46,400 Speaker 1: about you know, AI and and the inevitability of it. 682 00:35:46,600 --> 00:35:49,920 Speaker 1: But yeah, but sorry, Emily was about to say something. 683 00:35:50,200 --> 00:35:54,839 Speaker 4: I just want to be starting so our colleagues to meet. 684 00:35:54,880 --> 00:35:58,879 Speaker 4: Gabruin and mil Torres coined this acronym test reel, which 685 00:35:58,920 --> 00:36:01,160 Speaker 4: stands for a bundle of ideologies that are all very 686 00:36:01,200 --> 00:36:03,680 Speaker 4: closely related to each other. And what's interesting about the transition, 687 00:36:03,760 --> 00:36:07,320 Speaker 4: you notice that they've basically moved from one part of 688 00:36:07,520 --> 00:36:10,960 Speaker 4: the tescrol acronym to another. So the it's all the 689 00:36:11,000 --> 00:36:14,640 Speaker 4: stuff that's based on these ideas of eugenics and of 690 00:36:14,880 --> 00:36:19,360 Speaker 4: sort of really disinterest in any actual current humans in 691 00:36:19,480 --> 00:36:23,920 Speaker 4: the service of these imagined people living as uploaded simulations 692 00:36:23,960 --> 00:36:28,160 Speaker 4: in the far long future. It's utilitarianism made even more 693 00:36:28,239 --> 00:36:31,120 Speaker 4: ridiculous by being taken to an an extreme endpoint. So 694 00:36:31,680 --> 00:36:33,520 Speaker 4: this thing like it's going to kill us all comes 695 00:36:34,040 --> 00:36:36,279 Speaker 4: partially from like the long term ism part of this 696 00:36:36,480 --> 00:36:38,440 Speaker 4: which people are fixated, this idea of we have to 697 00:36:39,040 --> 00:36:42,120 Speaker 4: it's ridiculous. They have a specific number, which is ten 698 00:36:42,160 --> 00:36:44,480 Speaker 4: to the fifty eight, who are the future humans who 699 00:36:44,520 --> 00:36:48,640 Speaker 4: are going to live as uploaded simulations in computer systems 700 00:36:48,719 --> 00:36:51,759 Speaker 4: installed all over the galaxy. And these are people who 701 00:36:51,800 --> 00:36:55,239 Speaker 4: clearly have never worked in it support because somehow the 702 00:36:55,280 --> 00:36:56,279 Speaker 4: computers just keep running. 703 00:36:56,320 --> 00:36:57,719 Speaker 2: Yeah yeah, yeah yeah. 704 00:36:58,080 --> 00:37:01,520 Speaker 4: And the idea is that if we don't make sure 705 00:37:01,719 --> 00:37:05,759 Speaker 4: that future comes about, then we collectively are missing out 706 00:37:05,840 --> 00:37:07,680 Speaker 4: on the happiness of those ten to the fifty eight 707 00:37:07,760 --> 00:37:09,439 Speaker 4: humans and that's such a big number that it doesn't 708 00:37:09,480 --> 00:37:11,719 Speaker 4: matter what happens now, all right. And I always say 709 00:37:11,760 --> 00:37:13,680 Speaker 4: when I relate this story that I wish I were 710 00:37:13,760 --> 00:37:16,160 Speaker 4: making this up. Yeah, but there are actually people who 711 00:37:16,239 --> 00:37:18,200 Speaker 4: believe this. And so that's where the sort of like, 712 00:37:18,280 --> 00:37:20,000 Speaker 4: oh no, it's you know, it's going to end us. 713 00:37:20,040 --> 00:37:22,440 Speaker 5: All stuff lives, and that's. 714 00:37:22,280 --> 00:37:24,400 Speaker 4: The l the long termism part of tesculism. But this 715 00:37:24,520 --> 00:37:28,240 Speaker 4: idea that we should join with the computers and become 716 00:37:28,280 --> 00:37:31,600 Speaker 4: a better thing, that's the t that's the transhumanism. And 717 00:37:31,680 --> 00:37:34,279 Speaker 4: it's all part of sort of the same bundle and 718 00:37:34,360 --> 00:37:36,360 Speaker 4: way of thinking. And there's this great paper out in 719 00:37:36,440 --> 00:37:39,160 Speaker 4: the publication called First Monday by mil Tors and to 720 00:37:39,200 --> 00:37:42,360 Speaker 4: Meet Gibbrus, sort of documenting the way that all of 721 00:37:42,440 --> 00:37:45,680 Speaker 4: these different ideologies are linked one to the next. There's 722 00:37:45,719 --> 00:37:47,800 Speaker 4: overlaps in the people working on them, there's overlaps in 723 00:37:47,840 --> 00:37:50,120 Speaker 4: the ideas, and it all goes back to eugenics and 724 00:37:50,160 --> 00:37:50,680 Speaker 4: race science. 725 00:37:51,360 --> 00:37:53,680 Speaker 3: Wow, okay, and it just had a onto. 726 00:37:53,760 --> 00:37:57,480 Speaker 1: I mean, it's really so the dobrism and the boosterism 727 00:37:57,520 --> 00:38:00,919 Speaker 1: are you know, two sides of the same cli even 728 00:38:00,960 --> 00:38:02,960 Speaker 1: though they kind of pose each other to be different. 729 00:38:03,040 --> 00:38:05,719 Speaker 1: So you if you imagine, there was an article and 730 00:38:06,000 --> 00:38:08,760 Speaker 1: it was a very funny chart that was a company 731 00:38:08,760 --> 00:38:11,160 Speaker 1: of this article that is an article in the Washington Post, 732 00:38:12,440 --> 00:38:15,279 Speaker 1: Natasha Tiku, and she had this kind of grid and 733 00:38:15,360 --> 00:38:18,160 Speaker 1: it was really funny because it was like on one 734 00:38:18,360 --> 00:38:22,120 Speaker 1: end was this guy Eliza Yudowski who's like a big duomor. 735 00:38:22,280 --> 00:38:24,319 Speaker 1: He had this thing in Time where he wrote an 736 00:38:24,320 --> 00:38:28,560 Speaker 1: op ed in Time magazine he was like, you know, basically, 737 00:38:28,880 --> 00:38:31,400 Speaker 1: if we need to, we have to be willing to 738 00:38:31,440 --> 00:38:34,600 Speaker 1: do air strikes on data centers and then which he 739 00:38:34,680 --> 00:38:37,880 Speaker 1: actually wrote and he actually said she speaking of Tucker Carlston. 740 00:38:37,920 --> 00:38:40,319 Speaker 1: I think Tucker Carlson also was like, oh geez, maybe 741 00:38:40,320 --> 00:38:42,239 Speaker 1: we should do that. And on the other end you 742 00:38:42,320 --> 00:38:47,720 Speaker 1: have you know Sam Altman, who was like, in Mira Maradi. 743 00:38:47,600 --> 00:38:48,800 Speaker 3: You got a cyber trust. 744 00:38:48,800 --> 00:38:52,160 Speaker 2: Sorry, no, it's okay, I just want a cyber truck. 745 00:38:52,400 --> 00:38:55,319 Speaker 1: You just want a cyber truck. I got to enter 746 00:38:55,360 --> 00:38:59,440 Speaker 1: my social Security number one. Yeah, yeah, it's fine, I'll 747 00:38:59,440 --> 00:39:01,000 Speaker 1: give you mine. I want to beat to you too, 748 00:39:01,640 --> 00:39:04,720 Speaker 1: and so you know, but you know, they are actually 749 00:39:04,760 --> 00:39:07,560 Speaker 1: two sides of the same coin because they basically want 750 00:39:07,680 --> 00:39:09,439 Speaker 1: to and I mean in the middle was to meet 751 00:39:09,520 --> 00:39:12,359 Speaker 1: who's on this test real paper. And also it's also 752 00:39:12,440 --> 00:39:15,160 Speaker 1: my boss at the at Dare and and and I'm 753 00:39:15,160 --> 00:39:18,120 Speaker 1: not I'm not saying that too. You know, kiss Acids 754 00:39:18,400 --> 00:39:20,960 Speaker 1: we actually get along quite well. So but it's but 755 00:39:21,080 --> 00:39:24,320 Speaker 1: it's you know, posed as you know, someone in this 756 00:39:24,600 --> 00:39:27,279 Speaker 1: in this grid. But so Dubrius and Bistrooms them both 757 00:39:27,360 --> 00:39:32,680 Speaker 1: basically see AIS this kind of inevitability. We're gonna get there, 758 00:39:33,440 --> 00:39:36,520 Speaker 1: you know. And to me, I think the metaphor you know, 759 00:39:36,600 --> 00:39:38,960 Speaker 1: you allude to Emily is kind of like thinking about 760 00:39:38,960 --> 00:39:41,279 Speaker 1: this like a kid that needs to be formed. And 761 00:39:41,680 --> 00:39:43,880 Speaker 1: in some in some ways I think it's I mean, 762 00:39:43,920 --> 00:39:47,360 Speaker 1: it's colonialism. It's it's that it's manifest destiny. You know, 763 00:39:47,560 --> 00:39:51,320 Speaker 1: it's always five years away, you know. And so but 764 00:39:51,440 --> 00:39:54,520 Speaker 1: they but they both see development of AI to be 765 00:39:54,680 --> 00:39:58,520 Speaker 1: really really critical to whatever is happening next when we 766 00:39:58,600 --> 00:40:00,680 Speaker 1: can be like no one is asking for this ship 767 00:40:01,280 --> 00:40:05,120 Speaker 1: and it's taking up you know, and I and we're 768 00:40:05,120 --> 00:40:09,080 Speaker 1: seeing a lot of memes online are like, you know, 769 00:40:10,120 --> 00:40:12,839 Speaker 1: I'm so glad that we are draining a lake every 770 00:40:13,160 --> 00:40:16,319 Speaker 1: you know, to generate this image of and there's one 771 00:40:16,320 --> 00:40:20,080 Speaker 1: of the images was like a toddler holding you know, 772 00:40:20,400 --> 00:40:24,640 Speaker 1: a crucifix and with a helmet that says the police 773 00:40:24,760 --> 00:40:27,439 Speaker 1: on it and their neck deep in water and they've 774 00:40:27,480 --> 00:40:30,959 Speaker 1: got big kawhi you know, emoji eyes, right. 775 00:40:31,520 --> 00:40:34,799 Speaker 2: And if you blur your eyes, it's Jesus's face. Yeah, 776 00:40:34,880 --> 00:40:36,440 Speaker 2: because that's another huge thing. 777 00:40:36,520 --> 00:40:36,799 Speaker 3: I see. 778 00:40:36,840 --> 00:40:40,040 Speaker 2: There's so much AI nonsense art out there too. 779 00:40:41,160 --> 00:40:44,879 Speaker 4: And it's it's super environmentally disastrous, right, this is this thing, 780 00:40:45,200 --> 00:40:47,759 Speaker 4: and a lot of it is non consensual in the 781 00:40:47,800 --> 00:40:49,239 Speaker 4: sense that, like, if you try to use Google these 782 00:40:49,320 --> 00:40:51,480 Speaker 4: days to do a search, the first thing you get, 783 00:40:51,760 --> 00:40:54,360 Speaker 4: depending on search with many searches, is the AI overview, 784 00:40:54,400 --> 00:40:56,200 Speaker 4: which is the output of one of these text to 785 00:40:56,280 --> 00:40:59,720 Speaker 4: text models, taking way more processing power than just returning 786 00:40:59,719 --> 00:41:01,279 Speaker 4: the townlu links, which is that they used to do, 787 00:41:01,840 --> 00:41:03,200 Speaker 4: and you can't turn it off. 788 00:41:03,760 --> 00:41:04,840 Speaker 5: So if you use Google do. 789 00:41:04,840 --> 00:41:06,520 Speaker 4: A search, you're stuck with that. You're stuck with its 790 00:41:06,560 --> 00:41:07,400 Speaker 4: environmental impact. 791 00:41:08,040 --> 00:41:11,160 Speaker 3: Yeah. The one positive I'd say that I've seen more 792 00:41:11,200 --> 00:41:15,200 Speaker 3: and more since we last spoke is people being like, wait, 793 00:41:15,280 --> 00:41:19,880 Speaker 3: who the fuck is this for? Exactly? Who is asking 794 00:41:20,080 --> 00:41:22,879 Speaker 3: for this? Which I think is ultimately going to become 795 00:41:22,920 --> 00:41:26,080 Speaker 3: a louder and louder question. Is this seems to be 796 00:41:26,880 --> 00:41:32,800 Speaker 3: mainly for tech CEOs. Yeah, we're very wealthy, but especially 797 00:41:32,880 --> 00:41:36,719 Speaker 3: like in countries that aren't, where corporations are not more 798 00:41:36,800 --> 00:41:39,840 Speaker 3: powerful and valued more as humans than actual humans, like 799 00:41:39,880 --> 00:41:42,000 Speaker 3: the United States. I feel like it will it will 800 00:41:42,040 --> 00:41:45,040 Speaker 3: become more and more of an issue and then be 801 00:41:45,120 --> 00:41:49,000 Speaker 3: a matter of figuring out if that message actually gets 802 00:41:49,080 --> 00:41:49,959 Speaker 3: in in the US. 803 00:41:50,160 --> 00:41:52,680 Speaker 4: But yeah, we can keep shouting it, and that's the 804 00:41:52,719 --> 00:41:53,319 Speaker 4: great question, like. 805 00:41:53,360 --> 00:41:54,320 Speaker 5: Who asked for this? 806 00:41:54,800 --> 00:41:59,080 Speaker 3: Wait? Who asked for this? And who likes where it's going? Yeah? 807 00:41:59,320 --> 00:42:02,239 Speaker 2: Yeah, because of the luster has worn off from the 808 00:42:02,360 --> 00:42:05,600 Speaker 2: early you know days of those early chat GPT models 809 00:42:05,640 --> 00:42:06,360 Speaker 2: that came out like this. 810 00:42:06,480 --> 00:42:08,560 Speaker 3: Thing could be a doctor, this thing is could be 811 00:42:08,640 --> 00:42:09,160 Speaker 3: a lawyer. 812 00:42:09,560 --> 00:42:11,120 Speaker 2: And then most people kind of like the people that 813 00:42:11,200 --> 00:42:13,880 Speaker 2: I know who are first impressed, like yeah, Like it 814 00:42:14,040 --> 00:42:16,239 Speaker 2: helped me write an email. That's about the most I 815 00:42:16,280 --> 00:42:18,800 Speaker 2: can do with it, because I hate writing formally and 816 00:42:19,000 --> 00:42:21,719 Speaker 2: so it helped that. But beyond that, I don't know 817 00:42:21,880 --> 00:42:24,200 Speaker 2: many like as personally, but and I know people that 818 00:42:24,280 --> 00:42:27,719 Speaker 2: work in many different fields. It sort of ends up 819 00:42:27,760 --> 00:42:29,440 Speaker 2: being like, yeah, I don't really have a use for 820 00:42:29,560 --> 00:42:32,040 Speaker 2: it aside from like making funny songs that I share 821 00:42:32,080 --> 00:42:35,120 Speaker 2: with my girlfriend that you know, we have a song 822 00:42:35,120 --> 00:42:37,719 Speaker 2: about how she loves Chipotle, and it was in the 823 00:42:37,800 --> 00:42:41,680 Speaker 2: style of you know, Crystal Waters. Yeah, and now that's it. 824 00:42:42,360 --> 00:42:44,640 Speaker 1: Yeah, yeah, I mean I think that's I mean, I 825 00:42:44,880 --> 00:42:48,360 Speaker 1: like this phrase that Emily coined, which is, you know, 826 00:42:48,480 --> 00:42:51,360 Speaker 1: resist the urge to be impressed. Yeah, where it's like 827 00:42:51,520 --> 00:42:54,359 Speaker 1: this this kind of thing where you know, you've got 828 00:42:54,480 --> 00:42:57,120 Speaker 1: this proof of concept and it's got a little little 829 00:42:57,239 --> 00:43:01,160 Speaker 1: g whiz to it, but beyond that, I mean, you know, 830 00:43:01,320 --> 00:43:02,920 Speaker 1: are you going to use this in any kind of 831 00:43:03,040 --> 00:43:08,280 Speaker 1: real I don't know, product cycle or ideation or whatever. 832 00:43:08,480 --> 00:43:11,400 Speaker 1: And and there's just been I mean, we've had a 833 00:43:11,560 --> 00:43:14,279 Speaker 1: tech lash for the last couple of years here and 834 00:43:14,640 --> 00:43:18,120 Speaker 1: and I mean I think it's really the only people 835 00:43:18,239 --> 00:43:20,600 Speaker 1: I ever see praising this. And I mean, maybe this 836 00:43:20,760 --> 00:43:24,000 Speaker 1: is just a virtue of my timeline that it's creating, 837 00:43:24,080 --> 00:43:27,600 Speaker 1: and Elon Musk hasn't absolutely destroyed it yet, But the 838 00:43:27,680 --> 00:43:30,200 Speaker 1: only people I see praising it are typically people who 839 00:43:30,280 --> 00:43:34,359 Speaker 1: are very very much in the industry. You know, they're 840 00:43:34,360 --> 00:43:37,160 Speaker 1: the same people who are like I live in San 841 00:43:37,239 --> 00:43:41,239 Speaker 1: Francisco and I support you know, London Breed saying she 842 00:43:41,400 --> 00:43:43,719 Speaker 1: wants to clean up our streets from homeless people. And 843 00:43:43,840 --> 00:43:46,359 Speaker 1: it's just the people. And I mean some of them 844 00:43:46,400 --> 00:43:48,920 Speaker 1: are not out and out that fascist, but you know, 845 00:43:49,000 --> 00:43:51,880 Speaker 1: they're they're they're they're brushing, They're brushing along with it 846 00:43:52,600 --> 00:43:56,040 Speaker 1: everyone else I see, especially also a lot of people 847 00:43:56,200 --> 00:43:59,120 Speaker 1: in technology. They're like, I work in technology, and I 848 00:43:59,360 --> 00:44:03,280 Speaker 1: hate this bubble and I'm so tired about talking about 849 00:44:03,320 --> 00:44:05,560 Speaker 1: this and i just wanted to go away. 850 00:44:05,640 --> 00:44:07,200 Speaker 3: I mean, this is you know. 851 00:44:07,719 --> 00:44:10,080 Speaker 1: And then I see teachers, you know, we see a 852 00:44:10,120 --> 00:44:14,279 Speaker 1: lot of people from the professions, teachers, nurses, doctor. My 853 00:44:14,400 --> 00:44:16,360 Speaker 1: sister is a nurse. He's like, I hate this stuff. 854 00:44:16,719 --> 00:44:19,239 Speaker 1: My sister's a lawyer, and she's like, yeah, I've used 855 00:44:19,280 --> 00:44:21,719 Speaker 1: this to sort of start a brief, but he gets 856 00:44:21,800 --> 00:44:25,000 Speaker 1: so many things wrong. I need to double check everything. 857 00:44:25,160 --> 00:44:27,239 Speaker 1: And at the end of the day, yeah, does it 858 00:44:27,360 --> 00:44:29,360 Speaker 1: actually help. Yeah, probably not. 859 00:44:29,719 --> 00:44:32,759 Speaker 2: But but it's just a little baby, now, Alex, it's 860 00:44:32,840 --> 00:44:33,720 Speaker 2: just a little baby. 861 00:44:34,040 --> 00:44:36,640 Speaker 3: It's just a little baby, A little baby. 862 00:44:36,760 --> 00:44:40,400 Speaker 1: Would you want to take a little defenseless baby. 863 00:44:40,640 --> 00:44:42,320 Speaker 3: It is so interesting too, because it's like and it 864 00:44:42,400 --> 00:44:44,840 Speaker 3: could be a lawyer or a doctor, which is also 865 00:44:45,000 --> 00:44:50,560 Speaker 3: a baby. I'm so impressed by my little baby. Like 866 00:44:50,640 --> 00:44:52,920 Speaker 3: what they just said, they could be a lawyer or 867 00:44:53,000 --> 00:44:55,000 Speaker 3: a doctor one day to. 868 00:44:55,160 --> 00:44:58,120 Speaker 4: Watch for the phrase in its infancy in the coverage 869 00:44:58,160 --> 00:44:59,360 Speaker 4: of this, it's all over the place. 870 00:45:00,480 --> 00:45:02,640 Speaker 1: Yeah, shout up, shout shout up. And I just want 871 00:45:02,680 --> 00:45:04,600 Speaker 1: to give a shout out to Ana Laurene Hoffman, who 872 00:45:04,840 --> 00:45:08,600 Speaker 1: someone is a professor at You'd have a really good 873 00:45:08,600 --> 00:45:11,080 Speaker 1: friend of mine and colleague of Emily and uh so 874 00:45:11,560 --> 00:45:13,319 Speaker 1: she's done a lot of work and sort of talking 875 00:45:13,320 --> 00:45:16,680 Speaker 1: about like these metaphors of this kind of baby noess 876 00:45:16,719 --> 00:45:19,840 Speaker 1: and how it absolves AI of like truly being like 877 00:45:20,080 --> 00:45:23,480 Speaker 1: racist and sexist and absolutely fucking up. It's like, it's 878 00:45:23,480 --> 00:45:25,440 Speaker 1: just a little baby. Yeah, it's a baby with with 879 00:45:25,800 --> 00:45:29,359 Speaker 1: with a billion dollar valuation or whatever. You know, however 880 00:45:29,560 --> 00:45:30,000 Speaker 1: much it is. 881 00:45:30,560 --> 00:45:33,840 Speaker 3: By having AI put in the like mentioned in the 882 00:45:33,920 --> 00:45:34,520 Speaker 3: earnings call. 883 00:45:34,840 --> 00:45:37,600 Speaker 2: But if we're going to keep that consistency with that metaphor, 884 00:45:37,719 --> 00:45:39,720 Speaker 2: you'd be like, then, why are you, as the parent, 885 00:45:39,800 --> 00:45:42,520 Speaker 2: wheeling this child out to do labor that it's wholly 886 00:45:42,640 --> 00:45:43,279 Speaker 2: unprepared for. 887 00:45:43,600 --> 00:45:47,399 Speaker 3: You look fucked up. Actually, so, Miles, I have kids 888 00:45:47,480 --> 00:45:49,680 Speaker 3: that are a little older than yours, and so I 889 00:45:49,719 --> 00:45:52,359 Speaker 3: could that is actually good parenting. To wheel them out 890 00:45:52,400 --> 00:45:55,960 Speaker 3: and have them and it's brings in the cheddar. 891 00:45:56,640 --> 00:46:01,480 Speaker 1: Yeah, Jack coming on the podcast Hot take child labor, 892 00:46:01,640 --> 00:46:03,120 Speaker 1: actually child labor. 893 00:46:03,200 --> 00:46:08,240 Speaker 3: And when it's your kid, come on. Those are called chores, 894 00:46:08,360 --> 00:46:11,280 Speaker 3: and yes, if they're doing the chores for a multinational 895 00:46:11,400 --> 00:46:13,799 Speaker 3: corporation against a little hazy, but. 896 00:46:14,000 --> 00:46:17,040 Speaker 2: Yeah, take out an h eight hour chor shift that check. 897 00:46:16,960 --> 00:46:18,880 Speaker 3: Hazy because of all that money coming in. 898 00:46:19,280 --> 00:46:19,480 Speaker 2: You know. 899 00:46:20,760 --> 00:46:23,200 Speaker 4: So on this point about like nobody wants this, I 900 00:46:23,280 --> 00:46:25,200 Speaker 4: have a talk that I've been doing since summer of 901 00:46:25,280 --> 00:46:28,560 Speaker 4: twenty three called chat GP Why And then the subtitle 902 00:46:28,640 --> 00:46:32,040 Speaker 4: is when, if ever is synthetic text safe, appropriate and desirable? 903 00:46:32,400 --> 00:46:34,640 Speaker 4: And it's meant for sort of non specialist audiences. I 904 00:46:34,680 --> 00:46:36,520 Speaker 4: basically go through, Okay, what's a language model, what's the 905 00:46:36,560 --> 00:46:38,880 Speaker 4: technology for in its original use cases? 906 00:46:38,920 --> 00:46:40,080 Speaker 5: How did we get to the current thing? 907 00:46:40,440 --> 00:46:42,160 Speaker 4: And then what would have to be true for you 908 00:46:42,200 --> 00:46:44,360 Speaker 4: to actually want to use the output of one of 909 00:46:44,400 --> 00:46:44,799 Speaker 4: these things? 910 00:46:44,880 --> 00:46:45,000 Speaker 2: Right? 911 00:46:45,040 --> 00:46:46,680 Speaker 4: So, the first thing is you'd want something that is 912 00:46:46,960 --> 00:46:49,759 Speaker 4: ethically produced, so not based on stolen data, not based 913 00:46:49,800 --> 00:46:54,240 Speaker 4: on exploited labor. Basically, don't have those second layers box. Okay, 914 00:46:55,080 --> 00:46:57,800 Speaker 4: you also want something that somehow is not an environmental disaster. 915 00:46:58,400 --> 00:46:59,200 Speaker 5: Also don't have that. 916 00:47:01,040 --> 00:47:03,440 Speaker 4: Assuming we somehow got past those two hurdles, and it's 917 00:47:03,440 --> 00:47:05,640 Speaker 4: things like, okay, well, you need a case where it's 918 00:47:05,719 --> 00:47:10,080 Speaker 4: not going to be misleading anybody. It's got to be 919 00:47:10,120 --> 00:47:12,360 Speaker 4: a case where you don't actually care about the stuff 920 00:47:12,480 --> 00:47:15,720 Speaker 4: being accurate or truthful what comes out of it, including 921 00:47:16,400 --> 00:47:19,080 Speaker 4: you know, being able to recognize and mitigate any biases. 922 00:47:19,760 --> 00:47:22,640 Speaker 4: You also don't care about the output being original, so 923 00:47:22,760 --> 00:47:25,279 Speaker 4: plagiarism's okay. And like by the time you've done all that, 924 00:47:25,719 --> 00:47:28,680 Speaker 4: it's like, yeah, this use case of helping you draft 925 00:47:28,719 --> 00:47:32,359 Speaker 4: an email because that's tedious, right, and is that worth 926 00:47:32,480 --> 00:47:34,520 Speaker 4: the two that we started with there? Right, you know, 927 00:47:34,800 --> 00:47:37,960 Speaker 4: the labor exploitation, data theft and environmental impacts. 928 00:47:38,239 --> 00:47:41,760 Speaker 3: Right now we've just broken down all of human meaning 929 00:47:42,200 --> 00:47:46,759 Speaker 3: for the past, like for civilizations, Like you know, none 930 00:47:46,800 --> 00:47:50,080 Speaker 3: of the things that we've always seemed to think matter 931 00:47:50,719 --> 00:47:54,680 Speaker 3: matter anymore. Because billion dollar companies needed like a new 932 00:47:54,760 --> 00:47:57,760 Speaker 3: toy to hype up for the stock market. It sounds 933 00:47:57,840 --> 00:48:00,440 Speaker 3: like that's very frustrating. Let's take a quick let's take 934 00:48:00,480 --> 00:48:02,400 Speaker 3: a quick break. We'll come back, because I do want 935 00:48:02,440 --> 00:48:05,279 Speaker 3: to talk about what it is being used for. I mean, 936 00:48:05,360 --> 00:48:08,240 Speaker 3: AI is a very loose term the way it's being applied, 937 00:48:08,320 --> 00:48:12,080 Speaker 3: But people want to use, you know, these new advances 938 00:48:12,160 --> 00:48:16,080 Speaker 3: in computing to make themselves seem more profitable, and so 939 00:48:16,400 --> 00:48:18,440 Speaker 3: I want to talk about what it is actually being 940 00:48:18,560 --> 00:48:31,960 Speaker 3: used for. We'll be right back, and we're back. We're 941 00:48:32,760 --> 00:48:36,879 Speaker 3: we're back, and on your show, you've done some good 942 00:48:36,920 --> 00:48:41,680 Speaker 3: stuff on just the surveillance side of AI, which I 943 00:48:41,760 --> 00:48:45,800 Speaker 3: mean that turns out a lot of the technology that 944 00:48:45,880 --> 00:48:51,080 Speaker 3: we initially thought was promising was just eventually used for 945 00:48:51,360 --> 00:48:54,719 Speaker 3: the purposes of marketing and surveillance in the end, and 946 00:48:54,880 --> 00:48:57,960 Speaker 3: it seems like AI skipped all the promising stuff. And 947 00:48:58,040 --> 00:49:00,160 Speaker 3: it's just like, what if we just went right to 948 00:49:00,280 --> 00:49:05,000 Speaker 3: the right there harming harming people. 949 00:49:05,480 --> 00:49:08,440 Speaker 1: Yeah, I will, I will say that kind of. I mean, 950 00:49:08,880 --> 00:49:12,279 Speaker 1: you had mentioned that this term AI is kind of 951 00:49:12,360 --> 00:49:15,960 Speaker 1: being used, Lucy Goosey, and you know, I mean we 952 00:49:16,280 --> 00:49:20,560 Speaker 1: we AI is kind of synonymous with large language models 953 00:49:20,600 --> 00:49:23,640 Speaker 1: and image generators. But you know, things that have been 954 00:49:23,760 --> 00:49:30,840 Speaker 1: called AI also encompass things like biometric surveillance, like like 955 00:49:31,040 --> 00:49:35,560 Speaker 1: different different systems which use this technology called quot unquote 956 00:49:35,600 --> 00:49:39,320 Speaker 1: machine learning, and which is kind of this large scale 957 00:49:39,440 --> 00:49:44,000 Speaker 1: patterned recognition. So a lot of it's being used, especially 958 00:49:44,440 --> 00:49:48,720 Speaker 1: at the border, so doing things like trying to detect 959 00:49:49,200 --> 00:49:53,920 Speaker 1: verify identities by voices or by faces. You probably see 960 00:49:53,960 --> 00:49:56,680 Speaker 1: this if you've been in the airport. The TSA has 961 00:49:56,719 --> 00:49:59,920 Speaker 1: been using this, and you can still voluntarily opt out 962 00:50:00,320 --> 00:50:03,160 Speaker 1: for now, but they're really incentivizing it. I saw that 963 00:50:03,280 --> 00:50:06,960 Speaker 1: TSA has this touchless thing now, which is a facial recognition, 964 00:50:07,560 --> 00:50:09,880 Speaker 1: so you don't have to present your ID, you can 965 00:50:10,000 --> 00:50:11,320 Speaker 1: just scan your face. 966 00:50:11,160 --> 00:50:14,640 Speaker 4: And go and and like, don't do that, like take 967 00:50:14,800 --> 00:50:17,040 Speaker 4: every option to opt out, and that the fact that 968 00:50:17,120 --> 00:50:20,879 Speaker 4: those signs are there saying that this is optional. Penny, 969 00:50:20,960 --> 00:50:24,200 Speaker 4: somebody actually Petty. Yeah, yeah, the only reason we had 970 00:50:24,239 --> 00:50:26,800 Speaker 4: that science is because of her activism saying like this 971 00:50:26,960 --> 00:50:28,840 Speaker 4: has to be clear to the travelers that it's actually 972 00:50:28,880 --> 00:50:31,279 Speaker 4: optional and you can opt out, so it's posted there 973 00:50:31,320 --> 00:50:31,920 Speaker 4: that you don't have. 974 00:50:32,000 --> 00:50:34,480 Speaker 3: To do this. Yeah, all right, then I'm going to 975 00:50:34,520 --> 00:50:37,040 Speaker 3: feel you up. Sorry, those are just the rules. 976 00:50:37,520 --> 00:50:40,439 Speaker 1: Yeah, it's just it's absolutely but I mean it gets 977 00:50:40,560 --> 00:50:45,040 Speaker 1: it gets you know, leveraged against people who fly to 978 00:50:45,120 --> 00:50:48,600 Speaker 1: a lesser degree, but I mean folks who are refugees 979 00:50:48,719 --> 00:50:51,239 Speaker 1: or asy le's you know, I mean people on the 980 00:50:51,400 --> 00:50:55,719 Speaker 1: move really encountered to stuff incredibly violent ways. You know, 981 00:50:55,880 --> 00:50:59,200 Speaker 1: they do things like try to they take their blood 982 00:50:59,440 --> 00:51:03,040 Speaker 1: and say that, well we can we can associate your 983 00:51:03,600 --> 00:51:06,920 Speaker 1: We're gonna you know, sequence your genome and safe you're 984 00:51:07,000 --> 00:51:09,560 Speaker 1: actually from the country you say you're from, which is first, 985 00:51:09,600 --> 00:51:13,840 Speaker 1: it's pseudoscience. I mean, basically all biologists have been like, 986 00:51:13,960 --> 00:51:16,759 Speaker 1: you can't use this to determine if someone is X, 987 00:51:16,880 --> 00:51:22,480 Speaker 1: y Z, like nationality, because nationalities are one political entities, 988 00:51:22,520 --> 00:51:27,000 Speaker 1: they're not biological ones, and so like we can sort 989 00:51:27,040 --> 00:51:29,360 Speaker 1: of pinpoint you to a region, but it says nothing 990 00:51:29,520 --> 00:51:33,640 Speaker 1: to say of anything about the political borders of a country. 991 00:51:34,520 --> 00:51:37,760 Speaker 1: There's a great book I started reading by Petro Molner 992 00:51:38,640 --> 00:51:42,000 Speaker 1: which is called The Walls Have Eyes, which is about 993 00:51:42,040 --> 00:51:47,719 Speaker 1: this kind of intense surveillance state or intense surveillance architecture. 994 00:51:48,239 --> 00:51:51,040 Speaker 1: You know, it's being used in you know, typically in 995 00:51:51,640 --> 00:51:56,239 Speaker 1: the border, the US Mexico border, but also the you know, 996 00:51:56,360 --> 00:52:00,880 Speaker 1: the various points of entry in Europe where African migrants 997 00:52:01,040 --> 00:52:05,600 Speaker 1: are fleeing to, you know, fleeing places like Sudan and 998 00:52:05,719 --> 00:52:09,920 Speaker 1: Congo and the Tagrai region of Ethiopia. So just like 999 00:52:10,680 --> 00:52:12,799 Speaker 1: and this is just some of the most violent kind 1000 00:52:12,800 --> 00:52:15,560 Speaker 1: of stuff he in it you can imagine, and it's 1001 00:52:15,680 --> 00:52:18,719 Speaker 1: way far away from you know, this kind of Oh, 1002 00:52:18,880 --> 00:52:21,880 Speaker 1: here's like a fake little child, you know, or a 1003 00:52:22,000 --> 00:52:26,520 Speaker 1: Jesus holding twelve thousand babies riding in America truck with 1004 00:52:26,600 --> 00:52:28,759 Speaker 1: the American flag on it, you know what I mean, right, 1005 00:52:29,200 --> 00:52:33,360 Speaker 1: that's yeah, So the reality is yeah, much more stark. 1006 00:52:34,320 --> 00:52:37,000 Speaker 4: And you see that, you see that one too many 1007 00:52:37,760 --> 00:52:40,800 Speaker 4: image matching, so you get all these false arrests. So 1008 00:52:40,880 --> 00:52:43,520 Speaker 4: people because the AI said that they matched the image 1009 00:52:43,520 --> 00:52:46,359 Speaker 4: from the grain surveillance video. And it's one of those 1010 00:52:46,360 --> 00:52:50,560 Speaker 4: things where it's bad if it works, because you have 1011 00:52:50,680 --> 00:52:53,239 Speaker 4: this like increased surveillance power of the state, and it's 1012 00:52:53,280 --> 00:52:54,880 Speaker 4: bad if it doesn't work, because you get all these 1013 00:52:54,920 --> 00:52:55,479 Speaker 4: false arrests. 1014 00:52:55,520 --> 00:52:57,320 Speaker 5: Like it's it's just a bad idea. It's just a 1015 00:52:57,440 --> 00:52:58,279 Speaker 5: don't and. 1016 00:52:58,320 --> 00:53:01,080 Speaker 4: It's not just image stuff. So we read a while 1017 00:53:01,200 --> 00:53:04,640 Speaker 4: back about a situation in Germany, I think, where asylum 1018 00:53:04,680 --> 00:53:08,520 Speaker 4: seekers were being vetted as to whether or not they 1019 00:53:08,680 --> 00:53:12,040 Speaker 4: spoke the right language using you know, so one of 1020 00:53:12,040 --> 00:53:14,200 Speaker 4: the things you can do with pattern matching is okay, 1021 00:53:14,400 --> 00:53:18,560 Speaker 4: language identification that this string what languages that come from 1022 00:53:18,920 --> 00:53:21,960 Speaker 4: but it was being done based on completely inadequate data 1023 00:53:22,000 --> 00:53:24,319 Speaker 4: sets by people who don't speak the languages, who are 1024 00:53:24,400 --> 00:53:27,120 Speaker 4: not in a position to actually vet the output of 1025 00:53:27,160 --> 00:53:28,839 Speaker 4: the machine. And so you have these folks who are 1026 00:53:28,880 --> 00:53:31,840 Speaker 4: in the worst imaginable situation, like you don't go seeking 1027 00:53:31,880 --> 00:53:33,719 Speaker 4: asylum on a. 1028 00:53:33,760 --> 00:53:35,480 Speaker 5: Lark, right, these other people? 1029 00:53:37,280 --> 00:53:43,399 Speaker 4: Yeah, yeah, and then they're getting denied because some algorithms said, oh, 1030 00:53:43,440 --> 00:53:45,359 Speaker 4: you don't speak the language from the place you claim 1031 00:53:45,400 --> 00:53:48,560 Speaker 4: to be coming from where the person your accent is 1032 00:53:48,560 --> 00:53:50,680 Speaker 4: wrong or your variety is wrong or whatever. And the 1033 00:53:50,719 --> 00:53:53,960 Speaker 4: person who's who's run this computer system has no way 1034 00:53:54,040 --> 00:53:56,080 Speaker 4: of actually checking its output, but they believe it, and 1035 00:53:56,120 --> 00:53:57,759 Speaker 4: then they get these asylum seekers turned away. 1036 00:53:58,080 --> 00:53:59,759 Speaker 3: Yeah, so how does that? You know? 1037 00:54:00,160 --> 00:54:03,879 Speaker 2: Everything you said? How should we feel that open AI 1038 00:54:04,160 --> 00:54:09,080 Speaker 2: recently welcome to their board the eighteenth director of the NSA, 1039 00:54:09,440 --> 00:54:14,879 Speaker 2: Paul nakasone, is that bad? Or what should we take 1040 00:54:14,920 --> 00:54:15,399 Speaker 2: from that one? 1041 00:54:15,640 --> 00:54:17,880 Speaker 4: How should we feel not at all surprised? 1042 00:54:18,040 --> 00:54:18,120 Speaker 2: Right? 1043 00:54:18,239 --> 00:54:19,960 Speaker 4: How should we feel when open AI it's like, okay, 1044 00:54:20,080 --> 00:54:21,280 Speaker 4: bad is whatever the rest. 1045 00:54:21,160 --> 00:54:21,919 Speaker 5: Of that is is bad? 1046 00:54:22,200 --> 00:54:26,839 Speaker 2: Yeah? It seems bad, man, Yeah, it seems like there's 1047 00:54:26,840 --> 00:54:30,000 Speaker 2: again we're talking like this technology to mass surveillance pipeline, 1048 00:54:30,040 --> 00:54:33,480 Speaker 2: and who better than someone who ran the fucking NSSA, Like. 1049 00:54:33,880 --> 00:54:35,280 Speaker 3: And I know the way it's being spun. 1050 00:54:35,320 --> 00:54:37,800 Speaker 2: It's like, you know, this is a part of cyber command, 1051 00:54:37,880 --> 00:54:41,440 Speaker 2: Like he inherently knows like how what the what the 1052 00:54:41,520 --> 00:54:43,759 Speaker 2: guardbrails need to be in terms of keeping us safe. 1053 00:54:43,800 --> 00:54:45,640 Speaker 2: But to me it just feels like, No, you brought 1054 00:54:45,680 --> 00:54:49,279 Speaker 2: in a surveillance pro, not someone who understands inherently like 1055 00:54:49,719 --> 00:54:52,080 Speaker 2: what this specific technology is, but more someone who's like 1056 00:54:52,520 --> 00:54:56,080 Speaker 2: learns how to harness technology for this other specific aim. 1057 00:54:57,080 --> 00:55:00,600 Speaker 4: Yeah, so surveillance is not synonymous with safe. Like the 1058 00:55:00,680 --> 00:55:03,000 Speaker 4: one the one kind of one use case for the 1059 00:55:03,040 --> 00:55:05,840 Speaker 4: word surveillance that I think actually was pro public safety 1060 00:55:06,280 --> 00:55:08,680 Speaker 4: is there's a study the long term seting in Seattle 1061 00:55:08,719 --> 00:55:12,080 Speaker 4: called the Seattle Flu Study, and they are doing what 1062 00:55:12,120 --> 00:55:15,000 Speaker 4: they call surveillance testing for flu viruses. So they get 1063 00:55:15,080 --> 00:55:17,080 Speaker 4: volunteers to come in and get swabbed, and they are 1064 00:55:17,200 --> 00:55:20,040 Speaker 4: keeping track of what viruses are circulating in our community. Right, 1065 00:55:20,560 --> 00:55:23,200 Speaker 4: I'm all for surveilling the viruses, especially if you can 1066 00:55:23,239 --> 00:55:24,000 Speaker 4: keep the people out of it. 1067 00:55:24,200 --> 00:55:26,360 Speaker 1: Yeah, I would, I would. I would add a wrinkle 1068 00:55:26,440 --> 00:55:28,439 Speaker 1: to that just because I think that, I mean, there's 1069 00:55:28,480 --> 00:55:30,480 Speaker 1: a lot of surveillance. I mean that's the kind of technology, 1070 00:55:30,520 --> 00:55:32,840 Speaker 1: that's the kind of terminology they use of health surveillance 1071 00:55:33,280 --> 00:55:36,759 Speaker 1: to detect kind of virus rates and whatnot. I would 1072 00:55:36,760 --> 00:55:38,919 Speaker 1: also add the wrinkle that like a lot of those 1073 00:55:39,920 --> 00:55:43,759 Speaker 1: you know, organizations are really trusted by distrusted by marginalized people, 1074 00:55:43,840 --> 00:55:44,920 Speaker 1: Like what are you going to do what to me? 1075 00:55:45,120 --> 00:55:48,799 Speaker 1: You know, like especially thinking like you know, like lots 1076 00:55:48,840 --> 00:55:52,760 Speaker 1: of lots of transfolks and like especially like under housed 1077 00:55:52,840 --> 00:55:54,920 Speaker 1: or unhoused transfers, and just like you're going to do 1078 00:55:55,000 --> 00:55:56,920 Speaker 1: what you want this day upon me for who you know. 1079 00:55:57,239 --> 00:56:02,080 Speaker 4: Right, So, yeah, understandably, especially because because surveillance in general 1080 00:56:02,200 --> 00:56:05,040 Speaker 4: like is not a safety thing, right, It's not. It 1081 00:56:05,200 --> 00:56:09,200 Speaker 4: is maybe a like safety for people within the walls 1082 00:56:09,239 --> 00:56:11,600 Speaker 4: of the walled garden thing, but that's not safety, right. 1083 00:56:11,640 --> 00:56:16,040 Speaker 4: That's the other thing about this is that what we 1084 00:56:16,160 --> 00:56:19,799 Speaker 4: call AI these days is predicated on enormous data collection, right, 1085 00:56:19,920 --> 00:56:21,799 Speaker 4: And so to one eccent, it's just sort of an 1086 00:56:21,840 --> 00:56:24,879 Speaker 4: excuse to go about claiming access to all that data. 1087 00:56:25,440 --> 00:56:27,160 Speaker 4: And once you have access to all that data, you 1088 00:56:27,200 --> 00:56:28,640 Speaker 4: can do things with it that have nothing to do 1089 00:56:28,840 --> 00:56:31,920 Speaker 4: with the large language models, and so there's you know, 1090 00:56:32,000 --> 00:56:35,480 Speaker 4: this is I think less typically less immediately like threatening 1091 00:56:35,560 --> 00:56:38,360 Speaker 4: to life and limb than the applications that Alex was 1092 00:56:38,400 --> 00:56:41,400 Speaker 4: starting with. But there's a lot of stuff where it's like, actually, 1093 00:56:41,440 --> 00:56:44,360 Speaker 4: we would be better off without all that information about 1094 00:56:44,440 --> 00:56:47,560 Speaker 4: us being out there. And there's an example that came 1095 00:56:47,640 --> 00:56:49,200 Speaker 4: up recently. So did you see this thing about the 1096 00:56:49,280 --> 00:56:53,840 Speaker 4: system called recall that came out with Windows eleven? So 1097 00:56:54,360 --> 00:56:56,320 Speaker 4: this thing, Oh god, this is such a mess. So 1098 00:56:56,960 --> 00:56:58,719 Speaker 4: initially it was going to be by default turned on. 1099 00:56:59,040 --> 00:57:01,160 Speaker 2: Oh yes, right, yeah, this is kind of like the 1100 00:57:01,200 --> 00:57:02,120 Speaker 2: Adobe story too. 1101 00:57:02,239 --> 00:57:03,560 Speaker 5: Yeah, yeah, every five. 1102 00:57:03,480 --> 00:57:05,920 Speaker 4: Seconds it takes a picture of your screen and then 1103 00:57:06,040 --> 00:57:08,840 Speaker 4: you can use that to like using AI search for 1104 00:57:08,880 --> 00:57:11,240 Speaker 4: stuff that you've sort of and their example is something stupid. 1105 00:57:11,239 --> 00:57:12,560 Speaker 4: It's like, yeah, I saw a recipe, but I don't 1106 00:57:12,560 --> 00:57:13,759 Speaker 4: remember where I saw it, So you want to be 1107 00:57:13,760 --> 00:57:16,240 Speaker 4: able to search back through your activity and like zero 1108 00:57:16,400 --> 00:57:19,520 Speaker 4: thought to what this means for people who are victims 1109 00:57:19,720 --> 00:57:24,800 Speaker 4: of intimate partner violence, right that they have this surveillance 1110 00:57:24,880 --> 00:57:27,200 Speaker 4: going on in their computer that eventually it ended up 1111 00:57:27,440 --> 00:57:30,720 Speaker 4: being shipped as off by default because the cybersecurity folks 1112 00:57:30,880 --> 00:57:33,000 Speaker 4: pushed back really hard. And by folks, I don't mean 1113 00:57:33,040 --> 00:57:34,560 Speaker 4: the people at Microsoft, I mean the people out in 1114 00:57:34,600 --> 00:57:37,040 Speaker 4: the world or saw this coming. Yeah, but that's another 1115 00:57:37,120 --> 00:57:40,960 Speaker 4: example of like surveillance in the name of AI that's 1116 00:57:40,960 --> 00:57:43,560 Speaker 4: supposed to be the sort of, you know, helpful little 1117 00:57:43,640 --> 00:57:45,440 Speaker 4: thing for you, but like no thought to what that 1118 00:57:45,560 --> 00:57:46,840 Speaker 4: means for people. And it's like, yeah, we're just going 1119 00:57:46,880 --> 00:57:49,760 Speaker 4: to turn this on by default because everybody wants this obviously, right. 1120 00:57:50,080 --> 00:57:52,920 Speaker 2: It's like, no, I know how to look through my history. 1121 00:57:53,160 --> 00:57:56,680 Speaker 2: Actually I've developed that skill. Yeah, I don't need you 1122 00:57:56,800 --> 00:57:59,640 Speaker 2: to take snapshots of my desktop every three seconds. 1123 00:58:00,080 --> 00:58:03,320 Speaker 3: Your show's covered so many kind of upsetting ways that 1124 00:58:04,120 --> 00:58:07,360 Speaker 3: it doesn't seem like it's people implementing AI. It's companies 1125 00:58:07,440 --> 00:58:11,280 Speaker 3: implementing AI in a lot of cases to do jobs 1126 00:58:11,360 --> 00:58:14,840 Speaker 3: that it's not capable of doing. There there's been incorrect 1127 00:58:14,880 --> 00:58:20,000 Speaker 3: obituaries Grock, the Elon Musk won the Twitter one made 1128 00:58:20,040 --> 00:58:23,880 Speaker 3: up fake headlines about Iran attacking Israel, and like public 1129 00:58:24,080 --> 00:58:27,080 Speaker 3: like put them out as like a major trending story. 1130 00:58:27,720 --> 00:58:31,000 Speaker 3: You have this great anecdote about a Facebook chatbot AI 1131 00:58:31,560 --> 00:58:35,120 Speaker 3: like responding to someone had this like very specific question 1132 00:58:35,280 --> 00:58:38,160 Speaker 3: they have, like a gifted disabled child. They were like, he, 1133 00:58:38,320 --> 00:58:42,400 Speaker 3: does anybody have experience with a gifted disabled like two 1134 00:58:42,480 --> 00:58:46,720 Speaker 3: E child with like this specific New York public school 1135 00:58:47,000 --> 00:58:51,200 Speaker 3: program And the chatbot responds, yes, I have experience with that, 1136 00:58:51,280 --> 00:58:53,840 Speaker 3: and just like made up because they knew that's what 1137 00:58:54,040 --> 00:58:57,360 Speaker 3: that's what they wanted to hear. And fortunately it was 1138 00:58:57,400 --> 00:58:59,919 Speaker 3: like clearly labeled as an AI chatbot. So the person 1139 00:59:00,160 --> 00:59:05,920 Speaker 3: was like, what the black mirror, but World Health Organization, 1140 00:59:06,080 --> 00:59:10,960 Speaker 3: you know, eating disorder institutions replacing therapists with AI, Like 1141 00:59:11,320 --> 00:59:17,240 Speaker 3: you just have all these examples of this going being 1142 00:59:17,960 --> 00:59:22,280 Speaker 3: used where it shouldn't be and things going badly, and 1143 00:59:23,280 --> 00:59:25,880 Speaker 3: like that. There's a detail that I think we talked 1144 00:59:25,880 --> 00:59:32,280 Speaker 3: about last time about dual Lingo, where the model where 1145 00:59:32,440 --> 00:59:35,360 Speaker 3: they let AI take over some of the stuff that 1146 00:59:35,480 --> 00:59:39,840 Speaker 3: like human teachers and translators were doing before. And you 1147 00:59:39,960 --> 00:59:42,720 Speaker 3: made the point that people who are learning the language, 1148 00:59:42,720 --> 00:59:45,080 Speaker 3: who are beginners are not in a position to notice 1149 00:59:45,200 --> 00:59:48,560 Speaker 3: that the quality has dropped. Yeah, And I feel like 1150 00:59:48,680 --> 00:59:52,360 Speaker 3: that's what we're seeing basically everywhere now is just the 1151 00:59:52,480 --> 00:59:55,960 Speaker 3: Internet is so big, they're just using it so many 1152 00:59:56,040 --> 01:00:00,120 Speaker 3: different places that it's hard to catch them all and 1153 01:00:00,240 --> 01:00:04,960 Speaker 3: then there is not an appetite to report on all 1154 01:00:05,040 --> 01:00:08,920 Speaker 3: the ways it's fucking up, and so it just everything 1155 01:00:09,080 --> 01:00:14,840 Speaker 3: is kind of getting slightly too drastically shittier at once. Yeah, 1156 01:00:15,440 --> 01:00:17,440 Speaker 3: and I don't know what to do with that. 1157 01:00:18,680 --> 01:00:21,400 Speaker 1: I would say, yeah, well go ahead, Emily. 1158 01:00:21,840 --> 01:00:23,280 Speaker 5: What you do with that is you make fun of it. 1159 01:00:23,400 --> 01:00:25,760 Speaker 4: That's one of our things is ridiculous process to like, 1160 01:00:26,200 --> 01:00:28,360 Speaker 4: you know, try to try to keep the mood up, 1161 01:00:28,440 --> 01:00:29,480 Speaker 4: but also just show it for. 1162 01:00:29,520 --> 01:00:30,800 Speaker 5: How ridiculous it is. 1163 01:00:31,600 --> 01:00:33,600 Speaker 4: And then the other thing is to really seek out 1164 01:00:34,000 --> 01:00:36,160 Speaker 4: the good journalism on this topic, because so much of 1165 01:00:36,200 --> 01:00:39,200 Speaker 4: it is either fake journalism output by a large language 1166 01:00:39,200 --> 01:00:43,480 Speaker 4: model these days, or journalists who are basically practicing access 1167 01:00:43,560 --> 01:00:46,160 Speaker 4: journalism who are doing the g's thing, who are reproducing 1168 01:00:46,200 --> 01:00:48,520 Speaker 4: press releases and so finding the people who are doing 1169 01:00:48,600 --> 01:00:50,720 Speaker 4: really good critical work and like supporting them I think 1170 01:00:50,800 --> 01:00:51,520 Speaker 4: is super important. 1171 01:00:52,360 --> 01:00:53,760 Speaker 1: You we're going to say, well, I was, well, though, 1172 01:00:53,920 --> 01:00:55,600 Speaker 1: you just teed me up really well, because I was 1173 01:00:55,640 --> 01:00:57,840 Speaker 1: actually going to say, you know, some of the people 1174 01:00:57,840 --> 01:00:59,680 Speaker 1: who are doing some of the best work on it 1175 01:00:59,720 --> 01:01:02,920 Speaker 1: are like four or four media, and you know, I 1176 01:01:03,040 --> 01:01:04,560 Speaker 1: want to give a shout out to them because they're 1177 01:01:04,720 --> 01:01:08,880 Speaker 1: you know, these folks are basically you know, they were 1178 01:01:08,920 --> 01:01:14,200 Speaker 1: at Motherboard and Motherboard you know, uh, the whole Vice 1179 01:01:15,240 --> 01:01:20,000 Speaker 1: Empire was basically you know, Sunset and and so they 1180 01:01:20,120 --> 01:01:22,840 Speaker 1: they laid off a bunch of people. So they started 1181 01:01:22,880 --> 01:01:25,880 Speaker 1: this kind of journalists owned and operated place, and you 1182 01:01:26,000 --> 01:01:29,960 Speaker 1: know that focus is specifically on tech and AI and 1183 01:01:30,080 --> 01:01:33,120 Speaker 1: these folks have been kind of in the game for 1184 01:01:33,280 --> 01:01:36,560 Speaker 1: so long they they know how to talk about this 1185 01:01:36,640 --> 01:01:40,480 Speaker 1: stuff without really having this kind of being bowled over. 1186 01:01:41,200 --> 01:01:44,800 Speaker 1: You know, there's people who play that access journalism, like 1187 01:01:45,440 --> 01:01:48,560 Speaker 1: like kar Swisher who like kind of poses herself as 1188 01:01:48,680 --> 01:01:52,440 Speaker 1: this person who is very antagonistic but like, you know. 1189 01:01:52,520 --> 01:01:55,120 Speaker 2: Right, just like fawning over like AI people, and. 1190 01:01:55,920 --> 01:02:00,160 Speaker 1: Like all the time, Well I trusted Elon Musse and 1191 01:02:00,320 --> 01:02:01,840 Speaker 1: tell I was like, well, why did you trust this 1192 01:02:02,040 --> 01:02:04,720 Speaker 1: man in the first place? Like that, you know, I 1193 01:02:04,840 --> 01:02:09,160 Speaker 1: was reading the uh the Peter Thiel biography, The Contrarian 1194 01:02:09,800 --> 01:02:12,960 Speaker 1: and you know, and like it's it's a very it's 1195 01:02:13,000 --> 01:02:16,240 Speaker 1: a very harowing read. I mean, it was fascinating, but 1196 01:02:16,400 --> 01:02:19,200 Speaker 1: it was very harrowing. It wasn't an it was pretty 1197 01:02:19,320 --> 01:02:23,920 Speaker 1: like critical, but like you know, they discuss the PayPal 1198 01:02:24,080 --> 01:02:27,480 Speaker 1: days you know twenty four years ago, when you know, 1199 01:02:27,680 --> 01:02:31,000 Speaker 1: Elon Musk was like, well, I want to name PayPal 1200 01:02:31,120 --> 01:02:34,840 Speaker 1: to X and then and then everybody was like, why 1201 01:02:34,920 --> 01:02:37,320 Speaker 1: the fuck would you do that? People are already using 1202 01:02:37,760 --> 01:02:40,960 Speaker 1: people are using PayPal as a verb. You know, that's 1203 01:02:41,200 --> 01:02:44,120 Speaker 1: effectively the same thing he did with Twitter, Like people 1204 01:02:44,160 --> 01:02:46,560 Speaker 1: are talking about tweet as a verb. Why would you say, 1205 01:02:46,600 --> 01:02:51,000 Speaker 1: you know, just it's been like an absolutely vapid human 1206 01:02:51,080 --> 01:02:55,240 Speaker 1: being with no business sense. Anyways, Journal, that was a 1207 01:02:55,360 --> 01:02:58,800 Speaker 1: very long way of saying Kara Switcher sucks, and then 1208 01:02:59,080 --> 01:03:02,720 Speaker 1: also saying that all saying also saying that there's lots 1209 01:03:02,720 --> 01:03:05,320 Speaker 1: of folks. There's a number of folks doing great stuff. 1210 01:03:05,320 --> 01:03:08,160 Speaker 1: So I mean folks at four or for Karen Howe 1211 01:03:08,240 --> 01:03:12,280 Speaker 1: who's independent but had been at The Atlantic and I 1212 01:03:12,360 --> 01:03:15,280 Speaker 1: T Tech Review and Wall Street Journal. Curry Johnson, who 1213 01:03:15,360 --> 01:03:19,080 Speaker 1: is at Wired is now at cal Matters. There's a 1214 01:03:19,120 --> 01:03:22,520 Speaker 1: lot of people that really report on AI from the 1215 01:03:22,560 --> 01:03:25,680 Speaker 1: perspective of like the people who it's harming, rather than 1216 01:03:25,760 --> 01:03:29,240 Speaker 1: starting from well this tool can do X, Y and Z, right, 1217 01:03:29,520 --> 01:03:32,480 Speaker 1: you know, we really should take these groups out their claims. 1218 01:03:32,840 --> 01:03:34,520 Speaker 1: But yeah, I mean the larger part of it is 1219 01:03:34,600 --> 01:03:37,240 Speaker 1: I mean, there's just so much stuff out there, you know, 1220 01:03:37,360 --> 01:03:39,919 Speaker 1: and it's it's so hard and it is like whack 1221 01:03:40,000 --> 01:03:44,400 Speaker 1: them all. And I mean we're we're not journalists by training. 1222 01:03:44,560 --> 01:03:47,920 Speaker 1: I mean we're sort of doing a journalistic thing right now. 1223 01:03:50,280 --> 01:03:53,880 Speaker 1: I think we're I would not say we are journalists. 1224 01:03:53,920 --> 01:03:56,040 Speaker 1: I always say we are doing a journalistic thing. 1225 01:03:58,120 --> 01:03:59,200 Speaker 2: We're doing journalism. 1226 01:03:59,600 --> 01:04:03,160 Speaker 1: We are not doing original reporting. S sure, but it 1227 01:04:03,360 --> 01:04:06,880 Speaker 1: is well and you know, I would you know, I'm not. 1228 01:04:06,960 --> 01:04:09,040 Speaker 1: I don't know, I'm not the I don't I don't 1229 01:04:09,040 --> 01:04:11,720 Speaker 1: know who decides this is the court of journalism. But 1230 01:04:11,880 --> 01:04:14,440 Speaker 1: you know, reporting in so far as looking at original 1231 01:04:14,560 --> 01:04:18,280 Speaker 1: papers and effectively being like, okay, this is marketing. 1232 01:04:18,400 --> 01:04:22,000 Speaker 3: This is why it's marketing. Know they're there, yeah, rather 1233 01:04:22,120 --> 01:04:22,680 Speaker 3: than you. 1234 01:04:22,720 --> 01:04:28,760 Speaker 1: Know, a whizbang c net article or something that comes 1235 01:04:28,800 --> 01:04:32,840 Speaker 1: out of a content mill and says Google just published 1236 01:04:33,040 --> 01:04:36,400 Speaker 1: this tool that says you can you know, find eighteen 1237 01:04:36,480 --> 01:04:39,560 Speaker 1: million materials. Who are that are you know complete? Almost 1238 01:04:39,600 --> 01:04:42,400 Speaker 1: like Okay, well let's look at those claims and upon 1239 01:04:42,600 --> 01:04:45,880 Speaker 1: what grounds to those claims stand? And and you know 1240 01:04:46,040 --> 01:04:47,560 Speaker 1: how that's that's a pretty. 1241 01:04:48,360 --> 01:04:50,800 Speaker 4: I think what we're doing is is first of all, 1242 01:04:50,880 --> 01:04:53,240 Speaker 4: sharing our expertise in our specific fields, but also like 1243 01:04:53,480 --> 01:04:55,280 Speaker 4: modeling for people how to be critical. 1244 01:04:55,000 --> 01:04:56,600 Speaker 5: Consumers of journalism. 1245 01:04:57,960 --> 01:05:02,680 Speaker 4: So journalism ja but yeah, definitely without training in journalism totally. 1246 01:05:02,760 --> 01:05:04,680 Speaker 3: Yeah. But I think we want to do we want 1247 01:05:04,720 --> 01:05:07,439 Speaker 3: to do the M and M articles. I mean, oh 1248 01:05:07,680 --> 01:05:11,760 Speaker 3: my gosh, there's this article that has like done our 1249 01:05:11,880 --> 01:05:15,520 Speaker 3: brains because it just has this series of sentences. 1250 01:05:15,560 --> 01:05:18,840 Speaker 2: That I don't know that because everything is degrading, like journalism. 1251 01:05:18,960 --> 01:05:20,840 Speaker 2: You know, there's that story about like the Daily Mail 1252 01:05:20,960 --> 01:05:23,360 Speaker 2: was like Natalie Portman was hooked on cocaine when she 1253 01:05:23,480 --> 01:05:25,160 Speaker 2: was at Harvard. You're like, no, that was from that 1254 01:05:25,360 --> 01:05:28,440 Speaker 2: rap she did on SNL and that was like a bit. 1255 01:05:28,640 --> 01:05:31,720 Speaker 2: But because this thing is street and then the Daily 1256 01:05:31,800 --> 01:05:33,840 Speaker 2: Mail had to be like at the end they corrected it. 1257 01:05:33,920 --> 01:05:36,480 Speaker 2: They're like, uh, just she was not. That was obviously 1258 01:05:36,640 --> 01:05:39,360 Speaker 2: satirical and that was due to human error, Like they 1259 01:05:39,480 --> 01:05:40,520 Speaker 2: really leaned into that. 1260 01:05:40,760 --> 01:05:42,280 Speaker 4: Like no, yeah, of course, did I say by the 1261 01:05:42,320 --> 01:05:44,880 Speaker 4: time that the fabricated quote of mine came out of 1262 01:05:44,920 --> 01:05:46,479 Speaker 4: one of these things and was printed as news. 1263 01:05:46,680 --> 01:05:46,720 Speaker 2: No? 1264 01:05:46,960 --> 01:05:50,320 Speaker 4: No, So I also like alex so search my own 1265 01:05:50,400 --> 01:05:51,920 Speaker 4: name because I talked to journalists or not that I 1266 01:05:52,000 --> 01:05:54,040 Speaker 4: like to see what's happening. And there was something in 1267 01:05:54,120 --> 01:05:57,400 Speaker 4: an outfit called Bihar Praba that attributed this quote to me, 1268 01:05:57,440 --> 01:05:59,960 Speaker 4: which is not something I'd ever said and not anybody 1269 01:06:00,200 --> 01:06:03,240 Speaker 4: remember talking to. So I emailed the editor and I said, 1270 01:06:03,520 --> 01:06:06,360 Speaker 4: please take down this fabricated quote in print of retraction 1271 01:06:06,440 --> 01:06:08,880 Speaker 4: because I never said that, and they did so the 1272 01:06:08,960 --> 01:06:11,760 Speaker 4: article got updated remove the thing attributed to me, and 1273 01:06:11,840 --> 01:06:13,680 Speaker 4: then there was a thing at the bottom saying we've 1274 01:06:13,680 --> 01:06:16,120 Speaker 4: attracted this. But what they didn't put publicly, but he 1275 01:06:16,200 --> 01:06:18,360 Speaker 4: told me over email is that the whole thing came 1276 01:06:18,400 --> 01:06:19,040 Speaker 4: out of Gemini. 1277 01:06:19,920 --> 01:06:20,360 Speaker 3: Oh wow. 1278 01:06:20,400 --> 01:06:22,840 Speaker 4: And they posted it as a news article of course. 1279 01:06:22,840 --> 01:06:25,400 Speaker 4: And you know the only reason I discovered it was 1280 01:06:25,440 --> 01:06:27,640 Speaker 4: it was my own name and like, I never said 1281 01:06:27,680 --> 01:06:27,960 Speaker 4: that thing. 1282 01:06:28,320 --> 01:06:31,560 Speaker 2: Well, I need your expertise here to decipher this food 1283 01:06:31,600 --> 01:06:35,000 Speaker 2: and wine article that was talking about how Eminem's was 1284 01:06:35,080 --> 01:06:39,760 Speaker 2: coming out with a pumpkin pie flavored eminem but very early. 1285 01:06:40,000 --> 01:06:42,720 Speaker 2: Normally pumpkin pie flavored things don't enter the market till 1286 01:06:42,720 --> 01:06:45,360 Speaker 2: around August, like around when fall comes. But Eminem is 1287 01:06:45,400 --> 01:06:51,160 Speaker 2: why we were covering it, because we are journalists in May, 1288 01:06:51,880 --> 01:06:55,520 Speaker 2: pumpkin spice already no, But again they were saying this 1289 01:06:55,640 --> 01:06:59,440 Speaker 2: is because apparently gen Z and millennial consumers are celebrating 1290 01:06:59,560 --> 01:07:03,280 Speaker 2: Halloween earlier. But this is this one section that completely 1291 01:07:03,520 --> 01:07:08,320 Speaker 2: wait wait, yeah, I don't know. That's what they're saying, 1292 01:07:08,440 --> 01:07:12,640 Speaker 2: according to their analysis, that we were that we apparent, 1293 01:07:12,800 --> 01:07:13,800 Speaker 2: So let me read this for you. 1294 01:07:14,120 --> 01:07:14,400 Speaker 3: Quote. 1295 01:07:14,840 --> 01:07:17,520 Speaker 2: The pre seasonal launch of the milk chocolate pumpkin pie 1296 01:07:17,520 --> 01:07:20,880 Speaker 2: Eminem's is a strategic move that taps into Mars' market research. 1297 01:07:21,160 --> 01:07:24,320 Speaker 2: This research indicates that gen Z and millennials plan to 1298 01:07:24,440 --> 01:07:27,960 Speaker 2: celebrate Halloween by dressing up and planning for the holiday 1299 01:07:28,280 --> 01:07:32,200 Speaker 2: about six point eight weeks beforehand. Well, six point eight 1300 01:07:32,240 --> 01:07:35,000 Speaker 2: weeks from Memorial Day is the fourth of July, so 1301 01:07:35,120 --> 01:07:37,320 Speaker 2: you still have plenty of time to latch onto a 1302 01:07:37,400 --> 01:07:40,200 Speaker 2: pop culture trend and turn it into a creative costume. 1303 01:07:41,680 --> 01:07:46,760 Speaker 3: I don't right, it doesn't It doesn't make any sense. 1304 01:07:46,800 --> 01:07:53,600 Speaker 1: I know, I'm fixating on six point eight. 1305 01:07:55,640 --> 01:07:57,400 Speaker 6: What does that even mean? Does that mean? 1306 01:07:57,440 --> 01:07:59,160 Speaker 3: And where did Memorial Day come from? 1307 01:07:59,280 --> 01:07:59,480 Speaker 1: And that? 1308 01:07:59,720 --> 01:08:02,320 Speaker 3: And what is six point eight weeks for Memorial Day? 1309 01:08:02,320 --> 01:08:05,080 Speaker 3: Because it's not any of the days that they said 1310 01:08:05,120 --> 01:08:05,360 Speaker 3: it was. 1311 01:08:05,800 --> 01:08:07,080 Speaker 5: They said July fourth. 1312 01:08:07,600 --> 01:08:09,240 Speaker 3: Wait, and also six point. 1313 01:08:09,040 --> 01:08:11,520 Speaker 2: Eight eight weeks isn't a real amount of time. That's 1314 01:08:11,640 --> 01:08:15,120 Speaker 2: forty seven point point six days? Yeah, what is what 1315 01:08:15,280 --> 01:08:16,679 Speaker 2: is even a six point eight a week? 1316 01:08:17,240 --> 01:08:21,960 Speaker 4: So if this were real, it's possible that they surveyed 1317 01:08:22,000 --> 01:08:23,720 Speaker 4: a bunch of people and they said when do you 1318 01:08:23,760 --> 01:08:26,439 Speaker 4: start planning your Halloween costume? And those people gave dates 1319 01:08:26,520 --> 01:08:28,320 Speaker 4: and then they average that and that's how you could 1320 01:08:28,320 --> 01:08:28,600 Speaker 4: get to it. 1321 01:08:28,640 --> 01:08:28,960 Speaker 3: I get that. 1322 01:08:29,439 --> 01:08:32,680 Speaker 5: I get that that's fair, but also it totally. 1323 01:08:32,400 --> 01:08:35,840 Speaker 4: Sounds like someone put into a large language model write 1324 01:08:35,880 --> 01:08:40,160 Speaker 4: an article about why millennials and gen z are planning 1325 01:08:40,200 --> 01:08:41,320 Speaker 4: their Halloween costumes earlier. 1326 01:08:41,360 --> 01:08:43,120 Speaker 5: So like it sounds like that. 1327 01:08:43,479 --> 01:08:45,599 Speaker 2: But also just so odd to say, well, six point 1328 01:08:45,640 --> 01:08:48,240 Speaker 2: eight weeks from Memorial Day is the fourth of July. 1329 01:08:48,520 --> 01:08:50,840 Speaker 2: This article didn't even come out like it came out 1330 01:08:50,920 --> 01:08:55,280 Speaker 2: after Memorial Day and yeah fourth It's just nothing made sense. 1331 01:08:55,360 --> 01:08:57,720 Speaker 2: And I was like, I don't fucking understand what they're 1332 01:08:57,760 --> 01:09:00,519 Speaker 2: doing to me right now. But again that's this is 1333 01:09:00,600 --> 01:09:01,960 Speaker 2: like the insidious part for me. 1334 01:09:02,080 --> 01:09:04,479 Speaker 5: But this appeared in Food and Wine. 1335 01:09:04,720 --> 01:09:06,719 Speaker 2: This is in Food and Wine magazine with a human 1336 01:09:07,600 --> 01:09:10,160 Speaker 2: like in the byline, and I actually d m this 1337 01:09:10,280 --> 01:09:12,160 Speaker 2: person on Instagram and I said, do you mind just 1338 01:09:12,240 --> 01:09:15,200 Speaker 2: clarifying this part, like I'm a little bit confused and 1339 01:09:15,800 --> 01:09:17,519 Speaker 2: I've I've gotten no response. 1340 01:09:18,160 --> 01:09:20,559 Speaker 1: I'm wondering if it's because I know that. I mean, 1341 01:09:20,600 --> 01:09:23,920 Speaker 1: there was some good coverage and futurism and they were 1342 01:09:24,000 --> 01:09:28,360 Speaker 1: talking about this company called adv on Commerce and the 1343 01:09:28,439 --> 01:09:32,120 Speaker 1: way that basically this this this company has been basically 1344 01:09:32,760 --> 01:09:37,680 Speaker 1: making AI generated articles for a lot of different publications, 1345 01:09:38,360 --> 01:09:42,880 Speaker 1: usually on products like product placement, right, And so it 1346 01:09:43,000 --> 01:09:46,000 Speaker 1: makes me think it's sort of like because food and wine, 1347 01:09:46,640 --> 01:09:48,439 Speaker 1: you know, may have been one of their forgot I 1348 01:09:48,600 --> 01:09:51,240 Speaker 1: forgot the article, but they had a bun they had 1349 01:09:51,320 --> 01:09:54,560 Speaker 1: like you know, better homes and gardening and you know, 1350 01:09:54,640 --> 01:09:56,800 Speaker 1: kind of these legacy articles like that. So I don't 1351 01:09:56,840 --> 01:09:59,880 Speaker 1: know if it's something of that or this journalist kind 1352 01:09:59,880 --> 01:10:02,080 Speaker 1: of said write me this thing and I'm just gonna 1353 01:10:02,120 --> 01:10:04,320 Speaker 1: drop it and then go with god, you know. 1354 01:10:05,479 --> 01:10:07,320 Speaker 2: Yeah. Yeah. 1355 01:10:07,520 --> 01:10:10,840 Speaker 3: My My other favorite example of is this headline I 1356 01:10:10,920 --> 01:10:13,760 Speaker 3: saw somewhere it's no big secret why Van Vaught isn't 1357 01:10:13,800 --> 01:10:16,800 Speaker 3: around anymore, and with a picture of Vince Vaughan, but 1358 01:10:16,880 --> 01:10:23,400 Speaker 3: they just like got his name completely wrong. It's no 1359 01:10:23,520 --> 01:10:27,479 Speaker 3: big secret why Van Vaught isn't around anymore. 1360 01:10:29,920 --> 01:10:33,400 Speaker 1: I'm like, I'm you know, if I was just scrolling 1361 01:10:33,600 --> 01:10:35,800 Speaker 1: and I just and I'd say like, yeah, you just 1362 01:10:35,840 --> 01:10:37,920 Speaker 1: like you know, I liked Van Vaught in the intern 1363 01:10:38,400 --> 01:10:41,720 Speaker 1: and then yeah, but then I but and then I 1364 01:10:41,760 --> 01:10:43,000 Speaker 1: would have looked at it, and then I would have 1365 01:10:43,040 --> 01:10:45,760 Speaker 1: double taped. I'm like, wait, wait, wait did he co 1366 01:10:46,000 --> 01:10:49,040 Speaker 1: star with oh when Mick Wilson or something? 1367 01:10:49,280 --> 01:10:52,120 Speaker 2: Yeah, yeah, yeah, Russell Wilson was in that. 1368 01:10:53,000 --> 01:10:54,960 Speaker 4: I think it was the Week reporting that you're thinking of, 1369 01:10:55,040 --> 01:10:56,479 Speaker 4: alex I Futures' did a bunch of it. But then 1370 01:10:56,520 --> 01:10:58,240 Speaker 4: AD Week had the whole thing about ad Von and 1371 01:10:58,360 --> 01:10:59,720 Speaker 4: I can't quite no, no. 1372 01:11:00,360 --> 01:11:02,840 Speaker 1: No, it was it was future. It was futurism, yeah, 1373 01:11:02,880 --> 01:11:05,879 Speaker 1: because because A Week had the thing on this program 1374 01:11:06,760 --> 01:11:09,000 Speaker 1: that Google was offering and they didn't have a name. 1375 01:11:09,120 --> 01:11:13,600 Speaker 4: Oh right, yeah, avas futurism. Yeah, but it totally sounds. 1376 01:11:13,360 --> 01:11:14,559 Speaker 3: Like what it is happening. Yeah. 1377 01:11:14,720 --> 01:11:16,320 Speaker 5: Yeah, I thought you were going to. 1378 01:11:16,320 --> 01:11:19,320 Speaker 4: Talk about the surveillance by Eminem thing. We said Eminem's 1379 01:11:19,479 --> 01:11:22,679 Speaker 4: So this was somewhere in Canada. There was an Eminem 1380 01:11:22,760 --> 01:11:25,599 Speaker 4: vending machine that was like taking pictures of the students 1381 01:11:26,000 --> 01:11:27,880 Speaker 4: while they were making their purchases. And I forget what 1382 01:11:27,960 --> 01:11:30,240 Speaker 4: the like O sensible purpose was, but the students found 1383 01:11:30,280 --> 01:11:32,160 Speaker 4: out and I got it removed. 1384 01:11:32,680 --> 01:11:40,040 Speaker 3: Wow, probably freaked out and made a big deal about it. Right, Well, 1385 01:11:40,439 --> 01:11:42,639 Speaker 3: I feel like we could talk to you guys once 1386 01:11:42,640 --> 01:11:46,400 Speaker 3: again for three hours. There's so much interesting stuff to 1387 01:11:46,479 --> 01:11:48,800 Speaker 3: talk about. Your show is so great. Thank you both 1388 01:11:48,880 --> 01:11:53,000 Speaker 3: for joining. Yeah, where can people find you? Follow you 1389 01:11:53,280 --> 01:11:55,560 Speaker 3: all that good stuff? Emily, we'll start with you. 1390 01:11:56,240 --> 01:11:59,400 Speaker 4: Well, first, there's a podcast Mystery AI hYP Theater three thousand, 1391 01:11:59,520 --> 01:12:02,600 Speaker 4: where you find any podcast you can find ours, And 1392 01:12:02,720 --> 01:12:05,840 Speaker 4: we've also started a newsletter. If you just search Mystery 1393 01:12:05,880 --> 01:12:08,320 Speaker 4: a Hip Theater three thousand newsletter, I think it'll turn up. 1394 01:12:08,880 --> 01:12:11,600 Speaker 4: And that's an irregular newsletter where we basically took the 1395 01:12:11,680 --> 01:12:14,360 Speaker 4: things that used to be sort of little tweet storms, 1396 01:12:14,880 --> 01:12:17,559 Speaker 4: and since the social media stuff has gotten fragmented, we're 1397 01:12:17,600 --> 01:12:20,240 Speaker 4: now creating newsletter posts with them. So it's you know, 1398 01:12:20,640 --> 01:12:26,000 Speaker 4: off the cuff discussions of things on Twitter, x in 1399 01:12:26,400 --> 01:12:30,479 Speaker 4: Macedon and Blue Sky. I'm Emily M Bender and I'm 1400 01:12:30,479 --> 01:12:34,160 Speaker 4: also reluctantly using LinkedIn as social media these days. 1401 01:12:34,520 --> 01:12:38,320 Speaker 3: So it's gonna be the one that survives them all 1402 01:12:38,400 --> 01:12:40,760 Speaker 3: because I know some people kind of need it. 1403 01:12:41,080 --> 01:12:47,080 Speaker 1: Really the cock really the cock verches of yeah, yeah, yeah, 1404 01:12:47,120 --> 01:12:50,720 Speaker 1: I met al you, Alex, Alex, Hannah h A N 1405 01:12:50,840 --> 01:12:56,559 Speaker 1: n A on Twitter Blue Sky. I barely use blue 1406 01:12:56,600 --> 01:12:58,880 Speaker 1: Sky or mass it on, but Twitter is the best 1407 01:12:58,920 --> 01:13:02,800 Speaker 1: place to find me. Also check out Dare Dare d 1408 01:13:03,040 --> 01:13:07,080 Speaker 1: a I R hyphen Institute dot org. And we're also 1409 01:13:07,240 --> 01:13:13,040 Speaker 1: Dare Underscore Institute on Twitter, Macedon and uh, we're not 1410 01:13:13,120 --> 01:13:16,000 Speaker 1: on Blue Sky yet, but we're on LinkedIn. But that's, 1411 01:13:16,680 --> 01:13:18,280 Speaker 1: you know, where you learned a lot about what our 1412 01:13:18,360 --> 01:13:23,599 Speaker 1: institute's doing. Lots of good stuff, amazing colleagues and whatnot. 1413 01:13:24,160 --> 01:13:28,320 Speaker 3: Yeah, amazing. And is there a work of media that 1414 01:13:28,800 --> 01:13:29,679 Speaker 3: you've been enjoying. 1415 01:13:30,320 --> 01:13:31,479 Speaker 5: Yes, I've got one for you. 1416 01:13:31,600 --> 01:13:33,080 Speaker 4: This I think started off as a tweet, but I 1417 01:13:33,080 --> 01:13:35,160 Speaker 4: saw it as a screencap on Macedon. So it's by 1418 01:13:35,439 --> 01:13:38,439 Speaker 4: Lama in a tux and the text is don't you 1419 01:13:38,560 --> 01:13:41,000 Speaker 4: understand that the human race is an endless number of 1420 01:13:41,080 --> 01:13:43,599 Speaker 4: monkeys and every day we produce an endless number of words. 1421 01:13:43,800 --> 01:13:45,400 Speaker 4: And one of us already wrote Hamlet. 1422 01:13:47,800 --> 01:13:48,559 Speaker 3: That's really good. 1423 01:13:49,080 --> 01:13:52,840 Speaker 1: That's such. That's such a hyper specific piece of media. 1424 01:13:53,520 --> 01:13:56,320 Speaker 1: I think I think last I think last time was 1425 01:13:56,360 --> 01:13:59,080 Speaker 1: on this. I was plugging Worlds Beyond Number, which is 1426 01:13:59,120 --> 01:14:02,360 Speaker 1: a podcast which I'm just absolutely in love with, which 1427 01:14:02,439 --> 01:14:06,080 Speaker 1: is a a Dungeons and Dragon's actual play podcast, but 1428 01:14:06,200 --> 01:14:09,559 Speaker 1: it's got amazing sound production. I would just like plug 1429 01:14:09,640 --> 01:14:12,320 Speaker 1: in everything on dropout dot tv. I mean this it's 1430 01:14:12,320 --> 01:14:17,320 Speaker 1: a screaming service. Honestly, it's uh, you know, Sam Reich, 1431 01:14:17,400 --> 01:14:21,320 Speaker 1: who is a Reich who is the son of Robert Reich, 1432 01:14:21,560 --> 01:14:25,280 Speaker 1: kind of liberal darling and former Department of Labor secretary 1433 01:14:25,360 --> 01:14:29,960 Speaker 1: in the Clinton administration, has turned college humor into an 1434 01:14:30,000 --> 01:14:34,920 Speaker 1: area of of like really great comedians. So they're putting 1435 01:14:34,960 --> 01:14:38,360 Speaker 1: out a lot of great stuff. So i'd say, you know, 1436 01:14:38,680 --> 01:14:41,120 Speaker 1: make some noises. Coming out with a new season today, 1437 01:14:41,240 --> 01:14:44,519 Speaker 1: which is it's it's a really great improv comedy thing 1438 01:14:44,760 --> 01:14:47,439 Speaker 1: and yeah, let's just let's just go with that. 1439 01:14:47,680 --> 01:14:47,960 Speaker 2: So just. 1440 01:14:49,800 --> 01:14:50,320 Speaker 3: Hilarious. 1441 01:14:50,439 --> 01:14:54,400 Speaker 1: Those very important interviews Vick Michalis I named one of 1442 01:14:54,439 --> 01:14:59,560 Speaker 1: my chickens vehicular manslaughter after an inside joke there, and 1443 01:14:59,720 --> 01:15:04,439 Speaker 1: another one Thomas Shrinkley. So yeah, just incredible, incredible stuff. 1444 01:15:04,840 --> 01:15:09,280 Speaker 3: Yeah, shout out to Sam, He's one of the best. Miles. Yes, 1445 01:15:09,880 --> 01:15:12,280 Speaker 3: where can people find you? Is there a workimedia you 1446 01:15:12,439 --> 01:15:12,960 Speaker 3: can enjoy? 1447 01:15:13,160 --> 01:15:15,960 Speaker 2: They have at symbols. Look for at Miles of Gray. 1448 01:15:16,200 --> 01:15:19,640 Speaker 2: I'm probably there. You can find Jack and I on 1449 01:15:19,760 --> 01:15:23,639 Speaker 2: our basketball podcast, Miles and Jack Got Mad, where we've 1450 01:15:23,880 --> 01:15:28,200 Speaker 2: wrapped up the NBA season and I have that streaming 1451 01:15:28,320 --> 01:15:30,960 Speaker 2: down my face with pain and anger as the Celtics 1452 01:15:31,000 --> 01:15:33,519 Speaker 2: win again. And also if you want to hear me 1453 01:15:33,560 --> 01:15:36,360 Speaker 2: talk about very serious stuff, I'm talking about ninety day 1454 01:15:36,400 --> 01:15:39,800 Speaker 2: Fiance on my other show for twenty Day Fiance, which 1455 01:15:39,840 --> 01:15:42,880 Speaker 2: you can check out wherever they have podcasts. A tweet 1456 01:15:42,960 --> 01:15:47,760 Speaker 2: I like first one is from a past guest, Josh 1457 01:15:47,840 --> 01:15:51,479 Speaker 2: Gondleman tweeted, I bet the best part of being in 1458 01:15:51,560 --> 01:15:53,800 Speaker 2: a threuttle as you have someone to do all three 1459 01:15:53,920 --> 01:15:58,519 Speaker 2: Beastie Boys PARSI karaoke. That's I guess one way to 1460 01:15:58,560 --> 01:16:01,760 Speaker 2: look at that. And then the one from other past gasts. 1461 01:16:01,800 --> 01:16:05,360 Speaker 2: Demia de Juibe at electro Lemon got his account hacked 1462 01:16:05,400 --> 01:16:07,280 Speaker 2: and he tweeted, Hi, Hello, it's DEMI. I got my 1463 01:16:07,320 --> 01:16:09,720 Speaker 2: account back. I feel the need to clarify that under 1464 01:16:09,800 --> 01:16:13,320 Speaker 2: no circumstances should you ever believe that I or anybody 1465 01:16:13,400 --> 01:16:17,800 Speaker 2: on this website is selling cheap MacBooks for charity or otherwise, 1466 01:16:18,120 --> 01:16:20,280 Speaker 2: And what benefit would my signature do. 1467 01:16:20,400 --> 01:16:22,640 Speaker 3: To a laptop. Oh yeah, thank you for. 1468 01:16:23,240 --> 01:16:26,040 Speaker 1: I actually remember because I followed Demi, and I remember 1469 01:16:26,120 --> 01:16:29,120 Speaker 1: when his account got hacked and I thought, man, that's 1470 01:16:29,240 --> 01:16:31,439 Speaker 1: really and I at first I thought it was a 1471 01:16:31,520 --> 01:16:35,080 Speaker 1: bit because Demi is hilenious. But then I'm just like, 1472 01:16:35,240 --> 01:16:36,519 Speaker 1: what the hell, it's funny. 1473 01:16:36,800 --> 01:16:39,439 Speaker 2: His follow up tweet was for anyone who thought I 1474 01:16:39,600 --> 01:16:42,240 Speaker 2: was doing a bit, what's the punchline? 1475 01:16:43,840 --> 01:16:45,680 Speaker 3: My jokes are never so ubdused. 1476 01:16:45,920 --> 01:16:48,200 Speaker 2: I love you if I want you to know it 1477 01:16:48,320 --> 01:16:50,360 Speaker 2: wasn't all that funny, and I want you to know quick. 1478 01:16:50,840 --> 01:16:51,040 Speaker 3: Yeah. 1479 01:16:51,120 --> 01:16:53,320 Speaker 1: No, I was also trying to find out what the 1480 01:16:53,360 --> 01:16:54,840 Speaker 1: punchline was, right right? 1481 01:16:55,080 --> 01:16:55,280 Speaker 3: Yeah? 1482 01:16:55,600 --> 01:16:59,800 Speaker 2: Wait for it so funny that part of you wants 1483 01:16:59,840 --> 01:17:01,599 Speaker 2: to be like, well, hold on, what are you doing here? 1484 01:17:01,840 --> 01:17:03,599 Speaker 3: Yeah? What's like, what's what's the deal here? 1485 01:17:03,640 --> 01:17:05,960 Speaker 2: You don't want to immediately just dismiss Demi because he's 1486 01:17:05,960 --> 01:17:07,000 Speaker 2: such a great comedic man. 1487 01:17:07,160 --> 01:17:09,839 Speaker 1: Yeah, but yeah, if you do want good Demi content, 1488 01:17:10,040 --> 01:17:13,559 Speaker 1: the who's Who's welcome at the cookout? You can find 1489 01:17:13,640 --> 01:17:15,880 Speaker 1: that's some dropout content that you can get for free 1490 01:17:16,040 --> 01:17:16,599 Speaker 1: on YouTube. 1491 01:17:17,080 --> 01:17:21,120 Speaker 3: There you go. Tweet I've been enjoying sleepy at Sleepy 1492 01:17:21,160 --> 01:17:24,439 Speaker 3: Underscore Nice tweeted, it's absurd that Diddy Kong wears a 1493 01:17:24,560 --> 01:17:28,600 Speaker 3: hat that says Nintendo. Patently ridiculous. There's no way he 1494 01:17:28,800 --> 01:17:32,599 Speaker 3: understands the significance. It would be like me unknowingly wearing 1495 01:17:32,640 --> 01:17:36,040 Speaker 3: a hat that coincidentally depicts the true form of the universe. 1496 01:17:36,479 --> 01:17:37,760 Speaker 3: Take it off, Kong. 1497 01:17:39,479 --> 01:17:40,680 Speaker 1: That's incredible. 1498 01:17:41,479 --> 01:17:47,120 Speaker 6: Oh my god, good, it's so fucking good, because yeah, 1499 01:17:47,200 --> 01:17:50,879 Speaker 6: the second he showed up, you're like, I don't know, Yeah, Brandon, 1500 01:17:51,280 --> 01:17:52,120 Speaker 6: he likes Nintendo. 1501 01:17:53,000 --> 01:17:56,360 Speaker 3: You can find me on Twitter at Jack Underscore. O'Brien, 1502 01:17:56,400 --> 01:17:59,000 Speaker 3: you can find us on Twitter at Daily Zeikegeist. We're 1503 01:17:59,000 --> 01:18:01,799 Speaker 3: at the Daily Zeike on Instagram. We have a Facebook 1504 01:18:01,880 --> 01:18:05,240 Speaker 3: fan page and a website Daily zeitgeist dot com, where 1505 01:18:05,280 --> 01:18:08,360 Speaker 3: we post our episodes and our foot Nope, no where 1506 01:18:08,360 --> 01:18:10,559 Speaker 3: we link off to the information that we talked about 1507 01:18:10,960 --> 01:18:12,800 Speaker 3: in today's episode, as well as a song that we 1508 01:18:12,840 --> 01:18:15,639 Speaker 3: think you might enjoy. Myles, what song do you think 1509 01:18:15,720 --> 01:18:16,800 Speaker 3: people might enjoy? 1510 01:18:17,479 --> 01:18:21,880 Speaker 2: I came across this track from like the fifties that 1511 01:18:22,240 --> 01:18:24,760 Speaker 2: is like not really populas, Like it was playing on 1512 01:18:24,800 --> 01:18:26,559 Speaker 2: the radio and I just when I hear it. When 1513 01:18:26,600 --> 01:18:27,960 Speaker 2: I heard it, I was like, wait, what is this 1514 01:18:28,120 --> 01:18:30,840 Speaker 2: song because I thought it was like maybe a like 1515 01:18:31,000 --> 01:18:33,360 Speaker 2: newer artist doing sort of a send up of like 1516 01:18:33,439 --> 01:18:36,280 Speaker 2: fifties music, like surf music. It's called out in the 1517 01:18:36,439 --> 01:18:40,200 Speaker 2: Sun parenthetical hey oh, and it is a bit like 1518 01:18:40,320 --> 01:18:44,240 Speaker 2: Belafonte's day Oh and kind of has this like sort 1519 01:18:44,240 --> 01:18:47,360 Speaker 2: of similar sort of cadence to the verse. But it's 1520 01:18:47,520 --> 01:18:49,639 Speaker 2: just like when I heard it, I'm like, this sounds 1521 01:18:49,720 --> 01:18:52,639 Speaker 2: like the kind of like song like Tarantino would pluck 1522 01:18:52,680 --> 01:18:55,719 Speaker 2: from obscurity and then put under like a really dark scene, 1523 01:18:56,080 --> 01:18:57,920 Speaker 2: and it just got like it's like a beat song, 1524 01:18:58,000 --> 01:19:00,479 Speaker 2: but there's this like darkness to it that I really love. 1525 01:19:00,840 --> 01:19:04,400 Speaker 2: But anyway, this is the beach Nuts without in the Sun, 1526 01:19:05,240 --> 01:19:06,400 Speaker 2: So yeah, check this song out. 1527 01:19:06,479 --> 01:19:08,880 Speaker 3: It's when did that actually come out? Is it recent? 1528 01:19:09,080 --> 01:19:09,320 Speaker 2: You know? 1529 01:19:09,360 --> 01:19:10,080 Speaker 3: It's from the fifties? 1530 01:19:10,120 --> 01:19:11,880 Speaker 2: Notes from the fifties, Like they're an actual band. 1531 01:19:12,000 --> 01:19:15,479 Speaker 3: Like the lyrics are like, hey there, girls, where are 1532 01:19:15,600 --> 01:19:16,200 Speaker 3: you going? 1533 01:19:16,520 --> 01:19:18,880 Speaker 2: And they're like, down to the beach is. 1534 01:19:18,960 --> 01:19:19,960 Speaker 3: Where we're going? 1535 01:19:21,040 --> 01:19:24,880 Speaker 2: Lyrics is so literal, but there's this like charm to 1536 01:19:25,000 --> 01:19:27,479 Speaker 2: it and the instrumentation is cool. So anyway, this is 1537 01:19:27,560 --> 01:19:30,679 Speaker 2: the beach Nuts without in the Sun parenthetical. 1538 01:19:30,200 --> 01:19:33,560 Speaker 3: Fail all right, well, we will link off to that 1539 01:19:33,760 --> 01:19:36,760 Speaker 3: in the footnotes todayly Zeitgeist is a production of iHeartRadio. 1540 01:19:36,800 --> 01:19:38,880 Speaker 3: For more podcasts from my heart Radio, visit the heart 1541 01:19:38,960 --> 01:19:42,120 Speaker 3: Radio app, Apple Podcasts, or wherever fine podcasts are given 1542 01:19:42,120 --> 01:19:44,040 Speaker 3: away for free. That's going to do it for us 1543 01:19:44,280 --> 01:19:46,720 Speaker 3: this yet morning. We're back this afternoon to tell you 1544 01:19:46,800 --> 01:19:48,880 Speaker 3: what is trending, and we will talk to y'all then, 1545 01:19:49,040 --> 01:19:51,360 Speaker 3: bye bye, bye bye