1 00:00:09,920 --> 00:00:15,640 Speaker 1: There's this TikTok that somebody posted where she was getting waxed. 2 00:00:16,200 --> 00:00:19,000 Speaker 1: I had a Brazilian wax done about three weeks ago, 3 00:00:19,280 --> 00:00:21,759 Speaker 1: and it's been haunting me ever since. 4 00:00:22,720 --> 00:00:25,840 Speaker 2: The girl that was giving. 5 00:00:25,560 --> 00:00:31,319 Speaker 1: Me the watch, she was wearing Meta glasses and she's like, 6 00:00:31,880 --> 00:00:36,800 Speaker 1: they're not charged, they're not on, Like I promise. Verbal 7 00:00:36,840 --> 00:00:40,640 Speaker 1: technology is having a moment. There's been this wave building 8 00:00:40,680 --> 00:00:43,680 Speaker 1: for a while where the stuff that seemed like science fiction, 9 00:00:43,920 --> 00:00:48,240 Speaker 1: like having a computer on your wrist, is completely mainstream now, 10 00:00:48,440 --> 00:00:52,159 Speaker 1: and now the next step of smart glasses is starting 11 00:00:52,200 --> 00:00:54,960 Speaker 1: to get there too. The best example of this is 12 00:00:55,040 --> 00:00:58,600 Speaker 1: Meta's ray bands. These glasses blend in so well that 13 00:00:58,680 --> 00:01:01,240 Speaker 1: people are starting to wear them as their normal, everyday 14 00:01:01,320 --> 00:01:05,080 Speaker 1: prescription glasses, with the added benefit of being able to, 15 00:01:05,319 --> 00:01:09,959 Speaker 1: among other things, record audio and video. And this brings 16 00:01:09,959 --> 00:01:14,120 Speaker 1: in a lot of questions about privacy. These aren't new questions. 17 00:01:14,120 --> 00:01:15,960 Speaker 1: We've had well over a decade to think about this, 18 00:01:16,560 --> 00:01:19,399 Speaker 1: but it's got to be uncomfortable to have this societal 19 00:01:19,480 --> 00:01:23,600 Speaker 1: quandary about surveillance thrown at you when you're not wearing 20 00:01:23,640 --> 00:01:28,640 Speaker 1: any pants. I could not stop thinking like, could this 21 00:01:28,680 --> 00:01:31,720 Speaker 1: scrow be filming me right now? Like could she be filming? 22 00:01:34,480 --> 00:01:36,679 Speaker 3: Madam will tell you that they have an LED indicator 23 00:01:36,760 --> 00:01:41,360 Speaker 3: light that goes on when you are recording. But someone's 24 00:01:41,400 --> 00:01:42,720 Speaker 3: going to figure out a way to hack that. 25 00:01:43,560 --> 00:01:46,080 Speaker 1: Victoria's song is a senior reviewer at the Verge. 26 00:01:46,440 --> 00:01:48,960 Speaker 3: Someone's going to figure out a sticker to buy so 27 00:01:49,000 --> 00:01:51,000 Speaker 3: that you can put it over the case. Like what 28 00:01:51,160 --> 00:01:54,520 Speaker 3: is going to be the social etiquette going forward for 29 00:01:54,600 --> 00:01:57,920 Speaker 3: these glasses? Are they going to be banned from certain professions? 30 00:01:58,280 --> 00:02:00,760 Speaker 1: Victoria's been testing and reporting on way bearables for over 31 00:02:00,800 --> 00:02:04,120 Speaker 1: a decade, so she's seen the technology evolve from fitness 32 00:02:04,120 --> 00:02:06,920 Speaker 1: trackers that count your steps to rings that track your 33 00:02:06,920 --> 00:02:11,960 Speaker 1: heart rate too, smart watches, dependence and glasses. But this 34 00:02:12,120 --> 00:02:13,680 Speaker 1: time feels a little different. 35 00:02:14,080 --> 00:02:17,640 Speaker 3: For whatever reason, the big tech powers that be have 36 00:02:17,760 --> 00:02:19,119 Speaker 3: convened to decide. 37 00:02:18,800 --> 00:02:19,680 Speaker 4: That wearables are it? 38 00:02:20,200 --> 00:02:23,400 Speaker 3: Wearables are how we're going to get AI to really 39 00:02:23,639 --> 00:02:27,040 Speaker 3: take off, Like we're gonna have AI leave the computer 40 00:02:27,320 --> 00:02:30,519 Speaker 3: and be on your body, beyond you as a person, 41 00:02:30,560 --> 00:02:32,480 Speaker 3: and you're going to take AI with you in every 42 00:02:32,560 --> 00:02:36,440 Speaker 3: aspect of your life. Is that the future that we're 43 00:02:36,480 --> 00:02:38,440 Speaker 3: heading towards? Is that the future we want to head towards? 44 00:02:38,440 --> 00:02:40,080 Speaker 3: These are the conversations that we're going to have to 45 00:02:40,080 --> 00:02:47,880 Speaker 3: start having. 46 00:02:50,160 --> 00:02:56,359 Speaker 1: Kaleidoscope and iHeart podcasts. This is kill Switch. 47 00:02:57,120 --> 00:03:37,280 Speaker 5: I'm Dector Thomas, I'm staring, I'm good bye. 48 00:03:41,960 --> 00:03:44,440 Speaker 1: This isn't the first time we run into questions about 49 00:03:44,480 --> 00:03:48,400 Speaker 1: privacy and smart glasses. Most people first started thinking about 50 00:03:48,440 --> 00:03:51,400 Speaker 1: this a little over a decade ago in twenty thirteen, 51 00:03:51,720 --> 00:03:56,080 Speaker 1: with the release of Google Glass. When I first started 52 00:03:56,080 --> 00:03:58,960 Speaker 1: thinking about wearables, there's maybe something that I might be 53 00:03:59,040 --> 00:04:01,040 Speaker 1: interested in. Is Google Glass? 54 00:04:01,440 --> 00:04:02,680 Speaker 4: Oh yes, Google Glass. 55 00:04:03,200 --> 00:04:05,760 Speaker 1: I see your face, So we obviously got to talk 56 00:04:05,800 --> 00:04:09,280 Speaker 1: about Google Glass. When you first heard about Google Glass, 57 00:04:09,560 --> 00:04:10,560 Speaker 1: would you think about it? 58 00:04:11,400 --> 00:04:14,920 Speaker 3: Ah? It was sort of straight out of science fiction. 59 00:04:15,240 --> 00:04:19,360 Speaker 3: There was this marketing sizzle reel showing what Google Glass 60 00:04:19,440 --> 00:04:20,240 Speaker 3: was supposed to be. 61 00:04:20,920 --> 00:04:24,279 Speaker 6: You're riding right there, okay glass, take a picture? 62 00:04:26,920 --> 00:04:39,200 Speaker 3: Okay, class, and you know it's an augmented reality device 63 00:04:39,360 --> 00:04:44,599 Speaker 3: that shows you notifications of stuff that you need based 64 00:04:44,640 --> 00:04:48,719 Speaker 3: on the world around you in real time. Is total 65 00:04:48,720 --> 00:04:52,040 Speaker 3: game changer if you really think about could this actually work? 66 00:04:52,360 --> 00:04:55,320 Speaker 3: A lot of people probably were thinking of things like 67 00:04:55,400 --> 00:04:59,440 Speaker 3: Iron Man and Tony Starks smart glasses. There's just a 68 00:04:59,480 --> 00:05:03,160 Speaker 3: lot of science fiction imagery that exists with heads up displays, 69 00:05:03,320 --> 00:05:06,280 Speaker 3: even though this technology actually dates all the way back 70 00:05:06,279 --> 00:05:09,160 Speaker 3: to World War Two with fighter pilots and heads up 71 00:05:09,160 --> 00:05:12,440 Speaker 3: displays in the windshields. If you were to really trace 72 00:05:12,440 --> 00:05:16,720 Speaker 3: the genesis of it, it's a really old concept in technology. 73 00:05:16,760 --> 00:05:19,480 Speaker 3: But I think Google Glass was the first that kind 74 00:05:19,520 --> 00:05:22,360 Speaker 3: of plucked it from science fiction into the realm of like, hey, 75 00:05:22,400 --> 00:05:24,919 Speaker 3: we have a product and it's like, I think, what 76 00:05:24,960 --> 00:05:27,640 Speaker 3: was it, like fifteen hundred dollars somewhere in that ballpark, 77 00:05:27,680 --> 00:05:30,159 Speaker 3: and you can buy it if you have the money, 78 00:05:30,200 --> 00:05:33,600 Speaker 3: and people did wear the Google Glass Explorer additioned out 79 00:05:33,640 --> 00:05:37,280 Speaker 3: into the world, and it was like a really surreal 80 00:05:37,720 --> 00:05:39,080 Speaker 3: thing that happened. 81 00:05:39,720 --> 00:05:43,440 Speaker 1: Google Glass was not subtle. It barely looked like glasses. 82 00:05:43,960 --> 00:05:46,479 Speaker 1: It was this bar that went across your forehead that 83 00:05:46,680 --> 00:05:49,400 Speaker 1: sat on the bridge of your nose, and instead of lenses, 84 00:05:49,720 --> 00:05:52,800 Speaker 1: there was this metal square thing and this inch long 85 00:05:52,880 --> 00:05:55,760 Speaker 1: transparent camera that stuck out over your right eye. 86 00:05:56,400 --> 00:05:59,120 Speaker 3: It reminded me of the dragon ball z scouters that 87 00:05:59,200 --> 00:06:05,839 Speaker 3: vegeta the power level scouts. Yeah, it reminded me kind 88 00:06:05,839 --> 00:06:08,479 Speaker 3: of like that because there wasn't actually like glasses for 89 00:06:08,560 --> 00:06:13,039 Speaker 3: people to see through. What does a scouter say about 90 00:06:13,040 --> 00:06:16,400 Speaker 3: his power level, it's over nine. 91 00:06:17,920 --> 00:06:19,280 Speaker 1: Nine thousand. 92 00:06:19,440 --> 00:06:20,479 Speaker 6: There's no way. 93 00:06:20,360 --> 00:06:25,200 Speaker 1: That could be right, Yo, I hadn't thought about that. 94 00:06:25,520 --> 00:06:29,320 Speaker 1: It looks like the dragon ball Z. But the scouter that, 95 00:06:29,360 --> 00:06:30,000 Speaker 1: oh my god. 96 00:06:30,360 --> 00:06:31,839 Speaker 5: Yeah, you know. 97 00:06:32,680 --> 00:06:38,240 Speaker 1: Google glass really leaned into the science fiction aesthetic of it, right. 98 00:06:38,320 --> 00:06:40,320 Speaker 1: It wasn't supposed to blend in. You were supposed to 99 00:06:40,360 --> 00:06:41,760 Speaker 1: stand out when you wore it. 100 00:06:42,080 --> 00:06:42,640 Speaker 4: Yeah, it was. 101 00:06:42,760 --> 00:06:45,719 Speaker 3: Very distinctive when you saw someone wearing it. You'd see 102 00:06:45,720 --> 00:06:47,320 Speaker 3: it and you'd immediately clock it. 103 00:06:48,520 --> 00:06:51,440 Speaker 1: People would immediately clock it, and in a lot of 104 00:06:51,480 --> 00:06:54,120 Speaker 1: cases they'd make fun of it, not just because of 105 00:06:54,160 --> 00:06:57,040 Speaker 1: surveillance because they didn't want to be recorded. Sometimes the 106 00:06:57,080 --> 00:07:00,760 Speaker 1: criticism was just more surface level and basic, like why 107 00:07:00,800 --> 00:07:02,880 Speaker 1: would you want to wear that thing on your face 108 00:07:03,080 --> 00:07:06,600 Speaker 1: and show everybody that you're a weirdo who records everything. 109 00:07:07,160 --> 00:07:09,560 Speaker 1: Back when it came out, the Daily Show had a 110 00:07:09,600 --> 00:07:12,120 Speaker 1: whole segment that was making fun of it. Our nation 111 00:07:12,240 --> 00:07:15,640 Speaker 1: has long been haunted by discrimination, and while we've made 112 00:07:15,640 --> 00:07:19,160 Speaker 1: great strides over the years overcoming the challenges, there are 113 00:07:19,200 --> 00:07:23,000 Speaker 1: still those that suffer from the barbs of injustice. I 114 00:07:23,040 --> 00:07:26,200 Speaker 1: was denied admission and service at various establishments. I was 115 00:07:26,320 --> 00:07:27,520 Speaker 1: mugged in the Mission District. 116 00:07:27,680 --> 00:07:29,840 Speaker 2: I was asked to leave a coffee shop and the 117 00:07:30,360 --> 00:07:32,280 Speaker 2: reason why was because we don't do that. 118 00:07:32,560 --> 00:07:32,880 Speaker 5: Here. 119 00:07:33,080 --> 00:07:35,240 Speaker 1: Hold on a second, What the have you all got 120 00:07:35,280 --> 00:07:36,000 Speaker 1: on your faces? 121 00:07:37,320 --> 00:07:39,040 Speaker 6: Google glass? Google? What? 122 00:07:39,320 --> 00:07:42,800 Speaker 1: Google glass? Google? And that's what this is all about. 123 00:07:43,400 --> 00:07:48,720 Speaker 2: Yes, yes, it seems even in this day and age, 124 00:07:48,760 --> 00:07:51,640 Speaker 2: you can still be treated differently just because of how 125 00:07:51,680 --> 00:07:55,080 Speaker 2: you look wearing a fifteen hundred dollars face computer. 126 00:07:55,840 --> 00:07:59,320 Speaker 3: So you know, that's why it caused the controversies that 127 00:07:59,400 --> 00:08:02,480 Speaker 3: it did. I think the most famous example was just 128 00:08:02,560 --> 00:08:05,880 Speaker 3: a woman wearing the Google glass while she was out 129 00:08:05,920 --> 00:08:08,280 Speaker 3: in San Francisco and someone ripping it off her face. 130 00:08:12,440 --> 00:08:13,440 Speaker 2: It's on video now. 131 00:08:18,800 --> 00:08:21,480 Speaker 1: Sarah Slocum was a social media consultant and she was 132 00:08:21,520 --> 00:08:24,000 Speaker 1: wearing a Google glass at a bar in San Francisco. 133 00:08:24,480 --> 00:08:27,640 Speaker 1: She was showing some friends how Google glass works, and 134 00:08:27,720 --> 00:08:30,080 Speaker 1: some other people at the bar saw it and got 135 00:08:30,120 --> 00:08:32,319 Speaker 1: mad at her and started yelling at her because they 136 00:08:32,440 --> 00:08:35,920 Speaker 1: didn't want to be recorded, So she started recording a 137 00:08:36,000 --> 00:08:39,000 Speaker 1: video of them yelling at her, which maybe is kind 138 00:08:39,000 --> 00:08:42,120 Speaker 1: of ironic, but maybe also a preview of what the 139 00:08:42,160 --> 00:08:45,240 Speaker 1: future was going to look like. So It's kind of 140 00:08:45,280 --> 00:08:48,200 Speaker 1: hard to tell because everyone's yelling, but there's a moment 141 00:08:48,200 --> 00:08:50,800 Speaker 1: in the footage where you can hear someone say, get 142 00:08:50,840 --> 00:08:54,040 Speaker 1: your Google glasses out of here. They start yelling back 143 00:08:54,080 --> 00:08:56,439 Speaker 1: and forth, and she also flips them off and again 144 00:08:56,480 --> 00:08:59,440 Speaker 1: this is all on the video that she's recording, and 145 00:08:59,440 --> 00:09:01,839 Speaker 1: then someone rabs the glasses and rips them off of 146 00:09:01,880 --> 00:09:04,840 Speaker 1: her face. Sarah Slocan was featured in the local news 147 00:09:04,840 --> 00:09:07,199 Speaker 1: about this, and she also told her story in a 148 00:09:07,280 --> 00:09:09,319 Speaker 1: daily show segment about what happened. 149 00:09:10,040 --> 00:09:15,160 Speaker 6: I was at a bar and people started verbally accosting me. 150 00:09:15,320 --> 00:09:18,160 Speaker 6: They started getting physical immediately when I started recording. They 151 00:09:18,240 --> 00:09:20,400 Speaker 6: ripped them off my face basically, and they ran outside. 152 00:09:20,880 --> 00:09:23,520 Speaker 6: It was a hate crime. This only thing is that 153 00:09:23,520 --> 00:09:26,480 Speaker 6: they're going to be wearing these things probably in a year. 154 00:09:27,920 --> 00:09:30,680 Speaker 1: I'm going to reserve my comments about calling this a 155 00:09:30,720 --> 00:09:33,080 Speaker 1: hate crime, and instead I'm just going to move on 156 00:09:33,120 --> 00:09:37,160 Speaker 1: to the fact and say that Sarah Slocumb's prediction was incorrect. 157 00:09:37,559 --> 00:09:40,440 Speaker 1: Pretty much nobody was wearing these things in a year. 158 00:09:40,880 --> 00:09:43,160 Speaker 3: When you looked at it, you couldn't help but wonder, 159 00:09:43,240 --> 00:09:44,959 Speaker 3: oh was this person recording me? And I mean, there 160 00:09:45,000 --> 00:09:47,040 Speaker 3: was a recording light and all of that stuff. But 161 00:09:47,520 --> 00:09:48,920 Speaker 3: it really kind. 162 00:09:48,760 --> 00:09:50,880 Speaker 4: Of forced the whole sci fi. 163 00:09:50,640 --> 00:09:54,120 Speaker 3: Aspect of it into current everyday life in a way 164 00:09:54,160 --> 00:09:57,439 Speaker 3: that I don't think society was really prepared for at 165 00:09:57,480 --> 00:09:58,679 Speaker 3: that point in time. 166 00:09:59,360 --> 00:10:04,480 Speaker 1: So I also remember when Google Glass was announced and 167 00:10:04,520 --> 00:10:06,320 Speaker 1: actually thinking, oh, you know what, this would actually be 168 00:10:06,400 --> 00:10:11,280 Speaker 1: kind of cool. My thought was I could record concerts 169 00:10:11,720 --> 00:10:13,600 Speaker 1: or something like that. That might be kind of cool, 170 00:10:13,640 --> 00:10:17,199 Speaker 1: because using your phone it wasn't really a great experience either. 171 00:10:17,360 --> 00:10:20,880 Speaker 1: You're in people's way. So I'm going to artist shows 172 00:10:20,880 --> 00:10:24,880 Speaker 1: who I know. Sometimes they asked me to film their sets, 173 00:10:25,000 --> 00:10:27,280 Speaker 1: but it's kind of weird being a guy up on 174 00:10:27,320 --> 00:10:30,600 Speaker 1: the stage with the camera. But I remember going from 175 00:10:30,679 --> 00:10:35,320 Speaker 1: being kind of interested in the Google Glass to realizing, oh, 176 00:10:35,559 --> 00:10:38,800 Speaker 1: I think maybe people hate this, and then it was gone. 177 00:10:38,920 --> 00:10:42,360 Speaker 1: That life cycle of it was so fast. Did it 178 00:10:42,400 --> 00:10:43,080 Speaker 1: feel like that for you? 179 00:10:43,760 --> 00:10:45,760 Speaker 3: Oh, it's absolutely really fast. And there's a lot of 180 00:10:45,760 --> 00:10:48,920 Speaker 3: factors that go into that. One. The price was prohibitive 181 00:10:49,520 --> 00:10:52,560 Speaker 3: to the things that it could actually do, Like the 182 00:10:52,679 --> 00:10:57,520 Speaker 3: video that they showed was truly sci fi revolutionary, couldn't 183 00:10:57,520 --> 00:11:00,520 Speaker 3: actually deliver on that right, So it was much more 184 00:11:00,600 --> 00:11:03,880 Speaker 3: limited in what it could actually do. And then it 185 00:11:04,000 --> 00:11:07,760 Speaker 3: was so ostentatious when you wore it that there's no 186 00:11:07,840 --> 00:11:11,840 Speaker 3: way of being discreete right, So you're basically becoming an 187 00:11:11,920 --> 00:11:15,240 Speaker 3: ambassador of future technology for the average person. Do you 188 00:11:15,280 --> 00:11:17,080 Speaker 3: want to have people come up to you on the 189 00:11:17,080 --> 00:11:18,960 Speaker 3: street and be like, Oh, what is it that you're wearing? 190 00:11:19,000 --> 00:11:20,920 Speaker 3: Are you filming me or you recording me? Nah, you're 191 00:11:20,960 --> 00:11:23,000 Speaker 3: out here just trying to have a nice time. And 192 00:11:23,240 --> 00:11:25,880 Speaker 3: you know, in twenty thirteen, we were not in the 193 00:11:25,960 --> 00:11:28,800 Speaker 3: TikTok era. We were not in an era where everyone 194 00:11:28,920 --> 00:11:31,480 Speaker 3: is used to people just whipping out their phones and 195 00:11:31,559 --> 00:11:34,600 Speaker 3: filming life. The definition of a public space in a 196 00:11:34,600 --> 00:11:38,440 Speaker 3: private space was different ten years ago than it is now. 197 00:11:38,920 --> 00:11:42,200 Speaker 3: I think Google Glass was like a really unfortunate example 198 00:11:42,360 --> 00:11:46,120 Speaker 3: of a good idea, or at least like a substantive 199 00:11:46,200 --> 00:11:48,400 Speaker 3: idea and really poor timing. 200 00:11:49,160 --> 00:11:51,760 Speaker 1: And of course there was the phrase that went along 201 00:11:51,840 --> 00:11:53,440 Speaker 1: with anybody who had it, whether or not they were 202 00:11:53,440 --> 00:11:55,080 Speaker 1: doing anything wrong. Glasshole. 203 00:11:55,520 --> 00:11:59,679 Speaker 3: Yeah, glasshole. I kind of feel for Google because they're. 204 00:11:59,480 --> 00:12:00,720 Speaker 4: Never going to that down. 205 00:12:02,760 --> 00:12:04,439 Speaker 1: I don't know if I've ever talked to anybody who 206 00:12:04,440 --> 00:12:07,000 Speaker 1: said they feel bad for Google. This is a new one. 207 00:12:07,200 --> 00:12:08,960 Speaker 3: It's one of those things where it's like, I see 208 00:12:09,000 --> 00:12:13,040 Speaker 3: the vision, you know, pun intended. I see you really 209 00:12:13,120 --> 00:12:15,080 Speaker 3: wanted to go with that. I don't think it was 210 00:12:15,200 --> 00:12:18,640 Speaker 3: necessarily a bad idea, but you kind of made some 211 00:12:18,800 --> 00:12:24,000 Speaker 3: choices that were not in step with reality, and now 212 00:12:24,040 --> 00:12:25,600 Speaker 3: you have to live with the fact that you coined 213 00:12:25,640 --> 00:12:30,680 Speaker 3: the term glasshole like indelible, forever, imperpetuity. 214 00:12:32,679 --> 00:12:36,240 Speaker 1: Google Glass was a massive failure for Google. They did 215 00:12:36,280 --> 00:12:38,680 Speaker 1: try to salvage it by selling to companies where maybe 216 00:12:38,679 --> 00:12:41,200 Speaker 1: there would be some industrial uses for it. The idea 217 00:12:41,320 --> 00:12:43,720 Speaker 1: was that this whole thing about privacy and people feeling 218 00:12:43,800 --> 00:12:46,520 Speaker 1: uncomfortable wouldn't really matter if it's a worker and a 219 00:12:46,559 --> 00:12:50,120 Speaker 1: factory using Google Glass to check inventory or something. But 220 00:12:50,440 --> 00:12:53,920 Speaker 1: that didn't really catch on either, and Google basically gave 221 00:12:54,040 --> 00:12:56,960 Speaker 1: up on trying to push smart glasses. It looked like 222 00:12:57,120 --> 00:13:00,840 Speaker 1: as a society, we just didn't want it. But now 223 00:13:01,160 --> 00:13:03,600 Speaker 1: all of a sudden, there's been a resurgence in wearables 224 00:13:03,640 --> 00:13:08,480 Speaker 1: and specifically in smart glasses, and all those same questions 225 00:13:08,920 --> 00:13:13,080 Speaker 1: are coming back up. So maybe saying that society didn't 226 00:13:13,120 --> 00:13:16,160 Speaker 1: want it is the wrong framing. Maybe it's just that 227 00:13:16,240 --> 00:13:19,720 Speaker 1: we didn't want it yet, So are we ready for 228 00:13:19,760 --> 00:13:30,960 Speaker 1: smart glasses. Now that's after the break. Where are we 229 00:13:31,160 --> 00:13:36,079 Speaker 1: right now with wearables and specifically with smart glasses. 230 00:13:36,880 --> 00:13:37,280 Speaker 4: Right now, I. 231 00:13:37,280 --> 00:13:40,240 Speaker 3: Would say we are entering the age of AI hardware. 232 00:13:40,840 --> 00:13:44,520 Speaker 3: There are these wearable pendants that are always on you 233 00:13:44,640 --> 00:13:47,120 Speaker 3: at any given point time, listening to every single conversation 234 00:13:47,240 --> 00:13:50,560 Speaker 3: you have, acting as your quote unquote second memory. That's 235 00:13:50,640 --> 00:13:55,000 Speaker 3: one vein of wearable AI hardware. The others are things 236 00:13:55,120 --> 00:14:00,600 Speaker 3: like headphones and smart watches of existing popular forms, and 237 00:14:00,640 --> 00:14:04,080 Speaker 3: then kind of coming out of left field again, we 238 00:14:04,160 --> 00:14:09,079 Speaker 3: have smart glasses, specifically the ray band meta smart glasses 239 00:14:09,240 --> 00:14:13,560 Speaker 3: emerging as a kind of third track of AI hardware. 240 00:14:14,240 --> 00:14:16,679 Speaker 3: You know, I wrote about the meta smart glasses and 241 00:14:17,040 --> 00:14:18,839 Speaker 3: I had their comms people reach out to me and 242 00:14:18,880 --> 00:14:21,880 Speaker 3: be like, actually, could you call it AI glasses because 243 00:14:22,000 --> 00:14:27,560 Speaker 3: this is how closely tied together really. Their thesis of 244 00:14:27,640 --> 00:14:32,000 Speaker 3: AI and glasses are like their CTO. I believe Andrew 245 00:14:32,000 --> 00:14:34,880 Speaker 3: Bossworth has basically gone on the record saying like, this 246 00:14:34,920 --> 00:14:37,160 Speaker 3: is how we get AI to really take off, and 247 00:14:37,240 --> 00:14:42,720 Speaker 3: it's smart glasses. This is the ideal hardware form factor 248 00:14:43,480 --> 00:14:46,640 Speaker 3: for AI going forward will be in these smart glasses. 249 00:14:47,040 --> 00:14:49,480 Speaker 1: This is kind of interesting because if you look on 250 00:14:49,520 --> 00:14:51,880 Speaker 1: some corners of social media, you'll see a lot of 251 00:14:51,880 --> 00:14:55,120 Speaker 1: people get mad about AI features being added to products 252 00:14:55,120 --> 00:14:58,720 Speaker 1: that really would be fine without it. But tech companies 253 00:14:58,760 --> 00:15:02,840 Speaker 1: are still obvious trying to put AI into everything anyway, 254 00:15:03,480 --> 00:15:06,520 Speaker 1: and in wearables they seem to be betting on us 255 00:15:06,600 --> 00:15:07,240 Speaker 1: accepting it. 256 00:15:07,960 --> 00:15:11,720 Speaker 3: People have had chat GPT, Claude, all of these AI 257 00:15:11,760 --> 00:15:14,320 Speaker 3: assistants on their computers for a while, and a lot 258 00:15:14,360 --> 00:15:16,600 Speaker 3: of us have come to the same conclusion that AI 259 00:15:16,680 --> 00:15:20,240 Speaker 3: can be incredibly dumb. But if it's on you, and 260 00:15:20,280 --> 00:15:22,680 Speaker 3: from all the conversations that I've been having with AI 261 00:15:22,760 --> 00:15:27,240 Speaker 3: startups and with big tech investing in the future of AI, 262 00:15:27,840 --> 00:15:29,880 Speaker 3: if it's on you, that's a different proposition. 263 00:15:30,560 --> 00:15:33,560 Speaker 1: And with the metal ray band specifically, the AI is 264 00:15:33,640 --> 00:15:36,040 Speaker 1: sort of a bonus feature. It's a way for Meta 265 00:15:36,120 --> 00:15:39,440 Speaker 1: to get people accustomed to AI without explicitly having to 266 00:15:39,480 --> 00:15:40,240 Speaker 1: force it on them. 267 00:15:40,800 --> 00:15:43,560 Speaker 3: These are functional devices where you don't have to use 268 00:15:43,600 --> 00:15:46,960 Speaker 3: the AI. You never have to say Meta AI, you 269 00:15:47,000 --> 00:15:49,600 Speaker 3: don't ever have to use that wayward, but it's there. 270 00:15:49,920 --> 00:15:52,480 Speaker 3: If you're curious, you could use it. And because there 271 00:15:52,560 --> 00:15:56,000 Speaker 3: are other reasons why you would use these glasses besides AI, 272 00:15:56,760 --> 00:16:01,200 Speaker 3: it becomes a very easy entry way, a door for 273 00:16:01,480 --> 00:16:05,080 Speaker 3: Meta to open and say, hey, you're using it for 274 00:16:05,120 --> 00:16:08,200 Speaker 3: these purposes, it wouldn't be really great if you wanted 275 00:16:08,200 --> 00:16:10,760 Speaker 3: to use it, sort of like how you use Siri, 276 00:16:10,960 --> 00:16:13,880 Speaker 3: but actually can do more and why don't you experiment 277 00:16:13,920 --> 00:16:17,840 Speaker 3: with these actually very useful use cases. And that's why 278 00:16:18,000 --> 00:16:22,120 Speaker 3: these particular glasses have actually caught on like wildfire in 279 00:16:22,160 --> 00:16:25,760 Speaker 3: the blind and low vision community because the AI features 280 00:16:26,440 --> 00:16:31,320 Speaker 3: genuinely have created a game changing, life changing technology for them. 281 00:16:31,360 --> 00:16:35,040 Speaker 3: Like I've spoken with a bunch of blind and low 282 00:16:35,120 --> 00:16:39,600 Speaker 3: vision users who tell me that the Meta glasses enable 283 00:16:39,680 --> 00:16:42,320 Speaker 3: them to live more independent lives. They have this live 284 00:16:42,400 --> 00:16:46,440 Speaker 3: AI mode that allows you to just ask the AI what's. 285 00:16:46,200 --> 00:16:48,240 Speaker 4: Going on in my surroundings. It can read a. 286 00:16:48,200 --> 00:16:51,720 Speaker 3: Menu for you, and for a cited person that may 287 00:16:51,720 --> 00:16:53,640 Speaker 3: not that might be like, oh my god, why why 288 00:16:53,640 --> 00:16:55,360 Speaker 3: do I need an AI to read a menu for 289 00:16:55,480 --> 00:16:57,920 Speaker 3: me unless it's in a different language and translating it 290 00:16:58,240 --> 00:17:01,960 Speaker 3: before a person who is low vision or blind, just 291 00:17:02,480 --> 00:17:06,760 Speaker 3: asking an AI, hey, you know my kitchen is really messy. 292 00:17:07,000 --> 00:17:11,320 Speaker 3: Where is this particular appliance? And having the AI be 293 00:17:11,440 --> 00:17:14,680 Speaker 3: able to tell you to save you some emotional labor 294 00:17:14,680 --> 00:17:17,359 Speaker 3: to save you some of the toll of having to 295 00:17:17,400 --> 00:17:19,400 Speaker 3: ask a living person to see your. 296 00:17:19,440 --> 00:17:20,320 Speaker 4: Very messy kitchen. 297 00:17:20,880 --> 00:17:25,600 Speaker 3: It provides an actual life changing service for that particular community. 298 00:17:26,760 --> 00:17:29,720 Speaker 1: Victoria is not kidding. I know someone who's blind and 299 00:17:29,800 --> 00:17:32,320 Speaker 1: they let me use their smart glasses once and it 300 00:17:32,440 --> 00:17:35,960 Speaker 1: was genuinely amazing. So we were walking around on this 301 00:17:36,040 --> 00:17:38,800 Speaker 1: college campus and the glasses had these little speakers in 302 00:17:38,840 --> 00:17:41,800 Speaker 1: them and it would read signs to me from across 303 00:17:41,840 --> 00:17:43,960 Speaker 1: the quad. It could tell me if there was a 304 00:17:44,040 --> 00:17:47,359 Speaker 1: lamp pull or a bicycle in my path. Basically just 305 00:17:47,440 --> 00:17:49,760 Speaker 1: a whole bunch of things that would help somebody to 306 00:17:49,880 --> 00:17:53,119 Speaker 1: lead an independent life. So if someone ever tells me 307 00:17:53,200 --> 00:17:55,960 Speaker 1: that AI is useless, I got to stop them right there, 308 00:17:56,000 --> 00:17:59,360 Speaker 1: because even just from my very limited peak into what 309 00:17:59,440 --> 00:18:02,159 Speaker 1: this can do for people, it can be life changing. 310 00:18:02,840 --> 00:18:06,640 Speaker 1: But I don't think that reason is why smart glasses 311 00:18:06,760 --> 00:18:10,160 Speaker 1: are taking off now in a way that Google Glass didn't. 312 00:18:10,240 --> 00:18:14,119 Speaker 1: It's not just the technology, it's the timing. We'ren't an 313 00:18:14,160 --> 00:18:17,560 Speaker 1: age now where people are filming each other all the time, 314 00:18:17,560 --> 00:18:20,479 Speaker 1: whether it's making tiktoks on their phone or surveilling their 315 00:18:20,520 --> 00:18:24,160 Speaker 1: neighbors with the ring camera. We're used to seeing cameras 316 00:18:24,280 --> 00:18:27,560 Speaker 1: outside in a way that we just weren't in twenty thirteen. 317 00:18:28,200 --> 00:18:30,680 Speaker 1: So it sort of makes sense that even though these 318 00:18:30,680 --> 00:18:34,400 Speaker 1: things are conceptually very similar to Google glass, the Meta 319 00:18:34,480 --> 00:18:37,080 Speaker 1: ray bands are being more widely adopted now. 320 00:18:38,520 --> 00:18:41,520 Speaker 3: I've seen them in the wild quite a bunch lately, 321 00:18:41,640 --> 00:18:44,600 Speaker 3: and it's from non tuckies. Like it makes sense to 322 00:18:44,640 --> 00:18:46,640 Speaker 3: me if I go to CEES and there's a bunch 323 00:18:46,680 --> 00:18:48,879 Speaker 3: of people wearing them. Makes sense if I'm at a 324 00:18:48,880 --> 00:18:51,840 Speaker 3: press conference and there's a lot of influencers around. But 325 00:18:51,920 --> 00:18:53,680 Speaker 3: I was just walking out. I was seeing a friend 326 00:18:53,760 --> 00:18:55,520 Speaker 3: that I hadn't seen in ten years, and I looked 327 00:18:55,520 --> 00:18:57,200 Speaker 3: at them and I was like, are those meta ray 328 00:18:57,200 --> 00:18:57,960 Speaker 3: bands smart glasses? 329 00:18:57,960 --> 00:18:59,600 Speaker 4: And they're like, oh yeah, I love them. 330 00:18:59,640 --> 00:19:03,000 Speaker 3: I love to take concert footage with them, yo. And 331 00:19:03,080 --> 00:19:05,639 Speaker 3: you know, I was at a concert in June. I 332 00:19:05,720 --> 00:19:08,560 Speaker 3: turned to my side. There's a girl, she has Meta 333 00:19:08,640 --> 00:19:11,600 Speaker 3: ray bands, and I find her TikTok later that day 334 00:19:11,800 --> 00:19:14,480 Speaker 3: from that concert with the footage taken from these glasses. 335 00:19:14,520 --> 00:19:18,520 Speaker 3: So to your point, that example has become real. It 336 00:19:18,640 --> 00:19:21,080 Speaker 3: is a thing that the technology has finally caught up, 337 00:19:21,080 --> 00:19:23,320 Speaker 3: and it's a discrete form factory and it's so much 338 00:19:23,320 --> 00:19:26,560 Speaker 3: more affordable. It's around like two hundred to three hundred dollars. 339 00:19:27,040 --> 00:19:29,160 Speaker 3: And here's the other thing that I think is kind 340 00:19:29,160 --> 00:19:31,560 Speaker 3: of a factor in it is that it lets you 341 00:19:31,680 --> 00:19:35,960 Speaker 3: use your phone less. And phones are almost twenty years 342 00:19:36,000 --> 00:19:41,399 Speaker 3: old at this point, and we're culturally speaking, kind of fatigued. 343 00:19:41,920 --> 00:19:46,000 Speaker 3: There's so many products out there about helping you focus more, 344 00:19:46,080 --> 00:19:50,960 Speaker 3: putting down your phone, locking it, freaking the screen so 345 00:19:51,000 --> 00:19:54,400 Speaker 3: that you're not like glued to this device anymore. And 346 00:19:54,480 --> 00:19:58,040 Speaker 3: so the proposition now with wearables is that it enables 347 00:19:58,080 --> 00:20:00,040 Speaker 3: you to do that. You can still stay connected and 348 00:20:00,080 --> 00:20:03,679 Speaker 3: still get your notifications and be reachable, but you can 349 00:20:03,720 --> 00:20:05,159 Speaker 3: triage it and you don't have to look at your 350 00:20:05,160 --> 00:20:08,000 Speaker 3: phone quite as much. The reason why I knew that 351 00:20:08,040 --> 00:20:13,040 Speaker 3: these would not just fall away and be completely Google 352 00:20:13,119 --> 00:20:16,320 Speaker 3: Glass two point zero was my spouse is a has 353 00:20:16,400 --> 00:20:21,000 Speaker 3: become a luddite. They absolutely abhor my job and wearable 354 00:20:21,040 --> 00:20:24,440 Speaker 3: technology that I test, and they were like, I'm gonna cop. 355 00:20:24,280 --> 00:20:25,080 Speaker 4: Me a pair of those. 356 00:20:25,840 --> 00:20:27,200 Speaker 1: Really, the ray bands. 357 00:20:26,960 --> 00:20:28,920 Speaker 3: They got their own pair. Yeah, they got their own 358 00:20:28,960 --> 00:20:31,480 Speaker 3: pair of ray bands. That is their main pair of 359 00:20:31,480 --> 00:20:35,200 Speaker 3: glasses that they wear every day. They love cars, so 360 00:20:35,240 --> 00:20:38,440 Speaker 3: they go walk on the street and they can go, oh, hey, mita, 361 00:20:38,480 --> 00:20:41,360 Speaker 3: what model car is that and like in that instant 362 00:20:41,400 --> 00:20:43,240 Speaker 3: not have to pull out their phone and look all 363 00:20:43,240 --> 00:20:45,520 Speaker 3: that stuff up. That was compelling enough for them, and 364 00:20:45,560 --> 00:20:47,800 Speaker 3: the price point was good enough, and the look of 365 00:20:47,840 --> 00:20:52,960 Speaker 3: the glasses was not dorky. It was just a complete, 366 00:20:53,280 --> 00:20:54,879 Speaker 3: like whoa moment for me. 367 00:20:55,320 --> 00:20:58,240 Speaker 1: On top of the timing and the price. The look 368 00:20:58,359 --> 00:21:01,320 Speaker 1: of the Meta ray bands is probably another really big 369 00:21:01,359 --> 00:21:03,560 Speaker 1: factor of why these are working in a way that 370 00:21:03,680 --> 00:21:07,159 Speaker 1: Google Glass never did, because you don't have to consciously 371 00:21:07,200 --> 00:21:10,760 Speaker 1: sign up to be, as Victoria called it, an ambassador 372 00:21:10,760 --> 00:21:14,199 Speaker 1: for the future when you wear these things, and in retrospect, 373 00:21:14,240 --> 00:21:17,560 Speaker 1: that's probably another place where Google Glass kind of messed up. 374 00:21:17,800 --> 00:21:20,720 Speaker 1: They were relying on early adopters to be ambassadors and 375 00:21:20,840 --> 00:21:23,520 Speaker 1: hoping that they would get evangelists that would make everyone 376 00:21:23,600 --> 00:21:27,560 Speaker 1: else want to jump on board. Instead, they got Sarah Slocum, 377 00:21:27,720 --> 00:21:30,080 Speaker 1: the young lady who got her glasses ripped off of 378 00:21:30,080 --> 00:21:34,080 Speaker 1: her face. But you don't need a single ambassador if 379 00:21:34,119 --> 00:21:37,320 Speaker 1: the product just looks like something that's already out there, 380 00:21:37,840 --> 00:21:41,359 Speaker 1: everyone is already an ambassador. You're just joining the club. 381 00:21:42,119 --> 00:21:45,480 Speaker 3: That's one thing that Meta hit right on the now. 382 00:21:45,600 --> 00:21:48,680 Speaker 3: They partnered with estel or Luxotaka, which is the biggest 383 00:21:48,680 --> 00:21:51,280 Speaker 3: eyewear brand in the world, has a bunch of different 384 00:21:51,280 --> 00:21:54,639 Speaker 3: brands under an umbrella, including Oakley's, which they just released 385 00:21:54,680 --> 00:21:57,160 Speaker 3: a pair of smart glasses with under the Oakley branding, 386 00:21:57,760 --> 00:22:02,199 Speaker 3: and ray Bounds, which worldwide. So to be able to 387 00:22:02,240 --> 00:22:03,879 Speaker 3: have those fashionable. 388 00:22:03,359 --> 00:22:05,080 Speaker 4: Brands and to say here you. 389 00:22:05,040 --> 00:22:09,680 Speaker 3: Go, you're gonna look, if not stylish, normal in them, 390 00:22:10,200 --> 00:22:13,080 Speaker 3: it was a huge thing for smart glasses to go 391 00:22:13,200 --> 00:22:17,400 Speaker 3: truly mainstream. They have to be good looking because humans 392 00:22:17,400 --> 00:22:21,600 Speaker 3: are vain creatures. We are constantly obsessed with what's gonna 393 00:22:21,600 --> 00:22:23,760 Speaker 3: make us look good, and I don't care if you 394 00:22:23,760 --> 00:22:25,960 Speaker 3: have the coolest piece of tech on the planet, that's 395 00:22:25,960 --> 00:22:28,399 Speaker 3: the most convenient you're putting it on your face, on 396 00:22:28,440 --> 00:22:29,960 Speaker 3: your eyes, all right. 397 00:22:30,000 --> 00:22:32,600 Speaker 1: So some of y'all have listened this far and thought, Yo, 398 00:22:32,840 --> 00:22:36,320 Speaker 1: this sounds terrible. Anybody who wears these things is a 399 00:22:36,359 --> 00:22:40,320 Speaker 1: complete weirdo. But some people you're listening and you're thinking, 400 00:22:41,119 --> 00:22:42,920 Speaker 1: these actually sound kind of cool. You might be thinking 401 00:22:42,920 --> 00:22:45,240 Speaker 1: about getting a pair, or maybe you already have a pair. 402 00:22:46,400 --> 00:22:49,159 Speaker 1: So here we are again. We got to realize that 403 00:22:49,320 --> 00:22:52,560 Speaker 1: even though it felt like we societally rejected Google glass 404 00:22:52,600 --> 00:22:55,600 Speaker 1: ten years ago. Maybe all we did was just kick 405 00:22:55,640 --> 00:22:57,600 Speaker 1: the can down the road. We just put off making 406 00:22:57,640 --> 00:23:01,320 Speaker 1: a decision. Maybe when Sarah Slocum said that we'd all 407 00:23:01,359 --> 00:23:05,159 Speaker 1: be wearing these, she wasn't wrong. She was just a 408 00:23:05,160 --> 00:23:08,840 Speaker 1: little too early, Which brings us back to the TikToker 409 00:23:09,080 --> 00:23:13,080 Speaker 1: getting a Brazilian wax, The question of how much privacy 410 00:23:13,200 --> 00:23:13,960 Speaker 1: do we really want? 411 00:23:15,920 --> 00:23:20,160 Speaker 3: We got a listener question. It was a spicy question 412 00:23:20,440 --> 00:23:24,080 Speaker 3: from a verge Cast listener where they were like, is 413 00:23:24,080 --> 00:23:26,119 Speaker 3: it okay for me to wear it while I have 414 00:23:26,200 --> 00:23:29,520 Speaker 3: intimate relations with my wife? And I was like, oh, 415 00:23:30,119 --> 00:23:33,080 Speaker 3: that's that is genuinely a question that we have to 416 00:23:33,080 --> 00:23:35,840 Speaker 3: grapple with if these are to become a mainstream piece 417 00:23:35,880 --> 00:23:39,440 Speaker 3: of technology. Is when is it okay to wear these devices? 418 00:23:39,440 --> 00:23:42,160 Speaker 3: What conversations are you supposed to have when you're wearing 419 00:23:42,200 --> 00:23:44,879 Speaker 3: these devices? Is it just on vibes? Are we going? 420 00:23:45,080 --> 00:23:47,680 Speaker 3: Are we treating them like smartphone cameras? Right? 421 00:23:48,000 --> 00:23:48,280 Speaker 5: Right? 422 00:23:48,840 --> 00:23:51,720 Speaker 3: My contention is that a smartphone you know when someone's 423 00:23:51,760 --> 00:23:54,879 Speaker 3: recording because they hold it a specific way. There's just 424 00:23:54,960 --> 00:23:58,600 Speaker 3: kind of like a body language that goes with holding 425 00:23:58,640 --> 00:24:02,040 Speaker 3: a camera and recording, but because of the form factor 426 00:24:02,160 --> 00:24:05,879 Speaker 3: being so discreete which is a benefit in many ways, 427 00:24:05,920 --> 00:24:08,440 Speaker 3: like for content creators who are trying to do first 428 00:24:08,520 --> 00:24:12,320 Speaker 3: person point of view. But at the same time, is 429 00:24:12,359 --> 00:24:15,679 Speaker 3: someone just adjusting their glasses in a specific way going 430 00:24:15,760 --> 00:24:18,960 Speaker 3: to start looking like you're recording something? Even if you 431 00:24:19,040 --> 00:24:22,600 Speaker 3: don't have smart glasses. You know, the meta ray bands 432 00:24:22,600 --> 00:24:25,440 Speaker 3: do have a indicator of LED light, which tells you 433 00:24:25,480 --> 00:24:28,840 Speaker 3: when someone is recording in very bright outdoor lighting. I 434 00:24:28,880 --> 00:24:31,159 Speaker 3: would say most people would not notice it, and that 435 00:24:31,280 --> 00:24:33,760 Speaker 3: is a thing that I test for every single iteration 436 00:24:33,840 --> 00:24:34,840 Speaker 3: that comes out. 437 00:24:36,320 --> 00:24:38,520 Speaker 1: Living in twenty twenty five means you know that you 438 00:24:38,680 --> 00:24:42,120 Speaker 1: could be recorded at any time, But these glasses add 439 00:24:42,160 --> 00:24:45,040 Speaker 1: another layer. When someone pulls out a phone, it starts recording. 440 00:24:45,080 --> 00:24:47,400 Speaker 1: At least you have a visual indication that it's happening. 441 00:24:48,000 --> 00:24:50,560 Speaker 1: But what happens when you don't know you're being recorded. 442 00:24:51,200 --> 00:25:01,800 Speaker 1: That's after the break. So if you're outside while you're 443 00:25:01,840 --> 00:25:04,200 Speaker 1: listening to this, maybe you're going with shopping, maybe you're 444 00:25:04,200 --> 00:25:06,159 Speaker 1: going for a run, taking a walk in the park. 445 00:25:06,800 --> 00:25:10,199 Speaker 1: I want you to enjoy this moment. Just take it 446 00:25:10,240 --> 00:25:14,280 Speaker 1: in seriously because it might not be that much longer. 447 00:25:14,440 --> 00:25:17,800 Speaker 1: You can do this privately without having another person looking 448 00:25:17,840 --> 00:25:20,399 Speaker 1: you up. What I mean by that is that the 449 00:25:20,440 --> 00:25:23,359 Speaker 1: privacy issue of the meta ray bands goes beyond just 450 00:25:23,400 --> 00:25:26,600 Speaker 1: being recorded without your consent. Last year, a couple of 451 00:25:26,600 --> 00:25:29,159 Speaker 1: Harvard students were able to combine the recording and the 452 00:25:29,240 --> 00:25:32,920 Speaker 1: AI functionality of the metaglasses to docs people in real 453 00:25:33,040 --> 00:25:36,320 Speaker 1: time as they walked by. We built classes that let 454 00:25:36,359 --> 00:25:37,919 Speaker 1: you identify anybody on the street. 455 00:25:38,840 --> 00:25:40,200 Speaker 5: Cambridge Community Foundation. 456 00:25:42,200 --> 00:25:46,080 Speaker 1: Oh hiot, ma'am, but are you a that's Sam? Yes, 457 00:25:46,359 --> 00:25:48,560 Speaker 1: Oh okay, I think I met you through like the 458 00:25:48,600 --> 00:25:52,080 Speaker 1: Cambridge Community Foundation, Right, yeah, yeah, it's great to me. 459 00:25:52,200 --> 00:25:52,679 Speaker 5: Yeah, I'm Ken. 460 00:25:53,440 --> 00:25:57,160 Speaker 3: Terrifying, just absolutely terrifying. And they are college students who 461 00:25:57,160 --> 00:25:59,320 Speaker 3: were able to put that together. They just were able 462 00:25:59,320 --> 00:26:02,840 Speaker 3: to Jerry Rigs and ironically, several months later, they are 463 00:26:02,880 --> 00:26:05,400 Speaker 3: now coming out with their own smart classes product. So 464 00:26:05,680 --> 00:26:11,080 Speaker 3: it's just kind of it's it's a whole revorse of 465 00:26:11,240 --> 00:26:13,040 Speaker 3: just you know, on the one hand, they were raising 466 00:26:13,040 --> 00:26:15,119 Speaker 3: awareness like, oh my god, this could bebs used, but 467 00:26:15,160 --> 00:26:16,720 Speaker 3: also we have a product now cool. 468 00:26:17,320 --> 00:26:19,720 Speaker 1: Now. To be fair, this new smart glasses product that 469 00:26:19,720 --> 00:26:22,399 Speaker 1: they're selling is not the facial recognition look of that 470 00:26:22,440 --> 00:26:25,520 Speaker 1: they demoed on campus. Their pitch is that their new 471 00:26:25,560 --> 00:26:29,560 Speaker 1: glasses go further than metas glasses. Metas glasses just turn 472 00:26:29,640 --> 00:26:32,560 Speaker 1: on and record when you tell them to. Their product 473 00:26:32,680 --> 00:26:35,399 Speaker 1: will always be on. One of the founders told the 474 00:26:35,440 --> 00:26:39,640 Speaker 1: magazine Futurism that quote, we aim to literally record everything 475 00:26:39,720 --> 00:26:42,600 Speaker 1: in your life, and we think that will unlock just 476 00:26:42,760 --> 00:26:45,840 Speaker 1: way more power to the AI to help you on 477 00:26:45,880 --> 00:26:50,600 Speaker 1: a hybrid personal level. End quote. And their glasses won't 478 00:26:50,600 --> 00:26:53,720 Speaker 1: have an indicator light that tells you it's recording, because again, 479 00:26:54,200 --> 00:26:58,680 Speaker 1: it's always recording. This could bring up some legal issues, which, 480 00:26:58,840 --> 00:27:01,560 Speaker 1: by the way, are issues that other recording wearables are 481 00:27:01,640 --> 00:27:04,160 Speaker 1: probably also going to run into at some point. 482 00:27:04,359 --> 00:27:07,680 Speaker 3: Some people live in two party consent recording states, so 483 00:27:08,119 --> 00:27:11,439 Speaker 3: you have companies making tech that could be illegal in 484 00:27:11,480 --> 00:27:12,800 Speaker 3: some respects. 485 00:27:13,240 --> 00:27:17,840 Speaker 1: Yeah, in California specifically, both parties have to consent to 486 00:27:17,880 --> 00:27:19,040 Speaker 1: being recorded. 487 00:27:19,359 --> 00:27:22,160 Speaker 3: So I live in New Jersey and work in New York, 488 00:27:22,240 --> 00:27:24,879 Speaker 3: which are both one party consent states. So technically I 489 00:27:24,880 --> 00:27:27,800 Speaker 3: can walk out into public spaces with the thing on 490 00:27:27,960 --> 00:27:30,760 Speaker 3: and it's recording and I don't require anybody else's consent. 491 00:27:30,800 --> 00:27:33,639 Speaker 3: But if I'm going to California, is it okay? For 492 00:27:33,720 --> 00:27:37,080 Speaker 3: me to wear of always on recording device while I'm 493 00:27:37,080 --> 00:27:42,560 Speaker 3: on public transit. It suddenly becomes a very strange, murky 494 00:27:42,720 --> 00:27:46,720 Speaker 3: gray area. If you look at Metta's privacy policy for 495 00:27:46,840 --> 00:27:50,359 Speaker 3: the smart glasses, what they say is the best practices 496 00:27:50,440 --> 00:27:52,840 Speaker 3: for using these devices out in the world, and they 497 00:27:52,840 --> 00:27:56,040 Speaker 3: can basically say, hey, we published this, We've told people, 498 00:27:56,359 --> 00:28:01,520 Speaker 3: don't be a jerk, We're good, right when people inevitably 499 00:28:01,600 --> 00:28:04,800 Speaker 3: are jerks using their technology their defenses that they said, well, 500 00:28:04,800 --> 00:28:07,840 Speaker 3: we never intended them to use it that way. But 501 00:28:07,920 --> 00:28:10,639 Speaker 3: you can think about air tags in that respect. Apple 502 00:28:10,680 --> 00:28:13,080 Speaker 3: came out and were like, we made this an incredibly 503 00:28:13,119 --> 00:28:17,520 Speaker 3: convenient device for you, and a small bunch of bad 504 00:28:17,520 --> 00:28:21,040 Speaker 3: apples are going to use it to stalk people and 505 00:28:21,480 --> 00:28:24,119 Speaker 3: use them in ways that we absolutely didn't intend. But 506 00:28:24,320 --> 00:28:26,600 Speaker 3: in the fine front, we're going to say that that's illegal. 507 00:28:26,880 --> 00:28:30,360 Speaker 3: We don't condone that legally, we're scott free. 508 00:28:30,800 --> 00:28:33,480 Speaker 1: So if you're not familiar with these apples, AirTags are 509 00:28:33,560 --> 00:28:36,080 Speaker 1: these small tracking devices that you can attach to your 510 00:28:36,119 --> 00:28:38,680 Speaker 1: keys or your wallet to keep track of them. Products 511 00:28:38,720 --> 00:28:41,760 Speaker 1: like this existed before Apple's version, but Apple just made 512 00:28:41,800 --> 00:28:45,360 Speaker 1: them more convenient, which made them more mainstream and After 513 00:28:45,360 --> 00:28:48,840 Speaker 1: the product became more mainstream, the obvious bad things you 514 00:28:48,880 --> 00:28:52,280 Speaker 1: can do with this technology also became more mainstream. People 515 00:28:52,360 --> 00:28:55,640 Speaker 1: started sneaking air tags into people's purses or attaching them 516 00:28:55,640 --> 00:28:58,600 Speaker 1: to their cars so they could track them and stop them. 517 00:28:59,320 --> 00:29:02,240 Speaker 1: So Apple did make a notification system that would alert 518 00:29:02,280 --> 00:29:05,360 Speaker 1: you if an unknown tracking device is following you, but 519 00:29:05,520 --> 00:29:08,600 Speaker 1: you would only get that notification if you also had 520 00:29:08,680 --> 00:29:12,160 Speaker 1: an Apple iPhone. Months later, they did put something out 521 00:29:12,200 --> 00:29:14,120 Speaker 1: for Android, but even then you had to know how 522 00:29:14,120 --> 00:29:17,040 Speaker 1: to use it, and as cases of people being stocked 523 00:29:17,120 --> 00:29:20,280 Speaker 1: kept hitting the news, they'd keep making modifications, like making 524 00:29:20,320 --> 00:29:22,400 Speaker 1: the air tag beat more if it's away from its 525 00:29:22,440 --> 00:29:25,640 Speaker 1: owner for long enough, but of course people started working 526 00:29:25,680 --> 00:29:29,120 Speaker 1: on how to disable that. There was another solution for this, 527 00:29:29,640 --> 00:29:32,040 Speaker 1: stop with the software updates and just cancel the product, 528 00:29:32,080 --> 00:29:35,200 Speaker 1: pull it off the shelves, but Apple didn't do that 529 00:29:35,320 --> 00:29:38,959 Speaker 1: because air tags are really popular. People really like being 530 00:29:38,960 --> 00:29:41,920 Speaker 1: able to find their lost stuff. In the case of 531 00:29:41,960 --> 00:29:44,440 Speaker 1: the Apple air tags and now for the Meta rate bands, 532 00:29:44,880 --> 00:29:47,760 Speaker 1: the trick seems to be to find enough consumers for 533 00:29:47,800 --> 00:29:51,440 Speaker 1: whom the product is indispensable, people who think that the 534 00:29:51,480 --> 00:29:53,719 Speaker 1: benefits outweigh the risks. 535 00:29:54,760 --> 00:29:57,560 Speaker 3: I think this is what are really the crux of 536 00:29:57,600 --> 00:29:59,280 Speaker 3: all of this lies on is that air tags are 537 00:29:59,360 --> 00:30:03,400 Speaker 3: so con that most people, the vast majority of people, 538 00:30:03,440 --> 00:30:05,600 Speaker 3: will be like, yeah, I'm good with that because air 539 00:30:05,640 --> 00:30:08,960 Speaker 3: tags are so convenient, and I'm not the bad apple 540 00:30:09,040 --> 00:30:11,960 Speaker 3: using it in that way. It's like, for what app 541 00:30:12,160 --> 00:30:14,920 Speaker 3: for what use case will be so convenient that you 542 00:30:14,960 --> 00:30:18,960 Speaker 3: are willing to overlook the dystopian nightmare that comes along 543 00:30:18,960 --> 00:30:19,280 Speaker 3: with it. 544 00:30:19,680 --> 00:30:23,000 Speaker 1: I mean, I'm feeling like, even if the vast majority 545 00:30:23,000 --> 00:30:25,680 Speaker 1: of people with the metal ray bands or any of 546 00:30:25,680 --> 00:30:29,040 Speaker 1: the smart glasses use them in very responsible ways, just 547 00:30:29,080 --> 00:30:31,720 Speaker 1: the fact that it's out there is going to alter 548 00:30:32,920 --> 00:30:35,880 Speaker 1: how we can walk around in society period. 549 00:30:36,120 --> 00:30:39,560 Speaker 3: No, You're absolutely right, Like, in testing these devices, I 550 00:30:39,560 --> 00:30:41,200 Speaker 3: don't speak to myself as much as I used to 551 00:30:41,400 --> 00:30:44,040 Speaker 3: because I wore one of these devices into a bathroom. 552 00:30:44,400 --> 00:30:47,120 Speaker 3: I commented on my bowel movement and it recorded it, 553 00:30:47,400 --> 00:30:50,280 Speaker 3: and then it generated a to do for me to 554 00:30:50,320 --> 00:30:52,840 Speaker 3: have lactaid again, and I was like, this is the 555 00:30:52,960 --> 00:30:57,560 Speaker 3: rudest thing that's ever happened to me. But also, holy crap, 556 00:30:57,640 --> 00:31:00,720 Speaker 3: this is our dystopian future. Because when the AI is 557 00:31:00,840 --> 00:31:03,760 Speaker 3: in your glasses, when the AI is independent that sits 558 00:31:03,760 --> 00:31:06,320 Speaker 3: around your neck, when it's in your smart watch, when 559 00:31:06,360 --> 00:31:09,200 Speaker 3: it's on you twenty four to seven, and you just 560 00:31:09,240 --> 00:31:13,200 Speaker 3: have an unfiltered thought that you speak aloud to yourself. Well, 561 00:31:13,480 --> 00:31:15,960 Speaker 3: suddenly you're not the only one listening to your own thoughts. 562 00:31:16,000 --> 00:31:18,280 Speaker 3: It's an AI that's listening to your thoughts and it's 563 00:31:18,280 --> 00:31:20,800 Speaker 3: going ooh. I mentioned that maybe this is a thing 564 00:31:20,880 --> 00:31:22,120 Speaker 3: that I will generate a to. 565 00:31:22,080 --> 00:31:26,080 Speaker 1: Do list for, and that sounds like a convenient application. 566 00:31:26,280 --> 00:31:29,200 Speaker 1: And there's that word again, right, convenient. You say something 567 00:31:29,200 --> 00:31:31,680 Speaker 1: out loud and your smart glasses remember it for you, 568 00:31:32,280 --> 00:31:35,760 Speaker 1: but it's not remembering it for you, it's also remembering 569 00:31:35,800 --> 00:31:39,160 Speaker 1: it for the company and for the advertisers. You blurt 570 00:31:39,200 --> 00:31:41,959 Speaker 1: out something unconsciously about your head hurting, or you're around 571 00:31:41,960 --> 00:31:44,320 Speaker 1: somebody else as they're talking about having a headache, and 572 00:31:44,400 --> 00:31:46,240 Speaker 1: next thing you know, you're getting a bunch of targeted 573 00:31:46,240 --> 00:31:49,280 Speaker 1: ads for a very specific brand of headache medicine that 574 00:31:49,440 --> 00:31:50,960 Speaker 1: is paid to be at the top of the list 575 00:31:51,160 --> 00:31:53,520 Speaker 1: for the demographic of twenty five to thirty five year 576 00:31:53,520 --> 00:31:56,520 Speaker 1: old women who like cold brew coffee, live in urban environments, 577 00:31:56,720 --> 00:31:57,760 Speaker 1: and like techno music. 578 00:31:58,680 --> 00:32:00,880 Speaker 3: If we want to get super just Opien about it. 579 00:32:01,760 --> 00:32:06,600 Speaker 3: We live in the engagement economy, right. Engagement economy requires 580 00:32:06,640 --> 00:32:10,760 Speaker 3: the constant feed of data and personalization and all that 581 00:32:10,800 --> 00:32:13,840 Speaker 3: sort of stuff. What better AI training tool do you 582 00:32:13,880 --> 00:32:16,280 Speaker 3: have than a wearable? I sit up at night and 583 00:32:16,320 --> 00:32:18,680 Speaker 3: I think about it, and I was like, legitimately, we 584 00:32:18,720 --> 00:32:21,800 Speaker 3: started out just tracking our steps, and now it's your 585 00:32:21,840 --> 00:32:24,479 Speaker 3: heart right. They're working on finding ways to tell you 586 00:32:24,520 --> 00:32:26,680 Speaker 3: if your blood pressure is high or low, if your 587 00:32:26,680 --> 00:32:29,320 Speaker 3: blood glucose is higher and low. So they're looking to 588 00:32:29,600 --> 00:32:32,520 Speaker 3: how can the feed crapt on more data so that 589 00:32:32,560 --> 00:32:33,760 Speaker 3: we can. 590 00:32:33,840 --> 00:32:36,480 Speaker 4: Know more about you, give you more ads, give you. 591 00:32:36,440 --> 00:32:40,320 Speaker 3: More ads, personalized experiences like this is just my I 592 00:32:40,360 --> 00:32:42,440 Speaker 3: don't know if you've seen the meme of Charlie from 593 00:32:42,800 --> 00:32:44,880 Speaker 3: It's Always Sunny with the red string on the board. 594 00:32:44,920 --> 00:32:48,160 Speaker 4: This is my conspiracy theory. Yes, so where wearables are going. 595 00:32:48,800 --> 00:32:51,200 Speaker 1: I don't think that that's a conspiracy theory. I think 596 00:32:51,200 --> 00:32:53,840 Speaker 1: it's a very reasonable thing to assume is that the 597 00:32:53,840 --> 00:32:56,480 Speaker 1: more data is collected about you, it can be used 598 00:32:56,520 --> 00:33:02,720 Speaker 1: to show you ads that you will not past. Yeah, 599 00:33:03,200 --> 00:33:06,880 Speaker 1: you've got kind of a preview of this of what 600 00:33:06,960 --> 00:33:10,120 Speaker 1: society looks like because you're around the tech type people 601 00:33:10,160 --> 00:33:14,040 Speaker 1: all the time. What does a society look like when 602 00:33:14,120 --> 00:33:18,200 Speaker 1: we know that there's a good chance that somebody around 603 00:33:18,240 --> 00:33:21,840 Speaker 1: you is recording, is recording everything they're seeing and hearing. 604 00:33:22,840 --> 00:33:25,600 Speaker 4: I think it's a much more self conscious society. 605 00:33:25,800 --> 00:33:28,800 Speaker 3: I have become someone who when I'm out and about, 606 00:33:28,920 --> 00:33:32,160 Speaker 3: I am scanning the glasses that people wear to see 607 00:33:32,160 --> 00:33:32,640 Speaker 3: if there's. 608 00:33:32,480 --> 00:33:33,280 Speaker 4: Cameras on them. 609 00:33:34,040 --> 00:33:36,680 Speaker 3: I had a kind of a unique upbringing. There's a 610 00:33:36,760 --> 00:33:38,600 Speaker 3: question of whether my dad was a North Korean spire 611 00:33:38,680 --> 00:33:41,720 Speaker 3: or not, and whether we were under savellance at any 612 00:33:41,840 --> 00:33:44,080 Speaker 3: given point in time, and so I grew up always 613 00:33:44,120 --> 00:33:46,800 Speaker 3: thinking my life is kind of public. I have to 614 00:33:47,040 --> 00:33:49,960 Speaker 3: perform as if I'm always being watched. So I kind 615 00:33:50,000 --> 00:33:54,440 Speaker 3: of grew up with that my whole life. But it's 616 00:33:54,480 --> 00:33:57,400 Speaker 3: a heavy thing to grow up with, and I think, 617 00:33:58,960 --> 00:34:00,360 Speaker 3: you know, a lot of. 618 00:34:00,320 --> 00:34:02,280 Speaker 4: People are privileged and blessed to. 619 00:34:02,320 --> 00:34:05,800 Speaker 3: Not grow up performing in that way, as if there's 620 00:34:05,840 --> 00:34:08,840 Speaker 3: a movie set invisible camera on you at all times. 621 00:34:09,440 --> 00:34:10,680 Speaker 3: But I think that is just going to be a 622 00:34:10,680 --> 00:34:13,959 Speaker 3: reality that everyone starts to do. You start to become 623 00:34:13,960 --> 00:34:17,319 Speaker 3: a lot more conscious of your actions in public. You 624 00:34:17,360 --> 00:34:20,319 Speaker 3: start to become conscious of the spaces in your house 625 00:34:20,360 --> 00:34:21,360 Speaker 3: that are truly private. 626 00:34:22,160 --> 00:34:22,400 Speaker 4: You know. 627 00:34:22,719 --> 00:34:24,560 Speaker 3: I say to people all the time that the only 628 00:34:24,560 --> 00:34:27,239 Speaker 3: truly private place you have in this world is the 629 00:34:27,280 --> 00:34:30,480 Speaker 3: inside of your head. Like that's kind of dystopian when 630 00:34:30,480 --> 00:34:33,319 Speaker 3: you say that, But living the life that I do, 631 00:34:33,520 --> 00:34:36,920 Speaker 3: testing the products that I do, having the upbringing that 632 00:34:37,000 --> 00:34:41,160 Speaker 3: I had, I unfortunately think I am well equipped to 633 00:34:41,239 --> 00:34:45,000 Speaker 3: tell people that this is what's coming. I think we're 634 00:34:45,040 --> 00:34:46,759 Speaker 3: all going to have to live our lives as if 635 00:34:46,800 --> 00:34:49,600 Speaker 3: we're many celebrities out in public at a given point 636 00:34:49,640 --> 00:34:51,799 Speaker 3: in time, and that the paparazzo could come for you. 637 00:34:52,080 --> 00:34:54,000 Speaker 3: And there's degrees of that. Not all of us are 638 00:34:54,000 --> 00:34:58,520 Speaker 3: Timothy Shalome living out here having to wear caps and disguises, 639 00:34:59,280 --> 00:35:02,480 Speaker 3: But to a degree, I do think we're all going 640 00:35:02,520 --> 00:35:03,480 Speaker 3: to be living very. 641 00:35:03,320 --> 00:35:05,640 Speaker 1: Public lives, whether or not we want to. 642 00:35:06,520 --> 00:35:07,160 Speaker 4: Whether or not we. 643 00:35:07,120 --> 00:35:09,279 Speaker 3: Want to, I think we are all in some way 644 00:35:09,400 --> 00:35:12,520 Speaker 3: going to be living as public figures. So I think 645 00:35:12,640 --> 00:35:14,680 Speaker 3: as a society going forward, we really have to think 646 00:35:14,719 --> 00:35:18,600 Speaker 3: about what are truly private spaces and what truly private 647 00:35:18,640 --> 00:35:21,880 Speaker 3: spaces we want to protect, because it's a human need 648 00:35:22,520 --> 00:35:26,560 Speaker 3: to need that privacy, and I don't think that it 649 00:35:26,600 --> 00:35:31,480 Speaker 3: should be given up for whatever convenience. Like, however, tempting 650 00:35:31,480 --> 00:35:35,040 Speaker 3: the conveniences and sometimes the inconvenience is necessary. 651 00:35:35,560 --> 00:35:37,880 Speaker 1: It seems like this is another one of those conversations 652 00:35:38,280 --> 00:35:43,360 Speaker 1: where we're having as a society where the outcome seems predetermined, 653 00:35:43,760 --> 00:35:46,759 Speaker 1: which is to say, these are eventually going to be 654 00:35:46,880 --> 00:35:51,520 Speaker 1: adopted by everyone. It's just a matter of time. This 655 00:35:51,640 --> 00:35:57,080 Speaker 1: isn't a decision that you, as an individual get to make. Look, 656 00:35:57,120 --> 00:35:59,680 Speaker 1: you don't have to like these glasses, but they're going 657 00:35:59,719 --> 00:36:02,799 Speaker 1: to hit mainstream adoption. Everybody's gonna be wearing them, and 658 00:36:02,840 --> 00:36:04,920 Speaker 1: so you can choose not to buy them if you 659 00:36:04,960 --> 00:36:07,680 Speaker 1: don't want to, but you cannot choose to live in 660 00:36:07,920 --> 00:36:10,600 Speaker 1: a world that doesn't have them. And it feels like 661 00:36:11,560 --> 00:36:13,920 Speaker 1: that is where we're at right now. Do you think 662 00:36:13,920 --> 00:36:14,319 Speaker 1: we're there? 663 00:36:14,560 --> 00:36:16,960 Speaker 4: I think you're spot on, because I think the sales proven. 664 00:36:17,600 --> 00:36:21,879 Speaker 3: Google believes that the zeitgeist is strong enough for them 665 00:36:21,920 --> 00:36:25,640 Speaker 3: to be like, hey, guys, smart glasses. Let's get back 666 00:36:25,760 --> 00:36:28,880 Speaker 3: in on that and rebrand ourselves as the people with 667 00:36:28,960 --> 00:36:30,720 Speaker 3: the most experience in the space. 668 00:36:31,160 --> 00:36:33,480 Speaker 4: So at the end of last year, the age they 669 00:36:33,600 --> 00:36:34,560 Speaker 4: put me back in coach. 670 00:36:34,760 --> 00:36:37,560 Speaker 3: Like last at the end of last year, they legitimately 671 00:36:37,600 --> 00:36:40,400 Speaker 3: launched Android XR, which is going to be their platform 672 00:36:40,719 --> 00:36:42,520 Speaker 3: for smart glasses and headsets. 673 00:36:42,880 --> 00:36:44,640 Speaker 1: Why would you say now is the right moment to 674 00:36:44,719 --> 00:36:48,200 Speaker 1: launch XR. I think now is the perfect time to 675 00:36:48,320 --> 00:36:51,160 Speaker 1: work on XR because you have a convergence of all 676 00:36:51,200 --> 00:36:54,320 Speaker 1: these technologies. We've been in this space since Google Glass 677 00:36:54,360 --> 00:36:55,520 Speaker 1: and we have not stopped. 678 00:36:56,360 --> 00:36:59,560 Speaker 6: We can take those experiences, which already work great and 679 00:36:59,600 --> 00:37:01,759 Speaker 6: find new ways to be helpful for people. 680 00:37:02,480 --> 00:37:05,799 Speaker 3: Once Google is able to overcome their PTSD trauma over 681 00:37:05,880 --> 00:37:08,480 Speaker 3: Google Glasses, to be like, we want back in on 682 00:37:08,520 --> 00:37:10,960 Speaker 3: the thing that people made the most fun of us for. 683 00:37:11,640 --> 00:37:13,400 Speaker 4: I think it is inevitable. I think you're spot on 684 00:37:13,440 --> 00:37:13,759 Speaker 4: about that. 685 00:37:18,680 --> 00:37:21,000 Speaker 1: Thank you so much for listening to kill Switch. I 686 00:37:21,000 --> 00:37:23,040 Speaker 1: hope you dug this one. If you want to connect 687 00:37:23,080 --> 00:37:26,520 Speaker 1: with us, we're on Instagram at kill switch pod, or 688 00:37:26,640 --> 00:37:30,880 Speaker 1: you can email us at kill switch at Kaleidoscope dot NYC. 689 00:37:31,480 --> 00:37:33,000 Speaker 1: And you know, while you got your phone in your 690 00:37:33,000 --> 00:37:35,440 Speaker 1: hand whatever, before you put it back in your pocket, 691 00:37:35,800 --> 00:37:38,720 Speaker 1: maybe wherever you're listening to this podcast, leave us some review. 692 00:37:39,080 --> 00:37:41,799 Speaker 1: It helps other people find the show, which in turn 693 00:37:41,920 --> 00:37:44,760 Speaker 1: helps us keep doing our thing. Kill Switch is hosted 694 00:37:44,800 --> 00:37:48,640 Speaker 1: by me Dexter Thomas. It's produced by Shena Ozaki, Darluk 695 00:37:48,680 --> 00:37:51,600 Speaker 1: Potts and Kate Osborne. A theme song is by me 696 00:37:51,840 --> 00:37:55,839 Speaker 1: and Kyle Murdoch and Kyle also mixes the show from Kaleidoscope. 697 00:37:55,920 --> 00:37:59,879 Speaker 1: Our executive producers are Ozma Lashin, Mangesh Hatikadur and Kate 698 00:38:00,040 --> 00:38:04,239 Speaker 1: Osborne from iHeart. Our executive producers our Katrina Norvil and 699 00:38:04,440 --> 00:38:05,120 Speaker 1: Nikki Etur. 700 00:38:05,640 --> 00:38:05,839 Speaker 6: Oh. 701 00:38:06,080 --> 00:38:08,600 Speaker 1: One more thing. So there's that clip that we played 702 00:38:08,640 --> 00:38:11,400 Speaker 1: of the dragon Ball scouter exploding, and maybe you were 703 00:38:11,400 --> 00:38:15,040 Speaker 1: wondering why it exploded or if there's any scientific basis 704 00:38:15,080 --> 00:38:17,279 Speaker 1: for why a scouter would explode when the power levels 705 00:38:17,320 --> 00:38:20,440 Speaker 1: are too high. Okay, maybe you weren't wondering that, but 706 00:38:20,520 --> 00:38:23,280 Speaker 1: I was. And it turns out that the official dragon 707 00:38:23,320 --> 00:38:26,160 Speaker 1: Ball site published an article that kind of explains it, 708 00:38:26,640 --> 00:38:29,840 Speaker 1: featuring an interview with a professor in the engineering department 709 00:38:29,840 --> 00:38:33,400 Speaker 1: at Miyazaki University who talks about using AI headsets to 710 00:38:33,480 --> 00:38:36,680 Speaker 1: measure the weight of pigs. I promise you I'm not 711 00:38:36,800 --> 00:38:39,240 Speaker 1: making this up. I'll leave that in the show notes. Anyway, 712 00:38:39,760 --> 00:38:40,839 Speaker 1: We'll catch you on the next one 713 00:38:54,680 --> 00:38:55,160 Speaker 6: Would bind