1 00:00:12,119 --> 00:00:16,120 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech Stuff. I'm 2 00:00:16,120 --> 00:00:18,880 Speaker 1: mons Valashan, and I'm delighted that Cara Price is back 3 00:00:18,920 --> 00:00:22,040 Speaker 1: with us. Today we'll be bringing you the emotional highs 4 00:00:22,079 --> 00:00:25,680 Speaker 1: and lows of Silicon Valley, from a tech Titan's gilded 5 00:00:25,760 --> 00:00:30,360 Speaker 1: nuctuals to the soulless writing of checchipt. Then, on a 6 00:00:30,360 --> 00:00:32,600 Speaker 1: more serious note, we'll be looking at the conflict between 7 00:00:32,600 --> 00:00:34,839 Speaker 1: Iran and Israel in cyberspace. 8 00:00:35,240 --> 00:00:39,440 Speaker 2: Cyber attacks are often viewed as a much less costly 9 00:00:39,840 --> 00:00:42,960 Speaker 2: in terms of funding, but also in terms of lives, 10 00:00:43,440 --> 00:00:47,080 Speaker 2: way to critically wound an enemy. 11 00:00:48,120 --> 00:00:51,000 Speaker 1: All of that on the weekend tech It's Friday, June 12 00:00:51,000 --> 00:00:54,680 Speaker 1: twenty seventh. Welcome back, Cara. 13 00:00:54,840 --> 00:00:55,760 Speaker 3: Thank you as. 14 00:00:56,000 --> 00:00:58,640 Speaker 1: Back to tech Stuff, just in time to cover Jeff 15 00:00:58,680 --> 00:01:00,160 Speaker 1: and Lauren's wedding out. 16 00:01:00,160 --> 00:01:03,160 Speaker 3: A coincidence at all, I wouldn't have come back otherwise. 17 00:01:03,400 --> 00:01:05,920 Speaker 1: You do love a Selim spectacle, and this one's going 18 00:01:05,959 --> 00:01:11,440 Speaker 1: to be pretty spectacular, perhaps unsurprisingly given Jeff Bezos's fourtune 19 00:01:11,480 --> 00:01:13,880 Speaker 1: or two hundred and thirty billion dollars and seems to 20 00:01:13,920 --> 00:01:17,039 Speaker 1: be thirsty for a big hype bash this weekend. 21 00:01:17,280 --> 00:01:19,840 Speaker 3: And what do you do when you're the world's richest man, 22 00:01:20,480 --> 00:01:22,880 Speaker 3: you rent out Venice. Now, ven I think that's what 23 00:01:22,920 --> 00:01:25,920 Speaker 3: you do. We don't know much more about the ceremony 24 00:01:25,959 --> 00:01:29,560 Speaker 3: or the parties for that matter. I can guess that 25 00:01:29,640 --> 00:01:33,480 Speaker 3: some of the guests will be Kim Kardashian, who was 26 00:01:33,520 --> 00:01:37,479 Speaker 3: on Lauren Sanchez's bachelorette and then some others who were 27 00:01:37,520 --> 00:01:41,280 Speaker 3: on the Blue Origin spaceflight, which was not her bachelorette party. 28 00:01:41,280 --> 00:01:46,240 Speaker 1: By the way, so the Virgin spaceflight was Kim Kardashian. No, No, Katy. 29 00:01:45,959 --> 00:01:48,320 Speaker 3: Perry, Katy Perry, Gail King. 30 00:01:48,720 --> 00:01:51,360 Speaker 1: I remember when Katy Perry came down from space. There 31 00:01:51,560 --> 00:01:55,320 Speaker 1: was a article in any Mail being like, it's confirmed 32 00:01:55,400 --> 00:01:56,800 Speaker 1: Katy Perry's career is over. 33 00:01:57,680 --> 00:01:59,120 Speaker 3: Oh, I thought they were going to say it's confirmed 34 00:01:59,160 --> 00:02:00,520 Speaker 3: Katy Perry is an alien. 35 00:02:01,560 --> 00:02:04,720 Speaker 1: I wonder which of the if the tech Titans will 36 00:02:04,800 --> 00:02:08,359 Speaker 1: be there, zak Elon, Tim Cook from Apple or Tim 37 00:02:08,400 --> 00:02:12,480 Speaker 1: Apples he's otherwise known, Larian Serge from Google. I would 38 00:02:12,520 --> 00:02:14,560 Speaker 1: guess there'll be fewer at the wedding than they were 39 00:02:14,600 --> 00:02:15,960 Speaker 1: at Trump's inauguration. 40 00:02:17,440 --> 00:02:18,400 Speaker 3: Maybe maybe not. 41 00:02:18,639 --> 00:02:21,640 Speaker 1: The details are murky, but there is a big banner 42 00:02:21,720 --> 00:02:24,359 Speaker 1: flying in Venice at the moment saying if you can 43 00:02:24,360 --> 00:02:26,880 Speaker 1: rent Venice for your wedding, you can pay more tax. 44 00:02:27,320 --> 00:02:30,200 Speaker 3: You know, this wedding is so secret. I was making 45 00:02:30,240 --> 00:02:32,280 Speaker 3: a joke about renting out Venice, but it is hard 46 00:02:32,320 --> 00:02:36,040 Speaker 3: to tell. The party organizers of the wedding are denying it. 47 00:02:36,080 --> 00:02:38,760 Speaker 3: They actually released a statement saying it's not a city 48 00:02:38,800 --> 00:02:42,320 Speaker 3: takeover and the couple is committed to this is like, 49 00:02:42,360 --> 00:02:45,880 Speaker 3: this is insane. The couple is committed to minimizing disruption, 50 00:02:46,320 --> 00:02:50,280 Speaker 3: and Bezos donated three million dollars to various local organizations. 51 00:02:50,600 --> 00:02:54,520 Speaker 3: In the face of criticism, local Venice officials chimed in 52 00:02:54,560 --> 00:02:57,720 Speaker 3: and said the couple has only booked about thirty water 53 00:02:57,800 --> 00:03:00,400 Speaker 3: taxis for the event. I just live for this. 54 00:03:00,800 --> 00:03:02,600 Speaker 1: I do too. And you pointed me to a piece 55 00:03:02,600 --> 00:03:04,440 Speaker 1: in the Wall Street Journal that was the best one. 56 00:03:04,639 --> 00:03:07,400 Speaker 1: It had a headline promising to spill the secrets and 57 00:03:07,480 --> 00:03:09,680 Speaker 1: another the fact, when there's a big, big wedding, make 58 00:03:09,720 --> 00:03:12,040 Speaker 1: this even the Wall Street Journal will slip into the Daily. 59 00:03:12,080 --> 00:03:13,840 Speaker 1: Mom My God, I spill the secrets. 60 00:03:13,919 --> 00:03:15,520 Speaker 3: They were an excited to do so. 61 00:03:16,440 --> 00:03:18,959 Speaker 1: I was a bit skeptical about opening the show with 62 00:03:19,080 --> 00:03:23,040 Speaker 1: talking about the Sanchez Bezos wedding, perhaps because I'm a 63 00:03:23,040 --> 00:03:25,160 Speaker 1: brit and we've had quite a lot of royal wedding 64 00:03:25,160 --> 00:03:27,720 Speaker 1: action in the last decade. But you pointed me to 65 00:03:27,720 --> 00:03:30,000 Speaker 1: something in the Wall Street Journal piece that just felt 66 00:03:30,040 --> 00:03:32,320 Speaker 1: so tech stuff. I couldn't wait him more about it. 67 00:03:32,480 --> 00:03:34,720 Speaker 3: And this is so me. This is where the sort 68 00:03:34,760 --> 00:03:38,360 Speaker 3: of intersection between my nerdiness and my interest in celebrity 69 00:03:38,360 --> 00:03:40,640 Speaker 3: gossip intersects. I was reading the piece and I'm like, 70 00:03:40,640 --> 00:03:43,320 Speaker 3: this is a great piece, But there was a certain 71 00:03:44,480 --> 00:03:47,920 Speaker 3: thing that was mentioned. This luxury event designer said he 72 00:03:47,960 --> 00:03:53,280 Speaker 3: had recently commissioned hold your Hat, a five hundred thousand, 73 00:03:53,640 --> 00:03:58,160 Speaker 3: half a million dollar holograph of a bride's deceased grandfather 74 00:03:58,480 --> 00:04:01,640 Speaker 3: to share well wishes to the that's the cost of 75 00:04:01,920 --> 00:04:05,360 Speaker 3: like no percent of weddings is half a million dollars. 76 00:04:05,200 --> 00:04:06,560 Speaker 1: Or no percent of Besos' wedding. 77 00:04:06,640 --> 00:04:07,560 Speaker 3: That's right, that's right. 78 00:04:07,960 --> 00:04:11,280 Speaker 1: You know it's funny because obviously there's this trend where 79 00:04:11,440 --> 00:04:14,400 Speaker 1: rich people want to live forever. I guess the like 80 00:04:14,640 --> 00:04:18,320 Speaker 1: this follow up trend to that is reanimating your your 81 00:04:18,360 --> 00:04:20,719 Speaker 1: dead relatives for the for the cool sum of half 82 00:04:20,720 --> 00:04:23,520 Speaker 1: a million dollars to deliver a toast at the wedding. 83 00:04:23,760 --> 00:04:25,599 Speaker 1: It does remind me. You remember of that movie Mulan the 84 00:04:25,640 --> 00:04:29,640 Speaker 1: Disney Words Drive when the ancestors reanimated to wish her 85 00:04:29,640 --> 00:04:32,000 Speaker 1: well in her quest. Yes, this is like being if 86 00:04:32,000 --> 00:04:34,280 Speaker 1: you're a billionaire, you can you can your wedding can 87 00:04:34,360 --> 00:04:36,520 Speaker 1: feel like the opening sequence of Mulan. 88 00:04:36,640 --> 00:04:38,440 Speaker 3: When you're a billionaire, you can watch a movie like 89 00:04:38,480 --> 00:04:40,000 Speaker 3: Mulan and say I want that at my wedding. 90 00:04:40,080 --> 00:04:42,840 Speaker 1: Please bring out her to us all remember that? Yes, 91 00:04:42,960 --> 00:04:46,200 Speaker 1: of course, so you've now been subjected to my singing, 92 00:04:46,440 --> 00:04:50,440 Speaker 1: which is appropriate. Turn to our next story, which is 93 00:04:50,680 --> 00:04:54,719 Speaker 1: about the music industry and how the industry is building 94 00:04:54,760 --> 00:04:57,240 Speaker 1: tech to make AI music more traceable. 95 00:04:57,839 --> 00:05:00,200 Speaker 3: All right, so this is interesting because I do you 96 00:05:00,279 --> 00:05:04,200 Speaker 3: think music production is one of the most evolved uses 97 00:05:04,200 --> 00:05:08,080 Speaker 3: of AI. It duped me in twenty twenty three with 98 00:05:08,200 --> 00:05:09,040 Speaker 3: fake Drake. 99 00:05:09,160 --> 00:05:12,120 Speaker 1: Fake Drake. I was actually hoping to tell you about 100 00:05:12,160 --> 00:05:15,400 Speaker 1: fake Drake and play that classic twenty twenty three banger 101 00:05:15,440 --> 00:05:17,080 Speaker 1: and test you on whether or not you knew it 102 00:05:17,160 --> 00:05:21,120 Speaker 1: was a fake or not. But let's play it nonetheless. 103 00:05:22,520 --> 00:05:26,520 Speaker 3: Like it's fake question. 104 00:05:27,120 --> 00:05:30,360 Speaker 1: But that was good. I came when my eggs like 105 00:05:30,520 --> 00:05:31,160 Speaker 1: the lean. 106 00:05:31,240 --> 00:05:35,040 Speaker 3: Honestly though it's very good. 107 00:05:34,720 --> 00:05:37,160 Speaker 4: I love she need. 108 00:05:39,520 --> 00:05:41,640 Speaker 3: Given. You know, if I hadn't known it was AI, 109 00:05:41,800 --> 00:05:44,120 Speaker 3: I probably wouldn't have second guessed it at all. 110 00:05:44,360 --> 00:05:47,760 Speaker 1: It really spooked people, especially the music industry, because if 111 00:05:47,800 --> 00:05:50,520 Speaker 1: people didn't know, and worst of all, if people didn't 112 00:05:50,520 --> 00:05:53,680 Speaker 1: care that this was a fake track, what could that 113 00:05:53,720 --> 00:05:54,719 Speaker 1: mean for the whole business? 114 00:05:55,120 --> 00:05:57,840 Speaker 3: Yeah, you know, the music business has already been battered 115 00:05:57,920 --> 00:06:02,200 Speaker 3: so much in reaction first to piracy and illegal downloads, 116 00:06:02,240 --> 00:06:05,599 Speaker 3: and now the popularity of streaming platforms like Spotify. 117 00:06:05,839 --> 00:06:09,000 Speaker 1: Yeah, and this is another disruption, and potentially an unmanageable one, 118 00:06:09,200 --> 00:06:12,760 Speaker 1: because if AI music is uploaded to streaming platforms and 119 00:06:12,800 --> 00:06:15,760 Speaker 1: it's not labeled as being made by AI, the streaming 120 00:06:15,760 --> 00:06:19,240 Speaker 1: platforms themselves and of course listeners might not be any 121 00:06:19,320 --> 00:06:19,799 Speaker 1: the wiser. 122 00:06:20,040 --> 00:06:21,839 Speaker 3: But is this actually like a problem? 123 00:06:22,120 --> 00:06:22,360 Speaker 4: You know? 124 00:06:22,520 --> 00:06:25,320 Speaker 3: I wonder how much AI music is actually being like 125 00:06:25,480 --> 00:06:27,600 Speaker 3: dumped onto streaming platforms. 126 00:06:28,200 --> 00:06:33,680 Speaker 1: You wonder, And I searched the web. According to the Verge, 127 00:06:33,680 --> 00:06:36,960 Speaker 1: who spoke to the French streaming platform Deza Desa, said 128 00:06:37,000 --> 00:06:39,600 Speaker 1: that as of this April, roughly twenty percent of new 129 00:06:39,680 --> 00:06:43,000 Speaker 1: uploads every day were fully AI generated, so about twenty 130 00:06:43,080 --> 00:06:45,600 Speaker 1: thousand tracks a day and I think the other important 131 00:06:45,640 --> 00:06:48,280 Speaker 1: point here is that AI is obviously trained on what 132 00:06:48,320 --> 00:06:51,240 Speaker 1: you feed it, so these tracks probably more often than 133 00:06:51,279 --> 00:06:55,520 Speaker 1: not imitating musical ideas that perhaps should be licensed before 134 00:06:55,600 --> 00:06:57,400 Speaker 1: the song is widely distributed. 135 00:06:57,640 --> 00:06:59,080 Speaker 3: So it's kind of a money problem. 136 00:06:59,160 --> 00:07:02,839 Speaker 1: Well, yes, the money problem. It's also a fundamental human 137 00:07:02,920 --> 00:07:06,360 Speaker 1: creativity in the age of AI problem, but the money 138 00:07:06,360 --> 00:07:09,560 Speaker 1: problem makes it more urgent capitalism to the rescue of 139 00:07:09,560 --> 00:07:12,560 Speaker 1: the music business. Basically, there are multiple services now being 140 00:07:12,600 --> 00:07:15,960 Speaker 1: developed that can be integrated into streaming platform structures to 141 00:07:16,040 --> 00:07:20,480 Speaker 1: analyze one if uploaded tracks are AI, and two if 142 00:07:20,480 --> 00:07:22,960 Speaker 1: they contain so called protected elements. 143 00:07:23,200 --> 00:07:24,520 Speaker 3: So how does that work. 144 00:07:24,720 --> 00:07:26,400 Speaker 1: There's one product in particular I want to tell you about, 145 00:07:26,400 --> 00:07:29,360 Speaker 1: which is called trace ID, and it's marketed as an 146 00:07:29,440 --> 00:07:33,800 Speaker 1: AI rights management platform. Basically, the software breaks songs into 147 00:07:33,840 --> 00:07:37,600 Speaker 1: stems from vocal tone to melodic phrasing in order to 148 00:07:37,600 --> 00:07:40,720 Speaker 1: better detect mimicry, and that means that the rights holders 149 00:07:40,800 --> 00:07:43,400 Speaker 1: or the platforms can then know if a track needs 150 00:07:43,440 --> 00:07:46,200 Speaker 1: to be licensed and paid for before it's released. 151 00:07:46,640 --> 00:07:50,520 Speaker 3: So that is great for the industry, But what about me, Like, 152 00:07:51,160 --> 00:07:53,880 Speaker 3: you know, I care about infringement, but I'm not like 153 00:07:54,000 --> 00:07:57,200 Speaker 3: obsessed with it, and the thing that I'm really concerned 154 00:07:57,240 --> 00:07:58,520 Speaker 3: with is, like, I want to know if the song 155 00:07:58,520 --> 00:07:59,640 Speaker 3: that I'm listening to is real. 156 00:08:00,200 --> 00:08:02,920 Speaker 1: Taking a step back and not wanting to get too 157 00:08:03,040 --> 00:08:06,280 Speaker 1: philosophical here, but it kind of raises these other questions 158 00:08:06,320 --> 00:08:10,720 Speaker 1: about what AI music actually is because obviously you have 159 00:08:10,800 --> 00:08:14,520 Speaker 1: like fully generative AI generated tracks that are basically have 160 00:08:14,640 --> 00:08:17,320 Speaker 1: no human input. But then a lot of like normal 161 00:08:17,400 --> 00:08:21,080 Speaker 1: musicians as part of their production workflow, use AI tools 162 00:08:21,080 --> 00:08:23,080 Speaker 1: in fact, just as we do right like we use 163 00:08:23,120 --> 00:08:25,600 Speaker 1: AII editing software, we sort of use it in research. 164 00:08:25,680 --> 00:08:28,720 Speaker 1: So like there's a kind of philosophical conundrum about what 165 00:08:28,760 --> 00:08:31,840 Speaker 1: AI music actually is. But these tools we're talking about 166 00:08:31,880 --> 00:08:36,720 Speaker 1: today are really about detecting like fully synthesized full AI tracks. 167 00:08:37,160 --> 00:08:41,000 Speaker 1: In addition to these externally developed products like trace id, 168 00:08:41,120 --> 00:08:45,280 Speaker 1: these streaming platforms are also internally developing tools to scan 169 00:08:45,480 --> 00:08:48,680 Speaker 1: uploaded music and then if they detect a concentration of 170 00:08:48,679 --> 00:08:52,920 Speaker 1: synthetic elements, they can reduce the visibility of AI generated 171 00:08:52,960 --> 00:08:57,199 Speaker 1: tracks in both their algorithmic and also their editorial recommendations. 172 00:08:57,480 --> 00:09:00,000 Speaker 3: Yeah, I've actually seen people on the internet can play 173 00:09:00,360 --> 00:09:04,000 Speaker 3: that their release Radar playlist on Spotify is filled with 174 00:09:04,040 --> 00:09:06,600 Speaker 3: what they suspect to be AI music. This has not 175 00:09:06,840 --> 00:09:09,880 Speaker 3: happened to me personally, but like, it does take a 176 00:09:09,880 --> 00:09:12,720 Speaker 3: lot of work to then go to like an artist 177 00:09:12,760 --> 00:09:16,679 Speaker 3: page and see that someone has no listeners or followers 178 00:09:16,760 --> 00:09:18,439 Speaker 3: or only one or two tracks, you know, And I 179 00:09:18,760 --> 00:09:21,880 Speaker 3: guess there's hope that the actual places that are hosting 180 00:09:21,880 --> 00:09:24,880 Speaker 3: this music would would label it, label it. Yeah. 181 00:09:24,960 --> 00:09:27,240 Speaker 1: Well, the other thing is they might have an incentive 182 00:09:27,280 --> 00:09:29,800 Speaker 1: to do so, because there's this emerging body of research 183 00:09:29,880 --> 00:09:34,440 Speaker 1: that suggests when people feel that interacting with AI generated content, 184 00:09:34,880 --> 00:09:38,280 Speaker 1: that should become less engaged. There's this marketing publication called 185 00:09:38,320 --> 00:09:41,120 Speaker 1: The Drum which reported that more than fifty percent of 186 00:09:41,120 --> 00:09:44,319 Speaker 1: people check out if they believe content is AI generated. 187 00:09:44,679 --> 00:09:46,959 Speaker 1: So again there's another business intentive for the platforms to 188 00:09:47,000 --> 00:09:47,800 Speaker 1: solve this problem. 189 00:09:48,280 --> 00:09:52,240 Speaker 3: And speaking of AI generated content and interesting research, you 190 00:09:52,360 --> 00:09:55,359 Speaker 3: must have seen this the recent MIT study on CHATGPT 191 00:09:55,559 --> 00:10:00,120 Speaker 3: and critical thinking. I actually really liked the terminology that 192 00:10:00,160 --> 00:10:03,920 Speaker 3: they used in this research. The paper was titled your 193 00:10:04,000 --> 00:10:07,000 Speaker 3: Brain on Chat GPT accumulation. And this is what I 194 00:10:07,000 --> 00:10:11,360 Speaker 3: love of cognitive debt. When using an AI assistant for 195 00:10:11,520 --> 00:10:12,920 Speaker 3: essay writing task. 196 00:10:12,880 --> 00:10:15,679 Speaker 1: I love the idea of accumulating cognitive debt. That that 197 00:10:15,800 --> 00:10:18,840 Speaker 1: is such a good phrase and so familiar. I mean, 198 00:10:18,880 --> 00:10:21,520 Speaker 1: I feel like scrolling on my phone. I mean most 199 00:10:21,520 --> 00:10:23,360 Speaker 1: of what I do is accumulate cognitive debt. 200 00:10:23,480 --> 00:10:26,600 Speaker 3: We are just sacks of cognitive debt that we really 201 00:10:26,600 --> 00:10:29,040 Speaker 3: are right now. I mean people have to worry about, 202 00:10:29,320 --> 00:10:31,240 Speaker 3: you know, student debt, They have to worry about other 203 00:10:31,280 --> 00:10:34,520 Speaker 3: kinds of debt. Now we have to worry about cognitive debt. 204 00:10:34,720 --> 00:10:37,400 Speaker 1: I saw a lot of action about this research all 205 00:10:37,440 --> 00:10:38,559 Speaker 1: over my LinkedIn. 206 00:10:38,400 --> 00:10:40,800 Speaker 3: The first place I actually encountered it was in like 207 00:10:40,920 --> 00:10:43,720 Speaker 3: meme format, Like it was like a slideshow. 208 00:10:43,880 --> 00:10:45,560 Speaker 1: There are pictures of their brains, right do lighting up 209 00:10:45,600 --> 00:10:46,520 Speaker 1: in des exactly like. 210 00:10:46,559 --> 00:10:49,319 Speaker 3: I certainly did not see this on like a verified 211 00:10:49,880 --> 00:10:52,600 Speaker 3: news platform. I saw it like on a meme account 212 00:10:52,640 --> 00:10:56,040 Speaker 3: that was like, yeah, your brain's getting worse using chat GPT. 213 00:10:56,720 --> 00:10:59,680 Speaker 3: Before I explain the experiment, there are a few caveats 214 00:10:59,720 --> 00:11:04,080 Speaker 3: that the researchers themselves are eager to share. First, the 215 00:11:04,120 --> 00:11:08,040 Speaker 3: study only has fifty four subjects, which is a relatively 216 00:11:08,040 --> 00:11:11,600 Speaker 3: small sample. And second, and I think the least surprising, 217 00:11:11,679 --> 00:11:13,760 Speaker 3: is that this study has yet to be peer reviewed 218 00:11:14,160 --> 00:11:17,760 Speaker 3: by anybody other than our peers on LinkedIn and Instagram 219 00:11:17,760 --> 00:11:18,360 Speaker 3: and every. 220 00:11:18,320 --> 00:11:20,880 Speaker 1: The interinet had. 221 00:11:21,160 --> 00:11:24,280 Speaker 3: They really did. You know, it's a very buzzy concept, 222 00:11:24,559 --> 00:11:28,040 Speaker 3: this tool chat GBT that we all are using, maybe 223 00:11:28,120 --> 00:11:31,920 Speaker 3: eroding our own ability to think critically. And that's especially 224 00:11:31,960 --> 00:11:33,920 Speaker 3: alarming when you think of what this could mean for 225 00:11:34,720 --> 00:11:38,800 Speaker 3: not my brain, but developing brains. Mine has been developed, 226 00:11:38,520 --> 00:11:42,040 Speaker 3: the trade has left the station. But you know, as 227 00:11:42,080 --> 00:11:44,800 Speaker 3: we've talked about many many times on this podcast, AI 228 00:11:44,880 --> 00:11:48,320 Speaker 3: companies are really marketing themselves and I mean chat GBT, 229 00:11:48,480 --> 00:11:53,520 Speaker 3: especially to college students, and students are using chatbots like quite. 230 00:11:53,240 --> 00:11:55,800 Speaker 1: A bit, quite a bit to assist. 231 00:11:55,440 --> 00:11:57,760 Speaker 3: With or even do their homework. You know. The paper's 232 00:11:57,800 --> 00:12:01,000 Speaker 3: main author felt like her findings were alarming enough and 233 00:12:01,080 --> 00:12:03,840 Speaker 3: people are adapting to life with AI so fast. It 234 00:12:04,040 --> 00:12:07,120 Speaker 3: wasn't peer reviewed because waiting six to eight months might 235 00:12:07,160 --> 00:12:07,800 Speaker 3: be too late. 236 00:12:08,280 --> 00:12:10,920 Speaker 1: The paper's outh theory is a bone hype beast, that's right. 237 00:12:11,000 --> 00:12:12,719 Speaker 3: This is so she's like, it's got to get out 238 00:12:12,800 --> 00:12:14,640 Speaker 3: right now or it's going to be too late. 239 00:12:15,160 --> 00:12:17,840 Speaker 1: But I mean, to be fair, it does sound pretty lombing. 240 00:12:18,240 --> 00:12:22,040 Speaker 3: Yes, there are some silver linings. This is what happened 241 00:12:22,360 --> 00:12:26,280 Speaker 3: fifty four people ages eighteen to thirty nine from the 242 00:12:26,320 --> 00:12:30,240 Speaker 3: Boston area were separated into three groups. They were all 243 00:12:30,280 --> 00:12:34,920 Speaker 3: asked to write sat essays while an electro encephalogram or 244 00:12:35,000 --> 00:12:39,720 Speaker 3: EEG measured their brain activity or more specifically, the tiny 245 00:12:39,800 --> 00:12:43,160 Speaker 3: electrical signals produced by brain cells when they communicate. 246 00:12:43,600 --> 00:12:46,400 Speaker 1: And so there were three groups. What was the difference 247 00:12:46,400 --> 00:12:47,120 Speaker 1: between each group? 248 00:12:47,280 --> 00:12:51,280 Speaker 3: So each group was writing these twenty minute satsays using 249 00:12:51,400 --> 00:12:55,400 Speaker 3: slightly different tools. One group used open AI's chat GPT, 250 00:12:56,240 --> 00:13:00,040 Speaker 3: another Google search engine, and the last group didn't use. 251 00:12:59,920 --> 00:13:01,800 Speaker 1: It anything at all, nothing at all, i e. Just 252 00:13:01,880 --> 00:13:04,839 Speaker 1: the human brain, just their brain for them. 253 00:13:04,960 --> 00:13:09,280 Speaker 3: Unsurprisingly, the group that used chat GPT all delivered. This 254 00:13:09,360 --> 00:13:13,319 Speaker 3: is my favorite part similar essays. English teachers were consulted 255 00:13:13,360 --> 00:13:17,679 Speaker 3: and called them largely soulless. That is a sad indictment, 256 00:13:17,960 --> 00:13:19,720 Speaker 3: it is, and that's actually a good word for it. 257 00:13:19,720 --> 00:13:23,000 Speaker 3: Like if you read stuff that's written by chat GPT, 258 00:13:23,880 --> 00:13:28,079 Speaker 3: you're like, there's something, there's someone here, that's what the 259 00:13:28,160 --> 00:13:28,720 Speaker 3: lights are up. 260 00:13:28,840 --> 00:13:32,360 Speaker 1: Fifty two percent of consumers per the drum check out. 261 00:13:32,480 --> 00:13:34,880 Speaker 3: That feels like a low percent. But coming back to 262 00:13:34,880 --> 00:13:38,000 Speaker 3: the group we were using chat GPT, the EEG picked 263 00:13:38,040 --> 00:13:41,440 Speaker 3: up low executive control and attentional engagement in that group. 264 00:13:41,760 --> 00:13:46,600 Speaker 3: That's contrasted with the brain only group, which showed shocker 265 00:13:46,840 --> 00:13:51,640 Speaker 3: highest neural connectivity, especially in regions associated with creativity and memory. 266 00:13:51,920 --> 00:13:54,160 Speaker 1: And what about the Google Search group, of whom I 267 00:13:54,320 --> 00:13:56,200 Speaker 1: count myself a member of that cohort. 268 00:13:56,280 --> 00:14:00,280 Speaker 3: They're what we'd call mid Their brains were definitely more 269 00:14:00,320 --> 00:14:03,280 Speaker 3: active than the chat GPT groups. So they did this 270 00:14:03,360 --> 00:14:06,600 Speaker 3: experiment three times, and then the researcher switched it up 271 00:14:06,600 --> 00:14:09,440 Speaker 3: a bit. Each person was asked to rewrite one of 272 00:14:09,480 --> 00:14:13,440 Speaker 3: their previous essays, but the chat GPT group could only 273 00:14:13,600 --> 00:14:17,880 Speaker 3: use their brains, while the initial brain only group got 274 00:14:17,920 --> 00:14:21,360 Speaker 3: access to chat GPT. The people who had started with 275 00:14:21,440 --> 00:14:25,640 Speaker 3: chat GPT hardly remembered their own essays. 276 00:14:26,640 --> 00:14:27,560 Speaker 1: Remember that correct. 277 00:14:27,800 --> 00:14:30,080 Speaker 3: This reminds me of once I cheated on a physics exam. 278 00:14:30,200 --> 00:14:31,880 Speaker 3: I cheated off a friend and I just wrote the 279 00:14:31,920 --> 00:14:34,040 Speaker 3: answer without showing my work on a test, and my 280 00:14:34,080 --> 00:14:36,040 Speaker 3: teacher was just like, how did you get that answer? 281 00:14:36,080 --> 00:14:37,560 Speaker 3: It's right, but how'd you get it? And I said, 282 00:14:37,680 --> 00:14:43,720 Speaker 3: I don't know it of course, So the people who 283 00:14:43,720 --> 00:14:46,520 Speaker 3: had started with chat GPT hardly remembered their own essays. 284 00:14:47,240 --> 00:14:51,200 Speaker 3: The EEG confirmed that barely any aspects of the writing 285 00:14:51,240 --> 00:14:56,520 Speaker 3: process had integrated into people's memory networks. The brain only group, however, 286 00:14:57,400 --> 00:15:02,160 Speaker 3: exhibited a significant increase in brain connected across all EEG 287 00:15:02,320 --> 00:15:05,680 Speaker 3: frequency bands when they took a second stab at their 288 00:15:05,800 --> 00:15:09,960 Speaker 3: essays using chat GBT to help, meaning people who use 289 00:15:10,040 --> 00:15:13,640 Speaker 3: their brain displayed the most cognitive activity. 290 00:15:13,280 --> 00:15:15,000 Speaker 1: But not just people use their brain, people who use 291 00:15:15,040 --> 00:15:17,840 Speaker 1: their brain. And then a second time round pad it 292 00:15:17,880 --> 00:15:18,320 Speaker 1: with chat. 293 00:15:18,240 --> 00:15:20,040 Speaker 3: GPT people who use their brain first. 294 00:15:20,080 --> 00:15:23,720 Speaker 1: And that's pretty interesting. I mean, it kind of stands 295 00:15:23,760 --> 00:15:26,480 Speaker 1: to reason, which probably is again why it went viral, 296 00:15:26,520 --> 00:15:29,240 Speaker 1: because it kind of confirms what we might think, which 297 00:15:29,320 --> 00:15:31,760 Speaker 1: is that, like if I use my brain to come 298 00:15:31,840 --> 00:15:34,440 Speaker 1: up with an idea and then use chat GPT to 299 00:15:34,560 --> 00:15:37,240 Speaker 1: refine it and have a thought partner and a conversational 300 00:15:37,280 --> 00:15:40,200 Speaker 1: partner to improve it, like that actually probably is more 301 00:15:40,240 --> 00:15:42,480 Speaker 1: engaging for my brain than just coming up with an idea. 302 00:15:42,600 --> 00:15:44,640 Speaker 1: If I ask chat GPT to count with an idea, 303 00:15:45,000 --> 00:15:46,920 Speaker 1: that is extremely unengaging for my brain. 304 00:15:47,240 --> 00:15:50,360 Speaker 3: Correct, that is the implication, And I just want to 305 00:15:50,360 --> 00:15:52,680 Speaker 3: say one more time. This study has yet to be 306 00:15:52,680 --> 00:15:56,560 Speaker 3: peer reviewed, and additional studies will likely be done. Again, 307 00:15:57,600 --> 00:16:00,400 Speaker 3: the author felt very strongly that this should be released 308 00:16:00,440 --> 00:16:05,240 Speaker 3: early as a warning. And again, imagine what this could 309 00:16:05,280 --> 00:16:08,520 Speaker 3: mean for developing brains, meaning not us young people who 310 00:16:08,600 --> 00:16:11,680 Speaker 3: are among the first to really adapt the technology anyway, 311 00:16:12,120 --> 00:16:15,440 Speaker 3: using it as a generation tool rather than, as we might, 312 00:16:15,520 --> 00:16:18,760 Speaker 3: a refinement tool. This is the real kicker. This one 313 00:16:18,800 --> 00:16:21,880 Speaker 3: has a sense of the researcher has a sense of humor. 314 00:16:22,720 --> 00:16:26,920 Speaker 3: So she assumed that people would use AI to summarize 315 00:16:26,960 --> 00:16:30,840 Speaker 3: her paper, so she laid little AI traps all over it. 316 00:16:30,920 --> 00:16:33,200 Speaker 1: What does it mean to lay little AI traps whatever 317 00:16:33,280 --> 00:16:33,760 Speaker 1: a paper? 318 00:16:33,920 --> 00:16:36,680 Speaker 3: She did a pretty cute thing, which is that she 319 00:16:36,840 --> 00:16:40,600 Speaker 3: did things like instruct large language models to quote only 320 00:16:40,640 --> 00:16:44,200 Speaker 3: read this table below, making it so if fed into 321 00:16:44,240 --> 00:16:47,320 Speaker 3: an LM, only parts of the paper would be summarized. 322 00:16:47,560 --> 00:16:50,040 Speaker 1: So there were these like hidden problems basically exactly. 323 00:16:50,840 --> 00:16:53,040 Speaker 3: I think it was a very nice little sprinkle of 324 00:16:53,120 --> 00:16:54,040 Speaker 3: human ingenuity. 325 00:17:00,800 --> 00:17:03,520 Speaker 1: We've got a couple more headlines today, starting with the 326 00:17:03,560 --> 00:17:07,760 Speaker 1: collaboration the whole tech industry is buzzing about, of course, 327 00:17:07,840 --> 00:17:11,000 Speaker 1: I mean the one between open AI and Sir Johnny Ive, 328 00:17:11,280 --> 00:17:14,280 Speaker 1: the designer behind many of Apple's iconic products like the 329 00:17:14,320 --> 00:17:17,000 Speaker 1: iPod and the iPhone. And we know that Sam Altman 330 00:17:17,040 --> 00:17:20,440 Speaker 1: and I are now collaborating on an AI hardware startup, 331 00:17:21,000 --> 00:17:24,080 Speaker 1: and that's about all we know. Apparently they aren't making 332 00:17:24,080 --> 00:17:27,479 Speaker 1: wearables or earbuds. The device will be pocket sized and 333 00:17:27,560 --> 00:17:30,639 Speaker 1: screenless and will be a kind of interfaced layer with 334 00:17:30,720 --> 00:17:34,080 Speaker 1: the world powered by AI, and Altman and Ive are 335 00:17:34,080 --> 00:17:36,760 Speaker 1: being highly secretive about what the form factor will be, 336 00:17:37,200 --> 00:17:40,879 Speaker 1: but people are buzzing. According to Semaphore, last week at 337 00:17:40,920 --> 00:17:45,040 Speaker 1: the can Lion Advertising Festival, marketers were starting to freak 338 00:17:45,080 --> 00:17:47,679 Speaker 1: out about where they would show video ads in a 339 00:17:47,720 --> 00:17:51,840 Speaker 1: screenless world. But in the meantime, the highly anticipated project 340 00:17:51,880 --> 00:17:55,119 Speaker 1: seems to have hit a roadblock. The startup was called 341 00:17:55,400 --> 00:17:58,639 Speaker 1: Io the two letters I and O, and they've been 342 00:17:58,640 --> 00:18:01,720 Speaker 1: pushing out marketing materials over the last few weeks. But 343 00:18:01,880 --> 00:18:04,640 Speaker 1: now all mentions of Io have been scrubbed from Open 344 00:18:04,680 --> 00:18:08,080 Speaker 1: AI's website and social media channels because it turns out 345 00:18:08,080 --> 00:18:12,160 Speaker 1: there's a trademark lawsuit with another company called Io, which 346 00:18:12,200 --> 00:18:17,160 Speaker 1: is spelled IYO, that's working on voice controlled AI devices. 347 00:18:17,640 --> 00:18:20,840 Speaker 1: Sam Altman is called the lawsuit silly, but it's certainly 348 00:18:20,920 --> 00:18:24,960 Speaker 1: drama in the valley, unlikely ultimately to derail him and 349 00:18:25,080 --> 00:18:27,280 Speaker 1: Johnny Ive. Whatever they may end up with. 350 00:18:27,600 --> 00:18:30,120 Speaker 3: I would watch Drama in the Valley on Bravo. It's 351 00:18:30,160 --> 00:18:33,240 Speaker 3: the pitch yep here we go if you are a 352 00:18:33,359 --> 00:18:37,560 Speaker 3: sucker for post apocalyptic content like me. There's now a 353 00:18:37,600 --> 00:18:40,000 Speaker 3: follow up to the two thousand and one film twenty 354 00:18:40,000 --> 00:18:42,399 Speaker 3: eight Days Later, which is out this month called twenty 355 00:18:42,400 --> 00:18:46,920 Speaker 3: eight Years Later. The original film used lightweight, low resolution 356 00:18:47,080 --> 00:18:50,119 Speaker 3: Canon digital cameras, a cutting edge technology back in the 357 00:18:50,160 --> 00:18:53,040 Speaker 3: early arts, and for the follow up, director Danny Boyle, 358 00:18:53,080 --> 00:18:56,399 Speaker 3: another brit Yeah, chose to stay small and nimble with 359 00:18:56,640 --> 00:19:00,240 Speaker 3: the iPhone. He told Wired that the Apple device was 360 00:19:00,280 --> 00:19:03,680 Speaker 3: the principal camera for the film, with some caveats. Boyle 361 00:19:03,720 --> 00:19:07,560 Speaker 3: and his team ended up overriding the user friendly camera software. 362 00:19:08,119 --> 00:19:12,080 Speaker 3: The iPhone's camera automatically focuses on whatever it assumes is 363 00:19:12,119 --> 00:19:14,840 Speaker 3: the focus of your photo or video, but that's not 364 00:19:14,880 --> 00:19:17,560 Speaker 3: always what you want in a movie, so they essentially 365 00:19:17,720 --> 00:19:22,840 Speaker 3: hacked the iPhones to remove the auto focus. Also, most 366 00:19:22,840 --> 00:19:25,960 Speaker 3: of the time it wasn't just a cinematographer holding an iPhone. 367 00:19:26,080 --> 00:19:29,920 Speaker 3: The production used a massive rig that supported twenty iPhone, 368 00:19:29,960 --> 00:19:34,280 Speaker 3: fifteen Promax cameras my dream all with special accessories. So 369 00:19:34,320 --> 00:19:37,719 Speaker 3: that's twenty different angles on the action being filmed. 370 00:19:48,119 --> 00:19:49,560 Speaker 1: We're going to take a quick break, but when we 371 00:19:49,600 --> 00:19:53,560 Speaker 1: come back, there is a tentative ceasefire between Iran and Israel, 372 00:19:54,119 --> 00:20:10,439 Speaker 1: but does that include cyberwarfare? Stay with us, Welcome back 373 00:20:10,440 --> 00:20:12,680 Speaker 1: to tech Stuff. We want to spend some time talking 374 00:20:12,720 --> 00:20:16,119 Speaker 1: about the conflict between Israel and Iran. Much of the 375 00:20:16,119 --> 00:20:19,320 Speaker 1: battle has played out in public. Missile and drone attacks 376 00:20:19,359 --> 00:20:23,440 Speaker 1: have caused mass casualties in major cities and hospitals. Military 377 00:20:23,480 --> 00:20:27,199 Speaker 1: bases and nuclear sites have all been targeted. As of 378 00:20:27,240 --> 00:20:30,840 Speaker 1: this taping on Wednesday morning, a ceasefire seems to be holding, 379 00:20:31,640 --> 00:20:34,000 Speaker 1: but today we want to shed light on a murkier, 380 00:20:34,440 --> 00:20:38,119 Speaker 1: often invisible act of warfare, and one that's likely to 381 00:20:38,160 --> 00:20:42,800 Speaker 1: continue well after the missiles cease Hittelbos understand how the 382 00:20:42,840 --> 00:20:45,760 Speaker 1: conflict between Israel and Iran is playing out in cyberspace 383 00:20:46,119 --> 00:20:49,280 Speaker 1: and how it might ultimately affect the US is Maggie Miller, 384 00:20:49,400 --> 00:20:52,720 Speaker 1: a cyber security reporter for Politico. Maggie, welcome to tech Stuff. 385 00:20:52,720 --> 00:20:54,760 Speaker 4: Thank you so much for having me, either of. 386 00:20:54,680 --> 00:20:56,919 Speaker 1: This taping or Wednesday morning. The ceasefire is an effect, 387 00:20:56,960 --> 00:20:59,200 Speaker 1: but of course by Friday morning things may be different. 388 00:21:00,040 --> 00:21:02,440 Speaker 1: Give US a bit of the background on both Israel 389 00:21:02,600 --> 00:21:05,760 Speaker 1: and Iran, what their cyber capabilities are, and how much 390 00:21:05,800 --> 00:21:07,879 Speaker 1: attention you were paying to those two countries in the 391 00:21:07,880 --> 00:21:10,160 Speaker 1: cyber realm before this conflict started. 392 00:21:10,680 --> 00:21:14,400 Speaker 2: So Israel has seen pretty widely as one of the 393 00:21:14,440 --> 00:21:16,560 Speaker 2: most advanced in the world in terms. 394 00:21:16,359 --> 00:21:18,160 Speaker 4: Of the cyber capabilities as a. 395 00:21:18,119 --> 00:21:21,879 Speaker 2: Government, but also in terms of the industry experts that 396 00:21:21,880 --> 00:21:24,240 Speaker 2: they have in their country. Tel Aviv is a hub 397 00:21:24,280 --> 00:21:27,640 Speaker 2: of a lot of cybersecurity companies, so they have very 398 00:21:27,680 --> 00:21:31,200 Speaker 2: formidable cyber capabilities. We've also seen them brought to bear. 399 00:21:31,320 --> 00:21:34,240 Speaker 2: There was a cyber element involved in the explosion of 400 00:21:34,240 --> 00:21:38,679 Speaker 2: the pagers used by Hesbolah operatives in Lebanon in recent 401 00:21:38,720 --> 00:21:42,320 Speaker 2: months that was tied to the Israeli government. So anytime 402 00:21:42,359 --> 00:21:45,560 Speaker 2: there's a conflict with Israel, especially when it's being supported 403 00:21:45,640 --> 00:21:50,119 Speaker 2: by the US, which also has pretty formidable cyber attack capabilities, 404 00:21:50,320 --> 00:21:52,560 Speaker 2: you're going to keep an eye on it. Iran also 405 00:21:52,720 --> 00:21:56,840 Speaker 2: has had a history of integrating cyber attacks into its efforts, 406 00:21:57,240 --> 00:22:01,679 Speaker 2: sometimes lower level but still quite impactful. For example, in 407 00:22:01,720 --> 00:22:05,360 Speaker 2: the US, we saw in the weeks after the Hamas 408 00:22:05,400 --> 00:22:09,160 Speaker 2: attack on October seventh, twenty twenty three, onto Israel, there 409 00:22:09,240 --> 00:22:13,560 Speaker 2: were pro Iranian hackers. Sometimes Iran can operate also through 410 00:22:13,560 --> 00:22:17,080 Speaker 2: proxy groups as well as from the government, but we 411 00:22:17,160 --> 00:22:20,800 Speaker 2: saw at least one pro Iranian group hack into multiple 412 00:22:20,920 --> 00:22:25,000 Speaker 2: US water treatment facilities. Target has rarely made equipment in 413 00:22:25,119 --> 00:22:29,000 Speaker 2: order to basically deface it with a message against Israel 414 00:22:29,160 --> 00:22:31,840 Speaker 2: and really send a message that hey, you know, this 415 00:22:31,920 --> 00:22:35,159 Speaker 2: might be a small water facility in rural Pennsylvania, but 416 00:22:35,240 --> 00:22:38,680 Speaker 2: we can still cause damage. And so Iran has always 417 00:22:38,720 --> 00:22:40,880 Speaker 2: been one that the US has kept a close eye 418 00:22:40,920 --> 00:22:44,440 Speaker 2: on in cyberspace. So I think, you know, to emphasize, 419 00:22:44,480 --> 00:22:48,600 Speaker 2: all three nations involved here quite formidable and have demonstrated 420 00:22:48,600 --> 00:22:51,600 Speaker 2: these cyber attack capabilities in the past. 421 00:22:51,880 --> 00:22:55,560 Speaker 1: How much did this cyber conflict between Israel and Iran 422 00:22:56,040 --> 00:22:59,880 Speaker 1: sort of take off after the twenty twenty three Hamas attacks. 423 00:23:00,320 --> 00:23:02,960 Speaker 1: Was that a kind of a turning point or where 424 00:23:02,960 --> 00:23:05,200 Speaker 1: do you trace this current phase of escalation too? 425 00:23:05,480 --> 00:23:05,760 Speaker 3: Well? 426 00:23:05,800 --> 00:23:08,560 Speaker 2: In terms of the escalation that we've seen in the 427 00:23:08,560 --> 00:23:11,800 Speaker 2: past week, there of course was a pickup after the 428 00:23:12,040 --> 00:23:15,760 Speaker 2: initial I believe it was June thirteenth strikes by Israel 429 00:23:15,840 --> 00:23:19,480 Speaker 2: against Iran, but there has been a heightened amount of 430 00:23:19,760 --> 00:23:23,879 Speaker 2: cyber threats, cyber attacks between the two nations since October seventh. 431 00:23:24,640 --> 00:23:29,280 Speaker 2: Both Hamas and Hezbula are very much affiliated, supported by 432 00:23:29,400 --> 00:23:33,040 Speaker 2: the Iranian government, often have served as proxies. Both have 433 00:23:33,119 --> 00:23:36,480 Speaker 2: been somewhat knocked offline due to a lot of the 434 00:23:36,520 --> 00:23:40,200 Speaker 2: Israeli attacks against both groups in the last two years, 435 00:23:40,760 --> 00:23:43,760 Speaker 2: but they did both carry out some cyber operations also 436 00:23:44,000 --> 00:23:48,040 Speaker 2: spearhead a lot of disinformation online. That's another big effort 437 00:23:48,200 --> 00:23:51,600 Speaker 2: by Iran. We've seen just in the past week, for example, 438 00:23:51,920 --> 00:23:54,680 Speaker 2: messages either of trace to the Iranian government or to 439 00:23:54,760 --> 00:23:59,480 Speaker 2: proxies being sent to Israeli phone saying, for example, oh 440 00:23:59,520 --> 00:24:02,560 Speaker 2: you don't need to go to the shelters during this bombing, 441 00:24:02,760 --> 00:24:07,200 Speaker 2: you can stay outside, or another case's messages with links 442 00:24:07,280 --> 00:24:11,560 Speaker 2: to try to gain information from Israelis. So it really 443 00:24:11,600 --> 00:24:13,919 Speaker 2: has escalated, i would say, in the past week, but 444 00:24:14,240 --> 00:24:16,520 Speaker 2: has been a steady clip since October seven. 445 00:24:17,040 --> 00:24:21,399 Speaker 1: Do you see misinformation and disinformation as a type of 446 00:24:21,480 --> 00:24:23,600 Speaker 1: cyber attach or type of cyber warfare or as a 447 00:24:23,720 --> 00:24:26,600 Speaker 1: separate category as far as your reporting goes. 448 00:24:26,560 --> 00:24:30,480 Speaker 2: They're often linked in that. Of course, one does not 449 00:24:30,680 --> 00:24:34,399 Speaker 2: involve hacking into any sort of system or operation, but 450 00:24:34,560 --> 00:24:39,159 Speaker 2: it does involve changing a perception and using social media 451 00:24:39,240 --> 00:24:44,560 Speaker 2: often or for example malicious texts or calls, etc. And 452 00:24:44,640 --> 00:24:47,720 Speaker 2: it's an even more I would almost say, at times 453 00:24:47,840 --> 00:24:52,560 Speaker 2: more effective way of changing perceptions and causing chaos and 454 00:24:52,600 --> 00:24:56,280 Speaker 2: causing panic because you don't necessarily know who to trust, 455 00:24:56,680 --> 00:24:59,240 Speaker 2: and especially in this day and age where I think 456 00:24:59,680 --> 00:25:03,159 Speaker 2: less less people may be understanding who to trust, I 457 00:25:03,160 --> 00:25:05,880 Speaker 2: think it's an even more potent avenue. 458 00:25:06,560 --> 00:25:10,800 Speaker 1: How effective have Iran and Israel's cyber attacks on one another, 459 00:25:10,920 --> 00:25:13,520 Speaker 1: bin I mean, how much have they changed the face 460 00:25:13,520 --> 00:25:14,280 Speaker 1: of this conflict? 461 00:25:14,800 --> 00:25:16,639 Speaker 4: There has been certainly some effect. 462 00:25:16,840 --> 00:25:19,879 Speaker 2: So an example, and I've cited this in my reporting, 463 00:25:20,040 --> 00:25:23,320 Speaker 2: is there have been multiple major cyber attacks on Iranian 464 00:25:23,320 --> 00:25:25,119 Speaker 2: banks in the past week and a half that have 465 00:25:25,240 --> 00:25:26,800 Speaker 2: been linked to at. 466 00:25:26,640 --> 00:25:28,480 Speaker 4: The least pro Israeli groups. 467 00:25:28,560 --> 00:25:31,080 Speaker 2: You know, there's often a lot of cyber criminal or 468 00:25:31,119 --> 00:25:34,040 Speaker 2: activist groups in the world that may not be officially 469 00:25:34,200 --> 00:25:37,280 Speaker 2: affiliated with a government, but maybe that government wouldn't mind 470 00:25:37,400 --> 00:25:40,879 Speaker 2: their work. But there have been strikes against specific Iranian 471 00:25:40,880 --> 00:25:44,159 Speaker 2: banks designed to make it more difficult for Iranians to 472 00:25:44,400 --> 00:25:46,960 Speaker 2: access their funds, to access their accounts, and I think 473 00:25:47,000 --> 00:25:49,800 Speaker 2: critically to cause chaos. But on the flip side, there's 474 00:25:49,840 --> 00:25:52,840 Speaker 2: also been a tax linked to Iran and Israel. So 475 00:25:53,440 --> 00:25:57,400 Speaker 2: just a few days ago, Israel's cyber security agency put 476 00:25:57,440 --> 00:26:00,840 Speaker 2: out a warning that a lot of Israeli should disable 477 00:26:00,880 --> 00:26:04,360 Speaker 2: some of their home surveillance cameras because it was actually 478 00:26:04,440 --> 00:26:07,800 Speaker 2: being seen as a target used by Iran or pro 479 00:26:07,840 --> 00:26:11,840 Speaker 2: Iranian hackers trying to gather intelligence and gather real time 480 00:26:11,960 --> 00:26:14,440 Speaker 2: data on what was happening in the country. As I said, 481 00:26:14,480 --> 00:26:17,960 Speaker 2: there's also been a lot a huge ramp up of 482 00:26:18,359 --> 00:26:23,960 Speaker 2: phone messaging, emails, etc. From Iran targeting Israeli's designed to 483 00:26:24,240 --> 00:26:28,480 Speaker 2: either spread disinformation or in some cases trying to collect 484 00:26:28,480 --> 00:26:31,960 Speaker 2: information and data on Israelis or Israelis abroad. 485 00:26:32,480 --> 00:26:33,560 Speaker 4: So it certainly has. 486 00:26:33,440 --> 00:26:37,840 Speaker 2: Been an active campaign in cyberspace. And I think one 487 00:26:37,880 --> 00:26:41,120 Speaker 2: of the main questions I have is given this tenuous 488 00:26:41,119 --> 00:26:45,520 Speaker 2: ceasefire as of Wednesday while we're recording this. Often ceasefires, 489 00:26:45,520 --> 00:26:48,200 Speaker 2: of course mean physical strikes, and a lot of times 490 00:26:48,240 --> 00:26:51,720 Speaker 2: in the world, there isn't really a definition for what 491 00:26:51,760 --> 00:26:55,560 Speaker 2: that means in cyberspace, and we'll see if there really 492 00:26:55,640 --> 00:26:58,879 Speaker 2: is much of a ceasefire in the digital realm. 493 00:26:59,240 --> 00:27:01,920 Speaker 1: And as I like two types of cyber attacks, but 494 00:27:02,080 --> 00:27:05,439 Speaker 1: one is designed to kind of sow chaos but not 495 00:27:05,600 --> 00:27:08,200 Speaker 1: be so destructive as to be an act of war. 496 00:27:08,680 --> 00:27:11,879 Speaker 1: And another is like to knock out some specific radars 497 00:27:11,960 --> 00:27:15,200 Speaker 1: so that like military planes can most successfully bomb without 498 00:27:15,280 --> 00:27:17,879 Speaker 1: any risk of being hit back. Like how much of 499 00:27:17,960 --> 00:27:20,919 Speaker 1: this is in direct coordination with military and like so 500 00:27:21,000 --> 00:27:23,399 Speaker 1: called kinetic strikes, and how much of it is like 501 00:27:23,480 --> 00:27:26,760 Speaker 1: more low grade social erosion chaos causing. 502 00:27:27,000 --> 00:27:29,639 Speaker 4: No I think there's obviously different levels. 503 00:27:29,680 --> 00:27:32,560 Speaker 2: It's not necessarily going to be different vectors of how 504 00:27:32,600 --> 00:27:35,320 Speaker 2: the attack is carried out, but there is. I would think, 505 00:27:35,400 --> 00:27:39,040 Speaker 2: just as with physical strikes, conversations often about Okay, how 506 00:27:39,119 --> 00:27:39,560 Speaker 2: far do. 507 00:27:39,480 --> 00:27:40,239 Speaker 4: We want to take this? 508 00:27:40,440 --> 00:27:44,520 Speaker 2: You know, disabling a major bank is going to cause confusion, 509 00:27:44,680 --> 00:27:46,959 Speaker 2: and I'm sure you're not going to make any friends 510 00:27:46,960 --> 00:27:49,800 Speaker 2: by doing that, but ultimately you're not necessarily going to 511 00:27:49,840 --> 00:27:54,440 Speaker 2: cause death, You're not necessarily going to cause widespread military impact, 512 00:27:54,720 --> 00:27:57,760 Speaker 2: but you know, something like for example, in the beginning 513 00:27:57,840 --> 00:28:00,640 Speaker 2: of twenty twenty two, when we saw the full invasion 514 00:28:00,640 --> 00:28:03,520 Speaker 2: of Ukraine, one of the first things that happened was 515 00:28:03,600 --> 00:28:07,880 Speaker 2: a Russian government linked cyber attack on a major satellite company. 516 00:28:07,560 --> 00:28:11,360 Speaker 4: Called Viasat, which was very a key to. 517 00:28:11,880 --> 00:28:16,240 Speaker 2: Ukrainian military communications, and that was very largely disabled and 518 00:28:16,280 --> 00:28:19,440 Speaker 2: taken out just before Russian troops pored over the border 519 00:28:19,480 --> 00:28:22,560 Speaker 2: into Ukraine, and that was I would think very much 520 00:28:22,600 --> 00:28:26,399 Speaker 2: coordinated with the government, definitely caused a lot of problems 521 00:28:26,440 --> 00:28:30,240 Speaker 2: with Ukrainians being able to communicate briefly in cyber space. 522 00:28:30,280 --> 00:28:33,600 Speaker 2: There isn't really an internationally regarded red line in the 523 00:28:33,640 --> 00:28:37,200 Speaker 2: sand of this to go to war. However, I always 524 00:28:37,240 --> 00:28:41,080 Speaker 2: point out, of course Israel and Iran are not in NATO, 525 00:28:41,240 --> 00:28:45,080 Speaker 2: but within the NATO Block, Article five is an effect 526 00:28:45,240 --> 00:28:48,800 Speaker 2: and that ensures that if there is a specific act 527 00:28:48,800 --> 00:28:51,280 Speaker 2: of war against a member, that I'll go to war. 528 00:28:51,720 --> 00:28:54,760 Speaker 2: And actually it has been revamped to include cyber attacks. 529 00:28:55,200 --> 00:28:57,920 Speaker 2: Now there isn't really a definition though of what that means, 530 00:28:57,960 --> 00:29:01,440 Speaker 2: and what's been explained to me is kind of of anything. 531 00:29:01,120 --> 00:29:02,719 Speaker 4: That causes widespread deaths. 532 00:29:02,760 --> 00:29:06,760 Speaker 2: So for example, if you saw power offline and the 533 00:29:06,760 --> 00:29:09,120 Speaker 2: dead of winter for a couple of weeks and you 534 00:29:09,160 --> 00:29:11,120 Speaker 2: saw people dying because they didn't have heat. 535 00:29:12,000 --> 00:29:15,680 Speaker 1: We've talked a lot about cyber offense, what about defense? 536 00:29:15,760 --> 00:29:18,360 Speaker 1: I mean I read that Iran actually took most of 537 00:29:18,360 --> 00:29:21,680 Speaker 1: the Internet offline partly in response to cyber attacks, But 538 00:29:22,200 --> 00:29:24,240 Speaker 1: you tell us a bit about that, and what are 539 00:29:24,280 --> 00:29:26,960 Speaker 1: some of the other, maybe less dramatic defense tactics that 540 00:29:27,240 --> 00:29:28,720 Speaker 1: Israel and Iran a both employing. 541 00:29:28,960 --> 00:29:32,440 Speaker 2: Yes, So with Iran this is several days ago. I 542 00:29:32,480 --> 00:29:34,960 Speaker 2: think it may be around a week as of this recording. 543 00:29:35,360 --> 00:29:38,840 Speaker 2: Most of the country was taken offline into a semi 544 00:29:38,880 --> 00:29:41,800 Speaker 2: blackout by the government and one of the reasons given 545 00:29:42,000 --> 00:29:46,040 Speaker 2: was the need to defend against Israeli linked cyber attacks. 546 00:29:46,120 --> 00:29:49,520 Speaker 2: Now I have spoken to experts who are a little 547 00:29:49,520 --> 00:29:53,000 Speaker 2: bit skeptical of that claim. It's also a moment of 548 00:29:53,280 --> 00:29:55,800 Speaker 2: a lot of fear by the administration in Iran about 549 00:29:55,840 --> 00:29:58,400 Speaker 2: their future, So it also may have to do with 550 00:29:58,520 --> 00:30:01,480 Speaker 2: simply controlling the messaging. But it is true that they 551 00:30:01,760 --> 00:30:04,960 Speaker 2: certainly are worried about cyber attacks. As I mentioned earlier, 552 00:30:05,120 --> 00:30:08,600 Speaker 2: the pager incident in Lebanon, I think has really changed 553 00:30:08,640 --> 00:30:12,560 Speaker 2: the game in terms of concerns. Of course, it involved explosives, 554 00:30:12,640 --> 00:30:16,640 Speaker 2: but it was triggered remotely, very kind of sophisticated attack, 555 00:30:16,880 --> 00:30:20,040 Speaker 2: and as a result, another effort that Aroan is put 556 00:30:20,120 --> 00:30:25,000 Speaker 2: in places. They've told basically all government cybersecurity officials any 557 00:30:25,040 --> 00:30:29,120 Speaker 2: of their staff to pretty much stop using Internet connected 558 00:30:29,120 --> 00:30:33,160 Speaker 2: devices as much as possible. Obviously, if you're working in 559 00:30:33,240 --> 00:30:36,200 Speaker 2: hacking capabilities, there's going to be certain devices you have 560 00:30:36,280 --> 00:30:36,680 Speaker 2: to use. 561 00:30:37,000 --> 00:30:38,000 Speaker 4: But as I said, what. 562 00:30:38,000 --> 00:30:41,800 Speaker 2: Happened in Lebanon really changed the game in terms of concerns. 563 00:30:42,000 --> 00:30:44,600 Speaker 2: Of course, Israel, as I mentioned, they're very vocal. They're 564 00:30:44,640 --> 00:30:47,720 Speaker 2: cyber agency warning civilians about a lot of these waves 565 00:30:47,760 --> 00:30:50,520 Speaker 2: of messages about not clicking on certain links sent to them, 566 00:30:50,800 --> 00:30:53,040 Speaker 2: about disabling home surveillance cameras. 567 00:30:53,480 --> 00:30:55,920 Speaker 4: I do think though that Israelis. 568 00:30:55,440 --> 00:30:58,440 Speaker 2: Are, in terms of digital threats, always on a bit 569 00:30:58,480 --> 00:31:01,680 Speaker 2: of a higher putting hire alert. It is a very 570 00:31:01,680 --> 00:31:04,600 Speaker 2: digitally interconnected society, as I mentioned, one of the most 571 00:31:04,640 --> 00:31:08,960 Speaker 2: advanced in cyberspace, and there are certainly no strangers to 572 00:31:09,200 --> 00:31:09,920 Speaker 2: threats like this. 573 00:31:10,560 --> 00:31:12,320 Speaker 1: Let's talk about the US for remote, because you wrote 574 00:31:12,360 --> 00:31:15,640 Speaker 1: this story recently with the headline US critical networks are 575 00:31:15,640 --> 00:31:16,880 Speaker 1: prime targets. 576 00:31:16,440 --> 00:31:17,400 Speaker 4: For cyber attacks. 577 00:31:17,680 --> 00:31:21,440 Speaker 1: They're preparing for Iran to strike. So as of Wednesday, again, 578 00:31:21,960 --> 00:31:23,560 Speaker 1: did that happen in terms of. 579 00:31:23,520 --> 00:31:27,000 Speaker 2: A major attack, No, But at moments of any sort 580 00:31:27,040 --> 00:31:30,400 Speaker 2: of geopolitical tension like this, especially when the US in 581 00:31:30,440 --> 00:31:32,720 Speaker 2: this case has directly. 582 00:31:32,280 --> 00:31:34,720 Speaker 4: Weighted in by hitting Iranian. 583 00:31:34,280 --> 00:31:38,760 Speaker 2: Nuclear sites, a lot of US critical infrastructure owners operators, 584 00:31:38,800 --> 00:31:41,360 Speaker 2: and when I say that, I mean everything from those 585 00:31:41,440 --> 00:31:44,959 Speaker 2: that operate the electric grid across the country, that operate 586 00:31:45,040 --> 00:31:51,440 Speaker 2: water treatment facilities, hospitals, educational facilities, a lot of different sectors. 587 00:31:51,520 --> 00:31:53,240 Speaker 2: Of course, they're going to be on a bit of 588 00:31:53,280 --> 00:31:57,000 Speaker 2: higher alert because, as I said, you know, cyber attacks 589 00:31:57,080 --> 00:32:01,120 Speaker 2: are not really clearly defined as an active war, or 590 00:32:01,160 --> 00:32:03,719 Speaker 2: if they are, how far you have to go. And 591 00:32:03,760 --> 00:32:06,520 Speaker 2: it's a very cheap but very effective way of getting 592 00:32:06,520 --> 00:32:09,680 Speaker 2: a message across. Say, all of a sudden, the water 593 00:32:09,840 --> 00:32:12,320 Speaker 2: supply is compromised in a city in America. I mean 594 00:32:12,400 --> 00:32:15,680 Speaker 2: that's pretty effective in terms of messaging. Same with oh 595 00:32:15,720 --> 00:32:19,160 Speaker 2: the lights went out in this major city for who 596 00:32:19,160 --> 00:32:21,640 Speaker 2: knows how long. So you know, I think a lot 597 00:32:21,680 --> 00:32:25,200 Speaker 2: of these organizations are simply on a higher setting in 598 00:32:25,280 --> 00:32:27,760 Speaker 2: terms of what they're watching for. And a lot of 599 00:32:27,760 --> 00:32:30,600 Speaker 2: times I like to emphasize whenever I talk about these 600 00:32:30,640 --> 00:32:33,760 Speaker 2: types of threats, probably more than about ninety five percent 601 00:32:33,800 --> 00:32:38,320 Speaker 2: of all successful cyber attacks, it's not something very sophisticated. 602 00:32:38,480 --> 00:32:40,800 Speaker 2: It's not, you know, something that a government was planning 603 00:32:40,840 --> 00:32:44,000 Speaker 2: for years. It can be something as small as oh, 604 00:32:44,000 --> 00:32:45,040 Speaker 2: this email came through. 605 00:32:45,120 --> 00:32:46,040 Speaker 4: It looked legitimate. 606 00:32:46,120 --> 00:32:48,640 Speaker 2: I clicked this link all of a sudden, you know, 607 00:32:48,720 --> 00:32:51,360 Speaker 2: for example, you click that on your work email, your 608 00:32:51,400 --> 00:32:54,960 Speaker 2: email is compromised. Through that, they're able to compromise other 609 00:32:55,000 --> 00:32:57,400 Speaker 2: accounts and kind of move through the IT network. 610 00:32:57,840 --> 00:33:00,000 Speaker 1: But I guess my final question to you, Migi is, 611 00:33:00,120 --> 00:33:02,680 Speaker 1: as you look to the days ahead, what are you 612 00:33:02,800 --> 00:33:05,000 Speaker 1: expecting to see, What are you watching out for. What 613 00:33:05,040 --> 00:33:08,160 Speaker 1: are your sources telling you might be most interesting in 614 00:33:08,200 --> 00:33:11,560 Speaker 1: the Iran Israel conflict in the realm of cyber. 615 00:33:11,480 --> 00:33:14,120 Speaker 2: I think again to emphasize that while there may be 616 00:33:14,120 --> 00:33:18,320 Speaker 2: a ceasefire in terms of physical missile strikes, it really 617 00:33:18,400 --> 00:33:21,160 Speaker 2: doesn't say anything in the ceasefire about what. 618 00:33:20,960 --> 00:33:22,360 Speaker 4: The digital space will look like. 619 00:33:22,560 --> 00:33:25,720 Speaker 2: So it will be very interesting to see if we 620 00:33:25,840 --> 00:33:29,200 Speaker 2: continue to see more low level I would say threats 621 00:33:29,200 --> 00:33:32,520 Speaker 2: in terms of disinformation, in terms of you know, maybe 622 00:33:32,560 --> 00:33:36,400 Speaker 2: targeting of Israeli or Iranian organizations that are critical to 623 00:33:36,480 --> 00:33:39,719 Speaker 2: day to day life but aren't necessarily going to cause deaths, 624 00:33:39,760 --> 00:33:42,080 Speaker 2: such as, you know, disabling a bank for another day, 625 00:33:42,560 --> 00:33:44,840 Speaker 2: or if we see that go up as a result 626 00:33:44,920 --> 00:33:48,120 Speaker 2: of kind of having their hands tied on either side 627 00:33:48,200 --> 00:33:52,320 Speaker 2: being able to drop missiles, you know, kinetic attacks, and 628 00:33:52,360 --> 00:33:54,840 Speaker 2: the fact that the international community seems to be a 629 00:33:54,880 --> 00:33:56,840 Speaker 2: bit frozen when it comes to cyber attacks. 630 00:33:56,880 --> 00:33:58,000 Speaker 4: So, you know, if we. 631 00:33:58,000 --> 00:34:01,400 Speaker 2: See major threats to hospital for example in either nation, 632 00:34:01,680 --> 00:34:05,360 Speaker 2: threats to the grid, threats to any of the really 633 00:34:05,400 --> 00:34:07,400 Speaker 2: really critical groups. 634 00:34:07,000 --> 00:34:09,800 Speaker 4: So that would certainly be extremely interesting. 635 00:34:09,840 --> 00:34:12,360 Speaker 2: If I saw that, I would hope as a citizen 636 00:34:12,400 --> 00:34:15,000 Speaker 2: of the world, that we would see more restraint. 637 00:34:15,120 --> 00:34:16,000 Speaker 4: But we will see. 638 00:34:16,239 --> 00:34:18,279 Speaker 2: And also, I think, you know it's been emphasized to me, 639 00:34:18,560 --> 00:34:22,239 Speaker 2: is that when it comes to retaliation, Iran especially likes 640 00:34:22,239 --> 00:34:25,120 Speaker 2: to play the longer game. So even if we don't 641 00:34:25,160 --> 00:34:28,319 Speaker 2: necessarily see major cyber strikes on the US this week, 642 00:34:28,360 --> 00:34:30,640 Speaker 2: that doesn't mean we might not see one a year 643 00:34:30,680 --> 00:34:33,880 Speaker 2: from now that has been extensively planned out. It really 644 00:34:33,920 --> 00:34:35,799 Speaker 2: is something that you can never really take your eye 645 00:34:35,840 --> 00:34:38,799 Speaker 2: off the ball, especially with Iran. So we will have 646 00:34:38,840 --> 00:34:41,200 Speaker 2: to see what happens in the years and weeks to come. 647 00:34:41,239 --> 00:34:43,759 Speaker 2: But I would be very surprised if in the coming 648 00:34:43,840 --> 00:34:46,279 Speaker 2: days it was completely dead and nothing happening in the 649 00:34:46,280 --> 00:34:48,759 Speaker 2: digital realm. 650 00:34:48,920 --> 00:34:50,839 Speaker 1: Mate, thank you so much for joining us today, Thank 651 00:34:50,920 --> 00:34:51,600 Speaker 1: you for having me. 652 00:35:03,800 --> 00:35:06,040 Speaker 3: That's it for this week for tech stuff. I'm Kara 653 00:35:06,080 --> 00:35:07,440 Speaker 3: Price and I'm mos Valoshin. 654 00:35:07,680 --> 00:35:11,120 Speaker 1: This episode was produced by Eliza Dennis and Adriana Topia. 655 00:35:11,320 --> 00:35:14,200 Speaker 1: It was executive produced by me Kara Price and Kate 656 00:35:14,239 --> 00:35:18,360 Speaker 1: Osborne for Kaleidoscope and Katrina Novel for iHeart Podcasts. The 657 00:35:18,440 --> 00:35:22,879 Speaker 1: engineer is Elvira Gutierrez at CDM Studios. Jack Insley makes 658 00:35:22,920 --> 00:35:25,520 Speaker 1: this episode and Kyle Murdoch wrote our theme song. 659 00:35:25,800 --> 00:35:28,480 Speaker 3: Join us next Wednesday for tex Stuff The Story when 660 00:35:28,480 --> 00:35:31,160 Speaker 3: we will share an in depth conversation with author Jahini 661 00:35:31,239 --> 00:35:34,279 Speaker 3: Vara about how the Internet has shaped us as individuals. 662 00:35:34,520 --> 00:35:37,040 Speaker 1: Please rate, review, and reach out to us at tech 663 00:35:37,080 --> 00:35:40,040 Speaker 1: Stuff podcast at gmail dot com. We love hearing from 664 00:35:40,080 --> 00:35:40,160 Speaker 1: you