1 00:00:04,519 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,400 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:15,400 --> 00:00:18,439 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:18,480 --> 00:00:20,880 Speaker 1: tech are you. It's time for the tech News Inning 5 00:00:20,920 --> 00:00:24,959 Speaker 1: on Friday May thirty first, twenty twenty four, and let's 6 00:00:25,000 --> 00:00:29,840 Speaker 1: start off with some news about TikTok. The Independent reports 7 00:00:29,880 --> 00:00:34,159 Speaker 1: that part of TikTok's Project Texas plan. Project Texas was 8 00:00:34,200 --> 00:00:38,720 Speaker 1: the company's attempt to reassure American politicians that TikTok isn't 9 00:00:38,760 --> 00:00:42,199 Speaker 1: a data funnel that leads directly to China, was to 10 00:00:42,200 --> 00:00:47,080 Speaker 1: give the US government a remarkable amount of control and oversight. So, 11 00:00:47,200 --> 00:00:50,720 Speaker 1: for one thing, it would have let federal officials elect 12 00:00:50,880 --> 00:00:54,360 Speaker 1: some of the board members for TikTok. The government would 13 00:00:54,400 --> 00:00:57,720 Speaker 1: be given access to TikTok's source code to look for 14 00:00:57,800 --> 00:01:01,640 Speaker 1: evidence of backdoor access and that sort of thing. And 15 00:01:01,760 --> 00:01:05,400 Speaker 1: apparently there even would have been a kill switch feature 16 00:01:05,520 --> 00:01:09,320 Speaker 1: built into TikTok should someone determine that it was serving 17 00:01:09,360 --> 00:01:13,039 Speaker 1: as some sort of insidious tool belonging to a foreign adversary. 18 00:01:13,600 --> 00:01:16,480 Speaker 1: But the White House rejected this plan and said it 19 00:01:16,480 --> 00:01:20,720 Speaker 1: would not be sufficient to address national security concerns. Now, 20 00:01:20,760 --> 00:01:24,080 Speaker 1: I have only read the reporting around this plan. I 21 00:01:24,080 --> 00:01:26,759 Speaker 1: have not actually read the full details of the plan itself, 22 00:01:27,000 --> 00:01:29,920 Speaker 1: so I don't really have any more insight into this. 23 00:01:30,360 --> 00:01:34,000 Speaker 1: But we all know what actually happened instead of it right. 24 00:01:34,200 --> 00:01:36,720 Speaker 1: Congress passed a law that, if it holds up to 25 00:01:36,800 --> 00:01:41,360 Speaker 1: TikTok's legal challenges, will ultimately force TikTok to either separate 26 00:01:41,640 --> 00:01:45,520 Speaker 1: entirely from its Chinese parent company, byte Dance, or face 27 00:01:45,600 --> 00:01:50,120 Speaker 1: a nationwide ban in the United States. This brings me 28 00:01:50,240 --> 00:01:54,240 Speaker 1: to the next TikTok story. Reuter's reports that TikTok has 29 00:01:54,320 --> 00:01:58,440 Speaker 1: secretly been working on creating a recommendation algorithm that would 30 00:01:58,440 --> 00:02:02,400 Speaker 1: be completely independent from the one that Byteedance uses for 31 00:02:02,680 --> 00:02:06,560 Speaker 1: the sister app du Jin, which is the you know, 32 00:02:06,680 --> 00:02:11,079 Speaker 1: the Chinese variant of TikTok, or you could argue TikTok's 33 00:02:11,080 --> 00:02:14,920 Speaker 1: the American variant of Douyin. The implication is that this 34 00:02:15,160 --> 00:02:18,480 Speaker 1: is a potential preparation in the event that TikTok is 35 00:02:18,639 --> 00:02:24,040 Speaker 1: forced to separate from the mothership. Reuter's sites unnamed sources 36 00:02:24,080 --> 00:02:26,920 Speaker 1: who say the project is massive and could take a 37 00:02:26,960 --> 00:02:30,520 Speaker 1: year or so to complete. These sources claim that TikTok 38 00:02:30,600 --> 00:02:33,520 Speaker 1: executives have talked about the project in all hands meetings 39 00:02:33,520 --> 00:02:38,920 Speaker 1: and such, but TikTok representatives have disputed Reuter's report and 40 00:02:39,080 --> 00:02:43,840 Speaker 1: said it was quote misleading and factually inaccurate in the quote, 41 00:02:43,880 --> 00:02:48,640 Speaker 1: and that the divestiture from Byteedance is quote simply not possible. 42 00:02:48,880 --> 00:02:51,680 Speaker 1: End the quote where the truth lies. I don't know. 43 00:02:52,160 --> 00:02:56,000 Speaker 1: I have no doubt that divesting TikTok would be really challenging. 44 00:02:56,600 --> 00:03:00,400 Speaker 1: I do not think it would be impossible. It might 45 00:03:00,480 --> 00:03:03,560 Speaker 1: be very difficult, and it might also mean that the 46 00:03:03,639 --> 00:03:06,560 Speaker 1: effort to do so would cost so much in resources 47 00:03:06,600 --> 00:03:09,720 Speaker 1: that economically it's not viable. So maybe you know, from 48 00:03:09,760 --> 00:03:12,760 Speaker 1: an economic standpoint, you could say, yeah, it's not possible, 49 00:03:12,840 --> 00:03:18,160 Speaker 1: But I don't think that it's technologically impossible. And it 50 00:03:18,240 --> 00:03:21,840 Speaker 1: is also within the realm of possibility that TikTok representatives 51 00:03:21,880 --> 00:03:24,880 Speaker 1: are just saying this, because if they were to admit 52 00:03:25,000 --> 00:03:28,680 Speaker 1: otherwise that the company is working on this independent algorithm, 53 00:03:29,160 --> 00:03:31,840 Speaker 1: that could potentially give the US government a bit more 54 00:03:31,919 --> 00:03:35,240 Speaker 1: leverage to say, well, you're already making preparations, so there's 55 00:03:35,240 --> 00:03:38,640 Speaker 1: no problem here. But as I mentioned earlier, TikTok is 56 00:03:38,680 --> 00:03:42,320 Speaker 1: already suing to challenge this law, and this matter is 57 00:03:42,360 --> 00:03:47,680 Speaker 1: far from settled. So my guess is the US recommendation algorithm, 58 00:03:47,760 --> 00:03:50,520 Speaker 1: if in fact that really does exist and Royer sources 59 00:03:50,600 --> 00:03:55,040 Speaker 1: are truthful, I think that that's the contingency plan. I 60 00:03:55,120 --> 00:03:59,440 Speaker 1: think that's TikTok having a worst case scenario for the company, 61 00:03:59,840 --> 00:04:02,240 Speaker 1: the worst case being that it actually is forced to 62 00:04:02,360 --> 00:04:06,360 Speaker 1: divest itself or for byte Dance rather to divest itself 63 00:04:06,440 --> 00:04:10,920 Speaker 1: of TikTok. A security firm called black Lotus Labs has 64 00:04:10,960 --> 00:04:14,160 Speaker 1: a report that might explain a massive technological failing that 65 00:04:14,240 --> 00:04:17,599 Speaker 1: happened last year. So back in October of twenty twenty three, 66 00:04:18,080 --> 00:04:21,120 Speaker 1: more than half a million customers of the internet service 67 00:04:21,160 --> 00:04:25,840 Speaker 1: provider Windstream lost service. So what was the problem. Well, 68 00:04:25,920 --> 00:04:29,440 Speaker 1: the customers found that their routers had become bricked, though 69 00:04:29,480 --> 00:04:32,320 Speaker 1: they didn't actually necessarily know that's what had happened. Some 70 00:04:32,839 --> 00:04:35,599 Speaker 1: of them did, but most people were probably just thinking, 71 00:04:35,839 --> 00:04:39,600 Speaker 1: my Internet don't work no more. But it meant that 72 00:04:39,680 --> 00:04:43,000 Speaker 1: like six hundred thousand customers or so were without Internet 73 00:04:43,040 --> 00:04:46,440 Speaker 1: service and that's not good. And eventually Windstream would send 74 00:04:46,480 --> 00:04:49,919 Speaker 1: out replacement routers once it finally kind of got to 75 00:04:49,960 --> 00:04:52,680 Speaker 1: the conclusion that, yeah, it's the routers that failed, and 76 00:04:52,720 --> 00:04:55,760 Speaker 1: there was nothing on the fault of the customers As 77 00:04:55,760 --> 00:04:59,320 Speaker 1: far as anyone can tell. These failures happened over the 78 00:04:59,320 --> 00:05:02,840 Speaker 1: course of three days. So what can make so many 79 00:05:02,880 --> 00:05:05,960 Speaker 1: routers fail in such a short time. Well, according to 80 00:05:06,040 --> 00:05:09,160 Speaker 1: Black Lotus Labs, it was malware. Now I should add that, 81 00:05:09,279 --> 00:05:13,200 Speaker 1: as Dan Gooden of Ours Technica reports, Black Lotus Labs 82 00:05:13,240 --> 00:05:18,240 Speaker 1: did not specifically name Windstream in their report. Instead, the 83 00:05:18,279 --> 00:05:21,560 Speaker 1: firms spelled out the parameters of this malware attack and 84 00:05:21,640 --> 00:05:24,520 Speaker 1: the effects it had, and Goodin makes the case that 85 00:05:25,760 --> 00:05:28,560 Speaker 1: it's a pretty darn good match for what seems to 86 00:05:28,600 --> 00:05:32,200 Speaker 1: have happened over at Windstream, and I would agree with that. Now, 87 00:05:32,279 --> 00:05:34,760 Speaker 1: I would argue that the most concerning elements of this 88 00:05:34,839 --> 00:05:37,800 Speaker 1: report are the things we still do not know. We 89 00:05:37,920 --> 00:05:40,600 Speaker 1: do not know who carried out the attack. We do 90 00:05:40,680 --> 00:05:43,919 Speaker 1: not know why they did this attack. We also don't 91 00:05:43,960 --> 00:05:46,679 Speaker 1: know how the attacker was able to get initial access 92 00:05:46,720 --> 00:05:49,440 Speaker 1: to these routers in the first place. Like they used 93 00:05:49,640 --> 00:05:52,080 Speaker 1: a specific kind of malware in order to overwrite the 94 00:05:52,160 --> 00:05:55,080 Speaker 1: firmware on the router. We know that, like we know 95 00:05:55,160 --> 00:05:58,120 Speaker 1: what kind of malware they used, but how they got 96 00:05:58,200 --> 00:06:01,919 Speaker 1: that entry point in to the routers in the first place. 97 00:06:02,000 --> 00:06:05,000 Speaker 1: That is still an unknown variable. It could be that 98 00:06:05,040 --> 00:06:08,640 Speaker 1: there's a vulnerability that's in these routers that the cybersecurity 99 00:06:08,640 --> 00:06:12,359 Speaker 1: community doesn't know about yet but the attacker does, or 100 00:06:12,360 --> 00:06:15,200 Speaker 1: it could be something else. We also don't know if 101 00:06:15,240 --> 00:06:19,080 Speaker 1: the attack was you know, backed by a nation state. 102 00:06:19,640 --> 00:06:25,560 Speaker 1: Was this a state back hacking attack? No clue, And 103 00:06:25,839 --> 00:06:30,680 Speaker 1: because we know so little about the actual attack, like 104 00:06:31,200 --> 00:06:33,640 Speaker 1: who carried it out and how it was done, we 105 00:06:33,680 --> 00:06:35,680 Speaker 1: don't have a lot of good advice on how to 106 00:06:35,760 --> 00:06:38,200 Speaker 1: avoid this kind of thing in the future and how 107 00:06:38,200 --> 00:06:41,480 Speaker 1: to protect ourselves against future attacks, apart from just you know, 108 00:06:41,520 --> 00:06:44,720 Speaker 1: the general words of wisdom, like you know, when you 109 00:06:44,760 --> 00:06:47,719 Speaker 1: get a router or a modem or whatever, change the 110 00:06:47,760 --> 00:06:50,440 Speaker 1: default password, change it to something that only you know 111 00:06:50,600 --> 00:06:54,360 Speaker 1: and it's a strong password, or you know, reboot your 112 00:06:54,400 --> 00:06:58,000 Speaker 1: router on occasion in order to try and protect yourself 113 00:06:58,000 --> 00:07:00,680 Speaker 1: against attacks, that kind of thing, which I mean, those 114 00:07:00,680 --> 00:07:06,159 Speaker 1: are good rules to follow, but it's not very reassuring, 115 00:07:06,440 --> 00:07:10,960 Speaker 1: like it's not specific to this particular case. Blake Montgomery 116 00:07:11,000 --> 00:07:14,320 Speaker 1: reports in the Guardian the US authorities shut down a 117 00:07:14,360 --> 00:07:18,720 Speaker 1: botnet and not just any botnet, but the world's largest 118 00:07:18,840 --> 00:07:22,840 Speaker 1: buttonnet ever. So a botnet, for those of you unfamiliar 119 00:07:22,840 --> 00:07:26,600 Speaker 1: with the term, is a network of comprised computers. I mean, 120 00:07:26,640 --> 00:07:30,320 Speaker 1: the name kind of gives it away, a network of bots. 121 00:07:30,560 --> 00:07:36,440 Speaker 1: So typically a hacker uses malware or phishing attacks in 122 00:07:36,560 --> 00:07:40,720 Speaker 1: order to establish some kind of backdoor access to a 123 00:07:40,760 --> 00:07:44,760 Speaker 1: network of computers. So these are computers belonging to people 124 00:07:44,800 --> 00:07:48,840 Speaker 1: and companies and such and organizations that the hacker then 125 00:07:49,280 --> 00:07:52,480 Speaker 1: is able to at least take some partial control over. 126 00:07:53,120 --> 00:07:56,200 Speaker 1: And then typically the hacker puts these computers to work 127 00:07:56,520 --> 00:08:00,640 Speaker 1: to do something. This can include anything from using this 128 00:08:00,760 --> 00:08:04,800 Speaker 1: network of zombie computers zombie computers. That's another kind of 129 00:08:04,920 --> 00:08:08,360 Speaker 1: term for a botnet, a zombie army. You might use 130 00:08:08,400 --> 00:08:12,920 Speaker 1: those to blast some web server with Internet traffic in 131 00:08:12,960 --> 00:08:16,560 Speaker 1: an attempt to overwhelm it. Now that's a distributed denial 132 00:08:16,600 --> 00:08:21,200 Speaker 1: of service attack. Or you might put this network of 133 00:08:21,440 --> 00:08:25,560 Speaker 1: computers to work in the cryptocurrency minds. But in this 134 00:08:25,600 --> 00:08:29,600 Speaker 1: particular case, this compromised network was used to do several 135 00:08:29,680 --> 00:08:33,640 Speaker 1: different things, but the big one was an alleged COVID 136 00:08:33,800 --> 00:08:39,000 Speaker 1: insurance fraud scam that amounted to around six billion dollars 137 00:08:39,120 --> 00:08:43,079 Speaker 1: in fraud. The takedown operation, which was code named Endgame 138 00:08:43,120 --> 00:08:46,240 Speaker 1: because I guess cybersecurity folks like to feel cool and 139 00:08:46,559 --> 00:08:50,520 Speaker 1: presumably really enjoy the Avengers movies. It relied upon joint 140 00:08:50,520 --> 00:08:55,800 Speaker 1: cooperation of authorities in the United States, the United Kingdom, Ukraine, 141 00:08:56,000 --> 00:09:01,320 Speaker 1: the Netherlands, Denmark, Germany, and France. The US Department of 142 00:09:01,480 --> 00:09:05,560 Speaker 1: Justice arrested Yunhi Wang, who is a Chinese national, and 143 00:09:05,600 --> 00:09:09,960 Speaker 1: accused Wang of essentially spearheading the botnet operations. Wang did 144 00:09:10,000 --> 00:09:14,040 Speaker 1: not do it on his own, but was allegedly a 145 00:09:14,240 --> 00:09:18,200 Speaker 1: large part of this, and if Wang in fact has 146 00:09:18,240 --> 00:09:20,520 Speaker 1: found guilty, he could face up to sixty five years 147 00:09:20,600 --> 00:09:24,240 Speaker 1: in prison. He's thirty five years old now, so that's 148 00:09:24,280 --> 00:09:29,000 Speaker 1: a big old wolf. The United States National Security Agency, 149 00:09:29,080 --> 00:09:32,600 Speaker 1: or NSA, says it's a good idea for smartphone owners 150 00:09:32,640 --> 00:09:35,720 Speaker 1: to completely power it down their devices at least once 151 00:09:35,760 --> 00:09:38,320 Speaker 1: a week. The agency says that doing this can help 152 00:09:38,360 --> 00:09:41,600 Speaker 1: mitigate issues like spearfishing, but it's not a guarantee that 153 00:09:41,640 --> 00:09:44,720 Speaker 1: you'll be free and clear of all risks. It just helps. 154 00:09:45,920 --> 00:09:49,960 Speaker 1: So essentially, just the turning your device off and on 155 00:09:50,080 --> 00:09:53,679 Speaker 1: again on a regular basis should be considered a best practice, 156 00:09:54,080 --> 00:09:56,959 Speaker 1: and the NSSA should know because they are experts at 157 00:09:57,120 --> 00:09:59,800 Speaker 1: spying on people. If you'd like more information on that, 158 00:10:00,080 --> 00:10:03,680 Speaker 1: look up stories about prism or main way that kind 159 00:10:03,679 --> 00:10:06,000 Speaker 1: of thing. But to be less cheeky, I agree that 160 00:10:06,120 --> 00:10:09,480 Speaker 1: regularly doing a full power down and then power up 161 00:10:09,559 --> 00:10:12,439 Speaker 1: of your device is probably a good idea for lots 162 00:10:12,480 --> 00:10:14,640 Speaker 1: of different reasons, not least of which is that it 163 00:10:14,679 --> 00:10:19,960 Speaker 1: could provide an extra bit of security against threats. All right, 164 00:10:20,040 --> 00:10:22,360 Speaker 1: we've got a lot more tech news stories to cover, 165 00:10:22,440 --> 00:10:24,200 Speaker 1: but before we get to that, let's take a quick 166 00:10:24,200 --> 00:10:37,240 Speaker 1: break to thank our sponsors. We're back and now for 167 00:10:37,320 --> 00:10:40,520 Speaker 1: a couple of stories about using technology to spread propaganda 168 00:10:40,559 --> 00:10:44,000 Speaker 1: and misinformation. First up, META says that it identified and 169 00:10:44,040 --> 00:10:48,360 Speaker 1: subsequently removed six influenced campaigns. This is coming from an 170 00:10:48,440 --> 00:10:51,720 Speaker 1: article I saw on the Verge by Nick Barclay. Meta 171 00:10:51,800 --> 00:10:54,679 Speaker 1: says that a couple of these campaigns were using AI 172 00:10:54,840 --> 00:10:57,960 Speaker 1: in an attempt to push certain political viewpoints and to 173 00:10:57,960 --> 00:11:00,480 Speaker 1: make it seem as if that particular point view had 174 00:11:00,480 --> 00:11:03,920 Speaker 1: a larger amount of support than it really did. Meta 175 00:11:04,120 --> 00:11:09,440 Speaker 1: disclosed that the campaigns originated out of places such as Croatia, China, Bangladesh, 176 00:11:09,559 --> 00:11:14,320 Speaker 1: Israel and Iran. Apparently, the Israeli campaign made use of 177 00:11:14,360 --> 00:11:17,280 Speaker 1: AI to create comments to try and boost engagement and 178 00:11:17,360 --> 00:11:21,520 Speaker 1: the spread of messaging, and the Chinese campaign allegedly used 179 00:11:21,520 --> 00:11:24,880 Speaker 1: AI to generate images as part of that campaign. Now, 180 00:11:24,920 --> 00:11:28,640 Speaker 1: according to Meta, these attempts weren't particularly sophisticated or hard 181 00:11:28,679 --> 00:11:32,400 Speaker 1: to identify. But obviously folks expect that AI will get 182 00:11:32,440 --> 00:11:35,360 Speaker 1: better at creating this kind of stuff that people will 183 00:11:35,400 --> 00:11:40,600 Speaker 1: not be as readily available to detect. It'll be easier 184 00:11:40,640 --> 00:11:45,640 Speaker 1: for the stuff to kind of pass casual glance and 185 00:11:45,760 --> 00:11:49,520 Speaker 1: considering discourse on some social platforms, I expect it is 186 00:11:49,559 --> 00:11:51,000 Speaker 1: not going to take a whole lot of work to 187 00:11:51,000 --> 00:11:53,760 Speaker 1: craft something that fits right in, because goodness knows, I've 188 00:11:53,760 --> 00:11:58,040 Speaker 1: seen some garbage on social networks. On a similar note, 189 00:11:58,080 --> 00:12:00,720 Speaker 1: the New York Times reports that open Aye has said 190 00:12:01,040 --> 00:12:04,480 Speaker 1: it identified five online campaigns that were making use of 191 00:12:04,520 --> 00:12:10,280 Speaker 1: AI to boost messaging. These campaigns originated in places like Russia, China, Iran, 192 00:12:10,400 --> 00:12:13,080 Speaker 1: and Israel. There's no word on whether or not any 193 00:12:13,120 --> 00:12:15,840 Speaker 1: of these are the same ones that Meta mentioned. This week, 194 00:12:16,200 --> 00:12:19,160 Speaker 1: The Register had a rather snarky article about this that 195 00:12:19,200 --> 00:12:22,360 Speaker 1: talks about how these campaigns were relatively low stakes because 196 00:12:22,400 --> 00:12:25,839 Speaker 1: they hadn't seen much penetration. They were largely you know, 197 00:12:26,240 --> 00:12:30,520 Speaker 1: unseen by actual human people. Instead, they mostly consisted mainly 198 00:12:30,559 --> 00:12:33,720 Speaker 1: of bots posting stuff that other bots had created, or 199 00:12:33,760 --> 00:12:36,080 Speaker 1: maybe the same stuff that those same bots had created. 200 00:12:36,200 --> 00:12:39,160 Speaker 1: At any rate, it sounds like the actual impact of 201 00:12:39,200 --> 00:12:42,040 Speaker 1: these campaigns was minimal. And again some of that has 202 00:12:42,080 --> 00:12:44,760 Speaker 1: to do with the fact that the efforts of using 203 00:12:44,880 --> 00:12:48,800 Speaker 1: AI are not terribly sophisticated yet. But I do stress 204 00:12:48,840 --> 00:12:52,120 Speaker 1: the word yet, because there's every reason to expect these 205 00:12:52,120 --> 00:12:54,800 Speaker 1: attacks will get more sophisticated over time, and the real 206 00:12:54,880 --> 00:12:57,880 Speaker 1: concern is whether open AI will be as effective at 207 00:12:57,880 --> 00:13:02,840 Speaker 1: detecting and disrupting such campaigns when they inevitably surface. And 208 00:13:02,880 --> 00:13:06,120 Speaker 1: now in the AI is Coming for Creatives category, I 209 00:13:06,160 --> 00:13:08,720 Speaker 1: submit for your approval a story written by Winston Show 210 00:13:08,760 --> 00:13:12,160 Speaker 1: for The Hollywood Reporter about a company called Fable Studio. 211 00:13:12,840 --> 00:13:16,760 Speaker 1: This company is launching an AI powered platform called Showrunner, 212 00:13:16,840 --> 00:13:19,679 Speaker 1: which the studio claims will be able to create AI 213 00:13:19,880 --> 00:13:23,680 Speaker 1: generated television series. So it sounds to me like the 214 00:13:23,760 --> 00:13:27,080 Speaker 1: idea is you give AI some guidelines on what you want, 215 00:13:27,400 --> 00:13:31,360 Speaker 1: and then the AI creates an animated episode and voice 216 00:13:31,440 --> 00:13:35,440 Speaker 1: acted episode that consists of scenes that are based off 217 00:13:35,440 --> 00:13:38,600 Speaker 1: your prompts. So imagine that you're sitting there and you're thinking, man, 218 00:13:38,679 --> 00:13:42,640 Speaker 1: I really wish they hadn't canceled Firefly. And then imagine 219 00:13:42,640 --> 00:13:44,960 Speaker 1: you're thinking, hey, wait a minute, I can create new 220 00:13:44,960 --> 00:13:48,880 Speaker 1: episodes of Firefly using this tool and wash lives in 221 00:13:48,960 --> 00:13:52,280 Speaker 1: my version. Spoiler alert if you haven't seen Serenity. So 222 00:13:52,920 --> 00:13:55,600 Speaker 1: the actors might not look quite right, because again, the 223 00:13:55,600 --> 00:13:58,400 Speaker 1: tool can only make animated characters at the moment, it 224 00:13:58,440 --> 00:14:03,560 Speaker 1: can't do video AI generation. They might not sound right. 225 00:14:04,200 --> 00:14:06,680 Speaker 1: And sure, it probably won't come across like a real 226 00:14:06,800 --> 00:14:10,160 Speaker 1: Firefly episode and sound like something that Joss Whedon wrote, 227 00:14:10,679 --> 00:14:15,200 Speaker 1: but you could technically do it. If you're also thinking, hey, 228 00:14:15,240 --> 00:14:17,160 Speaker 1: this kind of sounds like the sort of stuff that 229 00:14:17,200 --> 00:14:20,120 Speaker 1: the Writers Guild of America and the Screen Actors Guild 230 00:14:20,160 --> 00:14:22,680 Speaker 1: were really worried about, you would be right on the money. 231 00:14:23,120 --> 00:14:25,960 Speaker 1: Fable Studio is launching a closed beta test of the 232 00:14:26,000 --> 00:14:28,640 Speaker 1: platform in the near future that will likely last the 233 00:14:28,680 --> 00:14:31,880 Speaker 1: rest of this year before it is able to launch 234 00:14:31,960 --> 00:14:34,800 Speaker 1: the service for reals's I will not be joining the 235 00:14:34,840 --> 00:14:38,640 Speaker 1: waitlist for this test. I have serious ethical objections to 236 00:14:38,920 --> 00:14:42,800 Speaker 1: AI generated entertainment, and they are far too numerous to 237 00:14:42,840 --> 00:14:46,960 Speaker 1: get into here. Now. I say that, but our next 238 00:14:46,960 --> 00:14:49,880 Speaker 1: story actually goes into one of the big reasons why so. 239 00:14:50,080 --> 00:14:54,400 Speaker 1: Jesus Diaz, a fast company, reports that Instagram is training 240 00:14:54,440 --> 00:14:58,240 Speaker 1: AI models using user data on the platform, and worse, 241 00:14:58,880 --> 00:15:01,720 Speaker 1: most users have no way to opt out of it. 242 00:15:02,040 --> 00:15:04,640 Speaker 1: So if you're an artist of any type and you 243 00:15:04,760 --> 00:15:08,360 Speaker 1: use Instagram to showcase your work, whether that's dance or 244 00:15:08,800 --> 00:15:12,440 Speaker 1: visual arts or photography, whatever it might be, your work 245 00:15:12,480 --> 00:15:16,040 Speaker 1: is being used to train up Meta's AI generative models. 246 00:15:16,360 --> 00:15:19,200 Speaker 1: The only people who even have the option to opt 247 00:15:19,240 --> 00:15:22,040 Speaker 1: out of this are citizens of the European Union, where 248 00:15:22,080 --> 00:15:26,360 Speaker 1: the rules of General Data Protection Regulation or GDPR provide 249 00:15:26,520 --> 00:15:30,000 Speaker 1: some protection. But as Das reports, Meta has taken some 250 00:15:30,080 --> 00:15:33,720 Speaker 1: rather extraordinary steps to obfuscate the option to opt out. 251 00:15:34,240 --> 00:15:37,320 Speaker 1: First up is the initial message alerting users in the 252 00:15:37,360 --> 00:15:40,240 Speaker 1: EU to the practice. In the first place, there's this 253 00:15:40,280 --> 00:15:43,840 Speaker 1: big old blue clothes button, and if you hit clothes, 254 00:15:43,960 --> 00:15:46,840 Speaker 1: essentially that serves as a I'm cool with this, you know, 255 00:15:46,880 --> 00:15:50,960 Speaker 1: it's essentially sending the opt in message, so your opt 256 00:15:50,960 --> 00:15:54,760 Speaker 1: out option is gone. Within the message itself is a 257 00:15:54,800 --> 00:15:58,040 Speaker 1: phrase that says quote this means you have the right 258 00:15:58,120 --> 00:16:01,240 Speaker 1: to object to how your information is used for these 259 00:16:01,280 --> 00:16:06,000 Speaker 1: purposes end quote, and the right to object phrase within 260 00:16:06,040 --> 00:16:09,800 Speaker 1: that is a link to the actual opt out feature. Now, 261 00:16:09,840 --> 00:16:12,920 Speaker 1: as you might imagine, this is much smaller than the 262 00:16:12,920 --> 00:16:18,160 Speaker 1: blue close button. Now, if you did click the right 263 00:16:18,320 --> 00:16:21,440 Speaker 1: to object phrase, it takes you to a rather intimidating 264 00:16:21,520 --> 00:16:24,840 Speaker 1: looking form that I would argue appears to be designed 265 00:16:24,880 --> 00:16:28,080 Speaker 1: to discourage users from taking the time to opt out. 266 00:16:28,440 --> 00:16:31,480 Speaker 1: That is my opinion. I am just saying. My opinion 267 00:16:31,600 --> 00:16:37,080 Speaker 1: is this was a calculated move to discourage people from 268 00:16:37,080 --> 00:16:40,000 Speaker 1: opting out. And what's more, Das rightly points out that 269 00:16:40,080 --> 00:16:44,160 Speaker 1: GDPR makes it illegal for Meta to deny anyone their 270 00:16:44,160 --> 00:16:47,040 Speaker 1: request to opt out of data capture and usage practices. 271 00:16:47,320 --> 00:16:49,760 Speaker 1: You don't have to give a reason, you don't have 272 00:16:49,840 --> 00:16:52,360 Speaker 1: to justify it. You just have to say I opt 273 00:16:52,400 --> 00:16:55,440 Speaker 1: out and that's it. So Meta has made this more 274 00:16:55,560 --> 00:16:59,080 Speaker 1: complicated than it needs to be. However, the form makes 275 00:16:59,120 --> 00:17:00,800 Speaker 1: it seem like you have to make a case to 276 00:17:00,880 --> 00:17:03,080 Speaker 1: opt out and then Meta has the right to deny 277 00:17:03,120 --> 00:17:06,520 Speaker 1: your request. They do not have that right. So if 278 00:17:06,560 --> 00:17:08,639 Speaker 1: you do live in the EU and you want to 279 00:17:08,680 --> 00:17:11,160 Speaker 1: opt out of this and you see that message pop up. 280 00:17:11,520 --> 00:17:14,200 Speaker 1: I suggest that when Meta asks you to explain why 281 00:17:14,240 --> 00:17:16,239 Speaker 1: you want to opt out, you write in something like 282 00:17:16,640 --> 00:17:19,320 Speaker 1: GDPR says, I don't have to give you a reason, 283 00:17:19,359 --> 00:17:22,640 Speaker 1: you jark face, or something to that effect. I feel 284 00:17:22,680 --> 00:17:24,960 Speaker 1: like Meta is really playing it fast and loose in 285 00:17:25,000 --> 00:17:28,040 Speaker 1: the EU with this approach, as I believe certain regulators, 286 00:17:28,040 --> 00:17:30,320 Speaker 1: and I'm thinking specifically of ones who happen to live 287 00:17:30,320 --> 00:17:33,760 Speaker 1: in Ireland might argue that the UX design that Meta 288 00:17:33,800 --> 00:17:37,399 Speaker 1: has employed is purposefully attempting to trick users into opting 289 00:17:37,480 --> 00:17:41,800 Speaker 1: in without necessarily wanting to, And if I had to 290 00:17:41,880 --> 00:17:43,959 Speaker 1: lay money on it, I would say that they're going 291 00:17:44,040 --> 00:17:47,320 Speaker 1: to face some lawsuits about this in the future. The 292 00:17:47,480 --> 00:17:51,520 Speaker 1: US Federal Aviation Administration, or FAA, has given Amazon the 293 00:17:51,560 --> 00:17:55,440 Speaker 1: clearance to operate delivery drones outside of the direct view 294 00:17:55,560 --> 00:17:59,400 Speaker 1: of a ground spotter. So previously, the FAA required Amazon 295 00:17:59,440 --> 00:18:02,280 Speaker 1: to employ ground spotters to make sure that drones weren't 296 00:18:02,320 --> 00:18:05,680 Speaker 1: putting people in property at risk while zooming around delivering 297 00:18:05,720 --> 00:18:08,919 Speaker 1: you know, socks and Taylor Swift albums and that kind 298 00:18:08,960 --> 00:18:12,080 Speaker 1: of thing. Without that requirement, Amazon will now have the 299 00:18:12,200 --> 00:18:15,399 Speaker 1: chance to expand operations beyond a few test markets and 300 00:18:15,480 --> 00:18:19,000 Speaker 1: potentially make it a viable means of delivering packages to 301 00:18:19,080 --> 00:18:22,320 Speaker 1: more customers. Now, this doesn't necessarily mean the air is 302 00:18:22,359 --> 00:18:25,040 Speaker 1: soon going to be buzzing with drones in the near future, 303 00:18:25,080 --> 00:18:27,719 Speaker 1: because the company has made some staffing cuts to the 304 00:18:27,760 --> 00:18:31,960 Speaker 1: Prime Air division in the recent past, and Amazon announced 305 00:18:32,000 --> 00:18:33,920 Speaker 1: just a few weeks ago it would be ending drone 306 00:18:33,960 --> 00:18:37,399 Speaker 1: operations in California entirely. So it might be a while 307 00:18:37,440 --> 00:18:40,760 Speaker 1: before you start seeing these suckers dropping off impulse purchases 308 00:18:40,800 --> 00:18:43,520 Speaker 1: in your neck of the woods, But a major regulatory 309 00:18:43,640 --> 00:18:47,480 Speaker 1: hurdle is now out of the way. Today mark's the 310 00:18:47,520 --> 00:18:51,200 Speaker 1: last day of employment for Twitch's current Safety Advisory Council. 311 00:18:51,720 --> 00:18:55,840 Speaker 1: This group of nine folks, which included industry experts and streamers, 312 00:18:56,080 --> 00:18:59,240 Speaker 1: were responsible for advising Twitch on how to improve safety 313 00:18:59,280 --> 00:19:01,720 Speaker 1: measures on the plot and to build trust among the 314 00:19:01,760 --> 00:19:05,399 Speaker 1: community of creators and users alike. They were alerted at 315 00:19:05,440 --> 00:19:07,520 Speaker 1: the beginning of this month that their services would no 316 00:19:07,640 --> 00:19:10,800 Speaker 1: longer be required at the end of May. Instead, Twitch 317 00:19:10,840 --> 00:19:14,760 Speaker 1: plans to create a new group consisting solely of Twitch ambassadors. 318 00:19:15,240 --> 00:19:18,800 Speaker 1: As Hayden Field of CNBC puts it, the language around 319 00:19:18,800 --> 00:19:22,040 Speaker 1: this decision is aligned with the general corporate speak that 320 00:19:22,200 --> 00:19:25,960 Speaker 1: typically boils down to we're cutting costs and safety is 321 00:19:26,000 --> 00:19:29,440 Speaker 1: a real hassle. That's me paraphrasing. By the way, Field 322 00:19:29,560 --> 00:19:32,800 Speaker 1: is far more professional and responsible than i am. Field 323 00:19:32,840 --> 00:19:35,919 Speaker 1: also points out that in twenty twenty three, Twitch sacked 324 00:19:35,920 --> 00:19:39,040 Speaker 1: around fifty folks in their trust and safety team, so 325 00:19:39,160 --> 00:19:41,239 Speaker 1: this move seems to be in alignment with that one 326 00:19:41,320 --> 00:19:44,280 Speaker 1: from last year. Considering the numerous stories that have come 327 00:19:44,320 --> 00:19:47,439 Speaker 1: out around how important security is and how risks and 328 00:19:47,520 --> 00:19:50,160 Speaker 1: threats are growing each year, partly due to the use 329 00:19:50,200 --> 00:19:53,359 Speaker 1: of AI, this to me seems like a short term 330 00:19:53,400 --> 00:19:57,840 Speaker 1: decision that could potentially have disastrous long term consequences. But 331 00:19:57,880 --> 00:20:00,440 Speaker 1: then Twitch has also made several policy chain over the 332 00:20:00,480 --> 00:20:02,280 Speaker 1: last couple of years that have really blown up in 333 00:20:02,280 --> 00:20:05,159 Speaker 1: the company's proverbial face, so maybe this is just the 334 00:20:05,160 --> 00:20:09,880 Speaker 1: platform saying the council hasn't been a good fit. In gadgets, 335 00:20:09,880 --> 00:20:13,080 Speaker 1: Will Shanklin reports that Spotify, after some resistance, has agreed 336 00:20:13,119 --> 00:20:16,040 Speaker 1: to issue refunds to folks who purchased a car thing. 337 00:20:16,680 --> 00:20:19,359 Speaker 1: That's the actual name for the product, the car Thing. 338 00:20:19,800 --> 00:20:22,359 Speaker 1: Spotify launched this a couple of years ago. It's a 339 00:20:22,400 --> 00:20:24,919 Speaker 1: device that attaches to your car's entertainment system, and it 340 00:20:24,960 --> 00:20:28,520 Speaker 1: provides streaming media from Spotify to your vehicle. But the 341 00:20:28,560 --> 00:20:30,480 Speaker 1: company announced last week that it was going to end 342 00:20:30,560 --> 00:20:33,320 Speaker 1: support for the devices on December ninth of this year, 343 00:20:33,400 --> 00:20:36,560 Speaker 1: at which point all those car things will become useless 344 00:20:36,600 --> 00:20:40,480 Speaker 1: things because they'll be bricked. Those puppies cost ninety bucks 345 00:20:40,520 --> 00:20:42,920 Speaker 1: a pop, and since the service is only a couple 346 00:20:42,880 --> 00:20:46,480 Speaker 1: of years old, that cheesed a lot of people off. Reportedly, 347 00:20:46,560 --> 00:20:50,080 Speaker 1: Spotify wasn't going to offer refunds at first, but the 348 00:20:50,119 --> 00:20:53,800 Speaker 1: company subsequently didn't about face, and did so just before 349 00:20:53,840 --> 00:20:57,000 Speaker 1: a class action lawsuit rolled in against them that was 350 00:20:57,040 --> 00:21:01,119 Speaker 1: accusing them of unfair business practices. So whether the change 351 00:21:01,160 --> 00:21:04,680 Speaker 1: of heart was in anticipation of that lawsuit or Spotify 352 00:21:04,800 --> 00:21:07,000 Speaker 1: just arrived at the conclusion that maybe it was a 353 00:21:07,000 --> 00:21:10,760 Speaker 1: bad idea to ignore customer complaints independently, I don't know. 354 00:21:11,160 --> 00:21:13,119 Speaker 1: But if you bought one, you can reach out the 355 00:21:13,119 --> 00:21:16,280 Speaker 1: customer service for a refund. You do have to provide 356 00:21:16,440 --> 00:21:20,520 Speaker 1: proof of purchase, however, And finally, it's the end of 357 00:21:20,520 --> 00:21:25,120 Speaker 1: an era. ICQ. The Venerable Instant Messenger service will shuffle 358 00:21:25,160 --> 00:21:28,680 Speaker 1: off this mortal coil. On my birthday, which doesn't mean 359 00:21:28,720 --> 00:21:31,800 Speaker 1: anything to you. But on June twenty sixth, the Russian 360 00:21:31,840 --> 00:21:36,440 Speaker 1: company VK, which is where ICQ ultimately landed, is going 361 00:21:36,480 --> 00:21:39,679 Speaker 1: to shut down the service, so anyone still using ICQ 362 00:21:39,800 --> 00:21:43,240 Speaker 1: will have to shift to some other instant messenger client. Now, 363 00:21:43,240 --> 00:21:45,959 Speaker 1: if you've never used the service, it was a bit peculiar. 364 00:21:46,400 --> 00:21:48,879 Speaker 1: You didn't get to choose a handle or use your 365 00:21:48,960 --> 00:21:51,480 Speaker 1: name or anything like that. Instead, the service would assign 366 00:21:51,560 --> 00:21:54,600 Speaker 1: you a number, you know, nice and personal like, and 367 00:21:54,640 --> 00:21:56,919 Speaker 1: it worked kind of like a phone number, and you 368 00:21:56,960 --> 00:22:00,440 Speaker 1: could initiate chat sessions with other users. And I used 369 00:22:00,480 --> 00:22:02,640 Speaker 1: it a lot back in my younger days, though honestly, 370 00:22:02,680 --> 00:22:05,520 Speaker 1: I can't remember the last time I popped on. It's 371 00:22:05,640 --> 00:22:09,199 Speaker 1: likely been at least two decades or more at this point. Honestly, 372 00:22:09,240 --> 00:22:11,880 Speaker 1: if you had told me that ICQ would outlive Aol 373 00:22:11,960 --> 00:22:15,080 Speaker 1: instant Messenger which shut down at the end of twenty seventeen, 374 00:22:15,359 --> 00:22:17,399 Speaker 1: I would have thought you were bunkers. But that's how 375 00:22:17,400 --> 00:22:19,960 Speaker 1: it turned out. Well, you had a good run ICQ. 376 00:22:20,400 --> 00:22:23,360 Speaker 1: I'll give you one final uh oh. That's the sound 377 00:22:23,400 --> 00:22:25,520 Speaker 1: that would play when you got a message on ICQ. 378 00:22:26,520 --> 00:22:28,760 Speaker 1: That's it for this episode, and the news for the 379 00:22:28,800 --> 00:22:32,000 Speaker 1: week ending on May thirty first, twenty twenty four. I 380 00:22:32,080 --> 00:22:34,560 Speaker 1: hope that you are all well, and I'll talk to 381 00:22:34,600 --> 00:22:44,360 Speaker 1: you again really soon. Tech Stuff is an iHeartRadio production. 382 00:22:44,800 --> 00:22:49,840 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 383 00:22:49,920 --> 00:22:55,199 Speaker 1: or wherever you listen to your favorite shows.