1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,960 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,000 --> 00:00:19,759 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,800 --> 00:00:22,480 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:22,680 --> 00:00:29,440 Speaker 1: July thirteenth, twenty twenty three. A lot of bummer news today, 6 00:00:30,240 --> 00:00:33,919 Speaker 1: which I wish weren't the case, but that's kind of 7 00:00:33,920 --> 00:00:37,239 Speaker 1: how the AI cookie crumbles. Before we get into all that, 8 00:00:37,920 --> 00:00:41,919 Speaker 1: one story that I missed on Tuesday was an update 9 00:00:41,960 --> 00:00:46,519 Speaker 1: to Microsoft's quest to acquire video game giant Activision Blizzard. 10 00:00:46,760 --> 00:00:51,000 Speaker 1: So quick recap. Microsoft announced its plan to acquire Activision Blizzard, 11 00:00:51,040 --> 00:00:54,640 Speaker 1: which was going through a really rough PR patch to 12 00:00:54,640 --> 00:00:59,200 Speaker 1: put it lightly way back in early twenty twenty two. Now, 13 00:00:59,240 --> 00:01:03,640 Speaker 1: since then, these companies have encountered resistance among various regulators 14 00:01:03,680 --> 00:01:06,760 Speaker 1: around the world, primarily in the United States and the 15 00:01:06,840 --> 00:01:11,000 Speaker 1: United Kingdom. While some of those knots have been loosened, 16 00:01:11,040 --> 00:01:13,679 Speaker 1: at least one that was really looming was the US 17 00:01:13,680 --> 00:01:17,880 Speaker 1: Federal Trade Commission, or FTC and its request for an 18 00:01:17,959 --> 00:01:22,400 Speaker 1: injunction to block the acquisition. Now, to be clear, this 19 00:01:22,520 --> 00:01:25,919 Speaker 1: move would have been like a temporary hold on the deal, 20 00:01:26,080 --> 00:01:30,319 Speaker 1: while the FTC mounts a more substantial legal challenge, But 21 00:01:30,520 --> 00:01:35,160 Speaker 1: on Tuesday, a federal judge decided against that request for 22 00:01:35,280 --> 00:01:38,520 Speaker 1: injunction and the deal would be allowed to move forward. 23 00:01:39,000 --> 00:01:42,080 Speaker 1: Microsoft wants to close this deal by July eighteenth, so 24 00:01:42,160 --> 00:01:46,200 Speaker 1: it's coming up right soon. The FDC is now appealing 25 00:01:46,400 --> 00:01:50,480 Speaker 1: to the Ninth Circuit Court of Appeals. Unsurprisingly, reps for 26 00:01:50,640 --> 00:01:53,600 Speaker 1: Microsoft and Activision Blizzards say the deal will actually be 27 00:01:53,680 --> 00:01:57,280 Speaker 1: really good for consumers as well as for competition. I 28 00:01:57,320 --> 00:02:03,720 Speaker 1: would humbly suggest that consolidation is is rarely good for competition. Still, 29 00:02:03,760 --> 00:02:06,520 Speaker 1: the federal judge says that the FTC failed to show 30 00:02:06,560 --> 00:02:10,400 Speaker 1: it has adequate justification to object to the deal in 31 00:02:10,440 --> 00:02:14,840 Speaker 1: the first place, that the FTC's arguments aren't really legally convincing. 32 00:02:15,440 --> 00:02:17,960 Speaker 1: Now I've not read all these arguments, but I can 33 00:02:18,040 --> 00:02:20,840 Speaker 1: imagine that the FTC is having a tough time arguing 34 00:02:20,919 --> 00:02:24,640 Speaker 1: their point, because it's pretty difficult to show how a 35 00:02:24,680 --> 00:02:28,320 Speaker 1: deal is going to have negative impact on competition before 36 00:02:28,360 --> 00:02:32,519 Speaker 1: the deal happens. We usually only recognize these things in hindsight. 37 00:02:33,200 --> 00:02:35,799 Speaker 1: Either way, the announcement means we are a little bit 38 00:02:35,800 --> 00:02:40,400 Speaker 1: closer to Activision Blizzard joining the Microsoft family. Now, we've 39 00:02:40,480 --> 00:02:43,720 Speaker 1: got a lot to talk about with AI today, and 40 00:02:43,800 --> 00:02:46,440 Speaker 1: first up, I figure I should mention the ongoing labor 41 00:02:46,520 --> 00:02:49,960 Speaker 1: strikes in Hollywood. For a few months now, the Writer's 42 00:02:50,000 --> 00:02:53,720 Speaker 1: Guild of America has been on strike, and today sag AFTRA, 43 00:02:54,040 --> 00:02:57,399 Speaker 1: which is an actors' union, is poised to join them 44 00:02:57,440 --> 00:03:00,680 Speaker 1: after the negotiating committee failed to reach an agreement with 45 00:03:00,840 --> 00:03:03,840 Speaker 1: movie studio producers and the committee came to a unanimous 46 00:03:03,880 --> 00:03:06,880 Speaker 1: vote to support a strike. Now, there are a lot 47 00:03:06,880 --> 00:03:10,200 Speaker 1: of different issues at play here, but one of those 48 00:03:10,440 --> 00:03:13,800 Speaker 1: is the role of AI in Hollywood. So understandably, there 49 00:03:13,840 --> 00:03:17,040 Speaker 1: are concerns that studios could choose to lean on AI 50 00:03:17,680 --> 00:03:21,480 Speaker 1: rather than say, a human writing staff to create entertainment. 51 00:03:21,840 --> 00:03:23,720 Speaker 1: So writers are worried that they're going to be called 52 00:03:23,760 --> 00:03:27,000 Speaker 1: in to punch up an AI generated script and then 53 00:03:27,040 --> 00:03:29,520 Speaker 1: receive a lower payout because they didn't actually write the 54 00:03:29,560 --> 00:03:32,440 Speaker 1: first draft. They were just hired to punch up an 55 00:03:32,480 --> 00:03:35,320 Speaker 1: already existing script. This could end up being a lot 56 00:03:35,360 --> 00:03:39,040 Speaker 1: of work because AI is not necessarily great at writing 57 00:03:39,080 --> 00:03:42,480 Speaker 1: such scripts. Meanwhile, actors are worried about their likeness and 58 00:03:42,560 --> 00:03:46,000 Speaker 1: their voices as AI gets increasingly better at creating digital 59 00:03:46,120 --> 00:03:49,880 Speaker 1: duplicates and how does that impact their ability to make 60 00:03:49,920 --> 00:03:52,280 Speaker 1: a living. Keep in mind that most of the people 61 00:03:52,320 --> 00:03:57,040 Speaker 1: in Hollywood aren't your mega celebrities who have multi million 62 00:03:57,120 --> 00:04:02,560 Speaker 1: dollar houses in Beverly Hills. Most of them are working professionals. 63 00:04:02,640 --> 00:04:05,520 Speaker 1: You know, they're going gig to gig and making what 64 00:04:05,560 --> 00:04:08,200 Speaker 1: they can and you know, saving and all that kind 65 00:04:08,240 --> 00:04:11,880 Speaker 1: of stuff. They're not all millionaires. Based off an article 66 00:04:11,880 --> 00:04:16,240 Speaker 1: and Deadline that seems to confirm every stereotype of producers 67 00:04:16,240 --> 00:04:20,359 Speaker 1: being greedy and manipulative jerkfaces, the studios are eager for 68 00:04:20,480 --> 00:04:24,479 Speaker 1: any option that maximizes profit and shareholder value. That article 69 00:04:24,480 --> 00:04:28,240 Speaker 1: in Deadline essentially said that the producer's strategy is to 70 00:04:28,279 --> 00:04:32,040 Speaker 1: wait until October before coming back to the negotiating table 71 00:04:32,080 --> 00:04:36,880 Speaker 1: with the WGA the writers because by that point, the 72 00:04:36,920 --> 00:04:40,320 Speaker 1: writers will be desperate because they'll be, you know, having 73 00:04:40,320 --> 00:04:43,560 Speaker 1: trouble paying rent and mortgages. So it's it's pretty much 74 00:04:43,600 --> 00:04:46,600 Speaker 1: pure evil. When you read that article, it's kind of 75 00:04:46,600 --> 00:04:48,839 Speaker 1: hard to come to any other conclusion. It has really 76 00:04:48,920 --> 00:04:52,880 Speaker 1: galvanized the labor movement in Hollywood right now. Now, it's 77 00:04:52,960 --> 00:04:55,000 Speaker 1: quite possible that this means we're going to be in 78 00:04:55,040 --> 00:04:57,919 Speaker 1: a real dry spell as far as movies, TV and 79 00:04:57,960 --> 00:05:02,120 Speaker 1: streaming content goes at least any content that would be 80 00:05:02,200 --> 00:05:04,880 Speaker 1: coming out of Hollywood and that that could last for 81 00:05:05,279 --> 00:05:08,360 Speaker 1: a little while. And I think this battle is one 82 00:05:08,360 --> 00:05:11,520 Speaker 1: that's needed. Of course, I am biased because I'm still 83 00:05:11,640 --> 00:05:15,080 Speaker 1: fuming that my old place of employment decided to eschew 84 00:05:15,200 --> 00:05:18,840 Speaker 1: human writers in favor of AI. But y'all don't need 85 00:05:18,880 --> 00:05:20,599 Speaker 1: to hear me go off on that again. If you 86 00:05:20,640 --> 00:05:22,920 Speaker 1: did miss it, just listen to a couple of recent 87 00:05:22,960 --> 00:05:27,479 Speaker 1: episodes where I talk about what happened at houstuffworks dot com. 88 00:05:27,800 --> 00:05:30,520 Speaker 1: Now it turns out that sacking employees in favor of 89 00:05:30,560 --> 00:05:35,039 Speaker 1: AI doesn't play well to the general public. CEO Summit 90 00:05:35,240 --> 00:05:39,279 Speaker 1: Shaw found that out on Twitter. He posted on Twitter 91 00:05:39,680 --> 00:05:43,040 Speaker 1: that his company quote had to lay off ninety percent 92 00:05:43,120 --> 00:05:48,520 Speaker 1: of our support team because of this AI chatbot. Tough, yes, necessary, 93 00:05:49,080 --> 00:05:54,520 Speaker 1: absolutely end quote. He claimed that this AI chatbot outperformed 94 00:05:54,760 --> 00:05:58,159 Speaker 1: his customer support staff, so he had no choice but 95 00:05:58,279 --> 00:06:02,400 Speaker 1: to replace human beings with artificial intelligence. He said that 96 00:06:02,440 --> 00:06:06,080 Speaker 1: this chatbot could resolve problems in a fraction of the 97 00:06:06,160 --> 00:06:09,080 Speaker 1: time that it took human workers, saying that it was 98 00:06:09,200 --> 00:06:13,360 Speaker 1: a difference of minutes with AI versus hours with humans, 99 00:06:13,640 --> 00:06:16,400 Speaker 1: and that this saved on customer support costs by up 100 00:06:16,400 --> 00:06:19,520 Speaker 1: to eighty five percent. Now Folks on Twitter descended like 101 00:06:19,600 --> 00:06:23,720 Speaker 1: piranha on Shaw's message. Some argued that the layoffs really 102 00:06:23,760 --> 00:06:26,640 Speaker 1: had more to do with a failing business model and 103 00:06:26,720 --> 00:06:29,920 Speaker 1: not an indication that AI is actually superior to human staff, 104 00:06:30,000 --> 00:06:33,719 Speaker 1: essentially saying Shaw was using AI as a shield to 105 00:06:34,440 --> 00:06:38,719 Speaker 1: guard against perceptions that his company was doing poorly. Shaw 106 00:06:38,760 --> 00:06:40,800 Speaker 1: has not backed down from his claims and says it 107 00:06:40,880 --> 00:06:44,400 Speaker 1: might be that he's on the more aggressive side and 108 00:06:44,560 --> 00:06:47,080 Speaker 1: that you know, talking about and taking action as something 109 00:06:47,080 --> 00:06:49,560 Speaker 1: that perhaps other business leaders are reluctant to do, but 110 00:06:49,600 --> 00:06:52,520 Speaker 1: that this is where we're all headed fun times, because 111 00:06:52,520 --> 00:06:54,479 Speaker 1: I don't see how that ends up being a win 112 00:06:54,600 --> 00:06:58,680 Speaker 1: in the long run. The Clerkson Law Firm has proposed 113 00:06:58,680 --> 00:07:03,440 Speaker 1: a class action loss suit against Alphabet, Google, and DeepMind. 114 00:07:03,880 --> 00:07:07,479 Speaker 1: Alphabet is the parent company to those other two companies. 115 00:07:08,080 --> 00:07:11,560 Speaker 1: The law firm accuses the companies of rifling through well 116 00:07:12,320 --> 00:07:15,880 Speaker 1: pretty much everything that's ever been online and all for 117 00:07:15,960 --> 00:07:19,800 Speaker 1: the purposes of training AI language models. The lawsuit argues 118 00:07:19,840 --> 00:07:26,080 Speaker 1: that Google, without permission, used personal, professional, copyrighted, and private works, 119 00:07:26,120 --> 00:07:30,440 Speaker 1: including stuff like private email, all to train AI and 120 00:07:30,480 --> 00:07:33,920 Speaker 1: that this represents a huge breach of trust and privacy, 121 00:07:34,040 --> 00:07:37,120 Speaker 1: and that those afflicted, which is pretty much anyone who 122 00:07:37,160 --> 00:07:40,480 Speaker 1: has ever had any of their information posted online, ever, 123 00:07:41,320 --> 00:07:44,680 Speaker 1: it gives them the right to deserve justice. Google's own 124 00:07:44,720 --> 00:07:47,800 Speaker 1: privacy policy, which updated on July first, says it may 125 00:07:47,840 --> 00:07:52,160 Speaker 1: collect data that is quote publicly available online end quote. 126 00:07:52,320 --> 00:07:57,600 Speaker 1: But the law firm argues this is drastic overreach, that 127 00:07:57,760 --> 00:08:01,360 Speaker 1: just because something is publicly available, that doesn't mean it's 128 00:08:01,560 --> 00:08:05,720 Speaker 1: free for use for whatever purpose, and that seems reasonable 129 00:08:05,720 --> 00:08:08,240 Speaker 1: to me. Now. According to one of the plaintiffs in 130 00:08:08,280 --> 00:08:12,200 Speaker 1: the lawsuit, there is evidence that google bard was trained 131 00:08:12,240 --> 00:08:16,440 Speaker 1: on copyrighted works, including this person's works. And if you 132 00:08:16,560 --> 00:08:19,560 Speaker 1: use the correct prompts with google Bard, you could essentially 133 00:08:20,240 --> 00:08:22,840 Speaker 1: get the works presented back to you. So like you 134 00:08:22,880 --> 00:08:25,600 Speaker 1: could ask what's the beginning of chapter three of such 135 00:08:25,640 --> 00:08:28,600 Speaker 1: and such and it would recite it to you. And 136 00:08:28,640 --> 00:08:30,520 Speaker 1: that means that there could be a workaround for people 137 00:08:30,560 --> 00:08:32,679 Speaker 1: who want to access a writer's work, but you know, 138 00:08:33,120 --> 00:08:35,160 Speaker 1: they don't want to have to pay for it. And 139 00:08:35,240 --> 00:08:37,960 Speaker 1: I get it. The Internet has trained us to expect 140 00:08:38,040 --> 00:08:42,760 Speaker 1: stuff for free. We've been kind of conditioned to expect that. However, 141 00:08:43,320 --> 00:08:45,760 Speaker 1: like I said at the beginning of this episode, if 142 00:08:45,800 --> 00:08:50,319 Speaker 1: writers can't make a living writing, then writing kind of stops. 143 00:08:50,720 --> 00:08:55,360 Speaker 1: So the approach ultimately becomes self destructive, and we're not 144 00:08:55,400 --> 00:08:59,280 Speaker 1: done with AI yet. Business Insider has an article about 145 00:08:59,320 --> 00:09:02,720 Speaker 1: how folks who created applications and processes that are built 146 00:09:02,720 --> 00:09:05,840 Speaker 1: on top of GPT four, which is the most recent 147 00:09:05,960 --> 00:09:10,320 Speaker 1: build of open AI's large language model, have started to 148 00:09:10,400 --> 00:09:14,200 Speaker 1: run into some issues recently. The article references users on 149 00:09:14,280 --> 00:09:17,440 Speaker 1: Twitter who have said that the model has become less 150 00:09:17,679 --> 00:09:22,120 Speaker 1: useful and described it as being dumber or lazier than 151 00:09:22,120 --> 00:09:27,040 Speaker 1: it had been previously. Problems include making careless mistakes, which 152 00:09:27,160 --> 00:09:28,880 Speaker 1: makes me think back to the days when I was 153 00:09:28,920 --> 00:09:31,920 Speaker 1: taking math courses. Sometimes I would forget something simple like 154 00:09:31,960 --> 00:09:34,480 Speaker 1: a negative sign, and so I would get an entire 155 00:09:34,520 --> 00:09:36,920 Speaker 1: problem wrong, even though you know, step by step I 156 00:09:36,960 --> 00:09:39,760 Speaker 1: was doing things correctly, but because I was careless or 157 00:09:39,840 --> 00:09:43,280 Speaker 1: lacked attention to detail, I got the wrong answer, and 158 00:09:43,760 --> 00:09:46,520 Speaker 1: thus you know that was a valuable lesson in paying 159 00:09:46,559 --> 00:09:51,280 Speaker 1: more attention. Well, apparently GPT four has some of those 160 00:09:51,320 --> 00:09:54,000 Speaker 1: same issues, where it will make some of these careless 161 00:09:54,040 --> 00:09:58,160 Speaker 1: mistakes like drop a bracket, which in code can really 162 00:09:58,240 --> 00:10:01,559 Speaker 1: just lead to a massive a bug or failure in 163 00:10:01,640 --> 00:10:05,200 Speaker 1: the program. Now, some have said the tool is also 164 00:10:05,280 --> 00:10:09,120 Speaker 1: less likely to quote unquote remember between prompts. That means 165 00:10:09,160 --> 00:10:12,000 Speaker 1: it gets to be harder to use GPT four to 166 00:10:12,080 --> 00:10:14,600 Speaker 1: build stuff, because at some point or another in the 167 00:10:14,640 --> 00:10:18,959 Speaker 1: building process it forgets what it was doing. The trade 168 00:10:18,960 --> 00:10:22,000 Speaker 1: off on this is that GPT four is now faster 169 00:10:22,720 --> 00:10:26,120 Speaker 1: than it had been at launch. So the developer community 170 00:10:26,240 --> 00:10:29,040 Speaker 1: is growing to suspect that open ai made some changes 171 00:10:29,440 --> 00:10:32,960 Speaker 1: that speeds things up, but at the cost of accuracy 172 00:10:33,000 --> 00:10:37,000 Speaker 1: and performance. Open Ai so far has declined to speak 173 00:10:37,120 --> 00:10:42,040 Speaker 1: about the perceived problems. All right, we've got a lot 174 00:10:42,080 --> 00:10:45,040 Speaker 1: more stories to cover before we get to that. Let's 175 00:10:45,040 --> 00:10:57,360 Speaker 1: take a quick break. We're back and I think it 176 00:10:57,360 --> 00:11:00,000 Speaker 1: can wrap this up in one more segment of store. 177 00:11:00,840 --> 00:11:06,280 Speaker 1: So starting off, yesterday, US Congress representatives released a report 178 00:11:06,360 --> 00:11:10,960 Speaker 1: accusing three major tax preparation companies, that being H and R, Block, 179 00:11:11,320 --> 00:11:16,160 Speaker 1: Tax Act, and Tax Slayer of sharing sensitive private customer 180 00:11:16,240 --> 00:11:21,600 Speaker 1: data with Meta, you know, Facebook's owner, and that Meta 181 00:11:21,640 --> 00:11:25,439 Speaker 1: was using this information to what else, improve its targeted 182 00:11:25,520 --> 00:11:30,000 Speaker 1: advertising strategy, and that these actions weren't taken with the 183 00:11:30,160 --> 00:11:34,200 Speaker 1: knowledge or consent of customers. And here's where I remind 184 00:11:34,280 --> 00:11:37,120 Speaker 1: folks outside the United States that over here in the US, 185 00:11:37,200 --> 00:11:40,960 Speaker 1: tax preparation can get really complicated, and for a lot 186 00:11:40,960 --> 00:11:44,280 Speaker 1: of people it falls way outside their comfort zone. So 187 00:11:44,440 --> 00:11:48,439 Speaker 1: tons of folks depend upon third parties to prepare their taxes. 188 00:11:48,760 --> 00:11:52,320 Speaker 1: I understand this concept is strange in some other countries 189 00:11:52,320 --> 00:11:54,000 Speaker 1: where a lot of that work is already done by 190 00:11:54,000 --> 00:11:56,920 Speaker 1: the government on behalf of its citizens. Pretty sure that 191 00:11:56,960 --> 00:11:59,120 Speaker 1: wouldn't fly here in the States because there are too 192 00:11:59,160 --> 00:12:02,640 Speaker 1: many people here have a deep distrust in the government, 193 00:12:02,679 --> 00:12:06,319 Speaker 1: particularly when it comes to handling something like your personal finances. 194 00:12:07,000 --> 00:12:09,040 Speaker 1: I can't entirely blame them for that, By the way, 195 00:12:09,200 --> 00:12:12,400 Speaker 1: our government has done a lot of boneheaded stuff anyway. 196 00:12:12,760 --> 00:12:19,000 Speaker 1: According to the report, these companies shared information like income amounts, 197 00:12:19,440 --> 00:12:23,160 Speaker 1: how much people were expecting in refunds, how much tax 198 00:12:23,200 --> 00:12:27,880 Speaker 1: they owed the government, the names of their dependence and more. 199 00:12:28,240 --> 00:12:31,640 Speaker 1: In addition, one of those companies, Taxpayer, was also accused 200 00:12:31,679 --> 00:12:36,360 Speaker 1: of having shared similar information with Google. The congressional representatives 201 00:12:36,360 --> 00:12:39,560 Speaker 1: are calling for a full investigation into the matter. There's 202 00:12:39,640 --> 00:12:42,280 Speaker 1: been some talk that this information was designed in such 203 00:12:42,280 --> 00:12:46,000 Speaker 1: a way as to not make it quite as personally identifiable. 204 00:12:46,080 --> 00:12:49,280 Speaker 1: But I think that's all rubbish because we have seen 205 00:12:49,400 --> 00:12:52,320 Speaker 1: time and again it takes very few points of data 206 00:12:52,679 --> 00:12:55,560 Speaker 1: for you to narrow in and figure out who that 207 00:12:55,640 --> 00:12:59,240 Speaker 1: data belongs to. We've seen it time and again. You 208 00:12:59,280 --> 00:13:01,240 Speaker 1: only need a few points of data to be able 209 00:13:01,320 --> 00:13:06,480 Speaker 1: to weed out the vast majority of folks and zoom 210 00:13:06,520 --> 00:13:09,520 Speaker 1: in on the people it could possibly be, which means 211 00:13:10,120 --> 00:13:13,840 Speaker 1: you could quote unquote anonymize data all you like and 212 00:13:13,920 --> 00:13:16,320 Speaker 1: you could still make it possible to figure out who 213 00:13:16,400 --> 00:13:20,800 Speaker 1: that data belongs to. Anyway, Once again, I feel like 214 00:13:20,840 --> 00:13:23,320 Speaker 1: I need to go on the broken record route and 215 00:13:23,360 --> 00:13:26,880 Speaker 1: call for the need for comprehensive data security and privacy 216 00:13:26,960 --> 00:13:30,200 Speaker 1: laws here in the US, because of course companies are 217 00:13:30,240 --> 00:13:32,920 Speaker 1: going to engage in this type of behavior if there's 218 00:13:32,960 --> 00:13:35,960 Speaker 1: even just a little ambiguity over whether or not it's legal. 219 00:13:36,320 --> 00:13:39,280 Speaker 1: So it needs to be spelled out in black and 220 00:13:39,320 --> 00:13:42,559 Speaker 1: white what is and is not legal, and what the 221 00:13:42,760 --> 00:13:47,120 Speaker 1: consequences are for breaking the law Before that happens. We're 222 00:13:47,120 --> 00:13:49,400 Speaker 1: going to see stuff like this over and over again. 223 00:13:50,040 --> 00:13:53,800 Speaker 1: Microsoft revealed that Chinese hackers were able to access Microsoft 224 00:13:53,920 --> 00:13:57,400 Speaker 1: email accounts belonging to two dozen government agencies and a 225 00:13:57,480 --> 00:14:02,119 Speaker 1: massive data breach. The agencies include both US and European departments. 226 00:14:02,480 --> 00:14:05,360 Speaker 1: The attackers managed to gain access earlier this year May 227 00:14:05,400 --> 00:14:08,719 Speaker 1: at the latest, but maybe even earlier than that. Microsoft 228 00:14:08,800 --> 00:14:12,760 Speaker 1: was unaware of the intrusion until mid June. According to 229 00:14:12,800 --> 00:14:18,280 Speaker 1: security experts, the attacks were extremely sophisticated and obviously precisely targeted. 230 00:14:18,800 --> 00:14:22,280 Speaker 1: The vulnerability appears to have been in Microsoft's cloud services, 231 00:14:22,640 --> 00:14:25,160 Speaker 1: which highlights one of the big challenges with cloud computing. 232 00:14:25,680 --> 00:14:29,200 Speaker 1: Organizations end up having to hand over at least partial 233 00:14:29,240 --> 00:14:32,200 Speaker 1: control to a third party and in trust that the 234 00:14:32,240 --> 00:14:35,480 Speaker 1: third party is going to protect the systems and information 235 00:14:35,720 --> 00:14:39,040 Speaker 1: that are nested in the cloud on behalf of the customer. 236 00:14:39,440 --> 00:14:43,080 Speaker 1: The alternative is to keep all your systems on premises, 237 00:14:43,120 --> 00:14:47,080 Speaker 1: but as systems grow more sophisticated and complex, and as 238 00:14:47,360 --> 00:14:52,040 Speaker 1: organizations become more distributed, the physical requirements of sticking strictly 239 00:14:52,080 --> 00:14:56,360 Speaker 1: with on prem becomes untenable. So there is a real 240 00:14:56,840 --> 00:14:59,880 Speaker 1: need for things like cloud computing. But it also shows 241 00:15:00,080 --> 00:15:05,160 Speaker 1: that cloud computing requires massive investments in security, which Microsoft 242 00:15:05,200 --> 00:15:08,240 Speaker 1: has done. It's just this was a case where bad 243 00:15:08,280 --> 00:15:11,200 Speaker 1: actors were able to exploit a vulnerability and it went 244 00:15:11,320 --> 00:15:13,800 Speaker 1: unnoticed for at least a few months or at least 245 00:15:13,840 --> 00:15:17,760 Speaker 1: two months. Amazon has filed a challenge to the EU's 246 00:15:17,880 --> 00:15:22,000 Speaker 1: Digital Services Act or DSA that's scheduled to become a 247 00:15:22,040 --> 00:15:26,760 Speaker 1: fully operational act death Star style by August twenty fifth. Now, 248 00:15:26,760 --> 00:15:28,800 Speaker 1: this Act is part of the EU's efforts to create 249 00:15:28,880 --> 00:15:33,560 Speaker 1: rules and restrictions and regulations that apply to big tech companies. 250 00:15:34,080 --> 00:15:36,720 Speaker 1: If the EU determines the tech company counts as a 251 00:15:36,840 --> 00:15:42,240 Speaker 1: quote unquote very large online platform or VLOP, then that 252 00:15:42,320 --> 00:15:45,600 Speaker 1: company has to meet certain obligations in order to comply 253 00:15:45,800 --> 00:15:49,560 Speaker 1: with these new regulations. So among those criteria is the 254 00:15:49,600 --> 00:15:54,640 Speaker 1: requirement to police disinformation and hate speech on the platform. 255 00:15:55,200 --> 00:15:58,480 Speaker 1: Amazon's not super keen on being held accountable for doing 256 00:15:58,520 --> 00:16:04,280 Speaker 1: that and wants to dispute its classification as a VLOP. Now, 257 00:16:04,320 --> 00:16:08,200 Speaker 1: considering Amazon is a leading cloud computing company, it seems 258 00:16:08,240 --> 00:16:10,880 Speaker 1: like it's a pretty steep hill to climb to say 259 00:16:10,880 --> 00:16:14,520 Speaker 1: that we're not a VLOP. But then Amazon reps have 260 00:16:14,640 --> 00:16:18,440 Speaker 1: a few salient points to make. Amazon is primarily known 261 00:16:18,520 --> 00:16:22,040 Speaker 1: as an online retail company. They say, we're not a 262 00:16:22,040 --> 00:16:25,000 Speaker 1: search engine, at least not in the traditional sense. We're 263 00:16:25,000 --> 00:16:27,960 Speaker 1: not a social network, even though there are some social 264 00:16:28,000 --> 00:16:33,320 Speaker 1: network functionalities built into Amazon, such as user reviews. And further, 265 00:16:33,680 --> 00:16:37,440 Speaker 1: Amazon argues that large retailers in the EU are not 266 00:16:37,480 --> 00:16:41,560 Speaker 1: being classified as vlops, So why is Amazon being held 267 00:16:41,600 --> 00:16:44,880 Speaker 1: to a standard when other large retailers in the EU 268 00:16:45,040 --> 00:16:48,360 Speaker 1: are not. The reps say that while Amazon agrees that 269 00:16:48,760 --> 00:16:51,640 Speaker 1: the company bears responsibility to make sure that the products 270 00:16:51,680 --> 00:16:55,440 Speaker 1: it offers on its marketplace are legal, it should not 271 00:16:55,520 --> 00:16:58,600 Speaker 1: fall into the classification of a very large online platform 272 00:16:58,920 --> 00:17:03,080 Speaker 1: because it is fundament different from platforms like Facebook or Google. 273 00:17:03,640 --> 00:17:08,000 Speaker 1: EU reps haven't directly addressed Amazon's statements, but have indicated 274 00:17:08,040 --> 00:17:11,359 Speaker 1: that all vlops will need to take steps to comply 275 00:17:11,480 --> 00:17:14,920 Speaker 1: with the law. However, the process for doing that will 276 00:17:14,960 --> 00:17:17,919 Speaker 1: be different depending upon the nature of the platform. So 277 00:17:17,960 --> 00:17:20,800 Speaker 1: a marketplace will do this in a very different way 278 00:17:21,160 --> 00:17:25,239 Speaker 1: than a social network, for example. And finally, I have 279 00:17:25,280 --> 00:17:29,480 Speaker 1: a recommended article for y'all today. It's from Lucas Ropek 280 00:17:29,720 --> 00:17:34,120 Speaker 1: of Gizmodo. The article is titled a proposed law would 281 00:17:34,200 --> 00:17:38,080 Speaker 1: force Internet companies to spy on their users for the DEA. 282 00:17:38,880 --> 00:17:42,520 Speaker 1: So this really applies to people living in the United States. 283 00:17:42,520 --> 00:17:45,639 Speaker 1: It all stems from a very real problem how do 284 00:17:45,680 --> 00:17:50,719 Speaker 1: you crack down on illegal and deadly drug trade online 285 00:17:51,320 --> 00:17:56,000 Speaker 1: without also creating a surveillance state. So I recommend you 286 00:17:56,119 --> 00:17:59,000 Speaker 1: check out that article. It's really well written, it's extensive. 287 00:17:59,080 --> 00:18:02,119 Speaker 1: It goes into detail about the actual nature of the problem, 288 00:18:02,359 --> 00:18:05,800 Speaker 1: why this is something that we do need to address, 289 00:18:05,920 --> 00:18:10,680 Speaker 1: and why the proposed solution could potentially be a really 290 00:18:10,720 --> 00:18:13,600 Speaker 1: bad one. Again, the title of that article is a 291 00:18:13,680 --> 00:18:16,960 Speaker 1: proposed law would force Internet companies to spy on their 292 00:18:17,040 --> 00:18:21,280 Speaker 1: users for the dea over at gizmoto, And just full disclosure, 293 00:18:21,359 --> 00:18:24,359 Speaker 1: I have no connection to Gizmoto. I do not know 294 00:18:24,480 --> 00:18:27,000 Speaker 1: Lucas Robek. I just thought this was a very good 295 00:18:27,119 --> 00:18:31,280 Speaker 1: article that folks should read. And that is it for 296 00:18:31,359 --> 00:18:35,640 Speaker 1: the news for Thursday, July thirteenth, twenty twenty three. I 297 00:18:35,680 --> 00:18:39,000 Speaker 1: hope you are all well, stay safe out there, and 298 00:18:39,080 --> 00:18:48,080 Speaker 1: I'll talk to you again really soon. Tech Stuff is 299 00:18:48,119 --> 00:18:52,639 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 300 00:18:52,720 --> 00:18:56,320 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 301 00:18:56,359 --> 00:18:59,720 Speaker 1: favorite shows.