1 00:00:04,440 --> 00:00:12,480 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,520 --> 00:00:16,320 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,320 --> 00:00:19,479 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,520 --> 00:00:21,919 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,000 --> 00:00:25,720 Speaker 1: the week ending on February ninth, twenty twenty four So. 6 00:00:25,840 --> 00:00:29,760 Speaker 1: The Guardian reports that a Chinese backed hacker group going 7 00:00:29,800 --> 00:00:33,199 Speaker 1: by the name volt Typhoon has spent the last five 8 00:00:33,320 --> 00:00:37,400 Speaker 1: years at the very least, infiltrating various critical computer systems 9 00:00:37,440 --> 00:00:40,560 Speaker 1: here in the United States in our infrastructure, and they 10 00:00:40,600 --> 00:00:46,239 Speaker 1: have compromised systems connected to transportation, shipping, and utilities like 11 00:00:46,360 --> 00:00:50,320 Speaker 1: water and sewage. The US National Security Agency says that 12 00:00:50,400 --> 00:00:54,200 Speaker 1: this hacker group has created footholds within these systems, so 13 00:00:54,320 --> 00:00:57,640 Speaker 1: they didn't just infiltrate and spy on stuff and get out. 14 00:00:57,680 --> 00:01:00,680 Speaker 1: They made sort of a cozy little nest for themselves 15 00:01:00,680 --> 00:01:03,320 Speaker 1: that they could easily return to. Now to me, this 16 00:01:03,440 --> 00:01:06,480 Speaker 1: sounds awfully familiar to stories that I've been hearing for 17 00:01:06,840 --> 00:01:10,119 Speaker 1: many years now. In fact, I even did a quick 18 00:01:10,160 --> 00:01:13,160 Speaker 1: search on Google and I just picked the year twenty 19 00:01:13,240 --> 00:01:15,880 Speaker 1: fourteen at random. It was actually my first choice. I mean, 20 00:01:15,920 --> 00:01:18,000 Speaker 1: it's ten years ago. I thought, let's see if there's 21 00:01:18,000 --> 00:01:22,600 Speaker 1: been any stories in twenty fourteen about Chinese hackers compromising 22 00:01:22,760 --> 00:01:26,960 Speaker 1: critical systems within US infrastructure, And sure enough, the top 23 00:01:27,000 --> 00:01:29,840 Speaker 1: result was from CNN, and there was an article that 24 00:01:29,920 --> 00:01:33,040 Speaker 1: had the headline, the US government thinks China could take 25 00:01:33,080 --> 00:01:36,280 Speaker 1: down the power grid because, just as we're hearing now, 26 00:01:36,760 --> 00:01:40,960 Speaker 1: Chinese hackers had managed to infiltrate these various computer systems 27 00:01:41,000 --> 00:01:44,640 Speaker 1: and to kind of leave for themselves the capacity to 28 00:01:44,760 --> 00:01:48,200 Speaker 1: return to it, so that maybe in the future, in 29 00:01:48,240 --> 00:01:52,600 Speaker 1: an act of cyber warfare, they could sabotage the system. So, 30 00:01:53,080 --> 00:01:57,360 Speaker 1: on one hand, this news story is alarming, but it's 31 00:01:57,400 --> 00:02:00,880 Speaker 1: not actually new, right because we've had issues with Chinese 32 00:02:00,880 --> 00:02:04,320 Speaker 1: backed hackers compromising systems here in the United States for ages. 33 00:02:04,640 --> 00:02:07,760 Speaker 1: But on the other hand, it's still a very alarming 34 00:02:07,840 --> 00:02:10,960 Speaker 1: story to read about. I would argue it might be 35 00:02:11,080 --> 00:02:13,880 Speaker 1: more alarming to me that we freaking knew about this 36 00:02:14,000 --> 00:02:17,000 Speaker 1: problem a decade ago and seem to have learned nothing 37 00:02:17,120 --> 00:02:20,680 Speaker 1: since then. That to me is really alarming. It's not 38 00:02:20,800 --> 00:02:23,680 Speaker 1: just that Chinese hackers did this, it's that we saw 39 00:02:24,400 --> 00:02:27,680 Speaker 1: essentially the same story play out a decade ago, and 40 00:02:27,800 --> 00:02:31,959 Speaker 1: yet haven't done the proper steps to protect ourselves against 41 00:02:32,040 --> 00:02:35,919 Speaker 1: future attacks. So the Guardian points out that the big 42 00:02:35,960 --> 00:02:39,160 Speaker 1: concerning element here, besides the fact that Chinese hackers have 43 00:02:39,200 --> 00:02:42,720 Speaker 1: infiltrated all these different computer systems, is that it does 44 00:02:42,760 --> 00:02:45,119 Speaker 1: not look like it's an act of espionage at all. 45 00:02:45,320 --> 00:02:48,360 Speaker 1: This isn't a case where Chinese hackers are trying to 46 00:02:48,440 --> 00:02:52,400 Speaker 1: get secrets and then use those back in China. This 47 00:02:52,480 --> 00:02:56,119 Speaker 1: appears to be a preparation for sabotage and cyber warfare. 48 00:02:56,600 --> 00:03:01,360 Speaker 1: So really this is preparations for an attack, should that 49 00:03:01,480 --> 00:03:03,360 Speaker 1: ever come to pass. Now, it may be that an 50 00:03:03,400 --> 00:03:05,800 Speaker 1: attack never would come to pass, but it's more like 51 00:03:06,320 --> 00:03:10,160 Speaker 1: the Chinese hackers were setting things up in the event 52 00:03:10,800 --> 00:03:14,760 Speaker 1: that there was a cyber attack to be conducted by 53 00:03:14,880 --> 00:03:17,639 Speaker 1: China against the United States, they would have already laid 54 00:03:17,680 --> 00:03:21,000 Speaker 1: the groundwork. So yeah, it's one thing to steal secrets, 55 00:03:21,000 --> 00:03:23,280 Speaker 1: but it's another thing to shut stuff off on a 56 00:03:23,320 --> 00:03:25,800 Speaker 1: massive scale. I mean, you can think of how disruptive 57 00:03:25,880 --> 00:03:29,760 Speaker 1: that would be. So presumably the affected targets are going 58 00:03:29,760 --> 00:03:33,200 Speaker 1: to seek ways to expel the hackers and harden security 59 00:03:33,240 --> 00:03:36,280 Speaker 1: against future intrusions. But then you would have figured that 60 00:03:36,280 --> 00:03:38,520 Speaker 1: would have been the case a decade ago, and yet 61 00:03:38,520 --> 00:03:41,080 Speaker 1: it's still happening. I guess you can argue that no 62 00:03:41,160 --> 00:03:44,040 Speaker 1: security system is perfect, and that there are always going 63 00:03:44,080 --> 00:03:47,360 Speaker 1: to be vulnerabilities, whether that's in the code or more 64 00:03:47,520 --> 00:03:52,360 Speaker 1: likely due to someone failing to practice could security measures, 65 00:03:52,400 --> 00:03:55,920 Speaker 1: and that you'll never have a perfect system. So you 66 00:03:56,000 --> 00:03:59,000 Speaker 1: could argue that, but at the same time, when you 67 00:03:59,080 --> 00:04:03,680 Speaker 1: find out that it's this apparently widespread, it raises some 68 00:04:03,840 --> 00:04:08,160 Speaker 1: very serious questions. Anyway, now let's move on to our 69 00:04:08,160 --> 00:04:11,720 Speaker 1: traditional glut of AI related news. So first up, the 70 00:04:11,800 --> 00:04:15,440 Speaker 1: Verge reports that Google has dropped the names Barred and 71 00:04:15,560 --> 00:04:20,120 Speaker 1: Duet from its various AI products. So the new AI 72 00:04:20,200 --> 00:04:25,560 Speaker 1: Google strategy is called Gemini, or if you're an astronaut 73 00:04:25,640 --> 00:04:30,680 Speaker 1: from the nineteen sixties, maybe Geminy. But Google is streamlining 74 00:04:30,720 --> 00:04:35,640 Speaker 1: its approach. So previously, it's aipowered chatbot had the name Barred, 75 00:04:36,080 --> 00:04:39,359 Speaker 1: which was a nod to William Shakespeare, a nod that 76 00:04:39,440 --> 00:04:43,919 Speaker 1: I appreciated, and it had some AI enhanced components in 77 00:04:43,960 --> 00:04:48,039 Speaker 1: the Google Workspace product that had the name Duet. But 78 00:04:48,160 --> 00:04:52,000 Speaker 1: now all of that is under the name Gemini. So 79 00:04:52,080 --> 00:04:54,280 Speaker 1: on top of that, Google is going to release a 80 00:04:54,400 --> 00:04:58,000 Speaker 1: new and larger than ever large language model to the 81 00:04:58,040 --> 00:05:01,720 Speaker 1: public soon, the Gemini language model. Android users will get 82 00:05:01,800 --> 00:05:04,600 Speaker 1: a chance to test out Gemini and all its glory. 83 00:05:05,440 --> 00:05:08,159 Speaker 1: It looks like a lot of different apps are going 84 00:05:08,200 --> 00:05:11,560 Speaker 1: to get a Gemini upgrade. iOS will be a little 85 00:05:11,600 --> 00:05:14,159 Speaker 1: bit different, but if you're an Android owner, you'll have 86 00:05:14,200 --> 00:05:17,719 Speaker 1: access to this pretty soon. And it looks like Gemini 87 00:05:17,880 --> 00:05:21,800 Speaker 1: may quietly take the place of the humble Google Assistant 88 00:05:22,480 --> 00:05:27,479 Speaker 1: over time. It might be a long, phasing kind of process, 89 00:05:27,880 --> 00:05:31,440 Speaker 1: but that's what looks like. So will Jim and I 90 00:05:31,520 --> 00:05:35,119 Speaker 1: be able to challenge open a eyes domination with chat, 91 00:05:35,160 --> 00:05:39,640 Speaker 1: GPT and the GPT language model. Will I even notice 92 00:05:40,200 --> 00:05:43,159 Speaker 1: that it's happened, or by that time will I be 93 00:05:43,240 --> 00:05:45,679 Speaker 1: hiding out in an off the grid shack in the woods. 94 00:05:46,360 --> 00:05:48,880 Speaker 1: Find out next week at the same BAT time, same 95 00:05:48,920 --> 00:05:53,200 Speaker 1: BAT channel. Al Jazeera reports that Sam Altman, the CEO 96 00:05:53,480 --> 00:05:57,680 Speaker 1: of open Ai, is looking for cash, like a whole 97 00:05:58,240 --> 00:06:01,000 Speaker 1: buttload of it. So how how over much you're thinking? 98 00:06:01,520 --> 00:06:04,080 Speaker 1: It's more than that? So, according to the news site, 99 00:06:04,160 --> 00:06:08,520 Speaker 1: Altman's goal is to raise quote trillions of dollars from investors, 100 00:06:08,560 --> 00:06:12,200 Speaker 1: including the United Arab Emirates government to boost the world's 101 00:06:12,240 --> 00:06:17,720 Speaker 1: capacity to produce advanced chips and power artificial intelligence end quote. 102 00:06:17,880 --> 00:06:21,960 Speaker 1: And further in the article they specify that they're talking 103 00:06:22,000 --> 00:06:27,080 Speaker 1: around seven trillion dollars good gravy. That's a lot of 104 00:06:27,080 --> 00:06:30,920 Speaker 1: money that goes beyond a princely sum. Princess don't have 105 00:06:31,040 --> 00:06:34,679 Speaker 1: access to seven trillion dollars. Now, we know that AI 106 00:06:34,880 --> 00:06:38,440 Speaker 1: requires a lot of computational power to work, and open 107 00:06:38,480 --> 00:06:43,200 Speaker 1: AI's business is completely dependent upon customers demanding more AI capabilities. 108 00:06:43,360 --> 00:06:45,600 Speaker 1: So in order to deliver those capabilities, you've got to 109 00:06:45,640 --> 00:06:48,880 Speaker 1: have the hardware that can run the AI software on 110 00:06:48,960 --> 00:06:51,680 Speaker 1: top of it. So it makes sense that open ai 111 00:06:51,920 --> 00:06:56,640 Speaker 1: needs all these these investments to go into making the 112 00:06:56,720 --> 00:07:01,080 Speaker 1: hardware that powers everything. So this money would ultimately go 113 00:07:01,440 --> 00:07:06,120 Speaker 1: toward building more fabrication facilities, and those facilities would be 114 00:07:06,240 --> 00:07:10,320 Speaker 1: run by established semiconductor fabrication companies, not open Ai. So 115 00:07:10,360 --> 00:07:13,840 Speaker 1: it's not that open ai would get into the semiconductor business. 116 00:07:14,080 --> 00:07:16,880 Speaker 1: They're saying, we need this money raised so that we 117 00:07:16,920 --> 00:07:20,000 Speaker 1: can build these sorts of fabrication plants all around the 118 00:07:20,000 --> 00:07:23,320 Speaker 1: world and have these established companies handle them so that 119 00:07:23,400 --> 00:07:25,720 Speaker 1: we have the equipment we need in order to deliver 120 00:07:25,800 --> 00:07:29,720 Speaker 1: the AI experiences that we want to, so the argument 121 00:07:29,760 --> 00:07:32,560 Speaker 1: is that we're not funneling the money to open AI itself. 122 00:07:32,880 --> 00:07:35,520 Speaker 1: I'm not exactly surprised by all this news, but it 123 00:07:35,560 --> 00:07:37,800 Speaker 1: does make me wonder if we're going to actually see 124 00:07:37,840 --> 00:07:42,520 Speaker 1: AI replace cryptocurrency mining as a computational application that just 125 00:07:42,560 --> 00:07:46,560 Speaker 1: demands an insane amount of power. Right? Is it possible 126 00:07:47,040 --> 00:07:49,880 Speaker 1: that in a year or two we'll be talking about 127 00:07:49,880 --> 00:07:54,920 Speaker 1: how AI as an industry requires more power than most 128 00:07:55,000 --> 00:07:57,240 Speaker 1: countries do for a full year the same way we 129 00:07:57,280 --> 00:08:00,520 Speaker 1: talk about cryptocurrency. We'll actually circle back to talk about 130 00:08:00,560 --> 00:08:04,320 Speaker 1: cryptocurrency and power demands a little bit later in this episode. 131 00:08:04,320 --> 00:08:08,080 Speaker 1: But yeah, I just wonder about that. This week, Clint Watts, 132 00:08:08,120 --> 00:08:11,880 Speaker 1: the general manager of Microsoft's Threat Analysis Center, published a 133 00:08:11,880 --> 00:08:14,600 Speaker 1: blog post that made me take notice. He was focusing 134 00:08:14,680 --> 00:08:18,920 Speaker 1: on how Iran's strategy regarding Israel is leaning more and 135 00:08:18,960 --> 00:08:22,440 Speaker 1: more on cyber operations, and one of those cyber ops 136 00:08:22,480 --> 00:08:26,880 Speaker 1: apparently involved using an AI generated news anchor to deliver 137 00:08:27,120 --> 00:08:30,720 Speaker 1: fake news to audiences in different parts of the world. 138 00:08:31,080 --> 00:08:35,000 Speaker 1: So apparently this happened late last year in December, and 139 00:08:35,160 --> 00:08:40,079 Speaker 1: an Iran aligned hacker group called Cotton Sandstorm interrupted some 140 00:08:40,080 --> 00:08:44,320 Speaker 1: streaming TV services in different countries and inserted this AI 141 00:08:44,480 --> 00:08:48,840 Speaker 1: generated news program with an AI generated news anchor and 142 00:08:48,920 --> 00:08:53,360 Speaker 1: that quote. The disruption reached audiences in the UAE, UK 143 00:08:53,840 --> 00:08:57,720 Speaker 1: and Canada end quote. The Guardian further reported that the 144 00:08:57,760 --> 00:09:01,720 Speaker 1: AI anchor presented quote unfit verified images that claim to 145 00:09:01,760 --> 00:09:06,520 Speaker 1: show Palestinians injured and killed from Israeli military operations in Gaza. 146 00:09:06,920 --> 00:09:11,560 Speaker 1: Uote Watts express concern about how this instance is an 147 00:09:11,600 --> 00:09:14,720 Speaker 1: indicator of what we should expect moving forward, particularly in 148 00:09:14,800 --> 00:09:19,240 Speaker 1: really eventful years like an election year, which hey, here 149 00:09:19,240 --> 00:09:21,520 Speaker 1: in the United States we happen to be in an 150 00:09:21,559 --> 00:09:25,160 Speaker 1: election year. So keep an eye out for those robots 151 00:09:25,360 --> 00:09:29,359 Speaker 1: on your screens and on your phones and really everywhere. 152 00:09:29,640 --> 00:09:31,200 Speaker 1: Why that's shack in the middle of the woods is 153 00:09:31,200 --> 00:09:33,959 Speaker 1: looking better and better. All right, you know what, We're 154 00:09:34,000 --> 00:09:36,000 Speaker 1: going to take a quick break to thank our sponsors. 155 00:09:36,040 --> 00:09:39,440 Speaker 1: I'm going to gather myself and try to get rid 156 00:09:39,440 --> 00:09:43,400 Speaker 1: of this sort of luddite tendency that's overtaking me, and 157 00:09:43,440 --> 00:09:56,240 Speaker 1: we'll be back right after this. We're back. So, the 158 00:09:56,440 --> 00:10:01,120 Speaker 1: US Centers for Medicare and Medicaid Services sent out a 159 00:10:01,559 --> 00:10:06,040 Speaker 1: memo to all insurers that offer Medicare Advantage with a 160 00:10:06,120 --> 00:10:09,760 Speaker 1: very clear message those insurers are not allowed to use 161 00:10:09,840 --> 00:10:12,559 Speaker 1: AI to determine if someone on a Medicare advantage plan 162 00:10:12,800 --> 00:10:17,040 Speaker 1: merits coverage or should be denied coverage. This is really 163 00:10:17,080 --> 00:10:20,439 Speaker 1: good news because we've already seen some insurance companies do 164 00:10:20,600 --> 00:10:24,079 Speaker 1: that very thing, use AI to determine if they should 165 00:10:24,160 --> 00:10:27,720 Speaker 1: grant or deny coverage to a patient. In fact, right now, 166 00:10:27,800 --> 00:10:31,280 Speaker 1: there are a couple of massive lawsuits against Humana and 167 00:10:31,440 --> 00:10:34,480 Speaker 1: United Health about that very thing. Patients claim that these 168 00:10:34,520 --> 00:10:37,600 Speaker 1: companies used an AI tool to decide whether or not 169 00:10:37,960 --> 00:10:43,000 Speaker 1: those patients should receive coverage for various medical processes, procedures, 170 00:10:43,040 --> 00:10:46,199 Speaker 1: and prescriptions, typically things like whether or not they should 171 00:10:46,240 --> 00:10:49,480 Speaker 1: be allowed to stay in a medical facility or if 172 00:10:49,559 --> 00:10:53,680 Speaker 1: they should be told to leave, even if it's before 173 00:10:53,760 --> 00:10:57,280 Speaker 1: the doctors said that they should leave because otherwise insurance 174 00:10:57,320 --> 00:10:59,200 Speaker 1: was going to cut off the support and they'd have 175 00:10:59,200 --> 00:11:01,760 Speaker 1: to cover everything out of pocket. So, according to the claim, 176 00:11:02,400 --> 00:11:05,080 Speaker 1: many of those decisions that were made for these patients 177 00:11:05,240 --> 00:11:10,840 Speaker 1: via AI were wrong. The AI reportedly had a dismal 178 00:11:11,400 --> 00:11:15,280 Speaker 1: accuracy rating, but the insurance companies were still depending upon it, 179 00:11:15,559 --> 00:11:19,120 Speaker 1: and the argument is that the companies were following an 180 00:11:19,120 --> 00:11:22,760 Speaker 1: incentive to use this AI tool because by denying claims, 181 00:11:23,000 --> 00:11:25,680 Speaker 1: they didn't have to pay out to the insured. So yeah, 182 00:11:25,720 --> 00:11:28,000 Speaker 1: the tool's broken. But the tool also says I get 183 00:11:28,040 --> 00:11:30,320 Speaker 1: to keep more of my money if I listen to it, 184 00:11:30,400 --> 00:11:34,040 Speaker 1: So I'm listening to the tool. In other words, Anyway, now, 185 00:11:34,160 --> 00:11:37,800 Speaker 1: the CMS is telling insurers that while they can use 186 00:11:37,880 --> 00:11:41,080 Speaker 1: AI assisted tools to do something like predictive person's length 187 00:11:41,120 --> 00:11:45,000 Speaker 1: of stay in a medical facility, any decisions on actual 188 00:11:45,040 --> 00:11:49,199 Speaker 1: coverage have to be done solely by taking that individual 189 00:11:49,280 --> 00:11:53,120 Speaker 1: patient's circumstances into account, which means you can't just use 190 00:11:53,320 --> 00:11:57,200 Speaker 1: aggregated data from a larger population and then make a 191 00:11:57,240 --> 00:12:01,360 Speaker 1: determination for that patient, which is good news for people 192 00:12:01,400 --> 00:12:05,000 Speaker 1: who are part of these programs. To learn more about this, 193 00:12:05,120 --> 00:12:08,400 Speaker 1: I recommend Beth Mole's article in Ours Technica. It's titled 194 00:12:08,760 --> 00:12:12,600 Speaker 1: AI cannot be used to deny health coverage FEDS clarify 195 00:12:12,679 --> 00:12:15,880 Speaker 1: to insurers. It's a good read. So a bit earlier, 196 00:12:16,120 --> 00:12:19,480 Speaker 1: I mentioned that AI is placing an increasingly large demand 197 00:12:19,520 --> 00:12:22,839 Speaker 1: on resources, similar to what we're seeing in crypto mining. Well, 198 00:12:22,840 --> 00:12:25,800 Speaker 1: according to Tom's Hardware, which is a great website by 199 00:12:25,800 --> 00:12:29,800 Speaker 1: the way, the US Energy Information Administration released an analysis 200 00:12:29,800 --> 00:12:33,480 Speaker 1: that says one hundred and thirty seven crypto mining operations 201 00:12:33,480 --> 00:12:37,160 Speaker 1: in the US demand two point three percent of the 202 00:12:37,280 --> 00:12:41,680 Speaker 1: US's power demands overall, which is astounding, Right, You're talking 203 00:12:41,720 --> 00:12:45,280 Speaker 1: about fewer than one hundred and fifty organizations that require 204 00:12:45,559 --> 00:12:51,160 Speaker 1: that much of the nation's power that we generate. We've 205 00:12:51,160 --> 00:12:54,960 Speaker 1: heard in the past about how crypto mining, specifically proof 206 00:12:55,000 --> 00:12:58,160 Speaker 1: of work systems like Bitcoin, have created this sort of 207 00:12:58,240 --> 00:13:02,080 Speaker 1: runaway demand on power, but boy howdy, I mean, this 208 00:13:02,160 --> 00:13:05,559 Speaker 1: is just a lot anyway. The implication here is that 209 00:13:05,600 --> 00:13:08,000 Speaker 1: the US government is starting to get a little concerned 210 00:13:08,559 --> 00:13:13,000 Speaker 1: about how much power these organizations demand, and that maybe 211 00:13:13,040 --> 00:13:15,880 Speaker 1: we should expect US agencies to take a bit of 212 00:13:15,880 --> 00:13:19,320 Speaker 1: a closer look at crypto mining, perhaps with regard to 213 00:13:19,360 --> 00:13:23,680 Speaker 1: stuff like I don't know, environmental impact, and this in 214 00:13:23,760 --> 00:13:27,200 Speaker 1: turn could eventually have a big effect on the crypto space, 215 00:13:27,240 --> 00:13:29,520 Speaker 1: at least here in the United States. Like it's possible 216 00:13:29,960 --> 00:13:33,480 Speaker 1: that we could see some regulations and restrictions that really 217 00:13:33,600 --> 00:13:36,840 Speaker 1: hamper crypto mining efforts in the US, which would just 218 00:13:36,920 --> 00:13:39,720 Speaker 1: open them up in other nations. Obviously. The article in 219 00:13:39,720 --> 00:13:42,720 Speaker 1: Tom's Hardware includes a map that shows where the administration 220 00:13:42,840 --> 00:13:45,880 Speaker 1: says these organizations are located. I was surprised to see 221 00:13:45,880 --> 00:13:48,640 Speaker 1: that a whole bunch of them are in my home 222 00:13:48,720 --> 00:13:53,280 Speaker 1: state of Georgia. Fun times. Microsoft and Activision Blizzard completed 223 00:13:53,280 --> 00:13:56,600 Speaker 1: their acquisition deal last year where the two companies merged, 224 00:13:56,640 --> 00:13:59,480 Speaker 1: But that doesn't mean the US Federal Trade Commission is 225 00:13:59,480 --> 00:14:01,600 Speaker 1: real happy about it. You know. If you might remember, 226 00:14:01,640 --> 00:14:06,840 Speaker 1: the FTC opposed that merger, and their concerns were largely 227 00:14:06,880 --> 00:14:10,880 Speaker 1: dismissed in various courts. But the FTC is still interested 228 00:14:10,960 --> 00:14:16,120 Speaker 1: in perhaps reversing that deal, and Microsoft is not actually 229 00:14:16,160 --> 00:14:19,760 Speaker 1: making it easy on themselves to avoid that because one 230 00:14:19,800 --> 00:14:22,960 Speaker 1: thing Microsoft agreed to when it was trying to convince 231 00:14:23,000 --> 00:14:26,280 Speaker 1: the US government that it should be allowed to acquire 232 00:14:26,320 --> 00:14:29,720 Speaker 1: Activision Blizzard was that the company wasn't going to hold 233 00:14:29,920 --> 00:14:33,480 Speaker 1: massive layoffs in the wake of these two companies coming together, 234 00:14:33,920 --> 00:14:38,160 Speaker 1: because the intention was Activision Blizzard would still operate kind 235 00:14:38,160 --> 00:14:40,400 Speaker 1: of as an independent entity, it would not be fully 236 00:14:40,520 --> 00:14:45,600 Speaker 1: integrated into Microsoft. But after the merger happened, Microsoft laid 237 00:14:45,600 --> 00:14:49,360 Speaker 1: off nearly two thousand employees I think oney nine hundred 238 00:14:49,440 --> 00:14:53,240 Speaker 1: or so, and we don't know exactly how many of 239 00:14:53,280 --> 00:14:56,280 Speaker 1: those were Activision Blizzard employees, but it could have been 240 00:14:56,320 --> 00:14:59,360 Speaker 1: as many as half of them, and the FTC says 241 00:14:59,800 --> 00:15:03,240 Speaker 1: that this could be a violation of the agreement Microsoft 242 00:15:03,880 --> 00:15:07,880 Speaker 1: presented when they were first trying to acquire Activision Blizzard. Furthermore, 243 00:15:08,200 --> 00:15:11,440 Speaker 1: Phil Spencer, the CEO of the Gaming division at Microsoft, 244 00:15:11,720 --> 00:15:16,400 Speaker 1: said that the companies have quote set priorities, identified areas 245 00:15:16,400 --> 00:15:19,560 Speaker 1: of overlap, and ensured that we're all aligned on the 246 00:15:19,600 --> 00:15:23,840 Speaker 1: best opportunities for growth end quote. But then the FTC says, well, 247 00:15:23,880 --> 00:15:27,320 Speaker 1: why are you worried about overlap? Because again, you had 248 00:15:27,440 --> 00:15:31,480 Speaker 1: said that the intention was Activision Blizzard and Microsoft Gaming 249 00:15:31,520 --> 00:15:35,320 Speaker 1: would more or less operate independently, that they would be 250 00:15:35,360 --> 00:15:38,680 Speaker 1: their own things. So overlap shouldn't be an issue because 251 00:15:38,680 --> 00:15:41,240 Speaker 1: you didn't. You said you weren't going to combine these two, 252 00:15:41,320 --> 00:15:44,640 Speaker 1: and yet now it sounds like you are combining them 253 00:15:44,960 --> 00:15:50,400 Speaker 1: and creating a more integrated division between the two. In fact, 254 00:15:50,440 --> 00:15:51,960 Speaker 1: you had said that you were going to keep them 255 00:15:52,000 --> 00:15:55,400 Speaker 1: separate so that you could potentially divest some or all 256 00:15:55,440 --> 00:15:58,000 Speaker 1: of Activision Blizzard if you wanted to. But now it 257 00:15:58,160 --> 00:16:03,000 Speaker 1: sounds like it's all getting mixed up together. So what's 258 00:16:03,040 --> 00:16:05,760 Speaker 1: the deal here? So the FTC is now asking courts 259 00:16:05,800 --> 00:16:09,240 Speaker 1: to force Microsoft and an activision Blizzard to temporarily pause 260 00:16:09,360 --> 00:16:12,720 Speaker 1: the merger process, like put a pause on any more 261 00:16:12,800 --> 00:16:16,080 Speaker 1: restructuring or anything like that, so that the FTC can 262 00:16:16,120 --> 00:16:20,800 Speaker 1: actually litigate the merger with the potential ultimate goal of 263 00:16:20,960 --> 00:16:24,120 Speaker 1: undoing the deal. So could we see one of the 264 00:16:24,160 --> 00:16:28,400 Speaker 1: biggest mergers in gaming hit the rewind button. It's possible. 265 00:16:28,840 --> 00:16:32,680 Speaker 1: It is not likely, but it is possible. Sony has 266 00:16:32,720 --> 00:16:37,360 Speaker 1: decided to sunset the anime streaming service Fundimation, merging it 267 00:16:37,400 --> 00:16:41,280 Speaker 1: with crunchy Roll, which Sony acquired in twenty twenty one. However, 268 00:16:41,640 --> 00:16:44,920 Speaker 1: this also means that some properties aren't necessarily going to 269 00:16:44,920 --> 00:16:47,680 Speaker 1: make the jump, So Sony's going to wipe those films 270 00:16:47,680 --> 00:16:50,920 Speaker 1: in series off the digital libraries. But worse than that, 271 00:16:51,080 --> 00:16:54,360 Speaker 1: which it's already bad, but worse than that. Fundamation offered 272 00:16:54,440 --> 00:16:58,000 Speaker 1: customers the chance to buy physical media versions of some 273 00:16:58,040 --> 00:17:01,120 Speaker 1: of the anime properties, and they would include a digital 274 00:17:01,520 --> 00:17:05,320 Speaker 1: code so that the customer could access a streaming version 275 00:17:05,440 --> 00:17:07,960 Speaker 1: of that through their Fundamation account. So they had the 276 00:17:07,960 --> 00:17:10,920 Speaker 1: physical media, but they also had a code for digital 277 00:17:10,920 --> 00:17:15,120 Speaker 1: streaming version. But this part of their service is not 278 00:17:15,240 --> 00:17:17,880 Speaker 1: going to make the transition to crunchy role. So those 279 00:17:17,880 --> 00:17:20,920 Speaker 1: digital codes aren't going to work anymore for folks who 280 00:17:20,920 --> 00:17:24,119 Speaker 1: purchased the physical media but who subsequently lost access to 281 00:17:24,119 --> 00:17:27,359 Speaker 1: that physical media for whatever reason, Like maybe they bought 282 00:17:27,359 --> 00:17:29,560 Speaker 1: it but then they realized they didn't have the space 283 00:17:29,640 --> 00:17:32,840 Speaker 1: to hold this kind of stuff, so maybe they traded 284 00:17:32,920 --> 00:17:36,359 Speaker 1: in the physical versions, but they kept the digital code. Well, 285 00:17:37,320 --> 00:17:40,560 Speaker 1: now their digital copy is going to be inaccessible, and 286 00:17:41,000 --> 00:17:44,080 Speaker 1: this stings quite a bit. Fundamation at one point claimed 287 00:17:44,119 --> 00:17:47,000 Speaker 1: that customers would have access to these digital streaming versions 288 00:17:47,080 --> 00:17:50,240 Speaker 1: quote unquote forever. So I guess forever ain't what it 289 00:17:50,359 --> 00:17:52,840 Speaker 1: used to be. But it turns out that the terms 290 00:17:52,840 --> 00:17:57,160 Speaker 1: of service indicated that customers never actually owned the digital 291 00:17:57,200 --> 00:18:00,240 Speaker 1: streaming copy. This is pretty typical for streaming services. You 292 00:18:00,240 --> 00:18:03,800 Speaker 1: don't own something. Even if you buy the title, like 293 00:18:03,840 --> 00:18:08,000 Speaker 1: on Amazon Prime or something, you don't actually own that copy. 294 00:18:08,440 --> 00:18:12,320 Speaker 1: You actually own access to that copy, and that access 295 00:18:12,600 --> 00:18:15,920 Speaker 1: at some time can be revoked. It could go bye bye. 296 00:18:16,359 --> 00:18:18,040 Speaker 1: This is one of the many reasons I have found 297 00:18:18,040 --> 00:18:21,480 Speaker 1: myself going back to buying physical media for some shows 298 00:18:21,520 --> 00:18:24,679 Speaker 1: and films, because at least then I will still have 299 00:18:24,760 --> 00:18:27,040 Speaker 1: access to that media, assuming I still have a machine 300 00:18:27,040 --> 00:18:30,240 Speaker 1: that can play whatever the format is. Finally, the last 301 00:18:30,240 --> 00:18:31,800 Speaker 1: story I want to cover is one that has gone 302 00:18:31,840 --> 00:18:33,840 Speaker 1: back and forth and up and down, kind of like 303 00:18:33,880 --> 00:18:37,920 Speaker 1: brushing your teeth. And hey, it involves electric toothbrushes, all right, 304 00:18:37,960 --> 00:18:41,399 Speaker 1: so this story has some weird elements to it. A 305 00:18:41,480 --> 00:18:46,760 Speaker 1: Swiss German newspaper called argaler Zeitung and I know I 306 00:18:46,840 --> 00:18:50,880 Speaker 1: butchered the pronunciation. It published an article that apparently claimed 307 00:18:51,119 --> 00:18:55,399 Speaker 1: that hackers had managed to compromise around three million smart toothbrushes. 308 00:18:55,680 --> 00:18:58,000 Speaker 1: And you might be thinking, what the heck do you 309 00:18:58,040 --> 00:19:01,119 Speaker 1: do with a compromise smart toothbrush? Make it brush someone's 310 00:19:01,119 --> 00:19:05,080 Speaker 1: teeth badly? But no, you can use those compromised toothbrushes 311 00:19:05,280 --> 00:19:07,959 Speaker 1: as a kind of botnet to send a distributed denial 312 00:19:07,960 --> 00:19:10,480 Speaker 1: of service attack to a target. So each of those 313 00:19:10,480 --> 00:19:13,840 Speaker 1: toothbrushes can start pinging a target server in an effort 314 00:19:13,840 --> 00:19:16,359 Speaker 1: to overwhelm it and shut it down. And because the 315 00:19:16,400 --> 00:19:20,439 Speaker 1: security on these Internet connected toothbrushes was really poor, it 316 00:19:20,520 --> 00:19:23,160 Speaker 1: was not hard to accumulate a big old army of them. 317 00:19:23,440 --> 00:19:25,880 Speaker 1: This is one of the major concerns about the Internet 318 00:19:25,880 --> 00:19:28,920 Speaker 1: of things. You know, sometimes those Internet connected sensors can 319 00:19:29,000 --> 00:19:31,520 Speaker 1: really be a threat if the company that makes them 320 00:19:31,680 --> 00:19:36,200 Speaker 1: doesn't implement good security measures. Anyway, some outlets reported on this, 321 00:19:36,359 --> 00:19:41,120 Speaker 1: and then the American branch of the cybersecurity company called 322 00:19:41,280 --> 00:19:46,680 Speaker 1: Fordinet ended up saying, oh, the original article was quoting 323 00:19:46,880 --> 00:19:49,840 Speaker 1: a Swiss branch of Fordinet, but there was a problem 324 00:19:49,840 --> 00:19:54,639 Speaker 1: with translation, and that in actuality, what the Swiss cybersecurity 325 00:19:54,640 --> 00:19:58,480 Speaker 1: experts were saying was this is a hypothetical example, not 326 00:19:58,680 --> 00:20:02,160 Speaker 1: something that actually happened, so you should really correct the article. 327 00:20:02,520 --> 00:20:05,840 Speaker 1: So then several outlets reported, oh, wasn't it funny This 328 00:20:05,920 --> 00:20:09,400 Speaker 1: mistranslation made people think that these electric toothbrushes were being 329 00:20:09,520 --> 00:20:14,160 Speaker 1: used in attacks. But then the newspaper our Gower Zitong 330 00:20:14,320 --> 00:20:17,320 Speaker 1: said no, no, no, no, no, this really did happen 331 00:20:17,440 --> 00:20:20,600 Speaker 1: because we asked about it. The Fortnet branch in Switzerland 332 00:20:20,680 --> 00:20:24,480 Speaker 1: indicated this was an actual case, not just a hypothetical example, 333 00:20:24,720 --> 00:20:27,280 Speaker 1: and when we submitted our article to them for their review, 334 00:20:27,760 --> 00:20:32,200 Speaker 1: they approved it. So clearly this isn't just a mistranslation. 335 00:20:32,800 --> 00:20:36,280 Speaker 1: So our toothbrush is out to get you, I don't know, 336 00:20:36,680 --> 00:20:38,760 Speaker 1: but you should still brush your teeth in any case, 337 00:20:38,840 --> 00:20:40,880 Speaker 1: just you know, maybe go with a toothbrush that doesn't 338 00:20:40,880 --> 00:20:43,080 Speaker 1: connect to your home network, just to be on the 339 00:20:43,119 --> 00:20:46,399 Speaker 1: safe side. All right, that's it for the news for 340 00:20:46,560 --> 00:20:49,399 Speaker 1: the week inning on February ninth, twenty twenty four. I 341 00:20:49,440 --> 00:20:51,600 Speaker 1: hope you are all well and I'll talk to you 342 00:20:51,680 --> 00:21:02,640 Speaker 1: again really soon. Text is an iHeartRadio production. For more 343 00:21:02,720 --> 00:21:07,440 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 344 00:21:07,480 --> 00:21:09,439 Speaker 1: wherever you listen to your favorite shows.