1 00:00:04,480 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:16,120 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,160 --> 00:00:19,119 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,120 --> 00:00:21,279 Speaker 1: tech are you. It's time for the tech news for 5 00:00:21,320 --> 00:00:25,920 Speaker 1: the week ending August second, twenty twenty four. Now, I'm 6 00:00:25,960 --> 00:00:29,200 Speaker 1: sure you all remember the CrowdStrike outage that happened back 7 00:00:29,280 --> 00:00:34,120 Speaker 1: on July nineteenth and subsequently caused chaos around the world 8 00:00:34,120 --> 00:00:39,040 Speaker 1: as various Windows based machines got caught in a reboot loop. Well, 9 00:00:39,040 --> 00:00:42,640 Speaker 1: we're still seeing the fallout of this classic goofum up, 10 00:00:42,840 --> 00:00:45,680 Speaker 1: because it turns out the problem was CrowdStrike had pushed 11 00:00:45,680 --> 00:00:49,239 Speaker 1: out an update that included ultimately a relatively simple but 12 00:00:49,479 --> 00:00:54,720 Speaker 1: catastrophic error in it a bug. Now, CrowdStrike shareholders are 13 00:00:54,760 --> 00:00:59,360 Speaker 1: angrily filing a massive class action lawsuit against the company, 14 00:00:59,520 --> 00:01:03,120 Speaker 1: claiming that it failed to disclose its testing process, and 15 00:01:03,200 --> 00:01:07,320 Speaker 1: the shareholders say that process was by all means inadequate. 16 00:01:07,560 --> 00:01:11,160 Speaker 1: The lawsuit says that CrowdStrike purposefully concealed the fact that 17 00:01:11,280 --> 00:01:14,759 Speaker 1: it had not put proper safety measures in place, and 18 00:01:14,959 --> 00:01:18,840 Speaker 1: as a result, this outage followed and then the company's 19 00:01:18,920 --> 00:01:22,560 Speaker 1: stock price tumbled by nearly a third over the course 20 00:01:22,600 --> 00:01:27,039 Speaker 1: of two weeks. That represented around twenty five billion with 21 00:01:27,160 --> 00:01:32,479 Speaker 1: a B dollars of market value. Yaoza and shareholders are 22 00:01:32,480 --> 00:01:36,039 Speaker 1: not the only ones considering a lawsuit against the company. 23 00:01:36,200 --> 00:01:40,480 Speaker 1: Delta Airlines is also apparently doing that, claiming that the 24 00:01:40,520 --> 00:01:43,760 Speaker 1: crowd strike outage led to Delta incurring an expense of 25 00:01:43,840 --> 00:01:48,600 Speaker 1: around half a billion dollars. Another Yaoza, Delta, like some 26 00:01:48,760 --> 00:01:52,600 Speaker 1: other airlines, had to postpone and cancel thousands of flights 27 00:01:52,760 --> 00:01:56,160 Speaker 1: and put travelers up in hotels and such in the 28 00:01:56,160 --> 00:01:59,240 Speaker 1: wake of this tech disaster. And in the interest of 29 00:01:59,280 --> 00:02:03,000 Speaker 1: full disclosure, my partner works for Delta. She was asked 30 00:02:03,040 --> 00:02:05,800 Speaker 1: to volunteer to work at the airport, a position she 31 00:02:05,920 --> 00:02:08,520 Speaker 1: had not held for more than twenty years, and she 32 00:02:08,600 --> 00:02:11,800 Speaker 1: did it because she's a rock star. But according to 33 00:02:11,880 --> 00:02:16,320 Speaker 1: Delta CEO ed Bastion, it workers had to manually reset 34 00:02:16,360 --> 00:02:20,519 Speaker 1: around forty thousand servers to get things back to normal 35 00:02:20,600 --> 00:02:24,600 Speaker 1: for the company, and they expect to receive some form 36 00:02:24,639 --> 00:02:28,800 Speaker 1: of compensation for the damages that they incurred as a 37 00:02:28,800 --> 00:02:31,840 Speaker 1: result of this outage. Metta has some good news for 38 00:02:31,960 --> 00:02:36,880 Speaker 1: its shareholders. The company posted impressive revenue growth thanks to 39 00:02:36,960 --> 00:02:40,920 Speaker 1: its advertising business. That revenue growth was enough to mollify 40 00:02:41,040 --> 00:02:45,800 Speaker 1: shareholders who continued to be skeptical over Meta's spending in 41 00:02:45,880 --> 00:02:50,160 Speaker 1: the metaverse and AI spaces. Zuckerberg told shareholders that the 42 00:02:50,160 --> 00:02:53,919 Speaker 1: company's spending in those areas had increased by seven percent 43 00:02:54,240 --> 00:02:56,839 Speaker 1: in the second quarter of twenty twenty four, but that 44 00:02:56,880 --> 00:02:59,080 Speaker 1: this was more than covered by the rise of the 45 00:02:59,120 --> 00:03:02,880 Speaker 1: company's operation rating margin, which hit thirty eight percent. Now 46 00:03:02,919 --> 00:03:06,560 Speaker 1: I have no magic ball where I can see what 47 00:03:06,680 --> 00:03:10,679 Speaker 1: shareholders are actually thinking, but I imagine more than a few 48 00:03:10,680 --> 00:03:13,440 Speaker 1: of them are still in the golly. I wish they'd 49 00:03:13,440 --> 00:03:16,320 Speaker 1: stop throwing money away, but I am thankful that the 50 00:03:16,360 --> 00:03:20,320 Speaker 1: digital ad business is going well again. They're in that camp. 51 00:03:20,840 --> 00:03:24,799 Speaker 1: I'm sure some shareholders remain excited about this metaverse concept 52 00:03:24,840 --> 00:03:28,160 Speaker 1: and where AI may be going. But more and more 53 00:03:28,200 --> 00:03:31,280 Speaker 1: I'm hearing folks get a little uncertain about the viability 54 00:03:31,320 --> 00:03:34,000 Speaker 1: of AI, or at least how far off we are 55 00:03:34,360 --> 00:03:39,400 Speaker 1: from AI making a real difference in in a revenue perspective. 56 00:03:39,640 --> 00:03:41,839 Speaker 1: I think a lot of people are starting to wake 57 00:03:41,960 --> 00:03:45,480 Speaker 1: up to the fact that AI does have enormous potential, 58 00:03:45,680 --> 00:03:48,600 Speaker 1: but that potential is going to take time to be realized, 59 00:03:48,680 --> 00:03:51,960 Speaker 1: and it isn't just a magic button where you press 60 00:03:51,960 --> 00:03:56,640 Speaker 1: it and money comes out. Speaking of AI disillusionment, Bloomberg's 61 00:03:56,720 --> 00:04:00,320 Speaker 1: Julia Love has an article titled big Tech fail to 62 00:04:00,360 --> 00:04:03,200 Speaker 1: convince Wall Street that AI is paying off And I 63 00:04:03,240 --> 00:04:06,840 Speaker 1: think that illustrates my point pretty darn well. She argues 64 00:04:06,880 --> 00:04:11,040 Speaker 1: that big companies like Amazon, Alphabet, and Microsoft have all 65 00:04:11,080 --> 00:04:14,080 Speaker 1: failed to prove the massive amount of money that they 66 00:04:14,120 --> 00:04:18,560 Speaker 1: have collectively poured into AI research and development has any 67 00:04:18,600 --> 00:04:21,000 Speaker 1: real chance of paying off in the near term. At 68 00:04:21,120 --> 00:04:24,479 Speaker 1: least they failed in the eyes of shareholders. Like it 69 00:04:24,560 --> 00:04:26,960 Speaker 1: might turn out that this isn't true, that it is 70 00:04:27,000 --> 00:04:30,799 Speaker 1: going to pay off, but shareholders are skeptical. These companies 71 00:04:30,839 --> 00:04:35,320 Speaker 1: saw their stock prices decline recently, so essentially, the upfront 72 00:04:35,360 --> 00:04:39,680 Speaker 1: costs of AIRE and D are enormous. The ongoing maintenance 73 00:04:39,800 --> 00:04:45,720 Speaker 1: costs to operate AI are also skyrocketing, particularly if you're 74 00:04:45,760 --> 00:04:48,880 Speaker 1: doing so at scale the way these big companies are 75 00:04:48,880 --> 00:04:52,279 Speaker 1: doing it, and the opportunities to actually generate revenue off 76 00:04:52,320 --> 00:04:55,640 Speaker 1: of that work are somewhat limited right now. But again, 77 00:04:55,680 --> 00:04:57,960 Speaker 1: that's not to say that's always going to be the case. 78 00:04:58,080 --> 00:05:00,640 Speaker 1: It may be that down the road this will look 79 00:05:00,760 --> 00:05:04,200 Speaker 1: absolutely critical. People will say, Wow, if you had gotten 80 00:05:04,279 --> 00:05:06,840 Speaker 1: in at that time, think of how much money you 81 00:05:06,880 --> 00:05:10,680 Speaker 1: would make. That's a possibility. However, for now, the attitude 82 00:05:10,680 --> 00:05:13,360 Speaker 1: appears to be more like, Ah, this is taking longer 83 00:05:13,400 --> 00:05:16,200 Speaker 1: than I thought it would, so I'm less excited now. 84 00:05:16,480 --> 00:05:19,720 Speaker 1: It's the good old hype cycle trend. Folks get super 85 00:05:19,760 --> 00:05:23,920 Speaker 1: excited about an emerging technology, but then that excitement starts 86 00:05:23,960 --> 00:05:27,320 Speaker 1: to drain away as they realize that the actual capabilities 87 00:05:27,360 --> 00:05:31,719 Speaker 1: of the tech aren't nearly as wild and impressive as 88 00:05:31,760 --> 00:05:34,839 Speaker 1: they had imagined. This does not mean that AI is 89 00:05:34,880 --> 00:05:37,360 Speaker 1: just going to fizzle out like some other stuff kind 90 00:05:37,360 --> 00:05:41,279 Speaker 1: of has. Arguably, NFTs have largely fizzled out. And yes, 91 00:05:41,320 --> 00:05:44,000 Speaker 1: I know NFTs are technically there's still a thing, but 92 00:05:44,080 --> 00:05:47,640 Speaker 1: they are a shadow of what they once were thanks 93 00:05:47,680 --> 00:05:50,960 Speaker 1: to the enormous collapse NFTs had a couple of years ago. 94 00:05:51,320 --> 00:05:53,359 Speaker 1: This might mean that companies will need to find some 95 00:05:53,480 --> 00:05:56,320 Speaker 1: creative ways to talk about AI investments in the future 96 00:05:56,360 --> 00:05:59,680 Speaker 1: if they do not want their shareholders getting antsy about 97 00:05:59,680 --> 00:06:02,520 Speaker 1: the whole thing. So, yeah, it's not just the tech, 98 00:06:02,960 --> 00:06:07,440 Speaker 1: it's always the money. Microsoft filed an annual report with 99 00:06:07,520 --> 00:06:10,920 Speaker 1: the SEC all publicly traded companies in the United States 100 00:06:10,920 --> 00:06:14,560 Speaker 1: have to do this. And in that report, Microsoft named 101 00:06:14,640 --> 00:06:18,680 Speaker 1: open Ai as a competitor. Now that's got a sting 102 00:06:19,040 --> 00:06:21,320 Speaker 1: for open Ai. It's kind of like hearing an ex 103 00:06:21,360 --> 00:06:25,920 Speaker 1: partner describe you as being okay. After all, Microsoft famously 104 00:06:25,960 --> 00:06:31,760 Speaker 1: pumped a thirteen billion dollar investment into open Ai. That's 105 00:06:32,000 --> 00:06:34,760 Speaker 1: a heck of a present, right, I mean, technically it's 106 00:06:34,800 --> 00:06:37,240 Speaker 1: not a president it's a business investment. But you know, 107 00:06:37,400 --> 00:06:41,920 Speaker 1: in one section of Microsoft's report, it reads, quote, our 108 00:06:42,160 --> 00:06:46,720 Speaker 1: AI offerings compete with AI products from hyperscalers such as 109 00:06:46,760 --> 00:06:51,520 Speaker 1: Amazon and Google, as well as products from other emerging competitors, 110 00:06:51,560 --> 00:06:57,400 Speaker 1: including Anthropic, open Ai, Meta and other open source offerings, 111 00:06:57,640 --> 00:07:01,839 Speaker 1: many of which are also current or putdential partners end quote. 112 00:07:01,960 --> 00:07:04,559 Speaker 1: I love that they put OpenAI second in that list. 113 00:07:04,600 --> 00:07:07,680 Speaker 1: I guess that's, you know, trying to couch things a 114 00:07:07,720 --> 00:07:10,720 Speaker 1: little bit. But obviously Microsoft is not saying that it 115 00:07:10,800 --> 00:07:13,520 Speaker 1: has declared war on open Ai. It's not that kind 116 00:07:13,560 --> 00:07:16,480 Speaker 1: of competitor. In fact, some suspect that this might be 117 00:07:16,520 --> 00:07:19,080 Speaker 1: the people at Microsoft trying to get ahead of any 118 00:07:19,280 --> 00:07:23,600 Speaker 1: regulatory issues that could follow in the future. Microsoft wouldn't 119 00:07:23,640 --> 00:07:26,840 Speaker 1: be considered anti competitive in the AI space, because look, 120 00:07:27,080 --> 00:07:31,320 Speaker 1: open ai is a competitor for Microsoft, not like a 121 00:07:31,720 --> 00:07:35,240 Speaker 1: best friend forever. Sure, for a while there, Microsoft had 122 00:07:35,240 --> 00:07:38,040 Speaker 1: secured a seat on open AI's board of directors. But 123 00:07:38,080 --> 00:07:40,640 Speaker 1: what does that mean? They don't have it anymore. They 124 00:07:40,760 --> 00:07:44,800 Speaker 1: dropped that seat. It's nothing you need to worry about, regulators, gerl. 125 00:07:45,000 --> 00:07:47,800 Speaker 1: We're all just you know, we're just er. We're competing 126 00:07:47,920 --> 00:07:51,680 Speaker 1: like crazy. Down here. Hey, look there's a three headed monkey. 127 00:07:51,920 --> 00:07:55,360 Speaker 1: On a semi related note, Windows Central reported late last 128 00:07:55,360 --> 00:07:57,960 Speaker 1: week that open ai could be facing a pretty grim 129 00:07:58,040 --> 00:08:02,120 Speaker 1: future unless it holds another round of massive funding. That 130 00:08:02,200 --> 00:08:06,400 Speaker 1: the company has a projected five billion dollar loss as 131 00:08:06,720 --> 00:08:10,960 Speaker 1: possibly in the books this year. The company reportedly spends 132 00:08:10,960 --> 00:08:14,960 Speaker 1: about eight point five billion dollars a year in operation 133 00:08:15,200 --> 00:08:19,320 Speaker 1: and staffing costs, but it only posts revenue of three 134 00:08:19,480 --> 00:08:21,880 Speaker 1: point five billion. So yeah, if you do the math, 135 00:08:21,920 --> 00:08:24,800 Speaker 1: that means the company is spending five billion dollars more 136 00:08:25,200 --> 00:08:28,760 Speaker 1: in twenty twenty four, then it's bringing in and without 137 00:08:28,800 --> 00:08:31,880 Speaker 1: an influx of more cash, the company could go bankrupt. 138 00:08:32,160 --> 00:08:36,040 Speaker 1: Open Ai has released updated large language models and tools 139 00:08:36,080 --> 00:08:39,640 Speaker 1: this year that in turn has led to increased revenue 140 00:08:39,640 --> 00:08:43,480 Speaker 1: for the company. But then running these updated models and 141 00:08:43,520 --> 00:08:46,960 Speaker 1: tools is really expensive, so some of that kind of 142 00:08:47,160 --> 00:08:51,600 Speaker 1: washes out. Could Open AI potentially innovate its way out 143 00:08:51,600 --> 00:08:56,960 Speaker 1: of business, or would its competitor Microsoft swoop in to 144 00:08:57,120 --> 00:08:59,679 Speaker 1: keep the ship afloat. I don't know. I guess I'm 145 00:08:59,679 --> 00:09:02,400 Speaker 1: gonna have I have to ask chat GPT what it thinks. 146 00:09:03,000 --> 00:09:06,360 Speaker 1: Sticking with Microsoft and AI for just a bit longer, 147 00:09:06,640 --> 00:09:10,120 Speaker 1: Four of four Media has a piece titled Microsoft and 148 00:09:10,160 --> 00:09:14,640 Speaker 1: Reddit are fighting about why Bing's crawler is blocked on Reddit? 149 00:09:15,080 --> 00:09:18,880 Speaker 1: All right, So Bing is Microsoft's search engine. It's one 150 00:09:18,920 --> 00:09:22,880 Speaker 1: that's most associated with the Edge web browser from Microsoft. 151 00:09:23,120 --> 00:09:27,320 Speaker 1: And search engines use web crawlers to index the web 152 00:09:27,600 --> 00:09:30,680 Speaker 1: so that the search engine can list pages in response 153 00:09:30,720 --> 00:09:35,000 Speaker 1: to queries. This is a fundamental function of search engines. However, 154 00:09:35,440 --> 00:09:39,160 Speaker 1: web builders they can actually put instructions inside a web 155 00:09:39,200 --> 00:09:42,960 Speaker 1: page that tell these crawlers to scram. Essentially, it's saying, hey, 156 00:09:43,080 --> 00:09:46,560 Speaker 1: don't index this web page for your search results. Why 157 00:09:46,559 --> 00:09:48,200 Speaker 1: would you do that? Well, you might want to do 158 00:09:48,240 --> 00:09:51,480 Speaker 1: it if you want a more controlled access to the 159 00:09:51,520 --> 00:09:56,080 Speaker 1: page in question. So essentially Reddit has done this across 160 00:09:56,160 --> 00:09:59,280 Speaker 1: its entire site. Because of how companies like Google and 161 00:09:59,360 --> 00:10:03,920 Speaker 1: Microsoft are now crawling websites not just for search purposes, 162 00:10:04,160 --> 00:10:08,400 Speaker 1: but also to train artificial intelligence large language models, and 163 00:10:08,559 --> 00:10:13,600 Speaker 1: Reddit doesn't want any company using content from its site 164 00:10:13,920 --> 00:10:18,440 Speaker 1: to train AI without first ponying up cash to do so. 165 00:10:19,080 --> 00:10:21,880 Speaker 1: Reddit doesn't object to AI learning from the stuff that 166 00:10:22,000 --> 00:10:25,760 Speaker 1: its users create and post to the site, but it 167 00:10:25,840 --> 00:10:28,480 Speaker 1: does not give that stuff away for free. Oh, it 168 00:10:28,520 --> 00:10:31,480 Speaker 1: also doesn't share any money with the users who actually 169 00:10:31,520 --> 00:10:34,120 Speaker 1: made the content in the first place. Reddit is like, no, 170 00:10:34,280 --> 00:10:36,720 Speaker 1: once you post it here, it belongs to us. It's 171 00:10:36,760 --> 00:10:40,800 Speaker 1: no longer yours, it's ours. But anyway, Reddit is essentially saying, 172 00:10:41,240 --> 00:10:45,040 Speaker 1: if you want to train your AI on our stuff, 173 00:10:45,440 --> 00:10:48,320 Speaker 1: you got to fork over the cash. And Microsoft is 174 00:10:48,400 --> 00:10:51,720 Speaker 1: essentially saying, bro, we're not training AI, We just want 175 00:10:51,760 --> 00:10:55,160 Speaker 1: to index Reddit for the purposes of search, so chill out. 176 00:10:55,559 --> 00:10:58,880 Speaker 1: Complicating matters is the fact that Google did pay Reddit 177 00:10:59,040 --> 00:11:03,800 Speaker 1: a cool sixty million dollars, and so Google, unlike other 178 00:11:03,880 --> 00:11:07,480 Speaker 1: search engines, is allowed to crawl an index and even 179 00:11:07,559 --> 00:11:11,640 Speaker 1: train AI on Reddit site. So some people are saying, huh, 180 00:11:11,679 --> 00:11:15,079 Speaker 1: this is starting to look a bit anti competitive. In fact, 181 00:11:15,200 --> 00:11:20,080 Speaker 1: Bing's head of search tweeted out quote Reddit has blocked 182 00:11:20,160 --> 00:11:24,720 Speaker 1: being from crawling their site for search, favoring another search engine, 183 00:11:24,800 --> 00:11:29,880 Speaker 1: and impacting competition from being and being powered engines. End quote. 184 00:11:30,080 --> 00:11:33,920 Speaker 1: Reddit reps told four h four Media that nah, nah, 185 00:11:33,920 --> 00:11:37,040 Speaker 1: that's not what's going on. This is totally because Microsoft 186 00:11:37,040 --> 00:11:39,480 Speaker 1: refuses to play by our rules, and that's all there 187 00:11:39,559 --> 00:11:44,199 Speaker 1: is to it. So this, I think is an illustration 188 00:11:44,440 --> 00:11:49,319 Speaker 1: of how AI, specifically generative AI, has started to really 189 00:11:49,440 --> 00:11:52,960 Speaker 1: impact the way that the Internet works. Typically, you wouldn't 190 00:11:53,000 --> 00:11:57,679 Speaker 1: see sites like Reddit put up a block on indexing 191 00:11:58,080 --> 00:12:01,840 Speaker 1: the page. Like indexing me that more people will go 192 00:12:01,920 --> 00:12:04,640 Speaker 1: to Reddit when they're searching for certain topics and there's 193 00:12:04,679 --> 00:12:07,800 Speaker 1: a Reddit thread all about that thing, so you want 194 00:12:07,840 --> 00:12:10,800 Speaker 1: more traffic to come in. But if it also means 195 00:12:10,840 --> 00:12:14,840 Speaker 1: that you know, people are exploiting that content and they're 196 00:12:14,960 --> 00:12:17,760 Speaker 1: profiting off of that, and Reddit's not getting a piece 197 00:12:17,800 --> 00:12:20,960 Speaker 1: of the action, that's where the problems come up. And 198 00:12:21,000 --> 00:12:23,800 Speaker 1: I think we're going to see more stuff like this 199 00:12:24,280 --> 00:12:28,400 Speaker 1: all the way across the Internet in various incarnations. Well, 200 00:12:29,000 --> 00:12:32,600 Speaker 1: speaking about ads and money, it's about that time for 201 00:12:32,679 --> 00:12:35,079 Speaker 1: us to take a quick break to thank our sponsors, 202 00:12:35,120 --> 00:12:37,800 Speaker 1: but we'll be back with more news right after this. 203 00:12:47,360 --> 00:12:51,920 Speaker 1: So we're back, and just this past Tuesday, Microsoft was 204 00:12:51,960 --> 00:12:55,560 Speaker 1: the target of a massive d DOS attack that hampered 205 00:12:55,720 --> 00:12:59,720 Speaker 1: Microsoft three sixty five and Azure services around the Globe 206 00:13:00,000 --> 00:13:04,280 Speaker 1: I measure if you aren't familiar, that's Microsoft's cloud based offerings. 207 00:13:04,880 --> 00:13:08,239 Speaker 1: And as a quick reminder, d DOOS stands for distributed 208 00:13:08,320 --> 00:13:12,480 Speaker 1: denial of service and typically this kind of attack involves 209 00:13:12,600 --> 00:13:17,600 Speaker 1: sending a gargantuan amount of Internet requests from an army 210 00:13:17,760 --> 00:13:23,280 Speaker 1: of compromised computers, often compromised through malware and hacking, and 211 00:13:23,880 --> 00:13:27,800 Speaker 1: directing all of that traffic to a specific target on 212 00:13:27,880 --> 00:13:31,400 Speaker 1: the web, and the target becomes overwhelmed trying to answer 213 00:13:31,440 --> 00:13:34,880 Speaker 1: all these requests and ultimately it can shut down as 214 00:13:34,880 --> 00:13:37,720 Speaker 1: a result. Now, there are actually a lot of mitigation 215 00:13:37,880 --> 00:13:41,040 Speaker 1: strategies that can help decrease the impact of these sorts 216 00:13:41,040 --> 00:13:44,760 Speaker 1: of attacks, like to have an early detection system to say, hey, 217 00:13:44,800 --> 00:13:49,679 Speaker 1: something's up. We need to raise defenses essentially raise shields 218 00:13:49,760 --> 00:13:52,360 Speaker 1: in star trek terms. But sometimes it can take a 219 00:13:52,400 --> 00:13:56,080 Speaker 1: little bit before these mitigation strategies can be employed, and 220 00:13:56,160 --> 00:13:58,400 Speaker 1: in the meantime you've got a crisis on your hands. 221 00:13:58,720 --> 00:14:02,760 Speaker 1: Microsoft has not yet identified the threat actor responsible for 222 00:14:02,800 --> 00:14:05,760 Speaker 1: this attack, but the company did admit that a mistake 223 00:14:05,960 --> 00:14:09,839 Speaker 1: in the mitigation deployment delayed a security response, which in 224 00:14:09,880 --> 00:14:12,320 Speaker 1: turn meant the attack had a greater effect than it 225 00:14:12,400 --> 00:14:15,840 Speaker 1: otherwise might have had. So this is, you know, not 226 00:14:15,920 --> 00:14:18,920 Speaker 1: to blame the victim. The hackers ultimately are the ones 227 00:14:18,960 --> 00:14:23,240 Speaker 1: responsible for the outages, but the error on Microsoft's part 228 00:14:23,360 --> 00:14:26,480 Speaker 1: did mean that it took a little longer to respond 229 00:14:26,560 --> 00:14:30,360 Speaker 1: to that than it otherwise would have. Most companies have 230 00:14:30,520 --> 00:14:34,120 Speaker 1: mitigation strategies or they work with a security company that 231 00:14:34,240 --> 00:14:37,160 Speaker 1: provides mitigation strategies for this kind of stuff, and a 232 00:14:37,240 --> 00:14:41,040 Speaker 1: lot of companies have seen pretty massive upticks in attack 233 00:14:41,280 --> 00:14:45,920 Speaker 1: attempts over more recent years. It's always been a bit 234 00:14:45,960 --> 00:14:48,880 Speaker 1: of an issue ever since d DOS attacks became kind 235 00:14:48,920 --> 00:14:52,040 Speaker 1: of a possibility, but they are becoming far more frequent 236 00:14:52,120 --> 00:14:55,600 Speaker 1: these days, as is pretty much every kind of hacker attack. 237 00:14:56,280 --> 00:15:00,680 Speaker 1: Israeli hackers claim responsibility for an attack that may have 238 00:15:00,760 --> 00:15:04,800 Speaker 1: disrupted massive amounts of Internet traffic in Iran. So this 239 00:15:04,880 --> 00:15:09,600 Speaker 1: happened in conjunction with military strikes that Israel reportedly was 240 00:15:09,600 --> 00:15:12,800 Speaker 1: carrying out against a Hamas target who was in Iran. 241 00:15:13,320 --> 00:15:16,480 Speaker 1: These strikes were originally reported as missile based attacks, but 242 00:15:16,560 --> 00:15:19,880 Speaker 1: The New York Times says, actually that local sources claim 243 00:15:20,280 --> 00:15:23,520 Speaker 1: it was an explosive device that was smuggled into where 244 00:15:23,720 --> 00:15:27,440 Speaker 1: the target was staying in Iran. But the hacker group, 245 00:15:27,560 --> 00:15:31,840 Speaker 1: which calls itself WE Read Evils or we Read Devils 246 00:15:31,880 --> 00:15:34,680 Speaker 1: if you prefer, that's not how the capitalization goes. But 247 00:15:34,760 --> 00:15:39,359 Speaker 1: they claimed to had taken down much of Iran's internet infrastructure, 248 00:15:39,720 --> 00:15:43,760 Speaker 1: like leading to a massive Wi Fi outage. However, Forbes 249 00:15:43,960 --> 00:15:47,240 Speaker 1: was unable to confirm the extent to which these attacks 250 00:15:47,280 --> 00:15:52,280 Speaker 1: disrupted communications within Iran. As journalist Zach Doffman points out, 251 00:15:52,440 --> 00:15:55,680 Speaker 1: the lines of communication in Iran aren't exactly dependable even 252 00:15:55,720 --> 00:15:57,920 Speaker 1: on a good day, so it's hard to say, like, 253 00:15:58,440 --> 00:16:01,720 Speaker 1: was this a devastating attack or did anyone even notice. 254 00:16:02,080 --> 00:16:05,960 Speaker 1: It's possible that people just figured it was another typical 255 00:16:06,040 --> 00:16:10,760 Speaker 1: day in Iran. Hard to say, but yeah, not surprising 256 00:16:10,920 --> 00:16:17,760 Speaker 1: to hear about hackers aligning with military operations. I don't 257 00:16:17,800 --> 00:16:21,440 Speaker 1: know if there's any evidence to show that We Read 258 00:16:21,520 --> 00:16:26,960 Speaker 1: Evils is state backed, Like if they receive support from 259 00:16:26,960 --> 00:16:30,040 Speaker 1: the Israeli government, that is possible they could be a 260 00:16:30,080 --> 00:16:33,720 Speaker 1: state backed hacker group. It's also possible they're acting independently 261 00:16:34,120 --> 00:16:38,200 Speaker 1: and have priorities that are in alignment with Israel's government. 262 00:16:38,240 --> 00:16:41,600 Speaker 1: It's hard to say. Net neutrality sure is having a 263 00:16:41,760 --> 00:16:44,360 Speaker 1: rough go of it in the United States. All right, 264 00:16:44,600 --> 00:16:47,120 Speaker 1: let's do a quick rundown on what this is. So. 265 00:16:47,200 --> 00:16:51,360 Speaker 1: Net neutrality essentially says you should be able to have 266 00:16:51,440 --> 00:16:54,640 Speaker 1: the same access to all the stuff on the Internet, 267 00:16:54,840 --> 00:16:57,880 Speaker 1: no matter where that stuff comes from, no matter what 268 00:16:58,120 --> 00:17:02,120 Speaker 1: isp you use, no matter what device you are using 269 00:17:02,240 --> 00:17:06,720 Speaker 1: to access it. And companies that provide Internet infrastructure should 270 00:17:06,760 --> 00:17:09,840 Speaker 1: not favor traffic from certain parts of the Internet, while 271 00:17:09,880 --> 00:17:13,800 Speaker 1: throttling traffic from other parts, nor should they deny service 272 00:17:13,800 --> 00:17:17,879 Speaker 1: to anyone just because that person isn't using that company's 273 00:17:17,960 --> 00:17:22,960 Speaker 1: own services or products. It should be neutral. During Barack 274 00:17:23,000 --> 00:17:28,120 Speaker 1: Obama's presidency, the FCC succeeded in passing some new rules 275 00:17:28,200 --> 00:17:31,960 Speaker 1: that would force service providers to play fair. Essentially, they 276 00:17:32,000 --> 00:17:35,040 Speaker 1: declared that the rules of common carriage would apply to 277 00:17:35,400 --> 00:17:40,240 Speaker 1: Internet traffic. During Trump's presidency, that version of the FCC 278 00:17:40,400 --> 00:17:44,920 Speaker 1: overturned those rules from the Obama administration, and now under 279 00:17:45,000 --> 00:17:49,600 Speaker 1: Joe Biden, the FCC reinstated those rules again, slightly different 280 00:17:49,680 --> 00:17:54,200 Speaker 1: versions but same idea. However, the Sixth Circuit US Court 281 00:17:54,240 --> 00:17:57,640 Speaker 1: of Appeals has now blocked those rules from going into effect, 282 00:17:57,800 --> 00:18:00,720 Speaker 1: saying that the FCC has failed to meet the high 283 00:18:00,760 --> 00:18:04,199 Speaker 1: standards of a legal argument justifying the rules in the 284 00:18:04,200 --> 00:18:06,879 Speaker 1: first place, and that those rules are not likely to 285 00:18:07,000 --> 00:18:10,639 Speaker 1: hold up to a legal challenge. So instead, the Court 286 00:18:10,680 --> 00:18:14,520 Speaker 1: has said that beginning either in late October or early November, 287 00:18:14,600 --> 00:18:17,480 Speaker 1: it will hear oral arguments on the matter before it 288 00:18:17,560 --> 00:18:21,439 Speaker 1: goes any further. Now, considering that US elections happen in 289 00:18:21,480 --> 00:18:24,320 Speaker 1: early November, and that net neutrality was one of those 290 00:18:24,440 --> 00:18:27,919 Speaker 1: tent pole achievements of the Biden administration, this is a 291 00:18:27,960 --> 00:18:31,679 Speaker 1: pretty tough political blow to Democrats. The safe assumption is 292 00:18:31,680 --> 00:18:34,480 Speaker 1: that Kamala Harris, who is on track to becoming the 293 00:18:34,560 --> 00:18:39,199 Speaker 1: official nominee of the Democrat Party, will also support net neutrality. 294 00:18:39,400 --> 00:18:42,080 Speaker 1: And we've already seen what a Trump administration thinks of 295 00:18:42,119 --> 00:18:45,439 Speaker 1: the matter, so largely the outcome of the election, I 296 00:18:45,440 --> 00:18:49,439 Speaker 1: think will determine the direction that this takes outside of 297 00:18:49,440 --> 00:18:52,120 Speaker 1: the courts, Like the court will ultimately get to decide 298 00:18:52,560 --> 00:18:57,760 Speaker 1: whether the matter meets legal standards. But you know, we've 299 00:18:57,800 --> 00:19:02,360 Speaker 1: already seen that with each administration. The FCC's makeup can 300 00:19:02,520 --> 00:19:06,800 Speaker 1: change dramatically, and that in turn changes the policy. Reuter's 301 00:19:06,840 --> 00:19:10,760 Speaker 1: reports that the US Department of Justice is investigating Nvidia 302 00:19:10,920 --> 00:19:16,000 Speaker 1: for anti competitive practices. Specifically, the DOJ is interested in 303 00:19:16,080 --> 00:19:19,920 Speaker 1: how Nvidia's AI chip business has kind of taken shape, 304 00:19:19,920 --> 00:19:22,800 Speaker 1: with concerns that the company has abused its position to 305 00:19:22,920 --> 00:19:26,560 Speaker 1: suppress competition in the space. It's not a crime to 306 00:19:26,600 --> 00:19:29,800 Speaker 1: be a successful business, but the DOJ is looking into 307 00:19:29,840 --> 00:19:33,639 Speaker 1: reports that Invidia wasn't just super good at selling chips 308 00:19:33,680 --> 00:19:37,240 Speaker 1: meant to power AI implementations, but also that the company 309 00:19:37,359 --> 00:19:41,800 Speaker 1: was pressuring customers to buy into packages of products. Further, 310 00:19:42,160 --> 00:19:46,280 Speaker 1: there were accusations that Nvidia would charge a lower price 311 00:19:46,400 --> 00:19:49,719 Speaker 1: if a customer opted to buy networking gear from Nvidia 312 00:19:49,760 --> 00:19:54,439 Speaker 1: while also electing to have Nvidia's own AI chips integrated 313 00:19:54,520 --> 00:19:56,560 Speaker 1: into that gear, but they would have to pay a 314 00:19:56,680 --> 00:19:59,840 Speaker 1: higher price if they wanted the gear from Nvidia, but 315 00:20:00,080 --> 00:20:04,320 Speaker 1: they wanted to use AI chips from one of Nvidia's competitors. 316 00:20:04,520 --> 00:20:08,000 Speaker 1: So this is just in the investigation phase at the moment. 317 00:20:08,240 --> 00:20:11,520 Speaker 1: There's no lawsuit or anything like that yet. It's possible 318 00:20:11,560 --> 00:20:14,080 Speaker 1: that nothing more will come of this, But with Nvidia 319 00:20:14,160 --> 00:20:17,000 Speaker 1: holding such a strong market position like they have like 320 00:20:17,200 --> 00:20:21,040 Speaker 1: eighty percent of the market share for AI powered chips. 321 00:20:21,280 --> 00:20:24,400 Speaker 1: I suspect that investigation is going to end up being 322 00:20:24,560 --> 00:20:29,280 Speaker 1: pretty darn thorough more in tech and politics, Yeah, I 323 00:20:29,280 --> 00:20:32,119 Speaker 1: love it too. This week, the US Senate passed a 324 00:20:32,160 --> 00:20:36,360 Speaker 1: bill called the Kids Online Safety Act or KOZA KOSA. 325 00:20:36,920 --> 00:20:40,760 Speaker 1: Many people had concerns about this legislation. They argued that 326 00:20:40,840 --> 00:20:44,080 Speaker 1: while it poses as a set of rules that are 327 00:20:44,119 --> 00:20:48,000 Speaker 1: meant to protect children from harmful content and practices online, 328 00:20:48,119 --> 00:20:53,240 Speaker 1: it was so vague in defining those quote unquote harmful 329 00:20:53,400 --> 00:20:57,280 Speaker 1: practices that it could be used to persecute folks like 330 00:20:57,320 --> 00:21:02,320 Speaker 1: those in say, the LGBTQ community. So understandably, there was 331 00:21:02,359 --> 00:21:04,880 Speaker 1: a lot of concern over the fact that it passed 332 00:21:05,320 --> 00:21:09,320 Speaker 1: in the Senate. However, the US House of Representatives is 333 00:21:09,440 --> 00:21:14,560 Speaker 1: not following suit. Yesterday, the GOP controlled House announced it 334 00:21:14,560 --> 00:21:17,639 Speaker 1: would not be bringing up COSA for a vote, citing 335 00:21:17,760 --> 00:21:22,600 Speaker 1: quote concerns across our conference end quote. So a bill 336 00:21:22,680 --> 00:21:25,960 Speaker 1: has to pass both the Senate and the House before 337 00:21:25,960 --> 00:21:29,679 Speaker 1: it could be signed into law, So KOSA is effectively 338 00:21:29,880 --> 00:21:32,480 Speaker 1: dead in the water. Now, this is actually great news, 339 00:21:32,600 --> 00:21:34,679 Speaker 1: which I'm sure some of y'all will be surprised to 340 00:21:34,720 --> 00:21:37,080 Speaker 1: hear me say, because I think it's pretty clear that 341 00:21:37,240 --> 00:21:41,520 Speaker 1: I do not often align with the GOP. But bills 342 00:21:41,600 --> 00:21:45,920 Speaker 1: like KOSA, which often use the pretense of protecting vulnerable 343 00:21:45,960 --> 00:21:49,199 Speaker 1: populations such as children, often end up serving as a 344 00:21:49,280 --> 00:21:54,400 Speaker 1: means to attack other vulnerable populations while restricting freedoms online. 345 00:21:54,640 --> 00:21:58,480 Speaker 1: These bills might originally be framed with the best of intentions, 346 00:21:58,840 --> 00:22:01,639 Speaker 1: or maybe they're fraimed from the get go as a 347 00:22:01,640 --> 00:22:04,359 Speaker 1: potential weapon, but in either case, they tend to be 348 00:22:04,480 --> 00:22:08,040 Speaker 1: bad news when they're implemented. So it is good news 349 00:22:08,080 --> 00:22:11,800 Speaker 1: that this particular bill is fizzling out. Senator Ran Paul 350 00:22:11,920 --> 00:22:14,560 Speaker 1: spoke out against this bill in a letter that laid 351 00:22:14,560 --> 00:22:18,080 Speaker 1: out his logic opposing the legislation, and it appears that 352 00:22:18,080 --> 00:22:21,399 Speaker 1: that went a long way in the House of Representatives. 353 00:22:21,880 --> 00:22:24,840 Speaker 1: Intel is facing a really tough future, and a lot 354 00:22:24,880 --> 00:22:27,000 Speaker 1: of folks at the company won't be there to be 355 00:22:27,080 --> 00:22:30,199 Speaker 1: a part of that future. Intel told The Verge that 356 00:22:30,400 --> 00:22:34,680 Speaker 1: it will downsize by around fifteen percent, which means more 357 00:22:34,720 --> 00:22:38,480 Speaker 1: than fifteen thousand people, perhaps as many as nineteen thousand 358 00:22:38,720 --> 00:22:41,680 Speaker 1: could be laid off from the company. This is all 359 00:22:41,720 --> 00:22:45,639 Speaker 1: part of an effort to cut ten billion dollars from costs. 360 00:22:46,080 --> 00:22:48,560 Speaker 1: Intel is also cutting back on the amount it's going 361 00:22:48,600 --> 00:22:50,600 Speaker 1: to spend on R and D, and it plans to 362 00:22:50,760 --> 00:22:55,040 Speaker 1: quote stop non essential work end quote. So I imagine 363 00:22:55,040 --> 00:22:56,639 Speaker 1: that means a lot of departments are going to be 364 00:22:56,680 --> 00:22:59,160 Speaker 1: scrambling to figure out if they can present their work 365 00:22:59,200 --> 00:23:02,360 Speaker 1: in a way that makes it seem as though it's essential. 366 00:23:02,640 --> 00:23:06,080 Speaker 1: WEO that's going to be rough. Pad Gelsinger, who is 367 00:23:06,200 --> 00:23:10,760 Speaker 1: Intel's CEO, revealed that the one point six billion dollar 368 00:23:11,000 --> 00:23:14,359 Speaker 1: loss Intel experienced in the second quarter of twenty twenty 369 00:23:14,359 --> 00:23:18,960 Speaker 1: four was quote unquote disappointing. I think that's putting it lightly. 370 00:23:19,200 --> 00:23:22,520 Speaker 1: He also stated that Intel has quote yet to fully 371 00:23:22,560 --> 00:23:26,639 Speaker 1: benefit from powerful trends like AI end quote. That seems 372 00:23:26,640 --> 00:23:30,320 Speaker 1: to align what I've covered already in today's episode talking 373 00:23:30,359 --> 00:23:34,679 Speaker 1: about the disillusionment that's growing around artificial intelligence. But another 374 00:23:34,760 --> 00:23:39,520 Speaker 1: issue is that Intel has been investing heavily in semiconductor manufacturing. 375 00:23:39,920 --> 00:23:43,840 Speaker 1: This is an incredibly expensive business to get into. You 376 00:23:43,920 --> 00:23:46,119 Speaker 1: have to build all the facilities to be able to 377 00:23:46,240 --> 00:23:51,280 Speaker 1: make these semiconductor chips. Typically, Intel would design the chips, 378 00:23:51,480 --> 00:23:55,920 Speaker 1: but then would partner with other companies overseas to fabricate 379 00:23:56,240 --> 00:23:59,600 Speaker 1: those chips. However, out of concerns of national security and 380 00:23:59,640 --> 00:24:02,840 Speaker 1: support life chain challenges, there's been a shift to invest 381 00:24:02,880 --> 00:24:06,320 Speaker 1: heavily in domestic chip fabrication here in the United States. 382 00:24:06,520 --> 00:24:09,440 Speaker 1: So part of Intel's challenges relate to the fact that 383 00:24:09,520 --> 00:24:12,560 Speaker 1: the company has been investing billions into that effort and 384 00:24:12,600 --> 00:24:15,440 Speaker 1: it's going to take time to realize the benefits. It's 385 00:24:15,480 --> 00:24:19,320 Speaker 1: a long term solution in a world that typically focuses 386 00:24:19,359 --> 00:24:23,960 Speaker 1: on short term results. Intel also faces increased competition from 387 00:24:24,000 --> 00:24:28,840 Speaker 1: companies ranging from Qualcom, AMD, Google, and Apple. I just 388 00:24:28,880 --> 00:24:30,959 Speaker 1: want to say to anyone out there who ends up 389 00:24:30,960 --> 00:24:34,440 Speaker 1: being affected by these layoffs, I wish you the absolute best. 390 00:24:34,480 --> 00:24:39,320 Speaker 1: I can't imagine how stressful that is, and it really stinks. Okay, 391 00:24:39,760 --> 00:24:42,399 Speaker 1: I have several more stories to get through. This was 392 00:24:42,440 --> 00:24:45,720 Speaker 1: a pretty heavy week as far as tech news goes. 393 00:24:45,880 --> 00:24:48,000 Speaker 1: So let's take another quick break and we'll be right 394 00:24:48,040 --> 00:25:02,120 Speaker 1: back with more news. We're back the watchdog group Global Witness. 395 00:25:02,359 --> 00:25:05,680 Speaker 1: I've talked about them many times before. They have kind 396 00:25:05,680 --> 00:25:10,280 Speaker 1: of raked Twitter slash X over the coals multiple times. Well, 397 00:25:10,359 --> 00:25:12,880 Speaker 1: they're ready to do it again. They report that there 398 00:25:12,920 --> 00:25:16,280 Speaker 1: are forty five accounts on X, the platform formally known 399 00:25:16,320 --> 00:25:19,720 Speaker 1: as Twitter that both appear to be bots and are 400 00:25:19,760 --> 00:25:25,000 Speaker 1: also responsible for promoting content that contains disinformation, racist and 401 00:25:25,119 --> 00:25:29,960 Speaker 1: sexist attacks, and tons of conspiracy. Theories that these posts 402 00:25:30,000 --> 00:25:33,760 Speaker 1: from these forty five bot controlled accounts have accumulated more 403 00:25:33,800 --> 00:25:38,520 Speaker 1: than four billion views. Between May twenty second and July 404 00:25:38,680 --> 00:25:42,800 Speaker 1: twenty second this year. The accounts posted in I watering 405 00:25:43,160 --> 00:25:48,760 Speaker 1: six hundred ten thousand times in that span. Now that's collectively, 406 00:25:49,280 --> 00:25:51,520 Speaker 1: but if even if we were to just average it out, 407 00:25:51,760 --> 00:25:55,560 Speaker 1: that would mean that each account on average posted more 408 00:25:55,600 --> 00:25:59,880 Speaker 1: than thirteen thousand, five hundred times in three months. That's 409 00:26:00,000 --> 00:26:02,800 Speaker 1: more prolific than I ever was back when I was 410 00:26:02,840 --> 00:26:06,359 Speaker 1: still on Twitter. The bots were particularly active leading up 411 00:26:06,400 --> 00:26:09,320 Speaker 1: to elections in the UK, in which the Labor Party 412 00:26:09,640 --> 00:26:13,159 Speaker 1: ultimately won more seats than the Conservative Party, though Global 413 00:26:13,200 --> 00:26:17,199 Speaker 1: Witness says it found no evidence linking either political party 414 00:26:17,240 --> 00:26:21,080 Speaker 1: to the bots themselves. So is it possible that one 415 00:26:21,359 --> 00:26:24,360 Speaker 1: or both of those parties, or an organization working on 416 00:26:24,480 --> 00:26:28,439 Speaker 1: behalf of one of those parties, hired bots to flood 417 00:26:28,640 --> 00:26:32,680 Speaker 1: x with messages supporting their side. Yeah, that is possible. 418 00:26:32,800 --> 00:26:36,359 Speaker 1: It's also possible this was all being controlled by some 419 00:26:36,560 --> 00:26:40,359 Speaker 1: outside source with no direct interest in either party, but 420 00:26:40,480 --> 00:26:43,760 Speaker 1: a direct interest in outcomes. We just don't know. But 421 00:26:43,840 --> 00:26:46,720 Speaker 1: those accounts did go on to spread conspiracy theories and 422 00:26:46,760 --> 00:26:51,000 Speaker 1: misinformation about everything from the attempted assassination of Donald Trump 423 00:26:51,160 --> 00:26:54,440 Speaker 1: to disinformation about climate change, so it has gone well 424 00:26:54,480 --> 00:26:58,480 Speaker 1: beyond UK elections since then. Global Witness says it has 425 00:26:58,560 --> 00:27:02,240 Speaker 1: reached out to x for comment on why the platform 426 00:27:02,320 --> 00:27:05,280 Speaker 1: is allowing this sort of activity and has, to the 427 00:27:05,320 --> 00:27:09,119 Speaker 1: surprise of no one, received no response so far. I 428 00:27:09,200 --> 00:27:12,040 Speaker 1: seem to recall something about Elon Musk saying he was 429 00:27:12,080 --> 00:27:14,639 Speaker 1: going to get rid of bots on Twitter before he 430 00:27:14,760 --> 00:27:18,159 Speaker 1: acquired It seems like that that just kind of slipped 431 00:27:18,160 --> 00:27:22,280 Speaker 1: through the cracks. Huh, because forty five, just forty five 432 00:27:22,680 --> 00:27:26,240 Speaker 1: accounts posting six hundred and ten thousand times in three 433 00:27:26,400 --> 00:27:29,920 Speaker 1: months and spreading this kind of stuff to widespread view 434 00:27:30,200 --> 00:27:32,560 Speaker 1: seems like it's something that you would pick up on 435 00:27:32,840 --> 00:27:35,679 Speaker 1: and it wouldn't just be flying under the radar. But 436 00:27:35,720 --> 00:27:39,480 Speaker 1: what do I know. Here's another AI related news item, 437 00:27:39,520 --> 00:27:43,400 Speaker 1: but this one involves motion capture performers and voice actors 438 00:27:43,480 --> 00:27:47,280 Speaker 1: in the video game industry. So these actors and performers 439 00:27:47,320 --> 00:27:52,080 Speaker 1: belong to the union sag AFTRA. That's an actor's and 440 00:27:52,240 --> 00:27:56,200 Speaker 1: performers union mostly known for folks who are in the 441 00:27:56,800 --> 00:28:00,240 Speaker 1: film and television industries, but it also applies to people 442 00:28:00,240 --> 00:28:03,439 Speaker 1: who work in video games. So this branch of sag 443 00:28:03,520 --> 00:28:09,040 Speaker 1: AFTRA authorized a strike earlier this month, and now they're 444 00:28:09,080 --> 00:28:12,680 Speaker 1: actually on strike. They are refusing to work while demanding 445 00:28:12,680 --> 00:28:17,119 Speaker 1: that video game companies create solid policies that protect actor 446 00:28:17,200 --> 00:28:22,119 Speaker 1: and performer jobs from generative AI, among other demands. But 447 00:28:22,200 --> 00:28:25,520 Speaker 1: this is a big part of the conversation. So actors 448 00:28:25,520 --> 00:28:27,879 Speaker 1: have said that video game companies are using AI to 449 00:28:27,960 --> 00:28:32,200 Speaker 1: train on actor voices, potentially so that the companies can 450 00:28:32,280 --> 00:28:35,800 Speaker 1: recreate those actor voices without actually having to hire the 451 00:28:35,840 --> 00:28:40,160 Speaker 1: actors themselves, which does seem pretty darn underhanded and it's 452 00:28:40,160 --> 00:28:44,239 Speaker 1: certainly not outside the realm of possibility. Sag AFTRA is 453 00:28:44,400 --> 00:28:48,040 Speaker 1: demanding that companies pledge to not make use of AI 454 00:28:48,200 --> 00:28:50,440 Speaker 1: in a way that would put real performers out of 455 00:28:50,480 --> 00:28:55,240 Speaker 1: a job, particularly without first negotiating a comprehensive agreement with 456 00:28:55,360 --> 00:28:58,720 Speaker 1: the performers themselves. I mean, if you're gonna use my 457 00:28:59,000 --> 00:29:02,000 Speaker 1: voice or likeness, you really should get my permission first 458 00:29:02,000 --> 00:29:06,240 Speaker 1: and pay me. Gosh, darn it. Kyle Orland at Ours 459 00:29:06,280 --> 00:29:10,640 Speaker 1: Technica has an article titled Xbox console sales continue to 460 00:29:10,720 --> 00:29:15,400 Speaker 1: crater with massive forty two percent revenue drop. Yikes. That's 461 00:29:15,800 --> 00:29:18,680 Speaker 1: that's year over year sales figures for Q two. By 462 00:29:18,680 --> 00:29:21,480 Speaker 1: the way, that Q two twenty twenty four shows a 463 00:29:21,560 --> 00:29:25,040 Speaker 1: forty two percent revenue drop from Q two twenty twenty three. 464 00:29:25,280 --> 00:29:28,680 Speaker 1: The company doesn't share detailed breakdowns of how many units 465 00:29:28,720 --> 00:29:33,040 Speaker 1: were produced or sold, but industry analysts estimate that Microsoft 466 00:29:33,040 --> 00:29:35,680 Speaker 1: sold fewer than one million units in the first quarter 467 00:29:35,800 --> 00:29:38,560 Speaker 1: of this year, whereas Sony posted four and a half 468 00:29:38,720 --> 00:29:43,560 Speaker 1: million BS five sales at that same time. However, Microsoft 469 00:29:43,720 --> 00:29:47,760 Speaker 1: is raking in the dough through its Game Pass subscription service. 470 00:29:47,960 --> 00:29:51,960 Speaker 1: So while hardware sales are tanking, I mean, there's no 471 00:29:52,000 --> 00:29:56,040 Speaker 1: other word for it, the ongoing subscription services are generating 472 00:29:56,240 --> 00:30:00,680 Speaker 1: bukous of buckos, particularly since Microsoft recently hiked up the 473 00:30:00,720 --> 00:30:05,160 Speaker 1: monthly price. Orland hypothesizes that Microsoft might get out of 474 00:30:05,200 --> 00:30:08,880 Speaker 1: the hardware game entirely in the future and focus solely 475 00:30:08,960 --> 00:30:12,880 Speaker 1: on the software side. That honestly wouldn't surprise me. In fact, 476 00:30:13,000 --> 00:30:15,720 Speaker 1: I expect that I figured that Microsoft was looking for 477 00:30:15,760 --> 00:30:18,720 Speaker 1: an off ramp from hardware hoping to launch a quote 478 00:30:18,760 --> 00:30:23,520 Speaker 1: unquote final console that would rely upon ongoing subscription content 479 00:30:23,760 --> 00:30:27,160 Speaker 1: to keep it going. So it is possible that the 480 00:30:27,440 --> 00:30:32,680 Speaker 1: last Xbox hardware update is the last Xbox hardware update. 481 00:30:32,960 --> 00:30:34,920 Speaker 1: That would not surprise me if that were the case. 482 00:30:34,960 --> 00:30:37,400 Speaker 1: It also wouldn't surprise me if we got one more 483 00:30:37,480 --> 00:30:42,160 Speaker 1: generation before that happened. Either one I think is entirely possible. Heck, 484 00:30:42,200 --> 00:30:44,640 Speaker 1: it's even possible that Microsoft doesn't get out entirely and 485 00:30:44,760 --> 00:30:48,600 Speaker 1: we just keep getting generations. But the improvements we see 486 00:30:48,640 --> 00:30:53,000 Speaker 1: from generation to generation, those performance improvements, I think you 487 00:30:53,000 --> 00:30:54,960 Speaker 1: can make an argument that we're starting to get to 488 00:30:55,000 --> 00:30:57,920 Speaker 1: a point of diminishing returns. Now, I'm never gonna say 489 00:30:57,960 --> 00:31:00,920 Speaker 1: that consoles are as good as they'll ever be, and 490 00:31:00,960 --> 00:31:03,520 Speaker 1: they'll never be able to make a better one with 491 00:31:03,640 --> 00:31:06,520 Speaker 1: you know, better graphic fidelity and all that kind of stuff, 492 00:31:06,560 --> 00:31:11,280 Speaker 1: because goodness knows, every generation of consoles brings surprises and 493 00:31:11,320 --> 00:31:15,280 Speaker 1: improvements with it. But it might be that consoles are 494 00:31:15,320 --> 00:31:19,880 Speaker 1: as good as they need to be, right Like, you 495 00:31:19,920 --> 00:31:23,120 Speaker 1: don't need to go harder than that that even if 496 00:31:23,120 --> 00:31:27,240 Speaker 1: you could, there's no market demand for it, So why 497 00:31:27,280 --> 00:31:29,840 Speaker 1: not back off of that? Part of the business entirely 498 00:31:29,920 --> 00:31:33,160 Speaker 1: and focus on the software side. Rob thubrun over at 499 00:31:33,240 --> 00:31:36,640 Speaker 1: tech Spot has a whammy of a piece titled Bungee 500 00:31:36,840 --> 00:31:41,080 Speaker 1: ceo faces backlash after announcing two hundred and twenty employees 501 00:31:41,120 --> 00:31:44,440 Speaker 1: will be laid off. So Bungi is the video game 502 00:31:44,440 --> 00:31:48,000 Speaker 1: developer behind titles like Halo, although that is now developed 503 00:31:48,040 --> 00:31:52,840 Speaker 1: under Microsoft, as well as the Destiny series. And yeah, 504 00:31:52,880 --> 00:31:55,680 Speaker 1: Bunge recently said it will lay off more than two 505 00:31:55,760 --> 00:31:59,920 Speaker 1: hundred employees and that stinks. But we're already seeing layoffs 506 00:32:00,080 --> 00:32:03,240 Speaker 1: across the video game industry, so you could argue that 507 00:32:03,360 --> 00:32:06,640 Speaker 1: this isn't that shocking. So why are people up in 508 00:32:06,760 --> 00:32:10,960 Speaker 1: arms against CEO Pete Parsons. Well, that might be because 509 00:32:11,000 --> 00:32:14,760 Speaker 1: of his spending habits. As Thuberan points out, Parsons has 510 00:32:14,840 --> 00:32:19,040 Speaker 1: dropped more than two million dollars on collecting classic cars 511 00:32:19,200 --> 00:32:22,560 Speaker 1: since Sony acquired Bungee back in the summer of twenty 512 00:32:22,680 --> 00:32:26,320 Speaker 1: twenty two. So if you're a Bungee employee who is 513 00:32:26,360 --> 00:32:29,360 Speaker 1: in danger of losing your job, you might also harbor 514 00:32:29,440 --> 00:32:32,400 Speaker 1: some pretty strong opinions regarding the fact that your boss 515 00:32:32,520 --> 00:32:35,400 Speaker 1: is dropping mountains of cash on cars that he might 516 00:32:35,480 --> 00:32:40,280 Speaker 1: not actually ever, drive, particularly when Parsons says the reason 517 00:32:40,360 --> 00:32:43,880 Speaker 1: for the layoffs is due to quote unquote financial challenges. 518 00:32:44,480 --> 00:32:47,600 Speaker 1: So maybe that financial challenges. He needs a bit extra 519 00:32:47,680 --> 00:32:50,600 Speaker 1: to buy that toyota urcell. He's had his ion. By 520 00:32:50,600 --> 00:32:52,760 Speaker 1: the way, that was a That was a sardonic joke. 521 00:32:52,840 --> 00:32:56,680 Speaker 1: That was not an actual accusation. When Butch Wilmore and 522 00:32:56,760 --> 00:33:00,440 Speaker 1: Sunny Williams first boarded the International Space Station, the plan 523 00:33:00,520 --> 00:33:02,640 Speaker 1: at that time was for them to return to Earth 524 00:33:02,680 --> 00:33:06,600 Speaker 1: as early as eight days later. Now that would have 525 00:33:06,720 --> 00:33:10,240 Speaker 1: been on June fourteenth. But now it's August and the 526 00:33:10,320 --> 00:33:15,200 Speaker 1: astronauts are still aboard the ISS. Why well, the spacecraft 527 00:33:15,200 --> 00:33:18,520 Speaker 1: that brought them to the ISS was the Boeing star Liner, 528 00:33:18,760 --> 00:33:22,600 Speaker 1: and that spacecraft experienced some technical issues on its way 529 00:33:22,720 --> 00:33:26,240 Speaker 1: to space. Now, initially, the thought was that these problems, 530 00:33:26,280 --> 00:33:30,480 Speaker 1: while needing to be solved, would not pose a significant 531 00:33:30,600 --> 00:33:35,200 Speaker 1: risk to the mission. However, upon further investigation, concerns have 532 00:33:35,280 --> 00:33:38,480 Speaker 1: been growing that some of the thruster systems are not 533 00:33:38,720 --> 00:33:42,320 Speaker 1: operating properly, and that they would be needed in order 534 00:33:42,360 --> 00:33:44,840 Speaker 1: to orient and guide the spacecraft so that it could 535 00:33:44,840 --> 00:33:48,520 Speaker 1: re enter Earth's atmosphere safely, which is a pretty darn 536 00:33:48,680 --> 00:33:52,440 Speaker 1: significant risk to the mission. There is, however, a huge 537 00:33:52,600 --> 00:33:56,320 Speaker 1: amount of pressure to use the star Liner as the 538 00:33:56,400 --> 00:33:59,840 Speaker 1: return vehicle for the mission. NASA has wanted to diversify 539 00:33:59,880 --> 00:34:02,320 Speaker 1: the companies that it works with in order to launch 540 00:34:02,480 --> 00:34:07,560 Speaker 1: vehicles and spacecraft and astronauts into space and not be 541 00:34:07,680 --> 00:34:11,800 Speaker 1: beholden to a single entity like SpaceX. So the story 542 00:34:11,800 --> 00:34:15,640 Speaker 1: out of NASA was that, yes, the star Liner has problems, 543 00:34:15,880 --> 00:34:19,400 Speaker 1: but they have their top scientists and engineers working on 544 00:34:19,480 --> 00:34:23,680 Speaker 1: fixing those problems. However, according to Eric Berger over at 545 00:34:23,719 --> 00:34:27,200 Speaker 1: Ours Technica, the discussion could be shifting toward relying on 546 00:34:27,239 --> 00:34:31,960 Speaker 1: SpaceX's Crew Dragon spacecraft as a potential return vehicle for 547 00:34:32,040 --> 00:34:35,640 Speaker 1: the astronauts instead of the star Liner. Now, I say 548 00:34:36,080 --> 00:34:40,080 Speaker 1: maybe shifting because NASA is being really quiet about all this. 549 00:34:40,480 --> 00:34:43,480 Speaker 1: It would look really bad for Boeing, which I don't 550 00:34:43,480 --> 00:34:45,759 Speaker 1: know if you know this, it's already facing issues in 551 00:34:45,800 --> 00:34:48,520 Speaker 1: another part of its aerospace business. But it would look 552 00:34:48,520 --> 00:34:51,200 Speaker 1: really bad for Boeing if the astronauts had to abandon 553 00:34:51,239 --> 00:34:53,839 Speaker 1: plans to return in the star Liner and instead rely 554 00:34:54,000 --> 00:34:56,960 Speaker 1: on a competing company's spacecraft in order to get back 555 00:34:56,960 --> 00:35:00,720 Speaker 1: home safely. But then again, it would look really really 556 00:35:01,239 --> 00:35:04,840 Speaker 1: really bad if those astronauts lives were put in danger 557 00:35:05,120 --> 00:35:08,200 Speaker 1: just to preserve the optics of the star Liner. I 558 00:35:08,280 --> 00:35:11,680 Speaker 1: recommend reading Berger's entire article for the full story. It's 559 00:35:11,719 --> 00:35:15,359 Speaker 1: an excellent piece. Again, it's titled NASA says it is 560 00:35:15,520 --> 00:35:19,439 Speaker 1: evaluating all options for the safe return of Starliner crew. 561 00:35:19,680 --> 00:35:22,960 Speaker 1: And that is on ours Tetnika. Okay. I've got a 562 00:35:23,000 --> 00:35:26,879 Speaker 1: couple of recommended articles for y'all to check out. One 563 00:35:27,000 --> 00:35:30,320 Speaker 1: is by Ben Quinn and Dan Milmo of The Guardian. 564 00:35:30,640 --> 00:35:34,160 Speaker 1: It is titled how TikTok bots and AI have powered 565 00:35:34,200 --> 00:35:37,719 Speaker 1: a resurgence in UK far right violence. It's one of 566 00:35:37,760 --> 00:35:41,080 Speaker 1: those stories that seems to confirm many fears that some 567 00:35:41,160 --> 00:35:44,120 Speaker 1: folks have about the rise of AI and how AI 568 00:35:44,280 --> 00:35:48,400 Speaker 1: can pour gasoline on a fire. So, just to be clear, 569 00:35:48,600 --> 00:35:52,160 Speaker 1: the underlying social issues are already there. AI is not 570 00:35:52,719 --> 00:35:56,320 Speaker 1: creating these social issues. They exist already and those issues 571 00:35:56,480 --> 00:36:00,840 Speaker 1: need to be addressed. This isn't just an AI problem. However, 572 00:36:01,120 --> 00:36:03,840 Speaker 1: the use of AI to make things worse is also 573 00:36:04,080 --> 00:36:08,400 Speaker 1: a problem. It's an additional problem. Then there's another piece 574 00:36:08,440 --> 00:36:11,400 Speaker 1: that's by Anthony Grayling and Brian Ball and it's on 575 00:36:11,480 --> 00:36:15,880 Speaker 1: the Conversation It is titled Philosophy is Crucial in the 576 00:36:15,920 --> 00:36:18,960 Speaker 1: Age of AI. Now, y'all, I tend to be a 577 00:36:19,080 --> 00:36:22,160 Speaker 1: pretty pragmatic person. I have the sort of brain that 578 00:36:22,239 --> 00:36:28,000 Speaker 1: finds a lot of philosophy tedious and ultimately questionably useful. However, 579 00:36:28,600 --> 00:36:32,839 Speaker 1: I admit this is a bias I have. I do 580 00:36:32,840 --> 00:36:36,480 Speaker 1: not pretend like my view is the correct one, or 581 00:36:36,520 --> 00:36:40,680 Speaker 1: even partly correct. Maybe I'm just dense. Not maybe I 582 00:36:40,719 --> 00:36:44,319 Speaker 1: am just dense, but maybe that's the problem. Anyway, this 583 00:36:44,400 --> 00:36:48,720 Speaker 1: particular piece traces how philosophy has been a part of 584 00:36:49,000 --> 00:36:53,000 Speaker 1: AI's evolution and development for decades, and how some deep 585 00:36:53,120 --> 00:36:57,719 Speaker 1: questions in AI are absolutely grounded in philosophy. It is 586 00:36:57,800 --> 00:37:01,400 Speaker 1: well worth a read, even if, like me, you often 587 00:37:01,440 --> 00:37:04,920 Speaker 1: find philosophy to be a little on the impractical side. 588 00:37:05,000 --> 00:37:08,279 Speaker 1: And that's it for today's tech News episode. It was 589 00:37:08,280 --> 00:37:10,120 Speaker 1: a long one, but like I said, a lot happened. 590 00:37:10,160 --> 00:37:13,240 Speaker 1: I even cut some stories from this because goodness gracious, 591 00:37:13,239 --> 00:37:15,839 Speaker 1: it was running along. I hope all of you out 592 00:37:15,880 --> 00:37:18,160 Speaker 1: there are doing well, and I will talk to you 593 00:37:18,200 --> 00:37:28,960 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 594 00:37:29,040 --> 00:37:33,879 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app Apple podcasts, 595 00:37:34,000 --> 00:37:39,560 Speaker 1: or wherever you listen to your favorite shows.