1 00:00:04,480 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Stricklund. 3 00:00:15,960 --> 00:00:19,439 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,480 --> 00:00:22,040 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,079 --> 00:00:26,560 Speaker 1: the week ending July twenty sixth, twenty twenty four and 6 00:00:26,720 --> 00:00:31,120 Speaker 1: let's start off with a follow up on the CrowdStrike story. So, 7 00:00:31,160 --> 00:00:35,880 Speaker 1: if somehow you missed that event, cybersecurity company CrowdStrike pushed 8 00:00:35,880 --> 00:00:39,640 Speaker 1: out and update the customers on Windows based platforms that 9 00:00:39,760 --> 00:00:43,919 Speaker 1: caused machines to go into a reboot loop. And this 10 00:00:44,000 --> 00:00:48,159 Speaker 1: little oopsey caused massive outages all around the world and 11 00:00:48,240 --> 00:00:53,239 Speaker 1: some companies like major airlines are still dealing with the fallout. 12 00:00:53,560 --> 00:00:58,680 Speaker 1: This week, CrowdStrike posted a quote preliminary post incident review 13 00:00:58,880 --> 00:01:03,880 Speaker 1: or pir are end quote giving a high level view 14 00:01:04,080 --> 00:01:08,520 Speaker 1: of what happened while promising a more thorough investigation that 15 00:01:08,640 --> 00:01:11,720 Speaker 1: will be shared down the line. So the post says 16 00:01:11,760 --> 00:01:16,440 Speaker 1: that last Friday, quote CrowdStrike released a content configuration update 17 00:01:16,480 --> 00:01:20,240 Speaker 1: for the Windows Sensor to gather telemetry on possible novel 18 00:01:20,280 --> 00:01:23,360 Speaker 1: threat techniques. These updates are a regular part of the 19 00:01:23,440 --> 00:01:28,280 Speaker 1: dynamic protection mechanisms of the Falcon platform. The problematic rapid 20 00:01:28,360 --> 00:01:33,800 Speaker 1: response content configuration update resulted in a Windows system crash 21 00:01:34,080 --> 00:01:36,520 Speaker 1: end quote. So the post goes on to say that 22 00:01:36,560 --> 00:01:40,360 Speaker 1: this rapid response content feature is meant to quote respond 23 00:01:40,400 --> 00:01:44,440 Speaker 1: to the changing threat landscape at operational speed end quote, 24 00:01:44,600 --> 00:01:47,640 Speaker 1: and by necessity that means CrowdStrike has to be very 25 00:01:47,720 --> 00:01:51,440 Speaker 1: quick to respond to emerging threats in order to protect customers. 26 00:01:51,680 --> 00:01:55,200 Speaker 1: But this particular update had an undetected error in it, 27 00:01:55,480 --> 00:01:59,520 Speaker 1: and in addition, the content validator you can think of 28 00:01:59,520 --> 00:02:02,280 Speaker 1: it as kind of like a bouncer standing outside a nightclub. 29 00:02:02,520 --> 00:02:05,080 Speaker 1: The validator is a system that's meant to look for 30 00:02:05,200 --> 00:02:09,440 Speaker 1: mistakes before allowing updates to go through, but the validator 31 00:02:09,480 --> 00:02:12,840 Speaker 1: itself had a bug that essentially meant it was looking 32 00:02:12,880 --> 00:02:15,360 Speaker 1: the other way. So the post goes on to stress 33 00:02:15,400 --> 00:02:19,160 Speaker 1: that CrowdStrike will be instituting new processes to validate code 34 00:02:19,200 --> 00:02:21,880 Speaker 1: and to test it thoroughly for errors before pushing it 35 00:02:21,919 --> 00:02:26,000 Speaker 1: out to customers in the future, because boy howdy, this 36 00:02:26,080 --> 00:02:32,560 Speaker 1: outage was incredibly disruptive and yeah, uh, the CrowdStrike partners 37 00:02:32,639 --> 00:02:36,040 Speaker 1: are mightily miffed about the outage, so not just the 38 00:02:36,200 --> 00:02:39,600 Speaker 1: customers who were directly affected, but people who then had 39 00:02:39,680 --> 00:02:45,120 Speaker 1: to step in and try to mitigate and repair systems 40 00:02:45,200 --> 00:02:48,840 Speaker 1: that were affected by this. Countless people had to work 41 00:02:49,040 --> 00:02:53,040 Speaker 1: extremely long hours in an effort to get things working again. 42 00:02:53,400 --> 00:02:57,119 Speaker 1: But don't worry, CrowdStrike cares. That's why the company sent 43 00:02:57,160 --> 00:03:00,480 Speaker 1: out a ten dollars voucher to some teammates and partners 44 00:03:00,520 --> 00:03:03,120 Speaker 1: as a gesture of gratitude and sympathy. Now, as you 45 00:03:03,160 --> 00:03:06,440 Speaker 1: might imagine, not everyone took this very well. Some folks 46 00:03:06,520 --> 00:03:10,120 Speaker 1: interpreting crowdstrikes move as being insult added to injury that 47 00:03:10,200 --> 00:03:13,720 Speaker 1: somehow a ten dollars gift vouchure for coffee and donuts 48 00:03:13,840 --> 00:03:17,440 Speaker 1: would make up for having to provide seemingly endless support 49 00:03:17,520 --> 00:03:20,959 Speaker 1: to countless customers who were affected by this outage. Some 50 00:03:21,080 --> 00:03:24,200 Speaker 1: people maybe saw the gesture as being well intentioned and 51 00:03:24,240 --> 00:03:28,079 Speaker 1: they didn't take umbrage to it, but unfortunately a lot 52 00:03:28,120 --> 00:03:30,720 Speaker 1: of them also had trouble using the vouchure in the 53 00:03:30,760 --> 00:03:33,880 Speaker 1: first place. The vouchuer was for uber eats, and uber 54 00:03:33,919 --> 00:03:37,280 Speaker 1: Eats actually flagged it as fraud because so many people 55 00:03:37,360 --> 00:03:41,440 Speaker 1: were trying to use the code. Interestingly, I saw something 56 00:03:41,520 --> 00:03:45,520 Speaker 1: similar with ride haling services because someone I know was 57 00:03:45,680 --> 00:03:49,440 Speaker 1: trying to make use of a promotional code, essentially to 58 00:03:49,920 --> 00:03:52,600 Speaker 1: use a right hailing service so that they could go 59 00:03:52,680 --> 00:03:55,680 Speaker 1: and help their company out during the outage. The company 60 00:03:55,760 --> 00:03:58,920 Speaker 1: was saying, hey, if you need to get a ride, 61 00:03:59,400 --> 00:04:01,640 Speaker 1: here's a code you can use that we really need 62 00:04:01,680 --> 00:04:04,400 Speaker 1: your help. But then when they tried to use the code, 63 00:04:04,760 --> 00:04:07,400 Speaker 1: they found that the code was flagged as being invalid, 64 00:04:07,760 --> 00:04:11,160 Speaker 1: that too many people were using it. And this was 65 00:04:11,200 --> 00:04:14,200 Speaker 1: not a small, unknown company that was extending this right 66 00:04:14,280 --> 00:04:17,560 Speaker 1: hailing offer in return for more help. It was a major, 67 00:04:18,360 --> 00:04:21,400 Speaker 1: major employer in the city of Atlanta. Anyway, it sounds 68 00:04:21,440 --> 00:04:24,080 Speaker 1: to me like an effort to do something small but 69 00:04:24,120 --> 00:04:29,040 Speaker 1: a little nice has backfired in crowdstrikes proverbial face. Okay, 70 00:04:29,080 --> 00:04:31,560 Speaker 1: now let's move on to get into a whole bunch 71 00:04:31,560 --> 00:04:35,440 Speaker 1: of AI news. This week, Meta unveiled the Lama three 72 00:04:35,600 --> 00:04:40,000 Speaker 1: AI model and showed off generative AI capabilities, including support 73 00:04:40,040 --> 00:04:43,359 Speaker 1: for eight languages and the ability to write code, with 74 00:04:43,520 --> 00:04:47,400 Speaker 1: promises of multi modal capabilities in the future. So how 75 00:04:47,400 --> 00:04:51,240 Speaker 1: does Lama three stack up against say, open AI's GPT 76 00:04:51,400 --> 00:04:54,640 Speaker 1: model or other AI models for that matter. That's actually 77 00:04:54,640 --> 00:04:59,760 Speaker 1: hard to say because it's very tricky to measure AI performance. 78 00:05:00,000 --> 00:05:03,400 Speaker 1: There's not a whole lot of agreement over what constitutes 79 00:05:03,640 --> 00:05:06,440 Speaker 1: a proper benchmark for AI capabilities. That are a lot 80 00:05:06,440 --> 00:05:09,440 Speaker 1: of benchmarks that are being used, but they don't necessarily 81 00:05:09,480 --> 00:05:12,440 Speaker 1: say that one model is superior to others. It might 82 00:05:12,480 --> 00:05:15,760 Speaker 1: be that one model is better at specific sets of 83 00:05:15,880 --> 00:05:19,359 Speaker 1: tasks than others, but doesn't perform as well in other regards. 84 00:05:19,720 --> 00:05:23,760 Speaker 1: Meta's model did score smack dab between open ais GPT 85 00:05:23,960 --> 00:05:27,159 Speaker 1: four oh and Claude three point five on a math 86 00:05:27,279 --> 00:05:31,640 Speaker 1: benchmark for word problems. Meta clearly hopes that developers will 87 00:05:31,680 --> 00:05:35,799 Speaker 1: rely on LAMA three more than on competing models running forward, 88 00:05:35,920 --> 00:05:38,400 Speaker 1: And it sounds to me like the various AI models 89 00:05:38,440 --> 00:05:42,320 Speaker 1: out there right now perform at fairly similar levels, Like 90 00:05:42,360 --> 00:05:45,320 Speaker 1: there is some variation, but from the big ones out there, 91 00:05:45,360 --> 00:05:47,640 Speaker 1: they tend to be kind of neck and neck. So 92 00:05:47,800 --> 00:05:49,760 Speaker 1: I think for a lot of developers it'll mostly come 93 00:05:49,800 --> 00:05:53,640 Speaker 1: down to which model works best from a financial standpoint, 94 00:05:53,880 --> 00:05:56,479 Speaker 1: not necessarily just a technical one. But you know which 95 00:05:56,520 --> 00:06:00,640 Speaker 1: one's going to be the cheapest to use. Brianna Herschbach 96 00:06:00,880 --> 00:06:04,359 Speaker 1: of the Star Tribune has a piece titled X's AI 97 00:06:04,480 --> 00:06:09,320 Speaker 1: chat bot groc spreads misinformation about Minnesota's ballots. Does the 98 00:06:09,440 --> 00:06:13,000 Speaker 1: tech giant care well? I suspect the answer to the 99 00:06:13,040 --> 00:06:16,560 Speaker 1: headline is that I'm sure there are some folks at 100 00:06:16,760 --> 00:06:19,719 Speaker 1: X who very much care. But whether that's the sentiment 101 00:06:19,839 --> 00:06:23,200 Speaker 1: held by the people in charge, I can't say, though 102 00:06:23,240 --> 00:06:27,080 Speaker 1: I have my suspicions anyway. According to the article, Groc, 103 00:06:27,440 --> 00:06:31,080 Speaker 1: the AI chat bot from X falsely claimed that it 104 00:06:31,120 --> 00:06:35,640 Speaker 1: was too late for presidential ballots to change in nine states, 105 00:06:35,680 --> 00:06:40,200 Speaker 1: including in Minnesota, that the candidates were already locked and loaded, 106 00:06:40,680 --> 00:06:42,839 Speaker 1: and that anyone who had failed to get on the 107 00:06:42,880 --> 00:06:46,360 Speaker 1: ballot already is out of luck. But quote, hey, there's 108 00:06:46,400 --> 00:06:49,800 Speaker 1: always twenty twenty eight right end quote. Because of course 109 00:06:49,920 --> 00:06:54,520 Speaker 1: Grock is sassy. And this is relevant obviously because here 110 00:06:54,520 --> 00:06:58,080 Speaker 1: in the United States, current President Joe Biden has dropped 111 00:06:58,120 --> 00:07:03,360 Speaker 1: out of the election rate and Kamala Harris, the vice president, 112 00:07:03,600 --> 00:07:08,960 Speaker 1: is currently in line to get the Democratic nomination for 113 00:07:09,040 --> 00:07:13,920 Speaker 1: the Democratic Party. So clearly, if we were at a 114 00:07:14,000 --> 00:07:17,800 Speaker 1: point where it's too late to put people on the 115 00:07:17,840 --> 00:07:20,640 Speaker 1: ballot to change the presidential ballots, then that would be 116 00:07:20,800 --> 00:07:26,840 Speaker 1: a huge problem for the Democrats, because they're presumptive candidate 117 00:07:26,880 --> 00:07:29,560 Speaker 1: would not be allowed to be added However, that's not 118 00:07:29,640 --> 00:07:33,680 Speaker 1: the case. No state has passed the deadline for candidates 119 00:07:33,720 --> 00:07:37,120 Speaker 1: to be added to the ballot, not in Minnesota, not 120 00:07:37,520 --> 00:07:41,160 Speaker 1: any other state. We haven't reached that point. But now 121 00:07:41,280 --> 00:07:45,000 Speaker 1: misinformation is spreading online about this because GROX said, Hey, 122 00:07:45,000 --> 00:07:49,320 Speaker 1: it's too late, so tough luck, and it's just not true. 123 00:07:50,080 --> 00:07:52,760 Speaker 1: I think most of us have a handle on how 124 00:07:52,800 --> 00:07:56,920 Speaker 1: social platforms can facilitate the distribution and misinformation. But this 125 00:07:56,960 --> 00:08:00,920 Speaker 1: is a case where X isn't just facility hating the 126 00:08:01,040 --> 00:08:05,880 Speaker 1: spread of misinformation, it is generating the actual misinformation, Like 127 00:08:06,160 --> 00:08:09,440 Speaker 1: the platform itself, or at least the AI chatbot that's 128 00:08:09,520 --> 00:08:13,560 Speaker 1: created for that platform, is doing it. Folks at X 129 00:08:13,560 --> 00:08:16,120 Speaker 1: have pointed out that GROC comes with a disclaimer that 130 00:08:16,280 --> 00:08:19,880 Speaker 1: urges users to independently verify the information that's coming out 131 00:08:19,920 --> 00:08:21,920 Speaker 1: of the chatbot. I don't see this as a get 132 00:08:21,960 --> 00:08:25,000 Speaker 1: off jail free card. Actually, I see this as a 133 00:08:25,160 --> 00:08:30,760 Speaker 1: condemnation of AI, specifically generative AI, because if you know 134 00:08:31,360 --> 00:08:35,319 Speaker 1: that your tool is prone to or at least capable 135 00:08:35,440 --> 00:08:40,440 Speaker 1: of generating falsehoods, maybe don't distribute the tool to customers. Like, 136 00:08:40,520 --> 00:08:44,200 Speaker 1: what is the point of having an AI chatbot if 137 00:08:44,240 --> 00:08:48,760 Speaker 1: the user can't trust what the chatbot says, why have 138 00:08:48,840 --> 00:08:52,680 Speaker 1: a chatbot at all unless the chatbot's reliable. If you 139 00:08:52,840 --> 00:08:55,840 Speaker 1: have to look up everything the chatbot tells you anyway, 140 00:08:56,040 --> 00:08:59,120 Speaker 1: you might as well skip having the chatbot, just go 141 00:08:59,200 --> 00:09:04,320 Speaker 1: straight to researching the question, right, because otherwise what are 142 00:09:04,360 --> 00:09:08,240 Speaker 1: you doing? It seems in line, however, with Musk's approach 143 00:09:08,800 --> 00:09:11,679 Speaker 1: in general. It reminds me of how Tesla has both 144 00:09:11,760 --> 00:09:16,720 Speaker 1: autopilot and full self driving products and Tesla cars. Both 145 00:09:16,760 --> 00:09:19,080 Speaker 1: of those, I would argue, give a false sense of 146 00:09:19,080 --> 00:09:22,079 Speaker 1: what they're capable of based upon their names. Like I've 147 00:09:22,080 --> 00:09:25,160 Speaker 1: often said, full self driving, which by the way, isn't 148 00:09:25,200 --> 00:09:28,920 Speaker 1: full self driving. That's my opinion, But I feel like 149 00:09:28,960 --> 00:09:31,120 Speaker 1: it is in line with that, And I'm sure this 150 00:09:31,240 --> 00:09:33,079 Speaker 1: is just the tip of the iceberg as far as 151 00:09:33,120 --> 00:09:36,040 Speaker 1: how AI is playing a part in the dissemination of 152 00:09:36,640 --> 00:09:41,520 Speaker 1: information and misinformation during the election cycle. Okay, I've got 153 00:09:41,520 --> 00:09:43,920 Speaker 1: a bunch more stories to go through before we start 154 00:09:44,120 --> 00:09:56,920 Speaker 1: into those, let's take a quick break. We're back. So, 155 00:09:57,280 --> 00:10:00,160 Speaker 1: in a rare event in US politics this week, week, 156 00:10:00,280 --> 00:10:03,960 Speaker 1: the Senate unanimously passed a federal bill that would let 157 00:10:04,080 --> 00:10:08,079 Speaker 1: victims of non consensual deep fakes of a sexually explicit 158 00:10:08,160 --> 00:10:13,400 Speaker 1: nature sue the people who create and traffic in such material. 159 00:10:13,880 --> 00:10:17,560 Speaker 1: This can include images, video, audio, et cetera. So deep 160 00:10:17,600 --> 00:10:21,079 Speaker 1: fakes have been a problem for a while now, and 161 00:10:21,200 --> 00:10:25,600 Speaker 1: it's a problem that disproportionately victimizes women. So this bill 162 00:10:25,760 --> 00:10:29,480 Speaker 1: would make it possible for victims of deep fakes sexually 163 00:10:29,480 --> 00:10:33,240 Speaker 1: explicit deep fakes to seek civil penalties against those who 164 00:10:33,280 --> 00:10:38,000 Speaker 1: are making or sharing or receiving those deep fakes. The 165 00:10:38,080 --> 00:10:40,800 Speaker 1: bill will still need to pass the House of Representatives 166 00:10:40,800 --> 00:10:43,240 Speaker 1: before it can move on to the President's desk to 167 00:10:43,280 --> 00:10:45,800 Speaker 1: get signed into law, and as of right now, that 168 00:10:45,840 --> 00:10:49,520 Speaker 1: bill is in committee at the House of Representatives. Alexandra 169 00:10:49,679 --> 00:10:54,640 Speaker 1: Occasio Cortes AOC. She's championing this and has made some 170 00:10:54,880 --> 00:10:58,760 Speaker 1: very passionate arguments as to why the bill is needed 171 00:10:59,120 --> 00:11:03,559 Speaker 1: because as of right now, there are very few avenues 172 00:11:03,600 --> 00:11:07,079 Speaker 1: that victims of deep fis can pursue to get justice 173 00:11:07,559 --> 00:11:11,480 Speaker 1: or any kind of action on their behalf. And that 174 00:11:11,640 --> 00:11:14,800 Speaker 1: is pretty horrifying, because, I mean, this is like a 175 00:11:14,880 --> 00:11:20,000 Speaker 1: consent issue, and deep fakes can cause not just incredible 176 00:11:20,040 --> 00:11:23,600 Speaker 1: emotional and psychological trauma, which they can. That alone is 177 00:11:23,800 --> 00:11:27,640 Speaker 1: enough to require some sort of of measure against them, 178 00:11:27,880 --> 00:11:30,960 Speaker 1: but they can also impact a person's ability to make 179 00:11:31,000 --> 00:11:33,320 Speaker 1: a living through no fault of their own, Like the 180 00:11:33,360 --> 00:11:39,040 Speaker 1: person literally hasn't done anything, but an AI generated image 181 00:11:39,120 --> 00:11:44,160 Speaker 1: or video has given them a terrible reputation and there 182 00:11:44,240 --> 00:11:48,400 Speaker 1: has to be a system in place to seek justice 183 00:11:48,440 --> 00:11:50,280 Speaker 1: for this. So I feel this is a very good 184 00:11:50,360 --> 00:11:54,960 Speaker 1: step toward addressing a growing and disturbing problem. Moving on 185 00:11:55,040 --> 00:11:59,640 Speaker 1: to other issues with AI. Sag AFTRA that's a union 186 00:11:59,679 --> 00:12:06,480 Speaker 1: that presents typically actors and other performers in film, television, 187 00:12:06,760 --> 00:12:12,920 Speaker 1: and video games. Well, the video game group of performers 188 00:12:13,200 --> 00:12:17,240 Speaker 1: are now going on strike, so this includes voice performers 189 00:12:17,360 --> 00:12:20,760 Speaker 1: as well as motion capture performers, and the reason they're 190 00:12:20,800 --> 00:12:24,199 Speaker 1: going on strike is over the fact that the union 191 00:12:24,240 --> 00:12:27,840 Speaker 1: has been unable to reach a satisfying agreement during contract 192 00:12:27,880 --> 00:12:30,880 Speaker 1: negotiations with various video game companies. And these are like 193 00:12:30,920 --> 00:12:35,480 Speaker 1: the big video game companies, so the contract agreement span 194 00:12:35,679 --> 00:12:38,720 Speaker 1: a lot of territory. It's not just one thing, but 195 00:12:38,960 --> 00:12:42,600 Speaker 1: one important element within all of that is over the 196 00:12:42,720 --> 00:12:48,000 Speaker 1: role of AI, and specifically AI's ability to replicate voices 197 00:12:48,200 --> 00:12:51,320 Speaker 1: and likenesses. And that makes a lot of sense to me. That. 198 00:12:51,400 --> 00:12:53,920 Speaker 1: This would be a big sticking point because if I 199 00:12:54,040 --> 00:12:57,920 Speaker 1: were hired for a gig because of my voice or 200 00:12:57,960 --> 00:13:01,760 Speaker 1: my likeness, I don't want AI to copy me, because 201 00:13:01,800 --> 00:13:05,240 Speaker 1: then that copy could be used in perpetuity without my 202 00:13:05,480 --> 00:13:09,520 Speaker 1: participation or compensation. I would be taking one gig and 203 00:13:09,679 --> 00:13:13,640 Speaker 1: essentially putting myself out of a job forever because that 204 00:13:13,760 --> 00:13:17,080 Speaker 1: copy could get passed around, and who knows what stuff 205 00:13:17,120 --> 00:13:21,080 Speaker 1: my AI mimic might be incorporated into. I could find 206 00:13:21,120 --> 00:13:23,840 Speaker 1: a copy of myself in projects that I never would 207 00:13:23,880 --> 00:13:27,200 Speaker 1: have agreed to work on. That's not really cool, right, 208 00:13:27,480 --> 00:13:31,040 Speaker 1: Like if I were hired to voice something that I 209 00:13:31,080 --> 00:13:32,560 Speaker 1: really believed in, I was like, Oh, this is a 210 00:13:32,559 --> 00:13:36,720 Speaker 1: really cool story, and I believe in the developer, and 211 00:13:36,760 --> 00:13:38,319 Speaker 1: I believe in the story they're trying to tell. I 212 00:13:38,360 --> 00:13:40,160 Speaker 1: would love to be part of this, by the way, 213 00:13:40,160 --> 00:13:42,040 Speaker 1: I would love to be part of stuff like that. 214 00:13:42,040 --> 00:13:44,760 Speaker 1: That would be a lot of fun. Then, yeah, that's great, 215 00:13:44,880 --> 00:13:47,840 Speaker 1: But what if they then used my voice to generate 216 00:13:47,920 --> 00:13:50,240 Speaker 1: a copy of it and put it into something that 217 00:13:50,400 --> 00:13:55,120 Speaker 1: I would find morally objectionable or it goes against my 218 00:13:55,160 --> 00:14:00,680 Speaker 1: own personal beliefs. That's a real problem. Apparently, these AI 219 00:14:00,960 --> 00:14:04,120 Speaker 1: points in the agreements have been contentious, with video game 220 00:14:04,160 --> 00:14:06,800 Speaker 1: companies reluctant to promise that they're not going to use 221 00:14:06,840 --> 00:14:11,640 Speaker 1: AI tools that could lead to the detriment of union members. Interestingly, 222 00:14:11,880 --> 00:14:14,960 Speaker 1: striking members are still allowed to participate in appearances at 223 00:14:15,000 --> 00:14:18,240 Speaker 1: San Diego Comic Con, which started earlier this week and 224 00:14:18,360 --> 00:14:20,520 Speaker 1: goes through the weekend, and they can do so without 225 00:14:20,600 --> 00:14:24,920 Speaker 1: penalty because usually a striking sag after member isn't supposed 226 00:14:24,960 --> 00:14:28,160 Speaker 1: to promote any work that is covered by the strike, 227 00:14:28,400 --> 00:14:31,000 Speaker 1: but in this case, the proximity of the strike to 228 00:14:31,520 --> 00:14:35,240 Speaker 1: the Comic Con event called for an exception, so folks 229 00:14:35,280 --> 00:14:37,920 Speaker 1: who were scheduled to appear can still do so. They 230 00:14:37,960 --> 00:14:40,680 Speaker 1: can still talk about the projects they were hired to do, 231 00:14:40,960 --> 00:14:43,600 Speaker 1: and that's really good news for fans. Potentially, it's also 232 00:14:43,680 --> 00:14:46,240 Speaker 1: good news for union members who may be able to 233 00:14:46,280 --> 00:14:50,360 Speaker 1: take the opportunity to speak about their concerns regarding AI 234 00:14:50,560 --> 00:14:53,720 Speaker 1: and other aspects of the dispute. At the top of 235 00:14:53,760 --> 00:14:57,240 Speaker 1: this episode, I talked about how Meta's Lama III performed 236 00:14:57,240 --> 00:15:01,960 Speaker 1: well in certain benchmarks against other AS models well. Google's 237 00:15:02,120 --> 00:15:06,120 Speaker 1: Deep Mind department similarly unveiled a couple of AI systems, 238 00:15:06,160 --> 00:15:10,120 Speaker 1: one called Alpha Proof and another called alpha geometry two 239 00:15:10,560 --> 00:15:14,920 Speaker 1: that performed really well on a very tough math test. 240 00:15:15,200 --> 00:15:18,160 Speaker 1: According to Google, the AI models were able to achieve 241 00:15:18,200 --> 00:15:22,000 Speaker 1: the equivalent of a silver metal performance by solving problems 242 00:15:22,080 --> 00:15:26,520 Speaker 1: that were part of the most recent International Mathematical olympiad BENJ. 243 00:15:26,760 --> 00:15:29,280 Speaker 1: Edwards of Ours Technica has a great piece about this. 244 00:15:29,480 --> 00:15:34,120 Speaker 1: It's titled Google claims Math breakthrough with proof solving AI models. 245 00:15:34,360 --> 00:15:36,360 Speaker 1: Benje also does a really good job of pointing out 246 00:15:36,440 --> 00:15:41,960 Speaker 1: that measuring AI capabilities is tricky. It is not always straightforward. 247 00:15:41,960 --> 00:15:46,240 Speaker 1: In fact, it rarely is straightforward. The achievement does mark 248 00:15:46,280 --> 00:15:50,240 Speaker 1: advancements in AI's ability to parse and then solve complex 249 00:15:50,360 --> 00:15:53,840 Speaker 1: mathematical problems, and that is pretty cool. But ben J. 250 00:15:53,920 --> 00:15:59,160 Speaker 1: Edwards also quotes Sir Timothy Gowers about this. Sir Timothy 251 00:15:59,200 --> 00:16:02,640 Speaker 1: Gowers took to AX and wrote, quote the program needed 252 00:16:02,880 --> 00:16:05,920 Speaker 1: a lot longer than the human competitors, some of the 253 00:16:05,960 --> 00:16:10,640 Speaker 1: problems over sixty hours end quote, and that if human 254 00:16:10,680 --> 00:16:14,000 Speaker 1: competitors had been given the same luxury of time, they 255 00:16:14,040 --> 00:16:17,760 Speaker 1: probably would have scored higher in this test as well. 256 00:16:18,000 --> 00:16:20,840 Speaker 1: So what Gowers is saying is this is not an 257 00:16:20,880 --> 00:16:24,560 Speaker 1: apples to apples comparison. You can't say, oh, this AI 258 00:16:24,720 --> 00:16:28,680 Speaker 1: program is x times smarter than the average person when 259 00:16:28,720 --> 00:16:31,240 Speaker 1: it comes to math, because you have to take into 260 00:16:31,320 --> 00:16:34,680 Speaker 1: account all the different parameters and say, well, if the 261 00:16:34,880 --> 00:16:37,040 Speaker 1: people who are really good at math were given the 262 00:16:37,040 --> 00:16:39,800 Speaker 1: same amount of time to solve a problem, they might 263 00:16:39,840 --> 00:16:42,440 Speaker 1: have done it as well or better than the AI 264 00:16:42,560 --> 00:16:45,840 Speaker 1: model did. Gowers also pointed out that humans had to 265 00:16:45,840 --> 00:16:49,000 Speaker 1: convert these problems into a language that the AI models 266 00:16:49,000 --> 00:16:51,560 Speaker 1: could understand before they could tackle the problem in the 267 00:16:51,560 --> 00:16:55,160 Speaker 1: first place. So again, the AI models weren't presented with 268 00:16:55,240 --> 00:16:59,640 Speaker 1: the exact same problem, or the problem wasn't framed the 269 00:16:59,680 --> 00:17:03,080 Speaker 1: exact same way for the AI versus the human competitors 270 00:17:03,120 --> 00:17:07,000 Speaker 1: who actually participated in the Olympiad, and that if the 271 00:17:07,119 --> 00:17:10,640 Speaker 1: AI had had to parse the problems the way that 272 00:17:10,800 --> 00:17:15,400 Speaker 1: humans did, maybe the AI wouldn't have performed as well. 273 00:17:15,520 --> 00:17:21,160 Speaker 1: So yeah, it's complicated hopping over to open ai because yeah, 274 00:17:21,200 --> 00:17:24,280 Speaker 1: we're not done with AI yet. Open ai announced this 275 00:17:24,359 --> 00:17:28,400 Speaker 1: week that a prototype version of its search GPT product 276 00:17:28,840 --> 00:17:32,639 Speaker 1: is now ready to go. This is open AI's artificial 277 00:17:32,680 --> 00:17:36,840 Speaker 1: intelligence powered search tool. It is a clear shot across 278 00:17:37,000 --> 00:17:40,120 Speaker 1: Google's bow, and according to open ai, the tool will 279 00:17:40,160 --> 00:17:44,240 Speaker 1: provide quote fast and timely answers with clear and relevant 280 00:17:44,280 --> 00:17:47,280 Speaker 1: sources end quote. I have signed up to join the 281 00:17:47,280 --> 00:17:50,080 Speaker 1: wait list to try this out, but as of this recording, 282 00:17:50,119 --> 00:17:52,800 Speaker 1: I have not yet personally been able to use this tool. 283 00:17:53,080 --> 00:17:55,880 Speaker 1: I do have concerns similar to what I was saying earlier. 284 00:17:56,600 --> 00:17:59,080 Speaker 1: I have these concerns with pretty much all AI enabled 285 00:17:59,119 --> 00:18:01,760 Speaker 1: tools that are like And one of those concerns is 286 00:18:01,760 --> 00:18:05,280 Speaker 1: that I worry AI enhanced search will mean fewer people 287 00:18:05,320 --> 00:18:08,679 Speaker 1: will actually navigate to what the real sources of the 288 00:18:08,760 --> 00:18:12,400 Speaker 1: information were. They'll just be satisfied with the AI summarized 289 00:18:12,760 --> 00:18:17,520 Speaker 1: or synthesized results and that's it, Which means the pages 290 00:18:17,600 --> 00:18:20,199 Speaker 1: that the AI is drawing from the source pages are 291 00:18:20,200 --> 00:18:23,480 Speaker 1: going to get less traffic, and that means less revenue, 292 00:18:23,760 --> 00:18:27,760 Speaker 1: which ultimately means the businesses that actually operate these pages 293 00:18:27,800 --> 00:18:30,880 Speaker 1: could find themselves strapped for cash. And then what happens, 294 00:18:31,119 --> 00:18:34,080 Speaker 1: you know, when the underlying infrastructure that the AI is 295 00:18:34,119 --> 00:18:38,680 Speaker 1: dependent upon goes rotten. What happens then? Plus there's still 296 00:18:38,720 --> 00:18:42,840 Speaker 1: the worry of mistakes, you know, like confabulations slash hallucinations 297 00:18:42,880 --> 00:18:46,320 Speaker 1: with Generative AI. Google's version of this same sort of 298 00:18:46,400 --> 00:18:50,720 Speaker 1: tool famously suggested that people include non toxic glue in 299 00:18:50,800 --> 00:18:54,120 Speaker 1: their pizza ingredients in order to prevent cheese from sliding 300 00:18:54,160 --> 00:18:58,239 Speaker 1: off their pizza. Clearly that is not good advice. So 301 00:18:58,520 --> 00:19:02,520 Speaker 1: how reliable will search GPTB and how will folks know 302 00:19:03,000 --> 00:19:05,440 Speaker 1: if the answers they get are actually good? I mean, 303 00:19:05,480 --> 00:19:07,320 Speaker 1: I guess one way you could check is you could 304 00:19:07,359 --> 00:19:10,439 Speaker 1: click through to the sources. But then again, why do 305 00:19:10,480 --> 00:19:13,760 Speaker 1: you even need AI search at that point? Like, if 306 00:19:13,760 --> 00:19:16,159 Speaker 1: you already have to double check to make sure the 307 00:19:16,200 --> 00:19:19,800 Speaker 1: answer that AI gave you is relevant and accurate, what 308 00:19:19,960 --> 00:19:22,639 Speaker 1: good is the AI? If you have to double check, 309 00:19:22,680 --> 00:19:24,919 Speaker 1: then you might as well just get rid of the 310 00:19:24,960 --> 00:19:27,720 Speaker 1: AI and go straight to the source in the first place. 311 00:19:28,040 --> 00:19:30,320 Speaker 1: It's just like I was saying before, AI is just 312 00:19:30,400 --> 00:19:35,000 Speaker 1: making things, giving an extra step. It's like turning to 313 00:19:35,040 --> 00:19:37,120 Speaker 1: a random person next to you and saying, like, how 314 00:19:37,160 --> 00:19:39,480 Speaker 1: do you repair a car? Like you have no idea 315 00:19:39,760 --> 00:19:44,240 Speaker 1: they have any experience or knowledge in that realm, and 316 00:19:44,280 --> 00:19:47,200 Speaker 1: if they give you advice, you don't know if it's trustworthy. 317 00:19:47,240 --> 00:19:50,840 Speaker 1: You have to look it up. Well whatever open AI 318 00:19:50,920 --> 00:19:54,119 Speaker 1: made this announcement, and after that, alphabet shares fell around 319 00:19:54,160 --> 00:19:58,320 Speaker 1: three percent. So I guess like these questions are ones 320 00:19:58,359 --> 00:20:02,640 Speaker 1: that don't necessarily run around the minds of investors. I'm 321 00:20:02,720 --> 00:20:08,080 Speaker 1: just gonna progressively go bonkers over here as the world 322 00:20:08,240 --> 00:20:11,920 Speaker 1: continues to embrace AI and I keep asking, but wait, 323 00:20:12,280 --> 00:20:16,600 Speaker 1: I have questions. I also have more news. But before 324 00:20:16,640 --> 00:20:28,320 Speaker 1: we get to that, let's take another quick break. So 325 00:20:28,520 --> 00:20:31,680 Speaker 1: before the break, I was talking about Google and speaking 326 00:20:31,720 --> 00:20:35,080 Speaker 1: of alphabet slash Google. One of the other companies in 327 00:20:35,200 --> 00:20:39,040 Speaker 1: that group of companies under alphabet is YouTube, and it 328 00:20:39,080 --> 00:20:43,040 Speaker 1: could be facing some pretty tough issues in Russia. So 329 00:20:43,600 --> 00:20:46,840 Speaker 1: the Russian government has indicated that YouTube could see that 330 00:20:47,080 --> 00:20:51,199 Speaker 1: the agents will throttle YouTube traffic by forty percent on 331 00:20:51,240 --> 00:20:54,479 Speaker 1: desktop computers this week, So slowing down YouTube traffic by 332 00:20:54,520 --> 00:20:57,679 Speaker 1: forty percent, that could increase to a seventy percent slowed 333 00:20:57,720 --> 00:21:01,200 Speaker 1: down by next week. And you might ask why why 334 00:21:01,280 --> 00:21:06,360 Speaker 1: is the Russian government throttling traffic from YouTube in Russia? Well, 335 00:21:06,400 --> 00:21:09,960 Speaker 1: it's because YouTube previously placed a block on some channels 336 00:21:10,040 --> 00:21:14,240 Speaker 1: that were carrying Russian state media. Russian state media has 337 00:21:15,040 --> 00:21:17,520 Speaker 1: a bit of a reputation for having, let us say, 338 00:21:17,600 --> 00:21:22,359 Speaker 1: a biased view of the news, and that it often 339 00:21:22,440 --> 00:21:25,359 Speaker 1: is just really seen as a mouthpiece for the Russian 340 00:21:25,400 --> 00:21:29,640 Speaker 1: government itself, and thus is really a way of distributing propaganda. 341 00:21:29,880 --> 00:21:34,760 Speaker 1: So presumably what has happened is YouTube determined that some 342 00:21:35,040 --> 00:21:39,520 Speaker 1: Russian state media YouTube channels were in violation of YouTube's 343 00:21:39,520 --> 00:21:43,280 Speaker 1: policies and so block them, And the Russian government's response 344 00:21:43,400 --> 00:21:47,200 Speaker 1: is We're going to make all traffic to YouTube super 345 00:21:47,280 --> 00:21:51,640 Speaker 1: duper slow, so that Russian users will get really frustrated 346 00:21:51,680 --> 00:21:56,200 Speaker 1: about this unless you reinstate those channels. So essentially this 347 00:21:56,240 --> 00:21:59,240 Speaker 1: is a threat. It's a standoff between YouTube and the 348 00:21:59,320 --> 00:22:03,160 Speaker 1: Russian government. And I did not see any responses from 349 00:22:03,280 --> 00:22:06,960 Speaker 1: YouTube about this as of the time I'm recording this episode. 350 00:22:07,000 --> 00:22:08,879 Speaker 1: That may be different by the time you hear it, 351 00:22:09,280 --> 00:22:13,280 Speaker 1: but yeah, right now, it is like a standoff, which 352 00:22:13,320 --> 00:22:16,280 Speaker 1: is a big old yikes from me. Now, if you 353 00:22:16,400 --> 00:22:19,480 Speaker 1: don't use Google, but you are used to seeing search 354 00:22:19,480 --> 00:22:23,880 Speaker 1: results include stuff from Reddit threads, that could be changing. 355 00:22:24,320 --> 00:22:27,960 Speaker 1: So Reddit has a content policy that forbid sites from 356 00:22:28,040 --> 00:22:32,480 Speaker 1: crawling Reddit forums without first agreeing to follow Reddit's rules, 357 00:22:32,720 --> 00:22:35,080 Speaker 1: and those rules cover a lot of ground. One of 358 00:22:35,119 --> 00:22:38,560 Speaker 1: the big rules that Reddit has is that companies that 359 00:22:38,640 --> 00:22:42,240 Speaker 1: want to crawl Reddit for the purposes of gathering data 360 00:22:42,280 --> 00:22:45,760 Speaker 1: to train AI models have to pay for that privilege 361 00:22:45,880 --> 00:22:48,119 Speaker 1: that you've got to fork over some cash if you're 362 00:22:48,119 --> 00:22:50,879 Speaker 1: going to use Reddit to train up your AI. This 363 00:22:50,920 --> 00:22:53,240 Speaker 1: is something that a lot of Reddit users are actually 364 00:22:53,240 --> 00:22:56,520 Speaker 1: salty about because you know, a lot of people don't 365 00:22:56,560 --> 00:22:59,199 Speaker 1: want their stuff to be used to train AI in 366 00:22:59,240 --> 00:23:01,359 Speaker 1: the first place. They don't have any say in the matter, 367 00:23:01,640 --> 00:23:04,800 Speaker 1: like Reddit gets to make that call. They don't, And 368 00:23:05,080 --> 00:23:07,600 Speaker 1: some users are thinking that's not really fair that they 369 00:23:07,600 --> 00:23:10,560 Speaker 1: should have more control over the content they create. Reddit 370 00:23:10,600 --> 00:23:13,239 Speaker 1: says no, once you post it here, it's ours, like 371 00:23:13,280 --> 00:23:16,280 Speaker 1: it was yours then you hit post and now it 372 00:23:16,280 --> 00:23:19,760 Speaker 1: belongs to us. And users also are kind of salty 373 00:23:19,880 --> 00:23:23,840 Speaker 1: because the compensation it's going to Reddit doesn't go to 374 00:23:23,880 --> 00:23:26,600 Speaker 1: the people who are generating the content in the first place. Right, Like, 375 00:23:26,640 --> 00:23:30,240 Speaker 1: these AI models are using Reddit content to train themselves, 376 00:23:30,400 --> 00:23:33,880 Speaker 1: but the people who are making the content, they don't 377 00:23:33,880 --> 00:23:36,320 Speaker 1: get any of the money. The money goes to Reddit, 378 00:23:36,720 --> 00:23:38,720 Speaker 1: So it's going to the platform, but not the people 379 00:23:38,720 --> 00:23:41,520 Speaker 1: who are creating the actual content that's training AI. So yeah, 380 00:23:41,560 --> 00:23:43,719 Speaker 1: there are a couple of reasons why Reddit users are 381 00:23:43,760 --> 00:23:47,080 Speaker 1: not happy about this. But anyway, this means that any 382 00:23:47,119 --> 00:23:51,679 Speaker 1: company that has not agreed to these these rules that 383 00:23:51,720 --> 00:23:55,640 Speaker 1: Reddit has posted, they are not allowed to crawl the site. 384 00:23:55,960 --> 00:24:00,199 Speaker 1: Without crawling the site, you can't index it, and specific 385 00:24:00,240 --> 00:24:03,359 Speaker 1: threads will not pop up in search results. If that 386 00:24:03,600 --> 00:24:07,520 Speaker 1: is the case. Apparently, search engines not named Google have 387 00:24:08,080 --> 00:24:12,399 Speaker 1: largely not yet forged these kinds of agreements with Reddit, 388 00:24:12,960 --> 00:24:16,320 Speaker 1: So the result is that if you search for topics 389 00:24:16,320 --> 00:24:20,720 Speaker 1: that are covered on Reddit forums or Reddit threads, you 390 00:24:20,760 --> 00:24:24,600 Speaker 1: will get fewer results on non Google search engines. Some 391 00:24:24,640 --> 00:24:28,600 Speaker 1: of them will return some Reddit results, but others won't. 392 00:24:29,000 --> 00:24:32,240 Speaker 1: For a full breakdown on this issue, including some really 393 00:24:32,280 --> 00:24:35,520 Speaker 1: great insight into the search engine landscape in general, i 394 00:24:35,600 --> 00:24:41,040 Speaker 1: recommend reading Sharon Harding's excellent piece in Ours Technica's article 395 00:24:41,119 --> 00:24:45,800 Speaker 1: titled non Google search engines blocked from showing recent Reddit results. 396 00:24:46,359 --> 00:24:50,840 Speaker 1: Shila Chiang CNBC reports that Apple is no longer among 397 00:24:50,920 --> 00:24:55,440 Speaker 1: the top five smartphone brands in China. Rather, Chinese companies 398 00:24:55,600 --> 00:24:58,720 Speaker 1: domestic companies are taking up the top five spots, and 399 00:24:58,800 --> 00:25:02,040 Speaker 1: Apple has fallen to hold fourteen percent of the market 400 00:25:02,119 --> 00:25:04,639 Speaker 1: share in the country. The Chinese market has been a 401 00:25:04,680 --> 00:25:08,480 Speaker 1: really important one for Apple. The company famously courted Chinese 402 00:25:08,480 --> 00:25:12,080 Speaker 1: officials and business representatives while trying to establish an inroad 403 00:25:12,119 --> 00:25:15,200 Speaker 1: into Chinese markets for the iPhone. Ironically, the iPhone was 404 00:25:15,240 --> 00:25:18,800 Speaker 1: being manufactured in China for two years, but not sold 405 00:25:19,000 --> 00:25:23,440 Speaker 1: in Chinese stores, at least not legally sold in China 406 00:25:23,480 --> 00:25:26,680 Speaker 1: for those two years. Earlier this year, Apple CEO Tim 407 00:25:26,760 --> 00:25:31,359 Speaker 1: Cook declared China an extremely competitive market in the smartphone space, 408 00:25:31,400 --> 00:25:34,600 Speaker 1: which was I feel like it's coded language. The country's 409 00:25:34,600 --> 00:25:39,600 Speaker 1: government has frequently favored domestic companies over products coming from 410 00:25:39,680 --> 00:25:43,000 Speaker 1: Western companies. I've talked several times about how tech companies 411 00:25:43,280 --> 00:25:45,800 Speaker 1: from the West have viewed China as a potential gold 412 00:25:45,880 --> 00:25:49,200 Speaker 1: mine because the country does have a truly enormous population. 413 00:25:49,520 --> 00:25:51,800 Speaker 1: But I personally think the price of doing business in 414 00:25:51,880 --> 00:25:55,919 Speaker 1: China is really high. It's high enough to make it 415 00:25:56,000 --> 00:25:59,040 Speaker 1: hard to say whether or not it's worth the effort. Also, 416 00:25:59,359 --> 00:26:02,959 Speaker 1: it frequently means that you are implicated or complicit with 417 00:26:03,080 --> 00:26:07,800 Speaker 1: ethically questionable state programs, which is more than putting it lightly. 418 00:26:08,280 --> 00:26:11,120 Speaker 1: In what could have been a nightmare scenario, a security 419 00:26:11,200 --> 00:26:15,359 Speaker 1: vendor called no B four that's all one word, No 420 00:26:15,680 --> 00:26:19,680 Speaker 1: B and then the Numeral four hired a software engineer 421 00:26:19,720 --> 00:26:22,439 Speaker 1: to fill a job in the company's AI division. The 422 00:26:22,520 --> 00:26:26,600 Speaker 1: job hire purported to be a US based software engineer, 423 00:26:26,720 --> 00:26:30,399 Speaker 1: but in fact was a hacker from North Korea. And 424 00:26:30,480 --> 00:26:33,719 Speaker 1: this hacker had stolen some US credentials and used a 425 00:26:33,760 --> 00:26:37,120 Speaker 1: stock photo and then altered it with AI to make 426 00:26:37,359 --> 00:26:40,000 Speaker 1: a fake headshot in order to fool the company. They 427 00:26:40,040 --> 00:26:44,200 Speaker 1: apparently looked enough like that headshot to fool hiring managers 428 00:26:44,240 --> 00:26:49,680 Speaker 1: because they then went on multiple video conferencing calls as 429 00:26:49,720 --> 00:26:52,399 Speaker 1: part of the hiring process. But yeah, this hacker managed 430 00:26:52,440 --> 00:26:56,200 Speaker 1: to get through the entire interview process, multiple rounds, even 431 00:26:56,280 --> 00:26:59,760 Speaker 1: got through a background check, and according to a company representative, 432 00:27:00,040 --> 00:27:03,639 Speaker 1: quote we sent them their Mac workstation and the moment 433 00:27:03,760 --> 00:27:08,120 Speaker 1: it was received, it immediately started to load malware end quote. 434 00:27:08,359 --> 00:27:12,920 Speaker 1: We say. Fortunately, the company security software caught on right 435 00:27:12,960 --> 00:27:17,120 Speaker 1: away and within short order, security kind of like sequestered 436 00:27:17,119 --> 00:27:22,119 Speaker 1: off the hire's computer so that they could not cause 437 00:27:22,240 --> 00:27:25,359 Speaker 1: further mischief, you know, to like company systems and stuff 438 00:27:25,400 --> 00:27:30,399 Speaker 1: like that. They think that it's quite possible this hacker 439 00:27:30,640 --> 00:27:34,920 Speaker 1: was a state backed operative that North Korea was actually 440 00:27:34,960 --> 00:27:38,119 Speaker 1: you know, kind of sponsoring hackers like this to try 441 00:27:38,200 --> 00:27:43,560 Speaker 1: and infiltrate various companies like security companies and then cause 442 00:27:43,680 --> 00:27:47,199 Speaker 1: problems from there. There are few public details available about this, 443 00:27:47,320 --> 00:27:51,359 Speaker 1: largely because Nobifore says there's an ongoing FBI investigation into 444 00:27:51,359 --> 00:27:54,040 Speaker 1: the matter. But yeah, it sounds like hiring managers might 445 00:27:54,119 --> 00:27:56,520 Speaker 1: need to be on the lookout for potential threat agents 446 00:27:56,600 --> 00:28:00,920 Speaker 1: during the hiring process. Fun times. Around five hundred Microsoft 447 00:28:01,000 --> 00:28:05,119 Speaker 1: employees working on the venerable World of Warcraft title have 448 00:28:05,280 --> 00:28:08,680 Speaker 1: voted to join the hoard. By that, I mean they've 449 00:28:08,760 --> 00:28:13,200 Speaker 1: voted in support of unionization. Microsoft is pledged to remain 450 00:28:13,280 --> 00:28:17,000 Speaker 1: neutral in the proceedings. That was actually part of Microsoft's 451 00:28:17,000 --> 00:28:20,840 Speaker 1: concessions during the long process in which the company was 452 00:28:21,040 --> 00:28:25,919 Speaker 1: attempting to acquire Activision Blizzard, which it ultimately did. Activision Blizzard, 453 00:28:25,920 --> 00:28:28,000 Speaker 1: by the way, is the company behind World of Warcraft. 454 00:28:28,480 --> 00:28:32,479 Speaker 1: Blizzard has already recognized this union, which is a key step. 455 00:28:32,800 --> 00:28:35,959 Speaker 1: It's one that means the union employees won't have to 456 00:28:36,000 --> 00:28:38,560 Speaker 1: appeal to the US government in order to force Blizzard 457 00:28:38,600 --> 00:28:41,960 Speaker 1: to acknowledge them. Sometimes that does happen. When a company 458 00:28:42,080 --> 00:28:46,600 Speaker 1: refuses to recognize a union, the workers join a few 459 00:28:46,680 --> 00:28:50,800 Speaker 1: other unionized groups within Microsoft's video game division. More than 460 00:28:50,880 --> 00:28:54,960 Speaker 1: one seven hundred and fifty employees in that specific part 461 00:28:55,040 --> 00:28:58,360 Speaker 1: of Microsoft are now in a union, and all I 462 00:28:58,360 --> 00:29:01,800 Speaker 1: can say is solidarity. Not long ago, I talked about 463 00:29:01,800 --> 00:29:06,440 Speaker 1: how SpaceX's Falcon nine was grounded following a launch vehicle's 464 00:29:06,520 --> 00:29:10,200 Speaker 1: mid flight failure that resulted in the loss of a 465 00:29:10,240 --> 00:29:14,440 Speaker 1: big group of Starlink satellites. But up to that point, 466 00:29:14,480 --> 00:29:19,560 Speaker 1: the Falcon nine had a stellar, pun intended safety record, 467 00:29:19,840 --> 00:29:21,960 Speaker 1: which is kind of unheard of in the space industry 468 00:29:21,960 --> 00:29:24,800 Speaker 1: if we're being brutally honest, I mean, space is hard, 469 00:29:25,240 --> 00:29:29,720 Speaker 1: so having a vehicle that has had the reliability of 470 00:29:29,760 --> 00:29:32,720 Speaker 1: the Falcon nine is a rare thing, indeed. But the 471 00:29:32,800 --> 00:29:37,280 Speaker 1: FAA has now ordered that the fleet, which was grounded 472 00:29:37,400 --> 00:29:40,800 Speaker 1: to allow for an investigation, can now return to operation. 473 00:29:41,560 --> 00:29:45,160 Speaker 1: The investigation still ongoing, but the FAA says the found 474 00:29:45,160 --> 00:29:48,680 Speaker 1: no reason to block operations, so there were no public 475 00:29:48,680 --> 00:29:51,840 Speaker 1: safety issues present as far as they could tell. So 476 00:29:51,880 --> 00:29:54,600 Speaker 1: while they will continue to investigate the matter, the Falcon 477 00:29:54,680 --> 00:29:57,760 Speaker 1: nine can return to service. And that's great news for 478 00:29:57,840 --> 00:30:00,680 Speaker 1: the industry as a whole, for the United States in particular, 479 00:30:00,720 --> 00:30:03,440 Speaker 1: because the Falcon nine is the launch vehicle that can 480 00:30:03,480 --> 00:30:07,000 Speaker 1: take people to the International Space Station here in the 481 00:30:07,080 --> 00:30:12,160 Speaker 1: United States. So yeah, SpaceX's launch vehicles have become absolutely 482 00:30:12,960 --> 00:30:16,760 Speaker 1: critical for the space industry here in the United States. 483 00:30:17,000 --> 00:30:19,040 Speaker 1: The company says it will be ready to return to 484 00:30:19,080 --> 00:30:23,240 Speaker 1: operations as early as tomorrow, Saturday, July twenty seventh, twenty 485 00:30:23,360 --> 00:30:28,280 Speaker 1: twenty four. Okay, that wraps up this episode of tech 486 00:30:28,360 --> 00:30:30,480 Speaker 1: Stuff with the Tech News. I do have a couple 487 00:30:30,480 --> 00:30:33,480 Speaker 1: of recommendations for y'all in order to read up on stuff. 488 00:30:33,640 --> 00:30:37,240 Speaker 1: First up is Jordan Velenski's piece for cnn dot Com 489 00:30:37,240 --> 00:30:41,400 Speaker 1: titled Grinder is limiting location services at the Olympics to 490 00:30:41,440 --> 00:30:46,520 Speaker 1: protect LGBTQ plus athletes. So this is about how the 491 00:30:46,760 --> 00:30:51,320 Speaker 1: Grinder app is proactively protecting athlete identities. This is important 492 00:30:51,400 --> 00:30:54,200 Speaker 1: because some of those folks are coming from countries that 493 00:30:54,280 --> 00:30:58,120 Speaker 1: have draconian laws that target people who belong to the 494 00:30:58,280 --> 00:31:01,959 Speaker 1: LGBTQ plus community. So, in other words, you're not going 495 00:31:02,000 --> 00:31:03,800 Speaker 1: to be able to open the app and change your 496 00:31:03,840 --> 00:31:07,320 Speaker 1: location to the Olympic village and then go looking around 497 00:31:07,320 --> 00:31:10,960 Speaker 1: for Olympic athletes who may be part of that LGBTQ 498 00:31:11,080 --> 00:31:14,560 Speaker 1: plus community. It's not going to work, and that's for 499 00:31:14,640 --> 00:31:17,440 Speaker 1: the protection of those athletes. I think that's a good idea, 500 00:31:18,040 --> 00:31:20,560 Speaker 1: but the article goes into further detail, so check it out. 501 00:31:20,840 --> 00:31:25,200 Speaker 1: Next up is an article by Alphonso Maruccia of tech Spot. 502 00:31:25,280 --> 00:31:29,800 Speaker 1: It's titled spyware Maker gets hacked. Data reveals thousands of 503 00:31:29,840 --> 00:31:33,600 Speaker 1: remotely controlled devices. So this article covers the case of 504 00:31:33,640 --> 00:31:37,000 Speaker 1: a company called spy Tech Software, which is in a 505 00:31:37,120 --> 00:31:41,120 Speaker 1: questionable business I would argue, and talks about how hackers 506 00:31:41,120 --> 00:31:44,480 Speaker 1: got access to the company systems and what that all means. 507 00:31:44,560 --> 00:31:47,920 Speaker 1: It is well worth a read. That's it for me 508 00:31:48,000 --> 00:31:50,840 Speaker 1: this week. I hope all of you are well and 509 00:31:50,880 --> 00:32:00,160 Speaker 1: I'll talk to you again really soon. Tech Stuff is 510 00:32:00,160 --> 00:32:04,640 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 511 00:32:04,680 --> 00:32:08,320 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 512 00:32:08,360 --> 00:32:09,120 Speaker 1: favorite shows.