1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:16,480 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,560 --> 00:00:19,600 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:19,640 --> 00:00:23,200 Speaker 1: are you? It is time for tech News for Tuesday, 5 00:00:23,360 --> 00:00:30,680 Speaker 1: October thirty first, twenty twenty three. Happy Halloween, everybody, or 6 00:00:32,000 --> 00:00:36,800 Speaker 1: a spooky sawyn if you prefer. Last week I mentioned 7 00:00:36,880 --> 00:00:39,839 Speaker 1: we would not have a news episode today because of 8 00:00:39,880 --> 00:00:42,920 Speaker 1: smart talks with IBM, But the joke was on me. 9 00:00:43,280 --> 00:00:46,440 Speaker 1: Turns out there was a scheduling delay, and so now 10 00:00:46,479 --> 00:00:49,200 Speaker 1: you get an episode from me after all. So let's 11 00:00:49,240 --> 00:00:53,400 Speaker 1: dive into the spooky world of tech news, and you 12 00:00:53,479 --> 00:00:57,200 Speaker 1: might hear me make some spooky sounds throughout the show 13 00:00:57,600 --> 00:01:00,800 Speaker 1: that I assure you it's not that I'm hand It's 14 00:01:00,960 --> 00:01:05,320 Speaker 1: just that the kidney stone I'm dealing with still isn't 15 00:01:05,319 --> 00:01:08,480 Speaker 1: done yet. Anyway, the back half of twenty twenty three 16 00:01:08,560 --> 00:01:11,199 Speaker 1: is hitting me hard, y'all. But never mind that you're 17 00:01:11,200 --> 00:01:15,000 Speaker 1: not here to hear me grouse and moan about my health. 18 00:01:15,360 --> 00:01:17,600 Speaker 1: You're here to hear me grouse and moan about tech. 19 00:01:17,680 --> 00:01:21,040 Speaker 1: Let's get to it. First. Up this week, US President 20 00:01:21,160 --> 00:01:25,080 Speaker 1: Joe Biden issued an executive order that deals with the 21 00:01:25,120 --> 00:01:29,319 Speaker 1: big topic in tech of twenty twenty three, that of 22 00:01:29,480 --> 00:01:33,440 Speaker 1: artificial intelligence, and they had a particular focus on generative AI, 23 00:01:33,560 --> 00:01:38,679 Speaker 1: which I'll just remind you. Generative AI is one subset 24 00:01:38,840 --> 00:01:42,360 Speaker 1: of the overall discipline of artificial intelligence. And if you're 25 00:01:42,360 --> 00:01:45,399 Speaker 1: not from America, or maybe you are from America but 26 00:01:45,440 --> 00:01:47,400 Speaker 1: you've never really looked into it and you don't know 27 00:01:47,440 --> 00:01:51,080 Speaker 1: what an executive order is, it is one of the 28 00:01:51,120 --> 00:01:54,880 Speaker 1: president's powers of office. The president has the authority to 29 00:01:55,040 --> 00:01:59,160 Speaker 1: issue directives without having to go through Congress or any 30 00:01:59,200 --> 00:02:02,240 Speaker 1: other branch of the govern as long as those directives 31 00:02:02,280 --> 00:02:05,560 Speaker 1: relate to how the federal government works. So the US 32 00:02:05,680 --> 00:02:10,639 Speaker 1: president could issue an executive order to manage federal resources. 33 00:02:11,160 --> 00:02:14,920 Speaker 1: But the president could not issue an executive order to 34 00:02:16,000 --> 00:02:19,320 Speaker 1: force South Dakota to change its name to Frank or whatever, 35 00:02:19,440 --> 00:02:21,440 Speaker 1: because that's not in the power of the federal government. 36 00:02:22,280 --> 00:02:26,880 Speaker 1: This executive order in particular, creates requirements, mostly among federal 37 00:02:26,880 --> 00:02:31,400 Speaker 1: agencies when it comes to entering into contracts with AI 38 00:02:31,560 --> 00:02:36,520 Speaker 1: companies and AI services. So the order restricts those contracts 39 00:02:36,960 --> 00:02:42,080 Speaker 1: to companies that comply with regulations, and those regulations include 40 00:02:42,080 --> 00:02:46,839 Speaker 1: things like testing requirements for their AI models, transparency when 41 00:02:46,840 --> 00:02:50,320 Speaker 1: it comes to developing the tools, and various restrictions on 42 00:02:50,320 --> 00:02:53,480 Speaker 1: what generative AI is and is not allowed to do so. 43 00:02:53,600 --> 00:02:57,240 Speaker 1: For example, if the generative AI can be convinced to 44 00:02:57,440 --> 00:03:01,280 Speaker 1: design weapons, that's right out. You know, a federal agency 45 00:03:01,360 --> 00:03:04,640 Speaker 1: is not allowed to enter into a contract with a 46 00:03:04,639 --> 00:03:08,520 Speaker 1: company that does that. So these requirements don't hit the 47 00:03:08,639 --> 00:03:12,720 Speaker 1: AI companies directly, right they're hitting the federal agencies. But 48 00:03:13,280 --> 00:03:16,200 Speaker 1: if you assume that those companies want to do business 49 00:03:16,560 --> 00:03:20,000 Speaker 1: with the federal government, which is a very lucrative business 50 00:03:20,040 --> 00:03:23,519 Speaker 1: to get into, it thus creates an incentive for those 51 00:03:23,560 --> 00:03:27,440 Speaker 1: AI companies to comply with the rules. Now, there are 52 00:03:27,520 --> 00:03:31,200 Speaker 1: some aspects to the executive Order that more directly target 53 00:03:31,240 --> 00:03:34,800 Speaker 1: AI companies. For example, should a company engage in training 54 00:03:34,840 --> 00:03:39,200 Speaker 1: AI systems that could potentially create a threat to national security, well, 55 00:03:39,240 --> 00:03:41,360 Speaker 1: those companies are going to need to send a little 56 00:03:41,440 --> 00:03:44,120 Speaker 1: heads up to the federal government first, which is not 57 00:03:44,160 --> 00:03:46,680 Speaker 1: a bad idea, though I would submit that sometimes we 58 00:03:46,720 --> 00:03:50,680 Speaker 1: don't realize something is a threat until you know it's 59 00:03:50,720 --> 00:03:54,600 Speaker 1: threatened US or worse. The order also recommends, but does 60 00:03:54,600 --> 00:03:59,960 Speaker 1: not necessarily require, that generative AI companies use a label 61 00:04:00,160 --> 00:04:05,120 Speaker 1: or a watermark for all AI created material, whether that's audio, video, 62 00:04:05,280 --> 00:04:08,520 Speaker 1: or images or whatever. The order also places the onus 63 00:04:08,960 --> 00:04:13,520 Speaker 1: of creating more structured regulations on various other agencies within 64 00:04:13,600 --> 00:04:18,720 Speaker 1: the US government. And speaking of regulations, a couple of 65 00:04:18,760 --> 00:04:24,839 Speaker 1: AI luminaries have recently spoken out against regulations. Business Insider 66 00:04:24,920 --> 00:04:28,880 Speaker 1: reports that Jan Lucun, one of the leading researchers in 67 00:04:29,000 --> 00:04:33,520 Speaker 1: artificial intelligence, has argued that the various doomsday scenarios being 68 00:04:33,560 --> 00:04:39,000 Speaker 1: bandied about regarding AI aren't really realistic. Rather, he says, 69 00:04:39,480 --> 00:04:42,960 Speaker 1: spreading these stories is a tactic that's used by a 70 00:04:43,040 --> 00:04:46,359 Speaker 1: few large companies that are working in AI today. And 71 00:04:46,400 --> 00:04:48,920 Speaker 1: you might say, well, why would they do that, why 72 00:04:48,920 --> 00:04:53,440 Speaker 1: would they say AI is a potential threat to humanity's existence. Well, 73 00:04:53,560 --> 00:04:56,400 Speaker 1: these same companies are actually taking a very active role 74 00:04:56,680 --> 00:04:59,560 Speaker 1: to work with governments around the world to develop AI 75 00:04:59,600 --> 00:05:05,760 Speaker 1: regular not because the companies actually believe these regulations are 76 00:05:05,760 --> 00:05:09,799 Speaker 1: going to prevent harm. Instead, they're trying to create regulations 77 00:05:10,320 --> 00:05:14,440 Speaker 1: that will allow big established companies to be insulated against 78 00:05:14,560 --> 00:05:17,760 Speaker 1: smaller startups because the smaller startups are going to find 79 00:05:17,760 --> 00:05:21,400 Speaker 1: it hard or maybe even impossible to work within the 80 00:05:21,440 --> 00:05:25,640 Speaker 1: framework of regulations. So, according to Lucun, this is a 81 00:05:25,680 --> 00:05:30,239 Speaker 1: way to stifle competition and innovation. Sam Altman, the CEO 82 00:05:30,360 --> 00:05:32,760 Speaker 1: of open Ai, comes to mind when you hear about 83 00:05:32,960 --> 00:05:36,760 Speaker 1: this argument when it's put this way, because Altman has 84 00:05:36,800 --> 00:05:39,760 Speaker 1: taken an active role to work with governments in the 85 00:05:39,880 --> 00:05:43,240 Speaker 1: US and the UK and elsewhere in an effort to 86 00:05:43,360 --> 00:05:47,559 Speaker 1: frame regulations around AI. Now. To take the other side, 87 00:05:47,640 --> 00:05:49,760 Speaker 1: Altman would argue that his goal is to set up 88 00:05:49,839 --> 00:05:53,159 Speaker 1: rules that make sense and that will protect people against 89 00:05:53,279 --> 00:05:57,680 Speaker 1: bad AI design and bad AI implementations. But critics like 90 00:05:57,720 --> 00:06:00,880 Speaker 1: Lacun say this is really more of a self serve motivation, 91 00:06:01,240 --> 00:06:04,120 Speaker 1: and it prompts Altman to work so closely with governments 92 00:06:04,600 --> 00:06:07,920 Speaker 1: in an effort to suppress potential competitors and give open 93 00:06:08,000 --> 00:06:11,640 Speaker 1: aye an unfair market advantage. He is, not, by the way, 94 00:06:11,680 --> 00:06:14,520 Speaker 1: the only AI bigwig to suggest that this is a thing. 95 00:06:15,040 --> 00:06:18,920 Speaker 1: Andrew ng, who founded Google Brain and who teaches at 96 00:06:18,920 --> 00:06:23,159 Speaker 1: Stanford University and in fact taught Sam Altman, has also 97 00:06:23,360 --> 00:06:27,640 Speaker 1: argued that the push for regulations, particularly the push from 98 00:06:27,800 --> 00:06:31,719 Speaker 1: large AI companies, suggests more of a move to squash 99 00:06:31,760 --> 00:06:35,880 Speaker 1: competition and innovation than to actually create a safe environment. 100 00:06:36,240 --> 00:06:40,720 Speaker 1: Now Ing told Financial Worldview that he believes regulations are important. 101 00:06:40,760 --> 00:06:43,839 Speaker 1: He's not saying don't have regulations at all, but he 102 00:06:43,960 --> 00:06:48,080 Speaker 1: is saying that bad regulations are worse than no regulations 103 00:06:48,120 --> 00:06:51,200 Speaker 1: at all. So it goes bad regulations at the bottom, 104 00:06:51,600 --> 00:06:56,400 Speaker 1: no regulations, and then good regulations. If things are left 105 00:06:56,400 --> 00:07:00,000 Speaker 1: to the companies that are going to be regulated, then 106 00:07:00,000 --> 00:07:02,560 Speaker 1: and we should expect nothing less than rules that are 107 00:07:02,600 --> 00:07:06,159 Speaker 1: going to benefit them to the detriment of everyone else, 108 00:07:06,200 --> 00:07:09,760 Speaker 1: whether that's a competitor or a consumer, which kind of 109 00:07:09,760 --> 00:07:12,600 Speaker 1: makes sense to me. And in the war between artists 110 00:07:12,800 --> 00:07:17,240 Speaker 1: and AI, a US district judge named William Oric has 111 00:07:17,320 --> 00:07:22,320 Speaker 1: dismissed most of a lawsuit that three artists brought against 112 00:07:22,360 --> 00:07:26,800 Speaker 1: the company's mid Journey, Deviant Art, and Stability AI. In fact, 113 00:07:27,160 --> 00:07:30,360 Speaker 1: out of those three, only the part of the lawsuit 114 00:07:30,440 --> 00:07:35,000 Speaker 1: directly involving Stability AI is still alive in the court system. 115 00:07:35,280 --> 00:07:37,880 Speaker 1: The judge ruled that the accusations as laid out in 116 00:07:37,920 --> 00:07:42,760 Speaker 1: the lawsuit were quote defective in numerous respects end quote 117 00:07:43,120 --> 00:07:46,680 Speaker 1: regarding mid Journey and Deviant Art, and so those elements 118 00:07:46,680 --> 00:07:49,640 Speaker 1: are now essentially dead in the water until they perhaps 119 00:07:49,640 --> 00:07:54,280 Speaker 1: get reasserted in a updated filing. So at the center 120 00:07:54,320 --> 00:07:57,800 Speaker 1: of this whole legal mess is a concern among artists 121 00:07:57,840 --> 00:08:01,760 Speaker 1: that generative AI companies are making broad and unrestricted use 122 00:08:02,200 --> 00:08:06,560 Speaker 1: of artist works without the artist's permission or compensation, and 123 00:08:06,600 --> 00:08:09,720 Speaker 1: that as a result, these generative AI models are able 124 00:08:09,720 --> 00:08:14,880 Speaker 1: to lift elements of artists' styles and essentially the AI 125 00:08:15,000 --> 00:08:18,640 Speaker 1: can compete against the very artists whose work the model 126 00:08:18,800 --> 00:08:21,680 Speaker 1: was used to train on in the first place. As 127 00:08:21,720 --> 00:08:25,480 Speaker 1: for why Stability AI is still in the mix here, 128 00:08:25,920 --> 00:08:27,600 Speaker 1: it mostly has to do with how mid Journey and 129 00:08:27,640 --> 00:08:32,040 Speaker 1: dev and Art have used Stability AI's generative tool called 130 00:08:32,080 --> 00:08:35,240 Speaker 1: Stable Diffusion, and the judge says, well, the artists weren't 131 00:08:35,280 --> 00:08:39,320 Speaker 1: able to really show that those companies were using this 132 00:08:39,640 --> 00:08:43,240 Speaker 1: knowingly in an effort to harm artists, or were doing 133 00:08:43,280 --> 00:08:45,080 Speaker 1: it in a way that where they were disregarding the 134 00:08:45,120 --> 00:08:48,600 Speaker 1: harm being done to artists, and that really just falls 135 00:08:48,600 --> 00:08:51,520 Speaker 1: to Stability AI, which created the tool in the first place. 136 00:08:52,000 --> 00:08:56,080 Speaker 1: The artists accusations showed that, at least according to a 137 00:08:56,120 --> 00:09:00,760 Speaker 1: search tool meant to match artists against work that generative 138 00:09:00,800 --> 00:09:06,040 Speaker 1: AI models have supposedly used during training, the Stability AI 139 00:09:06,240 --> 00:09:10,200 Speaker 1: tool had made unauthorized use of art to train up 140 00:09:10,240 --> 00:09:13,920 Speaker 1: Stable Diffusion. So out of the three artists, only one 141 00:09:13,920 --> 00:09:15,600 Speaker 1: of them is still part of this lawsuit. The other 142 00:09:15,640 --> 00:09:19,120 Speaker 1: two dropped out because they did not actually obtain a 143 00:09:19,160 --> 00:09:23,319 Speaker 1: copyright for their works, so there was no legal protection 144 00:09:23,760 --> 00:09:25,720 Speaker 1: to be violated in the first place. But one of 145 00:09:25,760 --> 00:09:30,040 Speaker 1: the three did. She's still in the mix. So she said, 146 00:09:30,160 --> 00:09:32,320 Speaker 1: my name has popped up in the search engine, showing 147 00:09:32,360 --> 00:09:36,480 Speaker 1: that this model used my work to train itself. Now 148 00:09:37,320 --> 00:09:39,880 Speaker 1: that in itself is not definitive proof, right, just because 149 00:09:39,920 --> 00:09:42,400 Speaker 1: she shows up on the search engine. But the judge 150 00:09:42,440 --> 00:09:44,440 Speaker 1: rule that it was sufficient to allow the case to 151 00:09:44,480 --> 00:09:48,240 Speaker 1: continue through to the discovery phase. Now I should add 152 00:09:48,520 --> 00:09:51,280 Speaker 1: this is not the only case brought against Stability AI 153 00:09:51,960 --> 00:09:56,000 Speaker 1: and how it trains stable diffusion. Getty Images sued the 154 00:09:56,040 --> 00:09:59,000 Speaker 1: company both in the UK and in the United States, 155 00:09:59,360 --> 00:10:03,960 Speaker 1: arguing that Stability AI used stock images from Getty's enormous 156 00:10:04,040 --> 00:10:08,600 Speaker 1: database to train its model, and that the company quote 157 00:10:08,840 --> 00:10:14,000 Speaker 1: removed or altered end quote Getty's copyright management information in 158 00:10:14,040 --> 00:10:18,440 Speaker 1: the process. Just yesterday, Stability's AI legal team asked a 159 00:10:18,480 --> 00:10:21,360 Speaker 1: London judge to throw out the UK case. I've not 160 00:10:21,520 --> 00:10:25,960 Speaker 1: heard how that has panned out. But also these cases 161 00:10:25,960 --> 00:10:28,240 Speaker 1: have been going on for a while. While this is 162 00:10:28,280 --> 00:10:31,520 Speaker 1: an update, you know, they date back to early twenty 163 00:10:31,559 --> 00:10:34,800 Speaker 1: twenty three and even late twenty twenty two, so it's 164 00:10:34,800 --> 00:10:36,840 Speaker 1: not like this just popped out of nowhere. This is 165 00:10:36,880 --> 00:10:41,800 Speaker 1: an ongoing battle in legal systems around the world. Justice 166 00:10:42,240 --> 00:10:46,319 Speaker 1: moves ever swiftly. Okay, we've got more stories to cover, 167 00:10:47,040 --> 00:10:49,559 Speaker 1: and they don't even have to do with artificial intelligence. 168 00:10:49,960 --> 00:10:51,719 Speaker 1: So we're going to take a quick break and when 169 00:10:51,720 --> 00:10:59,160 Speaker 1: we come back, I'll talk about some more tech news. 170 00:11:04,400 --> 00:11:07,240 Speaker 1: We're back, and now I'm done talking about AI at 171 00:11:07,320 --> 00:11:09,880 Speaker 1: least for the moment, which means it's time to talk 172 00:11:09,920 --> 00:11:13,199 Speaker 1: about X. You know, the platform formerly known as Twitter. 173 00:11:13,440 --> 00:11:15,800 Speaker 1: Only one story though, so let's just get through it. 174 00:11:16,320 --> 00:11:18,440 Speaker 1: And the big news this week is that X is 175 00:11:18,480 --> 00:11:24,199 Speaker 1: going to offer restricted stock units or RSUs to its employees. Now, 176 00:11:24,280 --> 00:11:28,240 Speaker 1: keep in mind X is a private company. At the moment, 177 00:11:28,679 --> 00:11:32,000 Speaker 1: there is no publicly traded stock market that it's on, 178 00:11:32,520 --> 00:11:34,960 Speaker 1: so you can't look to see what stocks are trading 179 00:11:34,960 --> 00:11:37,640 Speaker 1: at right now. It's privately held. So in order to 180 00:11:37,640 --> 00:11:42,640 Speaker 1: be able to issue restricted stock units to employees, first 181 00:11:42,679 --> 00:11:45,240 Speaker 1: the company has to determine how much it's actually worth. 182 00:11:45,960 --> 00:11:48,560 Speaker 1: And this gets a little more loosey goosey than you 183 00:11:48,559 --> 00:11:51,080 Speaker 1: would find with a publicly traded company. Like with a 184 00:11:51,080 --> 00:11:54,840 Speaker 1: publicly traded company, you take the number of shares that 185 00:11:54,960 --> 00:11:57,840 Speaker 1: exist in that company, and you multiply that by the 186 00:11:57,960 --> 00:12:00,959 Speaker 1: price per share, and you get a rough estimation of 187 00:12:01,000 --> 00:12:04,480 Speaker 1: the valuation of that company. So X had to go 188 00:12:04,520 --> 00:12:09,320 Speaker 1: through a different process, but in internal documents it had 189 00:12:09,360 --> 00:12:13,920 Speaker 1: listed its valuation at around nineteen billion dollars. Now keep 190 00:12:13,920 --> 00:12:18,680 Speaker 1: in mind, when Elon Musk agreed to purchase Twitter, he 191 00:12:18,800 --> 00:12:21,520 Speaker 1: did so at a share price that would be set 192 00:12:21,559 --> 00:12:25,120 Speaker 1: at fifty four dollars twenty cents per share, which meant 193 00:12:25,200 --> 00:12:28,040 Speaker 1: that at the time of purchase, Twitter would be worth 194 00:12:28,240 --> 00:12:32,600 Speaker 1: forty four billion dollars. So he spent forty four billion, 195 00:12:33,080 --> 00:12:36,960 Speaker 1: and now the platform has a current valuation of nineteen 196 00:12:37,040 --> 00:12:40,719 Speaker 1: billion dollars. That's a fifty six percent drop in valuation 197 00:12:41,559 --> 00:12:44,880 Speaker 1: over the course of a year. Now keep in mind 198 00:12:45,520 --> 00:12:49,320 Speaker 1: that's just one assessment. Other assessments are actually different. For example, 199 00:12:49,440 --> 00:12:52,800 Speaker 1: Fidelity figure that the drop is closer to sixty five 200 00:12:52,880 --> 00:12:57,080 Speaker 1: percent of value YELSA. Now I admit when I first 201 00:12:57,080 --> 00:12:59,200 Speaker 1: read about this, I didn't see that there was a 202 00:12:59,200 --> 00:13:03,760 Speaker 1: bright side, but tech Crunch's Amanda Silberling points out that 203 00:13:03,880 --> 00:13:07,800 Speaker 1: sometimes a lower valuation isn't necessarily a bad thing for 204 00:13:07,840 --> 00:13:11,640 Speaker 1: the company or its employees, because the employees stand to 205 00:13:12,000 --> 00:13:15,480 Speaker 1: gain those stocks or stock options, and it could mean 206 00:13:15,520 --> 00:13:18,040 Speaker 1: that they could get a lot more of them as 207 00:13:18,120 --> 00:13:21,960 Speaker 1: part of their compensation package. This can also become a 208 00:13:22,040 --> 00:13:25,080 Speaker 1: very powerful tool to recruit new talent into the company, 209 00:13:25,760 --> 00:13:28,880 Speaker 1: and as long as the company does better and starts 210 00:13:28,880 --> 00:13:32,079 Speaker 1: to improve in value, employees will see a huge gain 211 00:13:33,080 --> 00:13:35,800 Speaker 1: as a result of having those stocks, assuming that they 212 00:13:35,840 --> 00:13:39,600 Speaker 1: execute their stock options. Obviously, X still needs to regain 213 00:13:39,679 --> 00:13:42,840 Speaker 1: a lot of lost ground, right A fifty six percent 214 00:13:42,920 --> 00:13:46,800 Speaker 1: drop invaluation is not a great thing to communicate out 215 00:13:46,840 --> 00:13:50,160 Speaker 1: to the world. Doing that is going to create much 216 00:13:50,160 --> 00:13:53,480 Speaker 1: greater gains for the employees who exercise their stock options. However, 217 00:13:53,559 --> 00:13:55,960 Speaker 1: so it could be a really good thing. And you know, 218 00:13:56,440 --> 00:13:58,720 Speaker 1: never say never. I'm not going to say that X's 219 00:13:59,040 --> 00:14:02,840 Speaker 1: is just death to be doomed. It just has to 220 00:14:02,880 --> 00:14:05,240 Speaker 1: prove it can reverse the trend it's been on over 221 00:14:05,240 --> 00:14:08,280 Speaker 1: the last year. Now. Personally, I think that would be 222 00:14:08,280 --> 00:14:11,280 Speaker 1: a whole lot easier if Elon Musk took a big 223 00:14:11,280 --> 00:14:14,040 Speaker 1: old step back from the driver's seat. I mean, sure, 224 00:14:14,400 --> 00:14:17,240 Speaker 1: he put in a different CEO, so technically he's not 225 00:14:17,280 --> 00:14:20,680 Speaker 1: the CEO of the company anymore. But the general impression 226 00:14:20,720 --> 00:14:22,720 Speaker 1: I get in the tech world is that it's really 227 00:14:22,720 --> 00:14:26,360 Speaker 1: still Elon Musk calling the shots over there. And maybe 228 00:14:26,360 --> 00:14:30,520 Speaker 1: that's unrealistic. Maybe that's not the case, but it certainly 229 00:14:30,640 --> 00:14:33,920 Speaker 1: seems to be that, at least from the outside, and 230 00:14:34,160 --> 00:14:37,320 Speaker 1: I don't think that's been doing the company any favors. 231 00:14:38,120 --> 00:14:42,520 Speaker 1: Another long ongoing story around the world is how various 232 00:14:42,520 --> 00:14:45,880 Speaker 1: countries are beginning to treat apps and tech that is 233 00:14:46,000 --> 00:14:49,360 Speaker 1: created out of countries like China. So in a recent example, 234 00:14:49,840 --> 00:14:53,320 Speaker 1: the government of Canada decided yesterday to issue a ban 235 00:14:53,560 --> 00:14:57,480 Speaker 1: on the messaging app we Chat on all government issued 236 00:14:57,640 --> 00:15:02,080 Speaker 1: mobile devices. The Canadian government also included a Russian app 237 00:15:02,360 --> 00:15:04,840 Speaker 1: in that ban as well, or really a Russian suite, 238 00:15:05,440 --> 00:15:10,040 Speaker 1: the Kasperski antivirus program specifically. Now, this is out of 239 00:15:10,040 --> 00:15:13,600 Speaker 1: a concern that the governments of China and Russia could 240 00:15:13,600 --> 00:15:18,600 Speaker 1: potentially either directly or indirectly use those kinds of apps 241 00:15:18,640 --> 00:15:23,000 Speaker 1: to spy on Canadian government officials and Canadian systems. The 242 00:15:23,080 --> 00:15:25,880 Speaker 1: government did say that there was no evidence that any 243 00:15:25,880 --> 00:15:29,760 Speaker 1: such thing had happened so far, so this would really 244 00:15:29,800 --> 00:15:33,040 Speaker 1: be more of a proactive move, as the CIO for 245 00:15:33,160 --> 00:15:37,200 Speaker 1: Canada has identified those two apps as having unacceptable risks 246 00:15:37,280 --> 00:15:41,640 Speaker 1: associated with them. Chinese representatives condemned the move. They said 247 00:15:42,160 --> 00:15:44,840 Speaker 1: that it was really done more as a political gesture 248 00:15:45,000 --> 00:15:50,800 Speaker 1: without any real justifiable foundation, and Kasperski responded with surprise 249 00:15:50,920 --> 00:15:53,600 Speaker 1: and disappointment, saying that the lack of evidence indicates that 250 00:15:53,640 --> 00:15:58,040 Speaker 1: the ban is a response to a geopolitical perceived threat 251 00:15:58,440 --> 00:16:00,960 Speaker 1: rather than a real one, and claims no one from 252 00:16:01,000 --> 00:16:04,320 Speaker 1: Canada's government ever reached out to Kasperski to work things 253 00:16:04,320 --> 00:16:08,680 Speaker 1: out before issuing the ban. Now, I will say that 254 00:16:08,720 --> 00:16:13,479 Speaker 1: at least with Kasperski, it has a somewhat shady reputation 255 00:16:13,560 --> 00:16:17,240 Speaker 1: in the cybersecurity community, not that their tools don't work, 256 00:16:17,800 --> 00:16:22,040 Speaker 1: but that there is this fear of a potential connection 257 00:16:22,120 --> 00:16:27,720 Speaker 1: between Kasperski and various Russian intelligence agencies. Kasperski has been 258 00:16:28,360 --> 00:16:32,400 Speaker 1: rather compliant with Russian government on several things that have 259 00:16:32,560 --> 00:16:35,560 Speaker 1: led to people saying it may not be a good 260 00:16:35,560 --> 00:16:41,040 Speaker 1: idea to use Kaspersky products in things like government settings 261 00:16:41,560 --> 00:16:44,320 Speaker 1: because of the potential that someone could use that for 262 00:16:44,520 --> 00:16:47,760 Speaker 1: surveillance purposes. Canada is not the only country to have 263 00:16:47,800 --> 00:16:50,520 Speaker 1: concerns about this product. The US issued a similar ban 264 00:16:51,120 --> 00:16:55,520 Speaker 1: Back in twenty seventeen. Germany's Federal Office for Information Security 265 00:16:55,600 --> 00:16:58,200 Speaker 1: issued a strong recommendation to its staff not to use 266 00:16:58,280 --> 00:17:01,800 Speaker 1: Kasperski antivirus products out of concern, you know, regarding the 267 00:17:01,840 --> 00:17:05,080 Speaker 1: company's potential connections to the Russian government. So this is 268 00:17:05,080 --> 00:17:10,359 Speaker 1: not an unprecedented event. Now, for those of you who 269 00:17:10,600 --> 00:17:13,399 Speaker 1: like those stories where the little guy stands up to 270 00:17:13,480 --> 00:17:16,159 Speaker 1: the fat cat, I got one for you now. In 271 00:17:16,160 --> 00:17:19,080 Speaker 1: this case, the little guy is a software firm in 272 00:17:19,119 --> 00:17:22,960 Speaker 1: the UK called Threads Software Limited, and the fat cat 273 00:17:23,040 --> 00:17:27,080 Speaker 1: is Meta, formerly known as Facebook. And yes, this all 274 00:17:27,119 --> 00:17:29,960 Speaker 1: has to do with the name Threads, which Meta has 275 00:17:30,080 --> 00:17:33,680 Speaker 1: used for its microblogging service, you know, the Twitter slash 276 00:17:33,920 --> 00:17:38,639 Speaker 1: x killer platform that saw rapid adoption and then not 277 00:17:38,720 --> 00:17:42,959 Speaker 1: long after that rapid disengagement or what I call oh 278 00:17:43,000 --> 00:17:47,080 Speaker 1: I forgot I even installed that itis. The software company 279 00:17:47,119 --> 00:17:49,800 Speaker 1: sent a letter to Meta and said that it will 280 00:17:49,800 --> 00:17:53,800 Speaker 1: go to the courts and ask for an injunction against 281 00:17:53,840 --> 00:17:57,760 Speaker 1: Meta if the company doesn't stop using the name Threads. Now. 282 00:17:57,760 --> 00:18:02,320 Speaker 1: To be clear, Threads Software Limited or actually a precursor 283 00:18:02,359 --> 00:18:06,800 Speaker 1: to that company called JPY Limited, applied for and received 284 00:18:06,880 --> 00:18:10,560 Speaker 1: a trademark for the brand name Threads more than a 285 00:18:10,640 --> 00:18:14,359 Speaker 1: decade ago, and that was well before Meta announced any 286 00:18:14,359 --> 00:18:18,480 Speaker 1: intention of creating a microblogging competitor to Twitter. Now that 287 00:18:18,560 --> 00:18:22,000 Speaker 1: being said, this is not the first threads product from Meta. 288 00:18:22,720 --> 00:18:26,280 Speaker 1: Back in twenty nineteen, Meta, which at that time was 289 00:18:26,320 --> 00:18:30,080 Speaker 1: still called Facebook, introduced a feature in Instagram and it 290 00:18:30,119 --> 00:18:35,080 Speaker 1: was called Threads. I'm not sure if the thread Software 291 00:18:35,119 --> 00:18:39,440 Speaker 1: Limited company actually pushed back against Meta when that happened. 292 00:18:39,840 --> 00:18:45,159 Speaker 1: That's important because holding a trademark requires a company to 293 00:18:45,240 --> 00:18:50,439 Speaker 1: defend that trademark. If Meta can show that that Threads 294 00:18:50,480 --> 00:18:55,320 Speaker 1: Software Limited did not attempt to defend its trademark when 295 00:18:55,320 --> 00:18:59,400 Speaker 1: Meta first introduced the Threads product in Instagram in twenty nineteen, 296 00:19:00,080 --> 00:19:03,840 Speaker 1: then that could really weaken the case against Meta. I 297 00:19:03,880 --> 00:19:06,680 Speaker 1: don't know. Maybe they did pushback. I couldn't find information 298 00:19:06,720 --> 00:19:08,560 Speaker 1: on that when I was reading up on the story. 299 00:19:08,800 --> 00:19:10,760 Speaker 1: If I'd done some more digging, I probably could have, 300 00:19:10,800 --> 00:19:13,000 Speaker 1: but I was running out of time. Also, there are 301 00:19:13,000 --> 00:19:15,280 Speaker 1: a lot of other products out there that have the 302 00:19:15,359 --> 00:19:19,879 Speaker 1: name thread or threads, and they have not really seen 303 00:19:19,920 --> 00:19:21,800 Speaker 1: a lot of pushback from this company. From what I 304 00:19:21,840 --> 00:19:26,200 Speaker 1: can tell, which is another potential strike against them. Now, 305 00:19:26,200 --> 00:19:29,919 Speaker 1: all that being said, Meta supposedly did reach out to 306 00:19:30,160 --> 00:19:34,400 Speaker 1: thread Software Limited in an attempt to purchase the domain 307 00:19:34,600 --> 00:19:38,920 Speaker 1: threads dot app, but the company gave Meta the cold 308 00:19:38,960 --> 00:19:42,320 Speaker 1: shoulder for every single request. It does, however, indicate that 309 00:19:42,400 --> 00:19:44,720 Speaker 1: Meta was aware that was going to use a brand 310 00:19:44,800 --> 00:19:47,159 Speaker 1: name that was already in use. So it is a 311 00:19:47,160 --> 00:19:49,639 Speaker 1: complicated thing. And who knows which way the courts will go. 312 00:19:49,960 --> 00:19:52,800 Speaker 1: I mean a lot of courts might already be I 313 00:19:52,840 --> 00:19:56,600 Speaker 1: hate to use the word biased, maybe predisposed to laying 314 00:19:56,680 --> 00:19:59,920 Speaker 1: judgments against Meta because Meta has not done itself any 315 00:20:00,080 --> 00:20:04,639 Speaker 1: favors of the last decade. It has operated in a 316 00:20:04,680 --> 00:20:09,760 Speaker 1: way that has almost dared governments and courts to take 317 00:20:09,800 --> 00:20:14,480 Speaker 1: action against it, which could potentially spell over in decisions. 318 00:20:14,720 --> 00:20:19,280 Speaker 1: We're all human beings, even judges and lawyers, So who knows. 319 00:20:19,480 --> 00:20:23,520 Speaker 1: But yeah, that's what's going on in that world. In India, 320 00:20:23,600 --> 00:20:27,240 Speaker 1: Apple has sent an urgent alert to several politicians in 321 00:20:27,359 --> 00:20:33,160 Speaker 1: the Indian Parliament. The alert warned the recipients that their 322 00:20:33,200 --> 00:20:39,560 Speaker 1: respective iPhones were potentially targeted by a state sponsored attack, 323 00:20:40,600 --> 00:20:44,240 Speaker 1: like an attack with a high level of sophistication and 324 00:20:44,280 --> 00:20:48,679 Speaker 1: a high requirement for you know, resources, So something that 325 00:20:48,720 --> 00:20:52,480 Speaker 1: would typically only happen from a hacker group that had 326 00:20:52,480 --> 00:20:57,480 Speaker 1: the backing of some sort of government somewhere in the world. Now, notably, 327 00:20:58,000 --> 00:21:01,119 Speaker 1: all the people who have come forward so far saying 328 00:21:01,160 --> 00:21:05,160 Speaker 1: that they received this alert belong to the opposition party 329 00:21:05,359 --> 00:21:10,520 Speaker 1: in India's parliament. Now, Apple has not identified which state 330 00:21:11,160 --> 00:21:16,320 Speaker 1: may be behind these attacks, and in fact, Apple has 331 00:21:16,359 --> 00:21:20,320 Speaker 1: also said it's possible the alerts could be a false alarm. 332 00:21:20,760 --> 00:21:24,400 Speaker 1: They have not issued that many alerts around the world, 333 00:21:24,960 --> 00:21:27,879 Speaker 1: and they said, you know, this could be something where 334 00:21:27,920 --> 00:21:33,840 Speaker 1: we are being extra defensive, but there's no actual threat. There. 335 00:21:34,400 --> 00:21:36,199 Speaker 1: At least a few of the people who receive the 336 00:21:36,200 --> 00:21:41,000 Speaker 1: message suspect that India's conservative government is actually behind the operation, 337 00:21:41,680 --> 00:21:45,280 Speaker 1: that this is an effort to maintain and extend control 338 00:21:45,680 --> 00:21:50,320 Speaker 1: of the nation in general and to suppress opposition politicians. 339 00:21:50,920 --> 00:21:54,040 Speaker 1: I've talked a lot about how India's government takes a 340 00:21:54,119 --> 00:21:56,879 Speaker 1: really aggressive stance with tech companies when it comes to 341 00:21:56,920 --> 00:22:01,160 Speaker 1: things like allowing people to use various platforms to criticize 342 00:22:01,160 --> 00:22:04,919 Speaker 1: the government. So the implication that opposition leaders could be 343 00:22:05,000 --> 00:22:09,200 Speaker 1: the target of the actual Indian government fits that narrative, 344 00:22:09,560 --> 00:22:12,480 Speaker 1: but that doesn't necessarily mean it's true. It could be, 345 00:22:13,040 --> 00:22:16,479 Speaker 1: but without evidence, you can't really draw a conclusion. Various 346 00:22:16,520 --> 00:22:19,600 Speaker 1: representatives of the majority have argued that those who are 347 00:22:19,600 --> 00:22:23,720 Speaker 1: contacted are overreacting, that they're jumping to conclusions that cannot 348 00:22:23,760 --> 00:22:27,040 Speaker 1: be supported by evidence, and they're just reaching for reasons 349 00:22:27,080 --> 00:22:30,359 Speaker 1: to be outraged at the government. So they're essentially saying 350 00:22:30,560 --> 00:22:35,359 Speaker 1: we're being unfairly targeted by these criticisms. I'm sure more 351 00:22:35,440 --> 00:22:39,520 Speaker 1: well unfold, but it'll be interesting to see how Apple 352 00:22:39,640 --> 00:22:45,440 Speaker 1: navigates this because obviously it's a pretty thorny political situation, 353 00:22:45,840 --> 00:22:50,120 Speaker 1: so we'll have to wait and see. Okay, I've got 354 00:22:50,160 --> 00:22:53,399 Speaker 1: some more tech news items to cover before we conclude 355 00:22:53,440 --> 00:23:07,440 Speaker 1: this episode, but first let's take another quick break. Are 356 00:23:07,480 --> 00:23:10,840 Speaker 1: you ready for some football and by that I mean 357 00:23:10,880 --> 00:23:14,960 Speaker 1: American football? And also were you ready for it last Sunday? 358 00:23:15,320 --> 00:23:17,600 Speaker 1: And if you were, did you find yourself frustrated when 359 00:23:17,680 --> 00:23:21,160 Speaker 1: using YouTube in order to watch football? All right, let 360 00:23:21,160 --> 00:23:25,320 Speaker 1: me back up a bit. So YouTube secured a contract 361 00:23:25,640 --> 00:23:29,159 Speaker 1: to carry the NFL Sunday ticket product that originally was 362 00:23:29,200 --> 00:23:32,840 Speaker 1: with DirectTV. Now it's an add on to the YouTube 363 00:23:32,840 --> 00:23:38,679 Speaker 1: TV service NFL Sunday Ticket gives subscribers the ability to 364 00:23:38,720 --> 00:23:42,440 Speaker 1: tune into the various NFL games, the American football games 365 00:23:42,440 --> 00:23:45,240 Speaker 1: of the National Football League that are playing on any 366 00:23:45,320 --> 00:23:49,200 Speaker 1: given Sunday during football season, regardless of where you happen 367 00:23:49,280 --> 00:23:53,119 Speaker 1: to live. It is a very expensive add on. It 368 00:23:53,200 --> 00:23:56,160 Speaker 1: starts at three hundred and forty nine dollars per year, 369 00:23:56,200 --> 00:24:00,199 Speaker 1: and remember football season doesn't go all year long. This 370 00:24:00,320 --> 00:24:03,200 Speaker 1: is also on top of the YouTube TV subscription, which 371 00:24:03,240 --> 00:24:07,359 Speaker 1: is more than seventy dollars per month, so that starts 372 00:24:07,400 --> 00:24:10,399 Speaker 1: to add up real quick. You sports fans have to 373 00:24:10,440 --> 00:24:12,960 Speaker 1: pay a whole bunch of moulah to watch your stuff. 374 00:24:13,000 --> 00:24:15,639 Speaker 1: I feel like I shouldn't complain about all the different 375 00:24:15,640 --> 00:24:21,080 Speaker 1: streaming services I subscribe to at this point, because yikes. Anyway, 376 00:24:21,600 --> 00:24:25,240 Speaker 1: for the first seven weeks of this current season, YouTube's 377 00:24:25,320 --> 00:24:28,000 Speaker 1: version of the NFL Sunday Tickets seem to be working 378 00:24:28,440 --> 00:24:32,679 Speaker 1: just fine. But this past Sunday users encountered issues with 379 00:24:32,840 --> 00:24:36,399 Speaker 1: lag and buffering times, and for some of the games, 380 00:24:36,440 --> 00:24:40,760 Speaker 1: those issues lasted well into the second half of the game. Understandably, 381 00:24:41,440 --> 00:24:43,640 Speaker 1: a lot of customers are upset. I mean, if you're 382 00:24:43,640 --> 00:24:48,320 Speaker 1: paying a premium price for a service, you expect premium 383 00:24:48,440 --> 00:24:52,600 Speaker 1: quality in that service. And yeah, we all understand that 384 00:24:52,680 --> 00:24:56,560 Speaker 1: technical errors can happen, and that systems can and do breakdown. 385 00:24:56,800 --> 00:24:59,800 Speaker 1: That's just a fact of life. But when you're charging 386 00:24:59,840 --> 00:25:03,480 Speaker 1: three three hundred and fifty dollars a season just to 387 00:25:03,520 --> 00:25:07,320 Speaker 1: be able to let people see games on their television 388 00:25:07,400 --> 00:25:10,280 Speaker 1: or computer screen, not even go to a game, but 389 00:25:10,400 --> 00:25:13,480 Speaker 1: just see it at home, there is a certain expectation 390 00:25:13,560 --> 00:25:17,159 Speaker 1: that the service is gonna work as advertised. YouTube posted 391 00:25:17,160 --> 00:25:19,280 Speaker 1: a message indicating that the company was aware of the 392 00:25:19,320 --> 00:25:22,320 Speaker 1: problem and was working on it, but as I was 393 00:25:22,359 --> 00:25:25,159 Speaker 1: writing this episode, it didn't have any further details about 394 00:25:25,200 --> 00:25:28,560 Speaker 1: what actually caused the problem that might be public by 395 00:25:28,560 --> 00:25:31,479 Speaker 1: the time you hear this, but I hadn't seen it 396 00:25:31,520 --> 00:25:34,440 Speaker 1: when I was actually putting the episode together. The game 397 00:25:34,520 --> 00:25:38,600 Speaker 1: company Ubisoft is ending online service for ten of the 398 00:25:38,640 --> 00:25:44,000 Speaker 1: company's titles. That includes Assassin's Creed two, Assassin's Creed Brotherhood, 399 00:25:44,320 --> 00:25:47,600 Speaker 1: and Assassin's Creed Revelations, all three of which were very 400 00:25:47,640 --> 00:25:52,080 Speaker 1: popular titles. This means that the online features for those 401 00:25:52,200 --> 00:25:55,720 Speaker 1: and the seven other games that they announced are just 402 00:25:55,760 --> 00:25:58,800 Speaker 1: not going to work anymore. Those services will no longer 403 00:25:58,840 --> 00:26:02,760 Speaker 1: be active, and that includes stuff like online multiplayer obviously, 404 00:26:03,080 --> 00:26:06,760 Speaker 1: as well as Ubisoft Connect rewards. Those are kind of 405 00:26:06,760 --> 00:26:10,920 Speaker 1: like achievements on Xbox or trophies on PlayStation. Steam also 406 00:26:10,960 --> 00:26:14,720 Speaker 1: has achievements like this. Fortunately, this is not a case 407 00:26:14,840 --> 00:26:18,720 Speaker 1: where the online component means the games become totally unplayable. 408 00:26:18,960 --> 00:26:21,440 Speaker 1: As far as I can tell, gamers should still be 409 00:26:21,520 --> 00:26:25,240 Speaker 1: able to run these games on the hardware that can 410 00:26:25,320 --> 00:26:28,000 Speaker 1: run them because these are older games and still be 411 00:26:28,000 --> 00:26:30,480 Speaker 1: able to play the single player experience. They just won't 412 00:26:30,520 --> 00:26:33,480 Speaker 1: be able to access those online features. But this is 413 00:26:33,520 --> 00:26:36,240 Speaker 1: one of the big reasons I personally do not like 414 00:26:36,440 --> 00:26:39,800 Speaker 1: game companies tying their single player games to some sort 415 00:26:39,840 --> 00:26:43,160 Speaker 1: of online service. Usually it's an online service that's meant 416 00:26:43,200 --> 00:26:46,600 Speaker 1: to verify that the gamer is using a legitimate copy 417 00:26:46,640 --> 00:26:49,000 Speaker 1: of the game and not a pirated copy. And that's 418 00:26:49,040 --> 00:26:54,119 Speaker 1: because if a company decommissions the online component, then the 419 00:26:54,160 --> 00:26:57,120 Speaker 1: game can become unplayable, even if you still have equipment 420 00:26:57,119 --> 00:27:01,400 Speaker 1: that could technically run the game. So I hate when 421 00:27:01,760 --> 00:27:05,800 Speaker 1: game companies tie their titles to some online service that 422 00:27:05,880 --> 00:27:10,080 Speaker 1: doesn't really add to the game itself. It only exists 423 00:27:10,160 --> 00:27:13,639 Speaker 1: to verify the copy of the game, because if that 424 00:27:13,720 --> 00:27:16,760 Speaker 1: service goes offline and they don't offer some sort of patch, 425 00:27:17,359 --> 00:27:22,000 Speaker 1: then the game you purchased is just it's just dead code. Anyway, 426 00:27:22,800 --> 00:27:25,080 Speaker 1: kind of sad to see that that support go away. 427 00:27:25,160 --> 00:27:27,520 Speaker 1: It does make sense, like there is a limited amount 428 00:27:27,520 --> 00:27:31,040 Speaker 1: of resources that a company has to put towards supporting games, 429 00:27:31,640 --> 00:27:34,880 Speaker 1: and it doesn't really make sense to just continually support 430 00:27:35,000 --> 00:27:40,879 Speaker 1: online components in perpetuity to a diminishing audience, right, That 431 00:27:41,480 --> 00:27:43,480 Speaker 1: doesn't make good business sense, So I get it from 432 00:27:43,480 --> 00:27:47,240 Speaker 1: that perspective. You might remember that a few weeks back, 433 00:27:47,280 --> 00:27:50,400 Speaker 1: I talked about how NASA had retrieved a capsule from 434 00:27:50,480 --> 00:27:55,120 Speaker 1: the Osiris REX mission. That's the mission that visited an 435 00:27:55,160 --> 00:28:01,199 Speaker 1: asteroid called Binu, and the spacecraft up collecting soil and 436 00:28:01,280 --> 00:28:05,720 Speaker 1: rock samples from Benu and then swung by Earth jettison 437 00:28:05,800 --> 00:28:09,280 Speaker 1: this capsule, and the capsule landed here on Earth and 438 00:28:09,359 --> 00:28:12,679 Speaker 1: NASA retrieved it. That's good news, but there's also some 439 00:28:12,760 --> 00:28:16,560 Speaker 1: bad news. The bad news is, so far the scientists 440 00:28:16,640 --> 00:28:20,160 Speaker 1: haven't been able to open the capsule. They have collected 441 00:28:20,160 --> 00:28:22,840 Speaker 1: more than seventy grams worth of dust and rocks from 442 00:28:22,840 --> 00:28:26,359 Speaker 1: the outside of the container. But because that dust and 443 00:28:26,400 --> 00:28:28,960 Speaker 1: those rocks were on the outside of the container. That 444 00:28:29,119 --> 00:28:33,320 Speaker 1: means that they've had contact with and thus contamination from 445 00:28:33,880 --> 00:28:39,680 Speaker 1: our dirty, dirty Earth dirt. So you can't really make 446 00:28:41,040 --> 00:28:47,600 Speaker 1: specific scientific conclusions about stuff from outer space if it's 447 00:28:47,600 --> 00:28:50,200 Speaker 1: already been contaminated here on Earth. You don't know what 448 00:28:50,320 --> 00:28:54,520 Speaker 1: elements were just present in the space stuff versus what 449 00:28:54,760 --> 00:28:59,080 Speaker 1: was transferred through that contact. So the real treasure is 450 00:28:59,200 --> 00:29:03,440 Speaker 1: inside the case where those samples have not been contaminated 451 00:29:03,960 --> 00:29:08,400 Speaker 1: by Earth. So the goal is to preserve that status 452 00:29:08,880 --> 00:29:12,200 Speaker 1: and retrieve the samples, then run tests, and that way 453 00:29:12,240 --> 00:29:16,800 Speaker 1: you can at least be pretty sure that the conclusions 454 00:29:16,800 --> 00:29:21,240 Speaker 1: you draw are are firm, that they're well grounded, because 455 00:29:21,840 --> 00:29:24,320 Speaker 1: you know, if you were to say, find organic material 456 00:29:24,720 --> 00:29:28,840 Speaker 1: inside the dust or rocks, then you could say, well, 457 00:29:29,360 --> 00:29:33,560 Speaker 1: we took all the controls necessary to prevent contamination from Earth. 458 00:29:33,920 --> 00:29:36,840 Speaker 1: We found this stuff here that's really interesting and it 459 00:29:36,920 --> 00:29:39,800 Speaker 1: raises a lot more questions. You can't really do that 460 00:29:40,000 --> 00:29:45,320 Speaker 1: if you're not able to maintain that integrity. Unfortunately, two 461 00:29:46,040 --> 00:29:48,600 Speaker 1: of the fasteners on this capsule, they're more than thirty 462 00:29:48,640 --> 00:29:52,320 Speaker 1: fasteners in total, but two of them are firmly shut 463 00:29:52,600 --> 00:29:55,800 Speaker 1: and cannot be removed with the tools that NASA had 464 00:29:55,840 --> 00:30:02,000 Speaker 1: previously approved for the retrieval operation, So that means that 465 00:30:02,040 --> 00:30:04,880 Speaker 1: we have to wait while researchers work on the problem 466 00:30:05,000 --> 00:30:09,480 Speaker 1: and figure out an alternative to accessing the material inside 467 00:30:09,520 --> 00:30:13,560 Speaker 1: the capsule, one that will maintain that integrity. It doesn't 468 00:30:13,560 --> 00:30:15,520 Speaker 1: mean that they won't come up with it. In fact, 469 00:30:15,560 --> 00:30:17,080 Speaker 1: I'm sure they will come up with it, but it 470 00:30:17,200 --> 00:30:20,480 Speaker 1: will delay things a little bit because they have to 471 00:30:20,520 --> 00:30:24,959 Speaker 1: make sure that they're still keeping things as secure as 472 00:30:25,000 --> 00:30:28,400 Speaker 1: they possibly can. So we do still expect to get 473 00:30:28,400 --> 00:30:30,840 Speaker 1: into that capsule and to eventually be able to check 474 00:30:30,880 --> 00:30:32,960 Speaker 1: out those space rocks. It's just going to take a 475 00:30:33,000 --> 00:30:34,600 Speaker 1: little bit longer to come up with a way to 476 00:30:34,640 --> 00:30:39,880 Speaker 1: do it that keeps them preserved properly. Now that's it 477 00:30:39,880 --> 00:30:42,200 Speaker 1: for the stories, but I do have one press release 478 00:30:42,320 --> 00:30:46,400 Speaker 1: and one article to recommend for y'all for further reading today. 479 00:30:46,680 --> 00:30:50,360 Speaker 1: So on the press release side, the US Securities Exchange 480 00:30:50,360 --> 00:30:53,680 Speaker 1: Commission stated that it has filed fraud charges against the 481 00:30:53,720 --> 00:30:57,200 Speaker 1: company Solar Winds, as well as the company's chief information 482 00:30:57,280 --> 00:31:01,280 Speaker 1: security officer. Now you might remember that Solar Winds is 483 00:31:01,320 --> 00:31:05,560 Speaker 1: the company that was behind the Orion software that thousands 484 00:31:05,600 --> 00:31:09,600 Speaker 1: of large companies and government agencies use and that hackers 485 00:31:09,920 --> 00:31:14,200 Speaker 1: were able to compromise the Orian software and through updates 486 00:31:14,480 --> 00:31:17,840 Speaker 1: that were pushed out by Solar Winds itself, they were 487 00:31:17,880 --> 00:31:23,080 Speaker 1: subsequently able to infect target computer systems, thousands of them. 488 00:31:23,360 --> 00:31:26,440 Speaker 1: It was one of the largest known cybersecurity breaches of 489 00:31:26,440 --> 00:31:29,440 Speaker 1: all time, and it raised awareness of a type of 490 00:31:29,520 --> 00:31:34,960 Speaker 1: malware strategy called a supply chain attack. Anyway, the fraud 491 00:31:35,040 --> 00:31:38,360 Speaker 1: charges are complicated, and I'm sure it'll necessitate like a 492 00:31:38,400 --> 00:31:41,040 Speaker 1: full episode on the topic to give a full update, 493 00:31:41,560 --> 00:31:43,800 Speaker 1: but for the time being, I recommend y'all read up 494 00:31:43,800 --> 00:31:46,280 Speaker 1: on that press release. You can find it at SEC 495 00:31:46,480 --> 00:31:52,160 Speaker 1: dot gov and look for the press release about Solar Winds. 496 00:31:52,880 --> 00:31:57,600 Speaker 1: As for an article recommendation, Lindsey Clark of The Register 497 00:31:58,040 --> 00:32:02,720 Speaker 1: has a piece titled tech still cling to sexist stereotypes 498 00:32:03,040 --> 00:32:07,640 Speaker 1: forgetting female pioneers who coded their path Now, I cannot 499 00:32:07,720 --> 00:32:11,160 Speaker 1: say that this story surprises me, which makes me sad. 500 00:32:11,400 --> 00:32:13,400 Speaker 1: I wish I could say it surprised me and still 501 00:32:13,440 --> 00:32:16,120 Speaker 1: made me sad, but I wasn't surprised, and that was 502 00:32:16,320 --> 00:32:20,400 Speaker 1: really sad. It's a reminder that the tech industry in general, 503 00:32:20,960 --> 00:32:24,480 Speaker 1: and really the whole tech bro culture in particular, is 504 00:32:24,680 --> 00:32:29,520 Speaker 1: often an unwelcome or downright hostile environment for women and 505 00:32:29,560 --> 00:32:33,720 Speaker 1: really for anyone who doesn't identify as male, and it 506 00:32:33,840 --> 00:32:37,640 Speaker 1: reminds us that we wouldn't be where we are today 507 00:32:37,840 --> 00:32:41,640 Speaker 1: in the tech world without contributions of people who do 508 00:32:41,720 --> 00:32:45,800 Speaker 1: not identify as male, of women, and of non binary 509 00:32:45,880 --> 00:32:50,200 Speaker 1: people who have made incredible contributions to tech. Some of 510 00:32:50,240 --> 00:32:54,880 Speaker 1: the greatest innovators in tech who built the foundations of 511 00:32:54,920 --> 00:32:59,760 Speaker 1: computer science were women, Grace Hopper and even further back, 512 00:33:00,120 --> 00:33:04,600 Speaker 1: a Lovelace, for goodness sakes, people who were true visionaries 513 00:33:04,680 --> 00:33:08,360 Speaker 1: when it comes to technology. And yet we still have 514 00:33:08,480 --> 00:33:15,200 Speaker 1: this environment that denigrates women and dismisses their contributions and achievements. 515 00:33:16,320 --> 00:33:19,480 Speaker 1: It's unfortunate and sadly, again not surprising, but I do 516 00:33:19,560 --> 00:33:22,600 Speaker 1: recommend reading that article again. That's in the Register. It's 517 00:33:22,640 --> 00:33:26,239 Speaker 1: called tech Bros still cling to sexist stereotypes. You can 518 00:33:26,360 --> 00:33:29,680 Speaker 1: check that out. That's it for today. I wish you 519 00:33:29,800 --> 00:33:34,240 Speaker 1: all a happy Halloween. I hope you are you encounter 520 00:33:34,800 --> 00:33:38,960 Speaker 1: endless treats and very few tricks, and I hope you're 521 00:33:39,000 --> 00:33:41,840 Speaker 1: also all doing well. And I'll talk to you again 522 00:33:42,640 --> 00:33:51,520 Speaker 1: really soon. Y tech stuff is an iHeart radio production. 523 00:33:51,840 --> 00:33:56,880 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 524 00:33:57,000 --> 00:34:00,560 Speaker 1: or wherever you listen to your favorite shows.