1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:16,079 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,120 --> 00:00:19,960 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech are. 4 00:00:20,720 --> 00:00:23,720 Speaker 1: It's time for the tech news for Thursday October fifth, 5 00:00:24,480 --> 00:00:29,680 Speaker 1: twenty twenty three. Not an official tech news thing. But 6 00:00:30,040 --> 00:00:33,080 Speaker 1: I'm feeling a little bit better. I hope I'm sounding better. 7 00:00:33,159 --> 00:00:37,600 Speaker 1: I know my voice has had quite a journey over 8 00:00:37,640 --> 00:00:40,480 Speaker 1: this last week. It's not I'm not at one hundred 9 00:00:40,479 --> 00:00:43,440 Speaker 1: percent by any means, but you got to keep on, 10 00:00:43,720 --> 00:00:46,280 Speaker 1: keep it on, all right, Let's get to the tech news. 11 00:00:46,840 --> 00:00:52,120 Speaker 1: First up, let's talk zero day vulnerabilities again. So as 12 00:00:52,120 --> 00:00:55,240 Speaker 1: a reminder, the term zero day refers to a case 13 00:00:55,280 --> 00:00:59,320 Speaker 1: where software developers are unaware that the product they've made 14 00:00:59,360 --> 00:01:02,720 Speaker 1: and released contains vulnerabilities. And to be clear, it's not 15 00:01:02,800 --> 00:01:05,039 Speaker 1: just software. It can be other stuff too, but we 16 00:01:05,120 --> 00:01:09,120 Speaker 1: often talk about in terms of software. So they release something, 17 00:01:09,600 --> 00:01:12,240 Speaker 1: there's a security vulnerability in it that they're not aware of. 18 00:01:12,800 --> 00:01:16,560 Speaker 1: That means there's a chance for bad actors to discover 19 00:01:16,640 --> 00:01:21,360 Speaker 1: that vulnerability and then create an exploit that leverages that vulnerability. 20 00:01:21,640 --> 00:01:25,640 Speaker 1: And since the developers may still remain unaware that there's 21 00:01:25,640 --> 00:01:27,679 Speaker 1: a problem in the first place. There's no one to 22 00:01:27,800 --> 00:01:30,760 Speaker 1: prevent that from happening, or even to react in a 23 00:01:30,840 --> 00:01:35,240 Speaker 1: timely way until someone discovers that something hinky is going on. Now, 24 00:01:35,280 --> 00:01:38,760 Speaker 1: once developers do become aware of the problem, they can respond, 25 00:01:38,840 --> 00:01:41,240 Speaker 1: but by then the damage may have already been done, 26 00:01:41,240 --> 00:01:43,560 Speaker 1: at least to some extent. Now, the reason I mentioned 27 00:01:43,600 --> 00:01:48,680 Speaker 1: this is that Microsoft released patches to two open source libraries. 28 00:01:49,600 --> 00:01:53,200 Speaker 1: In this case, a library determined programming that refers to 29 00:01:53,240 --> 00:01:58,960 Speaker 1: a collection of resources that can be used by various 30 00:01:59,640 --> 00:02:04,320 Speaker 1: computer programs. So, according to tech Crunch, products that tapped 31 00:02:04,320 --> 00:02:07,520 Speaker 1: into these two libraries, Microsoft products that had been using 32 00:02:07,560 --> 00:02:12,920 Speaker 1: these two libraries included the Edge web browser, Microsoft Teams, 33 00:02:13,000 --> 00:02:16,280 Speaker 1: and Skype. And just to be clear, Microsoft was not 34 00:02:16,320 --> 00:02:19,560 Speaker 1: the only company affected by this. These are open source libraries, 35 00:02:19,680 --> 00:02:23,600 Speaker 1: so lots of other programs from other companies also tap 36 00:02:23,639 --> 00:02:28,600 Speaker 1: into these. Google issued its own patches for Chrome, Firefox 37 00:02:28,639 --> 00:02:30,200 Speaker 1: did the same. Well, I think we're going to be 38 00:02:30,280 --> 00:02:32,720 Speaker 1: talking about Apple later doing the same sort of thing. 39 00:02:33,360 --> 00:02:37,880 Speaker 1: So Microsoft has not commented on whether hackers targeted any 40 00:02:37,960 --> 00:02:42,680 Speaker 1: of their products through these vulnerabilities, nor has the company 41 00:02:43,000 --> 00:02:45,360 Speaker 1: indicated whether or not they could even determine that in 42 00:02:45,400 --> 00:02:49,760 Speaker 1: the first place. A scammy advertisement that featured a deep 43 00:02:49,800 --> 00:02:53,880 Speaker 1: faked mister Beast popped up on TikTok and has now 44 00:02:53,919 --> 00:02:57,720 Speaker 1: become part of a larger conversation around AI and deep 45 00:02:57,720 --> 00:03:02,760 Speaker 1: fake technology and fraudulent advertising. So mister Beast, in case 46 00:03:02,919 --> 00:03:07,279 Speaker 1: you don't know, is the YouTube handle of Jimmy Donaldson, 47 00:03:07,440 --> 00:03:10,880 Speaker 1: and he's mostly known for doing stuff like giving away 48 00:03:11,000 --> 00:03:15,720 Speaker 1: massive sums of money or opening up a burger chain 49 00:03:16,040 --> 00:03:19,400 Speaker 1: and then later on suing it and being sued by it. 50 00:03:20,000 --> 00:03:23,359 Speaker 1: But the burger chain story, while fascinating, probably belongs to 51 00:03:23,440 --> 00:03:30,799 Speaker 1: another podcast. Anyway, mister Beast is super duper Internet famous. 52 00:03:30,880 --> 00:03:34,720 Speaker 1: I joke about being Internet famous, No, I am unknown. 53 00:03:35,200 --> 00:03:37,280 Speaker 1: Mister Beast has more than one hundred and eighty eight 54 00:03:37,280 --> 00:03:41,240 Speaker 1: million subscribers on YouTube. That makes him the most popular 55 00:03:41,440 --> 00:03:46,040 Speaker 1: individual on YouTube by subscriptions. There is technically another channel 56 00:03:46,080 --> 00:03:50,360 Speaker 1: that actually has more subscribers. That is a music label 57 00:03:50,480 --> 00:03:53,360 Speaker 1: in India that has more subscribers than mister Beast. But 58 00:03:53,680 --> 00:03:56,960 Speaker 1: if you're talking about a single person, mister Beast is 59 00:03:57,080 --> 00:04:02,680 Speaker 1: like top of the pile. Anyway. In this ad that 60 00:04:02,720 --> 00:04:07,400 Speaker 1: appeared on TikTok, the AI generated mister Beast claimed that 61 00:04:07,440 --> 00:04:11,720 Speaker 1: he will sell ten thousand iPhone fifteen pro smartphones for 62 00:04:11,880 --> 00:04:17,200 Speaker 1: the princely sum of two American dollars. Now, these are 63 00:04:17,279 --> 00:04:20,400 Speaker 1: phones that usually go for like thirteen hundred dollars or 64 00:04:20,440 --> 00:04:24,520 Speaker 1: more depending upon what version you buy, and since mister 65 00:04:24,600 --> 00:04:28,480 Speaker 1: Beast actually has this long history of stunts that involve 66 00:04:28,960 --> 00:04:33,120 Speaker 1: giving stuff away to people, you can imagine how this 67 00:04:33,200 --> 00:04:36,240 Speaker 1: ad might actually fool folks into thinking that it's legit 68 00:04:36,400 --> 00:04:39,039 Speaker 1: and maybe they pay two bucks to try and get 69 00:04:39,080 --> 00:04:42,480 Speaker 1: an iPhone fifteen pro. TikTok has since removed this ad, 70 00:04:43,160 --> 00:04:44,960 Speaker 1: but the fact that it went up in the first 71 00:04:45,000 --> 00:04:50,440 Speaker 1: place despite the platform's review process is concerning because TikTok 72 00:04:50,480 --> 00:04:54,760 Speaker 1: says it uses both human review and AI to analyze 73 00:04:54,800 --> 00:04:57,360 Speaker 1: ads before allowing them to go up on the platform. 74 00:04:57,400 --> 00:05:01,159 Speaker 1: If that's true, this ad somew i managed to slip 75 00:05:01,200 --> 00:05:05,920 Speaker 1: through those controls and was on the platform. People were 76 00:05:05,920 --> 00:05:09,320 Speaker 1: reporting it, Mister Beast himself commented on it. So it's 77 00:05:09,360 --> 00:05:12,239 Speaker 1: another example of how deep fakes can be harmful. Of course, 78 00:05:12,320 --> 00:05:15,559 Speaker 1: this isn't you know, a surprise. We know deep fakes 79 00:05:15,600 --> 00:05:18,440 Speaker 1: can be harmful. They've been harmful ever since they were 80 00:05:18,520 --> 00:05:22,160 Speaker 1: unleashed a few years ago. Other folks who have said 81 00:05:22,200 --> 00:05:24,600 Speaker 1: that they have been deep faked into ads without their 82 00:05:24,640 --> 00:05:28,800 Speaker 1: consent include America's Dad you know, Tom Hanks, David S. 83 00:05:28,839 --> 00:05:32,599 Speaker 1: Pumpkins if You're not familiar, and also an anchor for 84 00:05:32,680 --> 00:05:36,240 Speaker 1: CBS named Gail King says that that has happened to them. 85 00:05:36,320 --> 00:05:39,440 Speaker 1: So if you happen to see an ad that features 86 00:05:39,560 --> 00:05:44,400 Speaker 1: me hawking the latest I don't know sports gambling app, well, 87 00:05:44,440 --> 00:05:47,080 Speaker 1: someone done darn faked me, because I wouldn't do that. 88 00:05:49,080 --> 00:05:51,880 Speaker 1: The Wall Street Journal reports that part of the US 89 00:05:51,880 --> 00:05:56,239 Speaker 1: Federal Trade Commissions lawsuit against Amazon hinges on a quote 90 00:05:56,279 --> 00:06:01,719 Speaker 1: unquote secret algorithm. Now the algorithm does exist or did exist, 91 00:06:01,880 --> 00:06:05,799 Speaker 1: and its code name was Project NeSSI. The FTC claims 92 00:06:05,800 --> 00:06:10,520 Speaker 1: that quote Amazon uses its extensive surveillance network to block 93 00:06:10,640 --> 00:06:17,080 Speaker 1: price competition by detecting and deterring discounting, artificially inflating prices 94 00:06:17,120 --> 00:06:20,800 Speaker 1: on and off Amazon, and depriving rivals of the ability 95 00:06:20,839 --> 00:06:25,640 Speaker 1: to gain scale by offering lower prices end quote. So, 96 00:06:25,760 --> 00:06:30,000 Speaker 1: according to the FTC, Amazon manipulated the market so that 97 00:06:30,040 --> 00:06:34,840 Speaker 1: when merchants tried competitive practices such as cutting product prices, 98 00:06:35,200 --> 00:06:38,640 Speaker 1: the algorithm essentially kicked in to counteract that move so 99 00:06:38,680 --> 00:06:44,919 Speaker 1: that the competitor would see fewer sales. So, if you 100 00:06:45,120 --> 00:06:49,000 Speaker 1: are a merchant and you cut prices and your sales 101 00:06:49,000 --> 00:06:50,920 Speaker 1: don't go up, well there's no reason to keep the 102 00:06:50,960 --> 00:06:55,160 Speaker 1: prices low. If your sales have remained the same, you 103 00:06:55,240 --> 00:06:57,760 Speaker 1: might as well sell things for more money, because otherwise 104 00:06:57,800 --> 00:07:01,520 Speaker 1: you're just cutting into your profit margins. So the FTC 105 00:07:01,600 --> 00:07:03,680 Speaker 1: is saying this is what Amazon was doing in order 106 00:07:03,760 --> 00:07:09,720 Speaker 1: to keep competitors to keep their prices higher. Amazon reportedly 107 00:07:09,760 --> 00:07:13,120 Speaker 1: stopped using this algorithm in twenty nineteen, but the FTC's 108 00:07:13,160 --> 00:07:16,400 Speaker 1: point is that the company was stacking the deck for 109 00:07:16,480 --> 00:07:19,080 Speaker 1: its own benefit, a benefit that amounted to more than 110 00:07:19,120 --> 00:07:22,840 Speaker 1: a billion dollars in revenue according to Wall Street Journal's sources. 111 00:07:23,480 --> 00:07:27,560 Speaker 1: An Amazon representative told Ours Technica that this complaint is 112 00:07:27,600 --> 00:07:30,960 Speaker 1: a mischaracterization of what Project NeSSI was all about. The 113 00:07:31,040 --> 00:07:34,880 Speaker 1: rep says that Nessi's purpose was to prevent Amazon's price 114 00:07:34,960 --> 00:07:39,720 Speaker 1: matching practice from making unsustainable changes to product prices. So, 115 00:07:39,760 --> 00:07:42,600 Speaker 1: in other words, you know, making sure that Amazon wasn't 116 00:07:42,640 --> 00:07:45,600 Speaker 1: going to sell something for less than what it cost 117 00:07:45,640 --> 00:07:50,000 Speaker 1: Amazon to make or procure it, or ship it. Amazon 118 00:07:50,080 --> 00:07:53,800 Speaker 1: says that its policy was to try and match lowest prices, 119 00:07:54,320 --> 00:07:56,560 Speaker 1: and that this is not meant to drive prices up, 120 00:07:56,600 --> 00:07:58,720 Speaker 1: but rather to be more competitive. They said, it's the 121 00:07:58,760 --> 00:08:03,080 Speaker 1: exact opposite of what ft claims. The FTC says Amazon 122 00:08:03,200 --> 00:08:06,400 Speaker 1: was actually manipulating the system so that the lowest prices 123 00:08:06,760 --> 00:08:10,120 Speaker 1: would be driven up much higher than they would be 124 00:08:10,160 --> 00:08:13,880 Speaker 1: in a true free market, and that Amazon was again 125 00:08:14,000 --> 00:08:17,440 Speaker 1: gaming the system. It's all pretty complicated, and it might 126 00:08:17,600 --> 00:08:21,280 Speaker 1: boil down to the possibility that Amazon was essentially using 127 00:08:21,320 --> 00:08:26,400 Speaker 1: a powerful algorithm to manipulate weaker algorithms that were belonging 128 00:08:26,440 --> 00:08:30,239 Speaker 1: to various competitors, all in an effort to fix pricing 129 00:08:30,320 --> 00:08:34,240 Speaker 1: levels that would favor Amazon. Apple has released an update 130 00:08:34,240 --> 00:08:38,480 Speaker 1: to iOS seventeen, which means the most current version as 131 00:08:38,520 --> 00:08:44,520 Speaker 1: of this episode is iOS seventeen dot dot three. The 132 00:08:44,760 --> 00:08:49,400 Speaker 1: update includes security fixes that address some vulnerabilities, including a 133 00:08:49,520 --> 00:08:52,319 Speaker 1: kernel exploit, as well as attatched to one of those 134 00:08:52,360 --> 00:08:55,320 Speaker 1: open source libraries we talked about with the Microsoft story, 135 00:08:56,000 --> 00:08:59,120 Speaker 1: and the update also addresses a problem that some iPhone 136 00:08:59,160 --> 00:09:03,680 Speaker 1: fifteen users have reported, you know that their smartphone tends 137 00:09:03,679 --> 00:09:06,640 Speaker 1: to run a little hot. Apple says that the reason 138 00:09:06,720 --> 00:09:09,880 Speaker 1: for the hot phones mostly falls to software and also 139 00:09:10,200 --> 00:09:14,960 Speaker 1: to those darn USBC chargers. You see, Apple tried to 140 00:09:15,040 --> 00:09:19,199 Speaker 1: warn you that switching away from the company's proprietary charging 141 00:09:19,240 --> 00:09:23,400 Speaker 1: technology would cause problems, and here it is. I should 142 00:09:23,400 --> 00:09:26,680 Speaker 1: also add that folks have speculated that maybe the processor 143 00:09:26,720 --> 00:09:29,680 Speaker 1: in the iPhone fifteen could be partially at fault. Apple 144 00:09:29,720 --> 00:09:32,760 Speaker 1: says that's not the case anyway. If you do have 145 00:09:32,800 --> 00:09:35,040 Speaker 1: an iPhone, you might want to check to see if 146 00:09:35,120 --> 00:09:38,440 Speaker 1: the version of iOS running on your device is the 147 00:09:38,440 --> 00:09:42,959 Speaker 1: current one. Meanwhile, Google launched its Pixel eight and Pixel 148 00:09:43,000 --> 00:09:46,440 Speaker 1: eight Pro smartphones this week, as well as the Pixel 149 00:09:46,559 --> 00:09:51,000 Speaker 1: Watch two. Now smartphones have Google's own Tensor G three 150 00:09:51,160 --> 00:09:55,040 Speaker 1: chipset to power everything. The Pro model has a fifty 151 00:09:55,080 --> 00:09:59,160 Speaker 1: megapixel rear camera, a forty eight megapixel telephoto lens with 152 00:09:59,200 --> 00:10:03,400 Speaker 1: five times zoom, and a forty eight megapixel ultra wide camera. 153 00:10:03,520 --> 00:10:07,560 Speaker 1: So Google is continuing to embrace the whole photography aspect 154 00:10:07,559 --> 00:10:09,839 Speaker 1: of smartphones, which you know it's been doing for several 155 00:10:09,880 --> 00:10:14,560 Speaker 1: generations now. The Pro also has a temperature sensor that 156 00:10:14,640 --> 00:10:17,360 Speaker 1: Google says you could use to measure the temperature of 157 00:10:17,600 --> 00:10:20,080 Speaker 1: like a surface, you know, kind of like those gadgets 158 00:10:20,120 --> 00:10:23,040 Speaker 1: that barbecue enthusiasts use to make sure that their coals 159 00:10:23,040 --> 00:10:25,720 Speaker 1: are hot enough to I don't know, seer a pig 160 00:10:25,800 --> 00:10:31,200 Speaker 1: or whatever. The Google Pixel eight Pro starts at nine 161 00:10:31,240 --> 00:10:34,319 Speaker 1: hundred ninety nine bucks. That's one hundred dollars increase over 162 00:10:34,360 --> 00:10:38,280 Speaker 1: the previous generation's Pro model. The standard Pixel eight not 163 00:10:38,360 --> 00:10:40,839 Speaker 1: the Pro, but the regular Pixel eight is a little 164 00:10:40,840 --> 00:10:44,560 Speaker 1: bit smaller than the previous generation, the Pixel seven. It 165 00:10:44,640 --> 00:10:47,360 Speaker 1: lacks the telephoto lens that the Pro has, but it 166 00:10:47,440 --> 00:10:50,800 Speaker 1: also comes with the ultra wide camera, and it starts 167 00:10:50,840 --> 00:10:53,600 Speaker 1: at six hundred ninety nine dollars. That's also an increase 168 00:10:53,640 --> 00:10:58,680 Speaker 1: over the previous generation. The phones also launched with Android fourteen, 169 00:10:58,720 --> 00:11:02,720 Speaker 1: and Google has promised seven years of ongoing OS upgrade support, 170 00:11:02,840 --> 00:11:06,959 Speaker 1: which is best in class as far as OS support 171 00:11:07,000 --> 00:11:10,800 Speaker 1: goes in the smartphone industry. The watch, meanwhile, the pixel 172 00:11:10,840 --> 00:11:16,200 Speaker 1: Watch two, has a few cosmetic changes compared to the 173 00:11:16,200 --> 00:11:20,600 Speaker 1: original pixel Watch. It also has some updated sensors, and 174 00:11:20,679 --> 00:11:23,480 Speaker 1: it has a new processor, the snap Dragon W five, 175 00:11:23,840 --> 00:11:27,760 Speaker 1: and that means that the pixel Watch two should be 176 00:11:27,760 --> 00:11:31,800 Speaker 1: better at conserving battery life more effectively than the original 177 00:11:31,920 --> 00:11:35,400 Speaker 1: pixel Watch was. There were a few other announcements that 178 00:11:35,480 --> 00:11:37,719 Speaker 1: Google made as well, including one that says Google is 179 00:11:37,760 --> 00:11:41,000 Speaker 1: going to unleash a new AI enhanced version of the 180 00:11:41,080 --> 00:11:43,439 Speaker 1: Google Assistant on all of us at some point in 181 00:11:43,520 --> 00:11:49,080 Speaker 1: the near future. And I for one welcome our robot overlords. Okay, 182 00:11:49,440 --> 00:11:51,320 Speaker 1: we're gonna take a quick break and we'll be back 183 00:11:51,360 --> 00:12:03,640 Speaker 1: with some more tech news. We're back. So last week, 184 00:12:04,200 --> 00:12:08,079 Speaker 1: Meta pushed out an update to Messenger for some users 185 00:12:08,240 --> 00:12:12,120 Speaker 1: a small group that would allow them to create stickers 186 00:12:12,800 --> 00:12:17,160 Speaker 1: through AI generation, something that would presumably find its way 187 00:12:17,240 --> 00:12:20,880 Speaker 1: into various Meta platforms. And it didn't take long at 188 00:12:20,920 --> 00:12:24,400 Speaker 1: all for people to show that this tool as designed 189 00:12:25,080 --> 00:12:31,120 Speaker 1: has no real safeguards against stuff like intellectual property infringement, 190 00:12:31,400 --> 00:12:35,080 Speaker 1: like infringing upon trademarks and copyright and that kind of stuff, 191 00:12:35,600 --> 00:12:40,000 Speaker 1: because users created stickers where they prompted the AI to make, 192 00:12:40,320 --> 00:12:44,280 Speaker 1: you know, images of popular characters like Mickey Mouse doing 193 00:12:44,320 --> 00:12:47,320 Speaker 1: stuff that the Walt Disney Company absolutely would object to, 194 00:12:47,480 --> 00:12:51,840 Speaker 1: like having Mickey Mouse holding a bloody knife. That's not 195 00:12:51,880 --> 00:12:54,439 Speaker 1: an image that the Walt Disney Company would be likely 196 00:12:54,520 --> 00:12:58,400 Speaker 1: to approve of. Several folks pointed out that it seems 197 00:12:58,520 --> 00:13:01,880 Speaker 1: like the people at Meta a very little, if any 198 00:13:02,360 --> 00:13:05,920 Speaker 1: thought as to the consequences of rolling out an AI 199 00:13:06,160 --> 00:13:10,959 Speaker 1: generative sticker tool. Now, on the other hand, you could 200 00:13:11,000 --> 00:13:14,280 Speaker 1: make the argument that what Meta is doing is essentially 201 00:13:14,320 --> 00:13:17,480 Speaker 1: a limited beta rollout, and that the whole purpose of 202 00:13:17,520 --> 00:13:21,400 Speaker 1: the beta is to find exactly what sorts of issues 203 00:13:21,440 --> 00:13:24,640 Speaker 1: exist with the features before they push it out to 204 00:13:24,800 --> 00:13:27,480 Speaker 1: the world at large. You know, it's better to find 205 00:13:27,520 --> 00:13:30,199 Speaker 1: out that your tool is I don't know, violating intellectual 206 00:13:30,240 --> 00:13:34,199 Speaker 1: property laws with a small group rather than your overall 207 00:13:34,280 --> 00:13:37,920 Speaker 1: user base, right, I guess there's some logic to that. Now, 208 00:13:38,960 --> 00:13:41,880 Speaker 1: I personally wonder if perhaps some of this could have 209 00:13:41,880 --> 00:13:45,920 Speaker 1: been handled before it actually reached the beta test stage. Personally, 210 00:13:46,640 --> 00:13:49,120 Speaker 1: I see this as another indication that some companies are 211 00:13:49,160 --> 00:13:52,480 Speaker 1: really rushing a bit too quickly to incorporate generative AI 212 00:13:52,600 --> 00:13:55,720 Speaker 1: into products, and then the grand scheme of things are 213 00:13:55,720 --> 00:13:59,240 Speaker 1: actually making it harder on themselves. But again, that's that's 214 00:13:59,360 --> 00:14:02,760 Speaker 1: just my personfctive. I don't mean to suggest that that's 215 00:14:03,600 --> 00:14:06,240 Speaker 1: the correct way to look at things, or even that 216 00:14:06,360 --> 00:14:10,360 Speaker 1: I'm necessarily accurate in my estimation. It's just the way 217 00:14:10,920 --> 00:14:13,880 Speaker 1: it seems to be unfolding to me. Assuming an early 218 00:14:14,040 --> 00:14:18,560 Speaker 1: report from Reuters has been accurate. Meta also held another 219 00:14:18,800 --> 00:14:22,680 Speaker 1: round of layoffs yesterday. This time the division that was 220 00:14:22,720 --> 00:14:25,560 Speaker 1: said to have been affected the most was Reality Labs. 221 00:14:26,000 --> 00:14:29,560 Speaker 1: Now that's the part of Meta that's working on future projects, 222 00:14:29,560 --> 00:14:33,120 Speaker 1: you know, stuff like the metaverse. So according to this report, 223 00:14:33,200 --> 00:14:37,280 Speaker 1: the specific people who were targeted by these layoffs worked 224 00:14:37,280 --> 00:14:41,880 Speaker 1: on creating custom silicon products for Meta. So these are 225 00:14:41,960 --> 00:14:45,880 Speaker 1: the products that power stuff like mixed reality hardware. So 226 00:14:46,280 --> 00:14:51,520 Speaker 1: think of it like processors and such, and the mixed 227 00:14:51,520 --> 00:14:54,760 Speaker 1: reality includes stuff like virtual reality, augmented reality, that kind 228 00:14:54,800 --> 00:14:58,440 Speaker 1: of thing. So this does not mean that Meta has 229 00:14:58,480 --> 00:15:02,520 Speaker 1: lost confidence in the metaverse concept in general, or even 230 00:15:02,680 --> 00:15:06,960 Speaker 1: lost confidence with mixed reality in particular. Instead, it may 231 00:15:07,000 --> 00:15:10,640 Speaker 1: indicate that Meta wasn't seeing positive results as far as 232 00:15:10,640 --> 00:15:14,880 Speaker 1: developing their own chipsets. They weren't seeing the benefit of 233 00:15:15,000 --> 00:15:20,080 Speaker 1: doing that versus say, relying upon external providers and purchasing 234 00:15:20,240 --> 00:15:25,080 Speaker 1: chips from established companies that specialize in that sort of thing. 235 00:15:25,560 --> 00:15:28,680 Speaker 1: So I think the common wisdom is that Meta just 236 00:15:28,720 --> 00:15:33,120 Speaker 1: decided that it makes more sense to purchase stuff essentially 237 00:15:33,160 --> 00:15:36,240 Speaker 1: off the shelf rather than develop it in house for 238 00:15:36,320 --> 00:15:40,440 Speaker 1: this particular product. Okay, time to head over to the 239 00:15:40,520 --> 00:15:44,440 Speaker 1: platform X formerly known as Twitter. Elon Musk has made 240 00:15:44,480 --> 00:15:47,520 Speaker 1: another change, one that strikes me as a little odd, 241 00:15:48,040 --> 00:15:50,960 Speaker 1: and it was definitely Musk who demanded this change because 242 00:15:50,960 --> 00:15:54,680 Speaker 1: he said so himself. Anyway, that change is to remove 243 00:15:54,880 --> 00:16:00,520 Speaker 1: article headlines from posts on X. So let's say that 244 00:16:00,560 --> 00:16:05,920 Speaker 1: you're on X. Wow, that sounds terrible, right, not ecstasy. 245 00:16:06,080 --> 00:16:09,280 Speaker 1: You're on the platform X and you're posting an article 246 00:16:09,840 --> 00:16:12,840 Speaker 1: to your post used to be called a tweet. It's 247 00:16:12,840 --> 00:16:16,080 Speaker 1: hard to call anything anything these days. So you're posting 248 00:16:16,080 --> 00:16:21,160 Speaker 1: an article on your X platform. And now that means 249 00:16:21,160 --> 00:16:23,600 Speaker 1: that people who follow you, instead of seeing a headline, 250 00:16:23,600 --> 00:16:26,760 Speaker 1: they're going to see a blurb from the article followed 251 00:16:26,760 --> 00:16:28,960 Speaker 1: by an image and they can click through to go 252 00:16:29,000 --> 00:16:31,200 Speaker 1: and read the article there. But they're not going to 253 00:16:31,240 --> 00:16:34,520 Speaker 1: get the headline that actually indicates what the article is about. 254 00:16:34,560 --> 00:16:37,240 Speaker 1: And that seems really weird, right, because I mean, I 255 00:16:37,240 --> 00:16:40,520 Speaker 1: don't know a lot of articles bury the lead, so 256 00:16:40,720 --> 00:16:42,960 Speaker 1: the blurb at the top might not actually indicate what 257 00:16:43,080 --> 00:16:45,880 Speaker 1: the article itself is really about. You might not actually 258 00:16:45,920 --> 00:16:49,080 Speaker 1: have a good feeling as to what is this article 259 00:16:49,600 --> 00:16:53,280 Speaker 1: all about. And of course there are also those terrible 260 00:16:53,400 --> 00:16:57,000 Speaker 1: articles that will spend five hundred words just rambling about 261 00:16:57,080 --> 00:16:59,680 Speaker 1: the obvious in an effort to drive more AD impressions, 262 00:17:00,120 --> 00:17:01,920 Speaker 1: and you have to just keep scrolling and scrolling and 263 00:17:01,920 --> 00:17:03,880 Speaker 1: scrolling for like a full minute before you get to 264 00:17:03,960 --> 00:17:07,240 Speaker 1: the actual point of the article. I hate those anyway. 265 00:17:07,800 --> 00:17:11,720 Speaker 1: Musk says that this is going to quote greatly improve 266 00:17:11,840 --> 00:17:16,320 Speaker 1: the esthetics end quote. Now maybe I don't think so. 267 00:17:16,840 --> 00:17:19,640 Speaker 1: I don't understand what aesthetics are being improved by this, 268 00:17:19,920 --> 00:17:22,600 Speaker 1: But if I had to guess, I'd say what Musk 269 00:17:22,720 --> 00:17:25,640 Speaker 1: is actually hoping for is a way to discourage people 270 00:17:25,720 --> 00:17:29,160 Speaker 1: from posting articles on X in the first place, because 271 00:17:29,280 --> 00:17:32,359 Speaker 1: clicking on an article takes your eyeballs away from X, 272 00:17:32,760 --> 00:17:36,400 Speaker 1: and Musk really needs your eyes to stay on X itself. 273 00:17:37,359 --> 00:17:40,080 Speaker 1: So maybe what he's hoping is that people will just 274 00:17:40,200 --> 00:17:44,679 Speaker 1: kind of stop posting links and instead they'll summarize an article, 275 00:17:44,800 --> 00:17:46,720 Speaker 1: or they'll give their own opinion on a topic, and 276 00:17:46,720 --> 00:17:51,040 Speaker 1: they'll write longer posts, and this will prompt more engagement 277 00:17:51,119 --> 00:17:54,120 Speaker 1: on the platform. That way, because a lot of reports 278 00:17:54,160 --> 00:17:57,400 Speaker 1: have said that excessing massive drops in AD revenue ever 279 00:17:57,440 --> 00:18:01,000 Speaker 1: since Musk took over. That being said, Linda Yakarina, the 280 00:18:01,040 --> 00:18:04,760 Speaker 1: actual CEO of X, has claimed that ninety percent of 281 00:18:04,920 --> 00:18:07,680 Speaker 1: x's top advertisers have returned to the platform to place 282 00:18:07,720 --> 00:18:11,240 Speaker 1: ads there, but all the analysis I've seen has suggested 283 00:18:11,280 --> 00:18:15,680 Speaker 1: that between like fifty to sixty five percent of revenue 284 00:18:15,720 --> 00:18:20,760 Speaker 1: is down every month because of various issues that advertisers 285 00:18:20,760 --> 00:18:24,679 Speaker 1: have with the platform. So if I were a betting fellow, 286 00:18:25,680 --> 00:18:27,879 Speaker 1: I'm not, but if I were, I would place a 287 00:18:27,880 --> 00:18:30,360 Speaker 1: wager that this is all really part of a move 288 00:18:30,400 --> 00:18:34,680 Speaker 1: to try and reverse the bad fortune of x Reuter's 289 00:18:34,680 --> 00:18:38,040 Speaker 1: reports that BlackBerry is splitting into two companies. I was 290 00:18:38,080 --> 00:18:40,480 Speaker 1: actually surprised to learn that BlackBerry is still a thing. 291 00:18:40,520 --> 00:18:42,720 Speaker 1: I haven't been keeping up with that company for ages. 292 00:18:43,320 --> 00:18:45,560 Speaker 1: But one of the two companies will be focusing on 293 00:18:45,680 --> 00:18:50,320 Speaker 1: BlackBerry's Internet of Things business that's primarily in car connected systems, 294 00:18:50,960 --> 00:18:55,040 Speaker 1: and the other one will center around cybersecurity. Now, once 295 00:18:55,119 --> 00:18:59,399 Speaker 1: upon a time, BlackBerry was synonymous with smartphones. This was 296 00:18:59,400 --> 00:19:02,439 Speaker 1: actually in the era before smartphones had touch screens and 297 00:19:02,440 --> 00:19:05,560 Speaker 1: that sort of stuff, before the iPhone. Now, back when 298 00:19:05,880 --> 00:19:08,960 Speaker 1: smartphones had physical keys that you would use to type 299 00:19:08,960 --> 00:19:13,880 Speaker 1: on and only business executives typically owned one. Now, according 300 00:19:13,880 --> 00:19:16,199 Speaker 1: to the Globe and Mail, the driving force behind this 301 00:19:16,280 --> 00:19:19,800 Speaker 1: decision is to provide more shareholder value and to drum 302 00:19:19,920 --> 00:19:22,359 Speaker 1: up interest in the investor community because BlackBerry has been 303 00:19:22,440 --> 00:19:26,320 Speaker 1: kind of stagnating recently. Whether this will actually work and 304 00:19:26,400 --> 00:19:29,320 Speaker 1: achieve those goals remains to be seen, but the spinoff 305 00:19:29,440 --> 00:19:32,320 Speaker 1: is supposed to happen at the beginning of the next 306 00:19:32,359 --> 00:19:36,040 Speaker 1: fiscal year for BlackBerry, and for BlackBerry that means it 307 00:19:36,119 --> 00:19:40,159 Speaker 1: starts on March first. One story that I've somehow missed 308 00:19:40,680 --> 00:19:44,160 Speaker 1: over the last month has to do with a company, 309 00:19:44,240 --> 00:19:48,959 Speaker 1: and I use the term loosely called AOG technics all right, So, 310 00:19:49,440 --> 00:19:55,720 Speaker 1: several major airlines, which include American Airlines, Southwest Airlines, United Airlines, 311 00:19:55,800 --> 00:19:59,600 Speaker 1: and now Delta Airlines, have found out that some of 312 00:19:59,640 --> 00:20:03,520 Speaker 1: the in ees on some of their aircraft contained parts 313 00:20:03,840 --> 00:20:08,199 Speaker 1: that had fraudulent certification attached to them. So when it 314 00:20:08,240 --> 00:20:11,560 Speaker 1: comes to the aerospace industry, everything is supposed to be 315 00:20:11,600 --> 00:20:16,679 Speaker 1: carefully tested and documented and certified. You've got agencies like 316 00:20:16,680 --> 00:20:20,040 Speaker 1: here in the US we have the FAA that oversees 317 00:20:20,119 --> 00:20:22,760 Speaker 1: this sort of stuff, and it's all for the protection 318 00:20:23,280 --> 00:20:27,320 Speaker 1: of the public as well as for the companies right 319 00:20:27,400 --> 00:20:31,080 Speaker 1: to make sure that everything is as buttoned up as 320 00:20:31,119 --> 00:20:33,760 Speaker 1: it can be to be as safe as possible. Now, 321 00:20:33,760 --> 00:20:37,119 Speaker 1: a lot of airlines have aircraft that use a jet 322 00:20:37,160 --> 00:20:41,440 Speaker 1: engine that was made by CFM International, and there's nothing 323 00:20:41,440 --> 00:20:44,919 Speaker 1: wrong with the engines, like, the engines are fine. But 324 00:20:44,960 --> 00:20:48,280 Speaker 1: when it comes time to service or to repair an engine, 325 00:20:48,480 --> 00:20:51,879 Speaker 1: it's not uncommon for these airlines to rely upon a 326 00:20:51,920 --> 00:20:56,840 Speaker 1: third party repair company, and those third party repair companies 327 00:20:57,080 --> 00:21:04,840 Speaker 1: will use various brokers in the industry to acquire spare parts, 328 00:21:05,119 --> 00:21:09,080 Speaker 1: which then they use to replace stuff that's either ending 329 00:21:09,160 --> 00:21:12,720 Speaker 1: the nearing the end of its service life, or showing 330 00:21:12,760 --> 00:21:15,520 Speaker 1: signs of wear and tear, whatever it may be. So 331 00:21:15,680 --> 00:21:19,000 Speaker 1: AOG Technics was a part of this supply chain. It 332 00:21:19,040 --> 00:21:22,320 Speaker 1: was one of these brokers and its job was to 333 00:21:22,680 --> 00:21:25,240 Speaker 1: source and certify parts which it would then sell to 334 00:21:25,400 --> 00:21:29,920 Speaker 1: these various repair companies. It was claiming that these were 335 00:21:30,080 --> 00:21:33,359 Speaker 1: certified parts, and as it turns out, that certification was fake. 336 00:21:33,800 --> 00:21:36,960 Speaker 1: Now this means the reliability of those engine parts is 337 00:21:37,000 --> 00:21:39,680 Speaker 1: brought into question. It doesn't automatically mean the parts are 338 00:21:39,840 --> 00:21:43,000 Speaker 1: terrible or that they'll break immediately, but you just don't 339 00:21:43,040 --> 00:21:47,360 Speaker 1: know because they were never certified. So the company itself, 340 00:21:47,400 --> 00:21:51,080 Speaker 1: AOG Technics, has a single shareholder, which happens to be 341 00:21:51,119 --> 00:21:56,240 Speaker 1: the company's founder and director a fellow named jose Zamora Yerralla, 342 00:21:56,760 --> 00:21:59,080 Speaker 1: who has done a really good job of avoiding the 343 00:21:59,160 --> 00:22:02,800 Speaker 1: press as this story has unfolded. The consequences of this 344 00:22:02,880 --> 00:22:06,800 Speaker 1: issue is that the airlines affected have been pulling aircraft 345 00:22:07,040 --> 00:22:11,000 Speaker 1: that have these engines out of service until the parts 346 00:22:11,000 --> 00:22:14,520 Speaker 1: involved can be replaced. So essentially they're having to have 347 00:22:14,560 --> 00:22:18,720 Speaker 1: the whole repair job done again and the fraudulent part 348 00:22:19,160 --> 00:22:23,320 Speaker 1: swapped out for one that is truly certified. AOG Technics 349 00:22:23,320 --> 00:22:25,639 Speaker 1: has essentially kind of gone away in a puff of smoke. 350 00:22:26,200 --> 00:22:28,480 Speaker 1: It's a heck of a story. It really illustrates how 351 00:22:28,640 --> 00:22:33,080 Speaker 1: complex supply chains can be, and how you can have 352 00:22:33,200 --> 00:22:36,479 Speaker 1: unexpected weaknesses in those supply chains, and that you know, 353 00:22:36,720 --> 00:22:41,520 Speaker 1: those weaknesses can include people who are running dishonest businesses. 354 00:22:42,480 --> 00:22:45,040 Speaker 1: I will keep an eye on this story as it continues, 355 00:22:45,040 --> 00:22:49,080 Speaker 1: because I'm very curious to hear how this even happened. 356 00:22:49,359 --> 00:22:52,200 Speaker 1: AOG Technics, from why I understand, has been around since 357 00:22:52,200 --> 00:22:57,399 Speaker 1: twenty fifteen. It's had a kind of questionable history, you know, 358 00:22:57,720 --> 00:23:03,440 Speaker 1: with its with its own corporate address, bouncing around the UK, 359 00:23:04,640 --> 00:23:07,280 Speaker 1: never really staying in one place for any length of time, 360 00:23:07,280 --> 00:23:12,240 Speaker 1: and not apparently attached to like a actual physical office. 361 00:23:12,800 --> 00:23:17,359 Speaker 1: So it sounds like there's been some fairly questionable proceedings 362 00:23:17,440 --> 00:23:20,520 Speaker 1: going on for a while, and I'm curious to learn more. 363 00:23:20,560 --> 00:23:25,000 Speaker 1: But I'm sure we'll get more information as the story unfolds. Okay, 364 00:23:25,440 --> 00:23:27,600 Speaker 1: we're going to take another quick break. When we come back, 365 00:23:27,640 --> 00:23:29,639 Speaker 1: I got a few more stories and a couple of 366 00:23:29,640 --> 00:23:42,320 Speaker 1: recommendations for y'all, but first let's thank our sponsors. Okay, 367 00:23:42,320 --> 00:23:47,360 Speaker 1: we're back. So a few years ago, the video game 368 00:23:47,359 --> 00:23:51,840 Speaker 1: publisher Ubisoft, which is responsible for franchises like Assassin's Creed, 369 00:23:53,080 --> 00:23:57,800 Speaker 1: had a massive scandal, like a really big, nasty scandal 370 00:23:57,840 --> 00:24:01,800 Speaker 1: with a Nubisoft that involved you know, corporate culture and 371 00:24:02,040 --> 00:24:06,480 Speaker 1: sexual harassment and preferential treatment and all these sorts of 372 00:24:06,520 --> 00:24:13,640 Speaker 1: allegations that are absolutely terrible. The revelations shook the company significantly, 373 00:24:14,280 --> 00:24:18,199 Speaker 1: with multiple executives being shown the door as a result 374 00:24:18,320 --> 00:24:22,280 Speaker 1: of the allegations. But you would you could actually say 375 00:24:22,320 --> 00:24:25,800 Speaker 1: that the problems that were later revealed that Activision Blizzard 376 00:24:25,880 --> 00:24:29,440 Speaker 1: ultimately kind of pulled focus away from Ubisoft. Like it 377 00:24:29,600 --> 00:24:32,440 Speaker 1: it was. It was something like a year later where 378 00:24:32,480 --> 00:24:35,760 Speaker 1: Activision Blizzard story started to break and it just started 379 00:24:35,760 --> 00:24:38,719 Speaker 1: to look like it wasn't just a terrible systemic problem 380 00:24:38,720 --> 00:24:43,240 Speaker 1: with an Ubi Solft, but perhaps a wider problem throughout 381 00:24:43,800 --> 00:24:48,320 Speaker 1: parts of the video game development industry. Anyway. Now, the 382 00:24:48,440 --> 00:24:51,359 Speaker 1: new part of the news is that French authorities have 383 00:24:51,480 --> 00:24:56,880 Speaker 1: arrested five former Ubisoft executives. This is according to Engadget, 384 00:24:57,280 --> 00:25:00,679 Speaker 1: and these include executives who left UBI Soft disgrace in 385 00:25:00,760 --> 00:25:04,600 Speaker 1: twenty twenty after being named in employee accusations as being 386 00:25:04,600 --> 00:25:09,040 Speaker 1: guilty of, to put it, lightly, unprofessional conduct. The French 387 00:25:09,080 --> 00:25:12,199 Speaker 1: authorities said they spent more than a full year investigating 388 00:25:12,200 --> 00:25:15,880 Speaker 1: the various claims and that that led to these arrests. 389 00:25:15,960 --> 00:25:20,320 Speaker 1: Like these were serious allegations. A lawyer for the plaintiffs 390 00:25:20,640 --> 00:25:23,760 Speaker 1: say that the issues indicate a systemic problem with quote 391 00:25:23,840 --> 00:25:28,240 Speaker 1: unquote sexual violence within the company. I should also add 392 00:25:28,240 --> 00:25:33,479 Speaker 1: that leadership at Ubisoft, after this scandal had become huge 393 00:25:33,600 --> 00:25:38,840 Speaker 1: news committed to changing corporate culture. I am uncertain how 394 00:25:38,880 --> 00:25:43,159 Speaker 1: far that transformation has actually gone to date. I would 395 00:25:43,200 --> 00:25:46,680 Speaker 1: hope that it's been significant and that there really has 396 00:25:46,880 --> 00:25:50,600 Speaker 1: been massive change within the organization because the stories that 397 00:25:50,640 --> 00:25:54,800 Speaker 1: were coming out about Ubisoft around twenty twenty were absolutely disgusting. 398 00:25:55,280 --> 00:25:58,480 Speaker 1: So I hope that that actually has been a big change, 399 00:25:58,920 --> 00:26:01,879 Speaker 1: you just didn't see much information about it after the 400 00:26:02,000 --> 00:26:08,399 Speaker 1: initial scandal made headlines. The United States FCC has told 401 00:26:08,400 --> 00:26:10,800 Speaker 1: the Dish Network that it has to fork over one 402 00:26:10,840 --> 00:26:15,920 Speaker 1: hundred and fifty thousand bucks as a fine for littering. Yeah, 403 00:26:16,359 --> 00:26:21,359 Speaker 1: serious stuff. Specifically, this deals with littering space with a 404 00:26:21,400 --> 00:26:25,199 Speaker 1: defunct satellite. The FCC says that the Dish network had 405 00:26:25,240 --> 00:26:29,120 Speaker 1: a responsibility to move a satellite into a so called 406 00:26:29,280 --> 00:26:33,159 Speaker 1: graveyard orbit at the end of that satellite's functional life. 407 00:26:33,600 --> 00:26:36,840 Speaker 1: But the FCC says the Dish Network failed to do that, 408 00:26:37,440 --> 00:26:39,719 Speaker 1: and so the old satellite is now just taking up 409 00:26:39,720 --> 00:26:42,240 Speaker 1: space in an orbit that could pose as a future 410 00:26:42,280 --> 00:26:46,520 Speaker 1: hazard for some other spacecraft. It's now space junk, and 411 00:26:46,560 --> 00:26:50,840 Speaker 1: it's in an orbit that should otherwise be usable. So 412 00:26:50,960 --> 00:26:53,439 Speaker 1: this is the first time the FCC has actually fined 413 00:26:53,480 --> 00:26:57,879 Speaker 1: a company for contributing to space debris. The agency only 414 00:26:57,960 --> 00:27:00,440 Speaker 1: recently got the authority to be able to do that, 415 00:27:00,880 --> 00:27:04,399 Speaker 1: But now that means we got ourselves a precedent, So 416 00:27:04,760 --> 00:27:10,000 Speaker 1: I suspect we will see this used more in the future. Okay, 417 00:27:10,200 --> 00:27:12,240 Speaker 1: Like I said, I got a couple of article recommendations 418 00:27:12,240 --> 00:27:14,879 Speaker 1: for y'all, but before that, I have one last little 419 00:27:14,880 --> 00:27:17,119 Speaker 1: bit of news, and again this is Apple news, but 420 00:27:17,880 --> 00:27:22,200 Speaker 1: it's a little more light and fluffy, so I thought 421 00:27:22,280 --> 00:27:25,080 Speaker 1: I would leave it to the very end. Apple has 422 00:27:25,119 --> 00:27:30,040 Speaker 1: allowed the luxury solid gold Apple Watch product to wind 423 00:27:30,280 --> 00:27:33,840 Speaker 1: down as it were. The company has said it will 424 00:27:33,880 --> 00:27:38,520 Speaker 1: no longer offer servicing and repairs on the seventeen thousand 425 00:27:38,520 --> 00:27:42,280 Speaker 1: dollars watch. By the way, this watch has not had 426 00:27:42,400 --> 00:27:46,959 Speaker 1: software support since twenty eighteen, so we're talking the hardware 427 00:27:46,960 --> 00:27:50,880 Speaker 1: support at this point. So Apple now has designated this 428 00:27:51,280 --> 00:27:56,119 Speaker 1: luxury watch as being obsolete, which kind of underlines and 429 00:27:56,240 --> 00:28:01,480 Speaker 1: bolds a massive issue with smart watches compared to a 430 00:28:01,760 --> 00:28:06,520 Speaker 1: traditional luxury watch. So a traditional luxury watch is expected 431 00:28:06,560 --> 00:28:09,000 Speaker 1: to you know, kind of hold on to its value 432 00:28:09,320 --> 00:28:12,879 Speaker 1: and to operate indefinitely, assuming you treat it well and 433 00:28:12,960 --> 00:28:14,760 Speaker 1: you know, you know, you occasionally have its service to 434 00:28:14,800 --> 00:28:19,640 Speaker 1: tune it up in everything. But smart watches lose value 435 00:28:20,280 --> 00:28:23,480 Speaker 1: when the company that made it no longer offers software 436 00:28:23,560 --> 00:28:26,400 Speaker 1: or hardware support, or if the company that made it 437 00:28:26,440 --> 00:28:29,199 Speaker 1: goes out of business because that support goes away, Like 438 00:28:29,680 --> 00:28:34,040 Speaker 1: the stuff that is powering and enabling the watch isn't 439 00:28:34,640 --> 00:28:39,400 Speaker 1: necessarily in the watch itself, right, It's provided by a 440 00:28:39,480 --> 00:28:43,040 Speaker 1: company that pushes that stuff out. So when the company 441 00:28:43,040 --> 00:28:46,520 Speaker 1: stops doing that and stops offering support and repairs and such, 442 00:28:47,400 --> 00:28:51,480 Speaker 1: it means that you no longer have a really useful product. 443 00:28:51,720 --> 00:28:54,720 Speaker 1: So it's almost like all those folks who are saying, hey, 444 00:28:55,200 --> 00:28:58,280 Speaker 1: paying seventeen grand for a smart watch is a foolish 445 00:28:58,360 --> 00:29:03,360 Speaker 1: thing to do had at a point anyway. I guess 446 00:29:03,440 --> 00:29:06,560 Speaker 1: that means now I've just got a really expensive paperweight. 447 00:29:07,400 --> 00:29:09,800 Speaker 1: I'm just kidding. I can't afford anything more than a 448 00:29:09,840 --> 00:29:14,440 Speaker 1: TIMEX digital watch. They did, They actually still make Timex 449 00:29:14,480 --> 00:29:17,320 Speaker 1: digital watches. I don't even wear a watch. I haven't 450 00:29:17,360 --> 00:29:20,680 Speaker 1: worn a watch since the pebbles ceased to exist. I'm rambling. 451 00:29:20,960 --> 00:29:23,400 Speaker 1: We're done here. This is what COVID does to my brain. 452 00:29:23,600 --> 00:29:26,520 Speaker 1: I apologize, but I do have a couple of articles 453 00:29:26,560 --> 00:29:29,280 Speaker 1: I want to recommend. First up is an article in 454 00:29:29,440 --> 00:29:33,600 Speaker 1: Wired that's titled Men Overran, a job fare for women 455 00:29:33,720 --> 00:29:37,440 Speaker 1: in Tech. This article is written by Amanda Hoover, and 456 00:29:37,520 --> 00:29:41,160 Speaker 1: it tells the story of how the Grace Hopper Celebration 457 00:29:41,760 --> 00:29:46,120 Speaker 1: found itself absolutely overflowing with quote unquote self identifying males 458 00:29:46,600 --> 00:29:50,120 Speaker 1: at an event meant to help women and non binary 459 00:29:50,280 --> 00:29:54,920 Speaker 1: tech workers advance their careers or establish their careers. Now, 460 00:29:54,920 --> 00:29:59,400 Speaker 1: I'm gonna leave my own commentary right there, except to 461 00:29:59,440 --> 00:30:02,080 Speaker 1: add in a the little bit that the event took 462 00:30:02,120 --> 00:30:06,520 Speaker 1: place in Orlando, Florida, and I expect that we're going 463 00:30:06,600 --> 00:30:11,160 Speaker 1: to see various conventions and conferences, perhaps seek venues in 464 00:30:11,320 --> 00:30:16,360 Speaker 1: states other than Florida due to various issues within that state. 465 00:30:16,920 --> 00:30:19,680 Speaker 1: But we'll leave it there anyway, I recommend that article again. 466 00:30:19,720 --> 00:30:22,680 Speaker 1: That one is titled Men Overran a job fare for 467 00:30:22,760 --> 00:30:25,360 Speaker 1: women in Tech, and you can find it in Wired. 468 00:30:26,080 --> 00:30:29,240 Speaker 1: The second article I want to recommend this week is 469 00:30:29,280 --> 00:30:32,760 Speaker 1: from NPR's Jennifer Ludden, so you can find this on 470 00:30:32,920 --> 00:30:37,080 Speaker 1: NPR dot org. The article is titled Los Angeles is 471 00:30:37,160 --> 00:30:41,160 Speaker 1: using AI to predict who might become homeless and help 472 00:30:41,680 --> 00:30:45,040 Speaker 1: before they do. Now, I wanted to include this article 473 00:30:45,120 --> 00:30:48,200 Speaker 1: because most of the time, y'all, I cover stories that 474 00:30:48,240 --> 00:30:51,480 Speaker 1: put AI in a pretty critical or even negative light. 475 00:30:52,280 --> 00:30:55,880 Speaker 1: But that doesn't necessarily mean AI itself is bad. Right. 476 00:30:56,440 --> 00:30:58,880 Speaker 1: AI is not necessarily bad on its own, it's how 477 00:30:58,880 --> 00:31:03,040 Speaker 1: we use it. This article outlines how Los Angeles' Department 478 00:31:03,080 --> 00:31:07,360 Speaker 1: of Health Services is leveraging artificial intelligence to get much 479 00:31:07,440 --> 00:31:12,120 Speaker 1: needed aid to people before they find themselves homeless. So 480 00:31:12,520 --> 00:31:15,720 Speaker 1: I think it's a worthy read because it does remind 481 00:31:15,800 --> 00:31:20,080 Speaker 1: us that artificial intelligence doesn't have to be the boogeyman. 482 00:31:20,400 --> 00:31:23,080 Speaker 1: It doesn't have to be something that is putting people 483 00:31:23,120 --> 00:31:25,880 Speaker 1: out of work. It doesn't have to be something that is, 484 00:31:26,320 --> 00:31:30,120 Speaker 1: you know, compromising your privacy or security. It can be 485 00:31:30,240 --> 00:31:33,680 Speaker 1: something that ends up being life changing in a good 486 00:31:33,760 --> 00:31:40,040 Speaker 1: way if we are using it properly and in applications 487 00:31:40,080 --> 00:31:44,280 Speaker 1: that have a positive outcome. So I recommend that article 488 00:31:44,320 --> 00:31:46,720 Speaker 1: as well, especially if you need like a little bit 489 00:31:46,760 --> 00:31:49,480 Speaker 1: of inspiration in your day to think of, hey, this 490 00:31:49,560 --> 00:31:52,520 Speaker 1: is a really cool way to try and help people. 491 00:31:52,960 --> 00:31:55,440 Speaker 1: And it's also very realistic, Like if you read that article, 492 00:31:55,480 --> 00:31:59,640 Speaker 1: you will see that they are very clear and forthright 493 00:31:59,720 --> 00:32:04,960 Speaker 1: about how their approach is moving the needle. But it's 494 00:32:05,000 --> 00:32:07,960 Speaker 1: not like it's a magic pill that solves all problems. 495 00:32:08,040 --> 00:32:11,400 Speaker 1: So I recommend checking it out. Okay, that's it. I 496 00:32:11,480 --> 00:32:14,160 Speaker 1: gotta stop talking because I keep having to pause this 497 00:32:14,960 --> 00:32:17,880 Speaker 1: show so that I can cough my head off, and 498 00:32:17,960 --> 00:32:19,400 Speaker 1: I just want to be able to go cough my 499 00:32:19,440 --> 00:32:22,400 Speaker 1: head off without having to start up the show again. Also, 500 00:32:22,880 --> 00:32:24,760 Speaker 1: I don't know if you could hear it. My dog 501 00:32:24,920 --> 00:32:27,960 Speaker 1: barked earlier when I was talking. You can tell how 502 00:32:28,000 --> 00:32:30,760 Speaker 1: tired I am by the fact that I didn't go 503 00:32:30,840 --> 00:32:34,680 Speaker 1: back and re record and have it without my dog 504 00:32:34,720 --> 00:32:37,920 Speaker 1: barking in the background. But when I'm coughing this much, 505 00:32:37,960 --> 00:32:40,720 Speaker 1: I just can't afford to do that. All right, I 506 00:32:40,760 --> 00:32:43,760 Speaker 1: hope all of you out there are doing really well, 507 00:32:44,240 --> 00:32:53,600 Speaker 1: and I'll talk to you again really soon. Tech Stuff 508 00:32:53,720 --> 00:32:58,240 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 509 00:32:58,280 --> 00:33:01,800 Speaker 1: the iHeartRadio app, app podcasts, or wherever you listen to 510 00:33:01,840 --> 00:33:02,800 Speaker 1: your favorite shows.