1 00:00:04,480 --> 00:00:12,680 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,720 --> 00:00:16,120 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,160 --> 00:00:19,159 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,200 --> 00:00:21,319 Speaker 1: tech are you. It's time for the tech news for 5 00:00:21,360 --> 00:00:25,599 Speaker 1: the weekending Friday, August twenty third, twenty twenty four. And 6 00:00:25,680 --> 00:00:28,000 Speaker 1: first up, we have what I guess you could call 7 00:00:28,280 --> 00:00:33,000 Speaker 1: non news news. So recently online tech news outlets began 8 00:00:33,080 --> 00:00:37,280 Speaker 1: to report that Google was essentially killing off the Fitbit brand. 9 00:00:37,680 --> 00:00:41,280 Speaker 1: Google acquired Fitbit back in twenty nineteen, and over time, 10 00:00:41,520 --> 00:00:44,280 Speaker 1: the company has incorporated more of the same sort of 11 00:00:44,320 --> 00:00:48,559 Speaker 1: tech found and Fitbit activity trackers into the Google branded 12 00:00:48,680 --> 00:00:51,920 Speaker 1: smart watches. So the reporting seemed to suggest that Google 13 00:00:52,000 --> 00:00:55,840 Speaker 1: was going to sunset the Fitbit brand, or at least 14 00:00:56,040 --> 00:01:00,120 Speaker 1: reduce Fitbit to just a few models of its fitness trackers, 15 00:01:00,160 --> 00:01:05,319 Speaker 1: while the fit Bit branded smart watches like the Sense 16 00:01:05,440 --> 00:01:09,800 Speaker 1: and the Versa would essentially go away. However, Sharon Harding 17 00:01:09,880 --> 00:01:13,199 Speaker 1: of ours Tetnica followed up on this. She contacted Google 18 00:01:13,240 --> 00:01:15,960 Speaker 1: directly and asked if, in fact, the company was going 19 00:01:16,000 --> 00:01:19,200 Speaker 1: to reduce the fit Bit line to activity trackers like 20 00:01:19,240 --> 00:01:23,319 Speaker 1: the Inspire and the charge models. The rep from Google 21 00:01:23,360 --> 00:01:26,880 Speaker 1: said the earlier reporting was inaccurate, that Google, in fact 22 00:01:27,040 --> 00:01:31,360 Speaker 1: had just released a fit Bit branded smart watch for kids. 23 00:01:31,400 --> 00:01:34,039 Speaker 1: This is like a tongue twister, and I'm terrible at 24 00:01:34,080 --> 00:01:37,679 Speaker 1: those anyways. Harding points out the concerns about Google killing 25 00:01:37,720 --> 00:01:41,200 Speaker 1: off product lines. That's understandable because Google has a huge 26 00:01:41,280 --> 00:01:43,760 Speaker 1: body count when it comes to products and services that 27 00:01:43,840 --> 00:01:47,520 Speaker 1: it once launched that have long since pushed up the daisies. 28 00:01:47,640 --> 00:01:50,000 Speaker 1: But it seems like Fitbit is not going to be 29 00:01:50,040 --> 00:01:53,880 Speaker 1: one of those, at least according to company representatives. The 30 00:01:54,000 --> 00:01:58,680 Speaker 1: vergess Lauren Finer has an article titled Google sales reps 31 00:01:58,760 --> 00:02:03,200 Speaker 1: allegedly keep telling advertisers how to target teams. So this 32 00:02:03,280 --> 00:02:06,560 Speaker 1: relates to a story I talked about recently on Tech Stuff. 33 00:02:06,840 --> 00:02:10,280 Speaker 1: Google follows an industry practice that, in theory at least 34 00:02:10,480 --> 00:02:14,160 Speaker 1: means the company does not engage in targeted or personalized 35 00:02:14,200 --> 00:02:18,200 Speaker 1: advertising for any user under the age of eighteen. However, 36 00:02:18,560 --> 00:02:21,760 Speaker 1: according to multiple sources, now Google has made use of 37 00:02:21,800 --> 00:02:26,320 Speaker 1: a bit of a loophole. The company can target unknown users. 38 00:02:26,639 --> 00:02:29,840 Speaker 1: So these are users who do not have age information 39 00:02:30,560 --> 00:02:33,560 Speaker 1: built into their profiles, like they have not indicated what 40 00:02:33,680 --> 00:02:37,600 Speaker 1: their age is to Google, so there's no confirmation one 41 00:02:37,600 --> 00:02:40,120 Speaker 1: way or the other to say the user is above 42 00:02:40,240 --> 00:02:43,280 Speaker 1: or below the age of eighteen. So you could say 43 00:02:43,280 --> 00:02:46,440 Speaker 1: this is kind of a case of plausible deniability. But 44 00:02:46,520 --> 00:02:49,240 Speaker 1: as I'm sure you're well aware, it really does not 45 00:02:49,440 --> 00:02:52,840 Speaker 1: take much work to get a general feel for someone's 46 00:02:52,880 --> 00:02:56,440 Speaker 1: age based upon their web activity. So with even a 47 00:02:56,520 --> 00:03:01,680 Speaker 1: relatively simple data analysis, pass can start putting folks into 48 00:03:01,919 --> 00:03:07,040 Speaker 1: various buckets, including by age range. So while technically the 49 00:03:07,200 --> 00:03:11,919 Speaker 1: unknown users aren't registered as teens, you can effectively target 50 00:03:12,040 --> 00:03:16,280 Speaker 1: teens with advertising through this kind of roundabout approach. Finer 51 00:03:16,400 --> 00:03:20,000 Speaker 1: cites an article on Adweek and another in Financial Times 52 00:03:20,000 --> 00:03:23,040 Speaker 1: indicating that this is an issue that multiple outlets have 53 00:03:23,080 --> 00:03:27,520 Speaker 1: looked at recently. Google's response includes a statement from Josell Booth, 54 00:03:27,760 --> 00:03:31,639 Speaker 1: a spokesperson for the company, who categorically emphasized that Google's 55 00:03:31,680 --> 00:03:35,160 Speaker 1: policy is not to personalize advertising for anyone under the 56 00:03:35,200 --> 00:03:38,720 Speaker 1: age of eighteen, regardless of whether it's through direct data 57 00:03:38,840 --> 00:03:41,920 Speaker 1: or inferring age from supporting data, and that the company 58 00:03:41,920 --> 00:03:44,160 Speaker 1: would stress this to sales reps so that you know 59 00:03:44,320 --> 00:03:47,360 Speaker 1: they stop suggesting it to potential clients. That Google can 60 00:03:47,560 --> 00:03:50,720 Speaker 1: you know totally get advertisers linked up to impressionable teenagers 61 00:03:51,000 --> 00:03:55,120 Speaker 1: to knock that stuff off. Alex Heath, also of The Verge, 62 00:03:55,320 --> 00:03:58,000 Speaker 1: has a piece that lets us know reality is about 63 00:03:58,000 --> 00:04:01,520 Speaker 1: to get a bit more augmented. That is, both Snap 64 00:04:01,680 --> 00:04:05,240 Speaker 1: and Meta have plans to unveil AR glasses in the 65 00:04:05,320 --> 00:04:08,680 Speaker 1: upcoming weeks. In Snap's case, this would be the latest 66 00:04:08,760 --> 00:04:12,440 Speaker 1: version of the company's Spectacles product. This would be generation 67 00:04:12,520 --> 00:04:15,600 Speaker 1: number five for those of y'all keeping count. According to Heath, 68 00:04:15,760 --> 00:04:18,719 Speaker 1: Snap will be showing the soft terraing Snap's partner summit 69 00:04:18,920 --> 00:04:23,680 Speaker 1: on September seventeenth, Then just a week later, Mark Zuckerberg 70 00:04:23,760 --> 00:04:27,640 Speaker 1: plans to unveil Meta's own AR glasses, which are code 71 00:04:27,720 --> 00:04:32,320 Speaker 1: named Orion. However, Heath's sources tell him that in neither 72 00:04:32,520 --> 00:04:36,359 Speaker 1: case will these products ever hit the consumer market. These 73 00:04:36,400 --> 00:04:40,400 Speaker 1: are more like Microsoft's HoloLens in that regard. These pieces 74 00:04:40,400 --> 00:04:43,560 Speaker 1: of hardware are really meant to give developers a platform 75 00:04:43,600 --> 00:04:46,400 Speaker 1: to build upon. The issue here is that while the 76 00:04:46,440 --> 00:04:50,400 Speaker 1: potential for AR seems pretty darn limitless, the truth is 77 00:04:50,880 --> 00:04:53,599 Speaker 1: very few apps have been built that leverage AR in 78 00:04:53,640 --> 00:04:56,080 Speaker 1: a way that makes it the best tool for the job. 79 00:04:56,400 --> 00:04:59,679 Speaker 1: As we have seen with Apple's vision headset. Having great 80 00:04:59,680 --> 00:05:03,520 Speaker 1: hardware where isn't necessarily enough. You need the applications to 81 00:05:03,520 --> 00:05:06,120 Speaker 1: be there too. So while we should be seeing some 82 00:05:06,200 --> 00:05:10,440 Speaker 1: impressive demonstrations of this technology, assuming everything goes as planned, 83 00:05:10,760 --> 00:05:14,839 Speaker 1: we won't actually be using this stuff anytime soon. Heath 84 00:05:15,000 --> 00:05:18,240 Speaker 1: says that Snap will have around ten thousand units produced 85 00:05:18,400 --> 00:05:21,080 Speaker 1: and Meta will have even fewer than that, and I'm 86 00:05:21,080 --> 00:05:24,080 Speaker 1: sure that won't stop some tech enthusiasts from trying to 87 00:05:24,120 --> 00:05:26,320 Speaker 1: get their hands on these things, despite the fact that 88 00:05:26,400 --> 00:05:28,799 Speaker 1: there just isn't much you can do with them yet. 89 00:05:29,000 --> 00:05:31,839 Speaker 1: Some folks just have an almost pathological need to have 90 00:05:31,880 --> 00:05:34,400 Speaker 1: the latest technology. I should know. I used to be 91 00:05:34,440 --> 00:05:36,640 Speaker 1: one of them. Time to talk about AI for a 92 00:05:36,640 --> 00:05:39,320 Speaker 1: good long while, because I mean, it's twenty twenty four, y'all. 93 00:05:39,480 --> 00:05:42,279 Speaker 1: So first up, multiple news outlets have reported that Meta 94 00:05:42,320 --> 00:05:45,080 Speaker 1: has rolled out a couple of new web crawling bots 95 00:05:45,160 --> 00:05:48,880 Speaker 1: designed to gather data for the purposes of training AI models, 96 00:05:49,040 --> 00:05:52,960 Speaker 1: and further that these bots ignore attempts to block them, 97 00:05:53,240 --> 00:05:56,760 Speaker 1: so web page builders, in case you've never done this before, 98 00:05:56,880 --> 00:05:59,479 Speaker 1: they have the option to include a little line of 99 00:05:59,520 --> 00:06:03,160 Speaker 1: text that essentially tells bots to buzz off, so you 100 00:06:03,240 --> 00:06:05,200 Speaker 1: might want to do that if you do not want 101 00:06:05,200 --> 00:06:09,000 Speaker 1: your web page indexed for like search purposes, And you 102 00:06:09,080 --> 00:06:11,720 Speaker 1: might want to do that if you only wanted authorized 103 00:06:11,720 --> 00:06:14,440 Speaker 1: individuals to even know about the page in the first place. 104 00:06:14,720 --> 00:06:19,320 Speaker 1: But these bots reportedly ignore these kinds of lines of text, 105 00:06:19,440 --> 00:06:22,760 Speaker 1: and they will crawl a site even if the web 106 00:06:22,800 --> 00:06:26,800 Speaker 1: administrators said, please don't index the site. And I find 107 00:06:26,839 --> 00:06:30,120 Speaker 1: that really interesting because Meta very much takes the stance 108 00:06:30,160 --> 00:06:33,680 Speaker 1: that crawling sites like Facebook and the like is expressly 109 00:06:33,760 --> 00:06:36,760 Speaker 1: against their rules, that no one is supposed to treat 110 00:06:36,839 --> 00:06:40,920 Speaker 1: Meta that way, and yet here they are producing bots 111 00:06:41,400 --> 00:06:45,080 Speaker 1: that are apparently engaging in the very same behavior that 112 00:06:45,160 --> 00:06:48,719 Speaker 1: Meta prohibits on its owned and operated sites. How about 113 00:06:48,760 --> 00:06:51,640 Speaker 1: that anyway, This is really all part of the AI 114 00:06:51,960 --> 00:06:55,479 Speaker 1: arms race, where various AI companies are desperate to get 115 00:06:55,520 --> 00:06:58,520 Speaker 1: ever larger pools of data in order to train their 116 00:06:58,600 --> 00:07:02,520 Speaker 1: large language models and make the next AI tool guaranteed 117 00:07:02,560 --> 00:07:07,600 Speaker 1: to create massive ethical problems. Web administrators can end up 118 00:07:07,600 --> 00:07:10,280 Speaker 1: in a pickle when companies like Meta and Google engage 119 00:07:10,280 --> 00:07:12,760 Speaker 1: in this kind of activity, because often it means that 120 00:07:12,800 --> 00:07:16,680 Speaker 1: if you are successful in blocking the AI crawlers, it 121 00:07:16,720 --> 00:07:20,240 Speaker 1: means you're also having to block the index bots, which 122 00:07:20,320 --> 00:07:22,240 Speaker 1: means your site is not going to pop up in 123 00:07:22,400 --> 00:07:27,600 Speaker 1: like search results and such. So website operators they're pressured 124 00:07:27,800 --> 00:07:32,280 Speaker 1: to allow this AI crawling activity or else potentially miss 125 00:07:32,320 --> 00:07:36,240 Speaker 1: out on being discoverable on these massive platforms. And it 126 00:07:36,240 --> 00:07:38,360 Speaker 1: doesn't do you much good if you built something and 127 00:07:38,400 --> 00:07:41,680 Speaker 1: no one knows about it, right, So it becomes this 128 00:07:41,720 --> 00:07:44,320 Speaker 1: double edged sword. It could be like, well, yeah, I 129 00:07:44,360 --> 00:07:46,520 Speaker 1: want to be in search because I want people to 130 00:07:46,520 --> 00:07:48,280 Speaker 1: be able to find me, but I don't want it 131 00:07:48,320 --> 00:07:50,720 Speaker 1: to be crawled for the purposes of AI. Well, when 132 00:07:50,760 --> 00:07:54,360 Speaker 1: it's the same companies doing both, that becomes a problem, 133 00:07:54,640 --> 00:07:57,440 Speaker 1: and it's kind of like extortion if you think about it. 134 00:07:57,440 --> 00:07:59,440 Speaker 1: It's almost as if some of these big companies act 135 00:07:59,440 --> 00:08:02,280 Speaker 1: in ways that can be a bit anti competitive. Folks 136 00:08:02,320 --> 00:08:04,840 Speaker 1: over at the University of Texas have developed an earthquake 137 00:08:04,880 --> 00:08:08,120 Speaker 1: detection tool that uses AI to look for signals that 138 00:08:08,240 --> 00:08:12,360 Speaker 1: could indicate an upcoming earthquake, and the results have been promising. 139 00:08:12,560 --> 00:08:16,400 Speaker 1: According to SyTech Daily, the researchers achieved a seventy percent 140 00:08:16,520 --> 00:08:20,640 Speaker 1: accuracy rating in predicting earthquakes a week before the earthquakes 141 00:08:20,760 --> 00:08:24,840 Speaker 1: actually happened. The research was conducted in China. I think 142 00:08:24,840 --> 00:08:27,320 Speaker 1: this is pretty darn cool, but we do need to 143 00:08:27,360 --> 00:08:31,200 Speaker 1: remember this is by no means a perfect tool. It's 144 00:08:31,240 --> 00:08:33,920 Speaker 1: got a long way to go. But according to the researchers, 145 00:08:34,320 --> 00:08:37,600 Speaker 1: there were eight false positives, meaning the tool predicted an 146 00:08:37,640 --> 00:08:41,800 Speaker 1: earthquake but nothing actually happened. That's an issue. Also, the 147 00:08:41,800 --> 00:08:45,520 Speaker 1: predictions weren't exactly laser precise. According to the article, the 148 00:08:45,600 --> 00:08:50,280 Speaker 1: fourteen earthquake predictions that the researchers counted as successes were 149 00:08:50,280 --> 00:08:54,880 Speaker 1: within two hundred miles of an actual earthquake. That does 150 00:08:55,000 --> 00:08:58,000 Speaker 1: make one wonder if the tool was successful at predicting 151 00:08:58,040 --> 00:09:02,160 Speaker 1: the earthquake, or maybe it was just a matter of coincidence. However, 152 00:09:02,240 --> 00:09:05,480 Speaker 1: according to the article, the predicted strength of the earthquakes 153 00:09:05,720 --> 00:09:08,440 Speaker 1: was very close to what actually happened. That makes me 154 00:09:08,559 --> 00:09:11,760 Speaker 1: more inclined to think that this is not just coincidence. 155 00:09:11,800 --> 00:09:13,480 Speaker 1: It's one thing to say an earthquake is going to 156 00:09:13,520 --> 00:09:16,679 Speaker 1: happen on this day, generally around this time, and at 157 00:09:16,720 --> 00:09:19,600 Speaker 1: about this strength. If you were only getting like an 158 00:09:19,679 --> 00:09:22,280 Speaker 1: earthquake happening within two hundred miles of where you predicted, 159 00:09:22,600 --> 00:09:25,280 Speaker 1: but the strength wasn't anywhere close to what you predicted 160 00:09:25,520 --> 00:09:28,199 Speaker 1: that to me would feel like coincidence. Getting the strength 161 00:09:28,440 --> 00:09:30,920 Speaker 1: just about right. That seems to whittle that down a bit, 162 00:09:31,120 --> 00:09:33,319 Speaker 1: but it does mean there's a limitation as to how 163 00:09:33,360 --> 00:09:35,880 Speaker 1: precise this tool can be when it comes to locating 164 00:09:35,960 --> 00:09:39,640 Speaker 1: where an earthquake is going to happen. The research obviously 165 00:09:39,679 --> 00:09:41,920 Speaker 1: needs to continue. A lot more work needs to be 166 00:09:41,920 --> 00:09:44,720 Speaker 1: done in order to turn this into a really useful technology. 167 00:09:45,040 --> 00:09:47,640 Speaker 1: As it stands, sending out a warning a week ahead 168 00:09:47,679 --> 00:09:50,800 Speaker 1: of time that someone might be within a two hundred 169 00:09:50,800 --> 00:09:53,840 Speaker 1: mile radius of a future earthquake, that seems limited in 170 00:09:53,880 --> 00:09:56,840 Speaker 1: its usefulness unless we're talking about like a real whopper 171 00:09:56,960 --> 00:09:59,120 Speaker 1: of an earthquake, in which case it could potentially help 172 00:09:59,200 --> 00:10:04,000 Speaker 1: save countless lives, assuming that people actually heeded the warning. Okay, 173 00:10:04,600 --> 00:10:06,960 Speaker 1: we've got a lot more to talk about in today's news, 174 00:10:07,000 --> 00:10:09,559 Speaker 1: but first let's take a quick break to thank our sponsors. 175 00:10:18,960 --> 00:10:22,560 Speaker 1: We're back and we're headed back to the Verge. Last 176 00:10:22,600 --> 00:10:26,079 Speaker 1: week it was all ours Technica. This week gets the Verge. Anyway. 177 00:10:26,080 --> 00:10:28,520 Speaker 1: There are a pair of articles covering the same general 178 00:10:28,559 --> 00:10:31,160 Speaker 1: topic over on the Verge. One is by Alison Johnson, 179 00:10:31,280 --> 00:10:34,400 Speaker 1: the others by Sarah Jong, but they're both about Google's 180 00:10:34,559 --> 00:10:38,920 Speaker 1: Reimagine function in the new Google Pixel nine smartphones. So 181 00:10:39,000 --> 00:10:41,800 Speaker 1: this feature allows you to make some pretty massive changes 182 00:10:41,840 --> 00:10:44,560 Speaker 1: to photos that you've already taken, and you can use 183 00:10:44,840 --> 00:10:49,120 Speaker 1: text based prompts to have AI alter those images in 184 00:10:49,160 --> 00:10:52,800 Speaker 1: various ways. So one example that the articles used was 185 00:10:52,840 --> 00:10:55,760 Speaker 1: they showed a photo of just a street, just a 186 00:10:55,800 --> 00:10:59,400 Speaker 1: normal street, and then a subsequent text edit added in 187 00:10:59,520 --> 00:11:03,480 Speaker 1: a massive pothole incorporated into that street, and sure enough, 188 00:11:03,800 --> 00:11:07,520 Speaker 1: it was an edited photo that looked very convincing, like 189 00:11:07,520 --> 00:11:11,200 Speaker 1: like the pothole was actually there. Both Johnson and to 190 00:11:11,280 --> 00:11:14,160 Speaker 1: a greater extent, Johng point out that the feature allows 191 00:11:14,160 --> 00:11:18,160 Speaker 1: for fakery and image manipulation on a grand scale. Once 192 00:11:18,240 --> 00:11:21,079 Speaker 1: upon a time, you needed at least to be proficient 193 00:11:21,200 --> 00:11:25,040 Speaker 1: with tools like Photoshop to manipulate images convincingly, and even 194 00:11:25,040 --> 00:11:29,360 Speaker 1: then there were like telltale signs that some altering had happened. 195 00:11:29,600 --> 00:11:32,640 Speaker 1: But now AI takes care of all of this for you, 196 00:11:32,720 --> 00:11:36,680 Speaker 1: and it can be pretty darn good at fooling folks. Now, 197 00:11:36,679 --> 00:11:39,000 Speaker 1: the AI is supposed to have guardrails that are meant 198 00:11:39,040 --> 00:11:42,959 Speaker 1: to prevent users from doing really awful stuff, like you 199 00:11:43,000 --> 00:11:45,920 Speaker 1: wouldn't want someone to have a photo and then use 200 00:11:45,960 --> 00:11:49,080 Speaker 1: AI to just litter the ground with like dead puppies 201 00:11:49,200 --> 00:11:52,040 Speaker 1: or something that would be horrifying. But the folks at 202 00:11:52,040 --> 00:11:54,079 Speaker 1: the Verge found they were able to insert a great 203 00:11:54,080 --> 00:11:58,080 Speaker 1: deal of troubling imagery into photographs just by adding some 204 00:11:58,280 --> 00:12:01,800 Speaker 1: creative thinking to get around the guardrails. While being direct 205 00:12:01,800 --> 00:12:04,720 Speaker 1: and blunt in your text directions might result in a 206 00:12:04,800 --> 00:12:07,679 Speaker 1: denial saying no, that's against the policy or whatever, if 207 00:12:07,679 --> 00:12:10,320 Speaker 1: you're a little more circumspect, you can often get the 208 00:12:10,360 --> 00:12:13,120 Speaker 1: same results. And so the Verge showed off images that 209 00:12:13,120 --> 00:12:16,280 Speaker 1: appear to portray such disturbing scenes as a collision that 210 00:12:16,320 --> 00:12:19,320 Speaker 1: happened between a car and a bicycle on a city street, 211 00:12:19,640 --> 00:12:22,560 Speaker 1: or images where there appears to be a body laying 212 00:12:22,600 --> 00:12:25,560 Speaker 1: underneath a bloody sheet on the ground. It's not exactly 213 00:12:25,600 --> 00:12:30,680 Speaker 1: the most positive showcase for an image manipulation tool using AI. Moreover, 214 00:12:30,760 --> 00:12:36,280 Speaker 1: even if we remove the obvious cases of like unintended consequences, 215 00:12:36,320 --> 00:12:39,360 Speaker 1: the end result is that this tool means seeing is 216 00:12:39,400 --> 00:12:43,840 Speaker 1: absolutely not believing. When image manipulation is so easy that 217 00:12:44,000 --> 00:12:46,880 Speaker 1: anyone with the right kind of smartphone can do it 218 00:12:47,120 --> 00:12:51,319 Speaker 1: with no training needed, what does that mean for information? 219 00:12:51,559 --> 00:12:54,360 Speaker 1: How do we know what to trust. How could such 220 00:12:54,360 --> 00:12:57,400 Speaker 1: a tool be used to deceive others, either just for 221 00:12:57,520 --> 00:13:01,160 Speaker 1: kicks or for personal gain or whatever. Are the benefits 222 00:13:01,200 --> 00:13:04,920 Speaker 1: of this technology such that they actually outweigh the risks. 223 00:13:05,320 --> 00:13:07,959 Speaker 1: Then there's also the element of the liar's dividend. This 224 00:13:08,000 --> 00:13:11,000 Speaker 1: is the defense that someone who is absolutely guilty of 225 00:13:11,040 --> 00:13:13,920 Speaker 1: something could use. They could say, Oh, sure it looks 226 00:13:14,000 --> 00:13:17,560 Speaker 1: like that was me robbing that convenience store, but that's 227 00:13:17,600 --> 00:13:22,080 Speaker 1: clearly an AI altered image. I'm innocent. That's the liar's dividend. 228 00:13:22,360 --> 00:13:25,920 Speaker 1: Making matters worse is that Reimagine, at least currently doesn't 229 00:13:25,920 --> 00:13:30,760 Speaker 1: apply a digital watermark to altered images like purely AI 230 00:13:30,920 --> 00:13:34,400 Speaker 1: generated images often have a water mark, but not these. 231 00:13:35,080 --> 00:13:37,559 Speaker 1: Johnson points out that the meta data for the image 232 00:13:37,640 --> 00:13:41,400 Speaker 1: includes a record that it was edited through Reimagine, But 233 00:13:41,520 --> 00:13:43,480 Speaker 1: she also points out that you can get around that 234 00:13:43,640 --> 00:13:46,480 Speaker 1: just by taking a screenshot of the photo in question. 235 00:13:46,600 --> 00:13:49,240 Speaker 1: That just strips out all the metadata, because now you 236 00:13:49,440 --> 00:13:51,360 Speaker 1: just have a picture of a picture and there's no 237 00:13:51,480 --> 00:13:55,680 Speaker 1: record there that Reimagine was used to alter it. Blah. 238 00:13:56,600 --> 00:13:58,880 Speaker 1: You might remember that way back when the twenty twenty 239 00:13:58,920 --> 00:14:01,920 Speaker 1: four Democratic primary were going on, which seems like it 240 00:14:02,000 --> 00:14:04,120 Speaker 1: happened in a different world at this point. But you 241 00:14:04,200 --> 00:14:07,520 Speaker 1: might remember there were reports of an AI generated voice 242 00:14:07,880 --> 00:14:10,800 Speaker 1: that was impersonating US President Joe Biden, and it was 243 00:14:10,840 --> 00:14:13,439 Speaker 1: going out to potential voters in New Hampshire, and the 244 00:14:13,520 --> 00:14:16,680 Speaker 1: voice was urging voters to just stay home and not 245 00:14:16,920 --> 00:14:19,560 Speaker 1: go and vote in the primaries. Well, now the Federal 246 00:14:19,560 --> 00:14:24,120 Speaker 1: Communications Commission, or FCC has ordered the telecom company Lingo 247 00:14:24,240 --> 00:14:28,400 Speaker 1: Telecom to pay a one million dollar civil penalty for 248 00:14:28,840 --> 00:14:32,080 Speaker 1: allowing those calls to go out over its network. Now, 249 00:14:32,080 --> 00:14:36,880 Speaker 1: to be clear, Lingo Telecom wasn't responsible for creating those calls. 250 00:14:37,160 --> 00:14:40,480 Speaker 1: That honor falls to a political consultant named Steve Kramer, 251 00:14:40,520 --> 00:14:43,680 Speaker 1: who in turn was working on behalf of a candidate 252 00:14:43,800 --> 00:14:47,080 Speaker 1: named Dean Phillips who was running an opposition to Joe Biden. 253 00:14:47,520 --> 00:14:51,880 Speaker 1: But Lingo Telecom allowed the calls to go over its network, 254 00:14:51,960 --> 00:14:54,720 Speaker 1: which the FCC deemed as a violation of the know 255 00:14:54,880 --> 00:14:58,680 Speaker 1: your Customer and Know your Provider sets of rules. Kramer 256 00:14:58,760 --> 00:15:00,960 Speaker 1: is also facing a fine could be up to around 257 00:15:00,960 --> 00:15:03,760 Speaker 1: six million dollars. These penalties are meant to send a 258 00:15:03,760 --> 00:15:06,280 Speaker 1: message to folks who are considering a similar scheme that 259 00:15:06,320 --> 00:15:07,920 Speaker 1: if you do this kind of thing, it's going to 260 00:15:07,960 --> 00:15:11,040 Speaker 1: cost you. As for Lingo, the company also agreed to 261 00:15:11,080 --> 00:15:14,160 Speaker 1: make some changes to how it operates to WITT, weeding 262 00:15:14,200 --> 00:15:17,840 Speaker 1: out spoofed phone numbers and only presenting a number when 263 00:15:17,920 --> 00:15:21,480 Speaker 1: Lingo can verify that it's exactly where a call is 264 00:15:21,560 --> 00:15:24,360 Speaker 1: really coming from. Presumably you would otherwise see something like 265 00:15:24,720 --> 00:15:27,360 Speaker 1: unknown caller or something like that on your caller ID. 266 00:15:28,040 --> 00:15:32,000 Speaker 1: Lingo must also verify the identities of customers and work 267 00:15:32,040 --> 00:15:36,440 Speaker 1: with upstream providers that have quote unquote robust robocall mitigation. 268 00:15:36,880 --> 00:15:39,240 Speaker 1: So yeah, this is really sitting a message of saying 269 00:15:39,520 --> 00:15:42,200 Speaker 1: this kind of thing will not be tolerated. In twenty 270 00:15:42,240 --> 00:15:47,480 Speaker 1: twenty three, GM's cruise business shut down effectively after one 271 00:15:47,480 --> 00:15:51,160 Speaker 1: of its autonomous robotaxis dragged the pedestrian for twenty feet 272 00:15:51,200 --> 00:15:53,680 Speaker 1: in San Francisco before it came to a stop. The 273 00:15:53,720 --> 00:15:56,520 Speaker 1: pedestrian had already been struck by another car that one 274 00:15:56,760 --> 00:15:59,760 Speaker 1: was operated by a human. Pretty awful, like a really 275 00:15:59,760 --> 00:16:04,600 Speaker 1: hard ruble sequence of events, and CRUZ faced a massive investigation. 276 00:16:05,160 --> 00:16:08,160 Speaker 1: The CEO of the division promptly jumped ship, as did 277 00:16:08,200 --> 00:16:11,240 Speaker 1: several other leaders, and GM laid off a significant number 278 00:16:11,240 --> 00:16:14,920 Speaker 1: of workers within the Cruz division. But now Cruz is 279 00:16:15,040 --> 00:16:18,360 Speaker 1: back in the news, having struck a partnership deal with Uber. 280 00:16:18,920 --> 00:16:23,160 Speaker 1: New CEO Mark Witten said, quote, we are excited to 281 00:16:23,200 --> 00:16:26,520 Speaker 1: partner with Uber to bring the benefits of safe, reliable 282 00:16:26,640 --> 00:16:30,520 Speaker 1: autonomous driving to even more people, unlocking a new era 283 00:16:30,680 --> 00:16:33,920 Speaker 1: of urban mobility end quote. Which is interesting to me. 284 00:16:34,360 --> 00:16:40,240 Speaker 1: Uber itself really was aggressively pursuing robotaxi strategies several years ago, 285 00:16:40,320 --> 00:16:43,040 Speaker 1: because I mean, cutting human drivers out of the equation 286 00:16:43,240 --> 00:16:45,240 Speaker 1: means more money for the home office. Am I right? 287 00:16:45,600 --> 00:16:49,000 Speaker 1: But snarky comments aside. Uber pretty much pulled the plug 288 00:16:49,160 --> 00:16:52,720 Speaker 1: on its own efforts after a tragic incident in twenty 289 00:16:52,920 --> 00:16:57,840 Speaker 1: eighteen involving another pedestrian accident. Uber then switched to partnering 290 00:16:57,880 --> 00:17:00,840 Speaker 1: with companies in the autonomous vehicle space rather than pursuing 291 00:17:00,880 --> 00:17:03,600 Speaker 1: their own program. So can these two companies, each with 292 00:17:03,720 --> 00:17:07,119 Speaker 1: blemishes on their respective records, team up to create something 293 00:17:07,160 --> 00:17:11,680 Speaker 1: that's safe and reliable. Cruz is currently conducting autonomous vehicle 294 00:17:11,720 --> 00:17:16,400 Speaker 1: testing with supervising safety drivers in cities like Houston and Phoenix. 295 00:17:16,600 --> 00:17:19,359 Speaker 1: No word yet on when those driver lest Uber rides 296 00:17:19,400 --> 00:17:22,960 Speaker 1: will become a reality, or specifically which markets that might 297 00:17:23,040 --> 00:17:25,680 Speaker 1: happen in I know what you're thinking, you know, Jonathan, 298 00:17:25,760 --> 00:17:27,720 Speaker 1: it's been a hot minute since I've learned about a 299 00:17:27,760 --> 00:17:31,399 Speaker 1: new streaming video service launching. Well, rumor has it that 300 00:17:31,480 --> 00:17:34,679 Speaker 1: we might be getting yet another one from a seemingly 301 00:17:34,880 --> 00:17:38,080 Speaker 1: unlikely source. That source is Chick fil A, the fast 302 00:17:38,119 --> 00:17:41,840 Speaker 1: food restaurant known for chicken, among other things. Deadlines Peter 303 00:17:41,920 --> 00:17:44,680 Speaker 1: White reports that Chick fil A is planning a service 304 00:17:44,760 --> 00:17:48,879 Speaker 1: centered primarily around reality television and unscripted content and game 305 00:17:48,880 --> 00:17:52,959 Speaker 1: show programming, all with like a family friendly focus. Presumably 306 00:17:53,000 --> 00:17:55,720 Speaker 1: the programming on this service would in some way advertise 307 00:17:55,800 --> 00:17:59,000 Speaker 1: or promote the company, perhaps through the production of branded content. 308 00:17:59,320 --> 00:18:01,479 Speaker 1: How many folks are out there itching to sign up 309 00:18:01,520 --> 00:18:04,359 Speaker 1: to yet another streaming service, let alone one spearheaded by 310 00:18:04,359 --> 00:18:06,679 Speaker 1: a restaurant company. I have no clue. I'm not going 311 00:18:06,680 --> 00:18:08,359 Speaker 1: to write it off just yet, as it could always 312 00:18:08,359 --> 00:18:10,600 Speaker 1: surprise me. But my first impression is this is going 313 00:18:10,640 --> 00:18:13,480 Speaker 1: to be a very tough sell, particularly during a time 314 00:18:13,480 --> 00:18:15,760 Speaker 1: when people are already taking a harder look at their 315 00:18:15,760 --> 00:18:19,639 Speaker 1: family budgets for stuff like entertainment. Jesse Kiff is in 316 00:18:19,680 --> 00:18:22,360 Speaker 1: hot water for hacking into a government registry in Hawaii 317 00:18:22,520 --> 00:18:26,040 Speaker 1: for what purpose? Faking his own death. Kiff hacked into 318 00:18:26,080 --> 00:18:29,240 Speaker 1: the Hawaii death registry system and marked himself down as 319 00:18:29,320 --> 00:18:33,720 Speaker 1: previously alive or no longer breathing, or debt as a doornail. 320 00:18:33,800 --> 00:18:36,280 Speaker 1: To be honest, I don't know what the checkboxes actually say, 321 00:18:36,480 --> 00:18:38,880 Speaker 1: but the point is Kiff was faking his own debt. 322 00:18:38,960 --> 00:18:41,359 Speaker 1: He was also using fake credentials in an effort to 323 00:18:41,440 --> 00:18:44,800 Speaker 1: secure a credit card or debit a card account. It's 324 00:18:44,920 --> 00:18:47,439 Speaker 1: awfully hard to navigate the modern world unless you've got 325 00:18:47,440 --> 00:18:50,600 Speaker 1: access to that cheddar. So why was Kiff doing all this? Well, 326 00:18:50,640 --> 00:18:52,719 Speaker 1: apparently it was in order to avoid having to make 327 00:18:52,800 --> 00:18:55,879 Speaker 1: child support payments. He's already been tried and found guilty. 328 00:18:55,880 --> 00:18:59,040 Speaker 1: He faces a prison term of sixty nine months. Nice 329 00:18:59,359 --> 00:19:01,720 Speaker 1: he'll have to serve eighty five percent of that sentence, 330 00:19:01,800 --> 00:19:03,880 Speaker 1: after which he will be released, but will remain under 331 00:19:03,880 --> 00:19:08,280 Speaker 1: supervision for three years. Now for some recommended reading, So 332 00:19:08,640 --> 00:19:11,120 Speaker 1: first a recommend checking out Eric Berger's piece for Ours 333 00:19:11,160 --> 00:19:15,240 Speaker 1: Tetnica titled Against All Odds and Asteroid Mining Company appears 334 00:19:15,280 --> 00:19:17,640 Speaker 1: to be making headway, which is a cool story. We've 335 00:19:17,640 --> 00:19:21,280 Speaker 1: been hearing about potentials for asteroid mining for several years now, 336 00:19:21,359 --> 00:19:24,639 Speaker 1: so it's neat getting an update. Next up, Patrick George 337 00:19:24,640 --> 00:19:26,800 Speaker 1: has a piece in the Atlantic titled The Hardest Sell 338 00:19:26,880 --> 00:19:29,720 Speaker 1: in American Car Culture, and it's about how the Ford 339 00:19:29,800 --> 00:19:32,800 Speaker 1: Car Company wants to encourage American car shoppers to think 340 00:19:32,840 --> 00:19:36,080 Speaker 1: about smaller vehicles rather than the trucks and SUVs the 341 00:19:36,119 --> 00:19:38,560 Speaker 1: industry has kind of migrated to in the US over 342 00:19:38,600 --> 00:19:42,080 Speaker 1: the previous years. And that's because smaller cars are lighter. 343 00:19:42,240 --> 00:19:44,679 Speaker 1: Lighter cars are easier to move, and that means the 344 00:19:44,720 --> 00:19:49,080 Speaker 1: battery requirements for EV's that are smaller are more manageable 345 00:19:49,240 --> 00:19:51,679 Speaker 1: than for those big old chonkers that are currently favored 346 00:19:51,680 --> 00:19:53,920 Speaker 1: by the US. So if the US is to move 347 00:19:54,000 --> 00:19:56,560 Speaker 1: to more evs, part of the picture may also mean 348 00:19:56,640 --> 00:20:00,440 Speaker 1: driving smaller vehicles. That's it for this week. I hope 349 00:20:00,480 --> 00:20:02,960 Speaker 1: you are all well and I'll talk to you again 350 00:20:03,560 --> 00:20:13,520 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 351 00:20:13,600 --> 00:20:18,320 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 352 00:20:18,359 --> 00:20:20,280 Speaker 1: wherever you listen to your favorite shows.