1 00:00:04,480 --> 00:00:12,479 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,520 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,960 --> 00:00:19,479 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,560 --> 00:00:21,919 Speaker 1: Tech are You. It's time for the tech News for 5 00:00:21,960 --> 00:00:26,639 Speaker 1: the week ending on Friday, September twenty seventh, twenty twenty four. 6 00:00:27,120 --> 00:00:31,560 Speaker 1: And as the great professional wrestler Shane Helms used to say, 7 00:00:32,240 --> 00:00:37,000 Speaker 1: stand back, there's a hurricane coming through. Because that's right, 8 00:00:37,040 --> 00:00:40,000 Speaker 1: folks here in Atlanta, Georgia where I am, we're in 9 00:00:40,040 --> 00:00:43,040 Speaker 1: the path of a hurricane, and out of an over 10 00:00:43,120 --> 00:00:47,240 Speaker 1: abundance of caution, and also from some gentle prodding from 11 00:00:47,360 --> 00:00:51,000 Speaker 1: super producer Tari, we've decided to write and record the 12 00:00:51,040 --> 00:00:55,240 Speaker 1: tech News episode on Thursday. Now. I usually put these 13 00:00:55,280 --> 00:00:59,240 Speaker 1: things together on Friday morning and then record them after that, 14 00:00:59,360 --> 00:01:02,120 Speaker 1: so this means there could be some fresh tech news 15 00:01:02,160 --> 00:01:04,920 Speaker 1: that doesn't make it into this lineup because Tari and 16 00:01:04,959 --> 00:01:09,040 Speaker 1: I were busy, you know, laying out sandbags or clearing 17 00:01:09,120 --> 00:01:11,679 Speaker 1: trees off the road or something along those lines. I 18 00:01:11,720 --> 00:01:13,640 Speaker 1: hope anyone else out there in the path of the 19 00:01:13,720 --> 00:01:16,800 Speaker 1: hurricane is safe and sound. I really hope you're all 20 00:01:16,880 --> 00:01:18,880 Speaker 1: safe and sound. Whether you're in the path of the 21 00:01:18,959 --> 00:01:22,720 Speaker 1: hurricane or not. It's just particularly stressful for those of 22 00:01:22,800 --> 00:01:26,000 Speaker 1: us who are looking at a big, old, massive storm 23 00:01:26,080 --> 00:01:30,399 Speaker 1: system rolling in. Anyway, let's get to the tech news first. Up. 24 00:01:30,640 --> 00:01:35,880 Speaker 1: Meta held its Connect conference this week and CEO Mark 25 00:01:36,000 --> 00:01:38,440 Speaker 1: Zuckerberg got up in front of the crowd, and he 26 00:01:38,640 --> 00:01:42,199 Speaker 1: was his usual humble self. He wore a T shirt 27 00:01:42,400 --> 00:01:47,920 Speaker 1: that read out Zuck out nil, which I have totally mispronounced, 28 00:01:47,920 --> 00:01:50,640 Speaker 1: but that's because I didn't take Latin in school. But 29 00:01:50,680 --> 00:01:55,280 Speaker 1: it is Latin for all zuck or all nothing. Typically, 30 00:01:55,520 --> 00:02:00,320 Speaker 1: you would actually see this phrase associated with Caesar, not Zuck, 31 00:02:00,880 --> 00:02:05,720 Speaker 1: and it often was associated with a particularly humble Renaissance 32 00:02:05,920 --> 00:02:12,040 Speaker 1: prince named Caesar Borgia, known for being a little bit 33 00:02:12,120 --> 00:02:15,359 Speaker 1: full of himself. I honestly think that Zuckerberg should sit 34 00:02:15,400 --> 00:02:17,960 Speaker 1: down with Francis word Coppola and the two of them 35 00:02:18,040 --> 00:02:21,160 Speaker 1: could have a nice long conversation about Hubris. Also, they 36 00:02:21,160 --> 00:02:23,359 Speaker 1: could talk about the Roman Empire, because the two of 37 00:02:23,400 --> 00:02:26,720 Speaker 1: them both seemed to be obsessed by it. I mean, 38 00:02:26,760 --> 00:02:31,400 Speaker 1: Megalopolis is proof of that. Yeah. Anyway, enough commentary and 39 00:02:31,480 --> 00:02:35,680 Speaker 1: shade from yours truly, So what went down at the 40 00:02:35,760 --> 00:02:41,399 Speaker 1: Connect conference. Well, perhaps the buzziest bit was about Meta's 41 00:02:41,480 --> 00:02:47,040 Speaker 1: prototype augmented reality glasses called Orion. Zuckerberg showed them off 42 00:02:47,040 --> 00:02:51,400 Speaker 1: and good gollie, they are bulky now. The Verge gave 43 00:02:51,440 --> 00:02:55,400 Speaker 1: them some praise. The Verge said, quote, they look almost 44 00:02:55,440 --> 00:02:57,799 Speaker 1: like a trendy pair of frames you could pick up 45 00:02:58,000 --> 00:03:02,760 Speaker 1: without all the tech inside. Quote. Personally, however, I think 46 00:03:02,800 --> 00:03:05,640 Speaker 1: they look way too chonky for that. They look to 47 00:03:05,639 --> 00:03:07,600 Speaker 1: me like the kind of glasses that you might see 48 00:03:07,600 --> 00:03:10,720 Speaker 1: in a comic strip where the artist drawing the comic 49 00:03:10,720 --> 00:03:14,520 Speaker 1: strip uses really heavy lines for stuff like that. The 50 00:03:14,520 --> 00:03:18,680 Speaker 1: glasses house led projectors, which can beam images to display 51 00:03:18,680 --> 00:03:21,239 Speaker 1: in front of your eyes on the lenses, and that's 52 00:03:21,280 --> 00:03:25,040 Speaker 1: what gives you the augmented experience. But they're not intended 53 00:03:25,200 --> 00:03:29,320 Speaker 1: to be consumer products just yet. For one thing, even 54 00:03:29,400 --> 00:03:32,280 Speaker 1: with the bulky frame, you still have to carry a 55 00:03:32,360 --> 00:03:36,040 Speaker 1: couple of peripherals for this tech to actually work. That 56 00:03:36,080 --> 00:03:40,560 Speaker 1: includes what is called a neural wristband, which apparently is 57 00:03:40,640 --> 00:03:44,440 Speaker 1: needed so that you can do some gesture controls. And 58 00:03:44,720 --> 00:03:49,320 Speaker 1: there's also a compute puck, which I'm guessing offloads some 59 00:03:49,480 --> 00:03:52,840 Speaker 1: of the processing requirements needed for the glasses to work. 60 00:03:53,120 --> 00:03:55,280 Speaker 1: I think that makes sense. We are not at a 61 00:03:55,320 --> 00:03:58,200 Speaker 1: point where you can miniaturize all these elements so that 62 00:03:58,240 --> 00:04:02,720 Speaker 1: they fit solely within the frames of some even chunky glasses. 63 00:04:03,120 --> 00:04:05,120 Speaker 1: It isn't quite slimmed down to a point where the 64 00:04:05,120 --> 00:04:08,200 Speaker 1: average person would vibe with it. I think I don't 65 00:04:08,240 --> 00:04:10,680 Speaker 1: think people would want to think, Oh, here are my 66 00:04:10,760 --> 00:04:14,880 Speaker 1: cool glasses, where's my wristband and my compute puck so 67 00:04:14,920 --> 00:04:18,839 Speaker 1: that I can use them. Plus, according to Casey Newton's newsletter, 68 00:04:19,160 --> 00:04:23,080 Speaker 1: each unit costs somewhere in the neighborhood of ten thousand dollars, 69 00:04:23,480 --> 00:04:26,720 Speaker 1: which means it is certainly a non starter as far 70 00:04:26,760 --> 00:04:30,400 Speaker 1: as consumer tech is concerned. Meta did show off the 71 00:04:30,400 --> 00:04:34,240 Speaker 1: next generation of its ray band smart glasses. These don't 72 00:04:34,279 --> 00:04:37,880 Speaker 1: have the versatility of Oriyan, but they still incorporate some 73 00:04:38,080 --> 00:04:42,400 Speaker 1: smart technology and AR related features in them. Meta pointed 74 00:04:42,400 --> 00:04:45,920 Speaker 1: out how AI functionality means you can use these glasses 75 00:04:45,960 --> 00:04:47,560 Speaker 1: to do stuff like stay on top of your to 76 00:04:47,600 --> 00:04:50,200 Speaker 1: do list and that kind of thing. Metta also showed 77 00:04:50,240 --> 00:04:53,560 Speaker 1: off new VR headsets. In the quest line of the 78 00:04:53,880 --> 00:04:58,000 Speaker 1: one headset anyway, it's the Quest three S, and it's 79 00:04:58,000 --> 00:05:00,680 Speaker 1: intended to be kind of an entry point headset for 80 00:05:00,760 --> 00:05:03,880 Speaker 1: the VR space. It will cost two hundred ninety nine 81 00:05:03,920 --> 00:05:07,279 Speaker 1: dollars and ninety nine cents, so three hundred bucks. Meta 82 00:05:07,360 --> 00:05:11,400 Speaker 1: is ditching the Quest two and the Quest pro lines 83 00:05:11,440 --> 00:05:14,679 Speaker 1: of products, so now the choices are simplified to either 84 00:05:14,760 --> 00:05:19,320 Speaker 1: the Entry three S or the more expensive Quest three, 85 00:05:19,760 --> 00:05:23,080 Speaker 1: which now moves from being six hundred and fifty bucks 86 00:05:23,200 --> 00:05:26,280 Speaker 1: to five hundred bucks or technically four hundred ninety nine 87 00:05:26,480 --> 00:05:30,200 Speaker 1: and ninety nine cents. Zuckerberg also revealed that Meta is 88 00:05:30,240 --> 00:05:34,840 Speaker 1: introducing some AI generated content on Facebook and Instagram, which 89 00:05:34,880 --> 00:05:39,279 Speaker 1: sounds absolutely horrible to me. Why would I want to 90 00:05:39,320 --> 00:05:42,600 Speaker 1: go to these things and see stuff posted not even 91 00:05:42,600 --> 00:05:45,760 Speaker 1: by people I've heard of, let alone people I know, right? 92 00:05:46,240 --> 00:05:48,600 Speaker 1: Why do I want to see AI generated stuff? Maybe 93 00:05:48,640 --> 00:05:51,279 Speaker 1: I'm just missing the point. I wasn't at the conference. 94 00:05:51,320 --> 00:05:53,000 Speaker 1: Maybe if I were, I'd be like, oh, I get 95 00:05:53,040 --> 00:05:54,880 Speaker 1: it now, But right now it just strikes me as 96 00:05:54,960 --> 00:05:58,800 Speaker 1: kind of cold and pointless. Anyway, It's supposed to be 97 00:05:58,800 --> 00:06:03,120 Speaker 1: in a new section called Imagined for You, and supposedly 98 00:06:03,160 --> 00:06:06,200 Speaker 1: it will end up containing content the Meta thinks you 99 00:06:06,320 --> 00:06:10,520 Speaker 1: want or you need based upon your behaviors. So it's 100 00:06:10,520 --> 00:06:13,960 Speaker 1: supposed to be really curated and invented essentially for you, 101 00:06:14,520 --> 00:06:17,880 Speaker 1: which yay, I don't know. This doesn't strike me as 102 00:06:18,160 --> 00:06:24,480 Speaker 1: particularly good. Metta also introduced some AI impersonations of celebrity 103 00:06:24,560 --> 00:06:27,479 Speaker 1: voices in Meta AI, so if you wanted to chat 104 00:06:27,880 --> 00:06:32,480 Speaker 1: with a robot version of say John Cena or Aquafina 105 00:06:32,760 --> 00:06:35,320 Speaker 1: or Kristin Bell, then you could do that. The Kristen 106 00:06:35,360 --> 00:06:38,800 Speaker 1: Bell thing is particularly weird because not very long ago 107 00:06:38,920 --> 00:06:42,760 Speaker 1: Bell spoke out against meta mining user data for the 108 00:06:42,760 --> 00:06:46,800 Speaker 1: purposes of training artificial intelligence. But I guess she must 109 00:06:46,800 --> 00:06:49,320 Speaker 1: have hashed all that out because her voice has become 110 00:06:49,360 --> 00:06:51,360 Speaker 1: one of the options that people can choose when they're 111 00:06:51,400 --> 00:06:54,279 Speaker 1: interacting with the Meta AI. There were some other stuff 112 00:06:54,320 --> 00:06:57,320 Speaker 1: going on at the Connect keynote, including a revamp of 113 00:06:57,360 --> 00:07:01,720 Speaker 1: avatar design for Meta's various platforms, which includes the Horizon 114 00:07:01,800 --> 00:07:05,160 Speaker 1: operating system, which is what they use for VR. Meta 115 00:07:05,200 --> 00:07:08,960 Speaker 1: is still seeming to push to make the metaverse happen. 116 00:07:09,320 --> 00:07:12,920 Speaker 1: I think we need to tell Gretchen that, like Fetch, 117 00:07:13,280 --> 00:07:15,720 Speaker 1: the metaverse just isn't going to happen, or at least 118 00:07:15,720 --> 00:07:18,680 Speaker 1: I hope it doesn't. All right, that's enough about the 119 00:07:18,680 --> 00:07:21,320 Speaker 1: Connect conference, but we still have a little bit more 120 00:07:21,360 --> 00:07:24,720 Speaker 1: about meta. So Mark Zuckerberg sat down for an interview 121 00:07:24,720 --> 00:07:27,720 Speaker 1: with Alex Heath of The Verge, and among other things, 122 00:07:28,040 --> 00:07:31,480 Speaker 1: he stressed that research has failed to establish a causal 123 00:07:31,600 --> 00:07:35,720 Speaker 1: relationship between social media use and a negative impact on 124 00:07:36,160 --> 00:07:38,760 Speaker 1: mental health. Now, if you've been listening to my recent 125 00:07:38,840 --> 00:07:42,080 Speaker 1: episodes on tech stuff, you probably have heard me say 126 00:07:42,120 --> 00:07:45,160 Speaker 1: something similar to this, And that doesn't mean that there 127 00:07:45,400 --> 00:07:49,400 Speaker 1: is no causal relationship between the two, but we have 128 00:07:49,480 --> 00:07:53,679 Speaker 1: a lack of high quality studies that establish such a connection. 129 00:07:54,040 --> 00:07:57,480 Speaker 1: There may be correlation between poor mental health and social 130 00:07:57,520 --> 00:08:00,560 Speaker 1: media use, but that's not the same thing as causation. 131 00:08:00,960 --> 00:08:02,920 Speaker 1: I still think it's a big leap to go from 132 00:08:03,160 --> 00:08:07,520 Speaker 1: studies haven't really shown a causal connection to stating outright 133 00:08:07,560 --> 00:08:11,240 Speaker 1: that there is no causal connection. Those two things are 134 00:08:11,280 --> 00:08:13,840 Speaker 1: not synonymous. I don't think we can go so far 135 00:08:13,920 --> 00:08:18,520 Speaker 1: as to deny any causal connection, particularly since studies that 136 00:08:18,640 --> 00:08:23,360 Speaker 1: involve mental health are always tricky because isolating all the 137 00:08:23,440 --> 00:08:28,200 Speaker 1: variables in order to establish relationships between even like disturbing 138 00:08:28,280 --> 00:08:31,920 Speaker 1: behaviors and mental health, it's not always straightforward or easy. 139 00:08:32,240 --> 00:08:36,040 Speaker 1: Sometimes that stuff is really complicated. Now, Zuckerberg did say 140 00:08:36,120 --> 00:08:38,400 Speaker 1: that he felt meta can play a part in protecting 141 00:08:38,440 --> 00:08:41,600 Speaker 1: mental health by giving parents more controls relating to how 142 00:08:41,640 --> 00:08:44,160 Speaker 1: their kids are able to access social media, which is 143 00:08:44,240 --> 00:08:47,679 Speaker 1: at least a start. I recommend watching the whole interview, 144 00:08:47,960 --> 00:08:50,600 Speaker 1: which is both on The Verge and also The Verg's 145 00:08:50,640 --> 00:08:54,679 Speaker 1: YouTube channel. The interview itself is titled why Mark Zuckerberg 146 00:08:54,720 --> 00:08:57,840 Speaker 1: thinks ar glasses will replace your phone? But it's way 147 00:08:57,920 --> 00:09:02,200 Speaker 1: more than just about the arglasses, all right, And back 148 00:09:02,240 --> 00:09:04,360 Speaker 1: to that whole Kristen Bell thing, because it relates to 149 00:09:04,400 --> 00:09:08,520 Speaker 1: another story. She posted a message on Instagram which I 150 00:09:08,640 --> 00:09:11,679 Speaker 1: have seen going around a little bit now, and it's 151 00:09:11,720 --> 00:09:15,120 Speaker 1: one in which folks attempt to stave off meta crawling 152 00:09:15,160 --> 00:09:18,760 Speaker 1: their feeds for the purposes of training AI models. So 153 00:09:19,280 --> 00:09:23,240 Speaker 1: it's essentially just a variation of other viral messages that 154 00:09:23,520 --> 00:09:26,600 Speaker 1: people were posting to their feeds, either on Instagram or 155 00:09:26,600 --> 00:09:29,400 Speaker 1: on Facebook or both, and they were meant to prevent 156 00:09:29,480 --> 00:09:33,440 Speaker 1: Meta from doing stuff like selling their personal information for 157 00:09:33,480 --> 00:09:37,400 Speaker 1: the purposes of advertising. But like those cases, posting this 158 00:09:37,520 --> 00:09:41,640 Speaker 1: makes no difference whatsoever. It does not prevent Meta from 159 00:09:41,760 --> 00:09:44,760 Speaker 1: using your information to train AI. It doesn't prevent them 160 00:09:44,760 --> 00:09:48,000 Speaker 1: from selling your data for the purposes of ads or whatever, 161 00:09:48,200 --> 00:09:52,040 Speaker 1: or exploiting your data in any other way, because posting 162 00:09:52,240 --> 00:09:54,840 Speaker 1: that you're not giving Meta any permission to use your 163 00:09:54,920 --> 00:09:58,520 Speaker 1: data means nothing, and that's because the terms of service 164 00:09:58,880 --> 00:10:03,040 Speaker 1: already grant Meta all those permissions. The fact that you're 165 00:10:03,200 --> 00:10:05,599 Speaker 1: using the service, and in order to sign up for 166 00:10:05,640 --> 00:10:07,960 Speaker 1: the service, you had to agree to their terms of service, 167 00:10:08,000 --> 00:10:11,200 Speaker 1: which you're kind of grandfathered into. Like whenever they change it, 168 00:10:11,240 --> 00:10:15,040 Speaker 1: your agreement tends to be carried over into the new version, 169 00:10:15,080 --> 00:10:18,600 Speaker 1: which seems odd right, like it's the whole Darth Vader, 170 00:10:18,640 --> 00:10:20,880 Speaker 1: I have altered the deal. Pray I do not alter 171 00:10:21,000 --> 00:10:24,120 Speaker 1: it further. But yeah, the fact that you have agreed 172 00:10:24,160 --> 00:10:26,480 Speaker 1: to the terms of service means that Meta has that 173 00:10:26,520 --> 00:10:32,040 Speaker 1: permission already, so denying it on your feed does nothing. 174 00:10:32,120 --> 00:10:35,599 Speaker 1: In fact, the only way to prevent Meta from exploiting 175 00:10:35,600 --> 00:10:40,840 Speaker 1: your data is to either totally delete your profile and 176 00:10:40,960 --> 00:10:44,240 Speaker 1: just leave the platform entirely, or set everything to private, 177 00:10:44,720 --> 00:10:48,360 Speaker 1: at least for the purposes of crawling for AI training. 178 00:10:48,600 --> 00:10:51,920 Speaker 1: Meta won't touch stuff that's set to private as opposed 179 00:10:51,960 --> 00:10:54,560 Speaker 1: to public, so your private messages to your friends and 180 00:10:54,600 --> 00:10:57,880 Speaker 1: stuff that is not available for Meta to use, but 181 00:10:57,960 --> 00:11:01,160 Speaker 1: everything else is fair game. All right, we've got more 182 00:11:01,240 --> 00:11:04,360 Speaker 1: tech news to cover before that, let's take a quick break. 183 00:11:13,720 --> 00:11:18,120 Speaker 1: We're back. Open ai is planning to restructure the company 184 00:11:18,400 --> 00:11:21,880 Speaker 1: and essentially ditch all pretense that it bears any resemblance 185 00:11:21,920 --> 00:11:25,080 Speaker 1: to the original concept for open ai. All right, some 186 00:11:25,160 --> 00:11:29,200 Speaker 1: explanation is needed here. So originally open ai was intended 187 00:11:29,240 --> 00:11:32,600 Speaker 1: to be a not for profit organization that would maintain 188 00:11:32,640 --> 00:11:36,080 Speaker 1: a high level of transparency while developing AI in a 189 00:11:36,120 --> 00:11:40,080 Speaker 1: responsible manner, and that it would openly share this research, 190 00:11:40,200 --> 00:11:42,960 Speaker 1: thus the name open ai. They said there was a 191 00:11:43,000 --> 00:11:47,720 Speaker 1: need that AI advancements had to be helpful and cause 192 00:11:47,760 --> 00:11:50,360 Speaker 1: as little harm as possible, and to be developed in 193 00:11:50,559 --> 00:11:53,560 Speaker 1: a safe way. But it turns out being a not 194 00:11:53,679 --> 00:11:57,480 Speaker 1: for profit organization means you don't make a lot of money. 195 00:11:57,559 --> 00:12:02,400 Speaker 1: You're constantly scrambling for investment into the organization so that 196 00:12:02,440 --> 00:12:05,280 Speaker 1: you can conduct your research because AI R and D 197 00:12:05,600 --> 00:12:10,160 Speaker 1: is really expensive. So this restructuring would mean that the 198 00:12:10,360 --> 00:12:15,200 Speaker 1: for profit arm of open ai, which launched several years ago, 199 00:12:15,640 --> 00:12:20,560 Speaker 1: would essentially cut ties to the nonprofit board of directors, 200 00:12:20,960 --> 00:12:24,960 Speaker 1: and previously that board of directors oversaw the operations of 201 00:12:25,000 --> 00:12:28,920 Speaker 1: the company. You might even recall that an earlier incarnation 202 00:12:29,200 --> 00:12:33,160 Speaker 1: of this very same nonprofit board of directors voted to 203 00:12:33,320 --> 00:12:37,760 Speaker 1: remove open AI's CEO Sam Altman from the company, though 204 00:12:37,880 --> 00:12:41,000 Speaker 1: the board would later reverse that decision and then essentially 205 00:12:41,040 --> 00:12:44,240 Speaker 1: that board dissolved and a new one took its place. 206 00:12:44,600 --> 00:12:48,680 Speaker 1: But this restructuring would remove that particular entity from open 207 00:12:48,720 --> 00:12:53,800 Speaker 1: AI's governance entirely coincidentally, According to Altman, a few high 208 00:12:53,920 --> 00:12:57,240 Speaker 1: level executives have left open Ai. Among them are a 209 00:12:57,320 --> 00:13:00,600 Speaker 1: pair of senior research executives as well as the company's 210 00:13:00,760 --> 00:13:04,400 Speaker 1: chief technology officer. She also left, and Altman says that 211 00:13:04,480 --> 00:13:07,599 Speaker 1: their resignations have nothing to do with the planned restructuring. 212 00:13:07,720 --> 00:13:13,240 Speaker 1: It's just coincidence. Maybe that's true, Julian Denistron, and I 213 00:13:13,320 --> 00:13:18,560 Speaker 1: apologize for butchering your name of Inside evs Rights that 214 00:13:18,679 --> 00:13:23,320 Speaker 1: a research company called aMCI Testing looked into Tesla's full 215 00:13:23,320 --> 00:13:26,520 Speaker 1: self driving feature to determine how reliable and safe it 216 00:13:26,559 --> 00:13:30,320 Speaker 1: actually is now. aMCI reportedly tested the feature in a 217 00:13:30,360 --> 00:13:34,240 Speaker 1: variety of different settings, from city streets to mountain roads, 218 00:13:34,400 --> 00:13:38,319 Speaker 1: and apparently these tests were at least fairly extensive, comprising 219 00:13:38,400 --> 00:13:41,920 Speaker 1: more than one thousand miles traveled in total. According to 220 00:13:41,960 --> 00:13:46,520 Speaker 1: the report, aMCI drivers had to intervene approximately once every 221 00:13:46,600 --> 00:13:50,880 Speaker 1: thirteen miles driven or else risk getting into an accident. Now, 222 00:13:50,920 --> 00:13:55,840 Speaker 1: to be clear, despite the name full self driving, the 223 00:13:56,160 --> 00:13:59,720 Speaker 1: feature actually requires Tesla drivers to maintain their attention on 224 00:13:59,720 --> 00:14:01,640 Speaker 1: the road and to keep their hands of the steering 225 00:14:01,640 --> 00:14:06,040 Speaker 1: wheel at all times. So, despite some arguably misleading naming 226 00:14:06,080 --> 00:14:10,320 Speaker 1: conventions here, Tesla has stated that FSD is not intended 227 00:14:10,320 --> 00:14:14,360 Speaker 1: to be a full proof autonomous driverless system, but aMCI 228 00:14:14,520 --> 00:14:17,520 Speaker 1: says FSD can still give drivers a false sense of 229 00:14:17,520 --> 00:14:20,440 Speaker 1: security and that the system can perform well enough to 230 00:14:20,520 --> 00:14:24,360 Speaker 1: make people believe it is consistently safe. However, the director 231 00:14:24,400 --> 00:14:29,000 Speaker 1: of amci's testing, a guy named guy or Gui Mangiamele, 232 00:14:29,640 --> 00:14:35,240 Speaker 1: said quote, you may watch FSD successfully negotiate a specific 233 00:14:35,280 --> 00:14:38,280 Speaker 1: scenario many times, often on the same stretch of road 234 00:14:38,440 --> 00:14:42,640 Speaker 1: or intersection, only to have it inexplicably fail the next 235 00:14:42,720 --> 00:14:45,680 Speaker 1: time end quote. Obviously, that would be a big problem, 236 00:14:45,760 --> 00:14:47,840 Speaker 1: since it might mean a Tesla owner could feel that, 237 00:14:47,920 --> 00:14:51,120 Speaker 1: at least along certain stretches of road, the FSD feature 238 00:14:51,200 --> 00:14:54,040 Speaker 1: is fully capable of handling the car's operation, and they 239 00:14:54,080 --> 00:14:57,040 Speaker 1: might let their attention wander once again. I think my 240 00:14:57,080 --> 00:15:00,520 Speaker 1: biggest problem with Tesla's approach is that the marketing conflicts 241 00:15:00,560 --> 00:15:04,360 Speaker 1: with the actual capabilities of the advanced driving assist systems, 242 00:15:04,640 --> 00:15:08,560 Speaker 1: and unfortunately, when that involves vehicles, that can lead to catastrophe. 243 00:15:09,280 --> 00:15:12,080 Speaker 1: Longtime tech stuff listeners might remember me talking about a 244 00:15:12,120 --> 00:15:15,800 Speaker 1: company called Do Not Pay Ages ago. This company is 245 00:15:15,840 --> 00:15:18,040 Speaker 1: known for doing a few different things, but one big 246 00:15:18,080 --> 00:15:22,960 Speaker 1: one is helping subscribers do stuff like fight parking tickets. Now, 247 00:15:22,960 --> 00:15:26,040 Speaker 1: some of those parking tickets are probably unfair and fighting 248 00:15:26,040 --> 00:15:29,520 Speaker 1: them can be pretty intimidating, and some folks often just 249 00:15:29,640 --> 00:15:32,360 Speaker 1: resign themselves to paying off a fine that they don't 250 00:15:32,360 --> 00:15:36,200 Speaker 1: actually deserve. So Do Not Pay offers to help people out. 251 00:15:36,480 --> 00:15:38,920 Speaker 1: And one way they do this, or at least that 252 00:15:39,000 --> 00:15:41,200 Speaker 1: they did do this, was that they were offering the 253 00:15:41,280 --> 00:15:45,960 Speaker 1: services of quote, the world's first robot lawyer end quote. 254 00:15:46,080 --> 00:15:49,240 Speaker 1: There's really a generative AI tool meant to help people 255 00:15:49,440 --> 00:15:53,320 Speaker 1: draft legal language in order to do things like fight 256 00:15:53,840 --> 00:15:58,680 Speaker 1: traffic tickets or to perform other relatively uncomplicated legal tasks 257 00:15:58,720 --> 00:16:02,160 Speaker 1: such as drafting a cease and desist letter. But according 258 00:16:02,200 --> 00:16:05,680 Speaker 1: to the US Federal Trade Commission or FTC, Do Not 259 00:16:05,840 --> 00:16:09,800 Speaker 1: Pay failed to test the generative AI tool and didn't 260 00:16:09,880 --> 00:16:13,200 Speaker 1: even bring on human lawyers to work on the tool's 261 00:16:13,240 --> 00:16:17,520 Speaker 1: design or to ensure that it was functioning properly. Do 262 00:16:17,640 --> 00:16:20,720 Speaker 1: not Pay has since agreed to pay a fine of 263 00:16:20,760 --> 00:16:23,120 Speaker 1: its own, a grand total of one hundred and ninety 264 00:16:23,160 --> 00:16:26,960 Speaker 1: three thousand dollars. I assume they're not going to follow 265 00:16:27,000 --> 00:16:31,200 Speaker 1: the instructions that their company name states. That would likely 266 00:16:31,320 --> 00:16:34,160 Speaker 1: not go over so well with the FTC. Do Not 267 00:16:34,240 --> 00:16:37,280 Speaker 1: Pay must also send messages out to customers alerting them 268 00:16:37,280 --> 00:16:40,880 Speaker 1: to the limitations of their services, essentially to say, hey, 269 00:16:41,280 --> 00:16:45,560 Speaker 1: turns out we can't guarantee that the stuff we produced 270 00:16:45,600 --> 00:16:49,600 Speaker 1: for you in return for your subscription actually is holding 271 00:16:49,680 --> 00:16:53,960 Speaker 1: legal water. That's a rough one. According to Russian news outlets, 272 00:16:54,040 --> 00:16:57,200 Speaker 1: Google has shut down the ability for people within Russia 273 00:16:57,240 --> 00:17:01,080 Speaker 1: to create new Google accounts, and further, there has been 274 00:17:01,200 --> 00:17:05,280 Speaker 1: a sharp reduction quote in the number of SMS messages 275 00:17:05,320 --> 00:17:08,840 Speaker 1: sent by the company to Russian users end quote. That's 276 00:17:08,840 --> 00:17:12,439 Speaker 1: according to the Register. This implies that Google is not 277 00:17:12,640 --> 00:17:16,919 Speaker 1: sending out two factor authentication messages to people in Russia. 278 00:17:17,080 --> 00:17:19,280 Speaker 1: So if you're signed out of your Google account and 279 00:17:19,320 --> 00:17:21,360 Speaker 1: you're in Russia, you might not be able to get 280 00:17:21,400 --> 00:17:23,919 Speaker 1: back into that account because you won't get access to 281 00:17:23,960 --> 00:17:26,760 Speaker 1: all the information you need in order to get into 282 00:17:26,880 --> 00:17:30,160 Speaker 1: your account. You won't get the multi factor authentication message. 283 00:17:30,280 --> 00:17:33,240 Speaker 1: As I record this, Google and its parent company Alphabet 284 00:17:33,320 --> 00:17:36,360 Speaker 1: have yet to comment on this story. Out of Russia, 285 00:17:36,480 --> 00:17:38,560 Speaker 1: or from what I can gather, the general belief is 286 00:17:38,560 --> 00:17:42,600 Speaker 1: that Google has escalated its withdrawal from Russia after having 287 00:17:42,680 --> 00:17:46,720 Speaker 1: to shut down numerous accounts within Russia that were spreading disinformation, 288 00:17:46,880 --> 00:17:50,480 Speaker 1: primarily about the war in Ukraine, while also refusing to 289 00:17:50,520 --> 00:17:54,239 Speaker 1: comply with Moscow's demands to remove certain other accounts that 290 00:17:54,480 --> 00:17:58,360 Speaker 1: the government has declared as illegal, and those are typically 291 00:17:58,400 --> 00:18:02,320 Speaker 1: accounts that frequently contradict the narrative coming out of the Kremlin. 292 00:18:02,680 --> 00:18:07,320 Speaker 1: So maybe this is Google making a conscious decision to 293 00:18:07,920 --> 00:18:10,440 Speaker 1: pull out of Russia entirely, which would be a real 294 00:18:10,520 --> 00:18:13,840 Speaker 1: blow to people in Russia who are dependent upon Google 295 00:18:14,000 --> 00:18:17,960 Speaker 1: services but have no control over what their government does 296 00:18:18,080 --> 00:18:21,920 Speaker 1: or doesn't do. California now has a law that requires 297 00:18:21,960 --> 00:18:27,040 Speaker 1: companies that offer access to digital media to use accurate 298 00:18:27,119 --> 00:18:30,200 Speaker 1: language for those digital goods, so for stuff like streaming 299 00:18:30,240 --> 00:18:32,440 Speaker 1: media for example. So in other words, if I wanted 300 00:18:32,440 --> 00:18:35,440 Speaker 1: to pay so that I could have the UK comedy 301 00:18:35,480 --> 00:18:40,840 Speaker 1: series Spaced in my online library with a service like Amazon, Well, 302 00:18:40,880 --> 00:18:43,919 Speaker 1: then Amazon would not be allowed to use language like 303 00:18:44,280 --> 00:18:49,640 Speaker 1: buy or sell regarding that transaction. Instead, the language has 304 00:18:49,680 --> 00:18:51,840 Speaker 1: to make it clear that what I'm actually doing is 305 00:18:51,840 --> 00:18:56,000 Speaker 1: I'm purchasing a license so that I can access Spaced, 306 00:18:56,320 --> 00:18:59,879 Speaker 1: but I don't actually own a copy of Space. And 307 00:19:00,119 --> 00:19:03,040 Speaker 1: that's to make it clear to consumers that ownership isn't 308 00:19:03,080 --> 00:19:05,880 Speaker 1: really that big of a thing in the digital marketplace 309 00:19:05,960 --> 00:19:08,320 Speaker 1: for a lot of companies, and that doesn't matter if 310 00:19:08,359 --> 00:19:11,119 Speaker 1: you're talking about music or films, or video games or 311 00:19:11,119 --> 00:19:14,200 Speaker 1: other digital products and services. Companies that failed to comply 312 00:19:14,320 --> 00:19:16,800 Speaker 1: with this law could be found guilty of false advertising 313 00:19:16,840 --> 00:19:20,000 Speaker 1: and face some pretty stiff penalties. Now, all that being said, 314 00:19:20,240 --> 00:19:23,800 Speaker 1: if a company does allow you to actually download a 315 00:19:23,880 --> 00:19:26,840 Speaker 1: digital file and there's no requirement for it to be 316 00:19:26,840 --> 00:19:30,000 Speaker 1: connected to some sort of online server or anything like that, 317 00:19:30,280 --> 00:19:33,200 Speaker 1: like once you purchase it, it is yours and you can 318 00:19:33,440 --> 00:19:35,679 Speaker 1: move it to whatever drive you want and access it 319 00:19:35,720 --> 00:19:38,400 Speaker 1: whenever you want, they can still use terms like buy 320 00:19:38,400 --> 00:19:42,720 Speaker 1: and sell because you are buying a digital file. That's fine, 321 00:19:43,160 --> 00:19:46,840 Speaker 1: but it's for the ones that license the access to you. 322 00:19:47,359 --> 00:19:49,520 Speaker 1: And this was kind of brought up because of some 323 00:19:49,640 --> 00:19:53,639 Speaker 1: recent developments. There was a case where Ubisoft deleted the 324 00:19:53,720 --> 00:19:58,040 Speaker 1: video game The Crew from player accounts. Ubisoft had already 325 00:19:58,080 --> 00:20:00,840 Speaker 1: shut down the online service for that game, and it 326 00:20:00,880 --> 00:20:04,240 Speaker 1: was an online only game, so you could argue that 327 00:20:04,320 --> 00:20:07,719 Speaker 1: once they shut down the servers, the game was effectively obsolete. 328 00:20:07,800 --> 00:20:10,520 Speaker 1: But gamers were still shocked to see the title getting 329 00:20:10,560 --> 00:20:14,320 Speaker 1: deleted from their library entirely because that seems to fly 330 00:20:14,400 --> 00:20:17,800 Speaker 1: into the face of the whole concept of buying something, 331 00:20:18,040 --> 00:20:19,760 Speaker 1: and that's how we got to where we are now. 332 00:20:20,240 --> 00:20:22,760 Speaker 1: This law is only going to apply in California for now. 333 00:20:22,800 --> 00:20:26,520 Speaker 1: It takes effect next year. And finally, speaking of Ubisoft, 334 00:20:26,560 --> 00:20:29,040 Speaker 1: the company is facing a potential strike as workers have 335 00:20:29,119 --> 00:20:32,000 Speaker 1: objected to the company issuing a return to office mandate 336 00:20:32,040 --> 00:20:34,280 Speaker 1: that would require staff to come into the office three 337 00:20:34,359 --> 00:20:37,399 Speaker 1: days a week. The union says it had been negotiating 338 00:20:37,400 --> 00:20:41,080 Speaker 1: for profit sharing for employees and after those talks broke down, 339 00:20:41,119 --> 00:20:44,280 Speaker 1: that's when Ubisoft issued the return to office order. So 340 00:20:44,480 --> 00:20:47,840 Speaker 1: things have now escalated into a potential strike situation, which 341 00:20:47,880 --> 00:20:50,640 Speaker 1: sounds similar to what's going on over at Amazon. Amazon 342 00:20:50,720 --> 00:20:53,560 Speaker 1: gave a five days a week return to office mandate 343 00:20:53,640 --> 00:20:56,639 Speaker 1: starting next year and now like ninety one percent of 344 00:20:56,720 --> 00:21:00,359 Speaker 1: Amazon's staff who were surveyed about this say they don't 345 00:21:00,480 --> 00:21:03,240 Speaker 1: like it, and I can understand that. But that's it 346 00:21:03,520 --> 00:21:05,640 Speaker 1: for the tech news for this week. I hope you're 347 00:21:05,680 --> 00:21:08,200 Speaker 1: all well. If you are in the path of a 348 00:21:08,240 --> 00:21:12,000 Speaker 1: devastating hurricane, please please please be careful, be safe, be 349 00:21:12,200 --> 00:21:14,800 Speaker 1: happy and healthy, and I will talk to you again 350 00:21:15,640 --> 00:21:25,880 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 351 00:21:25,920 --> 00:21:30,680 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 352 00:21:30,680 --> 00:21:36,040 Speaker 1: wherever you listen to your favorite shows.