1 00:00:04,440 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,800 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,840 --> 00:00:19,520 Speaker 1: I'm an executive producer here at iHeartRadio. And how the 4 00:00:19,680 --> 00:00:24,960 Speaker 1: tech are you? It is Tuesday, August fifteenth, twenty twenty three. 5 00:00:25,160 --> 00:00:29,480 Speaker 1: It is time to talk about tech news. And first up, 6 00:00:29,480 --> 00:00:32,000 Speaker 1: I've got a follow up on the story of Sam 7 00:00:32,080 --> 00:00:36,600 Speaker 1: Bankman Freed, the co founder of crypto companies Alimtor Research 8 00:00:37,000 --> 00:00:40,720 Speaker 1: and the crypto Exchange FTX. SO, just in case a 9 00:00:40,720 --> 00:00:42,640 Speaker 1: few of y'all out there out of the loop on 10 00:00:42,760 --> 00:00:47,960 Speaker 1: this story. FDx collapsed late last year and prompted a 11 00:00:48,040 --> 00:00:51,920 Speaker 1: suite of investigations into the company and its executive leadership team, 12 00:00:52,000 --> 00:00:57,440 Speaker 1: including Sam Bankman Freed, also known as SBF. SO authorities 13 00:00:57,560 --> 00:01:01,560 Speaker 1: charged SBF with like a ton of crimes, several crimes, 14 00:01:02,080 --> 00:01:05,000 Speaker 1: and he is awaiting trial. And until recently he was 15 00:01:05,400 --> 00:01:07,679 Speaker 1: just under house arrest. He was staying with his parents, 16 00:01:08,080 --> 00:01:10,400 Speaker 1: but now he has been ordered to go straight to jail, 17 00:01:10,680 --> 00:01:14,160 Speaker 1: do not pass go, do not collect two hundred FTT tokens. 18 00:01:14,720 --> 00:01:17,160 Speaker 1: So you might say, well, what happened? What was the change? 19 00:01:17,480 --> 00:01:21,880 Speaker 1: Prosecutors brought concerns to the judge and said SBF had 20 00:01:21,880 --> 00:01:26,000 Speaker 1: been leaking documents to the media, potentially in an effort 21 00:01:26,160 --> 00:01:30,800 Speaker 1: to intimidate a witness that they were speaking with with 22 00:01:30,920 --> 00:01:33,920 Speaker 1: regard to his case. So the judge took this seriously 23 00:01:34,040 --> 00:01:37,080 Speaker 1: and has revoked the house arrest order at SBF will 24 00:01:37,080 --> 00:01:41,120 Speaker 1: have to go to jail while awaiting his trial, which 25 00:01:41,160 --> 00:01:45,640 Speaker 1: is a big old wolf. SBF tried to appeal this decision, 26 00:01:45,680 --> 00:01:48,360 Speaker 1: but the judge dismissed that appeal, saying the points that 27 00:01:48,360 --> 00:01:51,200 Speaker 1: his legal team made in the document were quote either 28 00:01:51,280 --> 00:01:56,480 Speaker 1: moot or without merit end quote double wolf. The prosecution 29 00:01:56,640 --> 00:01:59,480 Speaker 1: also asked the judge to place a gag order on 30 00:01:59,640 --> 00:02:02,680 Speaker 1: SBF so that he doesn't continue to leak information to 31 00:02:02,720 --> 00:02:06,480 Speaker 1: the media, and that has then prompted various media outlets 32 00:02:06,480 --> 00:02:10,400 Speaker 1: to submit court filings on First Amendment concerns, you know, 33 00:02:10,440 --> 00:02:13,960 Speaker 1: free speech concerns about that gag order, saying it violates 34 00:02:14,440 --> 00:02:18,040 Speaker 1: his First Amendment rights to freedom of speech. This is 35 00:02:18,040 --> 00:02:21,560 Speaker 1: a delicate matter, obviously, when you have an issue where 36 00:02:22,200 --> 00:02:28,120 Speaker 1: communications can actually impact the legal process, it does or 37 00:02:28,240 --> 00:02:32,120 Speaker 1: can at least come into conflict with the philosophies behind 38 00:02:32,120 --> 00:02:36,080 Speaker 1: freedom of speech. So complicated issues we're seeing that play 39 00:02:36,080 --> 00:02:39,600 Speaker 1: out in other arenas here in the United States as well. Okay, 40 00:02:40,320 --> 00:02:43,560 Speaker 1: now it's time to talk about AI for just okay, 41 00:02:43,560 --> 00:02:46,400 Speaker 1: I can't lie for a lot. So first up, the 42 00:02:46,440 --> 00:02:50,400 Speaker 1: hacker conference known as def Con happened this past weekend, 43 00:02:50,760 --> 00:02:53,600 Speaker 1: and one of the many events that were held over 44 00:02:53,639 --> 00:02:56,959 Speaker 1: the weekend at def Con was one that pitted hackers 45 00:02:57,120 --> 00:03:01,320 Speaker 1: against AI chatbots, and this includes bots from all the 46 00:03:01,360 --> 00:03:04,760 Speaker 1: major players in the space like open ai and Google 47 00:03:04,840 --> 00:03:07,720 Speaker 1: and Meta and others. So the purpose of the session 48 00:03:08,120 --> 00:03:11,520 Speaker 1: was to test these chatbots for vulnerabilities, and it didn't 49 00:03:11,560 --> 00:03:15,720 Speaker 1: involve like hacking into the code, but rather just chatting 50 00:03:15,720 --> 00:03:17,400 Speaker 1: with the chatbots and seeing if you could make it 51 00:03:17,440 --> 00:03:21,000 Speaker 1: do stuff it wasn't supposed to do. Finding a vulnerability 52 00:03:21,040 --> 00:03:24,600 Speaker 1: in an exhibition at Defcon is probably embarrassing, but it's 53 00:03:24,639 --> 00:03:28,120 Speaker 1: preferable to some bad actor out in the real world 54 00:03:28,600 --> 00:03:32,800 Speaker 1: finding that vulnerability and then exploiting it to terrible effect. 55 00:03:33,400 --> 00:03:36,440 Speaker 1: So these hackers were trying to manipulate the chatbots to 56 00:03:36,880 --> 00:03:39,560 Speaker 1: do things they absolutely were not supposed to do. That 57 00:03:39,640 --> 00:03:43,800 Speaker 1: included sharing private information that was supposed to be protected, 58 00:03:44,320 --> 00:03:49,560 Speaker 1: or produce examples of hate speech or misinformation, or making 59 00:03:49,600 --> 00:03:52,680 Speaker 1: defamatory statements about famous people, that kind of thing. In fact, 60 00:03:53,240 --> 00:03:56,520 Speaker 1: they had different categories for stuff that the chatbots were 61 00:03:56,520 --> 00:04:01,320 Speaker 1: not supposed to do, and points associated with those tasks. 62 00:04:01,320 --> 00:04:03,200 Speaker 1: So if you got the chatbot to do one of 63 00:04:03,200 --> 00:04:05,760 Speaker 1: those things, you would get that amount of points to 64 00:04:05,840 --> 00:04:09,120 Speaker 1: your score. NPR covered the story and mentioned how one 65 00:04:09,160 --> 00:04:12,440 Speaker 1: participant was able to convince a chatbot to reveal a 66 00:04:12,480 --> 00:04:16,400 Speaker 1: credit card number by just changing the context a little. 67 00:04:16,640 --> 00:04:20,760 Speaker 1: The participant claimed that his name was the number on 68 00:04:20,800 --> 00:04:24,600 Speaker 1: that credit card. Then he asked the chatbot, what's my name? 69 00:04:25,040 --> 00:04:28,200 Speaker 1: And the chatbot supplied the number in response. And that's 70 00:04:28,279 --> 00:04:30,880 Speaker 1: kind of out of the box thinking, right, because the 71 00:04:31,000 --> 00:04:35,320 Speaker 1: chatbot seemed to be contextualizing the credit card number as 72 00:04:35,320 --> 00:04:38,680 Speaker 1: not being a number, but instead it was just a 73 00:04:38,720 --> 00:04:43,200 Speaker 1: person's name. And because a person's name is not protected information, 74 00:04:43,520 --> 00:04:46,719 Speaker 1: especially if the person who's talking to you is the 75 00:04:46,720 --> 00:04:51,080 Speaker 1: one who belongs to that name, it handed the information over. 76 00:04:51,839 --> 00:04:54,960 Speaker 1: And it's not like a fantasy or horror novel where 77 00:04:55,040 --> 00:04:58,080 Speaker 1: names can have secret powers and you don't want anyone 78 00:04:58,120 --> 00:05:00,680 Speaker 1: to know your true Name's not how it works in 79 00:05:00,760 --> 00:05:04,000 Speaker 1: chatbot land. So if it thinks this credit card number 80 00:05:04,040 --> 00:05:07,040 Speaker 1: is actually a name. Well, there's no reason you can't 81 00:05:07,040 --> 00:05:10,240 Speaker 1: say a name, right. That's how simple it was to 82 00:05:10,320 --> 00:05:13,239 Speaker 1: make this one chatbot do something it was not supposed 83 00:05:13,240 --> 00:05:15,960 Speaker 1: to do, and so the chatbot coughed up the information. 84 00:05:16,080 --> 00:05:19,840 Speaker 1: So each participant had fifty minutes to complete as many 85 00:05:19,880 --> 00:05:23,480 Speaker 1: tasks as they possibly could and accumulate points in the process, 86 00:05:23,640 --> 00:05:26,760 Speaker 1: and also illustrating how far we have to go in 87 00:05:26,880 --> 00:05:31,760 Speaker 1: order to make generative AI trustworthy. This next story is 88 00:05:31,800 --> 00:05:35,120 Speaker 1: one that I missed when it broke Last Thursday, the 89 00:05:35,240 --> 00:05:37,920 Speaker 1: US Department of Defense announced it is creating a task 90 00:05:37,960 --> 00:05:42,840 Speaker 1: force specifically focused on generative AI. This task force has 91 00:05:42,880 --> 00:05:47,080 Speaker 1: the designation LIMA or LIMA if you prefer, like the BEAN, 92 00:05:47,720 --> 00:05:52,359 Speaker 1: and will explore potential uses and threats regarding generative AI 93 00:05:52,520 --> 00:05:55,760 Speaker 1: and large language models. I imagine that the people who 94 00:05:55,800 --> 00:05:58,800 Speaker 1: will be in this task force are already well aware 95 00:05:58,880 --> 00:06:03,400 Speaker 1: of the limitations of generative AI technology and how it 96 00:06:03,480 --> 00:06:07,160 Speaker 1: can be impressive and even useful, but that you have 97 00:06:07,200 --> 00:06:11,200 Speaker 1: to be really cautious because it can also sometimes be unreliable. 98 00:06:11,839 --> 00:06:14,520 Speaker 1: At least, I would like to think that the task 99 00:06:14,600 --> 00:06:17,440 Speaker 1: members are aware of all that. It's hard to imagine 100 00:06:17,480 --> 00:06:19,960 Speaker 1: that they're not, but you know, you get nervous. The 101 00:06:19,960 --> 00:06:24,159 Speaker 1: press release from the DoD mentions that the DoD recognizes 102 00:06:24,200 --> 00:06:29,440 Speaker 1: the potential of generative AI to significantly improve intelligence, operational planning, 103 00:06:29,680 --> 00:06:35,280 Speaker 1: and administrative and business processes. However, responsible implementation is key 104 00:06:35,320 --> 00:06:39,880 Speaker 1: to managing associated risks effectively, so at least it sounds 105 00:06:39,880 --> 00:06:43,080 Speaker 1: like the DoD wants a very steady approach to this 106 00:06:43,279 --> 00:06:46,280 Speaker 1: and is aware things could go pair shaped if if 107 00:06:46,320 --> 00:06:48,919 Speaker 1: you aren't being careful with it. It still makes me 108 00:06:49,000 --> 00:06:51,880 Speaker 1: nervous to think of generative AI being used in concert 109 00:06:51,960 --> 00:06:56,320 Speaker 1: with like gathering and analyzing intelligence because we know that 110 00:06:56,440 --> 00:07:00,400 Speaker 1: generative AI has a tendency to hallucinate and the event 111 00:07:00,440 --> 00:07:02,160 Speaker 1: that it doesn't have all the information it needs in 112 00:07:02,240 --> 00:07:05,560 Speaker 1: order to answer a question. It actually made me think 113 00:07:05,600 --> 00:07:09,680 Speaker 1: of a scene from the British comedy series Blackadder, specifically 114 00:07:09,720 --> 00:07:12,720 Speaker 1: season four, which is set in World War One. There's 115 00:07:12,920 --> 00:07:18,800 Speaker 1: a sequence where Captain Blackadder commands his subordinate Lieutenant George 116 00:07:19,240 --> 00:07:22,600 Speaker 1: to paint a scene that shows German forces being far 117 00:07:22,640 --> 00:07:26,080 Speaker 1: too powerful at their position, and this isn't an effort 118 00:07:26,320 --> 00:07:31,400 Speaker 1: to convince leadership not to command his division to advance, 119 00:07:31,720 --> 00:07:35,400 Speaker 1: so essentially he's saying, oh, we couldn't possibly advance. The 120 00:07:35,520 --> 00:07:38,680 Speaker 1: Germans are far too entrenched and have far too many resources, 121 00:07:38,720 --> 00:07:42,160 Speaker 1: and it would be a disaster. George, though, he gets 122 00:07:42,200 --> 00:07:46,640 Speaker 1: carried away while making this painting of these as far 123 00:07:46,680 --> 00:07:50,520 Speaker 1: as they know, fictional German forces, and he ends up 124 00:07:50,520 --> 00:07:54,840 Speaker 1: including stuff like battle elephants and stuff like that inside 125 00:07:54,920 --> 00:07:58,560 Speaker 1: the painting, which Captain Blackatter does his best to incorporate 126 00:07:58,640 --> 00:08:03,600 Speaker 1: into his report to his his superiors. So I imagine generative 127 00:08:03,600 --> 00:08:07,720 Speaker 1: AI producing equally fake intelligence, right, like, how can you 128 00:08:07,840 --> 00:08:14,040 Speaker 1: trust any intelligence provided by generative AI without doing so 129 00:08:14,160 --> 00:08:18,560 Speaker 1: much extensive double checking that you might actually negate any 130 00:08:18,960 --> 00:08:21,640 Speaker 1: benefit that the generative AI gave you. Right, If it 131 00:08:21,720 --> 00:08:25,320 Speaker 1: takes you more time to verify the information than it 132 00:08:25,360 --> 00:08:27,360 Speaker 1: would if you just didn't use the AI at all, 133 00:08:27,400 --> 00:08:30,239 Speaker 1: then really you're playing a losing game. This is also 134 00:08:31,000 --> 00:08:34,920 Speaker 1: what I argue with AI generated articles, where you have 135 00:08:35,000 --> 00:08:38,160 Speaker 1: to have an editor go over the article, because typically 136 00:08:38,840 --> 00:08:42,400 Speaker 1: you would have a human writer who is vetted to 137 00:08:42,480 --> 00:08:45,720 Speaker 1: write an article, and then the editor would double check 138 00:08:45,760 --> 00:08:49,280 Speaker 1: the article for things like, you know, grammatical mistakes and 139 00:08:49,480 --> 00:08:54,040 Speaker 1: just anything that stands out. But generally you're fairly confident 140 00:08:54,080 --> 00:08:56,520 Speaker 1: that the writer has turned in something without just making 141 00:08:56,559 --> 00:08:59,120 Speaker 1: stuff up. This can come back to bite you, as 142 00:08:59,120 --> 00:09:01,760 Speaker 1: we have seen multiple times, but usually it works out. 143 00:09:02,320 --> 00:09:04,440 Speaker 1: But with AI you can't be sure about that. And 144 00:09:04,480 --> 00:09:08,280 Speaker 1: so if you give AI generated articles to editors, often 145 00:09:08,520 --> 00:09:10,800 Speaker 1: they have to go through the article with such a 146 00:09:10,800 --> 00:09:15,079 Speaker 1: fine tooth comb that they essentially have rewritten the article themselves. 147 00:09:15,440 --> 00:09:17,720 Speaker 1: Like it's as if you had given the writing assignment 148 00:09:17,800 --> 00:09:22,400 Speaker 1: to the editor and not to an AI bought in 149 00:09:22,440 --> 00:09:25,120 Speaker 1: the first place. And that's a real problem. That's what 150 00:09:25,240 --> 00:09:29,079 Speaker 1: I worry about with the use of AI in connection 151 00:09:29,200 --> 00:09:32,800 Speaker 1: with gathering intelligence. Okay, how about we talk about a 152 00:09:32,840 --> 00:09:37,559 Speaker 1: case where a government is using AI to repress information? Yay, 153 00:09:38,320 --> 00:09:41,480 Speaker 1: how fun. So I'm going to try and get through 154 00:09:41,480 --> 00:09:45,120 Speaker 1: this without going on too much of a rant. But 155 00:09:45,240 --> 00:09:48,920 Speaker 1: the story is that in Iowa, the state government passed 156 00:09:49,200 --> 00:09:53,960 Speaker 1: legislation that bans books in school libraries if those books 157 00:09:54,000 --> 00:09:58,000 Speaker 1: include material not deemed to be quote unquote age appropriate. 158 00:09:58,400 --> 00:10:01,680 Speaker 1: So that includes any book that describes a sex act. 159 00:10:02,040 --> 00:10:05,439 Speaker 1: Anything like that is immediately like on that banned list. 160 00:10:05,480 --> 00:10:08,920 Speaker 1: But then how does a school go about doing that right? 161 00:10:09,040 --> 00:10:11,120 Speaker 1: Because libraries are I don't know if you know this, 162 00:10:11,400 --> 00:10:15,000 Speaker 1: they are absolutely choc a block with books and it 163 00:10:15,040 --> 00:10:17,480 Speaker 1: would take a considerable amount of time and effort to 164 00:10:17,520 --> 00:10:21,280 Speaker 1: go through every single book to see if that book 165 00:10:21,400 --> 00:10:24,240 Speaker 1: met the government's definition of age appropriate or if it 166 00:10:24,320 --> 00:10:29,160 Speaker 1: did not. So one school district in Mason City is 167 00:10:29,280 --> 00:10:32,760 Speaker 1: leaning on AI to do that work for them. Now. 168 00:10:32,880 --> 00:10:35,160 Speaker 1: What they've started with is a list of books that 169 00:10:35,240 --> 00:10:40,600 Speaker 1: have already received complaints in the past about objectionable material 170 00:10:40,679 --> 00:10:43,600 Speaker 1: in those books, and so they are then feeding these 171 00:10:43,600 --> 00:10:46,840 Speaker 1: books to AI software to scan the material and determine 172 00:10:46,880 --> 00:10:49,160 Speaker 1: if in fact the book violates the law, in which 173 00:10:49,200 --> 00:10:52,320 Speaker 1: case it would presumably be banned from the school libraries. 174 00:10:53,000 --> 00:10:56,319 Speaker 1: This includes books like The Handmaid's Tale. I'm pretty sure 175 00:10:56,360 --> 00:10:58,760 Speaker 1: that one's going to get banned, knowing some of the 176 00:10:58,760 --> 00:11:01,960 Speaker 1: scenes that are in that book. However, you know that's 177 00:11:02,280 --> 00:11:05,199 Speaker 1: it's not even ironic, it's just sort of like predicted 178 00:11:05,240 --> 00:11:08,720 Speaker 1: by the book itself. The state government is probably viewing 179 00:11:08,720 --> 00:11:10,720 Speaker 1: it as a good thing because The Handmaid's Tale is 180 00:11:10,760 --> 00:11:12,800 Speaker 1: a book that really lays out what happens when a 181 00:11:12,840 --> 00:11:17,600 Speaker 1: government gets authority over stuff like bodily autonomy, and you know, 182 00:11:17,880 --> 00:11:20,280 Speaker 1: you don't want young people being able to read about 183 00:11:20,280 --> 00:11:23,800 Speaker 1: that and then getting ideas. School is the last place 184 00:11:23,920 --> 00:11:27,559 Speaker 1: for getting ideas. After all, Sorry, I am ranting, even 185 00:11:27,600 --> 00:11:30,520 Speaker 1: after I said I wasn't going to anyway. I just 186 00:11:30,559 --> 00:11:32,640 Speaker 1: consider it a fresh new hell to be in a 187 00:11:32,679 --> 00:11:36,960 Speaker 1: world where AI is helping administrators ban books. It's like 188 00:11:37,559 --> 00:11:40,760 Speaker 1: the evil Mirror Image Universe version of a Reese's Peanut 189 00:11:40,800 --> 00:11:45,760 Speaker 1: butter Cup. This is two awful things that go awful together. Okay, 190 00:11:46,320 --> 00:11:51,920 Speaker 1: I obviously have become a little overwrought with emotion, So 191 00:11:51,960 --> 00:11:53,480 Speaker 1: we're going to take a quick break, and when we 192 00:11:53,520 --> 00:11:57,000 Speaker 1: come back, I'll talk about some more stories, including a 193 00:11:57,000 --> 00:12:09,880 Speaker 1: couple more AI ones. We're back. So many years ago 194 00:12:09,960 --> 00:12:13,160 Speaker 1: I did a tech Stuff episode about capture tests. These 195 00:12:13,200 --> 00:12:16,319 Speaker 1: are those tests you sometimes encounter on the web that 196 00:12:16,400 --> 00:12:18,640 Speaker 1: you have to pass to prove you're a human being 197 00:12:19,040 --> 00:12:21,800 Speaker 1: before you can access whatever is on the other side, 198 00:12:21,960 --> 00:12:23,600 Speaker 1: right Like, there are a lot of these where you 199 00:12:23,679 --> 00:12:26,160 Speaker 1: have to click on it in order to complete some 200 00:12:26,280 --> 00:12:29,559 Speaker 1: transaction or else the system will think that you are 201 00:12:29,600 --> 00:12:33,400 Speaker 1: a bot and reject it. So researchers say that bots 202 00:12:33,440 --> 00:12:36,520 Speaker 1: are now better at completing capture tests than humans are, 203 00:12:36,800 --> 00:12:40,839 Speaker 1: and that is a huge problem because the whole main 204 00:12:40,960 --> 00:12:45,199 Speaker 1: purpose of captures is to create a task that should 205 00:12:45,280 --> 00:12:49,200 Speaker 1: be relatively easy for most humans to complete, but it 206 00:12:49,200 --> 00:12:53,160 Speaker 1: should also be really tricky for automated systems to complete it. 207 00:12:53,679 --> 00:12:57,720 Speaker 1: As captures become harder for people to complete, they become 208 00:12:57,720 --> 00:13:00,960 Speaker 1: a barrier to legitimate usage. It's a real problem. And 209 00:13:01,000 --> 00:13:04,000 Speaker 1: as they become easier for bots to complete, well, obviously 210 00:13:04,040 --> 00:13:06,920 Speaker 1: they have no use at all from that standpoint, at 211 00:13:07,000 --> 00:13:10,480 Speaker 1: least not for their stated purpose. This is not the 212 00:13:10,520 --> 00:13:12,360 Speaker 1: first time we've seen this happen. By the way, the 213 00:13:12,440 --> 00:13:16,079 Speaker 1: whole history of capture is one that's kind of like 214 00:13:16,120 --> 00:13:20,080 Speaker 1: a seesaw. Developers will create automated programs they get better 215 00:13:20,120 --> 00:13:24,160 Speaker 1: at solving certain capture tests, and then capture test developers 216 00:13:24,200 --> 00:13:27,000 Speaker 1: will come up with a new approach to captures in 217 00:13:27,120 --> 00:13:30,160 Speaker 1: order to trip up this new generation of bots. So, 218 00:13:30,200 --> 00:13:33,120 Speaker 1: in a way, captures have played a really important part 219 00:13:33,360 --> 00:13:38,160 Speaker 1: in the evolution of artificial intelligence. But beyond this adversarial 220 00:13:38,200 --> 00:13:42,200 Speaker 1: approach to machine learning, the research points out that bots 221 00:13:42,240 --> 00:13:46,920 Speaker 1: have fewer barriers to do stuff that we generally frown upon, 222 00:13:47,120 --> 00:13:50,760 Speaker 1: like we typically put these capture things in place for 223 00:13:50,880 --> 00:13:55,439 Speaker 1: a reason, like we don't want automated algorithms or systems 224 00:13:55,760 --> 00:13:58,320 Speaker 1: to be able to game the system in some way, 225 00:13:58,880 --> 00:14:01,839 Speaker 1: So that can include things like using bots that can 226 00:14:01,880 --> 00:14:05,520 Speaker 1: defeat captures in an effort to access all the pages 227 00:14:05,520 --> 00:14:08,800 Speaker 1: in a website and then scrap all the data for 228 00:14:08,840 --> 00:14:13,240 Speaker 1: whatever purpose, or to pose as a legitimate customer on 229 00:14:13,320 --> 00:14:17,880 Speaker 1: an online marketplace and then post fake reviews for various 230 00:14:17,880 --> 00:14:21,560 Speaker 1: products and then artificially driving that product's review scores up 231 00:14:21,720 --> 00:14:26,160 Speaker 1: or down, right, like you could have someone pay to 232 00:14:26,280 --> 00:14:30,640 Speaker 1: downvote a competitor's product so that your product looks better 233 00:14:30,680 --> 00:14:33,640 Speaker 1: in comparison. You know, you could also use it to 234 00:14:33,680 --> 00:14:37,000 Speaker 1: try and boost your own product scores. These are issues 235 00:14:37,000 --> 00:14:39,440 Speaker 1: that are known and are happening, and one of the 236 00:14:39,440 --> 00:14:44,360 Speaker 1: reasons why captures are being used. The Independent reports that 237 00:14:44,400 --> 00:14:48,120 Speaker 1: researchers put capture tests to the you know test and 238 00:14:48,160 --> 00:14:52,360 Speaker 1: had people and bots tried to complete different captures, and 239 00:14:52,480 --> 00:14:55,880 Speaker 1: people did significantly worse on those tests than the bots did. 240 00:14:55,960 --> 00:14:58,760 Speaker 1: They took more time to complete the tests, and they 241 00:14:58,760 --> 00:15:01,840 Speaker 1: were less accurate than the bots. And the bots were 242 00:15:01,840 --> 00:15:05,200 Speaker 1: able to breeze through some of those challenges in less 243 00:15:05,200 --> 00:15:08,240 Speaker 1: than a second with like close to one hundred percent accuracy. 244 00:15:08,800 --> 00:15:11,640 Speaker 1: So the real take home here is that captures no 245 00:15:11,720 --> 00:15:14,480 Speaker 1: longer do the job they were intended to do, or 246 00:15:14,480 --> 00:15:18,360 Speaker 1: at least ostensibly intended to do, and in my mind 247 00:15:18,400 --> 00:15:21,360 Speaker 1: that means we should just ditch captures and come up 248 00:15:21,440 --> 00:15:25,080 Speaker 1: with a different approach. However, I should also note that 249 00:15:25,120 --> 00:15:29,560 Speaker 1: some companies, such as Google have relied on humans completing captures, 250 00:15:30,080 --> 00:15:33,680 Speaker 1: not because that was a way to prevent bots from 251 00:15:33,840 --> 00:15:37,040 Speaker 1: getting access to stuff, but rather to help Google train 252 00:15:37,120 --> 00:15:40,800 Speaker 1: its own AI models. Right Like, there was the time 253 00:15:40,840 --> 00:15:44,760 Speaker 1: where you would be presented with words scanned words from 254 00:15:44,880 --> 00:15:48,240 Speaker 1: a scanned book and you would have to identify what 255 00:15:48,280 --> 00:15:51,240 Speaker 1: that word was. And the reason for that was not 256 00:15:51,440 --> 00:15:54,800 Speaker 1: so that Google could necessarily say, Okay, you're definitely a human, 257 00:15:55,280 --> 00:15:58,720 Speaker 1: but to train its technology to be better able to 258 00:15:58,840 --> 00:16:04,000 Speaker 1: scan text and interpret it. So sometimes capchas aren't really 259 00:16:04,000 --> 00:16:07,760 Speaker 1: there as a safeguard against bots, but rather a method 260 00:16:07,960 --> 00:16:10,800 Speaker 1: to train bots to be even smarter than they already are. 261 00:16:11,360 --> 00:16:16,840 Speaker 1: And we're still on AI. So researchers at Purdue University 262 00:16:17,160 --> 00:16:21,360 Speaker 1: have studied chat GPT's performance regarding coding. They took a 263 00:16:21,480 --> 00:16:26,200 Speaker 1: very specific approach. They submitted to chat gpt five hundred 264 00:16:26,240 --> 00:16:30,160 Speaker 1: and seventeen different questions that they pulled from the website's 265 00:16:30,160 --> 00:16:34,320 Speaker 1: stack overflow. So in case you're not familiar with stack overflow. 266 00:16:34,520 --> 00:16:37,320 Speaker 1: That's a place for programmers to go in order to 267 00:16:37,640 --> 00:16:40,600 Speaker 1: learn and to share knowledge and tips, and you can 268 00:16:40,680 --> 00:16:44,520 Speaker 1: ask questions of the community and then receive answers from them. 269 00:16:44,560 --> 00:16:48,080 Speaker 1: It's kind of like a programmer specific version of Quora 270 00:16:48,400 --> 00:16:53,200 Speaker 1: or Rest in Peace Yahoo answers. So the researchers took 271 00:16:53,280 --> 00:16:56,720 Speaker 1: questions from stack overflow, They gathered all these questions from 272 00:16:56,720 --> 00:16:59,880 Speaker 1: the community, and they submitted those questions to chat gpt 273 00:17:00,520 --> 00:17:02,600 Speaker 1: in order to see what chat gpt said, and they 274 00:17:02,880 --> 00:17:07,080 Speaker 1: said that more than half of chat GPT's answers, fifty 275 00:17:07,080 --> 00:17:11,560 Speaker 1: two percent of them included at least some inaccuracies, you know, 276 00:17:11,640 --> 00:17:14,119 Speaker 1: some being totally inaccurate, to some that were just like 277 00:17:14,240 --> 00:17:19,240 Speaker 1: partly inaccurate. They also said seventy seven percent of chat 278 00:17:19,280 --> 00:17:23,440 Speaker 1: GPT's answers were overly verbose, which again makes me wonder 279 00:17:23,480 --> 00:17:27,320 Speaker 1: if I am actually chat gpt. The researcher said, the 280 00:17:27,400 --> 00:17:32,240 Speaker 1: inaccurate answers indicated that about half the time, like fifty 281 00:17:32,280 --> 00:17:35,440 Speaker 1: four percent of the time, when chat gpt gave incorrect answers, 282 00:17:36,160 --> 00:17:39,720 Speaker 1: it seemed to be because chat gpt didn't really understand 283 00:17:39,800 --> 00:17:43,560 Speaker 1: what the question was actually asking. So, in other words, 284 00:17:43,920 --> 00:17:47,480 Speaker 1: it's possible chat GPT could have produced a correct answer 285 00:17:47,680 --> 00:17:51,800 Speaker 1: if it had been able to parse what the question 286 00:17:51,960 --> 00:17:54,240 Speaker 1: asker wanted to know in the first place. It's just 287 00:17:54,320 --> 00:17:57,840 Speaker 1: chat gpt didn't understand the question and so gave an 288 00:17:57,880 --> 00:18:01,800 Speaker 1: inappropriate or incorrect response. All of this is not to 289 00:18:01,840 --> 00:18:05,800 Speaker 1: say that chat gpt is completely useless when it comes 290 00:18:05,840 --> 00:18:09,560 Speaker 1: to helping programmers code. It might be very useful, but 291 00:18:09,560 --> 00:18:13,840 Speaker 1: it does require a lot of editorial oversight, just like 292 00:18:14,040 --> 00:18:17,760 Speaker 1: with the writing of articles, like I'd mentioned before. But 293 00:18:17,840 --> 00:18:22,880 Speaker 1: it could potentially speed things up if it if it's 294 00:18:22,960 --> 00:18:27,000 Speaker 1: understanding the prompts properly and not hallucinating, which those are 295 00:18:27,000 --> 00:18:31,480 Speaker 1: big ifs. But like the researchers were even quick to say, 296 00:18:31,640 --> 00:18:35,840 Speaker 1: this isn't to suggest that AI doesn't have a place here. 297 00:18:35,920 --> 00:18:40,440 Speaker 1: It's just to remind ourselves that, you know, the way 298 00:18:40,480 --> 00:18:45,560 Speaker 1: you word questions matters, the way that chat GPT interprets 299 00:18:45,640 --> 00:18:49,520 Speaker 1: questions matters, and then we can't just assume that any 300 00:18:49,560 --> 00:18:56,840 Speaker 1: answers provided are magically correct and accurate. Okay, moving off 301 00:18:56,880 --> 00:19:00,200 Speaker 1: of AI, let's talk about Apple. So back in March 302 00:19:00,240 --> 00:19:02,640 Speaker 1: twenty just as the world was starting to shut down 303 00:19:02,680 --> 00:19:06,240 Speaker 1: in the face of COVID Apple agreed to a five 304 00:19:06,480 --> 00:19:10,880 Speaker 1: hundred million dollar settlement. So the heart of the matter 305 00:19:10,920 --> 00:19:14,840 Speaker 1: here was a class action lawsuit that accused Apple of 306 00:19:14,880 --> 00:19:20,320 Speaker 1: purposefully slowing down older iPhone models performance, presumably in an 307 00:19:20,320 --> 00:19:24,879 Speaker 1: effort to push people to upgrade to newer models. Apple 308 00:19:25,000 --> 00:19:29,160 Speaker 1: admitted that it had slowed performance down on older iPhone 309 00:19:29,160 --> 00:19:32,320 Speaker 1: models back in twenty seventeen, but the company said it 310 00:19:32,400 --> 00:19:35,159 Speaker 1: wasn't in an effort to make people go out and 311 00:19:35,160 --> 00:19:38,320 Speaker 1: buy a new iPhone. Instead, they said they had to 312 00:19:38,400 --> 00:19:43,360 Speaker 1: do it because updates to the iOS meant that older 313 00:19:43,359 --> 00:19:48,320 Speaker 1: phones would potentially shut down spontaneously, you know, would enter 314 00:19:48,520 --> 00:19:51,320 Speaker 1: into an issue where they would shut down or they 315 00:19:51,359 --> 00:19:54,399 Speaker 1: would burn through their battery life too quickly unless Apple 316 00:19:55,160 --> 00:20:01,439 Speaker 1: artificially made them work slower. But in people upset, Apple 317 00:20:01,440 --> 00:20:06,879 Speaker 1: customers upset, and around three million claimants joined this class 318 00:20:06,920 --> 00:20:11,520 Speaker 1: action lawsuit, which Apple again ultimately settled in March of 319 00:20:11,560 --> 00:20:17,200 Speaker 1: twenty twenty. And now, finally, three years after the settlement, 320 00:20:17,680 --> 00:20:21,520 Speaker 1: Apple will be sending checks out to the people who 321 00:20:21,560 --> 00:20:24,600 Speaker 1: were part of the lawsuit. The checks come out to 322 00:20:24,640 --> 00:20:27,919 Speaker 1: be about sixty five dollars per claim, because again it 323 00:20:27,960 --> 00:20:30,160 Speaker 1: was like three million claims and a five hundred million 324 00:20:30,200 --> 00:20:34,760 Speaker 1: dollar settlement fee, And part of the reason for the 325 00:20:34,760 --> 00:20:39,720 Speaker 1: long delay has nothing to do with Apple's behavior. It's 326 00:20:39,720 --> 00:20:42,560 Speaker 1: not that Apple was dragging its heels. Part of the 327 00:20:42,600 --> 00:20:45,240 Speaker 1: issue is that a couple of claimants out of those 328 00:20:45,280 --> 00:20:50,160 Speaker 1: three million, were dissatisfied with the settlement and they appealed 329 00:20:50,160 --> 00:20:53,560 Speaker 1: it to the ninth US Circuit Court of Appeals. But 330 00:20:53,840 --> 00:20:58,639 Speaker 1: ultimately the court ruled against that appeal. So now, after 331 00:20:58,720 --> 00:21:01,120 Speaker 1: many years, those checks should be heading out the door. 332 00:21:01,560 --> 00:21:03,680 Speaker 1: So you should keep on the lookout if you had 333 00:21:03,720 --> 00:21:07,119 Speaker 1: signed up to be one of the claimants in that lawsuit. 334 00:21:07,840 --> 00:21:11,200 Speaker 1: Last week, I talked about how Saudi Arabia was following 335 00:21:11,200 --> 00:21:14,840 Speaker 1: in the EU's footsteps, requiring all smartphone manufacturers to include 336 00:21:14,880 --> 00:21:19,840 Speaker 1: a USB C charging port starting in twenty twenty five. Well, 337 00:21:20,040 --> 00:21:24,000 Speaker 1: now the EU has passed similar rules that will require 338 00:21:24,200 --> 00:21:29,240 Speaker 1: all smartphones to have replaceable batteries starting in twenty twenty seven. 339 00:21:29,600 --> 00:21:32,800 Speaker 1: So like the USBC rules, this seems to me to 340 00:21:32,880 --> 00:21:37,280 Speaker 1: be more or less specifically targeting Apple, not just Apple. 341 00:21:37,280 --> 00:21:40,479 Speaker 1: Apple's not the only company that makes it impossible to 342 00:21:40,600 --> 00:21:44,280 Speaker 1: replace a smartphone's battery. In fact, my Android phone is 343 00:21:44,320 --> 00:21:47,040 Speaker 1: the same I can't replace the battery on my Android phone. 344 00:21:47,800 --> 00:21:51,440 Speaker 1: But Apple just has this reputation for protecting its proprietary 345 00:21:51,480 --> 00:21:54,320 Speaker 1: approach to smartphones and creating kind of a closed off 346 00:21:54,640 --> 00:21:58,800 Speaker 1: ecosystem that requires you to work with Apple to make 347 00:21:58,840 --> 00:22:02,600 Speaker 1: any repairs or maintenance to your own devices, And that's 348 00:22:03,040 --> 00:22:06,159 Speaker 1: part of what is being targeted here. It's also an 349 00:22:06,240 --> 00:22:10,320 Speaker 1: effort to cut down on things like e waste. But 350 00:22:11,000 --> 00:22:13,919 Speaker 1: here's the thing is that while that may be an element, 351 00:22:14,080 --> 00:22:17,880 Speaker 1: like the control part of the ecosystem is probably an 352 00:22:17,880 --> 00:22:21,920 Speaker 1: element for companies like Apple to lock away those batteries, 353 00:22:21,960 --> 00:22:24,840 Speaker 1: it's not the only reason for it. Part of the 354 00:22:25,800 --> 00:22:29,000 Speaker 1: esthetic for modern smartphones is to try and make them 355 00:22:29,040 --> 00:22:32,360 Speaker 1: as slim as possible, but in order to do that, 356 00:22:32,440 --> 00:22:35,880 Speaker 1: you have to cram all the components of the smartphone 357 00:22:35,920 --> 00:22:40,520 Speaker 1: into a very tiny form factor. And while you can 358 00:22:40,680 --> 00:22:43,880 Speaker 1: miniaturize a lot of stuff in smartphones, batteries are one 359 00:22:43,920 --> 00:22:48,480 Speaker 1: of the things you can't easily miniaturize. But this typically 360 00:22:48,520 --> 00:22:53,920 Speaker 1: means that it's not really practical or sometimes even possible, 361 00:22:54,200 --> 00:22:59,119 Speaker 1: to make replaceable batteries a thing because you've crammed everything 362 00:22:59,119 --> 00:23:01,679 Speaker 1: into such a small form factor that you just can't 363 00:23:02,040 --> 00:23:06,679 Speaker 1: access the battery or disengage it easily from the rest 364 00:23:07,119 --> 00:23:10,280 Speaker 1: of the phone. It's all just it's kind of built 365 00:23:10,320 --> 00:23:14,639 Speaker 1: into itself. So mandating that all smartphones have to have 366 00:23:14,720 --> 00:23:18,760 Speaker 1: replaceable batteries, ones that are replaceable by the end user. 367 00:23:18,840 --> 00:23:21,040 Speaker 1: No less, we're not just talking about taking it into 368 00:23:21,040 --> 00:23:24,359 Speaker 1: a shop and having it swapped out. The end user 369 00:23:24,440 --> 00:23:26,959 Speaker 1: is supposed to be able to replace these batteries. Well, 370 00:23:27,040 --> 00:23:29,840 Speaker 1: that means that companies will have to move away from 371 00:23:30,200 --> 00:23:34,480 Speaker 1: designs that end up having these compact layouts that would 372 00:23:34,480 --> 00:23:37,920 Speaker 1: make it difficult or impossible to replace the battery. They're 373 00:23:37,960 --> 00:23:40,880 Speaker 1: going to have to go with a different approach, and 374 00:23:41,080 --> 00:23:43,719 Speaker 1: that could mean that we're going to start seeing some 375 00:23:43,840 --> 00:23:47,200 Speaker 1: chonkier smartphones in the EU starting around twenty twenty seven 376 00:23:47,359 --> 00:23:52,000 Speaker 1: or so. Also, I should mention this rule doesn't just 377 00:23:52,119 --> 00:23:56,199 Speaker 1: apply to smartphones. It actually applies to any battery operated device, 378 00:23:56,320 --> 00:23:58,840 Speaker 1: So you know, things like laptops and stuff will also 379 00:23:58,920 --> 00:24:03,399 Speaker 1: have to have replace batteries. Even electric bikes, which are 380 00:24:03,440 --> 00:24:06,480 Speaker 1: also popular in the EU, will have to have these 381 00:24:06,560 --> 00:24:10,879 Speaker 1: replaceable batteries. Okay, I've got a few more stories to 382 00:24:10,920 --> 00:24:14,160 Speaker 1: cover before we wrap up. Let's take another quick break 383 00:24:14,200 --> 00:24:26,320 Speaker 1: and we'll be back with some more news. We're back 384 00:24:26,359 --> 00:24:29,040 Speaker 1: and I've got another class action lawsuit to bring up. 385 00:24:29,800 --> 00:24:35,080 Speaker 1: This one is against HP, so claimants in California, uh 386 00:24:35,359 --> 00:24:38,080 Speaker 1: and I think other places as well, but I know 387 00:24:38,160 --> 00:24:41,600 Speaker 1: the lawsuit is taking place in the state of California. 388 00:24:41,720 --> 00:24:46,080 Speaker 1: They have sued HP, saying the company was purposefully restricting 389 00:24:46,080 --> 00:24:49,639 Speaker 1: customers from using all in one printers if the toner 390 00:24:49,720 --> 00:24:51,879 Speaker 1: ran down, so you run out of ink and then 391 00:24:51,880 --> 00:24:54,359 Speaker 1: suddenly you're all in one printer just becomes a giant 392 00:24:54,400 --> 00:24:57,160 Speaker 1: paper weight. So you wouldn't be able to even use 393 00:24:57,200 --> 00:25:00,760 Speaker 1: the non printing functions on one of the these machines, 394 00:25:00,800 --> 00:25:02,960 Speaker 1: like you wouldn't be able to scan a document to 395 00:25:03,000 --> 00:25:05,600 Speaker 1: create like a PDF, or you wouldn't be able to 396 00:25:05,680 --> 00:25:09,480 Speaker 1: use it to send a fax, which doesn't require any 397 00:25:09,560 --> 00:25:11,960 Speaker 1: ink in the first place. And that was the basis 398 00:25:11,960 --> 00:25:15,639 Speaker 1: of the complaint, is that HP was locking these functions 399 00:25:15,640 --> 00:25:19,159 Speaker 1: away in an effort to force people to buy expensive 400 00:25:19,160 --> 00:25:22,639 Speaker 1: toner even if they didn't need the printer for the 401 00:25:22,640 --> 00:25:27,520 Speaker 1: purposes of printing. Also, they argued that HP failed to 402 00:25:27,600 --> 00:25:31,880 Speaker 1: disclose that this is what would happen in the device's documentation. 403 00:25:32,680 --> 00:25:36,639 Speaker 1: So HP filed a motion to dismiss this lawsuit, and 404 00:25:36,680 --> 00:25:40,399 Speaker 1: now a judge in California has denied that motion, so 405 00:25:40,440 --> 00:25:45,320 Speaker 1: the lawsuit may proceed. Earlier, when this lawsuit was first 406 00:25:45,359 --> 00:25:48,760 Speaker 1: filed against HP, a judge actually did dismiss the case 407 00:25:48,800 --> 00:25:54,080 Speaker 1: because the claim was that the plaintiffs had failed to 408 00:25:54,200 --> 00:25:58,240 Speaker 1: make an actual legal claim against HP. They had complaints, 409 00:25:58,480 --> 00:26:02,160 Speaker 1: but not a legal claim. But then the plaintiffs amended 410 00:26:02,840 --> 00:26:06,280 Speaker 1: their motion and that one held up to scrutiny, and 411 00:26:06,320 --> 00:26:09,040 Speaker 1: that's what's going to move forward. This still doesn't mean 412 00:26:09,080 --> 00:26:11,840 Speaker 1: that HP will ultimately be found to have acted in 413 00:26:11,920 --> 00:26:14,440 Speaker 1: the wrong, but it does mean that they're going to 414 00:26:14,480 --> 00:26:17,680 Speaker 1: have to face some tough questions in court. Last week, 415 00:26:18,000 --> 00:26:22,080 Speaker 1: California authorities gave two companies WAIM, which is owned by 416 00:26:22,119 --> 00:26:26,400 Speaker 1: Google's parent company, Alphabet, and Cruse, which is owned by 417 00:26:26,480 --> 00:26:31,800 Speaker 1: General Motors, the authority to operate self driving robotaxi services 418 00:26:32,280 --> 00:26:37,000 Speaker 1: around the clock in San Francisco, and then Cruz promptly 419 00:26:37,040 --> 00:26:39,639 Speaker 1: created a traffic jam in the North Beach neighborhood of 420 00:26:39,680 --> 00:26:44,800 Speaker 1: San Francisco, Sad Trombo. All right, so, according to reports 421 00:26:45,200 --> 00:26:49,639 Speaker 1: from San Francisco, for some reason, several of Cruz's self 422 00:26:49,720 --> 00:26:53,879 Speaker 1: driving cars, as many as ten of them at a time, 423 00:26:54,600 --> 00:26:59,800 Speaker 1: came to a stop around the Valejo Street in North Beach, 424 00:27:00,200 --> 00:27:02,520 Speaker 1: San Francisco. Ballejo just always makes me think of the 425 00:27:02,560 --> 00:27:06,080 Speaker 1: Sodiac Killer, But the Sodiac Killer had nothing to do 426 00:27:06,600 --> 00:27:08,879 Speaker 1: with these cars just coming to a stop. Some of 427 00:27:08,880 --> 00:27:12,160 Speaker 1: those cars had passengers inside them, and so the passengers 428 00:27:12,200 --> 00:27:14,639 Speaker 1: were stuck inside a non moving car on the street 429 00:27:14,680 --> 00:27:17,800 Speaker 1: for about fifteen minutes. The cars did turn on their 430 00:27:17,800 --> 00:27:21,399 Speaker 1: hazard lights, so at least there's that. So what the 431 00:27:21,440 --> 00:27:26,560 Speaker 1: heck happened? Well, representatives at Cruise say that it looks 432 00:27:26,600 --> 00:27:30,040 Speaker 1: like a nearby music festival was the problem. The cars 433 00:27:30,240 --> 00:27:32,919 Speaker 1: were not listening to the groovy tunes. Instead, there was 434 00:27:32,960 --> 00:27:35,919 Speaker 1: an excess of cell phone activity in the area and 435 00:27:35,960 --> 00:27:39,280 Speaker 1: it was kind of clogging up the airwaves, and so 436 00:27:39,359 --> 00:27:41,720 Speaker 1: all that interference made it difficult for the vehicles to 437 00:27:41,800 --> 00:27:44,040 Speaker 1: access their navigation features, and so they kind of went 438 00:27:44,080 --> 00:27:48,560 Speaker 1: into protective turtle mode. It was not an auspicious start 439 00:27:48,600 --> 00:27:51,919 Speaker 1: to the driverleist taxi revolution, I would say. Now, the 440 00:27:51,960 --> 00:27:55,679 Speaker 1: rules in California state that Weaimo is not allowed to 441 00:27:55,760 --> 00:28:00,159 Speaker 1: charge customers for taxi rides unless a safety driver is 442 00:28:00,200 --> 00:28:04,119 Speaker 1: also in the vehicle. If it's a driverless vehicle and 443 00:28:04,160 --> 00:28:07,879 Speaker 1: there's no safety driver, weimo can't charge for rides. They 444 00:28:07,920 --> 00:28:10,400 Speaker 1: could give them for free, but they wouldn't be able 445 00:28:10,440 --> 00:28:13,960 Speaker 1: to charge. Cruz has a slightly different deal. It can 446 00:28:14,359 --> 00:28:18,320 Speaker 1: charge for driverless trips without a safety driver, but only 447 00:28:18,359 --> 00:28:22,600 Speaker 1: between the hours of ten pm and six am. Anytime 448 00:28:22,680 --> 00:28:25,080 Speaker 1: outside of those hours, if there's not a safety driver 449 00:28:25,160 --> 00:28:28,920 Speaker 1: in the car, the ride is free, or if they 450 00:28:28,960 --> 00:28:31,840 Speaker 1: do have a safety driver present in the vehicle, they 451 00:28:31,840 --> 00:28:35,640 Speaker 1: can charge just anytime. GM has said that the long 452 00:28:35,720 --> 00:28:38,719 Speaker 1: term plan is to quote unquote blanket cities like San 453 00:28:38,760 --> 00:28:42,160 Speaker 1: Francisco with driverless vehicles, which kind of makes the point 454 00:28:42,200 --> 00:28:45,320 Speaker 1: for a lot of people who oppose these policies. They 455 00:28:45,440 --> 00:28:47,840 Speaker 1: argue that this just really means that we're just gonna 456 00:28:47,880 --> 00:28:50,160 Speaker 1: end up with a lot more vehicles on city streets. 457 00:28:50,560 --> 00:28:53,760 Speaker 1: That's not going to alleviate traffic, it's gonna make it worse. 458 00:28:54,560 --> 00:28:59,000 Speaker 1: And while the argument might be made that the purpose 459 00:28:59,080 --> 00:29:02,080 Speaker 1: is to convince people to not drive their own vehicles 460 00:29:02,080 --> 00:29:06,280 Speaker 1: on streets, the proponents for change are saying that's not 461 00:29:06,400 --> 00:29:08,640 Speaker 1: what we need. We don't need like self driving cars 462 00:29:08,680 --> 00:29:10,480 Speaker 1: to do that. What we need is to make cities 463 00:29:10,920 --> 00:29:15,479 Speaker 1: easier to get around for pedestrians and bicyclists and stuff, 464 00:29:16,200 --> 00:29:18,880 Speaker 1: which is actually taking more cars off the street as 465 00:29:18,880 --> 00:29:23,520 Speaker 1: opposed to going driverlests and having even more vehicles circle streets. 466 00:29:24,000 --> 00:29:28,120 Speaker 1: So yeah, not a great story to come out of 467 00:29:28,400 --> 00:29:33,160 Speaker 1: the early days of driverless robotaxis in San Francisco. Now, 468 00:29:33,200 --> 00:29:36,760 Speaker 1: our last full story has to do with video games 469 00:29:36,800 --> 00:29:40,000 Speaker 1: and modeling communities. So you may be aware there are 470 00:29:40,040 --> 00:29:43,880 Speaker 1: folks who love to create code that modifies existing video 471 00:29:43,960 --> 00:29:47,600 Speaker 1: games in some way. Right, they might allow you to 472 00:29:47,640 --> 00:29:53,560 Speaker 1: get access to abilities and tools that developers have but 473 00:29:53,680 --> 00:29:56,640 Speaker 1: players are not meant to have, and then do all 474 00:29:56,640 --> 00:30:00,000 Speaker 1: sorts of stuff. Maybe it even changes the game fundament 475 00:30:00,000 --> 00:30:04,720 Speaker 1: mentally or ads new content created by moders. That kind 476 00:30:04,720 --> 00:30:08,800 Speaker 1: of stuff. Some video game companies actually encourage these communities. 477 00:30:09,040 --> 00:30:12,800 Speaker 1: Some even work with them to create like a storefront 478 00:30:13,320 --> 00:30:17,800 Speaker 1: where the game producer and the motterers can both generate 479 00:30:17,840 --> 00:30:21,680 Speaker 1: revenue from those mods. But then you've got Rockstar Games, 480 00:30:21,720 --> 00:30:25,160 Speaker 1: the creators of the Grand Theft Auto series. Rockstar Games 481 00:30:25,160 --> 00:30:29,640 Speaker 1: has often taken a more adversarial approach to the modding community, 482 00:30:30,160 --> 00:30:33,880 Speaker 1: so back in twenty fifteen, the company banned a whole 483 00:30:33,920 --> 00:30:37,760 Speaker 1: bunch of members of a mod group called five M. 484 00:30:37,960 --> 00:30:40,680 Speaker 1: Rockstar said that the group had been developing code that 485 00:30:40,720 --> 00:30:43,280 Speaker 1: can make it possible for folks to pirate the game. 486 00:30:43,960 --> 00:30:46,440 Speaker 1: What the motters had actually done is they had created 487 00:30:46,960 --> 00:30:51,280 Speaker 1: mods that would allow people to play in an alternate 488 00:30:51,520 --> 00:30:54,840 Speaker 1: version of Grand Theft Auto Online. So that's like an 489 00:30:54,880 --> 00:31:00,160 Speaker 1: ongoing product that Rockstar offers. But the mods five D 490 00:31:00,400 --> 00:31:03,560 Speaker 1: made it possible to run a separate instance of Grand 491 00:31:03,600 --> 00:31:06,640 Speaker 1: Theft ato online, one not overseen by Rockstar and one 492 00:31:06,680 --> 00:31:10,120 Speaker 1: that could have lots of different mods in it. Plus, 493 00:31:10,640 --> 00:31:14,040 Speaker 1: people who had a pirated copy of Grand Theft Auto 494 00:31:14,160 --> 00:31:18,400 Speaker 1: would be able to access this alternative version of Grand 495 00:31:18,440 --> 00:31:23,280 Speaker 1: Theft Doto online. So Rockstar Game says, oh, you're encouraging 496 00:31:23,280 --> 00:31:26,920 Speaker 1: people to pirate the game, so we're banning you from 497 00:31:27,000 --> 00:31:33,640 Speaker 1: our different forums and stuff. But now Rockstar has acquired 498 00:31:33,640 --> 00:31:37,280 Speaker 1: a group called CFX dot Ra, which consists of you 499 00:31:37,320 --> 00:31:41,640 Speaker 1: guessed it, the team behind five M. So now the 500 00:31:41,760 --> 00:31:45,680 Speaker 1: dreaded pirates are part of the crew, y are they 501 00:31:45,680 --> 00:31:48,480 Speaker 1: can take down the system from within. This is a 502 00:31:48,480 --> 00:31:51,360 Speaker 1: pretty dramatic turn of events because back in twenty fifteen, 503 00:31:51,400 --> 00:31:54,440 Speaker 1: there were reports that Rockstar had gone so far as 504 00:31:54,480 --> 00:31:57,640 Speaker 1: to actually send private investigators out to the homes of 505 00:31:57,680 --> 00:32:01,240 Speaker 1: people who were part of five M and to essentially 506 00:32:01,280 --> 00:32:05,040 Speaker 1: intimidate them. But in the year since twenty fifteen to 507 00:32:05,040 --> 00:32:11,520 Speaker 1: five M has maintained this alternative online play space for 508 00:32:11,600 --> 00:32:15,920 Speaker 1: grand Theft auto players that reached a maximum of around 509 00:32:15,920 --> 00:32:18,960 Speaker 1: two hundred and fifty thousand concurrent players back in twenty 510 00:32:18,960 --> 00:32:22,560 Speaker 1: twenty one. So I guess Rockstar came around to the 511 00:32:22,560 --> 00:32:26,480 Speaker 1: old philosophy of if you can't beat them, acquire them. 512 00:32:26,920 --> 00:32:29,240 Speaker 1: Now before I head off, I do have a recommended 513 00:32:29,320 --> 00:32:32,360 Speaker 1: article I think you should check out. It's on tech Dirt. 514 00:32:32,760 --> 00:32:37,120 Speaker 1: It's written by Mike Masnik. The article is titled ur 515 00:32:37,320 --> 00:32:40,720 Speaker 1: I double A piles on the effort to kill the 516 00:32:40,880 --> 00:32:45,160 Speaker 1: World's greatest library sues Internet Archive for making it possible 517 00:32:45,200 --> 00:32:49,040 Speaker 1: to hear old seventy eight's. That is a very long headline, 518 00:32:49,880 --> 00:32:53,240 Speaker 1: but yeah, the story talks about how the RIAA aka 519 00:32:53,320 --> 00:32:57,720 Speaker 1: the Recording Industry Association of America is coming after the 520 00:32:57,760 --> 00:33:02,280 Speaker 1: Internet Archive because Theria objects to folks being able to 521 00:33:02,400 --> 00:33:06,280 Speaker 1: use Internet Archive to listen to obsolete media. So the 522 00:33:06,280 --> 00:33:11,400 Speaker 1: seventy eights reference record albums. That's what the seventy eights mean. 523 00:33:11,680 --> 00:33:15,040 Speaker 1: They're specific types of record albums, specifically ones that require 524 00:33:15,120 --> 00:33:19,360 Speaker 1: a playback speed of seventy eight revolutions per minute. You know, 525 00:33:19,480 --> 00:33:24,280 Speaker 1: Vinyl has experienced a renaissance lately, but you typically find 526 00:33:24,360 --> 00:33:27,440 Speaker 1: the Vinyl records of today falling either into the thirty 527 00:33:27,480 --> 00:33:31,360 Speaker 1: three and a third rpm category or the forty five 528 00:33:31,600 --> 00:33:35,320 Speaker 1: rpm category. Most record players and turntables don't even have 529 00:33:35,440 --> 00:33:38,080 Speaker 1: the ability to play at seventy eight RPM. Some do, 530 00:33:38,800 --> 00:33:42,520 Speaker 1: but a lot don't because seventy eight RPM albums are 531 00:33:42,680 --> 00:33:48,560 Speaker 1: pretty darn rare. They are really reaching obsolescence, so there's 532 00:33:48,600 --> 00:33:51,880 Speaker 1: a possibility that media recorded on seventy eight albums could 533 00:33:51,920 --> 00:33:56,560 Speaker 1: be lost forever without this archival approach. So the Internet 534 00:33:56,720 --> 00:34:00,680 Speaker 1: archive is all about preserving information, but the uble A 535 00:34:00,920 --> 00:34:04,080 Speaker 1: is not crazy about people being able to access stuff 536 00:34:04,080 --> 00:34:08,320 Speaker 1: without you know, the industry's total control over it. Anyway, 537 00:34:08,400 --> 00:34:10,200 Speaker 1: I'm biased when it comes to stuff about the ri 538 00:34:10,400 --> 00:34:13,799 Speaker 1: double A because that organization has brought the hammer down 539 00:34:13,880 --> 00:34:18,200 Speaker 1: with unnecessary force multiple times throughout the history of the Internet. 540 00:34:18,320 --> 00:34:24,080 Speaker 1: Like the Napster story is ridiculous, but you should read 541 00:34:24,200 --> 00:34:26,440 Speaker 1: this article on tech Dirt to get the full story 542 00:34:26,480 --> 00:34:29,320 Speaker 1: and maybe get a deeper appreciation for what the folks 543 00:34:29,320 --> 00:34:32,560 Speaker 1: at Internet archive are trying to do. And that's it 544 00:34:32,840 --> 00:34:36,600 Speaker 1: for the Tech News for Tuesday, August fifteenth, twenty twenty three. 545 00:34:37,080 --> 00:34:39,839 Speaker 1: I hope you're all well and I'll talk to you 546 00:34:39,880 --> 00:34:50,560 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 547 00:34:50,680 --> 00:34:55,520 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 548 00:34:55,640 --> 00:35:00,400 Speaker 1: or wherever you listen to your favorite shows. You