1 00:00:04,360 --> 00:00:12,240 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,640 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,640 --> 00:00:18,640 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:18,720 --> 00:00:22,080 Speaker 1: are you. It's time for the tech news for February 5 00:00:22,239 --> 00:00:26,560 Speaker 1: twenty eighth, twenty twenty three. For such a beastly month 6 00:00:26,600 --> 00:00:30,200 Speaker 1: as February twenty eight, days as a rule are plenty. 7 00:00:31,280 --> 00:00:34,760 Speaker 1: Shout out if you recognize that reference. It is another 8 00:00:34,920 --> 00:00:38,479 Speaker 1: AI dominated news day, though I promise there are a 9 00:00:38,520 --> 00:00:41,519 Speaker 1: few stories I'll be covering that do not have AI 10 00:00:41,560 --> 00:00:45,640 Speaker 1: as the central topic. But starting off Elon Musk repartedly 11 00:00:45,720 --> 00:00:49,280 Speaker 1: wants to found a new AI lab to compete against 12 00:00:49,320 --> 00:00:51,879 Speaker 1: open Ai. Now, for those of y'all who remember my 13 00:00:51,920 --> 00:00:55,680 Speaker 1: episode about open ai, maybe this comes as a surprise 14 00:00:55,800 --> 00:00:58,840 Speaker 1: because Musk was actually one of the original co founders 15 00:00:58,960 --> 00:01:02,080 Speaker 1: of open ai in the first place. But back then, 16 00:01:02,600 --> 00:01:06,039 Speaker 1: open ai was a not for profit organization, and it 17 00:01:06,080 --> 00:01:09,280 Speaker 1: had the goal of using an open source approach to 18 00:01:09,360 --> 00:01:13,000 Speaker 1: developing an evolving artificial intelligence in a way that ideally 19 00:01:13,280 --> 00:01:17,920 Speaker 1: would be universally beneficial. You know, none of this AI 20 00:01:17,959 --> 00:01:21,160 Speaker 1: that benefits one group at the expense of everyone else. 21 00:01:21,319 --> 00:01:25,039 Speaker 1: Kind of nonsense. But then Musk stepped down from the 22 00:01:25,080 --> 00:01:29,160 Speaker 1: board of open Ai because ostensibly because Tesla was also 23 00:01:29,400 --> 00:01:31,600 Speaker 1: developing AI of its own and there was a potential 24 00:01:31,640 --> 00:01:34,800 Speaker 1: conflict of interest, though Musk has later said that he 25 00:01:34,920 --> 00:01:39,400 Speaker 1: was critical of open AI's direction, and since that day, 26 00:01:39,520 --> 00:01:43,479 Speaker 1: Musk has gone on to criticize open Ai, particularly once 27 00:01:43,480 --> 00:01:48,440 Speaker 1: the organization founded a for profit arm ostensibly to help 28 00:01:48,560 --> 00:01:53,320 Speaker 1: fund the nonprofit part, and Musk has also criticized chat GPT, 29 00:01:53,640 --> 00:01:57,960 Speaker 1: saying open ai is quote training AI to be woke 30 00:01:58,600 --> 00:02:02,760 Speaker 1: end quote Yeah, Musk, I get it. You're a billionaire 31 00:02:02,960 --> 00:02:06,480 Speaker 1: white guy. Why not punch down because there's nowhere else 32 00:02:06,520 --> 00:02:09,600 Speaker 1: beat a punch right? What a jerk. I'm, of course 33 00:02:09,600 --> 00:02:12,040 Speaker 1: being a little facetious. My opinion of Musk is obviously 34 00:02:12,080 --> 00:02:15,240 Speaker 1: pretty low, but that's beside the point. I know no 35 00:02:15,280 --> 00:02:17,480 Speaker 1: one really wants to hear that, so I'll drop it. 36 00:02:17,919 --> 00:02:21,079 Speaker 1: Musk wants to create this AI lab and pursue AI 37 00:02:21,160 --> 00:02:25,080 Speaker 1: chat butts that are unfettered by the chains of wokeness. 38 00:02:25,600 --> 00:02:29,440 Speaker 1: Considering how Musk has shown that his free speech absolutist 39 00:02:29,560 --> 00:02:33,440 Speaker 1: stance isn't actually in alignment with his behavior, you can 40 00:02:33,440 --> 00:02:37,520 Speaker 1: see also how Twitter had banned mention of competing services 41 00:02:37,520 --> 00:02:41,720 Speaker 1: like massadon Instagram and others on its platform. To see 42 00:02:41,760 --> 00:02:44,600 Speaker 1: evidence of this, I suspect all of this is going 43 00:02:44,639 --> 00:02:47,160 Speaker 1: to come back to haunt him should he actually achieve 44 00:02:47,240 --> 00:02:51,600 Speaker 1: this goal. Meanwhile, Tesla investors are likely further aggravated to 45 00:02:51,639 --> 00:02:55,320 Speaker 1: see that the company's CEO continues to direct his attention 46 00:02:55,360 --> 00:02:59,959 Speaker 1: to yet another endeavor rather than address problems with Tesla. 47 00:03:00,200 --> 00:03:03,919 Speaker 1: And then over at open Ai, the company is introducing 48 00:03:03,919 --> 00:03:06,840 Speaker 1: a platform for developers that will give them access to 49 00:03:07,080 --> 00:03:12,000 Speaker 1: open AI's tools, namely the company's machine learning models. So 50 00:03:12,200 --> 00:03:17,799 Speaker 1: this is the very powerful AI compute systems that to 51 00:03:17,840 --> 00:03:21,040 Speaker 1: build out yourself would require millions and millions of dollars. 52 00:03:21,720 --> 00:03:26,560 Speaker 1: Open ai is calling this offering Foundry, and it means 53 00:03:26,600 --> 00:03:29,120 Speaker 1: that people who have an idea for apps or services 54 00:03:29,160 --> 00:03:32,680 Speaker 1: that would feature AI in some way could have access 55 00:03:32,760 --> 00:03:36,000 Speaker 1: to compute assets without having to build them all themselves. 56 00:03:36,320 --> 00:03:40,160 Speaker 1: That is an enticing offer for developers who might otherwise 57 00:03:40,160 --> 00:03:43,960 Speaker 1: have a great idea but they lack the funds to 58 00:03:44,040 --> 00:03:47,640 Speaker 1: be able to execute upon it. Details are somewhat scarce. 59 00:03:47,720 --> 00:03:50,360 Speaker 1: Open ai has not announced when we might expect Foundry 60 00:03:50,400 --> 00:03:52,760 Speaker 1: to launch, but we do know that it's going to 61 00:03:52,880 --> 00:03:55,960 Speaker 1: set developers back a pretty penny, actually a whole bunch 62 00:03:56,000 --> 00:03:59,840 Speaker 1: of pretty pennies to access these services. According to tech Crunch, 63 00:04:00,320 --> 00:04:04,640 Speaker 1: three months of access to the lightweight version of GPT 64 00:04:04,640 --> 00:04:08,080 Speaker 1: three point five would set you back seventy eight grand. 65 00:04:08,480 --> 00:04:13,760 Speaker 1: That's seventy thousand dollars for three months of access wowsers. 66 00:04:13,880 --> 00:04:16,320 Speaker 1: So this is well beyond the reach of your average 67 00:04:16,440 --> 00:04:20,360 Speaker 1: home developer. We're really talking more about startups and companies 68 00:04:20,360 --> 00:04:22,840 Speaker 1: that have a real shot at seeing a return on investment, 69 00:04:23,240 --> 00:04:25,600 Speaker 1: but they lack the infrastructure or money to build out 70 00:04:25,640 --> 00:04:30,080 Speaker 1: their own machine learning systems. Over at Meta, Mark Zuckerberg 71 00:04:30,080 --> 00:04:33,560 Speaker 1: announced that the company is pursuing its own AI strategy. 72 00:04:34,000 --> 00:04:37,279 Speaker 1: I'm just gonna just read out his Facebook post because 73 00:04:37,520 --> 00:04:40,760 Speaker 1: it tells you everything you need to know. Quote. We're 74 00:04:40,800 --> 00:04:44,080 Speaker 1: creating a new top level product group at Meta focused 75 00:04:44,080 --> 00:04:48,760 Speaker 1: on generative AI to turbo charge our work in this area. 76 00:04:48,880 --> 00:04:51,880 Speaker 1: We're starting by pulling together a lot of teams working 77 00:04:51,880 --> 00:04:55,880 Speaker 1: on generative AI across the company into one group focused 78 00:04:55,920 --> 00:05:00,880 Speaker 1: on building delightful experiences around this technology into all of 79 00:05:00,880 --> 00:05:04,520 Speaker 1: our different products. In the short term, will focus on 80 00:05:04,600 --> 00:05:08,720 Speaker 1: building creative and expressive tools Over the longer term will 81 00:05:08,720 --> 00:05:12,000 Speaker 1: focus on developing AI personas that can help people in 82 00:05:12,040 --> 00:05:16,279 Speaker 1: a variety of ways. We're exploring experiences with text like 83 00:05:16,480 --> 00:05:21,000 Speaker 1: chat in WhatsApp and Messenger, with images like creative Instagram 84 00:05:21,040 --> 00:05:25,760 Speaker 1: filters and ad formats, and with video and multimodal experiences. 85 00:05:26,120 --> 00:05:28,400 Speaker 1: We have a lot of foundational work to do before 86 00:05:28,440 --> 00:05:32,200 Speaker 1: getting to the really futuristic experiences, but I'm excited about 87 00:05:32,240 --> 00:05:35,640 Speaker 1: all the new things will build alone the way end quote. 88 00:05:36,000 --> 00:05:38,680 Speaker 1: So it sounds like Meta like Musk is on the 89 00:05:38,760 --> 00:05:42,440 Speaker 1: path to create its own AI approach, or perhaps Meta 90 00:05:42,480 --> 00:05:44,719 Speaker 1: will turn to Open AI to tap into the power 91 00:05:44,720 --> 00:05:48,640 Speaker 1: of chat GPT. It's still early days and we're not 92 00:05:48,920 --> 00:05:53,359 Speaker 1: done with AI yet. Snapchat is also jumping into the 93 00:05:53,400 --> 00:05:58,200 Speaker 1: AI game with a product it calls my AI. Only 94 00:05:58,240 --> 00:06:02,600 Speaker 1: subscribers to Snapchat Plus will have access to this. According 95 00:06:02,640 --> 00:06:05,640 Speaker 1: to Snapchat, the AI will do stuff like if you 96 00:06:05,680 --> 00:06:08,320 Speaker 1: ask it, it will give you recommendations for presents that 97 00:06:08,400 --> 00:06:11,840 Speaker 1: you could buy friends and family. I mean, presumably Snapchat 98 00:06:11,880 --> 00:06:14,440 Speaker 1: would scan everything you've ever said to these people and 99 00:06:14,520 --> 00:06:17,640 Speaker 1: start to pull suggestions out of that, or it might 100 00:06:17,680 --> 00:06:21,120 Speaker 1: give suggestions about things you could do with somebody, like hey, 101 00:06:21,279 --> 00:06:23,680 Speaker 1: I want to hang out with so and so, what's 102 00:06:23,720 --> 00:06:25,800 Speaker 1: a good activity and they might say, well, they really 103 00:06:25,800 --> 00:06:28,080 Speaker 1: like the outdoors, how about you go hiking that kind 104 00:06:28,120 --> 00:06:32,400 Speaker 1: of thing. You can also apparently name this AI. However, 105 00:06:32,440 --> 00:06:35,280 Speaker 1: the company also owns up to the fact that chat GPT, 106 00:06:35,920 --> 00:06:41,320 Speaker 1: which is the system powering my AI, isn't always reliable. 107 00:06:41,600 --> 00:06:43,960 Speaker 1: We've talked about that a lot over the last couple 108 00:06:43,960 --> 00:06:47,840 Speaker 1: of months, or as Snapchat actually put it, quote, as 109 00:06:47,920 --> 00:06:52,359 Speaker 1: with all aipowered chat bots, my AI is prone to 110 00:06:52,400 --> 00:06:55,880 Speaker 1: hallucination and can be tricked into saying just about anything. 111 00:06:56,279 --> 00:07:01,159 Speaker 1: End quote. Oh. Also, all that communication with AI is 112 00:07:01,200 --> 00:07:05,440 Speaker 1: logged for the purposes of review and development, so anything 113 00:07:05,480 --> 00:07:09,840 Speaker 1: you do say to my AI is being recorded. So 114 00:07:09,920 --> 00:07:12,680 Speaker 1: that means it's best not to confide in my AI 115 00:07:12,920 --> 00:07:16,720 Speaker 1: all your secrets, like Grandma's chocolate chip cookie recipe or 116 00:07:16,760 --> 00:07:20,720 Speaker 1: where you hit the bodies, because someone somewhere could be 117 00:07:20,760 --> 00:07:26,440 Speaker 1: reading over that log. The whole announcement dedicates a surprising 118 00:07:26,560 --> 00:07:29,320 Speaker 1: amount of space that warns users that the tool might 119 00:07:29,400 --> 00:07:33,360 Speaker 1: not work as intended, and that almost raises the question 120 00:07:33,400 --> 00:07:36,520 Speaker 1: about why they're deploying this tool so early. If they're 121 00:07:36,560 --> 00:07:40,760 Speaker 1: taking this amount of effort to say, hey, y'all this 122 00:07:40,800 --> 00:07:44,640 Speaker 1: thing might go heywire, and they really point out all 123 00:07:44,680 --> 00:07:47,520 Speaker 1: the different ways that you can flag stuff so that 124 00:07:47,560 --> 00:07:51,160 Speaker 1: people can review it and thus address any issues that 125 00:07:51,200 --> 00:07:55,360 Speaker 1: pop up. Like it's it's a significant amount of the 126 00:07:55,400 --> 00:07:59,840 Speaker 1: announcement that is all about covering their butts, so to speak. 127 00:08:00,160 --> 00:08:03,040 Speaker 1: So I suppose one answer as to why they're deploying 128 00:08:03,040 --> 00:08:07,240 Speaker 1: it so early is that this turns snapchat plus subscribers 129 00:08:07,600 --> 00:08:11,280 Speaker 1: into QA testers that they don't have to pay. Right, 130 00:08:11,400 --> 00:08:15,240 Speaker 1: these aren't employees. They could turn the community into the 131 00:08:15,320 --> 00:08:20,680 Speaker 1: QA team. It's the basic concept behind open beta programs. Right. 132 00:08:21,320 --> 00:08:25,640 Speaker 1: You find out by using a wide deployment where the 133 00:08:25,680 --> 00:08:28,400 Speaker 1: problems are, and then you fix them. Before you know, 134 00:08:28,440 --> 00:08:31,040 Speaker 1: you deploy it to an even larger audience in the future. 135 00:08:31,520 --> 00:08:35,440 Speaker 1: Jordan Parker Herb wrote a piece for Insider titled I 136 00:08:35,480 --> 00:08:38,880 Speaker 1: asked Chat GPT to write messages to my tender matches. 137 00:08:39,280 --> 00:08:43,559 Speaker 1: A dating coach said they gave off a creepy vibe. Now, 138 00:08:43,559 --> 00:08:46,439 Speaker 1: I don't think anyone's really surprised by that. Heck, if 139 00:08:46,440 --> 00:08:50,640 Speaker 1: you again turned on Nothing Forever the AI powered endless 140 00:08:50,720 --> 00:08:53,600 Speaker 1: Seinfeld episode, you would probably guess this would be the 141 00:08:53,640 --> 00:08:57,000 Speaker 1: inevitable outcome. Because those episodes can get a little unsettling 142 00:08:57,440 --> 00:09:01,160 Speaker 1: as well. And this piece in an Cider indicates that 143 00:09:01,240 --> 00:09:05,920 Speaker 1: chat GPT's responses fell into some pretty common traps when 144 00:09:06,000 --> 00:09:10,040 Speaker 1: one attempts to navigate the complicated world of modern dating, 145 00:09:10,640 --> 00:09:16,160 Speaker 1: namely that chat gpt wrote responses that are way too long. This, 146 00:09:16,320 --> 00:09:19,199 Speaker 1: by the way, it tells me that if I were single, 147 00:09:19,920 --> 00:09:22,920 Speaker 1: I would probably be single forever at this point, because 148 00:09:23,120 --> 00:09:25,080 Speaker 1: come on, y'all, there's no denying. I will use a 149 00:09:25,160 --> 00:09:28,040 Speaker 1: thousand words when ten would do just fine. So I 150 00:09:28,080 --> 00:09:31,640 Speaker 1: would never do well on these kinds of apps. Also, 151 00:09:31,760 --> 00:09:36,400 Speaker 1: chat gpt leaned heavily on its emoji game, and as 152 00:09:36,480 --> 00:09:38,960 Speaker 1: the title of the piece points out, some of the 153 00:09:39,000 --> 00:09:43,240 Speaker 1: responses came across as creepy. Also, the coach pointed out 154 00:09:43,280 --> 00:09:46,559 Speaker 1: that it's best for folks to just be themselves when 155 00:09:46,640 --> 00:09:49,719 Speaker 1: using dating apps because otherwise your perspective date will get 156 00:09:49,720 --> 00:09:52,400 Speaker 1: the wrong impression, and that pretty much guarantees things aren't 157 00:09:52,440 --> 00:09:56,440 Speaker 1: going to go well. Like if a tender matchup thinks 158 00:09:56,480 --> 00:09:58,880 Speaker 1: that the response was really cool, but the really cool 159 00:09:58,880 --> 00:10:02,719 Speaker 1: response was written by AI, and then they meet you 160 00:10:02,880 --> 00:10:06,520 Speaker 1: and you do not have the same vibe. That's a problem. 161 00:10:06,600 --> 00:10:11,880 Speaker 1: I feel like I'm describing almost every romantic comedy that 162 00:10:12,000 --> 00:10:15,840 Speaker 1: was written in the eighties and centered on teenagers, except 163 00:10:15,920 --> 00:10:18,120 Speaker 1: that instead of it being AI, it's typically you know, 164 00:10:18,160 --> 00:10:22,719 Speaker 1: the well meaning popular kids who are attempting to transform 165 00:10:22,760 --> 00:10:26,120 Speaker 1: a person so that they become popular. It feels like that, 166 00:10:26,240 --> 00:10:30,040 Speaker 1: except I guess I'm describing the next generation of teen 167 00:10:30,200 --> 00:10:33,240 Speaker 1: centered comedies. I would not be surprised if we find 168 00:10:33,240 --> 00:10:36,920 Speaker 1: a movie like that. Anyway, I highly recommend reading the 169 00:10:37,000 --> 00:10:41,040 Speaker 1: actual article. Some of the AI generated examples that Jordan's 170 00:10:41,040 --> 00:10:45,920 Speaker 1: shares are absolutely hilarious in that awkward, cringe e sitcom 171 00:10:46,040 --> 00:10:49,400 Speaker 1: kind of way. Again, it's a piece in Insider, and 172 00:10:49,520 --> 00:10:52,199 Speaker 1: it is called I asked chat gpt to write messages 173 00:10:52,240 --> 00:10:54,400 Speaker 1: to my tender matches. You could just search for that. 174 00:10:54,840 --> 00:10:57,720 Speaker 1: I recommend reading it. It's good for, you know, a laugh. 175 00:10:58,320 --> 00:11:02,080 Speaker 1: Of course, AI goes well beyond chatbots and machine learning. 176 00:11:02,240 --> 00:11:05,080 Speaker 1: We've talked about other uses of AI and the dangers 177 00:11:05,160 --> 00:11:08,840 Speaker 1: that they can present. One example that springs to mind 178 00:11:08,880 --> 00:11:13,000 Speaker 1: because it comes up time and again is facial recognition technology. 179 00:11:13,520 --> 00:11:18,360 Speaker 1: Even if the application of this technology is benign, there 180 00:11:18,360 --> 00:11:22,760 Speaker 1: are frequently problems with the underlying tech, unintended bias, has 181 00:11:22,760 --> 00:11:26,800 Speaker 1: been a huge issue with facial recognition technology for years, 182 00:11:26,920 --> 00:11:30,600 Speaker 1: ranging from some services being unable to detect a person 183 00:11:30,679 --> 00:11:34,760 Speaker 1: of colors face properly to misidentification, which can lead to 184 00:11:34,840 --> 00:11:38,960 Speaker 1: traumatic experiences such as being targeted by law enforcement simply 185 00:11:39,000 --> 00:11:42,880 Speaker 1: because a computer can't tell the difference between different people. 186 00:11:43,400 --> 00:11:48,480 Speaker 1: Frequently again people of color, and they are disproportionately targeted 187 00:11:48,520 --> 00:11:52,319 Speaker 1: and affected by such technology. While last week New Scientists 188 00:11:52,559 --> 00:11:58,160 Speaker 1: presented another example, one with truly grim and terrifying implications. 189 00:11:58,559 --> 00:12:01,560 Speaker 1: The magazine found a contract between a tech company called 190 00:12:01,679 --> 00:12:06,120 Speaker 1: real Networks and the United States Air Force. Real Networks 191 00:12:06,160 --> 00:12:10,400 Speaker 1: offers a facial recognition platform that they call Secure Accurate 192 00:12:10,600 --> 00:12:14,440 Speaker 1: Facial Recognition or SAFER, and the implication is that the 193 00:12:14,480 --> 00:12:17,960 Speaker 1: Air Force is incorporating this technology into its Unmanned aerial 194 00:12:18,120 --> 00:12:22,680 Speaker 1: vehicle or UAV program, you know drones. A lack of 195 00:12:22,720 --> 00:12:25,320 Speaker 1: information has led to some speculation, some of which I 196 00:12:25,320 --> 00:12:30,199 Speaker 1: think is definitely understandable and believable. After all, Special Forces 197 00:12:30,280 --> 00:12:34,199 Speaker 1: units have been involved in clandestine operations that are at 198 00:12:34,280 --> 00:12:39,199 Speaker 1: least difficult to separate from stuff like assassinations, and sometimes impossible. 199 00:12:39,280 --> 00:12:43,280 Speaker 1: Sometimes it's just outright an assassination. So it is not 200 00:12:43,320 --> 00:12:47,080 Speaker 1: a stretch to imagine a unit like a Special Forces 201 00:12:47,160 --> 00:12:51,160 Speaker 1: unit making use of a drone with this technology in 202 00:12:51,240 --> 00:12:55,600 Speaker 1: an effort to identify and acquire targets. But the potential 203 00:12:55,760 --> 00:12:59,200 Speaker 1: for misuse of such technology, let alone the chance that 204 00:12:59,280 --> 00:13:02,640 Speaker 1: the techs could make a mistake, has led critics to 205 00:13:02,720 --> 00:13:05,160 Speaker 1: raise the alarm about this approach, and I think that 206 00:13:05,280 --> 00:13:10,160 Speaker 1: is a wise reaction. Even if the tech works perfectly, 207 00:13:10,840 --> 00:13:12,640 Speaker 1: you still have to wrestle with the fact that people 208 00:13:12,720 --> 00:13:17,319 Speaker 1: can sometimes be the absolute worst. They can abuse technology 209 00:13:17,400 --> 00:13:19,800 Speaker 1: for their own purposes, and when it comes to something 210 00:13:19,840 --> 00:13:25,600 Speaker 1: as potentially lethal as this, that is a major problem. Okay, 211 00:13:25,640 --> 00:13:27,960 Speaker 1: we're going to take a quick break. When we come back, 212 00:13:28,000 --> 00:13:39,960 Speaker 1: we'll have a lot more tech news to cover. We're back, 213 00:13:40,000 --> 00:13:43,240 Speaker 1: and we're not done with AI yet. I do promise 214 00:13:43,280 --> 00:13:45,800 Speaker 1: we have other stories besides AI, but we've got a 215 00:13:45,840 --> 00:13:49,800 Speaker 1: couple more to get through, and one of the stories 216 00:13:49,800 --> 00:13:52,280 Speaker 1: we have is about how AI is complicated, not just 217 00:13:52,360 --> 00:13:56,600 Speaker 1: because of the technology, but because of people and the 218 00:13:56,640 --> 00:14:02,720 Speaker 1: way we react to AI and interact with AI. I 219 00:14:02,760 --> 00:14:07,000 Speaker 1: think that this is a truly fascinating topic that relates 220 00:14:07,000 --> 00:14:11,040 Speaker 1: heavily to both psychology and technology. So I want to 221 00:14:11,040 --> 00:14:13,880 Speaker 1: talk about a recent study out of my alma mater, 222 00:14:14,280 --> 00:14:18,480 Speaker 1: the University of Georgia Go Bulldogs. Nicole Davis, who is 223 00:14:18,480 --> 00:14:22,400 Speaker 1: a graduate student at UGA, participated in a research project 224 00:14:22,800 --> 00:14:26,360 Speaker 1: that I think is both interesting and has some upsetting 225 00:14:26,400 --> 00:14:31,400 Speaker 1: but not really surprising conclusions. The project brought together a 226 00:14:31,480 --> 00:14:34,680 Speaker 1: bunch of people and then ask them questions about stereotypes 227 00:14:34,720 --> 00:14:39,560 Speaker 1: that relate to white, Black, and Asian ethnic groups, and generally, 228 00:14:39,600 --> 00:14:43,760 Speaker 1: the response is indicated that people saw Asians as being 229 00:14:43,800 --> 00:14:48,800 Speaker 1: the most competent people and Black people the least competent 230 00:14:49,160 --> 00:14:54,120 Speaker 1: people for any given task. I guess it's a really 231 00:14:54,200 --> 00:14:58,480 Speaker 1: ugly stereotype, but it's also undeniably a pretty common one. 232 00:14:59,160 --> 00:15:01,720 Speaker 1: Then the users were given a task and it was 233 00:15:01,760 --> 00:15:03,840 Speaker 1: to try and find a way to reduce the expense 234 00:15:03,880 --> 00:15:07,280 Speaker 1: of a vacation rental, and they were going to make 235 00:15:07,440 --> 00:15:11,400 Speaker 1: use of an AI powered bot, a chatbot. They had 236 00:15:11,440 --> 00:15:15,840 Speaker 1: a little avatar representing the AI So these were cartoonish 237 00:15:15,840 --> 00:15:18,680 Speaker 1: avatars and there were some that were white, some that 238 00:15:18,720 --> 00:15:21,880 Speaker 1: were black, and some that were Asian in design, and 239 00:15:22,000 --> 00:15:25,840 Speaker 1: the users were later asked to comment on the bot's performance, 240 00:15:25,960 --> 00:15:30,600 Speaker 1: specifically how human and warm it was and how competent 241 00:15:30,680 --> 00:15:34,480 Speaker 1: it was at helping the user reduce the vacation rental cost. 242 00:15:35,080 --> 00:15:38,640 Speaker 1: Davis said, quote when we asked about the bot, we 243 00:15:38,760 --> 00:15:43,200 Speaker 1: saw perceptions change. Even if they said yes, I feel 244 00:15:43,200 --> 00:15:47,000 Speaker 1: like black people are less competent, they also said yes, 245 00:15:47,440 --> 00:15:51,680 Speaker 1: I feel like the black ais were more competent. Davis said. 246 00:15:51,720 --> 00:15:56,400 Speaker 1: This is an example of expectation violation theory, which pausits 247 00:15:56,440 --> 00:15:59,080 Speaker 1: that if someone enters into a situation and they have 248 00:15:59,200 --> 00:16:02,960 Speaker 1: low expectation and then their experience is a positive one, 249 00:16:03,280 --> 00:16:08,080 Speaker 1: they walk away feeling that it was an overwhelmingly positive experience, 250 00:16:08,760 --> 00:16:12,360 Speaker 1: not just that was good, but because it exceeded their expectations, 251 00:16:12,640 --> 00:16:15,720 Speaker 1: it was even better than that. Davis goes on to 252 00:16:15,760 --> 00:16:17,800 Speaker 1: say that more research is needed to find ways in 253 00:16:17,800 --> 00:16:21,840 Speaker 1: which bought representation can help to impact consumer perception in 254 00:16:21,880 --> 00:16:26,560 Speaker 1: positive ways, like perhaps breaking down barriers they might otherwise 255 00:16:26,640 --> 00:16:30,920 Speaker 1: have because of these stereotypes that they maybe unconsciously have 256 00:16:31,240 --> 00:16:34,920 Speaker 1: of different people. But this is obviously a complicated and 257 00:16:35,080 --> 00:16:39,400 Speaker 1: sensitive challenge. Amazon has been using AI to help monitor 258 00:16:39,440 --> 00:16:42,440 Speaker 1: delivery drivers for a while now, but this recently got 259 00:16:42,480 --> 00:16:45,920 Speaker 1: more attention when a TikTok user would the handle. Amber 260 00:16:46,040 --> 00:16:50,720 Speaker 1: Gertz gave an explanation of how the delivery truck's camera 261 00:16:50,760 --> 00:16:54,720 Speaker 1: systems monitor driver behavior. She is an Amazon delivery driver. 262 00:16:55,160 --> 00:16:58,640 Speaker 1: She created this TikTok that explains the whole thing, and 263 00:16:59,480 --> 00:17:03,480 Speaker 1: she says that the system logs violations if a driver 264 00:17:03,800 --> 00:17:06,879 Speaker 1: breaks protocol in any way. This can include stuff like 265 00:17:07,040 --> 00:17:10,760 Speaker 1: failing to come to a complete stop at a stop sign, which, hey, 266 00:17:10,760 --> 00:17:13,720 Speaker 1: that makes sense. This is like one of the biggest 267 00:17:13,800 --> 00:17:18,320 Speaker 1: violations a driver can commit, and you definitely need something 268 00:17:18,400 --> 00:17:21,479 Speaker 1: to help ensure drivers follow this process because I mean, 269 00:17:21,520 --> 00:17:24,600 Speaker 1: they're on the road all day, so they have the 270 00:17:24,640 --> 00:17:28,439 Speaker 1: potential for getting involved in collisions more than the average 271 00:17:28,480 --> 00:17:30,480 Speaker 1: person does. You know, the average person is not on 272 00:17:30,520 --> 00:17:33,960 Speaker 1: the road all day. And it also tracks whether or 273 00:17:34,000 --> 00:17:36,440 Speaker 1: not the driver has buckled their seat belt at the 274 00:17:36,480 --> 00:17:38,879 Speaker 1: conclusion of each stop, or whether or not they've gotten 275 00:17:38,880 --> 00:17:41,800 Speaker 1: out of their seat. However, the system will also trigger 276 00:17:41,800 --> 00:17:45,120 Speaker 1: if a driver takes a drink without first pulling over 277 00:17:45,200 --> 00:17:46,800 Speaker 1: to the side of the road to come to a 278 00:17:46,840 --> 00:17:50,040 Speaker 1: complete stop. So if you got your morning coffee with 279 00:17:50,119 --> 00:17:52,080 Speaker 1: you and you're an Amazon delivery driver, you have to 280 00:17:52,119 --> 00:17:53,760 Speaker 1: come to a complete stop before you can take a 281 00:17:53,760 --> 00:17:58,000 Speaker 1: sip of coffee or you will get you your image 282 00:17:58,040 --> 00:18:03,879 Speaker 1: captured and a violation will be hit on your profile. Also, 283 00:18:04,480 --> 00:18:07,400 Speaker 1: drivers aren't allowed to touch the center console without first 284 00:18:07,400 --> 00:18:11,040 Speaker 1: stopping because that's considered a distraction. And the cameras are 285 00:18:11,080 --> 00:18:13,760 Speaker 1: not providing a live feed for the whole day. It's 286 00:18:13,760 --> 00:18:17,720 Speaker 1: not like there's some security office within Amazon where there's 287 00:18:17,760 --> 00:18:22,280 Speaker 1: this one person looking at a wall of monitors trying 288 00:18:22,320 --> 00:18:24,280 Speaker 1: to keep up with all these different drivers. It would 289 00:18:24,320 --> 00:18:28,800 Speaker 1: be impossible to do that. Instead, AI incorporated into the 290 00:18:28,840 --> 00:18:32,920 Speaker 1: system monitors the camera view and captures video should a 291 00:18:33,040 --> 00:18:36,320 Speaker 1: driver do anything that violates these policies, and it's all 292 00:18:36,359 --> 00:18:38,720 Speaker 1: in the name of safety. As Amber Gert says in 293 00:18:38,800 --> 00:18:42,200 Speaker 1: the TikTok, pretty much every Amazon driver hates this system, 294 00:18:42,640 --> 00:18:47,080 Speaker 1: which includes multiple cameras set up within the vehicle and 295 00:18:47,160 --> 00:18:50,760 Speaker 1: also forward facing cameras to keep things like how far 296 00:18:50,800 --> 00:18:53,159 Speaker 1: away you are from the traffic in front of you. 297 00:18:53,720 --> 00:18:57,639 Speaker 1: But she also generously says this is all in an 298 00:18:57,680 --> 00:19:02,240 Speaker 1: effort to keep drivers and others also Amazon drivers can 299 00:19:02,320 --> 00:19:05,440 Speaker 1: dispute violation reports, and Bergarts even mentions a case where 300 00:19:05,680 --> 00:19:08,480 Speaker 1: a driver scratched his beard while he was driving and 301 00:19:08,520 --> 00:19:12,160 Speaker 1: the system mistakenly believed he was talking on a cell 302 00:19:12,200 --> 00:19:15,200 Speaker 1: phone and so dinged him with a violation, and so 303 00:19:15,240 --> 00:19:17,560 Speaker 1: he was able to dispute that and get it reversed. 304 00:19:17,600 --> 00:19:20,520 Speaker 1: Now I can honestly say I feel really conflicted about 305 00:19:20,640 --> 00:19:23,399 Speaker 1: this whole approach. On the one hand, this is taking 306 00:19:23,440 --> 00:19:26,800 Speaker 1: employee surveillance to the extreme, there's no doubt about it. 307 00:19:27,240 --> 00:19:29,400 Speaker 1: But on the other hand, the system has also allegedly 308 00:19:29,440 --> 00:19:32,879 Speaker 1: contributed to reduction and collision rates by thirty five percent, 309 00:19:33,320 --> 00:19:37,320 Speaker 1: and considering that collisions often result in injuries and property damage, 310 00:19:37,920 --> 00:19:41,040 Speaker 1: that's significant. And I kind of wonder what Ben Franklin 311 00:19:41,080 --> 00:19:43,520 Speaker 1: would have to say about all this, with his views 312 00:19:43,520 --> 00:19:46,560 Speaker 1: on liberty versus safety and all By the way, that 313 00:19:46,720 --> 00:19:51,520 Speaker 1: famous quote is more complicated than the quote itself would indicate. 314 00:19:51,840 --> 00:19:55,120 Speaker 1: I recommend looking into what he was talking about when 315 00:19:55,119 --> 00:19:58,560 Speaker 1: he was chatting about liberty and safety, and hey, I 316 00:19:58,600 --> 00:20:02,280 Speaker 1: mentioned TikTok. Let's talk about that really quickly. Canada has 317 00:20:02,320 --> 00:20:06,120 Speaker 1: now banned TikTok from federal government devices. The White House 318 00:20:06,119 --> 00:20:07,919 Speaker 1: here in the United States has done the same and 319 00:20:07,960 --> 00:20:12,160 Speaker 1: has given federal employees thirty days to wipe TikTok off 320 00:20:12,200 --> 00:20:15,600 Speaker 1: any government owned devices. There are a few special cases 321 00:20:15,640 --> 00:20:19,119 Speaker 1: where there are exceptions for things like security research or 322 00:20:19,240 --> 00:20:22,879 Speaker 1: law enforcement, but for the most part this is a 323 00:20:23,960 --> 00:20:27,400 Speaker 1: federal government wide band. Several state governments in the US 324 00:20:27,480 --> 00:20:29,960 Speaker 1: have done the same sort of thing. The EU has 325 00:20:30,000 --> 00:20:33,280 Speaker 1: started to take action as well for the US and Canada. 326 00:20:33,359 --> 00:20:36,399 Speaker 1: The main concern here is that TikTok's parent company, byte Dance, 327 00:20:36,520 --> 00:20:39,679 Speaker 1: is a Chinese company, and as such could potentially be 328 00:20:39,760 --> 00:20:42,240 Speaker 1: scouring the app for data in an effort to gather 329 00:20:42,320 --> 00:20:46,760 Speaker 1: intelligence on behalf of the Chinese government, specifically the Communist Party. 330 00:20:47,119 --> 00:20:50,119 Speaker 1: For the EU, it gets a little more complicated because 331 00:20:50,160 --> 00:20:53,480 Speaker 1: even if you ignore the connection to China, TikTok itself 332 00:20:54,000 --> 00:20:56,080 Speaker 1: is based in the United States, and the EU is 333 00:20:56,119 --> 00:20:59,160 Speaker 1: a real stickler when it comes to protecting EU citizen 334 00:20:59,240 --> 00:21:04,639 Speaker 1: data from being collected and exploited, and that includes keeping 335 00:21:04,760 --> 00:21:08,120 Speaker 1: the information of the government safe, so they don't want 336 00:21:08,160 --> 00:21:12,320 Speaker 1: the US to just get access to that. Meanwhile, China's 337 00:21:12,359 --> 00:21:15,840 Speaker 1: Foreign Ministry Office issued a statement saying the US quote 338 00:21:16,280 --> 00:21:19,960 Speaker 1: has been overstretching the concept of national security and abusing 339 00:21:20,000 --> 00:21:24,560 Speaker 1: state power to suppress other countries companies. How unsure of itself, 340 00:21:24,640 --> 00:21:28,800 Speaker 1: can the US, the world's top superpower, be to fear 341 00:21:29,040 --> 00:21:34,760 Speaker 1: a favorite young person's favorite app to such a degree 342 00:21:34,960 --> 00:21:37,760 Speaker 1: end quote. First of all, I don't think that favorite 343 00:21:37,800 --> 00:21:41,200 Speaker 1: thing was meant to be repeated, but secondly, shots fired 344 00:21:41,280 --> 00:21:45,040 Speaker 1: China's sick burn. Of course, I should also point out 345 00:21:45,119 --> 00:21:48,760 Speaker 1: that there are literal laws in China that compel citizens 346 00:21:48,800 --> 00:21:51,320 Speaker 1: and companies to act as agents on behalf of gathering 347 00:21:51,320 --> 00:21:55,960 Speaker 1: intelligence for the Communist Party, so there's not a healthy 348 00:21:56,040 --> 00:22:01,080 Speaker 1: leg to stand on there. Also, China, oddly enough, has 349 00:22:01,240 --> 00:22:05,960 Speaker 1: famously blocked tons of apps and services originating in the 350 00:22:05,960 --> 00:22:09,680 Speaker 1: West in an effort to prevent their citizens from accessing them. 351 00:22:09,680 --> 00:22:13,080 Speaker 1: So again, not exactly taking the high ground on that 352 00:22:13,119 --> 00:22:19,639 Speaker 1: front either, but yeah, you use sing us China. Last Pass, 353 00:22:19,840 --> 00:22:23,240 Speaker 1: the password vault company, revealed that hackers were able to 354 00:22:23,280 --> 00:22:27,320 Speaker 1: access and employees home computer and in the process they 355 00:22:27,400 --> 00:22:31,919 Speaker 1: got access to a decrypted vault, a corporate vault, not 356 00:22:32,000 --> 00:22:35,120 Speaker 1: a user vault. This is on the corporate side. Now. 357 00:22:35,160 --> 00:22:38,320 Speaker 1: You might recall that the same service revealed last year 358 00:22:38,440 --> 00:22:42,440 Speaker 1: that hackers had penetrated some customer vaults through other means. Currently, 359 00:22:42,520 --> 00:22:45,000 Speaker 1: last Pass says it does not look like this attack 360 00:22:45,080 --> 00:22:48,960 Speaker 1: and those previous attacks were connected at all. Whatever the case, 361 00:22:49,320 --> 00:22:53,000 Speaker 1: last Pass users should change not only all the passwords 362 00:22:53,000 --> 00:22:56,200 Speaker 1: that they had stored in last passes vault, but also 363 00:22:56,240 --> 00:22:59,879 Speaker 1: their master password for their last pass account. This is 364 00:23:00,119 --> 00:23:03,960 Speaker 1: a worst case scenario, and while Ours Technica points out 365 00:23:04,280 --> 00:23:07,240 Speaker 1: that we do not know yet if hackers have access 366 00:23:07,280 --> 00:23:11,480 Speaker 1: to individual users vaults and their passwords, you have to 367 00:23:11,520 --> 00:23:15,000 Speaker 1: operate under the assumption that they do, and that further, 368 00:23:15,520 --> 00:23:18,080 Speaker 1: this data could end up being sold on the dark web, 369 00:23:18,160 --> 00:23:20,800 Speaker 1: So you definitely want to get out there and start 370 00:23:20,880 --> 00:23:24,800 Speaker 1: changing this stuff now. I've long advocated for password vaults 371 00:23:24,920 --> 00:23:27,560 Speaker 1: as they make the worst parts about passwords a little 372 00:23:27,600 --> 00:23:30,680 Speaker 1: more user friendly. That is, by using a password vault, 373 00:23:31,080 --> 00:23:35,680 Speaker 1: it's easier to create unique, strong passwords for every service 374 00:23:35,760 --> 00:23:39,520 Speaker 1: that you access. These passwords are difficult to crack, but 375 00:23:39,600 --> 00:23:43,879 Speaker 1: they're also hard to remember, and because they're all unique, 376 00:23:44,240 --> 00:23:48,320 Speaker 1: you've got this ton of different passwords that are hard 377 00:23:48,359 --> 00:23:52,280 Speaker 1: for you to just keep in your memory. So it 378 00:23:52,320 --> 00:23:54,639 Speaker 1: gets to the point where it can be impossible to 379 00:23:54,760 --> 00:23:57,159 Speaker 1: remember all of your unique passwords. So a vault's a 380 00:23:57,160 --> 00:24:01,360 Speaker 1: great solution unless something like this happens. And while these 381 00:24:01,359 --> 00:24:05,119 Speaker 1: security events are rare, we've seen they're not impossible and 382 00:24:05,240 --> 00:24:08,159 Speaker 1: that it then falls to us to take action to 383 00:24:08,200 --> 00:24:11,320 Speaker 1: make sure we keep our data and our services as 384 00:24:11,320 --> 00:24:14,439 Speaker 1: safe as possible. Last Pass is not the only target 385 00:24:14,480 --> 00:24:17,440 Speaker 1: to have a catastrophic breach. Another is the United States 386 00:24:17,560 --> 00:24:21,720 Speaker 1: Marshals Service, which announced last week that attackers were able 387 00:24:21,760 --> 00:24:26,440 Speaker 1: to gain access to secure systems or assumed secure systems 388 00:24:26,440 --> 00:24:29,960 Speaker 1: and potentially retrieved sensitive information. The service did say that 389 00:24:30,000 --> 00:24:33,359 Speaker 1: the information may include data about subjects who are currently 390 00:24:33,480 --> 00:24:39,440 Speaker 1: under investigation, It might include administrative information and also personal 391 00:24:39,520 --> 00:24:42,600 Speaker 1: data regarding some of the staff of the agency, among 392 00:24:42,680 --> 00:24:47,000 Speaker 1: other things. However, one system that they said was unaffected 393 00:24:47,200 --> 00:24:50,520 Speaker 1: was the Witness Security Program, which is more commonly known 394 00:24:50,560 --> 00:24:54,120 Speaker 1: here in the US as the Witness Protection Program. This 395 00:24:54,160 --> 00:24:57,359 Speaker 1: is the famous program that aims to create new identities 396 00:24:57,359 --> 00:25:01,360 Speaker 1: for witnesses who are involved in cases for major crimes, 397 00:25:01,880 --> 00:25:03,920 Speaker 1: and this is all in an effort to keep those 398 00:25:03,960 --> 00:25:08,000 Speaker 1: witnesses safe from retaliation. It's pretty much a key ingredient 399 00:25:08,440 --> 00:25:11,120 Speaker 1: in a ton of movies and TV shows that are 400 00:25:11,200 --> 00:25:15,480 Speaker 1: about the mafia. It's frequent that someone gets put into 401 00:25:15,520 --> 00:25:18,640 Speaker 1: witness protection so that the mafia is unable to track 402 00:25:18,720 --> 00:25:22,320 Speaker 1: them down and target them. According to the agency's representatives, 403 00:25:22,480 --> 00:25:25,439 Speaker 1: the hackers were unable to breach that particular database, so 404 00:25:25,520 --> 00:25:29,160 Speaker 1: that is some good news. Okay, we're gonna take another 405 00:25:29,200 --> 00:25:31,520 Speaker 1: quick break. When we come back, I've got a few 406 00:25:31,560 --> 00:25:44,040 Speaker 1: more news stories that we need to talk about. We're back, 407 00:25:44,400 --> 00:25:48,120 Speaker 1: all right. So last year, News Corps that's Rupert Murdock's 408 00:25:48,160 --> 00:25:51,720 Speaker 1: company that owns multiple newspapers and some other media outlets, 409 00:25:52,200 --> 00:25:55,480 Speaker 1: announced that hackers had gained access to corporate systems. We 410 00:25:55,520 --> 00:26:00,119 Speaker 1: found out about this February twenty twenty two. However, now 411 00:26:00,200 --> 00:26:02,679 Speaker 1: we have a little extra information, and it's that the 412 00:26:02,760 --> 00:26:08,359 Speaker 1: hackers had essentially embedded themselves inside News Corps systems for 413 00:26:08,560 --> 00:26:12,360 Speaker 1: nearly two years. In a recent letter to at least 414 00:26:12,400 --> 00:26:16,760 Speaker 1: some of the company's employees, the corporation revealed, quote, based 415 00:26:16,760 --> 00:26:20,800 Speaker 1: on the investigation, news Corp understands that between February twenty 416 00:26:20,880 --> 00:26:25,200 Speaker 1: twenty and January twenty twenty two, an unauthorized party gained 417 00:26:25,280 --> 00:26:29,040 Speaker 1: access to certain business documents and emails from a limited 418 00:26:29,200 --> 00:26:32,680 Speaker 1: number of its personnel's accounts in the affected system, some 419 00:26:32,760 --> 00:26:36,840 Speaker 1: of which contained personal information end quote. The letter also 420 00:26:36,840 --> 00:26:40,000 Speaker 1: says the newscore doesn't believe the intrusion was focused on 421 00:26:40,080 --> 00:26:44,080 Speaker 1: stealing personal data, and that identity theft is likely not 422 00:26:44,480 --> 00:26:47,400 Speaker 1: the purpose of this attack, but rather that the intruders 423 00:26:47,400 --> 00:26:50,720 Speaker 1: were gathering intelligence. When you look into the information that 424 00:26:50,760 --> 00:26:55,240 Speaker 1: the hackers were able to access, it gets pretty gross 425 00:26:55,280 --> 00:26:58,360 Speaker 1: for the employees who are affected because it includes not 426 00:26:58,440 --> 00:27:02,920 Speaker 1: just stuff like their name, aims, and addresses and birth dates, 427 00:27:02,920 --> 00:27:06,440 Speaker 1: but also things like their Social Security number, their driver's 428 00:27:06,440 --> 00:27:10,320 Speaker 1: license number, their passport number, that kind of thing. It's 429 00:27:10,400 --> 00:27:14,080 Speaker 1: understandable that employees who are affected would be very much 430 00:27:14,119 --> 00:27:17,560 Speaker 1: concerned about this, so the company is providing effected employees 431 00:27:17,560 --> 00:27:21,840 Speaker 1: the option of experience services to protect against identity theft 432 00:27:21,840 --> 00:27:25,320 Speaker 1: and that kind of thing. The identity of the attackers 433 00:27:25,760 --> 00:27:29,080 Speaker 1: remains unknown, so it's not really possible to say definitively 434 00:27:29,600 --> 00:27:32,280 Speaker 1: what they were up to or how they intend to 435 00:27:32,359 --> 00:27:35,919 Speaker 1: use the information they accessed. The leading hypothesis is that 436 00:27:36,000 --> 00:27:39,480 Speaker 1: the attackers were aligned with the Chinese government, so this 437 00:27:39,560 --> 00:27:42,680 Speaker 1: could be an example of a state sponsored attack, But 438 00:27:42,880 --> 00:27:46,520 Speaker 1: from what I've seen, there's nothing that definitively shows that, 439 00:27:46,600 --> 00:27:50,760 Speaker 1: or at least nothing that anyone has publicly acknowledge, and 440 00:27:50,880 --> 00:27:54,520 Speaker 1: my guess is the investigation is probably ongoing. The website 441 00:27:54,560 --> 00:27:56,800 Speaker 1: the Drive has an article that brings up a potential 442 00:27:56,840 --> 00:28:00,720 Speaker 1: hazard with autonomous vehicles, then I hadn't really considered before, 443 00:28:00,920 --> 00:28:03,920 Speaker 1: which is silly because it's such an obvious use case 444 00:28:04,119 --> 00:28:06,960 Speaker 1: that I'm sure most of y'all are way ahead of me. 445 00:28:07,840 --> 00:28:10,040 Speaker 1: So this is really an oversight on my part, but 446 00:28:10,200 --> 00:28:13,560 Speaker 1: it's at the Ford Motor Company has recently been awarded 447 00:28:13,600 --> 00:28:19,160 Speaker 1: a patent regarding vehicle repossessions. So instead of sending Amelio 448 00:28:19,320 --> 00:28:22,359 Speaker 1: Estevez to repossess a car after the owner falls behind 449 00:28:22,480 --> 00:28:26,200 Speaker 1: on their payments, shout out to anyone who recognizes that reference. 450 00:28:26,800 --> 00:28:30,840 Speaker 1: Ford is suggesting that future vehicles that are outfitted with 451 00:28:30,960 --> 00:28:35,320 Speaker 1: autonomous operation features would just drive themselves to a location 452 00:28:35,480 --> 00:28:37,760 Speaker 1: where a tow truck could meet up with it, or 453 00:28:37,760 --> 00:28:41,080 Speaker 1: it would go straight to the repossession agency or maybe 454 00:28:41,080 --> 00:28:43,680 Speaker 1: even a junkyard. This would save the people who are 455 00:28:43,800 --> 00:28:47,200 Speaker 1: driving tow trucks the potentially dangerous job of going to 456 00:28:47,240 --> 00:28:50,880 Speaker 1: an owner's property to repossess a vehicle, So, in other words, 457 00:28:50,880 --> 00:28:55,800 Speaker 1: a car would effectively repossess itself. Ford's patent also describes 458 00:28:55,840 --> 00:28:59,680 Speaker 1: features for cars that would not necessarily have autonomous capabilities 459 00:29:00,160 --> 00:29:02,800 Speaker 1: that Ford would be able to shut down certain options 460 00:29:02,800 --> 00:29:05,600 Speaker 1: within the car remotely, some of them not even being optioned, 461 00:29:05,600 --> 00:29:08,040 Speaker 1: some of them just being outright features, So things like 462 00:29:08,600 --> 00:29:14,800 Speaker 1: power locks or cruise control or air conditioning, or even 463 00:29:14,880 --> 00:29:19,400 Speaker 1: disabling the engine itself, rendering the car inert. The patent 464 00:29:19,440 --> 00:29:21,800 Speaker 1: describes the process by which an owner would be alerted 465 00:29:21,800 --> 00:29:24,560 Speaker 1: in advance, which would give the owner the opportunity to 466 00:29:24,600 --> 00:29:28,440 Speaker 1: make good on payments. Otherwise, well, a car might start 467 00:29:28,480 --> 00:29:31,760 Speaker 1: to lose all those features, or eventually even drive itself 468 00:29:31,840 --> 00:29:35,440 Speaker 1: to the repossession agency, or, like I said, in cases 469 00:29:35,480 --> 00:29:39,280 Speaker 1: where repossession would be viewed as being too expensive, like 470 00:29:39,320 --> 00:29:42,440 Speaker 1: a bank would say, oh, it does make sense financially 471 00:29:42,840 --> 00:29:45,479 Speaker 1: for us to repossess this vehicle, they might just have 472 00:29:45,520 --> 00:29:48,320 Speaker 1: the car drive itself to the junkyard, which gets kind 473 00:29:48,320 --> 00:29:52,120 Speaker 1: of sinister when you think about it, right, because a 474 00:29:52,280 --> 00:29:56,680 Speaker 1: car autonomously driving itself to a junkyard for it to 475 00:29:57,440 --> 00:30:02,760 Speaker 1: presumably be junked. That's grim stuff. Pixar could have a 476 00:30:02,840 --> 00:30:07,680 Speaker 1: field day with that concept. And now for a horrifying 477 00:30:07,760 --> 00:30:12,960 Speaker 1: and infuriating story involving a car company fully embracing a 478 00:30:13,080 --> 00:30:18,160 Speaker 1: dystopic philosophy, or rather a third party that works with 479 00:30:18,240 --> 00:30:21,240 Speaker 1: a famous car company doing so. A sheriff's office in 480 00:30:21,360 --> 00:30:28,280 Speaker 1: Illinois encountered unthinkable resistance from Volkswagen's car Net service while 481 00:30:28,320 --> 00:30:31,920 Speaker 1: trying to track down a stolen VW vehicle that had 482 00:30:31,960 --> 00:30:36,720 Speaker 1: a two year old boy inside it. So the story goes, 483 00:30:36,760 --> 00:30:40,560 Speaker 1: this mom drives home with her two kids and she 484 00:30:40,720 --> 00:30:44,520 Speaker 1: pulls up in her driveway, gets out, takes one kid inside, 485 00:30:44,640 --> 00:30:48,640 Speaker 1: comes back out to retriever second kid. But meanwhile, a 486 00:30:48,680 --> 00:30:52,840 Speaker 1: group of car thieves had driven up behind her vehicle. 487 00:30:53,600 --> 00:30:57,920 Speaker 1: They ended up beating up the woman, stealing her car, 488 00:30:58,000 --> 00:31:01,360 Speaker 1: running her over, and driving off with her two year 489 00:31:01,360 --> 00:31:04,360 Speaker 1: old son in the car. She was able to call 490 00:31:04,600 --> 00:31:08,640 Speaker 1: nine one one and report the car and her child 491 00:31:09,160 --> 00:31:13,520 Speaker 1: having been taken by these thieves and get medical attention 492 00:31:13,560 --> 00:31:19,000 Speaker 1: as well. So anyway, the sheriff calls Karnet because Carnet 493 00:31:19,080 --> 00:31:24,480 Speaker 1: is a service that allows Volkswagen really Carnet itself to 494 00:31:24,640 --> 00:31:28,800 Speaker 1: remotely track and even control vehicles to an extent. So 495 00:31:29,240 --> 00:31:32,360 Speaker 1: the detectives are like, we need to know the location 496 00:31:32,400 --> 00:31:38,840 Speaker 1: of this vehicle right away, and the representative from Karnet says, well, 497 00:31:39,320 --> 00:31:42,000 Speaker 1: she let that subscription lapse, so I'm going to need 498 00:31:42,040 --> 00:31:44,920 Speaker 1: one hundred and fifty dollars reactivation fee before I can 499 00:31:45,000 --> 00:31:50,800 Speaker 1: give you that information. A boy's life was hanging in 500 00:31:50,880 --> 00:31:55,720 Speaker 1: the balance, and this representative for Carnet is like, can't 501 00:31:55,760 --> 00:31:57,680 Speaker 1: give you that info till you cough up the fee. 502 00:31:58,280 --> 00:32:02,000 Speaker 1: The detective actually did pay the one hundred fifty dollars 503 00:32:02,040 --> 00:32:05,840 Speaker 1: because the detective was aware that a boy's life is 504 00:32:05,840 --> 00:32:08,800 Speaker 1: worth more than one hundred and fifty dollars. It is 505 00:32:08,840 --> 00:32:14,000 Speaker 1: taking everything in me not to swear during this news item. 506 00:32:14,840 --> 00:32:20,320 Speaker 1: It is so unthinkably awful. The detective then, of course, 507 00:32:20,400 --> 00:32:25,080 Speaker 1: posted about this incident on Facebook. Volkswagen has responded by 508 00:32:25,120 --> 00:32:29,520 Speaker 1: calling it a quote unquote serious breach for its process 509 00:32:29,560 --> 00:32:32,720 Speaker 1: of how it works with law enforcement. And again, to 510 00:32:32,760 --> 00:32:36,000 Speaker 1: be clear, Karnet is a third party service that partners 511 00:32:36,000 --> 00:32:43,600 Speaker 1: with Volkswagen, so ultimately Karnet is responsible for this horrible incident, 512 00:32:43,840 --> 00:32:47,000 Speaker 1: but Volkswagen shares some of the blame as well as 513 00:32:47,000 --> 00:32:50,240 Speaker 1: for the child. I am happy to report that the 514 00:32:50,320 --> 00:32:53,840 Speaker 1: child was found safe. A witness saw thieves pull into 515 00:32:53,880 --> 00:32:56,720 Speaker 1: a parking lot. They took the kid out of the car, 516 00:32:57,040 --> 00:33:00,160 Speaker 1: then they drove off, and this witness was able to 517 00:33:00,240 --> 00:33:03,600 Speaker 1: rescue the kid before he could wander into traffic. The 518 00:33:03,680 --> 00:33:07,960 Speaker 1: police subsequently found the woman's Volkswagen. The woman, who was 519 00:33:08,040 --> 00:33:10,800 Speaker 1: seriously injured as she tried to rescue her son during 520 00:33:10,840 --> 00:33:14,040 Speaker 1: the theft, is currently recovering in the hospital and I 521 00:33:14,080 --> 00:33:17,320 Speaker 1: think Carnet has a really long way to go to 522 00:33:17,480 --> 00:33:26,480 Speaker 1: atone for this. This was unspeakably inhumane. Finally, on a 523 00:33:26,760 --> 00:33:31,680 Speaker 1: brighter side, Competition in Markets Authority or the CMA, this 524 00:33:31,760 --> 00:33:34,680 Speaker 1: is an antitrust kind of organization in the UK. This 525 00:33:34,800 --> 00:33:37,520 Speaker 1: is one of those organizations that looks to make sure 526 00:33:37,600 --> 00:33:42,600 Speaker 1: that the marketplace remains fair and competitive. CMA has said 527 00:33:42,640 --> 00:33:48,560 Speaker 1: that third parties indicate consoles could be moving away from 528 00:33:48,880 --> 00:33:52,400 Speaker 1: the eight to ten years cycle that we're familiar with, right, 529 00:33:52,440 --> 00:33:56,320 Speaker 1: that typically there's around eight to ten years between generations 530 00:33:56,720 --> 00:33:59,880 Speaker 1: of consoles, and that we might see them move to 531 00:34:00,160 --> 00:34:04,120 Speaker 1: three to four year cycles instead, So every three to 532 00:34:04,160 --> 00:34:06,920 Speaker 1: four years you would have a new version of say 533 00:34:07,360 --> 00:34:11,399 Speaker 1: Xbox or PlayStation, and that concerns me a little bit 534 00:34:11,880 --> 00:34:15,000 Speaker 1: simply because of the economic side of things like it 535 00:34:15,000 --> 00:34:18,320 Speaker 1: could be really exciting to people who are thinking, oh, 536 00:34:18,360 --> 00:34:20,759 Speaker 1: every three or four years, I'm going to get a 537 00:34:20,880 --> 00:34:27,120 Speaker 1: chance to buy a new console with components and better 538 00:34:27,200 --> 00:34:29,360 Speaker 1: features and that sort of thing, and that is exciting. 539 00:34:30,239 --> 00:34:34,359 Speaker 1: The thing that concerns me, however, is that currently the 540 00:34:34,400 --> 00:34:39,239 Speaker 1: way companies like Microsoft and Sony typically market their consoles 541 00:34:39,840 --> 00:34:43,000 Speaker 1: is they sell them at cost or sometimes even at 542 00:34:43,040 --> 00:34:46,319 Speaker 1: a loss, And the reasoning behind it is that you 543 00:34:46,400 --> 00:34:49,560 Speaker 1: go out, you buy your console, and then you end 544 00:34:49,640 --> 00:34:53,200 Speaker 1: up spending money on games and services, and that's how 545 00:34:53,200 --> 00:34:56,040 Speaker 1: companies like Microsoft and Sony end up seeing a profit 546 00:34:56,120 --> 00:34:58,759 Speaker 1: from those sales. It's not from the hardware where they're 547 00:34:58,760 --> 00:35:02,560 Speaker 1: taking a loss, but it's from the use of that hardware. Well, 548 00:35:02,600 --> 00:35:07,680 Speaker 1: that use is stretched over a decade essentially, or eight years, 549 00:35:07,680 --> 00:35:10,439 Speaker 1: that that's a long tail for you to be able 550 00:35:10,480 --> 00:35:14,120 Speaker 1: to make your profit off of these pieces of hardware. 551 00:35:14,600 --> 00:35:16,840 Speaker 1: If that gets reduced down to three to four years, 552 00:35:17,239 --> 00:35:19,680 Speaker 1: then we're probably also looking at a future where these 553 00:35:19,719 --> 00:35:24,480 Speaker 1: consoles are going to cost more because they're not going 554 00:35:24,520 --> 00:35:26,600 Speaker 1: to be as willing to take a big loss on 555 00:35:26,640 --> 00:35:29,000 Speaker 1: the hardware sales because there won't be as much time 556 00:35:29,680 --> 00:35:33,040 Speaker 1: to recapture those costs over the lifespan of the consoles. 557 00:35:33,040 --> 00:35:36,719 Speaker 1: If people are changing every three to four years, then 558 00:35:37,239 --> 00:35:41,520 Speaker 1: they're not necessarily, you know, realizing the profits they would 559 00:35:41,719 --> 00:35:44,880 Speaker 1: off a single console generation than they would with a 560 00:35:44,960 --> 00:35:49,000 Speaker 1: longer development cycle. So my guess is that such a 561 00:35:49,040 --> 00:35:53,440 Speaker 1: future would see consoles being more expensive that at least 562 00:35:53,440 --> 00:35:56,880 Speaker 1: you'd be looking at the companies moving away from selling 563 00:35:56,880 --> 00:35:59,719 Speaker 1: them at a loss. Maybe they would continue to sell 564 00:35:59,760 --> 00:36:02,560 Speaker 1: them at cost, but I would guess they would choose 565 00:36:02,560 --> 00:36:05,920 Speaker 1: a way to see better profit margins off the hardware sales, 566 00:36:06,480 --> 00:36:09,680 Speaker 1: because otherwise they're leaving money on the table. It doesn't 567 00:36:09,840 --> 00:36:13,920 Speaker 1: make sense to me otherwise. This is just my assumption. 568 00:36:14,640 --> 00:36:17,880 Speaker 1: I don't know that for sure. Also, this is based 569 00:36:17,920 --> 00:36:22,280 Speaker 1: off the CMA citing third party reports. This isn't coming 570 00:36:22,320 --> 00:36:26,719 Speaker 1: from Microsoft or Sony. So until we start seeing those 571 00:36:26,719 --> 00:36:30,279 Speaker 1: announcements come from the actual companies, we could say that 572 00:36:30,320 --> 00:36:32,680 Speaker 1: this is just a rumor, but it's one that makes 573 00:36:32,719 --> 00:36:34,960 Speaker 1: me think that if that were to come to pass, 574 00:36:35,080 --> 00:36:38,120 Speaker 1: that we would see more expensive consoles in the future. 575 00:36:38,960 --> 00:36:41,160 Speaker 1: That's just my feeling, my gut feeling on the matter. 576 00:36:41,360 --> 00:36:43,440 Speaker 1: I don't have anything to base that off of, except, 577 00:36:44,000 --> 00:36:46,640 Speaker 1: you know, thinking it through, and I could be totally 578 00:36:46,680 --> 00:36:49,839 Speaker 1: wrong on this. All right, that's it from the Tech 579 00:36:49,920 --> 00:36:53,120 Speaker 1: News for Tuesday, February twenty eight, twenty twenty three. I 580 00:36:53,160 --> 00:36:55,520 Speaker 1: hope you are all well. If you have suggestions for 581 00:36:55,600 --> 00:36:59,680 Speaker 1: me to talk about stuff what's on this show, well, 582 00:37:00,000 --> 00:37:01,279 Speaker 1: I had a couple ways of reaching out to me. 583 00:37:01,360 --> 00:37:05,000 Speaker 1: One is to go to Twitter and to tweet at 584 00:37:05,160 --> 00:37:08,560 Speaker 1: tech Stuff hsw that's the Twitter handle for the show. 585 00:37:08,800 --> 00:37:10,560 Speaker 1: Let me know what you would like to hear, or 586 00:37:10,840 --> 00:37:13,240 Speaker 1: you can download the iHeartRadio app. It's bringing to download 587 00:37:13,239 --> 00:37:15,160 Speaker 1: free to use. You can navigate over to tech Stuff 588 00:37:15,160 --> 00:37:17,280 Speaker 1: by putting it in the search field. Hit that little 589 00:37:17,320 --> 00:37:19,440 Speaker 1: microphone button that I'll let you leave a voice message 590 00:37:19,480 --> 00:37:22,160 Speaker 1: up to thirty seconds in length, and you can talk 591 00:37:22,200 --> 00:37:27,920 Speaker 1: to me goose. Okay, that's enough references to the eighties 592 00:37:28,040 --> 00:37:31,160 Speaker 1: in this show, Collie Gee willikers you can tell I'm 593 00:37:31,200 --> 00:37:34,160 Speaker 1: getting old all right. I hope you are all well, 594 00:37:34,520 --> 00:37:43,640 Speaker 1: and I'll talk to you again really soon. Text Stuff 595 00:37:43,719 --> 00:37:48,279 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 596 00:37:48,280 --> 00:37:51,840 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 597 00:37:51,880 --> 00:37:52,800 Speaker 1: your favorite shows.