1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,760 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,880 --> 00:00:17,840 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio and 4 00:00:17,920 --> 00:00:20,800 Speaker 1: love all things tech. It's time for the tech News 5 00:00:20,840 --> 00:00:25,599 Speaker 1: for Thursday, November four, twenty one, and we're gonna start 6 00:00:25,640 --> 00:00:29,280 Speaker 1: off with kind of a funny story. So on Tuesday, 7 00:00:29,720 --> 00:00:33,680 Speaker 1: we talked about how Facebook, the company, has adopted the 8 00:00:33,800 --> 00:00:39,080 Speaker 1: name Meta as a rebrand for the overall corporation. So, Facebook, 9 00:00:39,120 --> 00:00:41,800 Speaker 1: the social media platform is just one part of that 10 00:00:41,840 --> 00:00:45,120 Speaker 1: business and it's not changing its name. But now it 11 00:00:45,159 --> 00:00:49,600 Speaker 1: turns out that another company disputes Facebook's right to the 12 00:00:49,720 --> 00:00:54,000 Speaker 1: meta name Meta. PC, a company that's been around for 13 00:00:54,040 --> 00:00:57,880 Speaker 1: around a year, applied for a trademark for the name 14 00:00:57,960 --> 00:01:01,760 Speaker 1: Meta back this past August, which was obviously a couple 15 00:01:01,800 --> 00:01:04,319 Speaker 1: of months before rumors popped up that Facebook was going 16 00:01:04,360 --> 00:01:08,320 Speaker 1: to even rebrand. However, the company has not yet actually 17 00:01:08,360 --> 00:01:13,520 Speaker 1: received a trademark from the trademark office. Assuming there are 18 00:01:13,560 --> 00:01:18,160 Speaker 1: not other conflicts, you would think that the PC company 19 00:01:18,240 --> 00:01:21,200 Speaker 1: would get the trademark due to being what we call 20 00:01:21,240 --> 00:01:26,000 Speaker 1: in the biz firstees. If they do get the trademark, well, 21 00:01:26,000 --> 00:01:29,480 Speaker 1: then Facebook would not be allowed to just adopt the 22 00:01:29,600 --> 00:01:35,280 Speaker 1: name without getting permission first. Now see, Facebook wouldn't have 23 00:01:35,520 --> 00:01:39,760 Speaker 1: necessarily known about this issue because the trademark had not 24 00:01:39,840 --> 00:01:43,039 Speaker 1: yet been registered. So I don't think the company was 25 00:01:43,600 --> 00:01:47,880 Speaker 1: bullishly trying to take over the Meta name from Meta PC. 26 00:01:48,040 --> 00:01:50,920 Speaker 1: I just don't think they even realized Meta pc was 27 00:01:50,960 --> 00:01:54,560 Speaker 1: a thing, uh, And they just probably assumed there were 28 00:01:54,560 --> 00:01:56,640 Speaker 1: no impediments in the way of getting a new name. 29 00:01:56,720 --> 00:02:00,720 Speaker 1: I mean, if if there was no register trademark, then 30 00:02:01,080 --> 00:02:03,480 Speaker 1: it looks like there's no foul, right. But the owners 31 00:02:03,520 --> 00:02:08,239 Speaker 1: of Meta PC, Zack shut and Joe Darger, say they're 32 00:02:08,240 --> 00:02:10,600 Speaker 1: willing to part with the rights to the name of 33 00:02:10,639 --> 00:02:15,240 Speaker 1: Meta for a cool twenty million dollars. And they also 34 00:02:15,440 --> 00:02:19,360 Speaker 1: tweeted a photoshop picture of Mark Zuckerberg holding a Meta 35 00:02:19,400 --> 00:02:22,120 Speaker 1: pc in his hands, and they said that if they 36 00:02:22,160 --> 00:02:24,239 Speaker 1: do sell the name, they'll make sure to pick something 37 00:02:24,360 --> 00:02:27,960 Speaker 1: new that suits their PC company, and maybe they just 38 00:02:28,040 --> 00:02:30,919 Speaker 1: go with the name Facebook. That seems like that's coming up, 39 00:02:31,320 --> 00:02:34,200 Speaker 1: which is, you know, kind of a funny, little like 40 00:02:34,880 --> 00:02:39,360 Speaker 1: tongue in cheek response. Whether this actually goes anywhere largely 41 00:02:39,400 --> 00:02:42,440 Speaker 1: depends upon the trademark office and whether or not they 42 00:02:42,480 --> 00:02:46,560 Speaker 1: allow Meta pc to have Meta registered as a trademark. 43 00:02:47,040 --> 00:02:50,760 Speaker 1: If the Trademark office denies that, well, then Meta pc 44 00:02:50,960 --> 00:02:54,560 Speaker 1: really doesn't have any leverage. However, it would also presumably 45 00:02:54,600 --> 00:02:58,000 Speaker 1: mean that Facebook slash meta would also have issues registering 46 00:02:58,040 --> 00:03:02,680 Speaker 1: the trademark. Otherwise the office is really playing favorites. So 47 00:03:02,720 --> 00:03:04,880 Speaker 1: I'll have to see how this all turns out. And 48 00:03:04,919 --> 00:03:09,680 Speaker 1: now for less whimsical Facebook news, because of course there's 49 00:03:09,680 --> 00:03:14,360 Speaker 1: gonna be some first up Facebook, the social network site 50 00:03:14,760 --> 00:03:19,120 Speaker 1: is giving up on facial recognition technologies, and that's a 51 00:03:19,200 --> 00:03:21,440 Speaker 1: good thing. I mean, we've talked a lot about how 52 00:03:21,800 --> 00:03:26,240 Speaker 1: facial recognition technology is often flawed in its implementation, and 53 00:03:26,240 --> 00:03:29,600 Speaker 1: that frequently we see that the tech tends to work 54 00:03:29,960 --> 00:03:34,360 Speaker 1: pretty well for certain ethnicities and not so great for others. 55 00:03:34,400 --> 00:03:37,280 Speaker 1: Misidentifying people, you know, you get a lot of false positives, 56 00:03:37,280 --> 00:03:39,960 Speaker 1: that kind of thing. And we also know that this 57 00:03:40,080 --> 00:03:43,880 Speaker 1: technology can lead to serious problems, such as certain populations 58 00:03:43,920 --> 00:03:47,480 Speaker 1: being disproportionately harmed by it. You know, if if police 59 00:03:47,480 --> 00:03:53,200 Speaker 1: forces are reliant upon a technology that routinely makes mistakes 60 00:03:53,200 --> 00:03:57,200 Speaker 1: when it comes to identifying people of of certain ethnicities, 61 00:03:57,760 --> 00:04:01,360 Speaker 1: that's a real problem. So Facebook saying no to facial 62 00:04:01,360 --> 00:04:05,680 Speaker 1: recognition algorithms. That's a good thing. What's less good is 63 00:04:05,720 --> 00:04:09,280 Speaker 1: that Meta, the parent company of Facebook, is totally not 64 00:04:09,400 --> 00:04:13,400 Speaker 1: making that same commitment. Meta plans to continue to rely 65 00:04:13,640 --> 00:04:18,520 Speaker 1: on facial recognition technology as it builds out it's metaverse products, 66 00:04:18,880 --> 00:04:21,919 Speaker 1: and I get the feeling that Zuckerberg is really pushing 67 00:04:21,960 --> 00:04:25,719 Speaker 1: for the metaverse stuff to be the future of the company, 68 00:04:25,960 --> 00:04:29,120 Speaker 1: and considering the fact that Facebook is having trouble attracting 69 00:04:29,200 --> 00:04:32,960 Speaker 1: young users, I think that that's kind of all signs 70 00:04:32,960 --> 00:04:35,440 Speaker 1: point that way. So while I'm glad to see Facebook 71 00:04:35,520 --> 00:04:38,920 Speaker 1: move away from facial recognition, I worry that we're really 72 00:04:38,960 --> 00:04:42,640 Speaker 1: just putting that problem on hold. Then again, I have 73 00:04:42,680 --> 00:04:45,839 Speaker 1: no idea how popular this metaverse concept is going to be. 74 00:04:46,760 --> 00:04:50,800 Speaker 1: If the metaverse is dependent upon users having access to 75 00:04:50,880 --> 00:04:54,520 Speaker 1: expensive hardware, it may be that only a subset of 76 00:04:54,560 --> 00:04:58,080 Speaker 1: people will ever really have access to it. Anyway, I 77 00:04:58,120 --> 00:05:01,120 Speaker 1: think it's clear that Facebook slash that doesn't really view 78 00:05:01,160 --> 00:05:04,479 Speaker 1: facial recognition tech as being a problem, or at least 79 00:05:04,520 --> 00:05:08,880 Speaker 1: not a deal breaking one. A Facebook investor named Roger 80 00:05:09,200 --> 00:05:13,680 Speaker 1: mcnammy addressed an audience at the Web Summit event and 81 00:05:14,080 --> 00:05:19,120 Speaker 1: recommended that governments conduct criminal investigations into Facebook, even calling 82 00:05:19,160 --> 00:05:22,680 Speaker 1: for executives to face jail time should they be found 83 00:05:22,680 --> 00:05:27,120 Speaker 1: to be responsible for crimes. Mcnami first invested in Facebook 84 00:05:27,120 --> 00:05:31,680 Speaker 1: around and held onto that stock through to two thousand 85 00:05:31,760 --> 00:05:34,880 Speaker 1: nineteen before he began to start selling it off, and 86 00:05:34,920 --> 00:05:38,880 Speaker 1: he has criticized Facebook multiple times in the past. He 87 00:05:38,960 --> 00:05:42,839 Speaker 1: outlined four areas he believes deserve criminal investigation, and he 88 00:05:42,880 --> 00:05:45,880 Speaker 1: also alluded that he has two others on top of 89 00:05:45,920 --> 00:05:48,960 Speaker 1: those four, bringing the total up to six. And they 90 00:05:49,080 --> 00:05:53,839 Speaker 1: range from business issues like Facebook allegedly failing to disclose 91 00:05:53,920 --> 00:05:57,400 Speaker 1: information to the SEC, even though as a publicly traded company, 92 00:05:57,880 --> 00:06:01,320 Speaker 1: is obligated to do that, but it also goes up 93 00:06:01,360 --> 00:06:06,720 Speaker 1: to charges that Facebook was potentially complicit in the political 94 00:06:06,839 --> 00:06:11,080 Speaker 1: riots that happened on January six, which, holy cow, that's 95 00:06:11,160 --> 00:06:14,320 Speaker 1: January six of this year. I mean, my ability to 96 00:06:14,360 --> 00:06:18,240 Speaker 1: remember dates in my own lifetime is notoriously bad. Like 97 00:06:18,360 --> 00:06:20,320 Speaker 1: I might be like, well, that happened a year ago, 98 00:06:20,400 --> 00:06:23,360 Speaker 1: and it really happened six years ago. But let me 99 00:06:23,400 --> 00:06:26,120 Speaker 1: tell you, the pandemic has made it nearly impossible for 100 00:06:26,160 --> 00:06:29,080 Speaker 1: me to keep things straight anyway. This is more of 101 00:06:29,080 --> 00:06:32,240 Speaker 1: a sign that there's a growing opposition to meta slash 102 00:06:32,279 --> 00:06:36,039 Speaker 1: Facebook on multiple fronts, and we're not done yet. The 103 00:06:36,160 --> 00:06:41,440 Speaker 1: Washington Post reports that in twenty Mark Zuckerberg himself approved 104 00:06:41,480 --> 00:06:44,800 Speaker 1: a request which was more like a demand from the 105 00:06:44,839 --> 00:06:48,839 Speaker 1: government of Vietnam, which wanted Facebook to take down posts 106 00:06:49,240 --> 00:06:53,000 Speaker 1: that were critical of the Communist party in in charge 107 00:06:53,000 --> 00:06:56,640 Speaker 1: in Vietnam. The government said that Facebook, you know, if 108 00:06:56,640 --> 00:07:00,880 Speaker 1: they didn't comply, then the government would for bid Facebook 109 00:07:00,880 --> 00:07:04,760 Speaker 1: and Vietnam and Zuckerberg reportedly caved into the demands and 110 00:07:04,880 --> 00:07:09,720 Speaker 1: authorized the removal of thousands of posts, and the Washington 111 00:07:09,840 --> 00:07:14,200 Speaker 1: Post says that Facebook's response was essentially, you know, when 112 00:07:14,240 --> 00:07:17,200 Speaker 1: asked about this, they said, you know, we did it. 113 00:07:17,440 --> 00:07:19,640 Speaker 1: But if we didn't do it, then Facebook would have 114 00:07:19,680 --> 00:07:22,320 Speaker 1: gone away in Vietnam, and that would have been even 115 00:07:22,360 --> 00:07:25,480 Speaker 1: worse censorship. That would have meant people were cut off 116 00:07:25,880 --> 00:07:29,320 Speaker 1: from a tool that they were depending upon. And that's 117 00:07:29,320 --> 00:07:31,920 Speaker 1: a difficult argument for me. I mean, I get what 118 00:07:31,960 --> 00:07:34,920 Speaker 1: they're saying, but at the same time, this is predicated 119 00:07:34,960 --> 00:07:37,720 Speaker 1: on the idea that Facebook is providing a service. But 120 00:07:38,280 --> 00:07:43,320 Speaker 1: arguably you could say that service is to be a 121 00:07:43,440 --> 00:07:46,880 Speaker 1: pr arm for the Vietnam government, at least as far 122 00:07:46,920 --> 00:07:52,280 Speaker 1: as silencing any dissenting voices so that I don't know. 123 00:07:52,680 --> 00:07:55,200 Speaker 1: That argument doesn't work so well for me. The Washington 124 00:07:55,200 --> 00:07:58,640 Speaker 1: Post also says that the Vietnamese government was effectively using 125 00:07:58,640 --> 00:08:03,600 Speaker 1: Facebook to track down activists and critics of the government, 126 00:08:03,680 --> 00:08:06,960 Speaker 1: claiming that even a post that levied just the tiniest 127 00:08:07,000 --> 00:08:10,520 Speaker 1: amount of criticism towards the Communist Party could result in 128 00:08:10,600 --> 00:08:13,520 Speaker 1: jail time for the user, which is a big Yike's 129 00:08:14,320 --> 00:08:18,360 Speaker 1: let's shift over to Google. The company is once again 130 00:08:18,400 --> 00:08:24,560 Speaker 1: throwing its corporate hat in the ring for defense contracts. Now. 131 00:08:24,640 --> 00:08:29,000 Speaker 1: A few years ago, Google was participating in the development 132 00:08:29,080 --> 00:08:33,360 Speaker 1: of Project Mayven, infected essentially one out a bidding war 133 00:08:33,480 --> 00:08:38,160 Speaker 1: for it, and Project Maven had a lot of controversial 134 00:08:38,600 --> 00:08:41,960 Speaker 1: stuff involved in it, including things that could potentially be 135 00:08:42,080 --> 00:08:45,120 Speaker 1: used in drone programs, and this in turn led Google 136 00:08:45,160 --> 00:08:50,080 Speaker 1: employees to walk out with a protest on Google Uh. 137 00:08:50,120 --> 00:08:52,480 Speaker 1: They didn't really like the idea of their work being 138 00:08:52,600 --> 00:08:56,760 Speaker 1: used to help war efforts. Now Google has bid in 139 00:08:56,880 --> 00:09:01,080 Speaker 1: the Joint Warfighting Cloud Capability can track that the Defense 140 00:09:01,120 --> 00:09:04,640 Speaker 1: Department has has put up. Now, this project is a 141 00:09:04,720 --> 00:09:09,760 Speaker 1: successor to a previously abandoned program, which was the Joint 142 00:09:10,000 --> 00:09:15,000 Speaker 1: Enterprise Defense Infrastructure or JEDI. Now you might remember that 143 00:09:15,120 --> 00:09:18,079 Speaker 1: JEDI saw a bidding war and it really came down 144 00:09:18,360 --> 00:09:23,160 Speaker 1: to between Microsoft and Amazon, and Microsoft ultimately won the 145 00:09:23,200 --> 00:09:26,200 Speaker 1: bid and was selected by the Department of Defense. But 146 00:09:26,280 --> 00:09:31,559 Speaker 1: then Amazon sued based upon this, uh, this particular agreement. 147 00:09:31,559 --> 00:09:35,040 Speaker 1: They challenged the contract and they claimed that then President 148 00:09:35,080 --> 00:09:39,720 Speaker 1: Trump had interfered in the whole process, and because Trump 149 00:09:39,760 --> 00:09:44,760 Speaker 1: had beef with Amazon and Jeff Bezos, Trump effectively scuttled 150 00:09:44,800 --> 00:09:51,280 Speaker 1: Amazon's bid and forced the Defense Department to go with Microsoft. Anyway, 151 00:09:51,440 --> 00:09:54,400 Speaker 1: As a result of all that, that particular project fell through, 152 00:09:54,520 --> 00:09:56,840 Speaker 1: and now Google is competing to win a place at 153 00:09:56,880 --> 00:09:59,679 Speaker 1: the table for the new project. So JEDI was a 154 00:09:59,760 --> 00:10:03,280 Speaker 1: kind of winner takes all approach with one company winning 155 00:10:03,480 --> 00:10:07,480 Speaker 1: the full bid, but the Joint Warfighting project will actually 156 00:10:07,480 --> 00:10:10,520 Speaker 1: see the Defense Department work with multiple companies, so it's 157 00:10:10,559 --> 00:10:13,520 Speaker 1: not you know, winner take all solution. As for what 158 00:10:13,679 --> 00:10:16,600 Speaker 1: Google would do should it become part of this project, 159 00:10:17,200 --> 00:10:20,440 Speaker 1: it sounds like the main components for Google would be 160 00:10:20,520 --> 00:10:24,920 Speaker 1: to provide things like cloud storage and cloud computing services 161 00:10:25,000 --> 00:10:31,199 Speaker 1: and mostly be used in noncombat oriented applications such as developing, 162 00:10:31,600 --> 00:10:34,760 Speaker 1: you know, monitoring the developing situation with the pandemic and 163 00:10:35,200 --> 00:10:39,679 Speaker 1: making uh analysis of that, perhaps predicting where the pandemic 164 00:10:39,800 --> 00:10:43,360 Speaker 1: might develop in the future, as well as things like 165 00:10:43,760 --> 00:10:49,439 Speaker 1: climate change. So stuff that's not directly tied to combat activities, 166 00:10:49,840 --> 00:10:53,760 Speaker 1: at least based upon the initial descriptions. Nowhord yet on 167 00:10:53,840 --> 00:10:57,840 Speaker 1: whether Google employees are fine with this. Earlier this week, 168 00:10:58,160 --> 00:11:01,640 Speaker 1: I talked about a crypto scam that referenced squid game. Well, 169 00:11:01,679 --> 00:11:05,480 Speaker 1: now the Verge reports on a different scam involving cryptocurrency. 170 00:11:05,559 --> 00:11:09,280 Speaker 1: This scam plays on some old fishing tactics, that's fishing 171 00:11:09,280 --> 00:11:12,200 Speaker 1: with a pH And here's how it works. When you 172 00:11:12,200 --> 00:11:15,040 Speaker 1: come into possession of cryptocurrency, you know, whether you earn 173 00:11:15,080 --> 00:11:18,120 Speaker 1: it or you buy some or you're gifted it or whatever, 174 00:11:18,840 --> 00:11:22,640 Speaker 1: that cryptocurrency needs to quote unquote live somewhere. It has 175 00:11:22,679 --> 00:11:26,320 Speaker 1: to be stored somewhere. It's code, but it has to 176 00:11:26,400 --> 00:11:30,319 Speaker 1: have a storage space. So one way to store it 177 00:11:30,440 --> 00:11:33,600 Speaker 1: is in a digital wallet that only the owner has 178 00:11:33,640 --> 00:11:36,839 Speaker 1: access to. So effectively, this is just a piece of software, 179 00:11:37,120 --> 00:11:41,120 Speaker 1: and that software will live on a computer or other device, 180 00:11:41,480 --> 00:11:43,520 Speaker 1: and if you were to lose access to that device, 181 00:11:43,920 --> 00:11:46,800 Speaker 1: you would also lose the digital wallet because it's the 182 00:11:46,840 --> 00:11:51,760 Speaker 1: software that's on that computer. It's it's on that physical device. 183 00:11:52,320 --> 00:11:54,640 Speaker 1: This is why you will occasionally hear those rough stories 184 00:11:54,679 --> 00:11:57,520 Speaker 1: about someone who's no longer able to access a hard 185 00:11:57,600 --> 00:11:59,960 Speaker 1: drive that might have like a hundred bitcoin on it 186 00:12:00,080 --> 00:12:04,240 Speaker 1: or whatever. Anyway, the digital wallet is a way for 187 00:12:04,280 --> 00:12:06,840 Speaker 1: you to hold cryptocurrency so that you can use it 188 00:12:06,920 --> 00:12:11,440 Speaker 1: in transactions. Well, the scam involves creating mock ups of 189 00:12:11,520 --> 00:12:16,360 Speaker 1: well known digital wallet companies, so copying them, uh, you know, 190 00:12:16,440 --> 00:12:19,360 Speaker 1: and these are companies that have a pretty decent reputation 191 00:12:19,440 --> 00:12:22,199 Speaker 1: or at least not a bad reputation, and the scam 192 00:12:22,320 --> 00:12:24,960 Speaker 1: artists make a copy of those sites to make it 193 00:12:25,000 --> 00:12:28,080 Speaker 1: look as realistic as possible, and then they buy out 194 00:12:28,240 --> 00:12:32,440 Speaker 1: ads on Google so that they rank in Google search results. 195 00:12:32,679 --> 00:12:35,720 Speaker 1: So when you search for digital wallet, the ad results 196 00:12:35,760 --> 00:12:38,439 Speaker 1: pop up above everything else. I mean, you know how 197 00:12:38,480 --> 00:12:41,240 Speaker 1: Google searches work. If you search for something, the ad 198 00:12:41,320 --> 00:12:44,600 Speaker 1: supported results are the first things to pop up. So 199 00:12:44,720 --> 00:12:47,240 Speaker 1: on a casual glance, you might just say, oh, I'll 200 00:12:47,240 --> 00:12:49,200 Speaker 1: just pick the first one. I need a digital wallet. 201 00:12:49,200 --> 00:12:50,840 Speaker 1: I'm going to pick the first one that's probably the 202 00:12:50,840 --> 00:12:54,320 Speaker 1: best one. And you wouldn't even know that this was 203 00:12:54,400 --> 00:12:57,880 Speaker 1: a scam that was using the ads in order to 204 00:12:57,920 --> 00:12:59,920 Speaker 1: get this sort of placement. So then you go to 205 00:13:00,080 --> 00:13:02,679 Speaker 1: the scam site and the scammers might try to just 206 00:13:03,080 --> 00:13:05,000 Speaker 1: you know, get as much of your personal information as 207 00:13:05,000 --> 00:13:07,839 Speaker 1: possible and cleaning stuff like your bank account or your 208 00:13:07,880 --> 00:13:11,679 Speaker 1: credit card number, you know, your typical phishing attacks strategy, 209 00:13:11,840 --> 00:13:14,720 Speaker 1: or they might be even more sneaky. They might have 210 00:13:14,840 --> 00:13:18,280 Speaker 1: you go through the process of creating a new digital wallet. 211 00:13:18,600 --> 00:13:21,320 Speaker 1: But instead of actually creating a new wallet, what the 212 00:13:21,360 --> 00:13:25,240 Speaker 1: scammers do is they assign you an existing digital wallet 213 00:13:25,440 --> 00:13:29,560 Speaker 1: that belongs to the scammers, so it looks like it's yours. 214 00:13:29,760 --> 00:13:32,439 Speaker 1: They give you the access to it, but they control 215 00:13:32,520 --> 00:13:36,400 Speaker 1: the wallet. So then you spend some money, you purchase cryptocurrency, 216 00:13:36,520 --> 00:13:41,240 Speaker 1: or you otherwise transfer cryptocurrency that you own into this wallet, 217 00:13:41,600 --> 00:13:45,400 Speaker 1: you are effectively stuffing the scam artist's wallet full of 218 00:13:45,440 --> 00:13:50,280 Speaker 1: your own cash. The scheme works because Google adds acts 219 00:13:50,320 --> 00:13:53,480 Speaker 1: like a shortcut. It's cutting in line. So by creating 220 00:13:53,480 --> 00:13:56,480 Speaker 1: a convincing fake and then buying out ad space, the 221 00:13:56,520 --> 00:13:59,960 Speaker 1: scam artists have their bogus site appear above the real one, 222 00:14:00,480 --> 00:14:03,280 Speaker 1: So security experts recommend that you make sure you scroll 223 00:14:03,400 --> 00:14:07,120 Speaker 1: down below the ad results on Google Search for that 224 00:14:07,280 --> 00:14:10,760 Speaker 1: very reason, because you can't be sure that the advertised 225 00:14:10,800 --> 00:14:14,920 Speaker 1: sites are legit, and Google seems unable or unwilling to 226 00:14:14,960 --> 00:14:17,960 Speaker 1: put in the work to protect consumers, which is pretty 227 00:14:18,120 --> 00:14:21,920 Speaker 1: ugly stuff. We've got more stories to cover after we 228 00:14:22,040 --> 00:14:33,840 Speaker 1: take this quick break. We're back, all right. At the 229 00:14:34,040 --> 00:14:38,280 Speaker 1: cop twenty six Climate Summit, more than forty countries have 230 00:14:38,360 --> 00:14:43,000 Speaker 1: committed to transitioning away from coal fired power plants. Uh, 231 00:14:43,040 --> 00:14:47,240 Speaker 1: the countries that have large economies, large developed economies, they 232 00:14:47,240 --> 00:14:50,600 Speaker 1: plan on making that divide much earlier. They plan to 233 00:14:50,640 --> 00:14:54,960 Speaker 1: get out of coal firing by which is fairly aggressive 234 00:14:55,360 --> 00:14:59,120 Speaker 1: for developing economies. Those countries are looking more at the 235 00:14:59,160 --> 00:15:02,680 Speaker 1: twenty four these, which you know, makes sense because they're 236 00:15:02,720 --> 00:15:06,280 Speaker 1: not in a position to easily transition away as much 237 00:15:06,320 --> 00:15:09,400 Speaker 1: as those that have really big economies. The countries that 238 00:15:09,760 --> 00:15:15,680 Speaker 1: agreed included ones like Vietnam, Indonesia, Ukraine, Canada, Poland, and 239 00:15:15,720 --> 00:15:19,600 Speaker 1: several more, But there were also some really notable absences 240 00:15:19,680 --> 00:15:23,880 Speaker 1: from that list, like the United States, for example, or 241 00:15:23,960 --> 00:15:28,320 Speaker 1: India or China, you know, really big countries that still 242 00:15:28,320 --> 00:15:32,360 Speaker 1: depend heavily on coal powered power plants. So there's still 243 00:15:32,400 --> 00:15:34,400 Speaker 1: a lot of work that needs to be done, and 244 00:15:34,560 --> 00:15:37,440 Speaker 1: unfortunately a lot of that work depends on countries that 245 00:15:37,560 --> 00:15:40,359 Speaker 1: are you know, responsible for a lot of coal consumption, 246 00:15:40,440 --> 00:15:43,880 Speaker 1: but yet not they haven't yet committed to changing that. 247 00:15:44,520 --> 00:15:47,920 Speaker 1: On a related note, many countries and financial institutions agreed 248 00:15:47,960 --> 00:15:51,760 Speaker 1: to end overseas financing for fossil fuel projects, and the 249 00:15:51,840 --> 00:15:55,440 Speaker 1: United States was among those. So the US is saying, yeah, 250 00:15:55,520 --> 00:16:00,600 Speaker 1: we won't, we won't help fund overseas fossil fuel prod jecks, 251 00:16:00,600 --> 00:16:04,480 Speaker 1: but um back off of us on our own home turf. 252 00:16:04,600 --> 00:16:08,280 Speaker 1: I guess. The Australian government has issued a demand to 253 00:16:08,360 --> 00:16:12,560 Speaker 1: clear View AI, famous for its facial recognition database services. 254 00:16:12,760 --> 00:16:16,120 Speaker 1: The company is to destroy all images and facial templates 255 00:16:16,160 --> 00:16:20,400 Speaker 1: related to Australia's citizens because the government has determined that 256 00:16:20,520 --> 00:16:25,640 Speaker 1: clear Views business violates Australia's privacy laws. So for a refresher, 257 00:16:25,840 --> 00:16:28,440 Speaker 1: one way that clear View has built out its massive 258 00:16:28,600 --> 00:16:33,160 Speaker 1: facial recognition databases is that it's scraped social networking sites, 259 00:16:33,320 --> 00:16:37,160 Speaker 1: using programs to collect and analyze images that were publicly 260 00:16:37,200 --> 00:16:40,840 Speaker 1: posted on platforms like Facebook, and they built out databases 261 00:16:40,920 --> 00:16:43,760 Speaker 1: using those images, and that lets clear View train machine 262 00:16:43,840 --> 00:16:48,160 Speaker 1: learning systems to match new images against those databases and 263 00:16:48,240 --> 00:16:51,280 Speaker 1: clear View markets this to governments and police forces around 264 00:16:51,280 --> 00:16:54,600 Speaker 1: the world. Clear View plans to appeal the decision of 265 00:16:54,600 --> 00:16:58,440 Speaker 1: the Australian Court system, saying that the images that it 266 00:16:58,560 --> 00:17:02,280 Speaker 1: uses were published in the United States, since Facebook is 267 00:17:02,360 --> 00:17:06,800 Speaker 1: an American company and therefore Australia doesn't have jurisdiction, and 268 00:17:06,880 --> 00:17:09,600 Speaker 1: the company also claims clear Views claims that because people 269 00:17:09,640 --> 00:17:13,480 Speaker 1: were posting to public profiles, they have no right to privacy, 270 00:17:13,840 --> 00:17:15,720 Speaker 1: which is a big old roof, so we'll have to 271 00:17:15,760 --> 00:17:19,200 Speaker 1: see how that goes. The startup company Too Simple, That's 272 00:17:19,240 --> 00:17:24,679 Speaker 1: tu Simple, which is designing self driving transportation trucks, plans 273 00:17:24,720 --> 00:17:27,359 Speaker 1: to test its autonomous vehicles on the roads without a 274 00:17:27,440 --> 00:17:31,120 Speaker 1: human safety operator before twenty twenty two, which is right 275 00:17:31,119 --> 00:17:34,359 Speaker 1: around the corner, so any day now. The company plans 276 00:17:34,400 --> 00:17:38,960 Speaker 1: to unleash driverless trucks with no safety human operator in 277 00:17:39,040 --> 00:17:42,240 Speaker 1: there for the eighty mile run between the cities of 278 00:17:42,240 --> 00:17:46,399 Speaker 1: Phoenix and Tucson, Arizona. The trucks will travel down public 279 00:17:46,480 --> 00:17:49,679 Speaker 1: roads to do this, so there will be people in 280 00:17:49,920 --> 00:17:52,600 Speaker 1: regular rold cars on the roads at the same time. 281 00:17:53,359 --> 00:17:56,000 Speaker 1: The company has said it plans to conduct multiple runs 282 00:17:56,000 --> 00:17:59,640 Speaker 1: over several weeks to test the technology, and it acknowledges 283 00:17:59,680 --> 00:18:02,600 Speaker 1: that this is a challenging problem. You can design a 284 00:18:02,640 --> 00:18:06,440 Speaker 1: system to handle known scenarios pretty well, but then preparing 285 00:18:06,440 --> 00:18:09,119 Speaker 1: for the unknown is a totally different matter. I'm sure 286 00:18:09,440 --> 00:18:11,960 Speaker 1: a lot of you out there have been in a car, 287 00:18:12,280 --> 00:18:14,919 Speaker 1: you know, traveling in a car when something unexpected happened, 288 00:18:15,320 --> 00:18:18,040 Speaker 1: and that can be a really intense and scary thing 289 00:18:18,160 --> 00:18:21,840 Speaker 1: for humans in many cases. But at least we are 290 00:18:22,080 --> 00:18:25,840 Speaker 1: pretty good at assessing things quickly and making a decision. 291 00:18:26,080 --> 00:18:28,879 Speaker 1: We don't always make the right decision, but you know, 292 00:18:29,000 --> 00:18:31,560 Speaker 1: we can extrapolate a lot of stuff based upon our 293 00:18:31,640 --> 00:18:35,480 Speaker 1: experience and make judgments about what to do. For computer 294 00:18:35,560 --> 00:18:39,560 Speaker 1: systems that encounter something new, there's no experience to draw upon, 295 00:18:39,680 --> 00:18:42,520 Speaker 1: and they're not very good at associative thinking. They can't 296 00:18:42,560 --> 00:18:45,240 Speaker 1: really say, well, I've never seen this, but i've seen 297 00:18:45,320 --> 00:18:47,520 Speaker 1: something like this, and I think this is the right 298 00:18:47,560 --> 00:18:50,000 Speaker 1: way to do it. They're not really they can't really 299 00:18:50,040 --> 00:18:52,639 Speaker 1: do that very well. The machine still has to make 300 00:18:52,640 --> 00:18:55,159 Speaker 1: a decision, and it may not have anything solid to 301 00:18:55,200 --> 00:18:57,520 Speaker 1: guide that decision. So let me give you a very 302 00:18:57,600 --> 00:19:01,560 Speaker 1: simple example. Let's say the vals traveling at night, and 303 00:19:01,600 --> 00:19:04,560 Speaker 1: there's a puddle across the road, and the headlights of 304 00:19:04,560 --> 00:19:07,320 Speaker 1: the vehicle hit the puddle and reflect off of it. 305 00:19:07,600 --> 00:19:10,199 Speaker 1: Now a human would recognize that as a reflection, they 306 00:19:10,280 --> 00:19:12,399 Speaker 1: might slow down so they can go through the puddle 307 00:19:12,440 --> 00:19:15,840 Speaker 1: without like hydroplaning or something. But otherwise they know what 308 00:19:15,880 --> 00:19:20,080 Speaker 1: they're looking at. But a machine could theoretically misinterpret that 309 00:19:20,160 --> 00:19:23,720 Speaker 1: reflection and see it as an obstacle that's in the road, 310 00:19:24,200 --> 00:19:26,520 Speaker 1: so the machine might try to swerve out of the 311 00:19:26,520 --> 00:19:29,879 Speaker 1: way or slam on the brakes. Now that scenario I 312 00:19:29,960 --> 00:19:32,480 Speaker 1: just gave is one that I'm sure all autonomous vehicle 313 00:19:32,520 --> 00:19:35,639 Speaker 1: companies have anticipated and have worked on. That's not like 314 00:19:35,680 --> 00:19:37,439 Speaker 1: it's so out of the ordinary no one would have 315 00:19:37,440 --> 00:19:41,119 Speaker 1: thought of it. I'm sure that's factored in. But my 316 00:19:41,280 --> 00:19:45,760 Speaker 1: point is that machines don't magically no real risk from 317 00:19:45,840 --> 00:19:48,760 Speaker 1: something that isn't a risk at all. That being said, 318 00:19:48,960 --> 00:19:52,320 Speaker 1: too simple is taking this process seriously. The company has 319 00:19:52,320 --> 00:19:56,359 Speaker 1: been limiting the human free tests so far to a 320 00:19:56,400 --> 00:19:59,920 Speaker 1: dedicated track. So so far, anytime they've run a test 321 00:20:00,040 --> 00:20:02,840 Speaker 1: where there has been no human in the truck, they've 322 00:20:02,840 --> 00:20:05,399 Speaker 1: only done it on a dedicated track that doesn't connect 323 00:20:05,400 --> 00:20:08,720 Speaker 1: to public roads, and for the moment, on the public 324 00:20:08,800 --> 00:20:12,439 Speaker 1: road tests, they still have operators writing the route between 325 00:20:12,440 --> 00:20:15,880 Speaker 1: the two cities. And because Too Simple is using this 326 00:20:16,080 --> 00:20:19,119 Speaker 1: established route, like it's not open ended, right, it's not 327 00:20:19,160 --> 00:20:21,720 Speaker 1: saying you're going from here to let's pick a city 328 00:20:22,040 --> 00:20:25,280 Speaker 1: Atlanta and just find the best route. They're not doing that. 329 00:20:25,560 --> 00:20:28,280 Speaker 1: They're saying go from here to here, and this is 330 00:20:28,320 --> 00:20:31,520 Speaker 1: the path that you you should take. That means that 331 00:20:31,560 --> 00:20:36,679 Speaker 1: the company has been limiting the variables, right. They have 332 00:20:36,840 --> 00:20:43,280 Speaker 1: this established route that creates a more knowable course, and 333 00:20:43,600 --> 00:20:46,000 Speaker 1: you can still have things that are unexpected happen, but 334 00:20:46,080 --> 00:20:49,760 Speaker 1: you've cut way back on those variables and it allows 335 00:20:49,840 --> 00:20:53,640 Speaker 1: us to continue to build toward a future where autonomous 336 00:20:53,920 --> 00:20:58,639 Speaker 1: vehicles are a viable solution. So while I have some 337 00:20:58,760 --> 00:21:01,600 Speaker 1: reservations about a top of as trucks, I do think 338 00:21:01,640 --> 00:21:04,360 Speaker 1: that the process Too Simple has laid out is one 339 00:21:04,400 --> 00:21:09,520 Speaker 1: with the appropriate amount of caution and accountability. Okay, geeky 340 00:21:09,880 --> 00:21:14,720 Speaker 1: news alert. The following news item is extremely geeky. A 341 00:21:14,800 --> 00:21:18,440 Speaker 1: startup out of Australia called q Control that's c t 342 00:21:18,800 --> 00:21:23,280 Speaker 1: r L has created an error suppression technique that improves 343 00:21:23,359 --> 00:21:29,359 Speaker 1: quantum algorithms by an astonishing two thousand and yeah, I 344 00:21:29,400 --> 00:21:33,520 Speaker 1: get it like that, that alone is effectively gobbledegook. So 345 00:21:33,560 --> 00:21:35,680 Speaker 1: what the heck do I even mean by this? Let's 346 00:21:35,720 --> 00:21:39,119 Speaker 1: start off with talking about quantum computers. When you boiled 347 00:21:39,200 --> 00:21:43,080 Speaker 1: down computer science with classical computers, you're talking about processing 348 00:21:43,119 --> 00:21:46,760 Speaker 1: information in the form of bits, and a bit is 349 00:21:46,880 --> 00:21:49,720 Speaker 1: a single unit of information it can buy be a 350 00:21:49,800 --> 00:21:53,119 Speaker 1: zero or a one, and a zero is always a 351 00:21:53,240 --> 00:21:55,560 Speaker 1: zero and the one is always a one. So you 352 00:21:55,600 --> 00:21:57,560 Speaker 1: can think of it like a light switch with an 353 00:21:57,600 --> 00:22:02,719 Speaker 1: off and on. But quantum can computers rely on cubits, 354 00:22:02,760 --> 00:22:06,720 Speaker 1: and these under certain conditions, can technically be both a 355 00:22:06,880 --> 00:22:10,359 Speaker 1: zero and a one at the same time, plus all 356 00:22:10,480 --> 00:22:13,600 Speaker 1: values in between. And when you take that and you 357 00:22:13,680 --> 00:22:17,360 Speaker 1: combine it with a properly designed quantum algorithm, you can 358 00:22:17,400 --> 00:22:22,560 Speaker 1: potentially solve a certain subset of computational problems much faster 359 00:22:22,800 --> 00:22:26,040 Speaker 1: than you could if you were to use a classical computer. So, 360 00:22:26,119 --> 00:22:30,679 Speaker 1: for example, let's say I give you a really really 361 00:22:30,720 --> 00:22:34,600 Speaker 1: big number, it's hundreds of digits long, and I tell 362 00:22:34,680 --> 00:22:39,360 Speaker 1: you I created this number by multiplying two prime numbers together. 363 00:22:40,080 --> 00:22:43,680 Speaker 1: Which two prime numbers did I use? Well, then you 364 00:22:43,680 --> 00:22:46,800 Speaker 1: would need to start trying out different prime numbers to 365 00:22:46,840 --> 00:22:49,720 Speaker 1: see if they divide evenly into the big number I 366 00:22:49,760 --> 00:22:53,760 Speaker 1: gave you, and then to make sure that the other 367 00:22:53,880 --> 00:22:57,960 Speaker 1: number that it produced was itself a prime number. And 368 00:22:58,000 --> 00:23:00,480 Speaker 1: you'd be going, Nope, it's not that one. Hope, it's 369 00:23:00,520 --> 00:23:02,960 Speaker 1: not that one. Nope. I mean, it would take you ages, 370 00:23:03,480 --> 00:23:06,439 Speaker 1: potentially centuries to get to the right pair, depending on 371 00:23:06,480 --> 00:23:08,880 Speaker 1: how big the number I was I gave you. And 372 00:23:09,119 --> 00:23:11,840 Speaker 1: and that's how classical computers kind of tackle these problems. 373 00:23:11,880 --> 00:23:15,080 Speaker 1: They sequentially go through all the possible answers to find 374 00:23:15,080 --> 00:23:18,280 Speaker 1: the one that fits, and even a fast computer would 375 00:23:18,320 --> 00:23:20,679 Speaker 1: take a very long time to get to that answer. 376 00:23:21,280 --> 00:23:25,320 Speaker 1: But quantum computers can effectively make all the guesses at 377 00:23:25,359 --> 00:23:28,880 Speaker 1: the same time, assuming one that the computer has enough 378 00:23:28,960 --> 00:23:32,879 Speaker 1: cubits to do this, and to the algorithm you've designed 379 00:23:33,240 --> 00:23:36,520 Speaker 1: that the computer is following works. So all the pieces 380 00:23:36,560 --> 00:23:38,040 Speaker 1: need to be there. It's not just the power of 381 00:23:38,040 --> 00:23:41,280 Speaker 1: the quantum computer. It's also the quality of the algorithm 382 00:23:41,280 --> 00:23:43,880 Speaker 1: you're using to try and solve a problem. But when 383 00:23:43,920 --> 00:23:46,880 Speaker 1: all the pieces are there, the quantum computer will give 384 00:23:46,920 --> 00:23:49,879 Speaker 1: solutions to those problems. I guess I should say we 385 00:23:49,920 --> 00:23:54,040 Speaker 1: typically get solutions with a certain percentage of confidence behind them, 386 00:23:54,119 --> 00:23:57,159 Speaker 1: kind of like, I'm sure this is the right answer. 387 00:23:57,359 --> 00:23:59,879 Speaker 1: So really we get answers in the form of probabilities 388 00:24:00,440 --> 00:24:04,600 Speaker 1: rather than certainties when we're talking about quantum computers. Anyway, 389 00:24:04,680 --> 00:24:07,440 Speaker 1: the startup says it has created a means of suppressing 390 00:24:07,640 --> 00:24:11,400 Speaker 1: errors with quantum algorithms, which theoretically should make it easier 391 00:24:11,440 --> 00:24:16,080 Speaker 1: to design quantum algorithms that can take advantage of quantum computing. Now, 392 00:24:16,119 --> 00:24:20,000 Speaker 1: I can't pretend to even have a partial understanding of 393 00:24:20,160 --> 00:24:23,960 Speaker 1: how they achieve this. I mean, the the bits that 394 00:24:24,000 --> 00:24:28,440 Speaker 1: I've told you about quantum computer so far, that's pretty 395 00:24:28,560 --> 00:24:31,720 Speaker 1: much right around my level of understanding of quantum computers. 396 00:24:31,720 --> 00:24:33,439 Speaker 1: It goes a little deeper than that, but not a 397 00:24:33,480 --> 00:24:36,080 Speaker 1: whole lot. Like once you start really getting into things 398 00:24:36,080 --> 00:24:39,520 Speaker 1: like entanglement and superposition, things get a little too wibbly 399 00:24:39,560 --> 00:24:42,399 Speaker 1: wobbly for me to be able to follow properly. But 400 00:24:42,520 --> 00:24:46,160 Speaker 1: this is really cool. It means that quantum computers could 401 00:24:46,160 --> 00:24:51,960 Speaker 1: potentially be used for a larger range of applications as 402 00:24:52,000 --> 00:24:55,840 Speaker 1: we continue to build stronger quantum computers. And I first 403 00:24:55,840 --> 00:25:00,280 Speaker 1: wrote how quantum computers work back when we were talking 404 00:25:00,280 --> 00:25:03,199 Speaker 1: about cubits on the order of like ten cubits for 405 00:25:03,240 --> 00:25:06,840 Speaker 1: our computer, and we're seeing that grow every single year 406 00:25:07,240 --> 00:25:11,000 Speaker 1: to a point where we could potentially do some really 407 00:25:11,040 --> 00:25:15,080 Speaker 1: cool things with quantum computers and tackle some very difficult problems. 408 00:25:15,160 --> 00:25:18,119 Speaker 1: It also, by the way, means that some of the 409 00:25:18,119 --> 00:25:22,879 Speaker 1: principles that are behind modern day encryption will have to 410 00:25:22,880 --> 00:25:27,560 Speaker 1: be completely rethought, because a good quantum computer with a 411 00:25:27,680 --> 00:25:33,280 Speaker 1: solid algorithm could potentially crack encryption at a fraction of 412 00:25:33,320 --> 00:25:36,840 Speaker 1: the time it would take us using classical methods, which 413 00:25:36,880 --> 00:25:41,760 Speaker 1: means that essentially, at that stage, we everyone who has 414 00:25:41,800 --> 00:25:44,000 Speaker 1: access to a quantum computer in one of these algorithms 415 00:25:44,119 --> 00:25:49,560 Speaker 1: effectively has a skeleton key to all encrypted information everywhere. Obviously, 416 00:25:49,600 --> 00:25:52,200 Speaker 1: that will change things dramatically, but we're not there yet. 417 00:25:52,240 --> 00:25:53,879 Speaker 1: But this is the sort of thing that kind of 418 00:25:53,880 --> 00:25:57,639 Speaker 1: sets us on that Pathway. While we're on the world 419 00:25:57,640 --> 00:25:59,360 Speaker 1: of kind of science fiction, I want to talk about 420 00:25:59,359 --> 00:26:02,200 Speaker 1: a story I published in Vice, and the story says 421 00:26:02,520 --> 00:26:07,240 Speaker 1: ethical AI trained on Reddit posts said genocide is okay 422 00:26:07,359 --> 00:26:10,159 Speaker 1: if it makes people happy, and I get it that 423 00:26:10,240 --> 00:26:13,479 Speaker 1: headline grabs your attention. But let's talk about some stuff. 424 00:26:13,760 --> 00:26:17,000 Speaker 1: One method of machine learning involves feeding tons of samples 425 00:26:17,000 --> 00:26:18,879 Speaker 1: to a computer model. We kind of talked about it 426 00:26:18,920 --> 00:26:21,679 Speaker 1: with Clear View AI in this very episode. So the 427 00:26:21,680 --> 00:26:24,280 Speaker 1: computer models job is to sort through the data that's 428 00:26:24,320 --> 00:26:26,320 Speaker 1: being fed to it and then to make some sort 429 00:26:26,359 --> 00:26:30,000 Speaker 1: of decision based upon that data. Now, the example I 430 00:26:30,000 --> 00:26:32,720 Speaker 1: always give is, imagine you've got like ten thousand photos 431 00:26:32,760 --> 00:26:35,000 Speaker 1: and some of those photos have coffee mugs in them 432 00:26:35,280 --> 00:26:37,520 Speaker 1: and some of them don't. And you're trying to teach 433 00:26:37,560 --> 00:26:40,000 Speaker 1: a computer what a coffee mug is, and you're feeding 434 00:26:40,040 --> 00:26:42,640 Speaker 1: these images into it, and it gives you some results, 435 00:26:42,640 --> 00:26:44,280 Speaker 1: and some of the things that says are right and 436 00:26:44,320 --> 00:26:47,520 Speaker 1: some are wrong. Some it misses some coffee mugs and 437 00:26:47,560 --> 00:26:51,240 Speaker 1: misidentifies other things as coffee mugs. So you tweak things, 438 00:26:51,359 --> 00:26:54,159 Speaker 1: You repeat the training, and you do this over and 439 00:26:54,200 --> 00:26:57,639 Speaker 1: over and over again. You might use a sample that 440 00:26:57,720 --> 00:27:00,639 Speaker 1: has millions of data points in it, might run that 441 00:27:00,680 --> 00:27:03,240 Speaker 1: test thousands of times in an effort to refine your 442 00:27:03,240 --> 00:27:07,600 Speaker 1: computer model. Well, no computer magically knows the answer to 443 00:27:07,680 --> 00:27:10,679 Speaker 1: these things. It is this training process that's important. And 444 00:27:10,720 --> 00:27:13,720 Speaker 1: in this case, we're talking about an AI called ask 445 00:27:13,920 --> 00:27:17,359 Speaker 1: Delphi or ask Delphi if you prefer, as in the 446 00:27:17,359 --> 00:27:21,920 Speaker 1: Oracle of Delphi, and you are to ask it ethical 447 00:27:22,000 --> 00:27:25,160 Speaker 1: questions and it gives you answers. Well, again, it has 448 00:27:25,200 --> 00:27:28,520 Speaker 1: to be trained to do this, and it's very easy 449 00:27:28,640 --> 00:27:32,000 Speaker 1: for these kind of models to be trained improperly. So 450 00:27:32,200 --> 00:27:35,080 Speaker 1: I wouldn't be at all surprised by this. This isn't 451 00:27:35,080 --> 00:27:38,760 Speaker 1: a shocking thing to me. It's actually entirely expected, really, 452 00:27:39,280 --> 00:27:42,159 Speaker 1: But I do think that the people who wrote the 453 00:27:42,240 --> 00:27:44,919 Speaker 1: Vice article do make some good points. They do say 454 00:27:45,200 --> 00:27:49,520 Speaker 1: maybe handing ethical judgments to AI is not a great 455 00:27:49,560 --> 00:27:54,320 Speaker 1: idea because an AI is always going to be reliant 456 00:27:54,359 --> 00:27:58,359 Speaker 1: upon the biases that taught that AI in the first place. 457 00:27:58,920 --> 00:28:01,080 Speaker 1: That also means that you probab probably shouldn't use AI 458 00:28:01,200 --> 00:28:03,600 Speaker 1: to be in charge of any system that hinges on 459 00:28:03,680 --> 00:28:07,560 Speaker 1: ethical judgments, which that's a much larger scale. Right. It's 460 00:28:07,560 --> 00:28:09,800 Speaker 1: one thing to ask AI, Hey, is it cool if 461 00:28:09,840 --> 00:28:13,400 Speaker 1: I do this? It's another thing if you're talking about 462 00:28:13,400 --> 00:28:16,040 Speaker 1: a system that, at some point or another needs to 463 00:28:16,080 --> 00:28:19,880 Speaker 1: make a call about whether something is ethical or not. Uh, 464 00:28:20,359 --> 00:28:22,320 Speaker 1: you know that that starts to really bring in a 465 00:28:22,359 --> 00:28:24,359 Speaker 1: lot of questions. I think the headline was just a 466 00:28:24,400 --> 00:28:27,879 Speaker 1: bit sensational, but I think the piece was actually really valuable. 467 00:28:28,960 --> 00:28:32,919 Speaker 1: And finally, Ridley Scott put it best. In space, no 468 00:28:32,960 --> 00:28:35,560 Speaker 1: one can hear you toot, at least I think that's 469 00:28:35,560 --> 00:28:38,080 Speaker 1: how that goes. Also, that's totally not true. If you 470 00:28:38,120 --> 00:28:40,840 Speaker 1: happen to be in a spaceship that's got an atmosphere 471 00:28:40,840 --> 00:28:42,320 Speaker 1: in it and there are other people near you and 472 00:28:42,360 --> 00:28:45,200 Speaker 1: it's not too loud in the environment, they might hear 473 00:28:45,240 --> 00:28:47,840 Speaker 1: you if you start cutting muffins. But I wanted to 474 00:28:47,880 --> 00:28:51,880 Speaker 1: open the segment with that joke because it's about space tacos. Yeah, 475 00:28:51,960 --> 00:28:56,160 Speaker 1: tacos in space. So our final story is that astronauts 476 00:28:56,160 --> 00:28:59,360 Speaker 1: aboard the International Space Station grew a batch of hatch 477 00:28:59,560 --> 00:29:02,280 Speaker 1: chili is as part of their experiments aboard the I 478 00:29:02,560 --> 00:29:06,200 Speaker 1: s S, and on Friday, astronaut Megan MacArthur tweeted that 479 00:29:06,240 --> 00:29:10,000 Speaker 1: she had made tacos using the space grown chilies as 480 00:29:10,080 --> 00:29:12,800 Speaker 1: one of the ingredients. Now, the other components all came 481 00:29:12,880 --> 00:29:14,960 Speaker 1: up from Earth and various launches, so I don't have 482 00:29:15,000 --> 00:29:18,400 Speaker 1: any exciting stories to talk about space beef here, but 483 00:29:18,480 --> 00:29:21,800 Speaker 1: this is really cool. Growing the chilies was an experiment 484 00:29:21,800 --> 00:29:25,840 Speaker 1: all by itself, and astronauts conducted scientific observations before the 485 00:29:25,920 --> 00:29:29,440 Speaker 1: fruits of their labor could become taco ingredients. I just 486 00:29:29,480 --> 00:29:32,040 Speaker 1: thought that was a neat story to end on. And 487 00:29:32,120 --> 00:29:36,239 Speaker 1: that's it for the news for Thursday, November four one. 488 00:29:36,320 --> 00:29:38,440 Speaker 1: If you have suggestions for topics I should cover in 489 00:29:38,480 --> 00:29:41,600 Speaker 1: future episodes of tech Stuff, reach out and let me know. 490 00:29:41,880 --> 00:29:43,560 Speaker 1: The best way to do that is on Twitter. The 491 00:29:43,560 --> 00:29:47,040 Speaker 1: handle for the show is text Stuff hs W and 492 00:29:47,120 --> 00:29:56,080 Speaker 1: I'll talk to you again really soon. Text Stuff is 493 00:29:56,120 --> 00:29:59,240 Speaker 1: an I Heart Radio production. For more podcasts from my 494 00:29:59,400 --> 00:30:03,000 Speaker 1: heart Radio, visit the i heart Radio app, Apple Podcasts, 495 00:30:03,120 --> 00:30:09,480 Speaker 1: or wherever you listen to your favorite shows. H