1 00:00:04,480 --> 00:00:12,479 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,520 --> 00:00:16,400 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:16,440 --> 00:00:19,799 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,880 --> 00:00:23,800 Speaker 1: tech are you. It's time for the tech news for Friday, 5 00:00:23,920 --> 00:00:29,840 Speaker 1: October fourth, twenty twenty four. So this week, California Governor 6 00:00:29,960 --> 00:00:34,320 Speaker 1: Gavin Newsom vetoed a bill that was aimed at putting 7 00:00:34,400 --> 00:00:38,360 Speaker 1: some guardrails up on major AI companies in the state 8 00:00:38,360 --> 00:00:41,920 Speaker 1: of California. Newsom says that he vetoed it because he 9 00:00:41,960 --> 00:00:45,720 Speaker 1: felt the bill was too weak, that it wasn't addressing 10 00:00:46,280 --> 00:00:50,680 Speaker 1: certain elements of AI that he thought were really important, 11 00:00:50,720 --> 00:00:54,319 Speaker 1: and instead was kind of framing it the wrong way. Now, 12 00:00:54,400 --> 00:00:58,040 Speaker 1: tech journalist Casey Newton says in his tech newsletter, which 13 00:00:58,040 --> 00:01:01,600 Speaker 1: is well worth subscribing to, and I have no connection 14 00:01:01,800 --> 00:01:05,120 Speaker 1: to Casey. I just really like his work Anyway, He 15 00:01:05,200 --> 00:01:09,520 Speaker 1: says that most folks he chatted to are not really 16 00:01:09,680 --> 00:01:15,639 Speaker 1: buying Gavin Newsom's explanation for why he vetoed the bill. 17 00:01:15,920 --> 00:01:19,440 Speaker 1: The bill would have mandated safety testing for AI models 18 00:01:19,440 --> 00:01:22,280 Speaker 1: that had cost more than one hundred million dollars to train, 19 00:01:22,720 --> 00:01:25,000 Speaker 1: so that would have covered the major players in the space, 20 00:01:25,120 --> 00:01:29,440 Speaker 1: most of which have been headquartered in California. So Newsom 21 00:01:29,440 --> 00:01:33,480 Speaker 1: has said, well, yeah, I mean these are really big models, 22 00:01:33,840 --> 00:01:37,399 Speaker 1: but what about smaller models that are trained to do 23 00:01:37,560 --> 00:01:42,080 Speaker 1: things that are inherently risky. This has no coverage for them, 24 00:01:42,200 --> 00:01:45,399 Speaker 1: So it does go deeper than that. In addition, the 25 00:01:45,440 --> 00:01:49,320 Speaker 1: bill would have allowed judges to levy punishments for cases 26 00:01:49,360 --> 00:01:52,800 Speaker 1: in which AI was found to have caused harm. You 27 00:01:52,880 --> 00:01:54,800 Speaker 1: might think, well, of course they should be able to 28 00:01:54,800 --> 00:01:58,640 Speaker 1: do that, but until it becomes established law, you can't 29 00:01:58,720 --> 00:02:02,640 Speaker 1: just hand out punishment. If there's no law covering what 30 00:02:02,800 --> 00:02:05,760 Speaker 1: is at least being perceived to be a crime, then 31 00:02:05,800 --> 00:02:08,360 Speaker 1: that's an issue. Ca C. Newton does go on to 32 00:02:08,400 --> 00:02:12,200 Speaker 1: say that in recent months Gavin Newsom, Newsom and Newton 33 00:02:12,960 --> 00:02:16,640 Speaker 1: a little too close anyway, Gavin Newsom has signed eighteen 34 00:02:16,880 --> 00:02:22,080 Speaker 1: other AI bills into law, so that is significant, right, Like, yeah, 35 00:02:22,120 --> 00:02:25,440 Speaker 1: this one he vetoed, but he has signed more than 36 00:02:25,440 --> 00:02:29,200 Speaker 1: a dozen regulations into law. Though some of those laws 37 00:02:29,440 --> 00:02:32,000 Speaker 1: AI companies are going to have like a full year 38 00:02:32,240 --> 00:02:35,880 Speaker 1: grace period before they go into effect because they are 39 00:02:35,919 --> 00:02:41,440 Speaker 1: not going to become actual effective laws until twenty twenty six. 40 00:02:41,840 --> 00:02:46,280 Speaker 1: These new laws carry out that groundwork I was talking about. 41 00:02:46,320 --> 00:02:49,680 Speaker 1: They create the legal pathways for victims to pursue with 42 00:02:49,760 --> 00:02:53,480 Speaker 1: regard to AI related abuse. So, for example, if someone 43 00:02:53,520 --> 00:02:57,520 Speaker 1: were to use AI to generate nude images of you, 44 00:02:58,400 --> 00:03:01,359 Speaker 1: and then they demand money from you, saying, hey, we're 45 00:03:01,360 --> 00:03:04,000 Speaker 1: going to release these pictures and they look real and 46 00:03:04,040 --> 00:03:07,400 Speaker 1: they are of you, and you're naked. So unless you 47 00:03:07,440 --> 00:03:10,320 Speaker 1: pay us, we're going to flood the Internet with these things. 48 00:03:10,360 --> 00:03:12,639 Speaker 1: Well that would now be a crime. And again you 49 00:03:12,639 --> 00:03:14,280 Speaker 1: would think, well, that sounds like that should be a 50 00:03:14,320 --> 00:03:17,440 Speaker 1: crime already it's blackmail, and yes it is, but the 51 00:03:17,480 --> 00:03:20,960 Speaker 1: AI generated element of it, you know, a lawyer could 52 00:03:20,960 --> 00:03:23,360 Speaker 1: have argued, well, this isn't really a picture of you. 53 00:03:24,160 --> 00:03:28,079 Speaker 1: It wasn't taken of you, it's generated by AI. It's 54 00:03:28,160 --> 00:03:31,440 Speaker 1: not really a picture of you, and that creates this 55 00:03:31,440 --> 00:03:33,680 Speaker 1: this wiggle room. So that's what these laws were meant 56 00:03:33,720 --> 00:03:36,680 Speaker 1: to do, is to kind of eliminate those gaps. So 57 00:03:36,840 --> 00:03:39,640 Speaker 1: really it's all about, you know, filling in the blanks 58 00:03:40,080 --> 00:03:43,080 Speaker 1: for stuff that's clearly wrong but until recently wasn't actually 59 00:03:43,160 --> 00:03:46,320 Speaker 1: defined as a criminal act. Amazon got a bit of 60 00:03:46,360 --> 00:03:50,600 Speaker 1: a win in court this week, so the company saw 61 00:03:50,640 --> 00:03:54,640 Speaker 1: a partial dismissal of a lawsuit that the United States 62 00:03:54,760 --> 00:03:59,440 Speaker 1: Federal Trade Commission, or FTC, has brought against it. So 63 00:03:59,480 --> 00:04:01,840 Speaker 1: the heart of the matter is that the FTC has 64 00:04:01,880 --> 00:04:06,600 Speaker 1: claimed Amazon has engaged in anti competitive behaviors in order 65 00:04:06,640 --> 00:04:11,880 Speaker 1: to acquire and hold a dominant position in online marketplaces, 66 00:04:12,120 --> 00:04:14,960 Speaker 1: and Amazon asked for a dismissal of the court case, 67 00:04:15,080 --> 00:04:18,479 Speaker 1: arguing that the FTC did not have sufficient evidence to 68 00:04:18,560 --> 00:04:22,680 Speaker 1: show that the company was actually harming consumers, because I mean, 69 00:04:22,720 --> 00:04:26,560 Speaker 1: that's ultimately what anti competitive stuff is supposed to be about, 70 00:04:26,720 --> 00:04:30,800 Speaker 1: is that the reduction of competition in the market hurts consumers. 71 00:04:30,839 --> 00:04:34,160 Speaker 1: You know, competition is good for consumers. The judge has 72 00:04:34,200 --> 00:04:37,840 Speaker 1: dismissed at least some of the FTC's charges, but not 73 00:04:38,000 --> 00:04:40,680 Speaker 1: all of them, and I don't know which ones because 74 00:04:40,720 --> 00:04:44,320 Speaker 1: the judge issued this as a sealed ruling, so it's 75 00:04:44,320 --> 00:04:47,760 Speaker 1: not public information about which elements of the lawsuit have 76 00:04:47,800 --> 00:04:51,320 Speaker 1: been dismissed and which ones remain. But the judge also 77 00:04:51,480 --> 00:04:55,799 Speaker 1: denied another one of Amazon's requests, which was to combine 78 00:04:55,920 --> 00:05:00,680 Speaker 1: both a trial and the FTC's proposed solution to this 79 00:05:01,000 --> 00:05:04,840 Speaker 1: alleged anti competitive issue in a single case. So the 80 00:05:04,920 --> 00:05:08,080 Speaker 1: judges ruled that that will instead be two separate cases. 81 00:05:08,080 --> 00:05:10,440 Speaker 1: So it suggests to me that there is at least 82 00:05:10,680 --> 00:05:15,320 Speaker 1: a sufficient amount of charges left in the FTC's case 83 00:05:15,360 --> 00:05:20,360 Speaker 1: against Amazon to necessitate that ruling. Right. So, don't know 84 00:05:20,760 --> 00:05:24,200 Speaker 1: how much of the case was dismissed, but Amazon gets 85 00:05:24,279 --> 00:05:26,520 Speaker 1: a little bit of a win there. It was a 86 00:05:26,600 --> 00:05:29,800 Speaker 1: huge week for open Ai this week. So over the 87 00:05:29,880 --> 00:05:33,760 Speaker 1: last few months, I've seen lots of different analysts sort 88 00:05:33,760 --> 00:05:36,159 Speaker 1: of projecting that open ai could find itself out of 89 00:05:36,279 --> 00:05:41,800 Speaker 1: money unless it had another significant round of investments, and 90 00:05:42,320 --> 00:05:46,719 Speaker 1: that round happened this week, and it was indeed significant. 91 00:05:47,040 --> 00:05:51,760 Speaker 1: Open Ai raised around six point six billion with a 92 00:05:51,880 --> 00:05:56,279 Speaker 1: B dollars in investments. About three quarters of a billion 93 00:05:56,400 --> 00:06:00,719 Speaker 1: dollars came from Microsoft, which continues to financially back open 94 00:06:00,760 --> 00:06:05,640 Speaker 1: Ai pretty enthusiastically. Now, this huge influx of cash has 95 00:06:05,720 --> 00:06:10,200 Speaker 1: analysts now valuing open Ai at a staggering one hundred 96 00:06:10,200 --> 00:06:15,400 Speaker 1: and fifty seven billion dollars. Once again, I'm left to 97 00:06:15,480 --> 00:06:19,520 Speaker 1: grapple with a paradoxical situation because here's a company that 98 00:06:19,640 --> 00:06:24,520 Speaker 1: really burns through cash fast. And that's not blaming open Ai. 99 00:06:24,760 --> 00:06:27,600 Speaker 1: I mean I mentioned earlier this week in other tech 100 00:06:27,640 --> 00:06:32,159 Speaker 1: stuff episodes AI R and D is mega expensive. It 101 00:06:32,240 --> 00:06:36,320 Speaker 1: is so expensive to not just do research and development 102 00:06:36,320 --> 00:06:39,279 Speaker 1: and AI, but then to run AI applications. It costs 103 00:06:39,360 --> 00:06:42,680 Speaker 1: a lot of money. And some folks were saying that 104 00:06:43,200 --> 00:06:46,720 Speaker 1: open ai might actually spend itself out of business by 105 00:06:46,760 --> 00:06:49,560 Speaker 1: early next year. But now we turn around and it's 106 00:06:49,560 --> 00:06:52,400 Speaker 1: being valued at one hundred and fifty seven billion buckaroos. 107 00:06:52,800 --> 00:06:55,839 Speaker 1: The finances just don't make sense to me. Like, I 108 00:06:55,880 --> 00:06:59,120 Speaker 1: get that in the in the moment, it's worth a 109 00:06:59,160 --> 00:07:02,080 Speaker 1: huge amount of money. But this is the same company 110 00:07:02,080 --> 00:07:04,320 Speaker 1: that people are saying it's going to spend itself out 111 00:07:04,360 --> 00:07:07,560 Speaker 1: of business, and it's not like anything there has changed, right, 112 00:07:07,600 --> 00:07:09,560 Speaker 1: It's still going to have to spend huge amounts of 113 00:07:09,600 --> 00:07:12,040 Speaker 1: money and it's not going to bring in enough revenue 114 00:07:12,040 --> 00:07:16,320 Speaker 1: to cover the costs. So this will power open ai 115 00:07:16,520 --> 00:07:20,640 Speaker 1: for some foreseeable future, but we're still going to have 116 00:07:20,640 --> 00:07:22,640 Speaker 1: to wait and see if open ai can get to 117 00:07:22,680 --> 00:07:27,080 Speaker 1: a situation where it will generate enough revenue to cover 118 00:07:27,120 --> 00:07:29,960 Speaker 1: the costs of doing business, or if we'll just get 119 00:07:30,080 --> 00:07:33,120 Speaker 1: right back into the situation where at some point open 120 00:07:33,160 --> 00:07:37,120 Speaker 1: ai needs to hold another significant round of investments in 121 00:07:37,200 --> 00:07:41,840 Speaker 1: order to stay going right, it's kind of crazy. On 122 00:07:41,920 --> 00:07:44,280 Speaker 1: top of the funding news, by the way, Reuter's reports 123 00:07:44,320 --> 00:07:48,559 Speaker 1: that open ai also secured a four billion dollar line 124 00:07:48,600 --> 00:07:51,560 Speaker 1: of credit, so on top of the six point six 125 00:07:51,680 --> 00:07:54,600 Speaker 1: billion that was invested into it, it means that it 126 00:07:54,640 --> 00:07:57,800 Speaker 1: has more than ten billion dollars of liquidity to its 127 00:07:57,880 --> 00:08:00,840 Speaker 1: name right now, which is a big old yeah. Ten 128 00:08:01,520 --> 00:08:05,720 Speaker 1: billion dollars now. Again, that doesn't guarantee that open ai 129 00:08:05,920 --> 00:08:09,000 Speaker 1: is going to be able to leverage all this money 130 00:08:09,040 --> 00:08:13,600 Speaker 1: and turn Ai into like a sustainable business that actually 131 00:08:13,960 --> 00:08:17,400 Speaker 1: can you stand on some own. We'll still have to 132 00:08:17,440 --> 00:08:21,000 Speaker 1: see or maybe we'll get to a point where not 133 00:08:21,080 --> 00:08:23,560 Speaker 1: only are they going to need investment, that investment is 134 00:08:23,560 --> 00:08:26,320 Speaker 1: going to go directly to paying off interest on a 135 00:08:26,360 --> 00:08:31,400 Speaker 1: four billion dollar line of credit. Yikes. Meanwhile, over at 136 00:08:31,440 --> 00:08:36,400 Speaker 1: open AI's former headquarters, Elon Musk, a guy who never 137 00:08:36,480 --> 00:08:39,680 Speaker 1: met a grudge he couldn't hold, held an event for 138 00:08:39,760 --> 00:08:43,240 Speaker 1: his own AI startup, which is called x Ai, and 139 00:08:43,320 --> 00:08:46,360 Speaker 1: I would say that it's it's pretty much in character 140 00:08:46,800 --> 00:08:52,840 Speaker 1: for Elon Musk to hold a recruiting event for his 141 00:08:52,920 --> 00:08:59,720 Speaker 1: own AI startup in the open ai original headquarters because 142 00:08:59,720 --> 00:09:02,280 Speaker 1: he got I remember, Musk was a co founder of 143 00:09:02,320 --> 00:09:05,760 Speaker 1: open ai years ago, but he had a massive falling 144 00:09:05,840 --> 00:09:09,679 Speaker 1: out with others in the organization and he bailed. There 145 00:09:09,679 --> 00:09:13,199 Speaker 1: were rumors that he was attempting to essentially take over 146 00:09:13,320 --> 00:09:16,360 Speaker 1: open ai, and when he encountered resistance, he decided he 147 00:09:16,360 --> 00:09:18,640 Speaker 1: would go and make his own AI company. It's one 148 00:09:18,640 --> 00:09:22,400 Speaker 1: of those like Futurama type moments. He has since argued 149 00:09:22,400 --> 00:09:26,200 Speaker 1: that open ai has largely abandoned its initial mission of 150 00:09:26,240 --> 00:09:31,200 Speaker 1: being an open, transparent, non profit organization that's dedicated to 151 00:09:31,520 --> 00:09:35,160 Speaker 1: the safe and responsible development of artificial intelligence. And y'all, 152 00:09:35,200 --> 00:09:38,040 Speaker 1: I do not agree with Elon Musk on many things, 153 00:09:38,160 --> 00:09:41,160 Speaker 1: but I do think that particular criticism is one hundred 154 00:09:41,160 --> 00:09:44,480 Speaker 1: percent on target. I think there's no way to argue 155 00:09:44,520 --> 00:09:49,920 Speaker 1: against that Open ai has certainly transformed dramatically from that 156 00:09:50,400 --> 00:09:54,720 Speaker 1: open and transparent nonprofit organization into very much a for 157 00:09:54,960 --> 00:09:58,480 Speaker 1: profit company that of fuse skates a lot of what 158 00:09:58,600 --> 00:10:01,360 Speaker 1: it works on. Now. That's not to say that I 159 00:10:01,400 --> 00:10:04,640 Speaker 1: think Musk would have done things differently had he actually 160 00:10:05,080 --> 00:10:07,760 Speaker 1: been the one in charge of open ai. I think 161 00:10:07,800 --> 00:10:11,400 Speaker 1: if he had one that struggle, if in fact, there 162 00:10:11,600 --> 00:10:14,360 Speaker 1: was one a few years ago, and if he had 163 00:10:14,440 --> 00:10:17,160 Speaker 1: become the head of open ai, he would either be 164 00:10:17,280 --> 00:10:20,200 Speaker 1: doing something not too different from what open ai is 165 00:10:20,240 --> 00:10:23,199 Speaker 1: doing now, or he would have run the company out 166 00:10:23,240 --> 00:10:25,720 Speaker 1: of business one of the two. And the reason I 167 00:10:25,720 --> 00:10:31,280 Speaker 1: say that is again, AI is really expensive, like it 168 00:10:31,440 --> 00:10:35,479 Speaker 1: just it burns through money so fast because the technology 169 00:10:35,520 --> 00:10:38,839 Speaker 1: you need to run AI applications. First of all, the 170 00:10:38,920 --> 00:10:42,240 Speaker 1: chips are in short supply, so those are expensive, and 171 00:10:42,280 --> 00:10:45,320 Speaker 1: then you have the electricity needs. Those are expensive. It's 172 00:10:45,400 --> 00:10:47,920 Speaker 1: just hard to do. And if you're going to run 173 00:10:47,960 --> 00:10:50,439 Speaker 1: it as nonprofit, it means you do need to have 174 00:10:50,520 --> 00:10:54,960 Speaker 1: that constant influx of cash in order to fund your work. 175 00:10:55,200 --> 00:10:58,200 Speaker 1: And I'm not sure that Elon Musk would have managed 176 00:10:58,240 --> 00:11:00,559 Speaker 1: to do that. So it's not that I'm saying he 177 00:11:00,559 --> 00:11:03,440 Speaker 1: would have mismanaged it, just rather I don't know how 178 00:11:03,480 --> 00:11:09,040 Speaker 1: you get a nonprofit to work anyway. Musk addressed a 179 00:11:09,080 --> 00:11:12,760 Speaker 1: group of engineers at this event to recruit them for XAI. 180 00:11:13,160 --> 00:11:15,439 Speaker 1: At the event, Musk talked about his desire to create 181 00:11:15,480 --> 00:11:19,760 Speaker 1: a quote digital superintelligence that is as benign as possible 182 00:11:19,880 --> 00:11:23,320 Speaker 1: end quote. He also said he thought we'd achieve artificial 183 00:11:23,400 --> 00:11:26,959 Speaker 1: general intelligence within a couple of years. That seems overly 184 00:11:27,000 --> 00:11:29,800 Speaker 1: ambitious to me. But then again, Musk has also frequently 185 00:11:30,320 --> 00:11:35,040 Speaker 1: made some rather grandiose predictions about AI that just haven't 186 00:11:35,120 --> 00:11:39,080 Speaker 1: you know, shaken out, Like specifically in the autonomous car space, 187 00:11:39,160 --> 00:11:41,720 Speaker 1: he has thought that we were going to hit a real, 188 00:11:41,840 --> 00:11:46,560 Speaker 1: true autonomous car future much earlier, and you know, obviously 189 00:11:46,600 --> 00:11:48,719 Speaker 1: we're still not there yet, but he thought it was 190 00:11:48,760 --> 00:11:50,760 Speaker 1: already going to be here, and that has not happened. 191 00:11:50,760 --> 00:11:55,720 Speaker 1: So I would not bank on artificial general intelligence within 192 00:11:55,800 --> 00:11:59,200 Speaker 1: the next couple of years. Maybe. I mean, I don't 193 00:11:59,200 --> 00:12:01,480 Speaker 1: know for sure. I just know that it's a really 194 00:12:01,559 --> 00:12:04,600 Speaker 1: hard goal to hit. He also mentioned the desire to 195 00:12:04,640 --> 00:12:09,360 Speaker 1: create a supersonic passenger aircraft company in the future. That 196 00:12:09,640 --> 00:12:13,080 Speaker 1: probably merits a separate discussion. That's a really tricky thing. 197 00:12:13,320 --> 00:12:17,600 Speaker 1: But yeah, that's what happened with his recruiting event this week. Okay, 198 00:12:17,640 --> 00:12:19,880 Speaker 1: I've got a couple more stories to talk about Before 199 00:12:19,880 --> 00:12:32,080 Speaker 1: I do that, let's take a quick break. We're back. 200 00:12:32,440 --> 00:12:36,199 Speaker 1: So Victoria Song of the Verge has a piece titled 201 00:12:36,520 --> 00:12:40,520 Speaker 1: college students used Meta's smart glasses to docs people in 202 00:12:40,600 --> 00:12:44,160 Speaker 1: real time. Now. As that headline indicates, this is about 203 00:12:44,200 --> 00:12:48,080 Speaker 1: how a pair of students used some ar glasses that 204 00:12:48,120 --> 00:12:53,280 Speaker 1: are outfitted with cameras and Internet connectivity to essentially run 205 00:12:52,760 --> 00:12:56,880 Speaker 1: an app that they had built that relies on things 206 00:12:56,920 --> 00:13:01,400 Speaker 1: like facial recognition and personal idea databases in order to 207 00:13:01,559 --> 00:13:05,160 Speaker 1: return information about the people that are within view of 208 00:13:05,640 --> 00:13:10,439 Speaker 1: the glasses camera. As songwrits, quote, the tech works by 209 00:13:10,520 --> 00:13:15,319 Speaker 1: using the meta smart glasses ability to livestream video to Instagram. 210 00:13:15,480 --> 00:13:19,280 Speaker 1: A computer program then monitors that stream and uses AI 211 00:13:19,440 --> 00:13:23,200 Speaker 1: to identify faces. Those photos are then fed into public 212 00:13:23,280 --> 00:13:28,920 Speaker 1: databases to find names, addresses, phone numbers, and even relatives. 213 00:13:29,040 --> 00:13:32,480 Speaker 1: That information is then fed back through a phone app 214 00:13:33,000 --> 00:13:36,880 Speaker 1: end quote. So the students call their Techi x ray, 215 00:13:36,960 --> 00:13:41,679 Speaker 1: and I'm sure you could immediately imagine how that technology 216 00:13:41,760 --> 00:13:44,720 Speaker 1: could be abused, And in fact, there are very few 217 00:13:44,920 --> 00:13:48,280 Speaker 1: use cases that are benign. Right, And the students have 218 00:13:48,320 --> 00:13:52,320 Speaker 1: stressed they're not releasing this technology. This is not meant 219 00:13:52,320 --> 00:13:54,080 Speaker 1: to be an app that you're going to be able 220 00:13:54,080 --> 00:13:58,359 Speaker 1: to download and then walk around and know everybody's secret identity. 221 00:13:58,720 --> 00:14:05,160 Speaker 1: They recognize how technology is inherently abusable, like it's again 222 00:14:05,440 --> 00:14:07,600 Speaker 1: very hard to use it in a way that isn't 223 00:14:08,120 --> 00:14:14,000 Speaker 1: malicious or at least irresponsible. So just imagine someone wearing 224 00:14:14,480 --> 00:14:19,200 Speaker 1: glasses like these and then pretending to know complete strangers 225 00:14:19,280 --> 00:14:23,440 Speaker 1: because they've got on their phone a quick dossier about 226 00:14:23,440 --> 00:14:27,680 Speaker 1: the person. They've got their name, their address, they have 227 00:14:27,840 --> 00:14:30,640 Speaker 1: relative names, all that kind of stuff. They're able to 228 00:14:30,680 --> 00:14:33,480 Speaker 1: actually reference this. Like I used to do a goofy 229 00:14:33,680 --> 00:14:37,080 Speaker 1: version of this at stores where I would walk in 230 00:14:37,160 --> 00:14:39,440 Speaker 1: and like the employees would have a name tag on, 231 00:14:39,760 --> 00:14:41,800 Speaker 1: and I would just address them by their name tag 232 00:14:41,960 --> 00:14:44,480 Speaker 1: in the store, Like not outside that's creepy, but in 233 00:14:44,520 --> 00:14:46,840 Speaker 1: the store. And sometimes they would forget that they were 234 00:14:46,840 --> 00:14:48,920 Speaker 1: wearing a name tag. They're like, how did you know 235 00:14:48,960 --> 00:14:50,760 Speaker 1: who I We know each other. I'm like, no, you're 236 00:14:50,800 --> 00:14:53,760 Speaker 1: wearing your name literally on your shirt and they say, oh, 237 00:14:54,000 --> 00:14:56,480 Speaker 1: right right. You know, just one of those moments where 238 00:14:56,520 --> 00:14:59,320 Speaker 1: you just you're not even thinking about it. Well, as 239 00:14:59,600 --> 00:15:02,360 Speaker 1: silly a little interaction as that is, you can imagine 240 00:15:02,400 --> 00:15:05,800 Speaker 1: one being much more serious. Let's say it's at a bar. 241 00:15:06,440 --> 00:15:09,560 Speaker 1: You can easily imagine someone trying to prey upon people 242 00:15:09,600 --> 00:15:12,520 Speaker 1: at a bar by pretending like they either know this 243 00:15:12,600 --> 00:15:16,080 Speaker 1: person from way back, or they know someone that knows 244 00:15:16,120 --> 00:15:20,040 Speaker 1: this person, and they're trying to get an end that way. 245 00:15:20,280 --> 00:15:23,400 Speaker 1: So the students say their intent was to raise awareness 246 00:15:23,440 --> 00:15:27,480 Speaker 1: that this capability isn't just some hypothetical future technology. I mean, 247 00:15:27,560 --> 00:15:30,040 Speaker 1: this is something people have been warning about for a while, 248 00:15:30,120 --> 00:15:33,320 Speaker 1: but the students are saying, listen, we're done warning. It's here. 249 00:15:33,800 --> 00:15:36,520 Speaker 1: We did it. It is possible. And if we did it, 250 00:15:36,560 --> 00:15:39,440 Speaker 1: even though we're not gonna do anything with this technology, 251 00:15:39,800 --> 00:15:41,640 Speaker 1: that doesn't mean the next person will do the same. 252 00:15:42,080 --> 00:15:45,880 Speaker 1: So you have to be aware what this technology can do. 253 00:15:46,200 --> 00:15:49,360 Speaker 1: I think it also again goes back to showing how 254 00:15:49,680 --> 00:15:53,120 Speaker 1: terrible a job the United States has done when it 255 00:15:53,160 --> 00:15:58,680 Speaker 1: comes to citizen data privacy. It's criminal, if you ask me, 256 00:15:58,880 --> 00:16:01,840 Speaker 1: because the fact that there have been no real rules 257 00:16:01,840 --> 00:16:06,200 Speaker 1: about this make things like this totally possible and in fact, 258 00:16:06,480 --> 00:16:09,800 Speaker 1: like you could even imagine a much deeper dive for 259 00:16:09,880 --> 00:16:13,200 Speaker 1: this kind of thing, because those databases out there are 260 00:16:13,320 --> 00:16:18,120 Speaker 1: enormous and comprehensive. Even for people who aren't online. If 261 00:16:18,160 --> 00:16:22,080 Speaker 1: they're showing up in pictures that are on friends, you know, 262 00:16:22,640 --> 00:16:25,560 Speaker 1: social profiles or whatever, and they're identified in the photos, 263 00:16:26,040 --> 00:16:29,040 Speaker 1: that's enough. They don't even have to participate directly in 264 00:16:29,080 --> 00:16:32,600 Speaker 1: the system to be abused by it. So yeah, pretty 265 00:16:33,000 --> 00:16:38,560 Speaker 1: sobering example of how technology can interfere with privacy and 266 00:16:38,560 --> 00:16:42,120 Speaker 1: potentially put people at risk. Google is apparently rolling out 267 00:16:42,120 --> 00:16:46,120 Speaker 1: a verification feature on search results. Just Weatherbed also of 268 00:16:46,120 --> 00:16:49,520 Speaker 1: the Verge, reports that some users are getting results back 269 00:16:49,560 --> 00:16:52,920 Speaker 1: that include blue check marks next to certain entries. Those 270 00:16:53,000 --> 00:16:56,440 Speaker 1: check marks indicate sites that Google has verified as being 271 00:16:56,720 --> 00:16:59,840 Speaker 1: a legitimate business. So if you're searching for something, let's 272 00:16:59,840 --> 00:17:03,120 Speaker 1: say like you're searching for I don't know, durable camping equipment, 273 00:17:03,800 --> 00:17:07,000 Speaker 1: when you get your search results back, you are going 274 00:17:07,040 --> 00:17:09,920 Speaker 1: to see if you're part of this anyway, you would 275 00:17:09,920 --> 00:17:12,080 Speaker 1: see that some of the companies that are listed would 276 00:17:12,080 --> 00:17:15,200 Speaker 1: have this blue check mark. Other companies may be ones 277 00:17:15,240 --> 00:17:18,159 Speaker 1: that are trying to pose as if they are a 278 00:17:18,200 --> 00:17:21,240 Speaker 1: more established, reputable company, They're not going to have that 279 00:17:21,320 --> 00:17:25,160 Speaker 1: check mark. So it's an immediate visual indicator of which 280 00:17:25,280 --> 00:17:29,280 Speaker 1: businesses are trustworthy or at least more likely to be trustworthy. 281 00:17:29,640 --> 00:17:33,120 Speaker 1: So it's really all about identifying businesses in this case. 282 00:17:33,160 --> 00:17:36,320 Speaker 1: That's it, not like people or anything like that. And 283 00:17:36,400 --> 00:17:39,800 Speaker 1: it sounds like this initiative is in a very limited rollout. 284 00:17:39,920 --> 00:17:42,240 Speaker 1: Not everyone's going to see check marks in their results, 285 00:17:42,280 --> 00:17:45,160 Speaker 1: and I don't I did quite a few different searches 286 00:17:45,280 --> 00:17:47,480 Speaker 1: just to see if any of these popped up for me, 287 00:17:47,800 --> 00:17:50,320 Speaker 1: But no matter what I searched for, I didn't get 288 00:17:50,359 --> 00:17:52,960 Speaker 1: anything that had check marks on it. So I am 289 00:17:52,960 --> 00:17:55,320 Speaker 1: not part of this rollout, and of course there's no 290 00:17:55,440 --> 00:17:58,119 Speaker 1: guarantee that Google will ever roll it out to the 291 00:17:58,119 --> 00:18:00,200 Speaker 1: general public. It may just be that this is a 292 00:18:00,280 --> 00:18:02,720 Speaker 1: test and nothing else comes of it, but we'll have 293 00:18:02,720 --> 00:18:05,919 Speaker 1: to keep your eyes out. Finally, for some recommended reading, 294 00:18:05,960 --> 00:18:09,159 Speaker 1: I suggest Eric Berger's piece in Ours Tetnica. It is 295 00:18:09,200 --> 00:18:12,280 Speaker 1: titled NASA is working on a plan to replace its 296 00:18:12,400 --> 00:18:16,720 Speaker 1: space station, but time is running out. So the current 297 00:18:16,880 --> 00:18:19,520 Speaker 1: plan for the International Space Station is for it to 298 00:18:19,520 --> 00:18:22,720 Speaker 1: fly into the sunset, or more accurately, for it to 299 00:18:22,840 --> 00:18:27,440 Speaker 1: deorbit in twenty thirty. But unless the pace really picks 300 00:18:27,520 --> 00:18:30,760 Speaker 1: up and soon, that's going to happen without a new 301 00:18:31,000 --> 00:18:34,080 Speaker 1: space station taking its place, that would mean the good 302 00:18:34,119 --> 00:18:36,560 Speaker 1: old US of A would no longer have its own 303 00:18:36,720 --> 00:18:40,240 Speaker 1: research facility in orbit. Complicating matters is the fact that 304 00:18:40,320 --> 00:18:43,080 Speaker 1: NASA also has plans to return to the Moon and 305 00:18:43,160 --> 00:18:47,080 Speaker 1: potentially to set the stage for further human exploration, potentially 306 00:18:47,160 --> 00:18:50,320 Speaker 1: the places like Mars. So I recommend reading Burger's article 307 00:18:50,400 --> 00:18:53,919 Speaker 1: to get a full picture of the situation. That's it 308 00:18:54,040 --> 00:18:56,720 Speaker 1: for this week. I hope all of you out there 309 00:18:56,840 --> 00:19:00,800 Speaker 1: are doing well, and I'll talk to you again really soon. 310 00:19:06,880 --> 00:19:11,560 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 311 00:19:11,880 --> 00:19:15,600 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 312 00:19:15,600 --> 00:19:16,679 Speaker 1: to your favorite shows.