1 00:00:04,360 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:16,079 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,160 --> 00:00:19,119 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:19,200 --> 00:00:22,239 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:22,280 --> 00:00:26,520 Speaker 1: April fourth, twenty twenty three, and we've got more AI 6 00:00:26,720 --> 00:00:30,600 Speaker 1: news to get us started today. Big Surprise. The Handel's 7 00:00:30,680 --> 00:00:34,800 Speaker 1: Black newspaper in Germany reports that the German government is 8 00:00:34,840 --> 00:00:39,839 Speaker 1: debating blocking chat GPT in the country out of a 9 00:00:39,840 --> 00:00:44,320 Speaker 1: concern for data security. This would follow a similar ban 10 00:00:44,880 --> 00:00:48,040 Speaker 1: that has been issued in Italy. This doesn't have anything 11 00:00:48,080 --> 00:00:51,040 Speaker 1: to do with how reliable the chatbot can be, or 12 00:00:51,080 --> 00:00:54,200 Speaker 1: it's an ability to tell the difference between legitimate information 13 00:00:54,560 --> 00:01:00,840 Speaker 1: and misinformation or satire or you know, otherwise unreliable sources. Instead, 14 00:01:01,240 --> 00:01:04,800 Speaker 1: this has to do with how chat GPT handles privacy. 15 00:01:05,440 --> 00:01:08,880 Speaker 1: Italian regulators said that open ai failed to follow EU 16 00:01:09,080 --> 00:01:12,800 Speaker 1: rules regarding a measure that would prevent minors from being 17 00:01:12,800 --> 00:01:15,520 Speaker 1: able to use the chatbot service. There should be some 18 00:01:15,560 --> 00:01:20,839 Speaker 1: sort of age gate keeping methodology in there. On top 19 00:01:20,959 --> 00:01:25,480 Speaker 1: of a concern about how open ai collects user data 20 00:01:25,560 --> 00:01:27,679 Speaker 1: and then makes use of it, and of course, we 21 00:01:27,800 --> 00:01:32,840 Speaker 1: know that last month a glitch revealed user chatbot conversation 22 00:01:32,920 --> 00:01:37,039 Speaker 1: histories inadvertently, where people were suddenly able to see how 23 00:01:37,080 --> 00:01:41,560 Speaker 1: other folks had been using chat GPT. Whether this will 24 00:01:41,600 --> 00:01:44,880 Speaker 1: become a trend in the EU at large remains to 25 00:01:44,920 --> 00:01:49,280 Speaker 1: be seen, and I suspect Microsoft, which is heavily invested 26 00:01:49,440 --> 00:01:53,680 Speaker 1: in open AI in chat GPT, will start taking measures 27 00:01:53,680 --> 00:01:56,840 Speaker 1: to create a version of the chatbot that is more 28 00:01:56,840 --> 00:02:01,080 Speaker 1: in compliance with EU rules and regulations. It would shock 29 00:02:01,160 --> 00:02:04,200 Speaker 1: me if that were not the case. Now it's not 30 00:02:04,280 --> 00:02:09,320 Speaker 1: just countries clamping down on chat GPT, Samsung Semiconductor is 31 00:02:09,360 --> 00:02:13,720 Speaker 1: tweaking its approach to the tool as well. So until recently, 32 00:02:13,800 --> 00:02:17,720 Speaker 1: Samsung allowed engineers to kind of have unfettered use of 33 00:02:17,800 --> 00:02:22,320 Speaker 1: chat GPT in an effort to assist in the fabrication process. 34 00:02:22,919 --> 00:02:26,240 Speaker 1: So engineers were using chat GPT to do things like 35 00:02:26,280 --> 00:02:30,760 Speaker 1: look for errors and source code, or to take notes 36 00:02:30,840 --> 00:02:34,119 Speaker 1: from a meeting and convert it into graphics or even 37 00:02:34,120 --> 00:02:36,760 Speaker 1: a presentation, that kind of stuff. But the use of 38 00:02:36,840 --> 00:02:41,800 Speaker 1: chat GPT led to three prominent data leaks in the 39 00:02:41,840 --> 00:02:45,000 Speaker 1: span of twenty days, which is a big concern. Now, 40 00:02:45,080 --> 00:02:48,800 Speaker 1: I don't mean that chat GPT got all loose lipped 41 00:02:48,840 --> 00:02:52,320 Speaker 1: about it and leaked the information to the outside world. Rather, 42 00:02:52,520 --> 00:02:56,960 Speaker 1: this was an internal problem that engineers, in their eagerness 43 00:02:56,960 --> 00:02:59,520 Speaker 1: to leverage the tool, made their you know, to make 44 00:02:59,560 --> 00:03:03,480 Speaker 1: their jobs like you know, easy and efficient and error free, 45 00:03:03,720 --> 00:03:06,680 Speaker 1: they began to share a bit more than what Samsung 46 00:03:06,760 --> 00:03:13,399 Speaker 1: was comfortable with, including proprietary and top secret code and information, 47 00:03:13,560 --> 00:03:18,640 Speaker 1: stuff that is really important to Samsung Semiconductor, and they 48 00:03:18,680 --> 00:03:20,360 Speaker 1: don't want to share it with the outside world. And 49 00:03:20,400 --> 00:03:23,040 Speaker 1: you might think, well, sure, but you're you're just sharing 50 00:03:23,080 --> 00:03:25,720 Speaker 1: it with a chatbot. Was the chatbot going to do 51 00:03:25,800 --> 00:03:28,480 Speaker 1: with that information? But then you also have to remember, 52 00:03:28,919 --> 00:03:33,040 Speaker 1: wait a second, this chatbot comes from another company, right, 53 00:03:33,120 --> 00:03:35,160 Speaker 1: It comes from a company that also happens to be 54 00:03:35,240 --> 00:03:38,920 Speaker 1: tightly tied to Microsoft, And you start to worry about 55 00:03:39,040 --> 00:03:42,920 Speaker 1: who might be able to access that data, Like could 56 00:03:43,040 --> 00:03:47,800 Speaker 1: Microsoft see how people at Samsung Semiconductor were using chat GPT? 57 00:03:49,240 --> 00:03:53,280 Speaker 1: If so, then Microsoft could see some of this top 58 00:03:53,320 --> 00:03:56,080 Speaker 1: secret information, which would be a big no no. So 59 00:03:56,160 --> 00:03:59,040 Speaker 1: now Samsung has cracked down a bit and limited the 60 00:03:59,080 --> 00:04:02,280 Speaker 1: amount of information that employees are allowed to share when 61 00:04:02,320 --> 00:04:05,480 Speaker 1: they use chat GPT. They can still use it, but 62 00:04:05,640 --> 00:04:10,480 Speaker 1: they have strict limitations on it, and they're simultaneously developing 63 00:04:10,520 --> 00:04:14,320 Speaker 1: their own version of the tool that they can administer themselves, 64 00:04:15,040 --> 00:04:17,479 Speaker 1: So that way the creepy AI presence will be an 65 00:04:17,480 --> 00:04:20,880 Speaker 1: in house creepy AI presence, not one that has ties 66 00:04:20,880 --> 00:04:25,239 Speaker 1: to potential competitors. This is a fairly high profile case, 67 00:04:25,480 --> 00:04:28,960 Speaker 1: right Samsung Semiconductor. That's a big company. But it makes 68 00:04:28,960 --> 00:04:31,480 Speaker 1: me wonder if we're going to start seeing other companies 69 00:04:31,920 --> 00:04:36,600 Speaker 1: have second thoughts about letting employees use AI enabled tools 70 00:04:36,960 --> 00:04:40,600 Speaker 1: out of similar concerns. It is not difficult for me 71 00:04:40,680 --> 00:04:45,000 Speaker 1: to imagine executives deciding that these nifty AI tools that 72 00:04:45,080 --> 00:04:49,920 Speaker 1: are incorporated into say Microsoft office could pose as a 73 00:04:49,960 --> 00:04:54,760 Speaker 1: potential risk to data security. It's possible that Microsoft and 74 00:04:54,920 --> 00:04:58,960 Speaker 1: other companies, by jumping into the AI games so enthusiastically 75 00:04:59,040 --> 00:05:02,680 Speaker 1: right now, they could be setting themselves up for a fall. 76 00:05:02,920 --> 00:05:06,760 Speaker 1: If we see companies at large say, well, what happens 77 00:05:06,800 --> 00:05:10,839 Speaker 1: if I use Microsoft to take notes from an important 78 00:05:10,880 --> 00:05:14,400 Speaker 1: internal meeting and turn them into a presentation that we 79 00:05:14,480 --> 00:05:18,680 Speaker 1: can show other departments. But still internally, what happens if 80 00:05:18,720 --> 00:05:22,280 Speaker 1: Microsoft is able to actually access information because of this 81 00:05:22,320 --> 00:05:26,520 Speaker 1: connection with the AI enabled tools in their productivity suite. 82 00:05:28,400 --> 00:05:31,120 Speaker 1: As we've had more and more of our operations moved 83 00:05:31,120 --> 00:05:34,640 Speaker 1: to the cloud, this has become a non trivial concern, 84 00:05:35,680 --> 00:05:38,080 Speaker 1: and I imagine we're going to see companies start to 85 00:05:38,240 --> 00:05:45,680 Speaker 1: really reconsider how they go about handling top secret proprietary information. 86 00:05:46,360 --> 00:05:48,719 Speaker 1: I mean, I just see companies saying, all right, well, 87 00:05:48,760 --> 00:05:51,520 Speaker 1: for this, we can't use any of those tools because 88 00:05:52,080 --> 00:05:55,560 Speaker 1: the possibility even if it's you know, even if the 89 00:05:55,600 --> 00:05:57,600 Speaker 1: companies are saying no, no, no, we're not doing that, 90 00:05:57,640 --> 00:06:02,159 Speaker 1: the possibility of that being leaked might be too much. 91 00:06:02,680 --> 00:06:05,760 Speaker 1: It actually is making me think a lot about TikTok, right, Like, 92 00:06:05,800 --> 00:06:08,960 Speaker 1: the big argument against TikTok, or at least one of 93 00:06:09,000 --> 00:06:12,560 Speaker 1: the big arguments, is that it could serve as a 94 00:06:12,720 --> 00:06:18,080 Speaker 1: data siphon, pulling, you know, information that's important out of 95 00:06:18,120 --> 00:06:21,440 Speaker 1: the US and siphoning it off to China, and that 96 00:06:21,560 --> 00:06:24,839 Speaker 1: potentially that could impact national security. Well, I could see 97 00:06:24,839 --> 00:06:28,119 Speaker 1: companies looking at AI enabled tools and saying similar things 98 00:06:28,160 --> 00:06:32,120 Speaker 1: that how can we trust that the productivity stuff where 99 00:06:32,160 --> 00:06:37,520 Speaker 1: we're using isn't just sending our data to Microsoft, for example, 100 00:06:38,200 --> 00:06:41,000 Speaker 1: And that's a tough question. So I'll be curious to 101 00:06:41,040 --> 00:06:44,760 Speaker 1: see if this becomes a bigger thing throughout this year 102 00:06:44,800 --> 00:06:49,040 Speaker 1: and into next. Jeffrey Fowler of the Washington Post has 103 00:06:49,120 --> 00:06:53,559 Speaker 1: an interesting piece about chat GPT and education. The piece 104 00:06:53,720 --> 00:06:58,159 Speaker 1: is titled we tested a new chat GPT detector for teachers. 105 00:06:58,240 --> 00:07:02,080 Speaker 1: It flagged an innocent student, and yeah, the headline gives 106 00:07:02,120 --> 00:07:04,839 Speaker 1: you a strong idea about what the story is all about. 107 00:07:04,880 --> 00:07:08,000 Speaker 1: So the deal is Fowler was testing software from a 108 00:07:08,040 --> 00:07:13,560 Speaker 1: company called Turnatin. Turnatin, according to Fowler, provides more than 109 00:07:13,680 --> 00:07:18,600 Speaker 1: two million teachers, a tool that's meant to detect instances 110 00:07:18,600 --> 00:07:22,360 Speaker 1: of plagiarism. So if it turns out that there's a 111 00:07:22,440 --> 00:07:27,200 Speaker 1: passage in a student's essay that was lifted directly from 112 00:07:27,280 --> 00:07:29,600 Speaker 1: some other source, it's supposed to be able to flag that. 113 00:07:29,720 --> 00:07:32,880 Speaker 1: But it's also supposed to be able to detect if 114 00:07:32,880 --> 00:07:35,720 Speaker 1: a student has made use of a tool like chat 115 00:07:35,760 --> 00:07:40,920 Speaker 1: GPT to generate all or part of a work. Unfortunately, 116 00:07:41,600 --> 00:07:44,800 Speaker 1: just like chat GPT itself, it seems as though this 117 00:07:44,840 --> 00:07:50,080 Speaker 1: tool isn't totally reliable. Now. I've warned folks repeatedly that 118 00:07:50,200 --> 00:07:54,000 Speaker 1: chat GPT's responses are really only as good as the 119 00:07:54,040 --> 00:07:58,400 Speaker 1: source material it used to generate those responses, and often 120 00:07:58,960 --> 00:08:01,880 Speaker 1: you have no way of knowing what that source material was. 121 00:08:02,480 --> 00:08:05,320 Speaker 1: But I should also point out that the tools meant 122 00:08:05,360 --> 00:08:10,920 Speaker 1: to detect chat GPT aren't perfect either. So in Fowler's experiment, 123 00:08:11,320 --> 00:08:14,680 Speaker 1: a student submitted an essay that she wrote without the 124 00:08:14,680 --> 00:08:18,600 Speaker 1: help of chat GPT or any other AI, and yet 125 00:08:18,840 --> 00:08:23,640 Speaker 1: the software flagged her work as being AI augmented. Now 126 00:08:23,680 --> 00:08:28,000 Speaker 1: this concerns me for lots of different reasons. So first off, 127 00:08:28,240 --> 00:08:31,680 Speaker 1: understandably there are a lot of teachers who are concerned 128 00:08:31,800 --> 00:08:36,199 Speaker 1: about chat GPT. If students use chat GPT without putting 129 00:08:36,240 --> 00:08:39,680 Speaker 1: forth any real effort of their own, they won't really 130 00:08:39,800 --> 00:08:43,240 Speaker 1: learn anything except how to use AI so that they 131 00:08:43,240 --> 00:08:46,480 Speaker 1: can avoid thinking about stuff. Right, they might get really 132 00:08:46,480 --> 00:08:48,720 Speaker 1: good at gaming the system, but that's all they learned. 133 00:08:48,800 --> 00:08:52,120 Speaker 1: They don't learn to think critically. They don't learn to 134 00:08:52,160 --> 00:08:56,760 Speaker 1: think analytically. They can't start to think laterally. Like there's 135 00:08:57,040 --> 00:08:59,600 Speaker 1: when you really break it down. Education, at least it 136 00:08:59,600 --> 00:09:03,880 Speaker 1: should be all about learning how to think and to 137 00:09:04,000 --> 00:09:07,920 Speaker 1: think effectively and to think critically. So if they're just 138 00:09:08,080 --> 00:09:10,640 Speaker 1: using chat GPT to do their work, you might as 139 00:09:10,640 --> 00:09:13,319 Speaker 1: well not assign them anything at all, because the net 140 00:09:13,360 --> 00:09:16,040 Speaker 1: effect is going to be much the same. If the 141 00:09:16,080 --> 00:09:20,320 Speaker 1: students are not using chat GPT, but there's no way 142 00:09:20,320 --> 00:09:23,960 Speaker 1: to verify whether they are or not. Well, you have 143 00:09:24,000 --> 00:09:27,560 Speaker 1: a situation where teachers are suspicious of their students, and 144 00:09:27,720 --> 00:09:30,479 Speaker 1: that's not good either, right for teachers to just constantly 145 00:09:30,520 --> 00:09:34,280 Speaker 1: be wondering if their students are cheating because they have 146 00:09:34,320 --> 00:09:36,560 Speaker 1: no way of verifying. And then it could be that 147 00:09:36,600 --> 00:09:39,880 Speaker 1: the students are all legitimately submitting their own work, but 148 00:09:40,280 --> 00:09:42,839 Speaker 1: you've got this suspicion there. That's not good. That's not 149 00:09:43,040 --> 00:09:45,920 Speaker 1: conducive to a learning environment either. And then if you 150 00:09:45,960 --> 00:09:48,840 Speaker 1: have a tool that's meant to detect chat GPT, but 151 00:09:48,880 --> 00:09:51,720 Speaker 1: it's not a very good tool or it's not accurate 152 00:09:51,840 --> 00:09:55,800 Speaker 1: enough to be fully reliable, then that means two things. One, 153 00:09:56,160 --> 00:09:59,160 Speaker 1: it's going to miss some instances where students were essentially 154 00:09:59,280 --> 00:10:03,000 Speaker 1: cheating and they're just gonna submit their stuff and it 155 00:10:03,040 --> 00:10:05,640 Speaker 1: won't pick up on the chat GPTs thing at all, 156 00:10:05,679 --> 00:10:08,640 Speaker 1: and the students will be coasting through without really learning 157 00:10:08,640 --> 00:10:13,960 Speaker 1: anything and ultimately become terrible citizens or potentially become terrible citizens. 158 00:10:13,960 --> 00:10:16,200 Speaker 1: I can't pretend like that's going to be the case 159 00:10:16,280 --> 00:10:19,080 Speaker 1: every time, but you know, I get head up about 160 00:10:19,080 --> 00:10:22,800 Speaker 1: this because both of my parents are teachers, or it 161 00:10:22,840 --> 00:10:27,199 Speaker 1: will be a poorly made tool where it falsely accuses 162 00:10:27,400 --> 00:10:31,480 Speaker 1: a student and says that their legitimate work was ai 163 00:10:31,559 --> 00:10:35,640 Speaker 1: augmented when it wasn't. That's not good either, right, Like, 164 00:10:36,240 --> 00:10:40,079 Speaker 1: this is just a bad kind of situation all around, 165 00:10:40,200 --> 00:10:42,840 Speaker 1: And you know, the Pandora's box has been opened. There's 166 00:10:42,840 --> 00:10:45,480 Speaker 1: no getting the evils of the world back in there. 167 00:10:46,040 --> 00:10:48,040 Speaker 1: But figuring out the best way forward is going to 168 00:10:48,120 --> 00:10:53,800 Speaker 1: be really important if we are to have good supportive 169 00:10:54,040 --> 00:10:58,240 Speaker 1: education systems in place that give people the resources they 170 00:10:58,280 --> 00:11:03,800 Speaker 1: need to become good people right to learn how to think, 171 00:11:04,280 --> 00:11:07,840 Speaker 1: and to encourage that. We often get stuck in the 172 00:11:07,880 --> 00:11:10,640 Speaker 1: weeds anyway with education. We can't see the forest for 173 00:11:10,679 --> 00:11:15,160 Speaker 1: the trees because we're looking at the specific instance of 174 00:11:15,760 --> 00:11:20,840 Speaker 1: learning as opposed to the overall approach to learning how 175 00:11:20,920 --> 00:11:24,080 Speaker 1: to learn and learning how to think, and we end 176 00:11:24,120 --> 00:11:27,880 Speaker 1: up getting really hyper focused on the specifics, right Like, 177 00:11:27,920 --> 00:11:31,200 Speaker 1: I need to write this essay about how Shakespeare used 178 00:11:31,240 --> 00:11:35,400 Speaker 1: the fool to speak truth to power, that kind of thing, 179 00:11:35,800 --> 00:11:38,240 Speaker 1: and I'm really focused on that as opposed to know, 180 00:11:38,440 --> 00:11:41,520 Speaker 1: this is about more than that. This is about learning 181 00:11:41,559 --> 00:11:45,680 Speaker 1: how to take in information and analyze it and create 182 00:11:45,720 --> 00:11:48,400 Speaker 1: your own response to it, and use critical thinking in 183 00:11:48,400 --> 00:11:52,080 Speaker 1: the whole process. All of these things, with chat GPT 184 00:11:52,440 --> 00:11:56,200 Speaker 1: and the tools meant to detect chat GPT, they make 185 00:11:56,280 --> 00:11:58,960 Speaker 1: me worry because I feel like we're going to get 186 00:11:59,000 --> 00:12:03,000 Speaker 1: more and more hyper focused on particulars and lose sight 187 00:12:03,160 --> 00:12:08,040 Speaker 1: of the overall goal of education. Maybe I should write 188 00:12:08,040 --> 00:12:10,880 Speaker 1: a book about that. No one will read it, but 189 00:12:10,920 --> 00:12:12,360 Speaker 1: at least I can get it out on my system 190 00:12:12,400 --> 00:12:13,800 Speaker 1: and you won't have to hear it in tech stuff 191 00:12:13,840 --> 00:12:17,600 Speaker 1: news episodes anymore. Okay, one other AI story before we 192 00:12:17,640 --> 00:12:22,160 Speaker 1: go to Break. According to a cybersecurity firm called dark Trace, 193 00:12:22,679 --> 00:12:28,040 Speaker 1: Generative AI may be partly responsible for a massive surge 194 00:12:28,559 --> 00:12:33,800 Speaker 1: in novel social engineering attacks this year. Dark Trace calculates 195 00:12:33,840 --> 00:12:37,440 Speaker 1: that in January and February of twenty twenty three, there 196 00:12:37,559 --> 00:12:41,079 Speaker 1: was a one hundred thirty five percent jump in social 197 00:12:41,080 --> 00:12:45,679 Speaker 1: engineering attacks. Social engineering is when you attempt to penetrate 198 00:12:45,760 --> 00:12:49,240 Speaker 1: a security system, not by sitting down at a dark 199 00:12:49,400 --> 00:12:53,559 Speaker 1: computer screen and typing in password guesses. Instead, you trick 200 00:12:53,640 --> 00:12:57,240 Speaker 1: someone into handing over access to you. It's way easier 201 00:12:57,280 --> 00:12:59,600 Speaker 1: than trying to break the tech side of a security system. 202 00:12:59,640 --> 00:13:02,800 Speaker 1: You just have to convince someone that you know, you're 203 00:13:02,840 --> 00:13:05,320 Speaker 1: just there to install an update on their machine, or 204 00:13:05,360 --> 00:13:08,040 Speaker 1: maybe that their computer has been flagged as being compromised 205 00:13:08,080 --> 00:13:11,280 Speaker 1: so you're there to clean it up, when in reality 206 00:13:11,280 --> 00:13:14,800 Speaker 1: you're actually there to really compromise the machine. Social engineering 207 00:13:14,880 --> 00:13:17,560 Speaker 1: is the tool of the con artist, and it's right 208 00:13:17,640 --> 00:13:20,880 Speaker 1: up there with the skills practiced by snake oil salespeople, 209 00:13:21,360 --> 00:13:26,079 Speaker 1: mentalists and stage magicians, and these tactics work. They've worked 210 00:13:26,600 --> 00:13:29,440 Speaker 1: for all of human history. I've fallen for them in 211 00:13:29,480 --> 00:13:32,760 Speaker 1: the past multiple times. There's no guarantee that I won't 212 00:13:32,760 --> 00:13:35,000 Speaker 1: fall for it again in the future. All I can 213 00:13:35,040 --> 00:13:37,760 Speaker 1: do is try to be as careful as I can, 214 00:13:37,920 --> 00:13:41,520 Speaker 1: to use critical thinking and to avoid acting on impulse 215 00:13:41,679 --> 00:13:45,120 Speaker 1: or emotion anyway. It sounds like scammers and bad actors 216 00:13:45,120 --> 00:13:48,480 Speaker 1: in general are making use of generative AI to craft 217 00:13:48,559 --> 00:13:51,080 Speaker 1: messages that are more likely to convince a target to 218 00:13:51,080 --> 00:13:54,400 Speaker 1: follow through on some kind of action, such as opening 219 00:13:54,480 --> 00:13:57,559 Speaker 1: up an attachment or clicking on a link that's going 220 00:13:57,559 --> 00:13:59,719 Speaker 1: to download malware to their machine, or getting them to 221 00:13:59,760 --> 00:14:03,320 Speaker 1: fill to form that will send important information to the attackers. 222 00:14:03,760 --> 00:14:07,800 Speaker 1: And chat, GPT and other tools can create all new 223 00:14:07,840 --> 00:14:11,520 Speaker 1: messages that are designed to get these kinds of reactions, 224 00:14:12,000 --> 00:14:15,040 Speaker 1: stuff that attackers haven't necessarily thought of yet, And that 225 00:14:15,120 --> 00:14:17,800 Speaker 1: also means that the targets will not have encountered those 226 00:14:17,880 --> 00:14:20,480 Speaker 1: kinds of attacks in the past, so it makes those 227 00:14:20,480 --> 00:14:24,720 Speaker 1: attacks more likely to succeed. Plus, chat GPT can avoid 228 00:14:25,080 --> 00:14:28,560 Speaker 1: some of the tail tale alerts that otherwise we rely 229 00:14:28,680 --> 00:14:31,640 Speaker 1: upon when we encounter these malicious emails. You know, stuff 230 00:14:31,680 --> 00:14:34,920 Speaker 1: like grammar and spelling mistakes, which are typically a dead 231 00:14:34,920 --> 00:14:38,880 Speaker 1: giveaway that it's an attack not a legitimate email. Well, 232 00:14:38,960 --> 00:14:41,960 Speaker 1: chat GPT won't make those mistakes, so it'll get harder 233 00:14:41,960 --> 00:14:46,320 Speaker 1: and harder to tell the scam emails from the legit ones. 234 00:14:46,880 --> 00:14:49,280 Speaker 1: I expect we're going to see this continue unless somehow 235 00:14:49,320 --> 00:14:53,040 Speaker 1: companies like open ai and Microsoft build into their AI 236 00:14:53,120 --> 00:14:56,360 Speaker 1: tools some sort of method for detecting ill intent. But 237 00:14:56,440 --> 00:14:59,320 Speaker 1: to me, that seems like that's a whole nasty bee's 238 00:14:59,360 --> 00:15:01,480 Speaker 1: nest of its own that we don't want to get into. 239 00:15:01,600 --> 00:15:04,400 Speaker 1: Right now. Okay, we're gonna take a quick break. When 240 00:15:04,440 --> 00:15:16,560 Speaker 1: we come back, we've got some more news stories. All right, 241 00:15:16,680 --> 00:15:23,200 Speaker 1: we're back, and Tarique the spy music. Okay, do you 242 00:15:23,240 --> 00:15:27,680 Speaker 1: remember the NSO group that's the Israeli company behind tools 243 00:15:27,680 --> 00:15:30,720 Speaker 1: like Pegasus that was a type of malware that could 244 00:15:30,760 --> 00:15:34,360 Speaker 1: compromise iPhones just by sending an eye message to a 245 00:15:34,440 --> 00:15:36,960 Speaker 1: targeted device. You just had to know their phone number 246 00:15:37,480 --> 00:15:41,160 Speaker 1: and send an eye message using Pegasus, and it would, 247 00:15:41,160 --> 00:15:46,320 Speaker 1: without the user's interaction at all, compromise the iPhone due 248 00:15:46,320 --> 00:15:50,280 Speaker 1: to a vulnerability. Apple would subsequently patch that vulnerability, but 249 00:15:50,360 --> 00:15:53,000 Speaker 1: it's what gave them the foot in the door in 250 00:15:53,040 --> 00:15:58,440 Speaker 1: the first place, and this malware would target phones and 251 00:15:58,600 --> 00:16:03,680 Speaker 1: turn them into or valence systems. Essentially, the attacker would 252 00:16:03,680 --> 00:16:06,440 Speaker 1: be able to do stuff like turn on the microphone 253 00:16:06,480 --> 00:16:10,120 Speaker 1: and even the camera on the iPhone to be able 254 00:16:10,160 --> 00:16:13,600 Speaker 1: to survey what was going on in the surroundings. They 255 00:16:13,600 --> 00:16:17,320 Speaker 1: could access data that was on the device itself. Really 256 00:16:17,440 --> 00:16:22,120 Speaker 1: nasty stuff. Now worse than that NSO group allegedly counted 257 00:16:22,200 --> 00:16:26,720 Speaker 1: among its customers some of the most notorious authoritarian leaders 258 00:16:27,160 --> 00:16:30,640 Speaker 1: who use this tool to spy on all sorts of people. 259 00:16:31,080 --> 00:16:34,440 Speaker 1: And while the NSO group was marketing this tool as 260 00:16:34,480 --> 00:16:38,120 Speaker 1: a means for governments to keep tabs on terrorist organizations, 261 00:16:38,760 --> 00:16:42,160 Speaker 1: the truth is that these authoritarian leaders were using a 262 00:16:42,360 --> 00:16:46,320 Speaker 1: very liberal definition of terrorists to include pretty much anybody 263 00:16:46,360 --> 00:16:51,240 Speaker 1: they didn't like, that included political rivals, that included activists, 264 00:16:51,280 --> 00:16:55,440 Speaker 1: and included journalists. Anyway, it's this kind of thing that 265 00:16:55,680 --> 00:17:00,720 Speaker 1: democratic societies tend to disapprove of, at least publicly. And 266 00:17:00,880 --> 00:17:03,000 Speaker 1: you know, the ability to spy on people and abuse 267 00:17:03,080 --> 00:17:06,679 Speaker 1: power and whatnot doesn't fly so well in democratic societies. 268 00:17:07,040 --> 00:17:10,000 Speaker 1: And here in the United States, the Biden administration issued 269 00:17:10,040 --> 00:17:13,480 Speaker 1: a ban on American companies from doing business with the 270 00:17:13,600 --> 00:17:17,159 Speaker 1: NSO crew. But now the New York Times reports that 271 00:17:17,359 --> 00:17:22,919 Speaker 1: someone disobeyed some federal agency, which one is not clear 272 00:17:23,080 --> 00:17:27,000 Speaker 1: as I record this episode. Apparently worked outside the system 273 00:17:27,040 --> 00:17:31,040 Speaker 1: to get access to an NSO group product called Landmark, 274 00:17:31,400 --> 00:17:34,320 Speaker 1: which allows you to keep tabs on a person's physical 275 00:17:34,359 --> 00:17:38,119 Speaker 1: location at all times. It's essentially a bug tracker that 276 00:17:38,200 --> 00:17:43,000 Speaker 1: you just install on someone's device. Well, you know, the 277 00:17:43,000 --> 00:17:45,800 Speaker 1: physical location of the infected device is what it really 278 00:17:45,960 --> 00:17:48,680 Speaker 1: tracks it. But to infect the target's device, I am 279 00:17:48,720 --> 00:17:51,000 Speaker 1: not sure what the mechanism is to do this. I 280 00:17:51,040 --> 00:17:53,280 Speaker 1: don't know how it works, right, I haven't heard enough 281 00:17:53,280 --> 00:17:57,080 Speaker 1: about Landmark to understand. If it's as insidious as Pegasus was. 282 00:17:57,800 --> 00:18:00,320 Speaker 1: It won't even require the target to take any action 283 00:18:00,359 --> 00:18:03,600 Speaker 1: at all. Which makes it really scary, and then you 284 00:18:03,720 --> 00:18:08,120 Speaker 1: just monitor that device's location. Apparently, this unknown federal agency 285 00:18:08,200 --> 00:18:11,280 Speaker 1: was using Landmark to keep tabs on targets in Mexico 286 00:18:12,040 --> 00:18:15,240 Speaker 1: who I don't know and for what purpose I don't know. 287 00:18:15,680 --> 00:18:18,960 Speaker 1: The agency went to some trouble to acquire this tool. 288 00:18:19,359 --> 00:18:24,399 Speaker 1: They used a dummy corporation called Cleopatra Holdings, which turned 289 00:18:24,400 --> 00:18:27,639 Speaker 1: out to be a fake company. It was supposedly headed 290 00:18:27,680 --> 00:18:30,680 Speaker 1: up by a guy named Bill Malone, a fake CEO. 291 00:18:31,680 --> 00:18:34,000 Speaker 1: But the real go between for this agency and the 292 00:18:34,160 --> 00:18:37,280 Speaker 1: NSO was a company called Riva Networks, which is a 293 00:18:37,359 --> 00:18:42,359 Speaker 1: defense contractor located in New Jersey. Reva Networks in turn 294 00:18:42,560 --> 00:18:47,720 Speaker 1: did business with a company called Gideon Cyber Systems. This 295 00:18:47,800 --> 00:18:50,760 Speaker 1: company is a holding company, meaning it doesn't actually do 296 00:18:51,119 --> 00:18:55,520 Speaker 1: anything on its own, it's just there to hold assets, 297 00:18:55,560 --> 00:18:59,800 Speaker 1: and Gideon cyber Systems is in turn owned by another 298 00:19:00,040 --> 00:19:04,480 Speaker 1: company called novol Pina Capital. Novol Pina, it turns out, 299 00:19:05,240 --> 00:19:09,040 Speaker 1: is a majority owner of the NSO group. So this 300 00:19:10,000 --> 00:19:15,280 Speaker 1: circuitous series of connections is what allowed this federal agency 301 00:19:15,359 --> 00:19:20,280 Speaker 1: to get hold of this forbidden product. The Biden administration 302 00:19:20,359 --> 00:19:23,639 Speaker 1: says that it was unaware of this activity, So if 303 00:19:23,680 --> 00:19:27,480 Speaker 1: that's true. That means this federal agency was directly going 304 00:19:27,520 --> 00:19:30,960 Speaker 1: against the administration's wishes and was trying to hide their 305 00:19:31,000 --> 00:19:34,280 Speaker 1: tracks in the process, So it means they essentially have 306 00:19:34,520 --> 00:19:39,359 Speaker 1: gone rogue. Of course, it could be that this was 307 00:19:39,400 --> 00:19:44,520 Speaker 1: a secret but sanctioned acquisition that the administration is saying 308 00:19:44,560 --> 00:19:47,160 Speaker 1: they were unaware of it. But it's possible they were 309 00:19:47,320 --> 00:19:49,880 Speaker 1: aware of it. We just don't know, and that all 310 00:19:49,920 --> 00:19:53,359 Speaker 1: this skull duggery was in place just to obfuskate what 311 00:19:53,440 --> 00:19:58,240 Speaker 1: was going on and to avoid detection. Only they were detected. 312 00:19:58,600 --> 00:20:00,760 Speaker 1: I'm sure I will follow up on a story as 313 00:20:00,840 --> 00:20:05,560 Speaker 1: more information becomes available. Jumping over to Twitter and tick removal. 314 00:20:07,040 --> 00:20:10,040 Speaker 1: By that, I mean the check marks on verified Twitter accounts. 315 00:20:10,040 --> 00:20:14,320 Speaker 1: I'm not talking about blood sucking arachnids, So to my 316 00:20:14,359 --> 00:20:18,600 Speaker 1: friend Shay, I apologize. I'm talking about checks, not ticks. 317 00:20:19,280 --> 00:20:25,400 Speaker 1: Twitter has reportedly started removing the blue ticks on verified 318 00:20:25,400 --> 00:20:29,320 Speaker 1: accounts that have yet to subscribe to Twitter. So we've 319 00:20:29,320 --> 00:20:31,800 Speaker 1: been talking about this for a while. That one of 320 00:20:31,840 --> 00:20:36,640 Speaker 1: Elon Must's directives was that that little check mark verification 321 00:20:37,160 --> 00:20:40,840 Speaker 1: notice was no longer going to be a sign of 322 00:20:40,880 --> 00:20:43,280 Speaker 1: a verified account. It was instead going to be a 323 00:20:43,320 --> 00:20:46,080 Speaker 1: sign of a subscribed account. So if you want that 324 00:20:46,160 --> 00:20:49,359 Speaker 1: check mark then and you're an individual, then you have 325 00:20:49,400 --> 00:20:52,919 Speaker 1: to pay eight bucks a month to the Twitter Blue subscription, 326 00:20:52,920 --> 00:20:55,520 Speaker 1: and that includes the blue check mark next to your name, 327 00:20:56,000 --> 00:21:00,400 Speaker 1: which just means you're a subscriber. Organizations have to pay 328 00:21:00,400 --> 00:21:02,680 Speaker 1: a lot more. They have to cough up a grand 329 00:21:02,960 --> 00:21:05,680 Speaker 1: per month for the privilege of having that check mark there, 330 00:21:06,320 --> 00:21:09,600 Speaker 1: And a lot of folks, including myself, have already said 331 00:21:09,600 --> 00:21:11,840 Speaker 1: that we're not going to pay, and that the whole 332 00:21:12,119 --> 00:21:16,120 Speaker 1: darned thing misses the point of verification in the first place. 333 00:21:16,160 --> 00:21:20,520 Speaker 1: It's not verification at all anymore. And further, this really 334 00:21:20,600 --> 00:21:23,320 Speaker 1: is nothing but an attempt for Elon Musk to generate 335 00:21:23,400 --> 00:21:28,080 Speaker 1: revenue after having alienated a good chunk of advertisers, and 336 00:21:28,160 --> 00:21:34,560 Speaker 1: Twitter had previously depended almost exclusively on advertising revenue for 337 00:21:34,840 --> 00:21:38,439 Speaker 1: its revenue, and verification means nothing if you're just paying 338 00:21:38,440 --> 00:21:41,200 Speaker 1: for it, like, it's not verification anymore. The whole point 339 00:21:41,200 --> 00:21:45,000 Speaker 1: of verification was so that users would know that the 340 00:21:45,080 --> 00:21:49,879 Speaker 1: account they were following legitimately belonged to whomever it claimed 341 00:21:49,880 --> 00:21:53,720 Speaker 1: to be. So if you're following the account of a 342 00:21:53,760 --> 00:21:56,640 Speaker 1: celebrity and there's a check mark there, You're like, Okay, 343 00:21:56,680 --> 00:21:58,840 Speaker 1: this is really them. So a lot of people are saying, well, 344 00:21:58,880 --> 00:22:02,560 Speaker 1: now on Twitter says it's not going to allow impersonation, 345 00:22:02,640 --> 00:22:05,640 Speaker 1: But how is it really going to enforce that. How 346 00:22:05,680 --> 00:22:10,040 Speaker 1: many people are going to go get that check mark 347 00:22:10,119 --> 00:22:13,280 Speaker 1: and then change their user name so that it reflects 348 00:22:14,280 --> 00:22:18,440 Speaker 1: a notable public figure. It's a huge mess anyway. One 349 00:22:18,560 --> 00:22:22,359 Speaker 1: of the voices that has criticized this move the loudest 350 00:22:23,560 --> 00:22:27,080 Speaker 1: belongs to The New York Times, the newspaper, the New 351 00:22:27,160 --> 00:22:29,520 Speaker 1: York Times. So now Twitter has stripped the New York 352 00:22:29,560 --> 00:22:33,200 Speaker 1: Times of its check mark. Now, to be clear, there 353 00:22:33,240 --> 00:22:36,639 Speaker 1: will be a big group of heavy hitting Twitter accounts, 354 00:22:36,640 --> 00:22:39,040 Speaker 1: ones that have lots and lots of followers, like the 355 00:22:39,080 --> 00:22:41,640 Speaker 1: top accounts on Twitter, that are going to be able 356 00:22:41,680 --> 00:22:43,920 Speaker 1: to keep their check mark without having to pay that 357 00:22:44,000 --> 00:22:47,560 Speaker 1: monthly fee. For those, I guess the check mark will 358 00:22:47,560 --> 00:22:52,920 Speaker 1: still mean that it's a verified source. You see how 359 00:22:52,920 --> 00:22:57,560 Speaker 1: this all gets confusing and muddled. But I would have 360 00:22:57,600 --> 00:23:00,080 Speaker 1: imagined that the New York Times would have been on 361 00:23:00,119 --> 00:23:03,360 Speaker 1: that list. It is a prominent news source. I mean, 362 00:23:03,359 --> 00:23:05,560 Speaker 1: there are other news sources that are on that list, 363 00:23:06,119 --> 00:23:09,560 Speaker 1: like CNN and the Washington Post, but Twitter has pulled 364 00:23:09,600 --> 00:23:12,440 Speaker 1: the mark off of The New York Times. Musk has 365 00:23:12,480 --> 00:23:15,520 Speaker 1: also been in a rather public snit over The New 366 00:23:15,600 --> 00:23:18,440 Speaker 1: York Times in the past few weeks. So it's very 367 00:23:18,520 --> 00:23:23,199 Speaker 1: hard to avoid thinking that this is really a personal issue, 368 00:23:23,320 --> 00:23:26,600 Speaker 1: not a business one. It feels like this is personal 369 00:23:27,000 --> 00:23:31,080 Speaker 1: and I am biased. I have a lot of feelings 370 00:23:31,119 --> 00:23:34,760 Speaker 1: about Elon Musk that are negative, and so like, I 371 00:23:34,840 --> 00:23:37,840 Speaker 1: am inclined to think that this is a very petty 372 00:23:37,920 --> 00:23:41,000 Speaker 1: personal move on Musk's part, But I have to say 373 00:23:41,040 --> 00:23:44,200 Speaker 1: I don't know that for sure. It's it's how I feel, 374 00:23:44,440 --> 00:23:47,040 Speaker 1: but I have to admit that's just an opinion, and 375 00:23:47,080 --> 00:23:52,920 Speaker 1: I could be one hundred percent off the mark. It's 376 00:23:52,920 --> 00:23:58,480 Speaker 1: just it's hard for me to hold back on that thought. Anyway. 377 00:23:59,320 --> 00:24:02,480 Speaker 1: Lots of other news outlets still have their marks, including 378 00:24:02,520 --> 00:24:04,800 Speaker 1: news outlets that have already said that they are also 379 00:24:04,920 --> 00:24:09,479 Speaker 1: not going to pay the monthly fee, So that raises questions. 380 00:24:09,480 --> 00:24:12,600 Speaker 1: I mean, why was the New York Times singled out 381 00:24:12,640 --> 00:24:16,320 Speaker 1: when other news outlets that also said they weren't going 382 00:24:16,440 --> 00:24:20,240 Speaker 1: to pay still have that check mark. One reason for 383 00:24:20,280 --> 00:24:23,440 Speaker 1: this could be that, from what I understand, the process 384 00:24:23,520 --> 00:24:26,760 Speaker 1: to remove those check marks is a manual one, so 385 00:24:26,800 --> 00:24:31,800 Speaker 1: it's going to very gradually happen across the checked accounts 386 00:24:31,840 --> 00:24:34,639 Speaker 1: that are on Twitter. So if that's the case, if 387 00:24:34,680 --> 00:24:36,960 Speaker 1: people manually have to go and review each of these 388 00:24:37,000 --> 00:24:40,480 Speaker 1: and then remove ones that are not from subscribers, then yeah, 389 00:24:40,560 --> 00:24:43,680 Speaker 1: it's that kind of explains things, but it does make 390 00:24:43,680 --> 00:24:46,800 Speaker 1: it seem like the New York Times was being held out. 391 00:24:46,840 --> 00:24:49,679 Speaker 1: For an example here, I just know that as of 392 00:24:49,760 --> 00:24:52,600 Speaker 1: this morning, I still had my check mark. I got 393 00:24:52,640 --> 00:24:55,840 Speaker 1: the verified check mark years ago, but I know it's 394 00:24:55,840 --> 00:24:58,400 Speaker 1: on borrowed time because I am not paying for it, 395 00:24:58,520 --> 00:25:01,600 Speaker 1: and I imagine one day just notice that it's gone, 396 00:25:01,880 --> 00:25:04,600 Speaker 1: and probably by that time it will have been gone 397 00:25:04,640 --> 00:25:08,600 Speaker 1: for ages, because while I check tech Stuff's Twitter account 398 00:25:08,640 --> 00:25:12,320 Speaker 1: pretty regularly, I very rarely go into my own these days. 399 00:25:12,760 --> 00:25:16,200 Speaker 1: So yeah, it'll happen, and then maybe like a week 400 00:25:16,280 --> 00:25:19,480 Speaker 1: or two later, I'll figure it out. Pour one out 401 00:25:19,520 --> 00:25:23,359 Speaker 1: for Virgin Orbit, which had the backing of a billionaire, 402 00:25:23,440 --> 00:25:27,280 Speaker 1: but turns out a billionaire support is still not enough 403 00:25:27,320 --> 00:25:31,080 Speaker 1: to send payloads into space. All right. So Virgin Orbit 404 00:25:31,280 --> 00:25:34,160 Speaker 1: was a company that was backed by Sir Richard Branson. 405 00:25:34,880 --> 00:25:38,440 Speaker 1: Virgin Orbit's business was to send payloads into orbit by 406 00:25:38,480 --> 00:25:42,760 Speaker 1: using rockets that would fly aboard specially outfitted commercial jets, 407 00:25:43,080 --> 00:25:46,560 Speaker 1: essentially like a seven forty seven. Once the jet reached 408 00:25:46,560 --> 00:25:49,320 Speaker 1: a high enough altitude, it would deploy the rocket, The 409 00:25:49,400 --> 00:25:52,919 Speaker 1: rocket would ignite its engines, and then this would at 410 00:25:53,000 --> 00:25:57,240 Speaker 1: least theoretically send the payload into orbit. And this approach 411 00:25:57,280 --> 00:26:00,320 Speaker 1: accomplished a few things. For one, there was no need 412 00:26:00,320 --> 00:26:02,399 Speaker 1: for a launch pad, right You didn't have to have 413 00:26:02,440 --> 00:26:06,719 Speaker 1: a launch vehicle launch pad at Cape Kennedy. You just 414 00:26:06,800 --> 00:26:09,960 Speaker 1: needed an airport capable of handling the commercial jet that 415 00:26:10,119 --> 00:26:13,560 Speaker 1: was used as the mothership. That would free up Virgin 416 00:26:13,720 --> 00:26:17,240 Speaker 1: Orbit to launch from places that don't have their own 417 00:26:17,480 --> 00:26:20,800 Speaker 1: rocket launch facilities, like say the United Kingdom, and it 418 00:26:20,840 --> 00:26:23,280 Speaker 1: would reduce the need for rocket fuel. You wouldn't need 419 00:26:23,320 --> 00:26:26,679 Speaker 1: as much to get a payload into space. Of course, 420 00:26:26,800 --> 00:26:30,720 Speaker 1: you were using a lot of jet fuel as well. Unfortunately, 421 00:26:31,400 --> 00:26:34,960 Speaker 1: the business of Virgin one was already struggling when earlier 422 00:26:35,000 --> 00:26:37,560 Speaker 1: this year a flight from the UK that was meant 423 00:26:37,600 --> 00:26:41,760 Speaker 1: to launch a payload into orbit failed to be clear 424 00:26:42,119 --> 00:26:46,600 Speaker 1: the launch of the rocket failed. The aircraft was fine, 425 00:26:47,359 --> 00:26:50,760 Speaker 1: but the payload was unable to reach orbit, and investors 426 00:26:50,760 --> 00:26:54,560 Speaker 1: who were already uncertain about this business began to abandon 427 00:26:54,640 --> 00:26:57,439 Speaker 1: ship and I mentioned in an earlier news segment this 428 00:26:57,520 --> 00:27:02,280 Speaker 1: year that Virgin Orbit would possibly have to declare bankruptcy, 429 00:27:02,359 --> 00:27:05,280 Speaker 1: and that is in fact what has happened. The company 430 00:27:05,320 --> 00:27:10,320 Speaker 1: has officially declared or filed for bankruptcy, and they're looking 431 00:27:10,359 --> 00:27:14,240 Speaker 1: for a buyer reportedly, so there is a chance that 432 00:27:14,400 --> 00:27:17,560 Speaker 1: Virgin Orbit will still live on, possibly under some other 433 00:27:17,640 --> 00:27:21,840 Speaker 1: name and definitely under some other corporate governance if it happens, 434 00:27:22,040 --> 00:27:25,840 Speaker 1: or it may end up being essentially liquidated as much 435 00:27:25,840 --> 00:27:28,760 Speaker 1: as possible, and for the majority of the staff who 436 00:27:28,800 --> 00:27:31,399 Speaker 1: worked there, it's already too late, because last week the 437 00:27:31,400 --> 00:27:33,840 Speaker 1: company announced it was going to be laying off eighty 438 00:27:33,920 --> 00:27:39,080 Speaker 1: five percent of its workforce. Youch now, layoffs aren't the 439 00:27:39,119 --> 00:27:42,639 Speaker 1: only way that companies are trying to reduce costs right now. 440 00:27:43,119 --> 00:27:46,439 Speaker 1: CNBC reports that Google is cutting down on amenities like 441 00:27:46,920 --> 00:27:51,200 Speaker 1: fitness classes for employees. It's also going to expect Googlers 442 00:27:51,200 --> 00:27:54,680 Speaker 1: to stick with their computers for longer, so employees will 443 00:27:54,720 --> 00:27:58,160 Speaker 1: not be able to get laptop replacements as easily or 444 00:27:58,200 --> 00:28:01,920 Speaker 1: frequently as they have in the past. Also, someone should 445 00:28:01,960 --> 00:28:05,919 Speaker 1: tell Milton that he's best off if he's guarding that 446 00:28:06,000 --> 00:28:09,320 Speaker 1: red swing line stapler with his life, because Google also 447 00:28:09,359 --> 00:28:13,560 Speaker 1: won't be handing out staplers to employees either. Those things 448 00:28:13,560 --> 00:28:16,920 Speaker 1: are going to be like gold when Google Society collapses 449 00:28:16,960 --> 00:28:20,960 Speaker 1: and everyone turns into scavengers. So yeah, guard that red 450 00:28:21,000 --> 00:28:25,600 Speaker 1: swing line. Milton. Also, we probably know when Google Society 451 00:28:25,600 --> 00:28:29,080 Speaker 1: will collapse. My guesses it's going to happen on a Monday, because, 452 00:28:29,119 --> 00:28:32,240 Speaker 1: as Google's memo to employees pointed out, the company quote 453 00:28:32,840 --> 00:28:37,040 Speaker 1: baked too many muffins on a Monday end quote. Now, 454 00:28:37,400 --> 00:28:39,960 Speaker 1: I gotta be fair, I'm giving that quote totally out 455 00:28:40,000 --> 00:28:42,680 Speaker 1: of context just for the purposes of poking fun. But 456 00:28:43,480 --> 00:28:45,920 Speaker 1: what the document was actually saying was that the company's 457 00:28:45,960 --> 00:28:50,400 Speaker 1: expenses are too large. They're spending too much money, particularly 458 00:28:50,480 --> 00:28:53,920 Speaker 1: in an office environment where employees are not coming in 459 00:28:54,080 --> 00:28:57,400 Speaker 1: five days a week anymore. They said, the policies that 460 00:28:57,440 --> 00:29:00,360 Speaker 1: we had in place were for when everyone is coming 461 00:29:00,360 --> 00:29:02,640 Speaker 1: into the office every single day of the week, or 462 00:29:02,680 --> 00:29:04,840 Speaker 1: at least every day of the work week, and now 463 00:29:04,840 --> 00:29:07,880 Speaker 1: you're not doing that. So since you're not doing that anymore, 464 00:29:08,360 --> 00:29:11,640 Speaker 1: then we have to cut back on these things. And 465 00:29:11,840 --> 00:29:13,720 Speaker 1: that's why you're starting to see a lot of the 466 00:29:14,160 --> 00:29:18,400 Speaker 1: employee programs and benefits go away. The question I have 467 00:29:19,160 --> 00:29:22,680 Speaker 1: is how many of those benefits are going to return 468 00:29:23,080 --> 00:29:26,640 Speaker 1: once the economy improves. I have a guess. That guess 469 00:29:26,720 --> 00:29:30,920 Speaker 1: is pretty darned close to the figure of zero. The 470 00:29:30,960 --> 00:29:33,680 Speaker 1: amenities Google had are the kind of things that you 471 00:29:33,760 --> 00:29:38,440 Speaker 1: typically associate with startups, but they are also things that 472 00:29:38,480 --> 00:29:41,800 Speaker 1: tend to get phased out when startups become massive corporations. 473 00:29:42,160 --> 00:29:44,680 Speaker 1: Now Google has been a massive corporation for many years 474 00:29:44,720 --> 00:29:46,840 Speaker 1: at this point, so the fact that that was hanging 475 00:29:46,920 --> 00:29:50,840 Speaker 1: on to these amenities for so long was partly a 476 00:29:50,880 --> 00:29:55,240 Speaker 1: sign of how competitive companies have to be in Silicon Valley. 477 00:29:55,480 --> 00:29:59,200 Speaker 1: In order to attract talent, right, if you wanted the best, 478 00:29:59,240 --> 00:30:02,000 Speaker 1: you need to have lots of bells and whistles on 479 00:30:02,120 --> 00:30:05,360 Speaker 1: top of really good salaries in order to attract them 480 00:30:05,360 --> 00:30:07,840 Speaker 1: to your company and make sure that the competition didn't 481 00:30:07,840 --> 00:30:10,160 Speaker 1: get hold of them. But now we're in a world 482 00:30:10,200 --> 00:30:13,800 Speaker 1: where there's a sudden excess of talent out there in 483 00:30:13,840 --> 00:30:16,680 Speaker 1: the world because all the companies out there are laying 484 00:30:16,680 --> 00:30:19,920 Speaker 1: off thousands of people. So it's way less critical to 485 00:30:19,960 --> 00:30:21,680 Speaker 1: make sure that the folks who are still at your 486 00:30:21,680 --> 00:30:25,480 Speaker 1: company are being catered to right now. They're terrified about 487 00:30:25,480 --> 00:30:27,720 Speaker 1: losing their jobs. It doesn't really matter if you get 488 00:30:27,800 --> 00:30:30,520 Speaker 1: rid of all the stuff that costs money but kept 489 00:30:30,520 --> 00:30:33,840 Speaker 1: employees happy, because where are they going to go. Everyone's 490 00:30:33,880 --> 00:30:38,280 Speaker 1: laying everybody off. It's pretty grim. Okay, we're going to 491 00:30:38,360 --> 00:30:40,360 Speaker 1: take another quick break. When we come back, I've got 492 00:30:40,400 --> 00:30:52,120 Speaker 1: a couple more news stories to talk about. All right. 493 00:30:52,400 --> 00:30:55,000 Speaker 1: We have an update on a story that's been going 494 00:30:55,040 --> 00:30:59,280 Speaker 1: on for many years now. So several years ago, from 495 00:30:59,600 --> 00:31:02,960 Speaker 1: twenty fifteen to twenty sixteen, a guy named Owen Diaz 496 00:31:03,560 --> 00:31:08,520 Speaker 1: worked as a contract employee for Tesla at its Fremont 497 00:31:08,640 --> 00:31:13,600 Speaker 1: factory outside of San Francisco, and he brought charges against 498 00:31:13,600 --> 00:31:18,680 Speaker 1: the company, saying that he was faced with racist attacks 499 00:31:19,560 --> 00:31:23,360 Speaker 1: multiple times as he worked there, including from his supervisor, 500 00:31:23,480 --> 00:31:29,040 Speaker 1: that there were people using racial slurs, and that it 501 00:31:29,120 --> 00:31:34,480 Speaker 1: was a very adversarial workplace. And he brought a lawsuit 502 00:31:34,520 --> 00:31:38,920 Speaker 1: against the company and he won. He was found to 503 00:31:39,040 --> 00:31:42,360 Speaker 1: be in the right and that Tesla was guilty of 504 00:31:42,400 --> 00:31:47,920 Speaker 1: allowing this environment to establish itself within the Fremont facility, 505 00:31:48,360 --> 00:31:53,360 Speaker 1: and initially the jury awarded him a staggering one hundred 506 00:31:53,600 --> 00:31:59,680 Speaker 1: thirty seven million dollars, mostly in punitive damages. However, the 507 00:31:59,800 --> 00:32:02,880 Speaker 1: judge in that trial felt this fine was, you know, 508 00:32:02,960 --> 00:32:07,840 Speaker 1: a tad much, and reduced it down to fifteen million dollars. 509 00:32:08,320 --> 00:32:11,920 Speaker 1: Not quite one tenth, but you know, fifteen million dollars. 510 00:32:11,920 --> 00:32:15,200 Speaker 1: That's a lot of money, but it's obviously nothing compared 511 00:32:15,240 --> 00:32:19,360 Speaker 1: to one hundred thirty seven million dollars. Diaz decided to 512 00:32:19,640 --> 00:32:23,720 Speaker 1: challenge that fifteen million dollar amount, and the matter went 513 00:32:23,800 --> 00:32:27,080 Speaker 1: to a new trial for a jury to decide whether 514 00:32:27,160 --> 00:32:33,280 Speaker 1: or not those damages reflected what he had experienced. Now, 515 00:32:33,400 --> 00:32:37,160 Speaker 1: in this more recent trial, the matter of Tesla's guilt 516 00:32:37,520 --> 00:32:40,760 Speaker 1: was not at issue because that had already been decided 517 00:32:41,040 --> 00:32:44,640 Speaker 1: in the first trial. So we start from the position 518 00:32:44,640 --> 00:32:48,240 Speaker 1: that Tesla was in fact guilty of the things Diaz 519 00:32:48,280 --> 00:32:51,160 Speaker 1: accused the company of. So the only thing that was 520 00:32:51,200 --> 00:32:54,400 Speaker 1: really at stake here was the amount of money Tesla 521 00:32:54,480 --> 00:32:57,560 Speaker 1: was going to have to award mister Diaz. The trial 522 00:32:57,640 --> 00:33:00,280 Speaker 1: lasted five days, and then the jury came back with 523 00:33:00,360 --> 00:33:05,640 Speaker 1: a new figure of three million dollars, which as one 524 00:33:05,800 --> 00:33:08,920 Speaker 1: fifth of fifteen million. So I'm sure Diaz is now 525 00:33:08,960 --> 00:33:12,840 Speaker 1: really frustrated and disappointed in this result. I'm not going 526 00:33:12,920 --> 00:33:15,680 Speaker 1: to pass any judgment here. I'm not going to speculate 527 00:33:15,760 --> 00:33:18,640 Speaker 1: on what I would have done, because I'm fully aware 528 00:33:19,080 --> 00:33:23,320 Speaker 1: I don't face racist depression. So it's impossible for me 529 00:33:23,400 --> 00:33:26,760 Speaker 1: to say whether I would have accepted that initial fifteen 530 00:33:26,800 --> 00:33:30,640 Speaker 1: million dollars that the judge had already knocked down, or 531 00:33:30,680 --> 00:33:33,800 Speaker 1: if I would have gone out to seek more in damages. 532 00:33:33,920 --> 00:33:35,920 Speaker 1: I don't know what I would have done. I don't 533 00:33:35,960 --> 00:33:40,479 Speaker 1: know what mister Diaz has experienced or gone through. Mostly 534 00:33:40,520 --> 00:33:43,840 Speaker 1: I came away from this story really, really, really sincerely 535 00:33:43,880 --> 00:33:47,280 Speaker 1: hoping that Tesla has taken measures to make certain that 536 00:33:47,360 --> 00:33:51,360 Speaker 1: employees never face that kind of environment again, and if 537 00:33:51,400 --> 00:33:54,400 Speaker 1: any instances of racist activity arise in the future, that 538 00:33:54,440 --> 00:33:57,880 Speaker 1: the company takes swift action to address it. To me, 539 00:33:58,040 --> 00:34:02,560 Speaker 1: that's really important, more so than how much money is 540 00:34:02,600 --> 00:34:05,160 Speaker 1: Tesla going to have to pay. I do feel for 541 00:34:05,280 --> 00:34:09,719 Speaker 1: mister Diaz. I it's a is this is not the 542 00:34:09,840 --> 00:34:14,280 Speaker 1: outcome that he wanted. I feel pretty comfortable saying that. Finally, 543 00:34:15,080 --> 00:34:21,880 Speaker 1: Ride Wiseman, Christina Hammock, Koch, Victor Glover, Jeremy Hanson. These 544 00:34:22,000 --> 00:34:26,839 Speaker 1: four people will crue NASA's artemists to mission. They will 545 00:34:26,880 --> 00:34:30,959 Speaker 1: fly aboard and Orion spacecraft, and they will journey from 546 00:34:31,000 --> 00:34:35,520 Speaker 1: Earth out to the Moon. They'll pass behind the Moon, 547 00:34:36,200 --> 00:34:39,360 Speaker 1: and then they'll return home. So this mission is the 548 00:34:39,440 --> 00:34:44,360 Speaker 1: predecessor to NASA's Artemists three, which will actually see astronauts 549 00:34:44,400 --> 00:34:47,000 Speaker 1: set foot on the Moon's surface for the first time 550 00:34:47,320 --> 00:34:52,040 Speaker 1: since nineteen seventy two. So while these particular astronauts will 551 00:34:52,080 --> 00:34:56,400 Speaker 1: not touch down on lunar firma, they will journey further 552 00:34:56,480 --> 00:34:59,120 Speaker 1: away from the Earth than any other human has in 553 00:34:59,239 --> 00:35:04,520 Speaker 1: several decades. Of the four astronauts, three have previously been 554 00:35:04,680 --> 00:35:09,560 Speaker 1: to space. Only Jeremy Hanson will be taking his first 555 00:35:09,680 --> 00:35:14,239 Speaker 1: space flight with this mission. So Artemis one was an 556 00:35:14,360 --> 00:35:18,799 Speaker 1: uncrewed mission, meaning there were no people aboard the Orion spacecraft. 557 00:35:18,840 --> 00:35:22,880 Speaker 1: It was a test flight for NASA's Space Launch System 558 00:35:23,040 --> 00:35:27,680 Speaker 1: or SLS, also known as the Big Honkin Rocket and 559 00:35:27,960 --> 00:35:31,799 Speaker 1: the Orion spacecraft, So that was supposed to happen back 560 00:35:31,840 --> 00:35:37,280 Speaker 1: in twenty sixteen. Originally, it finally really happened on November sixteenth, 561 00:35:37,400 --> 00:35:41,000 Speaker 1: twenty twenty two, so about six years later than the 562 00:35:41,040 --> 00:35:45,319 Speaker 1: original plan for Artemis. That mission was a success. The 563 00:35:45,440 --> 00:35:49,080 Speaker 1: Orion spacecraft, with no humans aboard, it entered Earth's orbit 564 00:35:49,160 --> 00:35:53,719 Speaker 1: and stayed there for several days. Orion returned to Earth 565 00:35:53,760 --> 00:35:56,520 Speaker 1: on December eleventh, twenty twenty two. It landed in the 566 00:35:56,560 --> 00:36:00,920 Speaker 1: Pacific Ocean and the crew was able to retreat the spacecraft. 567 00:36:01,320 --> 00:36:04,880 Speaker 1: So Artemis one is in the books. When can we 568 00:36:04,920 --> 00:36:08,399 Speaker 1: expect Artemis two to launch? Well, right now, the plan 569 00:36:08,560 --> 00:36:12,640 Speaker 1: is to aim for a November twenty twenty four launch, though, 570 00:36:12,680 --> 00:36:15,480 Speaker 1: as we've already seen with lots of different space missions, 571 00:36:15,480 --> 00:36:19,000 Speaker 1: there's no guarantee that will be able to make that date. 572 00:36:19,280 --> 00:36:22,120 Speaker 1: Hopefully we will. That's the earliest NASA plans to be 573 00:36:22,200 --> 00:36:27,480 Speaker 1: able to launch, so it may be significantly later than that, 574 00:36:27,640 --> 00:36:29,680 Speaker 1: as we have seen it. It's difficult, you know. Keep 575 00:36:29,680 --> 00:36:34,879 Speaker 1: in mind also that we're talking about you know, elections 576 00:36:34,880 --> 00:36:38,239 Speaker 1: and stuff, so that can complicate things because as much 577 00:36:38,280 --> 00:36:41,160 Speaker 1: as we don't like to bring politics into the space program, 578 00:36:41,440 --> 00:36:45,680 Speaker 1: politics definitely affects the space program. Still in the not 579 00:36:45,760 --> 00:36:48,520 Speaker 1: too distant future, we may once again have people staring 580 00:36:48,600 --> 00:36:51,359 Speaker 1: at the far side of the Moon in person, which 581 00:36:51,360 --> 00:36:55,400 Speaker 1: I have to admit is pretty darn cool. Okay, that 582 00:36:55,480 --> 00:36:59,080 Speaker 1: wraps up this news episode of tech Stuff. Hope you 583 00:36:59,120 --> 00:37:01,680 Speaker 1: are all well. If you'd like to reach out to me. 584 00:37:01,719 --> 00:37:04,160 Speaker 1: You can do so on Twitter. The handle for the 585 00:37:04,200 --> 00:37:07,400 Speaker 1: show is tech Stuff HSW. There's no tick mark for 586 00:37:07,440 --> 00:37:10,720 Speaker 1: that one, but trust me, it's for the show and 587 00:37:11,200 --> 00:37:13,959 Speaker 1: if you would prefer, you can download the iHeartRadio app. 588 00:37:14,120 --> 00:37:16,200 Speaker 1: It's free to downloads free to use. If you navigate 589 00:37:16,280 --> 00:37:18,560 Speaker 1: over to tech Stuff by typing tech Stuff into the 590 00:37:18,600 --> 00:37:21,920 Speaker 1: little search field, it'll pull up the result. You can 591 00:37:22,040 --> 00:37:24,960 Speaker 1: pop into the podcast and you will see a little 592 00:37:25,080 --> 00:37:27,279 Speaker 1: microphone icon. If you click on that, you can leave 593 00:37:27,280 --> 00:37:29,879 Speaker 1: a voice message up for thirty seconds in link let 594 00:37:29,880 --> 00:37:31,720 Speaker 1: me know what you would like to hear in the future, 595 00:37:32,080 --> 00:37:41,640 Speaker 1: and I'll talk to you again really soon. Tech Stuff 596 00:37:41,719 --> 00:37:46,239 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 597 00:37:46,280 --> 00:37:49,839 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 598 00:37:49,880 --> 00:37:50,800 Speaker 1: your favorite shows.