1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:16,079 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,120 --> 00:00:19,200 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,239 --> 00:00:22,160 Speaker 1: are you. It's time for the tech news for Thursday 5 00:00:22,160 --> 00:00:25,919 Speaker 1: October nineteenth, twenty twenty three, and we're going to start 6 00:00:25,960 --> 00:00:30,080 Speaker 1: off with some tech in national security concerns here in 7 00:00:30,080 --> 00:00:34,600 Speaker 1: the United States. So this week the Five Eyes Alliance 8 00:00:34,720 --> 00:00:38,960 Speaker 1: met at Stanford University to discuss matters of tech and security. 9 00:00:39,840 --> 00:00:41,920 Speaker 1: And you know, the Five Eyes Alliance sounds like something 10 00:00:41,960 --> 00:00:44,800 Speaker 1: out of you know, a Sherlock Holmes novel or maybe 11 00:00:44,960 --> 00:00:48,040 Speaker 1: Game of Thrones or something, but in fact, it's a 12 00:00:48,120 --> 00:00:51,800 Speaker 1: group that consists of representatives from the United States, the 13 00:00:51,880 --> 00:00:57,320 Speaker 1: United Kingdom, Canada, New Zealand, and Australia. And they discussed 14 00:00:57,320 --> 00:01:00,600 Speaker 1: current and future threats to safety and innovation, and the 15 00:01:00,680 --> 00:01:04,840 Speaker 1: general consensus was that China is the largest threat in 16 00:01:04,880 --> 00:01:09,399 Speaker 1: that regard. The group cited instances of Chinese hackers infiltrating 17 00:01:09,400 --> 00:01:15,200 Speaker 1: computer systems, as well as of actual spies conducting industrial espionage, 18 00:01:15,280 --> 00:01:19,039 Speaker 1: and of various attempts to compromise business insiders, all in 19 00:01:19,080 --> 00:01:22,920 Speaker 1: an effort to steal intellectual property and tech secrets. Much 20 00:01:23,000 --> 00:01:25,319 Speaker 1: was made of the fact that China has an enormous 21 00:01:25,360 --> 00:01:29,720 Speaker 1: program centered on state backed hacking projects, a program that's 22 00:01:29,880 --> 00:01:34,720 Speaker 1: larger than all other nations combined in fact, and that 23 00:01:34,920 --> 00:01:38,600 Speaker 1: China has directed operatives to focus on emerging technologies, including 24 00:01:38,600 --> 00:01:44,120 Speaker 1: stuff like quantum computing and artificial intelligence. The representatives also 25 00:01:44,200 --> 00:01:47,240 Speaker 1: warned that the emergence of generative AI could help hackers 26 00:01:47,280 --> 00:01:51,840 Speaker 1: create more efficient attacks, making their efforts even more effective. 27 00:01:52,360 --> 00:01:55,440 Speaker 1: The meeting concluded with representatives warning the audience that this 28 00:01:55,600 --> 00:01:58,040 Speaker 1: is something all governments and businesses will need to be 29 00:01:58,080 --> 00:02:00,400 Speaker 1: aware of or else they will run the risk of 30 00:02:00,440 --> 00:02:04,600 Speaker 1: being caught unawares. Sounds like a real fun group. On 31 00:02:04,640 --> 00:02:08,280 Speaker 1: a related note, the US government has updated its export 32 00:02:08,400 --> 00:02:11,920 Speaker 1: rules and added restrictions on the types of processors that 33 00:02:12,040 --> 00:02:15,680 Speaker 1: companies are allowed to ship to China. As it stands, 34 00:02:16,080 --> 00:02:20,040 Speaker 1: big companies like AMD, Nvidia, and Intel are not allowed 35 00:02:20,080 --> 00:02:24,120 Speaker 1: to ship their high end processors to China unless they 36 00:02:24,160 --> 00:02:28,079 Speaker 1: first secure an export license from the US Department of Commerce. 37 00:02:28,600 --> 00:02:33,560 Speaker 1: That actually includes Nvidia's g Force RTX forty ninety graphics cards, 38 00:02:34,000 --> 00:02:36,600 Speaker 1: and I think most people would associate that with high 39 00:02:36,720 --> 00:02:40,840 Speaker 1: end computer gaming. But as we've seen, powerful GPUs could 40 00:02:40,840 --> 00:02:43,880 Speaker 1: be put to work doing all sorts of processing besides, 41 00:02:44,120 --> 00:02:47,480 Speaker 1: you know, making puddles of water all reflective in sessions 42 00:02:47,480 --> 00:02:51,000 Speaker 1: of call of duty or whatever. Further, the rules mean 43 00:02:51,080 --> 00:02:53,880 Speaker 1: that companies will not be able to rely on Chinese 44 00:02:53,880 --> 00:02:57,760 Speaker 1: manufacturing to make these products. And since a lot of 45 00:02:57,840 --> 00:03:02,119 Speaker 1: chip manufacturing, or the very least chip assembly, will take 46 00:03:02,160 --> 00:03:04,880 Speaker 1: place in China, that's going to push these companies to 47 00:03:04,960 --> 00:03:08,920 Speaker 1: pivot to find alternatives that in turn is likely to 48 00:03:08,960 --> 00:03:11,440 Speaker 1: impact the cost of production. And I think we all 49 00:03:11,480 --> 00:03:15,519 Speaker 1: know that that's probably going to mean for us down 50 00:03:15,520 --> 00:03:17,720 Speaker 1: the road that we're going to be paying higher prices 51 00:03:17,720 --> 00:03:20,840 Speaker 1: for these chips for ourselves in the products that we 52 00:03:20,840 --> 00:03:25,639 Speaker 1: were purchasing. China is not the only country that's on 53 00:03:25,720 --> 00:03:28,960 Speaker 1: the US's list of no nos when it comes to exports. 54 00:03:29,520 --> 00:03:33,680 Speaker 1: There's also the United Arab Emirates, Vietnam, and Saudi Arabia. 55 00:03:33,840 --> 00:03:37,040 Speaker 1: All will require that companies first secure and export license 56 00:03:37,080 --> 00:03:40,160 Speaker 1: with the US Department of Commerce before they can legally 57 00:03:40,240 --> 00:03:45,839 Speaker 1: send those high performing chips to those destinations. A couple 58 00:03:45,920 --> 00:03:48,960 Speaker 1: of weeks ago, a hacker leaked stolen information from the 59 00:03:49,040 --> 00:03:52,480 Speaker 1: genetic testing company twenty three and meters, So according to 60 00:03:52,520 --> 00:03:55,400 Speaker 1: the company, the suspicion is the hacker used a technique 61 00:03:55,480 --> 00:03:59,920 Speaker 1: called credential stuffing, So that's when a hacker is able 62 00:03:59,960 --> 00:04:03,600 Speaker 1: to pull data from other breaches and then uses that 63 00:04:03,680 --> 00:04:06,920 Speaker 1: data to try and match user names and emails and 64 00:04:06,960 --> 00:04:12,000 Speaker 1: stolen passwords for other services and try combinations of them 65 00:04:12,120 --> 00:04:14,920 Speaker 1: so they can see if someone's used the same password 66 00:04:14,960 --> 00:04:19,280 Speaker 1: at multiple locations, which is why you should not do that. 67 00:04:19,720 --> 00:04:22,080 Speaker 1: Do not use the same password or even just a 68 00:04:22,120 --> 00:04:25,839 Speaker 1: small group of passwords for all of your services. You 69 00:04:25,880 --> 00:04:29,840 Speaker 1: should have a unique password for every single service, because 70 00:04:29,920 --> 00:04:32,760 Speaker 1: it's bad enough. If one of your accounts gets hacked, 71 00:04:32,920 --> 00:04:36,000 Speaker 1: that's terrible. It is a horrible hassle and it can 72 00:04:36,040 --> 00:04:38,920 Speaker 1: really have a negative impact on you and your livelihood. 73 00:04:39,800 --> 00:04:42,320 Speaker 1: But it can turn into a total disaster if it 74 00:04:42,360 --> 00:04:44,920 Speaker 1: turns out the now the people who did that attack 75 00:04:45,000 --> 00:04:47,520 Speaker 1: have the keys to all your stuff because you use 76 00:04:47,560 --> 00:04:52,039 Speaker 1: the same log in and password for everything. Well, now 77 00:04:52,080 --> 00:04:55,000 Speaker 1: that some hacker has done this again and the new 78 00:04:55,120 --> 00:04:58,200 Speaker 1: leak includes millions of data points about twenty three and 79 00:04:58,279 --> 00:05:02,599 Speaker 1: Me's customers, it's big news. The information covers around four 80 00:05:02,760 --> 00:05:06,080 Speaker 1: million people, and the hacker posted the information on a 81 00:05:06,120 --> 00:05:09,800 Speaker 1: forum known as a Hive of Scum and Villainy a 82 00:05:09,920 --> 00:05:14,200 Speaker 1: hacker forum, and it has the imposing name Beach Forums. 83 00:05:14,760 --> 00:05:18,279 Speaker 1: Trust me, it's scarier than it sounds anyway. The hacker 84 00:05:18,279 --> 00:05:21,120 Speaker 1: claims that the records include information on people from very 85 00:05:21,120 --> 00:05:24,480 Speaker 1: wealthy backgrounds. Twenty three and Me has informed users that 86 00:05:24,520 --> 00:05:27,560 Speaker 1: they should both update their passwords and also enable multi 87 00:05:27,560 --> 00:05:31,320 Speaker 1: factor authentication. According to tech Crunch, twenty three and meter 88 00:05:31,560 --> 00:05:36,080 Speaker 1: quote blamed the incident on its customers for reusing passwords 89 00:05:36,680 --> 00:05:40,520 Speaker 1: and an opt in feature called DNA relatives, which allows 90 00:05:40,600 --> 00:05:43,640 Speaker 1: users to see the data of other opted end users 91 00:05:43,960 --> 00:05:47,400 Speaker 1: whose genetic data matches theirs. End quote. So, in other words, 92 00:05:47,920 --> 00:05:50,600 Speaker 1: twenty three and Me is saying you should stop reusing 93 00:05:50,600 --> 00:05:53,960 Speaker 1: your passwords so much, you're the reason why this attack worked. 94 00:05:54,000 --> 00:05:56,080 Speaker 1: But they're also like, oh, and also, we have this 95 00:05:56,200 --> 00:05:59,520 Speaker 1: option in our service that if you turn it on, 96 00:06:00,160 --> 00:06:03,640 Speaker 1: means that hackers could potentially use your account to scrape 97 00:06:03,760 --> 00:06:07,320 Speaker 1: other accounts. So other accounts that might be very well 98 00:06:07,360 --> 00:06:10,120 Speaker 1: protected as far as passwords go, they could still be 99 00:06:10,200 --> 00:06:14,080 Speaker 1: vulnerable because of this opt in feature. So really everyone's 100 00:06:14,120 --> 00:06:16,560 Speaker 1: at fault. I guess a lot of victim blaming going 101 00:06:16,600 --> 00:06:20,200 Speaker 1: around over at twenty three and me. But anyway, that's 102 00:06:20,520 --> 00:06:24,359 Speaker 1: the update on that story. The Washington Post published an 103 00:06:24,440 --> 00:06:27,279 Speaker 1: article with advice on what people can do if they 104 00:06:27,400 --> 00:06:30,760 Speaker 1: are worried about the twenty three and me breach. The 105 00:06:30,839 --> 00:06:33,320 Speaker 1: article also expanded a little bit on the attack itself. 106 00:06:33,360 --> 00:06:35,760 Speaker 1: They said that the hacker was specifically calling out the 107 00:06:35,760 --> 00:06:39,280 Speaker 1: fact that the breach included many Jewish people in it, 108 00:06:39,720 --> 00:06:43,560 Speaker 1: and considering world events right now, that is particularly sinister, 109 00:06:44,120 --> 00:06:46,200 Speaker 1: like that has an edge to it in my mind. 110 00:06:46,880 --> 00:06:49,880 Speaker 1: The suggestions in the article include pretty much the same 111 00:06:49,880 --> 00:06:52,159 Speaker 1: steps that twenty three and Me suggested. You know, select 112 00:06:52,200 --> 00:06:55,880 Speaker 1: a unique, impossible to guess password for the service and 113 00:06:55,960 --> 00:06:59,360 Speaker 1: for every other service. It should not be a password 114 00:06:59,440 --> 00:07:02,200 Speaker 1: that you've used anywhere else, and the article goes on 115 00:07:02,279 --> 00:07:04,800 Speaker 1: to suggest that users who are really concerned can ask 116 00:07:04,920 --> 00:07:08,000 Speaker 1: twenty three and me to delete all information about you 117 00:07:08,120 --> 00:07:10,920 Speaker 1: from their servers. It also points out that only a 118 00:07:10,920 --> 00:07:13,880 Speaker 1: few states in the United States have laws that compel 119 00:07:14,040 --> 00:07:17,080 Speaker 1: a company to follow through on those requests, and those 120 00:07:17,120 --> 00:07:21,280 Speaker 1: states are California, Colorado, and Virginia, which leaves forty seven 121 00:07:21,320 --> 00:07:24,840 Speaker 1: states that you know don't have a law that does that. 122 00:07:25,080 --> 00:07:31,120 Speaker 1: But requesting the deletion within the service will then prompt 123 00:07:31,360 --> 00:07:34,400 Speaker 1: them to send you a confirmation email, and then you 124 00:07:34,480 --> 00:07:37,280 Speaker 1: have to, you know, click in the confirmation email to 125 00:07:37,320 --> 00:07:39,880 Speaker 1: confirm that yes, you want all of your data delete it. 126 00:07:40,680 --> 00:07:43,920 Speaker 1: And even then, twenty three and me will keep some 127 00:07:44,200 --> 00:07:48,520 Speaker 1: information due to quote unquote legal and lab requirements. However, 128 00:07:48,680 --> 00:07:51,800 Speaker 1: when asked to elaborate on that, twenty three and Me 129 00:07:51,960 --> 00:07:54,600 Speaker 1: did not expound upon the nature of that data. So 130 00:07:54,640 --> 00:07:58,240 Speaker 1: I don't know exactly what they do retain, but they 131 00:07:58,240 --> 00:08:02,320 Speaker 1: do retain something, not truly that they delete all of 132 00:08:02,360 --> 00:08:05,840 Speaker 1: the user data. The article also suggests that perhaps you 133 00:08:05,840 --> 00:08:10,040 Speaker 1: should resist sharing genetic information in general, which you know, 134 00:08:10,240 --> 00:08:12,320 Speaker 1: not very helpful because the cat is already out of 135 00:08:12,320 --> 00:08:15,400 Speaker 1: the bag. Like I get it, like saying you shouldn't 136 00:08:15,400 --> 00:08:17,880 Speaker 1: have done that, But you know what, saying you shouldn't 137 00:08:18,280 --> 00:08:21,400 Speaker 1: have done that doesn't help anybody because the person who 138 00:08:21,400 --> 00:08:24,920 Speaker 1: did it already knows they shouldn't have done it. It's 139 00:08:25,080 --> 00:08:27,559 Speaker 1: just a whole superiority thing. It's one of my pet peeves. 140 00:08:27,600 --> 00:08:29,840 Speaker 1: That's why I'm getting head up about it. Like the 141 00:08:29,960 --> 00:08:32,840 Speaker 1: I told you so, nature doesn't It doesn't help anything. 142 00:08:33,240 --> 00:08:35,240 Speaker 1: It just makes you look like a jerk. I know 143 00:08:35,320 --> 00:08:37,520 Speaker 1: because I used to say it all the time and 144 00:08:37,640 --> 00:08:40,440 Speaker 1: I was a jerk and some days I still am. Anyway, 145 00:08:41,480 --> 00:08:45,480 Speaker 1: I also think sharing genetic information isn't a very good idea. 146 00:08:45,960 --> 00:08:49,160 Speaker 1: But we also live in a world where companies are 147 00:08:49,320 --> 00:08:54,480 Speaker 1: constantly conditioning us to share more information about ourselves, right, Like, 148 00:08:54,520 --> 00:08:57,720 Speaker 1: that's what all of social media is predicated upon, is 149 00:08:57,840 --> 00:09:01,640 Speaker 1: us sharing information about ourselves, and we kind of get 150 00:09:01,679 --> 00:09:03,760 Speaker 1: a little reward for that by the way, people who 151 00:09:03,960 --> 00:09:07,439 Speaker 1: you know, engage with the stuff we post, and in 152 00:09:07,520 --> 00:09:10,120 Speaker 1: the meantime, all we're really doing is feeding the machine, 153 00:09:11,440 --> 00:09:14,880 Speaker 1: which is a big ol' yuck. So yeah, I also 154 00:09:14,920 --> 00:09:17,720 Speaker 1: don't think that you should share your genetic data on, 155 00:09:18,280 --> 00:09:21,280 Speaker 1: especially on a platform that is, you know, publicly viewable. 156 00:09:22,160 --> 00:09:23,679 Speaker 1: But I also think we all need to kind of 157 00:09:23,679 --> 00:09:26,240 Speaker 1: take a step back on how much we're sharing online anyway, 158 00:09:27,000 --> 00:09:31,679 Speaker 1: because what we're really doing is just you know, lining 159 00:09:31,720 --> 00:09:34,720 Speaker 1: the pockets of these companies with lots of more, lots 160 00:09:34,800 --> 00:09:37,439 Speaker 1: more money because they can advertise to us more effectively. 161 00:09:38,000 --> 00:09:40,360 Speaker 1: But all right, that's enough soap boxing. Let's let's keep 162 00:09:40,400 --> 00:09:43,440 Speaker 1: on going. Our next story might have you questioning what 163 00:09:43,600 --> 00:09:47,880 Speaker 1: the I in CIA actually stands for. Now, for the record. 164 00:09:48,280 --> 00:09:51,760 Speaker 1: The CIA is the central intelligence agency here in the 165 00:09:51,840 --> 00:09:54,760 Speaker 1: United States. The main job of the CIA is to 166 00:09:54,800 --> 00:10:00,320 Speaker 1: protect US national security, primarily through gathering and analyzing foreign intelligence. 167 00:10:01,000 --> 00:10:05,000 Speaker 1: But anyway, earlier this month, the CIA included a link 168 00:10:05,520 --> 00:10:09,079 Speaker 1: in its bio on x, you know, the service formerly 169 00:10:09,160 --> 00:10:11,960 Speaker 1: known as Twitter, and the link was meant to take 170 00:10:12,040 --> 00:10:17,240 Speaker 1: people to a CIA run channel on telegram uh And 171 00:10:17,320 --> 00:10:19,679 Speaker 1: it was meant so that for people who have intelligence 172 00:10:19,720 --> 00:10:22,080 Speaker 1: that is important to national security, they could have a 173 00:10:22,200 --> 00:10:26,120 Speaker 1: secure and private way to contact someone at the CIA 174 00:10:26,800 --> 00:10:31,800 Speaker 1: and share the information. So what you're supposed to do 175 00:10:31,960 --> 00:10:35,920 Speaker 1: is use tour the tor browser and you clandestinely contact 176 00:10:35,920 --> 00:10:38,840 Speaker 1: the CIA over the dark web. But there was a 177 00:10:38,880 --> 00:10:43,840 Speaker 1: problem because the restrictions on biolink on x meant that 178 00:10:43,960 --> 00:10:47,080 Speaker 1: when this was posted, the address for that telegram channel 179 00:10:47,320 --> 00:10:50,560 Speaker 1: got cut off a bit, so, in other words, it 180 00:10:50,600 --> 00:10:54,040 Speaker 1: was no longer the actual telegram channel for the CIA. 181 00:10:54,120 --> 00:10:57,240 Speaker 1: If you were to try and go to that telegram channel, 182 00:10:57,600 --> 00:11:00,240 Speaker 1: you would find that it went to an unclaimed channel, 183 00:11:00,360 --> 00:11:04,640 Speaker 1: which meant someone could claim it and then pose as 184 00:11:04,679 --> 00:11:09,080 Speaker 1: the CIA. A man named Kevin mcshehon noticed this problem 185 00:11:09,160 --> 00:11:12,440 Speaker 1: and took the steps to register the channel before a 186 00:11:12,480 --> 00:11:15,320 Speaker 1: malicious actor could do it. And he even posted a 187 00:11:15,360 --> 00:11:17,920 Speaker 1: message that revealed that if you did go to that 188 00:11:18,000 --> 00:11:21,600 Speaker 1: telegram channel, that it was not an official CIA channel 189 00:11:21,840 --> 00:11:25,560 Speaker 1: and said do not share sensitive information with anyone, and 190 00:11:25,640 --> 00:11:29,640 Speaker 1: he listed it both in English and in Cyrillic. The 191 00:11:29,720 --> 00:11:32,800 Speaker 1: CIA has since addressed the problem by ensuring that the 192 00:11:32,840 --> 00:11:36,440 Speaker 1: full address for the real telegram channel is appearing in 193 00:11:36,480 --> 00:11:39,800 Speaker 1: the ex bio. And as I read about this in Motherboard, 194 00:11:40,160 --> 00:11:43,559 Speaker 1: the title of the article suggests that the problem was 195 00:11:43,640 --> 00:11:46,439 Speaker 1: due to a flaw in X, but I think I 196 00:11:46,480 --> 00:11:49,000 Speaker 1: would be more inclined to wag by thinker at the 197 00:11:49,000 --> 00:11:52,880 Speaker 1: CIA for not double checking that the address that appeared 198 00:11:52,920 --> 00:11:57,680 Speaker 1: in the bio was in fact the correct one. So yeah, 199 00:11:57,720 --> 00:12:01,400 Speaker 1: it may be because the limitation on bio length and 200 00:12:01,720 --> 00:12:05,440 Speaker 1: X caused it to happen, But that seems like that's 201 00:12:05,440 --> 00:12:08,120 Speaker 1: something you should triple check is correct before you just 202 00:12:08,200 --> 00:12:12,600 Speaker 1: walk away. Okay, we've got other news stories to cover today. 203 00:12:12,640 --> 00:12:24,559 Speaker 1: Before we get to that, let's take a quick break. Okay, 204 00:12:24,679 --> 00:12:28,320 Speaker 1: So the media outlet New Scientist posted an article last 205 00:12:28,320 --> 00:12:31,800 Speaker 1: week that I did not see until today, but I 206 00:12:31,800 --> 00:12:34,079 Speaker 1: figure it's an important one to bring up because we're 207 00:12:34,120 --> 00:12:37,360 Speaker 1: talking about security and hackers a lot. In today's episode, 208 00:12:38,080 --> 00:12:42,400 Speaker 1: a cryptography expert named Daniel Bernstein, who works at the 209 00:12:42,600 --> 00:12:45,360 Speaker 1: University of Illinois, Chicago, came forward to say that the 210 00:12:45,480 --> 00:12:50,320 Speaker 1: US National Security Agency, the NSA, has been pushing to 211 00:12:50,440 --> 00:12:55,160 Speaker 1: influence the adoption of standards related to post quantum cryptography, 212 00:12:56,000 --> 00:13:00,199 Speaker 1: and the National Institute of Standards and Technology, or nis T, 213 00:13:01,000 --> 00:13:06,480 Speaker 1: refutes this accusation. So essentially, what Bernstein is saying is 214 00:13:06,480 --> 00:13:10,840 Speaker 1: that the NSA wants to shape these standards, which would 215 00:13:10,920 --> 00:13:16,120 Speaker 1: then weaken cryptography, because otherwise quantum cryptography could potentially be 216 00:13:16,240 --> 00:13:19,320 Speaker 1: so strong that the NSA would not be able to 217 00:13:19,360 --> 00:13:23,240 Speaker 1: crack it. And the NSA is all about reading stuff 218 00:13:23,360 --> 00:13:26,559 Speaker 1: that you didn't intend the NSA to read. You might remember, 219 00:13:26,840 --> 00:13:29,480 Speaker 1: the NSA has a pretty nasty track record of finding 220 00:13:29,480 --> 00:13:33,840 Speaker 1: ways to intercept communications. This was a huge scandal like 221 00:13:33,880 --> 00:13:38,280 Speaker 1: a decade ago, and that capability also led to abuse. 222 00:13:38,320 --> 00:13:42,719 Speaker 1: I'm reminded of stories about how some people on NSA's 223 00:13:42,800 --> 00:13:46,240 Speaker 1: staff would use these tools that were meant to do 224 00:13:46,320 --> 00:13:51,840 Speaker 1: things like detect potential terrorists, cell communications, things like that, 225 00:13:52,160 --> 00:13:54,000 Speaker 1: but they were using the tools to do stuff like 226 00:13:54,320 --> 00:13:58,160 Speaker 1: snoop on people they knew, like an ex romantic partner 227 00:13:59,280 --> 00:14:03,560 Speaker 1: and they're using these these government level surveillance tools for 228 00:14:03,600 --> 00:14:07,680 Speaker 1: that kind of stuff, And you know that obviously wasn't 229 00:14:07,679 --> 00:14:10,240 Speaker 1: the intended purpose for the tools. But it turns out 230 00:14:10,240 --> 00:14:13,560 Speaker 1: a tool is only trustworthy if the person who's using 231 00:14:13,600 --> 00:14:17,320 Speaker 1: it is also trustworthy, and people can be total jerkfaces. 232 00:14:18,120 --> 00:14:21,920 Speaker 1: Bernstein maintains that the NIST is not being transparent when 233 00:14:21,920 --> 00:14:25,160 Speaker 1: it comes to establishing these standards and that the consequences 234 00:14:25,160 --> 00:14:29,520 Speaker 1: of this could be unreliable security measures moving forward. That 235 00:14:30,240 --> 00:14:33,360 Speaker 1: you know, if the NSA essentially has the equivalent of 236 00:14:33,400 --> 00:14:39,360 Speaker 1: a back door to get into these otherwise encrypted communication channels, 237 00:14:40,040 --> 00:14:42,800 Speaker 1: that means other people could potentially find their way through 238 00:14:42,800 --> 00:14:45,320 Speaker 1: those back doors too. It. Like I've always said, this, 239 00:14:45,960 --> 00:14:48,840 Speaker 1: putting a back door into a security system means you 240 00:14:48,880 --> 00:14:52,360 Speaker 1: no longer have a secure system, just full stop. Like, 241 00:14:53,160 --> 00:14:56,120 Speaker 1: the whole point of creating a secure system is to 242 00:14:56,360 --> 00:14:59,560 Speaker 1: not have ways for people to get in that aren't 243 00:15:00,080 --> 00:15:05,280 Speaker 1: through specific checkpoints. So I still feel that way about this. 244 00:15:06,160 --> 00:15:09,760 Speaker 1: Earlier this week, US Senator Michael Bennett sent a letter 245 00:15:09,800 --> 00:15:13,960 Speaker 1: to Google, TikTok X, and Meta regarding how these platforms 246 00:15:14,000 --> 00:15:18,240 Speaker 1: are handling the influx of misinformation in the wake of 247 00:15:18,280 --> 00:15:21,520 Speaker 1: the war between Israel and hamas As. I'm sure you 248 00:15:21,600 --> 00:15:23,960 Speaker 1: know there's been a surge in that kind of content 249 00:15:24,040 --> 00:15:27,720 Speaker 1: across all of social media, and Bennett has accused the 250 00:15:27,800 --> 00:15:32,880 Speaker 1: respective algorithms of these platforms of amplifying that signal and 251 00:15:33,120 --> 00:15:37,720 Speaker 1: quote contributing to a dangerous cycle of outrage, engagement, and 252 00:15:37,840 --> 00:15:42,320 Speaker 1: redistribution end quote. Bennett is seeking information on the policies 253 00:15:42,320 --> 00:15:47,240 Speaker 1: and processes each platform follows to combat misinformation, though whatever 254 00:15:47,280 --> 00:15:50,400 Speaker 1: those might be, they are clearly not sufficient, at least 255 00:15:50,400 --> 00:15:53,240 Speaker 1: according to Bennett. In his letter, he writes, quote, the 256 00:15:53,360 --> 00:15:57,280 Speaker 1: mountain of false content clearly demonstrates that your current policies 257 00:15:57,280 --> 00:16:02,600 Speaker 1: and protocols are inadequate end quote. Bennett also criticized how Google, Meta, 258 00:16:02,640 --> 00:16:06,800 Speaker 1: and X have all made cuts, sometimes drastic ones, to 259 00:16:06,840 --> 00:16:10,600 Speaker 1: the various teams dedicated to detecting and removing disinformation and 260 00:16:10,680 --> 00:16:13,480 Speaker 1: hate speech. While I feel the Senator is making some 261 00:16:13,600 --> 00:16:17,560 Speaker 1: legitimate criticisms, I'm not sure what follow up there's going 262 00:16:17,600 --> 00:16:20,760 Speaker 1: to be from all of this. If you are in 263 00:16:20,880 --> 00:16:24,480 Speaker 1: New Zealand or the Philippines and you're not already on 264 00:16:25,080 --> 00:16:29,160 Speaker 1: X formerly known as Twitter, pretty soon you're gonna have 265 00:16:29,200 --> 00:16:31,720 Speaker 1: to pay admission if you do want to join, and 266 00:16:31,760 --> 00:16:34,680 Speaker 1: that's because X is going to require a one dollar 267 00:16:34,880 --> 00:16:40,000 Speaker 1: annual subscription fee for new users starting in those countries. Now, 268 00:16:40,080 --> 00:16:44,320 Speaker 1: according to X, this is to discourage bot farms. Only 269 00:16:44,360 --> 00:16:47,120 Speaker 1: by paying the fee will the account be able to 270 00:16:47,240 --> 00:16:50,720 Speaker 1: post and interact with other posts. If you don't pay 271 00:16:50,720 --> 00:16:53,560 Speaker 1: the fee, you can follow folks on Twitter and you 272 00:16:53,600 --> 00:16:56,480 Speaker 1: can read posts on Twitter, but you can't interact with them, 273 00:16:56,600 --> 00:17:00,920 Speaker 1: and you can't post anything yourself. So you might be asking, 274 00:17:00,960 --> 00:17:03,680 Speaker 1: all right, well, they're starting in New Zealand and the Philippines, 275 00:17:03,680 --> 00:17:06,040 Speaker 1: does that mean that's where all the bot farms are. No, 276 00:17:06,320 --> 00:17:09,399 Speaker 1: It's just these are serving as a testing ground for 277 00:17:09,520 --> 00:17:13,160 Speaker 1: this new policy, and then the thought is, assuming it works, 278 00:17:13,560 --> 00:17:18,000 Speaker 1: that policy will roll out to other countries. Now, X 279 00:17:18,040 --> 00:17:21,200 Speaker 1: says this is not intended to be a revenue generator, 280 00:17:21,680 --> 00:17:24,480 Speaker 1: and I'm inclined to believe them. I mean, at one 281 00:17:24,760 --> 00:17:28,480 Speaker 1: dollar per user per year, I think it's pretty fair 282 00:17:28,520 --> 00:17:31,600 Speaker 1: to say that's not a revenue generator. Instead, what this 283 00:17:31,640 --> 00:17:33,960 Speaker 1: is meant to be as a deterrent for people who 284 00:17:34,000 --> 00:17:38,120 Speaker 1: want to run armies of spambots, because if they want 285 00:17:38,160 --> 00:17:40,560 Speaker 1: all those spambots to be able to post and that's 286 00:17:40,600 --> 00:17:43,439 Speaker 1: the only way that spambots are useful. Then they have 287 00:17:43,480 --> 00:17:46,360 Speaker 1: to pay a dollar for every single one of those spambots. 288 00:17:46,400 --> 00:17:50,080 Speaker 1: And if you're trying to coordinate a campaign that has 289 00:17:50,119 --> 00:17:53,440 Speaker 1: thousands of spam bots, that's thousands of dollars out of 290 00:17:53,520 --> 00:17:56,200 Speaker 1: your pocket to do it. So while I'm not crazy 291 00:17:56,240 --> 00:17:59,000 Speaker 1: about this idea, I do think the strategy could potentially 292 00:17:59,080 --> 00:18:02,480 Speaker 1: work for X and it doesn't put a hefty burden 293 00:18:02,680 --> 00:18:06,679 Speaker 1: on most new users. Most folks can cough up a 294 00:18:06,920 --> 00:18:10,959 Speaker 1: dollar for a year's subscription to being able to use Twitter, 295 00:18:11,359 --> 00:18:13,399 Speaker 1: and if it means that it cuts back on the 296 00:18:13,440 --> 00:18:17,480 Speaker 1: spam armies, then that's a good thing. So I often 297 00:18:17,560 --> 00:18:21,000 Speaker 1: am very critical of X, but in this case, I 298 00:18:21,040 --> 00:18:26,040 Speaker 1: think it might actually make sense switching gears, as it were, 299 00:18:26,400 --> 00:18:30,680 Speaker 1: to a different elon Musk company Tesla held an earnings 300 00:18:30,680 --> 00:18:33,320 Speaker 1: call this week, and Musk gave an update on the 301 00:18:33,480 --> 00:18:36,960 Speaker 1: long delayed cyber truck. Now, we first heard about this 302 00:18:37,160 --> 00:18:40,879 Speaker 1: concept way back in twenty nineteen, and since then a 303 00:18:40,920 --> 00:18:43,280 Speaker 1: lot of people have put in a pre order for one, 304 00:18:43,320 --> 00:18:47,600 Speaker 1: like a million customers have reserved a cyber truck, and 305 00:18:47,720 --> 00:18:49,800 Speaker 1: it sounds like folks are going to have to keep 306 00:18:49,800 --> 00:18:52,160 Speaker 1: on waiting. At least a lot of them are because 307 00:18:52,240 --> 00:18:55,960 Speaker 1: Musk says that Tesla will start to deliver cyber trucks 308 00:18:55,960 --> 00:19:00,119 Speaker 1: to customers beginning on November thirtieth, So that's good, but 309 00:19:00,200 --> 00:19:02,959 Speaker 1: it will take quite a bit longer before Tesla can 310 00:19:03,080 --> 00:19:06,480 Speaker 1: ramp up manufacturing to meet demand. In fact, Musk said, 311 00:19:06,520 --> 00:19:09,200 Speaker 1: the goal is that they're hoping the company can get 312 00:19:09,280 --> 00:19:12,840 Speaker 1: up to manufacturing two hundred and fifty thousand cyber trucks 313 00:19:12,880 --> 00:19:17,320 Speaker 1: per year. That's not even likely to be reached until 314 00:19:17,359 --> 00:19:20,719 Speaker 1: after twenty twenty four. And because you've got a million 315 00:19:20,720 --> 00:19:22,800 Speaker 1: people who have put a reservation on the cyber truck, 316 00:19:23,080 --> 00:19:25,320 Speaker 1: that means it's going to take several years before Tesla 317 00:19:25,359 --> 00:19:28,879 Speaker 1: can even meet the demand of pre orders. And that, 318 00:19:29,200 --> 00:19:31,520 Speaker 1: of course, that's assuming everybody wants to keep their pre 319 00:19:31,640 --> 00:19:34,359 Speaker 1: order on their truck. But yeah, just because you have 320 00:19:34,359 --> 00:19:36,880 Speaker 1: a pre order, it may mean you're still like three 321 00:19:36,960 --> 00:19:39,600 Speaker 1: years out from being able to get your vehicle or more. 322 00:19:40,359 --> 00:19:43,480 Speaker 1: Musk said that quote we dug our own grave with 323 00:19:43,560 --> 00:19:45,920 Speaker 1: the cyber truck end quote. I think that was apparent 324 00:19:46,400 --> 00:19:49,159 Speaker 1: when during a demonstration of how strong the cyber truck's 325 00:19:49,160 --> 00:19:54,320 Speaker 1: windows are, they shattered the Windows anyway. During the call, 326 00:19:54,520 --> 00:19:57,119 Speaker 1: Musk also gave a somewhat pessimistic view on where the 327 00:19:57,160 --> 00:20:00,560 Speaker 1: global economy is going in general. Will also ways to 328 00:20:00,600 --> 00:20:03,320 Speaker 1: bring down the price on Tesla vehicles. They said, we've 329 00:20:03,320 --> 00:20:05,560 Speaker 1: got to get the costs to come down. And that 330 00:20:05,640 --> 00:20:08,400 Speaker 1: makes sense because there are also a lot more auto 331 00:20:08,400 --> 00:20:12,800 Speaker 1: companies that are making vehicles that now compete directly with 332 00:20:12,920 --> 00:20:17,120 Speaker 1: the ones that Tesla manufactures. And it's not just Tesla. 333 00:20:17,560 --> 00:20:20,520 Speaker 1: It's having to face a tough economy. Both Apple and 334 00:20:20,680 --> 00:20:26,240 Speaker 1: Microsoft are seeing declines. There's a decline in MacBook shipments 335 00:20:26,359 --> 00:20:30,600 Speaker 1: that one analyst has pointed to. Windows eleven adoption is 336 00:20:30,760 --> 00:20:34,240 Speaker 1: trailing well behind what Windows ten did, So we're seeing 337 00:20:34,240 --> 00:20:37,960 Speaker 1: these issues throughout the tech space. Okay, well, this brings 338 00:20:38,040 --> 00:20:39,720 Speaker 1: us to the end of the episode. I do have 339 00:20:39,720 --> 00:20:42,280 Speaker 1: a couple of article recommendations for you. First up as 340 00:20:42,320 --> 00:20:45,679 Speaker 1: a piece from ap news dot Com titled Fuji's Rapper 341 00:20:45,800 --> 00:20:49,440 Speaker 1: says lawyer's use of AI helped tank his case, pushes 342 00:20:49,440 --> 00:20:52,320 Speaker 1: for new trial. I just think that's interesting. One, it's 343 00:20:52,359 --> 00:20:54,480 Speaker 1: the Fuji is Two, it's not the first time we've 344 00:20:54,520 --> 00:20:57,760 Speaker 1: heard how AI being used in a trial setting has 345 00:20:57,800 --> 00:21:00,840 Speaker 1: not gone According to Plan. The other piece I recommend 346 00:21:00,960 --> 00:21:05,560 Speaker 1: is by Emon Javers and Page Tortorelli of CNBC. That 347 00:21:05,600 --> 00:21:08,920 Speaker 1: one's titled The Secret Life of Jimmy Jeong, who Stole 348 00:21:09,160 --> 00:21:14,800 Speaker 1: and lost more than three billion dollars. That story involves cryptocurrency, 349 00:21:15,080 --> 00:21:19,359 Speaker 1: stealing money from the Silk Road, living like a rock star, 350 00:21:20,320 --> 00:21:23,879 Speaker 1: being the victim of a break in like it's got everything. 351 00:21:24,200 --> 00:21:27,960 Speaker 1: There's also a documentary that CNBC has released called How 352 00:21:28,000 --> 00:21:30,720 Speaker 1: To Steal and Lose more Than three Billion in Bitcoin. 353 00:21:31,400 --> 00:21:34,640 Speaker 1: Both of those are well worth checking out. And that's 354 00:21:34,680 --> 00:21:37,439 Speaker 1: it for this week. I will talk to you again 355 00:21:38,080 --> 00:21:47,880 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 356 00:21:47,920 --> 00:21:52,680 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 357 00:21:52,680 --> 00:21:58,480 Speaker 1: wherever you listen to your favorite shows.