1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,280 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,400 --> 00:00:17,079 Speaker 1: Jonathan Strickland of an executive producer with I Heart Radio. 4 00:00:17,160 --> 00:00:19,439 Speaker 1: And how the tech are you. It's time for the 5 00:00:19,440 --> 00:00:23,200 Speaker 1: tech news for Thursday, July two thousand twenty two. Let's 6 00:00:23,239 --> 00:00:26,880 Speaker 1: get to it with some stories about Meta. A class 7 00:00:26,880 --> 00:00:31,560 Speaker 1: action lawsuit filed against Meta means that CEO Mark Zuckerberg 8 00:00:32,000 --> 00:00:36,559 Speaker 1: and former CEO Cheryl Sandberg will each have to testify 9 00:00:36,600 --> 00:00:40,640 Speaker 1: in federal court regarding their alleged roles in the Cambridge 10 00:00:40,640 --> 00:00:45,159 Speaker 1: Analytica scandal. And it seems like a lifetime since that 11 00:00:45,280 --> 00:00:48,320 Speaker 1: scandal first became public. Now, in case you do not 12 00:00:48,520 --> 00:00:52,400 Speaker 1: know what that scandal was all about, a British political 13 00:00:52,440 --> 00:00:57,600 Speaker 1: consultancy firm that worked with conservative political campaigns here in 14 00:00:57,640 --> 00:01:02,640 Speaker 1: the States leveraged data that had been scraped from Facebook. 15 00:01:02,760 --> 00:01:06,120 Speaker 1: So there was this app developer slash professor who had 16 00:01:06,160 --> 00:01:10,640 Speaker 1: created a political survey app and it was a pay 17 00:01:10,760 --> 00:01:13,399 Speaker 1: for survey, so you would get paid if you took it. 18 00:01:14,120 --> 00:01:17,160 Speaker 1: But what the people who took the survey didn't know 19 00:01:17,440 --> 00:01:21,000 Speaker 1: is that by granting the app permission, it gave the 20 00:01:21,040 --> 00:01:25,720 Speaker 1: app uh an elevated ability to see into not just 21 00:01:25,760 --> 00:01:29,440 Speaker 1: their own personal information, but that of their friends on Facebook. 22 00:01:29,959 --> 00:01:33,720 Speaker 1: So the app essentially made it possible for the app 23 00:01:33,760 --> 00:01:38,200 Speaker 1: developer to look at friend profiles as if the developer 24 00:01:38,280 --> 00:01:41,800 Speaker 1: were in fact that person's friend, and so the app 25 00:01:41,920 --> 00:01:45,520 Speaker 1: was able to collect massive amounts of information from people 26 00:01:45,560 --> 00:01:48,760 Speaker 1: who never consented to share that info in the first place. 27 00:01:49,200 --> 00:01:51,440 Speaker 1: Right Like, if you were a friend of someone who 28 00:01:51,520 --> 00:01:55,560 Speaker 1: took this survey, you didn't tell the survey that it 29 00:01:55,640 --> 00:01:59,800 Speaker 1: was okay to scrape your data. Anyway, Cambridge Analytica mostly 30 00:02:00,000 --> 00:02:02,880 Speaker 1: proved to be an ineffective entity as far as the 31 00:02:02,880 --> 00:02:08,200 Speaker 1: political campaign consultancy gig is concerned, but it's still is 32 00:02:08,240 --> 00:02:11,800 Speaker 1: true that the company relied heavily on information that violated 33 00:02:11,800 --> 00:02:15,760 Speaker 1: consumer privacy laws. Meta already went through an investigation that 34 00:02:15,840 --> 00:02:18,840 Speaker 1: was conducted by the Federal Trade Commission about this, and 35 00:02:18,880 --> 00:02:22,280 Speaker 1: the company had paid out a five billion dollar fine. 36 00:02:22,320 --> 00:02:26,480 Speaker 1: That's billion with a B, not chump change. But this 37 00:02:26,560 --> 00:02:30,560 Speaker 1: case alleges that several key executives of the company were 38 00:02:30,639 --> 00:02:34,480 Speaker 1: responsible for how the Cambridge Analytica story unfolded. They say 39 00:02:34,560 --> 00:02:37,240 Speaker 1: that the executives knew more about what was going on 40 00:02:37,480 --> 00:02:41,600 Speaker 1: earlier than they indicated, and that they may have even 41 00:02:41,600 --> 00:02:44,480 Speaker 1: taken some steps within the company to try and conceal 42 00:02:44,600 --> 00:02:48,040 Speaker 1: what was happening because it did not reflect well on Facebook. 43 00:02:48,720 --> 00:02:51,160 Speaker 1: The court has demanded that Meta hand over more than 44 00:02:51,240 --> 00:02:55,120 Speaker 1: one thousand documents that the company had previously withheld, citing 45 00:02:55,160 --> 00:02:58,639 Speaker 1: them as privileged information. Looks like some of that privilege 46 00:02:58,680 --> 00:03:01,760 Speaker 1: is gonna get stripped away. Sticking with Meta for a 47 00:03:01,760 --> 00:03:06,000 Speaker 1: couple more stories, nonprofit organizations that focus on civil and 48 00:03:06,080 --> 00:03:10,880 Speaker 1: digital rights are criticizing Meta for its Meta Human Rights Report. 49 00:03:11,440 --> 00:03:14,600 Speaker 1: That's an eighty three page document that the company published 50 00:03:14,680 --> 00:03:19,119 Speaker 1: last week. The report is supposed to document Meta's impact 51 00:03:19,240 --> 00:03:23,760 Speaker 1: on human rights and how it tries to protect human rights, 52 00:03:23,760 --> 00:03:27,920 Speaker 1: but these organizations are saying that the report fails to 53 00:03:27,960 --> 00:03:31,119 Speaker 1: do this. Instead, it serves more as a whitewashing document 54 00:03:31,720 --> 00:03:35,080 Speaker 1: meant to absolve Meta of its role in facilitating the 55 00:03:35,080 --> 00:03:40,080 Speaker 1: spread of hate speech, extremism, misinformation, and that kind of thing. Further, 56 00:03:40,400 --> 00:03:44,120 Speaker 1: the organizations say that Meta was citing its own press 57 00:03:44,160 --> 00:03:47,480 Speaker 1: releases in the document to use that as evidence of 58 00:03:47,520 --> 00:03:50,800 Speaker 1: how the company is working to support human rights, which 59 00:03:51,040 --> 00:03:55,200 Speaker 1: kind of boils down into because I said so, right, 60 00:03:55,400 --> 00:03:59,120 Speaker 1: like a like a a nonsensical argument like that. The 61 00:03:59,280 --> 00:04:03,480 Speaker 1: organizations also say that the document mentions several safeguards that 62 00:04:03,520 --> 00:04:06,600 Speaker 1: are meant to protect human rights, but that the document 63 00:04:06,720 --> 00:04:10,840 Speaker 1: fails to disclose that Meta has subsequently removed many of 64 00:04:10,880 --> 00:04:15,120 Speaker 1: those safeguards. So there's this growing criticism that Meta's reports 65 00:04:15,240 --> 00:04:17,600 Speaker 1: in general, not just this one but others as well, 66 00:04:17,640 --> 00:04:22,279 Speaker 1: that all claimed to increase transparency, failed to include anything 67 00:04:22,279 --> 00:04:25,280 Speaker 1: of real substance within those reports, and that the company 68 00:04:25,320 --> 00:04:27,400 Speaker 1: is really just kind of going through the motions or 69 00:04:27,440 --> 00:04:30,919 Speaker 1: putting on a kind of responsibility theater while failing to 70 00:04:30,960 --> 00:04:35,880 Speaker 1: make significant operational changes that would produce results. Pretty nasty 71 00:04:35,920 --> 00:04:39,320 Speaker 1: criticism directed to the company. And to round out our 72 00:04:39,360 --> 00:04:43,600 Speaker 1: Meta news, a company called Meta dot i S, which 73 00:04:43,680 --> 00:04:48,960 Speaker 1: is an art installation company, is suing Meta for trademark violation. 74 00:04:49,040 --> 00:04:51,680 Speaker 1: Now I'm pretty sure I actually mentioned Meta dot i 75 00:04:51,960 --> 00:04:56,479 Speaker 1: s way back shortly after Facebook changed its name to 76 00:04:56,520 --> 00:05:00,720 Speaker 1: Meta last year. Now, according to Meta dot i S, 77 00:05:01,279 --> 00:05:05,080 Speaker 1: they tried to work with the Facebook version of Meta, 78 00:05:05,240 --> 00:05:08,720 Speaker 1: but they found no real success there, and the art 79 00:05:08,760 --> 00:05:12,479 Speaker 1: insplation company now says it has trouble securing clients because 80 00:05:13,360 --> 00:05:17,600 Speaker 1: people assume that Meta dot i S is associated with 81 00:05:17,760 --> 00:05:21,400 Speaker 1: meta slash Facebook. And why would that be a problem. Well, 82 00:05:21,440 --> 00:05:25,799 Speaker 1: according to the art company, Facebook's reputation is so bad 83 00:05:25,960 --> 00:05:30,320 Speaker 1: right now, with so much negative press and perception, that 84 00:05:30,640 --> 00:05:34,040 Speaker 1: this spills over to affect the art company, even though 85 00:05:34,040 --> 00:05:37,560 Speaker 1: there's no connection other than having the same name. And 86 00:05:37,680 --> 00:05:42,640 Speaker 1: since meta dot i s pre existed Facebook's Meta, they 87 00:05:42,680 --> 00:05:45,320 Speaker 1: they are arguing that they should have the valid claim 88 00:05:45,360 --> 00:05:48,560 Speaker 1: on that trademark. Whether or not that argument will stand 89 00:05:48,600 --> 00:05:51,400 Speaker 1: up in court remains to be seen, and it's always 90 00:05:51,400 --> 00:05:55,400 Speaker 1: possible or even probable that meta slash Facebook will settle 91 00:05:55,400 --> 00:05:57,960 Speaker 1: out of court if the law team feels their case 92 00:05:58,160 --> 00:06:02,280 Speaker 1: isn't strong enough to win. All right, now, let's talk 93 00:06:02,320 --> 00:06:06,200 Speaker 1: about Netflix and the world of business, which I clearly 94 00:06:06,400 --> 00:06:10,000 Speaker 1: do not understand. So earlier this year, Netflix held an 95 00:06:10,000 --> 00:06:12,880 Speaker 1: earnings call and revealed that for the first time, it 96 00:06:13,040 --> 00:06:16,719 Speaker 1: lost more subscribers than it had gained over a quarter, 97 00:06:17,360 --> 00:06:21,560 Speaker 1: and subsequently, the stock price on Netflix took a nose dive. Well, 98 00:06:21,600 --> 00:06:24,320 Speaker 1: the company recently had its earnings call for the second 99 00:06:24,440 --> 00:06:27,680 Speaker 1: quarter of the year and revealed it lost nearly a 100 00:06:27,920 --> 00:06:31,120 Speaker 1: million subscribers. So in the first quarter it lost two 101 00:06:31,200 --> 00:06:34,760 Speaker 1: hundred thousand. This time it lost like nine hundred seventy 102 00:06:34,800 --> 00:06:38,040 Speaker 1: thousand subscribers, so that's like way more than it lost 103 00:06:38,040 --> 00:06:40,520 Speaker 1: in the first quarter. That's not as bad as what 104 00:06:40,640 --> 00:06:45,080 Speaker 1: Netflix thought it was gonna be. The initial forecast was 105 00:06:45,120 --> 00:06:48,800 Speaker 1: going to be a two million drop in subscribers, so 106 00:06:49,680 --> 00:06:51,640 Speaker 1: it's not as bad as what they thought it was 107 00:06:51,680 --> 00:06:55,600 Speaker 1: gonna be. It was half as bad, less better than 108 00:06:55,640 --> 00:06:57,680 Speaker 1: half as bad, I guess it's a weird way to 109 00:06:57,720 --> 00:07:00,560 Speaker 1: say it, but less than half of what they they 110 00:07:00,600 --> 00:07:03,839 Speaker 1: had forecast, so still worse than they did in the 111 00:07:03,880 --> 00:07:06,080 Speaker 1: first quarter, but not nearly as bad as they thought 112 00:07:06,120 --> 00:07:08,080 Speaker 1: it was going to be. In the second quarter, and 113 00:07:08,160 --> 00:07:13,080 Speaker 1: revenue was actually up by eight point six so yeah, 114 00:07:13,080 --> 00:07:15,960 Speaker 1: the company has, you were, subscribers, but it's earning more money, 115 00:07:16,040 --> 00:07:19,680 Speaker 1: largely because of, according to the company, the dollars standing 116 00:07:19,800 --> 00:07:24,800 Speaker 1: in the world economy. And consequently, Netflix's stock price increased 117 00:07:24,800 --> 00:07:29,960 Speaker 1: by seven point four percent on Wednesday, which is wild, right, 118 00:07:30,040 --> 00:07:33,280 Speaker 1: Like def flix Is stock price dropped when they lost 119 00:07:33,280 --> 00:07:36,760 Speaker 1: two hundred thousand, but when up when they quote unquote 120 00:07:36,840 --> 00:07:41,320 Speaker 1: only lost nine seventy thousand subscribers. I'm just not able 121 00:07:41,360 --> 00:07:43,640 Speaker 1: to see the matrix when it comes to the stock market, 122 00:07:43,880 --> 00:07:47,520 Speaker 1: and you know, it's such a relief that I'm not 123 00:07:47,560 --> 00:07:49,840 Speaker 1: a day trader, because I would have completely lost my 124 00:07:49,880 --> 00:07:53,680 Speaker 1: mind at this point. Microsoft and Google are both pumping 125 00:07:53,760 --> 00:07:57,520 Speaker 1: the brakes on hiring new employees due to economic pressures. 126 00:07:57,760 --> 00:08:01,520 Speaker 1: Bloomberg reports that Microsoft has been shut down open job 127 00:08:01,560 --> 00:08:05,920 Speaker 1: listings in various departments like Azure and security. Company reps 128 00:08:05,920 --> 00:08:09,720 Speaker 1: say there's essentially a hiring freeze for most departments, but 129 00:08:09,920 --> 00:08:13,120 Speaker 1: Microsoft will continue to honor job offers that have already 130 00:08:13,120 --> 00:08:17,960 Speaker 1: been extended to prospective employees, and they may make the 131 00:08:17,960 --> 00:08:20,960 Speaker 1: occasional exception for positions that are considered to be critical 132 00:08:20,960 --> 00:08:25,560 Speaker 1: to operations. Business Insider reports that Google has entered into 133 00:08:25,600 --> 00:08:29,240 Speaker 1: a two week hiring freeze while the company assesses its 134 00:08:29,280 --> 00:08:34,240 Speaker 1: actual quote unquote headcount needs, and like Microsoft, Google is 135 00:08:34,280 --> 00:08:36,920 Speaker 1: also going to honor any job offers that have already 136 00:08:36,920 --> 00:08:41,120 Speaker 1: been extended to future Googlers. So yeah, things are are 137 00:08:41,160 --> 00:08:44,760 Speaker 1: still looking pretty rough in the tech sector in general. 138 00:08:44,960 --> 00:08:48,160 Speaker 1: Will probably see that spill over into other industries as well, 139 00:08:48,400 --> 00:08:52,120 Speaker 1: or continue to spill over because it's already happening over 140 00:08:52,160 --> 00:08:54,800 Speaker 1: in the UK, the g c h Q and the 141 00:08:54,880 --> 00:08:58,079 Speaker 1: U k's National Cybersecurity Center are calling on tech companies 142 00:08:58,120 --> 00:09:01,360 Speaker 1: to practice client side scan ending in an effort to 143 00:09:01,400 --> 00:09:04,760 Speaker 1: seek out illegal material, namely images and videos related to 144 00:09:04,840 --> 00:09:08,720 Speaker 1: child abuse. Now, this is an ongoing struggle and it 145 00:09:08,840 --> 00:09:11,880 Speaker 1: is a highly charged topic because on the one side, 146 00:09:12,400 --> 00:09:15,280 Speaker 1: you have people who want to leverage technology to uncover 147 00:09:15,360 --> 00:09:18,520 Speaker 1: instances of child abuse so that the perpetrators can be 148 00:09:18,559 --> 00:09:24,960 Speaker 1: held accountable, and that's completely understandable. Child abuse is absolutely horrifying. 149 00:09:25,520 --> 00:09:28,640 Speaker 1: And on the other side, you have privacy advocates arguing 150 00:09:28,679 --> 00:09:32,120 Speaker 1: that any sort of client side scanning mandate is a 151 00:09:32,200 --> 00:09:35,319 Speaker 1: huge threat to privacy. It's an enormous amount of surveillance, 152 00:09:35,559 --> 00:09:40,079 Speaker 1: and it would also arguably necessitate the outlawing of end 153 00:09:40,080 --> 00:09:43,000 Speaker 1: to end encryption, because if you have true end to 154 00:09:43,200 --> 00:09:47,520 Speaker 1: end encryption, no other parties other than those involved in 155 00:09:47,640 --> 00:09:50,520 Speaker 1: that communication would be able to read the messages. So 156 00:09:50,600 --> 00:09:54,680 Speaker 1: even the service carrying the messages would not be able 157 00:09:54,720 --> 00:09:58,440 Speaker 1: to see what was inside that communication. So there's no 158 00:09:58,480 --> 00:10:01,560 Speaker 1: way they could do scanning in that event, And if 159 00:10:01,600 --> 00:10:04,280 Speaker 1: they're required to do scanning, then the logic goes, you 160 00:10:04,320 --> 00:10:06,959 Speaker 1: can't have end to end encryption. Now you can see 161 00:10:06,960 --> 00:10:10,600 Speaker 1: the validity of both sides, I imagine, because yes, child 162 00:10:10,600 --> 00:10:14,400 Speaker 1: abuse is terrible, it should be stopped. It absolutely needs 163 00:10:14,440 --> 00:10:21,240 Speaker 1: to be detected and halted and better yet prevented. Now, 164 00:10:21,240 --> 00:10:24,840 Speaker 1: on the other side, eliminating secure means of communication and 165 00:10:24,880 --> 00:10:28,199 Speaker 1: introducing more surveillance can put people into danger, people who 166 00:10:28,200 --> 00:10:32,040 Speaker 1: are innocent of committing any crimes. You know, maybe they 167 00:10:32,120 --> 00:10:36,319 Speaker 1: are a political activist or a journalist or something along 168 00:10:36,320 --> 00:10:40,440 Speaker 1: those lines. And if you have a government that turns 169 00:10:40,559 --> 00:10:46,360 Speaker 1: against those kinds of things, then having a client scanning side, uh, 170 00:10:46,440 --> 00:10:49,360 Speaker 1: you know technology in place, one that law enforcement could 171 00:10:49,440 --> 00:10:53,240 Speaker 1: theoretically access, puts those people at risk. So some of 172 00:10:53,280 --> 00:10:56,720 Speaker 1: the privacy side argue that using technology to address a 173 00:10:56,760 --> 00:11:01,480 Speaker 1: societal problem isn't effective, that all technology really does is 174 00:11:01,520 --> 00:11:05,040 Speaker 1: go after a symptom or the outcome, it doesn't actually 175 00:11:05,040 --> 00:11:08,679 Speaker 1: address the root causes, and that the money and effort 176 00:11:08,720 --> 00:11:12,559 Speaker 1: that would be spent making this technological approach would be 177 00:11:12,559 --> 00:11:15,680 Speaker 1: better directed at creating social programs that aim to prevent 178 00:11:15,760 --> 00:11:19,800 Speaker 1: child abuse in the first place. Okay, we've got other 179 00:11:19,840 --> 00:11:22,280 Speaker 1: stories to talk about, some of which are not nearly 180 00:11:22,320 --> 00:11:24,520 Speaker 1: as heavy as what we just talked about. But before 181 00:11:24,520 --> 00:11:34,000 Speaker 1: we get to those, let's take a quick break. Okay, 182 00:11:34,040 --> 00:11:37,600 Speaker 1: we're gonna talk about some stories that relate to electric vehicles, 183 00:11:37,600 --> 00:11:42,320 Speaker 1: and first up, the US Post Office or USPS UH 184 00:11:42,520 --> 00:11:47,000 Speaker 1: for the United States Postal Service had initially committed to 185 00:11:47,080 --> 00:11:51,880 Speaker 1: purchasing five thousand all electric mail trucks because it's adding 186 00:11:51,960 --> 00:11:56,480 Speaker 1: fifty vehicles to its fleet, but now the USPS says 187 00:11:56,679 --> 00:11:59,080 Speaker 1: it will actually be closer to half of those fifty 188 00:11:59,080 --> 00:12:02,160 Speaker 1: thousand that will be all trick vehicles, and that when 189 00:12:02,240 --> 00:12:04,960 Speaker 1: looking at all vehicles the USPS plans to purchase in 190 00:12:04,960 --> 00:12:08,079 Speaker 1: the near future, which goes beyond just the mail trucks, 191 00:12:08,200 --> 00:12:12,240 Speaker 1: goes to everything, that's about eighty four thousand vehicles total. 192 00:12:12,760 --> 00:12:17,160 Speaker 1: The USPS plans for around those to be electric vehicles. 193 00:12:17,160 --> 00:12:19,640 Speaker 1: It's a good move, especially for the mail trucks. They 194 00:12:19,640 --> 00:12:24,880 Speaker 1: are notoriously gas guzzling vehicles. The Grumman Long life vehicles, 195 00:12:24,960 --> 00:12:28,480 Speaker 1: which are the kind that the USPS has as most 196 00:12:28,480 --> 00:12:32,560 Speaker 1: of its fleet, they average around UH somewhere around ten 197 00:12:32,720 --> 00:12:35,800 Speaker 1: miles per gallon due to the stop and go nature 198 00:12:35,920 --> 00:12:38,319 Speaker 1: of postal workers duties. So I switched to e v 199 00:12:38,480 --> 00:12:41,480 Speaker 1: S would mean the USPS would significantly reduce its carbon 200 00:12:41,480 --> 00:12:45,080 Speaker 1: emissions across its fleet. Over at the Foreign Motor Company, 201 00:12:45,920 --> 00:12:49,280 Speaker 1: it's planning on cutting eight thousand jobs in order to 202 00:12:49,360 --> 00:12:53,120 Speaker 1: reallocate resources towards building more electric vehicles of its own. 203 00:12:53,679 --> 00:12:57,240 Speaker 1: The layoffs will affect Ford's Blue Division, which is the 204 00:12:57,360 --> 00:13:01,040 Speaker 1: part of Ford that focuses on building internal combustion engine 205 00:13:01,160 --> 00:13:05,400 Speaker 1: or i c E vehicles. This follows the recent reorganization 206 00:13:05,400 --> 00:13:09,200 Speaker 1: of Ford into the Blue Division and the Model E Division, 207 00:13:09,520 --> 00:13:12,040 Speaker 1: and it's another sign of the massive shift in the 208 00:13:12,080 --> 00:13:15,520 Speaker 1: automotive industry as more companies are transitioning to building non 209 00:13:15,559 --> 00:13:18,319 Speaker 1: i c E vehicles. Now, when it comes to e 210 00:13:18,440 --> 00:13:21,640 Speaker 1: V companies, you could argue that none is more famous 211 00:13:21,679 --> 00:13:25,720 Speaker 1: than Tesla, which is back in the news because of cryptocurrency. 212 00:13:25,800 --> 00:13:31,080 Speaker 1: So back in one, Tesla purchased a large amount of bitcoin, 213 00:13:31,160 --> 00:13:33,800 Speaker 1: to the tune of about one point five billion dollars 214 00:13:33,800 --> 00:13:37,480 Speaker 1: worth of the digital currency, but in a recent earning statement, 215 00:13:37,520 --> 00:13:41,640 Speaker 1: the company revealed it had sold off or three quarters 216 00:13:41,679 --> 00:13:45,560 Speaker 1: of its bitcoin holdings. Now, considering how far the value 217 00:13:45,559 --> 00:13:49,320 Speaker 1: of bitcoin has dropped since twenty one, that could mean 218 00:13:49,360 --> 00:13:52,679 Speaker 1: that Tesla took up fairly big loss on that investment. 219 00:13:52,880 --> 00:13:55,480 Speaker 1: It's hard to say exactly how much, because the company 220 00:13:55,520 --> 00:13:59,840 Speaker 1: did not reveal at what price it's sold that its 221 00:14:00,080 --> 00:14:05,000 Speaker 1: coin holdings. Brian Johnson and analyst at Barclays estimated that 222 00:14:05,080 --> 00:14:08,440 Speaker 1: Tesla was looking at a four hundred sixty million dollar 223 00:14:08,559 --> 00:14:11,560 Speaker 1: bitcoin impairment from the sell off, which is an big 224 00:14:11,559 --> 00:14:14,840 Speaker 1: old alchi Elon Must said that the decision to sell 225 00:14:14,880 --> 00:14:17,280 Speaker 1: off the chunk of bitcoin holdings had nothing to do 226 00:14:17,320 --> 00:14:20,000 Speaker 1: with the cryptocurrencies value. He said, This is not a 227 00:14:20,080 --> 00:14:24,120 Speaker 1: condemnation on crypto, he said, instead, it was just a 228 00:14:24,160 --> 00:14:27,400 Speaker 1: free up resources due to the ongoing challenges of operating 229 00:14:27,400 --> 00:14:31,120 Speaker 1: in China, which has strict COVID lockdown policies that have 230 00:14:31,200 --> 00:14:35,400 Speaker 1: been disrupting Tesla's operations there. When pressed if Tesla would 231 00:14:35,400 --> 00:14:38,680 Speaker 1: reinvest in bitcoin later down the line, must refer to 232 00:14:38,680 --> 00:14:41,760 Speaker 1: bitcoin as quote a side show to a side show 233 00:14:41,880 --> 00:14:45,040 Speaker 1: end quote, which does seem like kind of a condemnation 234 00:14:45,240 --> 00:14:48,680 Speaker 1: of cryptocurrency, So who ne who knows? And this follows 235 00:14:48,680 --> 00:14:52,320 Speaker 1: Tesla's previous move to stop accepting bitcoin as payment for 236 00:14:52,360 --> 00:14:55,240 Speaker 1: car purchases. It did do that briefly, but it has 237 00:14:55,280 --> 00:14:58,360 Speaker 1: curtailed that practice for quite some time now. That was 238 00:14:58,400 --> 00:15:01,040 Speaker 1: a move that the company credited to a concern about 239 00:15:01,080 --> 00:15:05,280 Speaker 1: the environmental impact of bitcoin mining. Whether or not that 240 00:15:05,400 --> 00:15:08,480 Speaker 1: was the one and only reason, I don't know, but 241 00:15:08,600 --> 00:15:12,040 Speaker 1: that's the reason they gave James Murray, the director of 242 00:15:12,040 --> 00:15:16,080 Speaker 1: the United States Secret Service, has announced his retirement and 243 00:15:16,120 --> 00:15:18,560 Speaker 1: he's moving on to join the private sector. And you 244 00:15:18,640 --> 00:15:21,600 Speaker 1: might wonder why am I talking about the director of 245 00:15:21,600 --> 00:15:24,920 Speaker 1: the Secret Service in a podcast that's about tech. Well, 246 00:15:24,920 --> 00:15:27,880 Speaker 1: it's because the company he is going to join is 247 00:15:28,000 --> 00:15:32,320 Speaker 1: Snap Incorporated, the parent company of Snapchat. So Murray will 248 00:15:32,320 --> 00:15:36,720 Speaker 1: oversee security at Snapping, which makes sense, though I have 249 00:15:36,760 --> 00:15:39,640 Speaker 1: to admit, if I were working in the security division 250 00:15:39,720 --> 00:15:43,680 Speaker 1: at Snap, I feel pretty intimidated when the former director 251 00:15:43,680 --> 00:15:47,080 Speaker 1: of the Secret Service came on board. A security researcher 252 00:15:47,320 --> 00:15:50,880 Speaker 1: has found an interesting way to compromise computer systems that 253 00:15:50,960 --> 00:15:53,240 Speaker 1: have an air gap. But before I even get into 254 00:15:53,240 --> 00:15:57,280 Speaker 1: this story, let me just say this exploit is technically possible, 255 00:15:57,880 --> 00:16:00,720 Speaker 1: but it isn't in any way practice at coals, so 256 00:16:00,880 --> 00:16:03,640 Speaker 1: I don't think there's any need to panic about it. 257 00:16:03,680 --> 00:16:07,480 Speaker 1: But it is an interesting security vulnerability. So let's talk 258 00:16:07,520 --> 00:16:10,040 Speaker 1: about this. First of all, what is an air gap. Well, 259 00:16:10,040 --> 00:16:12,840 Speaker 1: that's when you make sure a computer system isn't connected 260 00:16:12,880 --> 00:16:16,920 Speaker 1: to the Internet in any way. It is self contained, uh, 261 00:16:16,960 --> 00:16:19,880 Speaker 1: and it's isolated, so it makes it very difficult to 262 00:16:19,920 --> 00:16:23,840 Speaker 1: breach the system. If there are no pathways, you know, 263 00:16:24,200 --> 00:16:27,560 Speaker 1: into the system online, then hackers can't really gain access, 264 00:16:27,920 --> 00:16:32,680 Speaker 1: at least not remotely. But Mordecai Gurry, a security researcher, 265 00:16:32,720 --> 00:16:34,800 Speaker 1: it came up with a way that theoretically would let 266 00:16:34,800 --> 00:16:38,160 Speaker 1: someone steal from an air gapped system and to do 267 00:16:38,240 --> 00:16:43,400 Speaker 1: so wirelessly. And that sounds impossible, right, and it almost is. 268 00:16:43,720 --> 00:16:46,040 Speaker 1: So to get this to work, first you would have 269 00:16:46,080 --> 00:16:49,400 Speaker 1: to inject malware into the air gap system, which pretty 270 00:16:49,480 --> 00:16:53,360 Speaker 1: much means you or someone under your direction or you know, 271 00:16:53,480 --> 00:16:55,840 Speaker 1: maybe you trick them or whatever. They have to physically 272 00:16:55,920 --> 00:16:59,720 Speaker 1: deliver malware to the system in some way. Maybe it's 273 00:16:59,760 --> 00:17:03,480 Speaker 1: playing a USB stick into a machine and transferring malware 274 00:17:03,560 --> 00:17:07,600 Speaker 1: that way. So that's reason number one that this is impractical, 275 00:17:07,640 --> 00:17:10,800 Speaker 1: but not impossible. You know, people have been able to 276 00:17:10,800 --> 00:17:14,920 Speaker 1: get physical access to air gap systems before, but it's 277 00:17:14,960 --> 00:17:17,439 Speaker 1: not always easy. I mean, you you might find it 278 00:17:17,480 --> 00:17:20,080 Speaker 1: easier if you're able to trick someone on the inside 279 00:17:20,119 --> 00:17:23,160 Speaker 1: to do it for you, but even that's risky. So 280 00:17:23,359 --> 00:17:26,600 Speaker 1: then the researcher discovered that these SATA cables s A 281 00:17:26,680 --> 00:17:29,160 Speaker 1: t A cables that are used in these computer systems 282 00:17:29,680 --> 00:17:33,760 Speaker 1: EMIT a low power radio signal between five point nine 283 00:17:35,640 --> 00:17:40,959 Speaker 1: and five point nine six giga hurts, and that's not surprising. 284 00:17:41,359 --> 00:17:44,760 Speaker 1: If you run current through a wire, it will generate 285 00:17:45,320 --> 00:17:50,000 Speaker 1: an electromagnetic field. So these signals could act kind of 286 00:17:50,040 --> 00:17:55,560 Speaker 1: like Morse code. You know each character as it goes 287 00:17:55,600 --> 00:17:59,320 Speaker 1: over this wire. Uh emits a slightly different signal. So 288 00:17:59,359 --> 00:18:03,240 Speaker 1: if you set up a receiver near a compromised air 289 00:18:03,280 --> 00:18:06,080 Speaker 1: gapped system, and by near I mean you'd have to 290 00:18:06,119 --> 00:18:09,520 Speaker 1: be within a hundred times or three point nine feet 291 00:18:10,040 --> 00:18:12,680 Speaker 1: in order to not have too much of an error 292 00:18:12,720 --> 00:18:17,480 Speaker 1: rate introduced into the signal, and you limited your transmission 293 00:18:17,520 --> 00:18:21,439 Speaker 1: bit rate to about one bit per second. Remember a 294 00:18:21,440 --> 00:18:25,080 Speaker 1: bit is a zero or a one. Then it means 295 00:18:25,080 --> 00:18:27,400 Speaker 1: you would have to sit really close to the system 296 00:18:27,480 --> 00:18:31,760 Speaker 1: for a really long time in order to get anything useful. Presumably, 297 00:18:32,040 --> 00:18:35,000 Speaker 1: if you could somehow get access to this air gap 298 00:18:35,119 --> 00:18:39,280 Speaker 1: system and then planned to computer with wireless capability close 299 00:18:39,400 --> 00:18:41,280 Speaker 1: enough to it in a way that folks are not 300 00:18:41,400 --> 00:18:44,359 Speaker 1: likely to see it and thus remove it, you could 301 00:18:44,520 --> 00:18:49,480 Speaker 1: very slowly siphon information from that target system. So it's 302 00:18:49,520 --> 00:18:52,119 Speaker 1: good to be aware that this vulnerability as possible, but 303 00:18:52,240 --> 00:18:55,399 Speaker 1: it's an unlikely scenario you would ever encounter in real life. 304 00:18:55,960 --> 00:18:58,440 Speaker 1: And that's it for this episode of tech Stuff. Hope 305 00:18:58,440 --> 00:19:00,359 Speaker 1: you enjoyed it. Please reach out to me and let 306 00:19:00,400 --> 00:19:02,560 Speaker 1: me know about any topics you would like me to 307 00:19:02,640 --> 00:19:05,159 Speaker 1: cover in the future. You can do so on Twitter 308 00:19:05,320 --> 00:19:07,679 Speaker 1: that's tech stuff hs W, or you can do so 309 00:19:07,760 --> 00:19:09,800 Speaker 1: on the I Heart Radio app by navigating over to 310 00:19:09,840 --> 00:19:12,040 Speaker 1: tech Stuff and using a little microphone icon and you 311 00:19:12,080 --> 00:19:14,080 Speaker 1: can leave a voice message up to thirty seconds in 312 00:19:14,160 --> 00:19:19,360 Speaker 1: length and I will talk to you again really soon. Yeah. 313 00:19:23,480 --> 00:19:26,520 Speaker 1: Text Stuff is an I heart Radio production. For more 314 00:19:26,560 --> 00:19:29,960 Speaker 1: podcasts from I Heart Radio, visit the i Heart Radio app, 315 00:19:30,119 --> 00:19:33,280 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.