1 00:00:04,360 --> 00:00:12,239 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:15,680 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,720 --> 00:00:19,800 Speaker 1: I'm an executive producer with iHeartRadio and How the Tech 4 00:00:20,079 --> 00:00:23,280 Speaker 1: are you Today? We're going to cover the tech news 5 00:00:23,280 --> 00:00:26,159 Speaker 1: for Thursday, March nine, twenty twenty three. But before I 6 00:00:26,160 --> 00:00:28,920 Speaker 1: get to that, I just want to thank bridget Todd, 7 00:00:29,160 --> 00:00:31,880 Speaker 1: host of There Are No Girls on the Internet, or 8 00:00:32,000 --> 00:00:35,240 Speaker 1: taking over the show yesterday. I hope you enjoyed that episode. 9 00:00:35,280 --> 00:00:38,280 Speaker 1: You should definitely check out There Are No Girls on 10 00:00:38,280 --> 00:00:41,560 Speaker 1: the Internet. There are three seasons of episodes. Season four 11 00:00:41,680 --> 00:00:45,400 Speaker 1: is around the corner, and she does really great work 12 00:00:45,400 --> 00:00:52,120 Speaker 1: and talks with some really smart and passionate people, and 13 00:00:52,240 --> 00:00:54,840 Speaker 1: if you are one of those folks who gets really 14 00:00:55,760 --> 00:00:59,320 Speaker 1: riled up about issues in society in general and tech 15 00:00:59,360 --> 00:01:03,760 Speaker 1: in particular, you'll definitely want to tune into that. Okay, 16 00:01:03,800 --> 00:01:06,000 Speaker 1: let's get to the tech news, and we're going to 17 00:01:06,080 --> 00:01:07,960 Speaker 1: start off by following up on a story I talked 18 00:01:08,000 --> 00:01:12,800 Speaker 1: about earlier this week, the sad Tale of Silvergate Bank. So, 19 00:01:13,319 --> 00:01:17,440 Speaker 1: before twenty fourteen, this bank was primarily focused on serving 20 00:01:17,440 --> 00:01:20,520 Speaker 1: as a financial institution in the real estate market in 21 00:01:20,560 --> 00:01:24,840 Speaker 1: southern California, but then it hitched its wagon to the 22 00:01:24,840 --> 00:01:30,320 Speaker 1: cryptocurrency market and started to work with crypto clients. So 23 00:01:30,480 --> 00:01:34,760 Speaker 1: these markets need to connect to traditional financial institutions to 24 00:01:34,840 --> 00:01:39,160 Speaker 1: be useful beyond the insular ecosystem of the crypto world. 25 00:01:39,680 --> 00:01:42,680 Speaker 1: It wouldn't do you much good if you had all 26 00:01:42,720 --> 00:01:45,280 Speaker 1: of your money in bitcoin, if there were no way 27 00:01:45,319 --> 00:01:48,240 Speaker 1: to get your money out of bitcoin. Sure, you could 28 00:01:48,280 --> 00:01:52,040 Speaker 1: use bitcoin to purchase some stuff, but the vast majority 29 00:01:52,080 --> 00:01:54,040 Speaker 1: of transactions that you would need to make in your 30 00:01:54,120 --> 00:01:58,000 Speaker 1: day to day life would likely exist outside of the 31 00:01:58,320 --> 00:02:04,160 Speaker 1: cryptocurrency ecosystem. So you need financial institutions to facilitate movement 32 00:02:04,320 --> 00:02:09,040 Speaker 1: on and off that ecosystem. Plus companies dealing with crypto 33 00:02:09,360 --> 00:02:13,200 Speaker 1: they need banks too, and so Silvergate began to cater 34 00:02:13,720 --> 00:02:16,400 Speaker 1: to this world. But as we all know, last year 35 00:02:16,480 --> 00:02:20,520 Speaker 1: saw a series of disasters hit the crypto markets. Some 36 00:02:20,800 --> 00:02:25,519 Speaker 1: cryptocurrencies imploded, we talked about that earlier this week. Crypto 37 00:02:25,639 --> 00:02:30,240 Speaker 1: lenders and exchanges went under, and this ripple effect extended 38 00:02:30,240 --> 00:02:36,160 Speaker 1: out beyond the immediate cryptocurrency ecosystem into the classic financial 39 00:02:36,200 --> 00:02:40,480 Speaker 1: world in the form of Silvergate. The bank had operated 40 00:02:40,560 --> 00:02:45,240 Speaker 1: its own exchange known as Silvergate exchange Network, which the 41 00:02:45,280 --> 00:02:48,960 Speaker 1: company has already shut down. It held deposits for various 42 00:02:49,000 --> 00:02:53,639 Speaker 1: crypto companies, but in the wake of these crypto disasters 43 00:02:53,720 --> 00:02:58,640 Speaker 1: last year, particularly with FTX collapsing, customers withdrew an astonishing 44 00:02:58,720 --> 00:03:03,320 Speaker 1: eight point one billion dollars from their silver Gate accounts. 45 00:03:04,000 --> 00:03:08,400 Speaker 1: The company Silvergate had a billion dollar loss following this event, 46 00:03:08,680 --> 00:03:11,600 Speaker 1: and now it sounds like the actual situation at Silvergate 47 00:03:12,040 --> 00:03:15,400 Speaker 1: was even more dire than the PaperWorks seemed to indicate. 48 00:03:15,800 --> 00:03:20,000 Speaker 1: This has led the Silvergate Capitol Corporation to announce that 49 00:03:20,120 --> 00:03:23,440 Speaker 1: the bank is closing. It will return deposits to customers 50 00:03:23,639 --> 00:03:27,240 Speaker 1: and then shut down, and it credited quote recent industry 51 00:03:27,240 --> 00:03:31,760 Speaker 1: and regulatory developments end quote as the reason behind closing 52 00:03:31,880 --> 00:03:35,760 Speaker 1: up shop. And to be fair, the US government is 53 00:03:35,800 --> 00:03:39,440 Speaker 1: expected to look into Silvergate, possibly to see if there 54 00:03:39,440 --> 00:03:43,360 Speaker 1: were any hinky things going on between Silvergate and FTX. 55 00:03:43,920 --> 00:03:46,360 Speaker 1: But the fact of the matter is, even if nothing 56 00:03:46,680 --> 00:03:49,880 Speaker 1: untoward was happening, even if everything was on the up 57 00:03:49,880 --> 00:03:55,560 Speaker 1: and up, there was enough momentum and this collapse to 58 00:03:55,800 --> 00:03:59,160 Speaker 1: lead to Silvergate's downfall. Now this news is bad for 59 00:03:59,240 --> 00:04:02,880 Speaker 1: the entire crypto world. For one thing, the number of 60 00:04:03,920 --> 00:04:08,200 Speaker 1: regulated financial institutions that can serve as a bridge for 61 00:04:08,280 --> 00:04:11,680 Speaker 1: the crypto world has dropped by one, and there were 62 00:04:11,760 --> 00:04:15,119 Speaker 1: not that many to start with. For another, we might 63 00:04:15,160 --> 00:04:18,120 Speaker 1: see a rise in crypto companies being willing to make 64 00:04:18,200 --> 00:04:23,320 Speaker 1: use of poorly regulated or even unregulated organizations for banking. 65 00:04:23,640 --> 00:04:26,480 Speaker 1: And as we've already seen with crypto, a lack of 66 00:04:26,520 --> 00:04:31,200 Speaker 1: regulation may not really be the feature, it might really 67 00:04:31,240 --> 00:04:34,480 Speaker 1: be the bug. You know, we've had a lot of 68 00:04:34,800 --> 00:04:38,320 Speaker 1: crypto enthusiasts say that the lack of regulation is a 69 00:04:38,320 --> 00:04:42,040 Speaker 1: good thing, but twenty twenty two really showed that without 70 00:04:42,080 --> 00:04:46,840 Speaker 1: regulation there's a lot of potential for things to go sideways. Now, 71 00:04:46,839 --> 00:04:50,800 Speaker 1: if everyone's behaving properly, regulation wouldn't be necessary. But that's 72 00:04:50,880 --> 00:04:53,719 Speaker 1: kind of true in most cases, regardless of what industry 73 00:04:53,720 --> 00:04:56,240 Speaker 1: we're talking about, right, And we know that there are 74 00:04:56,240 --> 00:04:58,279 Speaker 1: a lot of folks in the crypto community who are 75 00:04:58,360 --> 00:05:03,520 Speaker 1: not having well. They are specifically looking for ways to 76 00:05:03,800 --> 00:05:08,320 Speaker 1: take advantage of others. They are opportunists who don't mind 77 00:05:08,800 --> 00:05:13,320 Speaker 1: speculating and ramping up and then rug pulling. It's clear 78 00:05:13,480 --> 00:05:17,000 Speaker 1: there are plenty of folks who fall into that camp. Now, 79 00:05:17,000 --> 00:05:21,160 Speaker 1: there's a third concern about Silvergate's collapse, which is that 80 00:05:21,760 --> 00:05:25,760 Speaker 1: as Silvergate was in the eyes of say, regulators and governments, 81 00:05:26,000 --> 00:05:30,560 Speaker 1: a quote unquote real financial institution, we're likely to see 82 00:05:30,600 --> 00:05:33,880 Speaker 1: governments take a closer look and a deeper concern in 83 00:05:33,920 --> 00:05:36,919 Speaker 1: the crypto market in general than we have already seen. 84 00:05:37,520 --> 00:05:42,039 Speaker 1: Because it's one thing for these ephemeral ecosystems to crash 85 00:05:42,160 --> 00:05:46,520 Speaker 1: in on themselves, right, that's a problem. People lose money, 86 00:05:47,480 --> 00:05:50,800 Speaker 1: real people lose real money in the process. So governments 87 00:05:50,800 --> 00:05:53,320 Speaker 1: are already concerned about that. But now we have an 88 00:05:53,360 --> 00:05:58,039 Speaker 1: example of an established quote unquote real world bank also 89 00:05:58,360 --> 00:06:00,960 Speaker 1: having to shut down as a result of the crypto 90 00:06:01,040 --> 00:06:04,839 Speaker 1: market involvement. And when the consequences of the crypto world 91 00:06:04,920 --> 00:06:09,120 Speaker 1: spill over into the traditional financial world, things can look 92 00:06:09,200 --> 00:06:13,160 Speaker 1: really scary for governments and regulators. So, if anything, I 93 00:06:13,160 --> 00:06:16,440 Speaker 1: imagine this is going to escalate efforts to regulate the 94 00:06:16,480 --> 00:06:19,640 Speaker 1: crypto market, and in the meantime, I suspect we're going 95 00:06:19,680 --> 00:06:24,640 Speaker 1: to see more volatility in the crypto world, which I mean, 96 00:06:25,360 --> 00:06:30,320 Speaker 1: that's nothing, right, That's like throwing a dart and then 97 00:06:30,440 --> 00:06:33,080 Speaker 1: drawing the dartboard around it, So you get a bullseye 98 00:06:33,240 --> 00:06:35,000 Speaker 1: to say that there's going to be volatility in the 99 00:06:35,000 --> 00:06:39,680 Speaker 1: crypto market. US Senator Ed Margie has promised to reintroduce 100 00:06:39,720 --> 00:06:43,040 Speaker 1: a proposed ban for the federal government making use of 101 00:06:43,240 --> 00:06:46,760 Speaker 1: facial recognition technology. Now, this comes in the wake of 102 00:06:46,800 --> 00:06:51,240 Speaker 1: the American Civil Liberties Union or ACLU, publishing a piece 103 00:06:51,279 --> 00:06:55,040 Speaker 1: that says the FBI and the Pentagon are both developing 104 00:06:55,440 --> 00:06:59,240 Speaker 1: facial recognition technology, and as we have mentioned dozens of 105 00:06:59,279 --> 00:07:03,839 Speaker 1: times on this show, this technology is inherently dangerous. Not 106 00:07:03,920 --> 00:07:07,560 Speaker 1: only have we seen multiple instances in which a bias 107 00:07:07,640 --> 00:07:11,920 Speaker 1: in the facial recognition model leads to misidentification, particularly for 108 00:07:11,960 --> 00:07:15,240 Speaker 1: people of color. This in turns makes these people of 109 00:07:15,280 --> 00:07:21,240 Speaker 1: color disproportionately affected by this technology's performance and leads to 110 00:07:21,400 --> 00:07:25,720 Speaker 1: all sorts of terrible outcomes. But also, as Ed Markey 111 00:07:25,840 --> 00:07:29,600 Speaker 1: points out, the use of such technology implies that all 112 00:07:29,640 --> 00:07:31,920 Speaker 1: of us are being treated as if we are already 113 00:07:31,960 --> 00:07:35,520 Speaker 1: suspects of a crime. From square one, everyone is being 114 00:07:35,680 --> 00:07:39,240 Speaker 1: surveiled as if they are under suspicion. Everyone is being 115 00:07:39,280 --> 00:07:41,880 Speaker 1: identified so that they can be tracked, so that when 116 00:07:41,960 --> 00:07:46,120 Speaker 1: they do something wrong, we can be taken in. So, 117 00:07:46,200 --> 00:07:50,720 Speaker 1: Marky argues that surveillance technology and facial recognition combined have 118 00:07:50,880 --> 00:07:54,560 Speaker 1: served as a threat to individual citizens privacy and security. 119 00:07:54,920 --> 00:07:56,840 Speaker 1: I find it really hard to find any fault in 120 00:07:56,880 --> 00:08:00,240 Speaker 1: that argument. I agree with it. And here in the US, 121 00:08:00,280 --> 00:08:02,520 Speaker 1: it's not just that the FBI wants to know who 122 00:08:02,560 --> 00:08:06,360 Speaker 1: you are, it's also that it wants to know where 123 00:08:06,480 --> 00:08:10,320 Speaker 1: you've been wired. Has a report that the FBI has 124 00:08:10,360 --> 00:08:14,480 Speaker 1: been found to circumvent the normal legal process, and in 125 00:08:14,520 --> 00:08:18,840 Speaker 1: fact has admitted to this. So normally the FBI would 126 00:08:18,880 --> 00:08:21,920 Speaker 1: have to secure some sort of permission in order to 127 00:08:21,960 --> 00:08:26,679 Speaker 1: get a look at someone's location data. Typically you're talking 128 00:08:26,680 --> 00:08:28,880 Speaker 1: about a warrant process where you have to go to 129 00:08:28,960 --> 00:08:33,040 Speaker 1: a court, submit your request, have a judge look it 130 00:08:33,120 --> 00:08:35,839 Speaker 1: over and approve it, and then you could go and 131 00:08:35,920 --> 00:08:40,400 Speaker 1: get hold of geolocation data. Instead, the FBI has admitted 132 00:08:40,440 --> 00:08:45,240 Speaker 1: that in the past it has purchased US location data 133 00:08:45,520 --> 00:08:48,920 Speaker 1: just like an advertising company would. In fact, Chris Ferray, 134 00:08:49,040 --> 00:08:51,720 Speaker 1: the director of the FBI, said that that's essentially what 135 00:08:51,760 --> 00:08:55,440 Speaker 1: the FBI did. It bought data from a broker, just 136 00:08:55,640 --> 00:09:00,559 Speaker 1: like it had been an advertising company. And privacy advocates 137 00:09:00,640 --> 00:09:04,000 Speaker 1: argue that all data being scooped up online is potentially harmful. 138 00:09:04,040 --> 00:09:06,400 Speaker 1: They've been saying this for years, and this is the 139 00:09:06,480 --> 00:09:09,280 Speaker 1: kind of thing that they're talking about as being one 140 00:09:09,320 --> 00:09:13,079 Speaker 1: of the worst case scenarios, right, not just that private 141 00:09:13,120 --> 00:09:16,560 Speaker 1: companies or even publicly traded companies have data about you, 142 00:09:16,960 --> 00:09:20,320 Speaker 1: but that you start to see governments take advantage of 143 00:09:20,360 --> 00:09:24,280 Speaker 1: loopholes and instead of going through due process in order 144 00:09:24,320 --> 00:09:28,080 Speaker 1: to gain permission and have everything be documented when it 145 00:09:28,080 --> 00:09:31,720 Speaker 1: comes to tracking people, it just sidesteps that and just 146 00:09:31,880 --> 00:09:35,920 Speaker 1: buys the information outright instead of going through what it 147 00:09:35,960 --> 00:09:39,720 Speaker 1: should be doing, because there aren't any specific rules or 148 00:09:39,720 --> 00:09:43,000 Speaker 1: regulations to stop that from happening. Now, there are some 149 00:09:43,120 --> 00:09:45,559 Speaker 1: rules in place that are meant to stop it, but 150 00:09:45,880 --> 00:09:51,400 Speaker 1: federally speaking, it's not really been the highest of priorities, 151 00:09:51,880 --> 00:09:55,440 Speaker 1: especially for US government agencies. Right. A lot of privacy 152 00:09:55,520 --> 00:10:01,720 Speaker 1: laws specifically exclude government agency when it comes to the 153 00:10:01,760 --> 00:10:07,320 Speaker 1: protection of private citizen data because here in the US 154 00:10:07,400 --> 00:10:12,000 Speaker 1: we tend to favor the law enforcement over the citizens 155 00:10:12,040 --> 00:10:16,040 Speaker 1: in some cases. So for the FBI to sidestep a 156 00:10:16,120 --> 00:10:18,400 Speaker 1: legal process in order to get desired data that kind 157 00:10:18,400 --> 00:10:21,440 Speaker 1: of stuff should concern everybody, regardless of how squeaky clean 158 00:10:21,520 --> 00:10:24,800 Speaker 1: you are. The processes are there for a reason, and 159 00:10:24,840 --> 00:10:27,280 Speaker 1: it's really to ensure that the government doesn't overstep and 160 00:10:27,400 --> 00:10:31,600 Speaker 1: violate a citizen's rights. So when a law enforcement agency 161 00:10:31,600 --> 00:10:34,800 Speaker 1: finds a loophole, that's a bad thing. The FBI admitted 162 00:10:34,800 --> 00:10:38,280 Speaker 1: this particular instance during a Senate hearing on Global Threats. 163 00:10:38,720 --> 00:10:41,840 Speaker 1: That's when Christopher Ray said that it was something that 164 00:10:41,880 --> 00:10:44,640 Speaker 1: the FBI had done in the past, but that to 165 00:10:44,720 --> 00:10:48,280 Speaker 1: his knowledge, the FBI was not currently doing it at 166 00:10:48,280 --> 00:10:51,760 Speaker 1: this very moment. The admission raised concern the Senate, But 167 00:10:51,760 --> 00:10:53,719 Speaker 1: the truth of the matter is that the US has 168 00:10:53,720 --> 00:10:56,640 Speaker 1: been really lousy when it comes to protecting citizen data. 169 00:10:56,720 --> 00:11:00,200 Speaker 1: So I don't know why anyone is surprised by this 170 00:11:00,360 --> 00:11:03,240 Speaker 1: or shocked by it. They should be concerned about it, 171 00:11:03,920 --> 00:11:07,760 Speaker 1: But to hear people express surprise that this has happened 172 00:11:08,440 --> 00:11:12,480 Speaker 1: seems disingenuous to me. But then again, it's also to 173 00:11:12,520 --> 00:11:14,600 Speaker 1: be fair, a lot of people who are in politics 174 00:11:14,640 --> 00:11:16,720 Speaker 1: in the United States are so out of touch when 175 00:11:16,760 --> 00:11:20,560 Speaker 1: it comes to technology, and especially when it comes to 176 00:11:20,600 --> 00:11:26,720 Speaker 1: things like data collection and data analysis. Maybe they're genuinely 177 00:11:26,760 --> 00:11:31,920 Speaker 1: surprised because of a large blind spot in their knowledge base. 178 00:11:32,600 --> 00:11:37,000 Speaker 1: Not that same hearing, FBI's director Ray told the Senate 179 00:11:37,040 --> 00:11:40,439 Speaker 1: that TikTok is a potential national security threat, which echoes 180 00:11:40,480 --> 00:11:43,679 Speaker 1: the concerns of lawmakers around the world. So Ray claimed 181 00:11:43,679 --> 00:11:46,679 Speaker 1: that TikTok could potentially give parent company byte Dance the 182 00:11:46,720 --> 00:11:50,760 Speaker 1: ability to quote control data and software on millions of 183 00:11:50,760 --> 00:11:54,440 Speaker 1: devices in the US and drive narratives to divide Americans. 184 00:11:54,559 --> 00:11:57,319 Speaker 1: End quote. Now, to be clear, I'm not saying Ray 185 00:11:57,400 --> 00:12:01,160 Speaker 1: said that specifically. That's how it was worded by Bloomberg. 186 00:12:01,280 --> 00:12:03,760 Speaker 1: They did not use quotations, so this is not Ray 187 00:12:03,840 --> 00:12:07,160 Speaker 1: saying it, but it's what Bloomberg said Ray said, I 188 00:12:07,200 --> 00:12:11,800 Speaker 1: honestly don't know about this claim. I don't even know 189 00:12:11,840 --> 00:12:16,560 Speaker 1: if Ray actually said that TikTok could potentially control software 190 00:12:16,679 --> 00:12:20,199 Speaker 1: on millions of devices. I haven't seen any security experts 191 00:12:20,240 --> 00:12:23,560 Speaker 1: suggest that TikTok has that ability, and I think that 192 00:12:23,559 --> 00:12:26,440 Speaker 1: would be a massive headline. I mean, if someone found 193 00:12:26,480 --> 00:12:29,120 Speaker 1: that a single app on a device had the capability 194 00:12:29,320 --> 00:12:33,400 Speaker 1: to potentially control other software to any real degree, that 195 00:12:33,440 --> 00:12:36,480 Speaker 1: would be huge news. And I don't know that Ray 196 00:12:36,559 --> 00:12:39,360 Speaker 1: said anything close to that. This could just be Bloomberg 197 00:12:39,600 --> 00:12:45,319 Speaker 1: mischaracterizing what Ray was saying. Anyway, I'm not surprised by 198 00:12:45,320 --> 00:12:48,720 Speaker 1: this take. Right It's a pretty popular view in the 199 00:12:48,800 --> 00:12:54,640 Speaker 1: government right now that TikTok is potentially a surveillance tool 200 00:12:54,720 --> 00:12:57,640 Speaker 1: for the Chinese state, and if it isn't right now, 201 00:12:57,720 --> 00:13:01,120 Speaker 1: it has that ability in the future. So this kind 202 00:13:01,120 --> 00:13:03,840 Speaker 1: of goes along with other things we've been hearing recently 203 00:13:04,040 --> 00:13:08,240 Speaker 1: in those circles. Okay, let's get back to geolocation data 204 00:13:08,320 --> 00:13:12,120 Speaker 1: tracking before we take our first break. Google has agreed 205 00:13:12,120 --> 00:13:14,920 Speaker 1: to pay out a nearly three hundred ninety two million 206 00:13:15,000 --> 00:13:18,880 Speaker 1: dollars settlement to end a massive lawsuit involving the state 207 00:13:18,920 --> 00:13:22,880 Speaker 1: attorneys general for forty out of the fifty US states. 208 00:13:23,240 --> 00:13:27,959 Speaker 1: So the core issue is that these state attorneys general 209 00:13:28,040 --> 00:13:31,319 Speaker 1: had argued Google had misled users when they chose to 210 00:13:31,360 --> 00:13:36,400 Speaker 1: turn off location data tracking because Google was still collecting 211 00:13:36,440 --> 00:13:39,839 Speaker 1: location information about these users. And yeah, that does seem 212 00:13:39,840 --> 00:13:42,200 Speaker 1: like there might be a bit of a disconnect, you know, like, oh, 213 00:13:42,280 --> 00:13:44,480 Speaker 1: you've turned off location tracking, but don't worry, we're still 214 00:13:44,520 --> 00:13:47,240 Speaker 1: keeping track of where you are. You know, you don't 215 00:13:47,280 --> 00:13:49,600 Speaker 1: seem to understand what location tracking means and what it 216 00:13:49,640 --> 00:13:51,960 Speaker 1: means to turn it off. So Google said that this 217 00:13:52,000 --> 00:13:55,319 Speaker 1: issue was related to quote, outdated product policies that we 218 00:13:55,480 --> 00:13:58,800 Speaker 1: changed years ago end quote. So they're not denying that 219 00:13:58,840 --> 00:14:01,160 Speaker 1: they did this, they're saying they don't do it anymore. 220 00:14:01,480 --> 00:14:04,440 Speaker 1: So according to Google, now when you turn off location tracking, 221 00:14:04,880 --> 00:14:07,240 Speaker 1: then it for reels would turn it off, at least 222 00:14:07,280 --> 00:14:11,280 Speaker 1: for Google anyway. The details of the settlement include requirements 223 00:14:11,280 --> 00:14:14,200 Speaker 1: that Google has to meet in order to institute some changes, 224 00:14:14,520 --> 00:14:18,200 Speaker 1: such as introducing more alerts whenever a user activates a feature, 225 00:14:18,240 --> 00:14:21,240 Speaker 1: whether an app or otherwise that would turn on or 226 00:14:21,360 --> 00:14:26,200 Speaker 1: off a location related data tracking component. So that's good, 227 00:14:26,880 --> 00:14:28,480 Speaker 1: all right, We're gonna take a quick break. When we 228 00:14:28,520 --> 00:14:30,920 Speaker 1: come back, We've got some more tech news to talk about. 229 00:14:40,640 --> 00:14:44,040 Speaker 1: We're back. Not that long ago, I talked about Anonymous, 230 00:14:44,320 --> 00:14:49,760 Speaker 1: the loosely organized activist group, and according to Taiwan News, 231 00:14:49,800 --> 00:14:53,520 Speaker 1: this group has claimed responsibility for hacking into a Chinese 232 00:14:53,560 --> 00:14:58,240 Speaker 1: weather balloon that flew over India twice. And then they 233 00:14:58,360 --> 00:15:01,720 Speaker 1: hacked into a Chinese website that actually related to a 234 00:15:01,800 --> 00:15:07,239 Speaker 1: study abroad program, and there they posted not a manifesto 235 00:15:07,440 --> 00:15:12,800 Speaker 1: but are really kind of rambling list of complaints and allegations, 236 00:15:12,840 --> 00:15:16,720 Speaker 1: as well as screenshots of the hacked control panel of 237 00:15:16,760 --> 00:15:20,680 Speaker 1: the weather balloon. Now, presumably Anonymous decided to target this 238 00:15:20,720 --> 00:15:24,680 Speaker 1: balloon after the recent discovery of surveillance balloons of Chinese 239 00:15:24,680 --> 00:15:28,560 Speaker 1: origin that flew over North America and eventually the US 240 00:15:28,640 --> 00:15:31,920 Speaker 1: shot him down. Anonymous but also throw in some other 241 00:15:31,960 --> 00:15:35,600 Speaker 1: reasons on this website, so it starts to almost feel 242 00:15:35,600 --> 00:15:41,400 Speaker 1: like a laundry list of unconnected complaints. So one of 243 00:15:41,440 --> 00:15:45,400 Speaker 1: the things that they cited was China's response to COVID 244 00:15:45,520 --> 00:15:50,400 Speaker 1: nineteen and how the government has treated that over the 245 00:15:50,480 --> 00:15:53,480 Speaker 1: last couple of years, So that was part of the 246 00:15:53,600 --> 00:15:56,360 Speaker 1: stuff that was included in there, but also there were 247 00:15:56,640 --> 00:16:01,600 Speaker 1: bits praising Taiwan. Taiwan and China have a contentious relationship, 248 00:16:01,680 --> 00:16:05,760 Speaker 1: to say the least, technically both sides lay claim to 249 00:16:05,840 --> 00:16:10,320 Speaker 1: the other. On the web page, Anonymous also railed against 250 00:16:10,360 --> 00:16:12,760 Speaker 1: some unrelated stuff that has nothing to do with China, 251 00:16:12,880 --> 00:16:17,280 Speaker 1: like Wikipedia and its policies. Anonymous said that women are 252 00:16:17,360 --> 00:16:22,080 Speaker 1: underrepresented in Wikipedia articles and that there might be an 253 00:16:22,120 --> 00:16:26,920 Speaker 1: issue with bias within Wikipedia, and that the articles occasionally 254 00:16:26,960 --> 00:16:31,720 Speaker 1: engage in outright trying to manipulate perspectives in points of view. 255 00:16:32,240 --> 00:16:36,320 Speaker 1: There's also some stuff about people who have been killed 256 00:16:36,320 --> 00:16:42,000 Speaker 1: by US police and various police incidents, so kind of 257 00:16:42,000 --> 00:16:45,520 Speaker 1: a related to Black Lives Matter. There were also parts 258 00:16:46,760 --> 00:16:49,240 Speaker 1: arguing that the Soviet Union which hasn't been a thing 259 00:16:49,320 --> 00:16:54,440 Speaker 1: for decades now, But how the USS are endangered human 260 00:16:54,520 --> 00:16:58,280 Speaker 1: lives and outright killed animals as part of the space race. 261 00:16:58,720 --> 00:17:04,240 Speaker 1: So again, it's like this mishmash hodgepodge of complaints that 262 00:17:04,400 --> 00:17:08,040 Speaker 1: don't necessarily have a centralized theme that appeared on this 263 00:17:08,080 --> 00:17:10,880 Speaker 1: web page. It almost seemed like, well, we've got the opportunity, 264 00:17:10,960 --> 00:17:17,080 Speaker 1: let's take it. I don't pretend to understand it, but 265 00:17:17,160 --> 00:17:19,440 Speaker 1: it's not a huge surprise, right, if you know anything 266 00:17:19,440 --> 00:17:22,320 Speaker 1: about Anonymous, it should not be surprising that it would 267 00:17:22,359 --> 00:17:26,800 Speaker 1: be a little disjointed, because Anonymous is such a loose organization, 268 00:17:26,880 --> 00:17:30,200 Speaker 1: and it's filled with people who have overlapping but not 269 00:17:30,320 --> 00:17:38,440 Speaker 1: necessarily identical motivations and priorities. So you could say, well, yeah, 270 00:17:38,720 --> 00:17:42,560 Speaker 1: it seems a bit a bit chaotic, but then so 271 00:17:42,840 --> 00:17:47,199 Speaker 1: is the very nature of Anonymous itself. That's part of 272 00:17:47,240 --> 00:17:52,040 Speaker 1: its its structure. Earlier this week, Reuters reported that the 273 00:17:52,160 --> 00:17:55,879 Speaker 1: German government is considering a ban on Huawei components in 274 00:17:55,920 --> 00:17:59,360 Speaker 1: its five G networks, along with components from some other 275 00:17:59,480 --> 00:18:02,440 Speaker 1: Chinese companies as well. Now, as I've talked about before, 276 00:18:02,560 --> 00:18:05,199 Speaker 1: the US instituted such a ban out of concern that 277 00:18:05,280 --> 00:18:10,960 Speaker 1: Huawei could essentially tap into, you know, the communications infrastructure 278 00:18:11,000 --> 00:18:14,640 Speaker 1: and then spy on communications within the US. The UK 279 00:18:14,840 --> 00:18:20,119 Speaker 1: has taken similar measures now. Huawei has repeatedly denied these 280 00:18:20,160 --> 00:18:23,160 Speaker 1: allegations and suspicions, saying that the company has never done 281 00:18:23,200 --> 00:18:27,919 Speaker 1: such a thing, that there aren't any backdoors or anything 282 00:18:27,960 --> 00:18:30,159 Speaker 1: built into the technology that would allow for it in 283 00:18:30,160 --> 00:18:34,480 Speaker 1: the first place. A spokesperson for China's embassy in Germany said, 284 00:18:34,480 --> 00:18:37,880 Speaker 1: whahwe follows EU and German laws and there's no reason 285 00:18:37,920 --> 00:18:40,919 Speaker 1: to suspect that the company would undermine those laws, and 286 00:18:40,960 --> 00:18:44,640 Speaker 1: that China is just very disappointed in Germany right now 287 00:18:45,000 --> 00:18:48,919 Speaker 1: and expected better. So this decision hasn't yet been finalized. 288 00:18:48,960 --> 00:18:53,040 Speaker 1: Its possible that Germany will not pursue this route. Even 289 00:18:53,080 --> 00:18:56,639 Speaker 1: if it does, it would probably take years to fully 290 00:18:56,760 --> 00:19:02,720 Speaker 1: decouple Germany's telecommunications infrastructure from Chinese components. But it does 291 00:19:02,760 --> 00:19:06,600 Speaker 1: show that there is this growing concern about how technology. 292 00:19:07,600 --> 00:19:10,800 Speaker 1: You know, more and more of our technology is ultimately 293 00:19:10,840 --> 00:19:15,199 Speaker 1: just meant to direct information somewhere, and that if you 294 00:19:15,200 --> 00:19:18,439 Speaker 1: could end up directing that information in a way that 295 00:19:18,560 --> 00:19:22,400 Speaker 1: benefits one country at the expense of others, that ends 296 00:19:22,480 --> 00:19:25,199 Speaker 1: up being a big concern. Whether or not that's actually 297 00:19:25,240 --> 00:19:30,680 Speaker 1: happening is still at question, but even just the potential 298 00:19:30,920 --> 00:19:35,280 Speaker 1: for it is enough to give companies pause. And to 299 00:19:35,320 --> 00:19:40,360 Speaker 1: be clear, I don't know if Huawei has actively participated 300 00:19:40,400 --> 00:19:43,760 Speaker 1: in surveillance on behalf of China, but I do know 301 00:19:44,359 --> 00:19:47,159 Speaker 1: that there is a very large concern for that. And 302 00:19:47,240 --> 00:19:51,240 Speaker 1: it doesn't help that the Chinese government has essentially given 303 00:19:51,280 --> 00:19:55,600 Speaker 1: a directive to citizens and companies that they are to 304 00:19:56,280 --> 00:20:03,000 Speaker 1: participate in information gathering activities would benefit China. So whether 305 00:20:03,000 --> 00:20:05,119 Speaker 1: it's happening or not, I can't say. I can just 306 00:20:05,160 --> 00:20:07,919 Speaker 1: say that there are enough pieces in place that I 307 00:20:07,960 --> 00:20:11,040 Speaker 1: can understand the concern. Toward the end of last year, 308 00:20:11,160 --> 00:20:14,600 Speaker 1: YouTube changed some rules that ended up having a massive 309 00:20:14,800 --> 00:20:19,280 Speaker 1: impact on YouTubers. If you follow any particular channels, you've 310 00:20:19,320 --> 00:20:24,879 Speaker 1: probably noticed this, either explicitly as people have actually addressed 311 00:20:24,920 --> 00:20:29,320 Speaker 1: this in their videos, or just through the process of 312 00:20:29,359 --> 00:20:33,040 Speaker 1: watching it and thinking, huh, something's changed here. So late 313 00:20:33,119 --> 00:20:36,520 Speaker 1: last year, YouTube started to really crack down on stuff 314 00:20:36,520 --> 00:20:41,800 Speaker 1: like profanity, and any video that included profanity within the 315 00:20:41,840 --> 00:20:45,000 Speaker 1: first fifteen seconds of the video starting was automatically made 316 00:20:45,040 --> 00:20:49,760 Speaker 1: ineligible for monetization. Didn't matter how severe it was. It 317 00:20:49,880 --> 00:20:51,880 Speaker 1: just meant that, you know, if you had a profanity 318 00:20:52,520 --> 00:20:56,640 Speaker 1: incident within those first fifteen seconds, boom monetization switched off 319 00:20:56,640 --> 00:20:59,800 Speaker 1: for that video. And if you were a YouTuber with 320 00:20:59,840 --> 00:21:03,600 Speaker 1: an intro like what the beep is up? This is 321 00:21:03,640 --> 00:21:07,800 Speaker 1: your beep, Homeboy, Beep Smith, Beep the beep like that, well, 322 00:21:07,840 --> 00:21:10,320 Speaker 1: then you'd be looking at all those videos that start 323 00:21:10,359 --> 00:21:13,840 Speaker 1: that way getting demonetized. Because this change wasn't just for 324 00:21:13,880 --> 00:21:18,920 Speaker 1: all videos moving forward, it was a retroactive change. YouTube 325 00:21:19,000 --> 00:21:23,400 Speaker 1: would scan the content of all videos posted, and if 326 00:21:23,440 --> 00:21:26,080 Speaker 1: any of them were detected to include profanity within the 327 00:21:26,160 --> 00:21:29,840 Speaker 1: verse fifteen seconds, boom shut off. And a video on 328 00:21:29,880 --> 00:21:34,760 Speaker 1: YouTube can have a really long life. You know. Typically 329 00:21:34,800 --> 00:21:38,960 Speaker 1: you see your biggest amount of engagement shortly after a 330 00:21:39,080 --> 00:21:42,560 Speaker 1: video publishes, but some videos have a really long tail, 331 00:21:42,840 --> 00:21:47,720 Speaker 1: especially if something relevant happens in the world that brings 332 00:21:47,880 --> 00:21:50,440 Speaker 1: back a video that was recorded a long time ago. 333 00:21:50,800 --> 00:21:53,199 Speaker 1: I'm reminded of a video I did years ago at 334 00:21:53,240 --> 00:21:56,399 Speaker 1: how Stuff Works about the Transatlantic accent. You know that 335 00:21:56,440 --> 00:22:00,320 Speaker 1: old timey radio voice. I did a video about at 336 00:22:00,560 --> 00:22:03,960 Speaker 1: and every few years it seems to bubble back up 337 00:22:04,000 --> 00:22:07,600 Speaker 1: and get popular again. So while it did pretty well 338 00:22:07,600 --> 00:22:11,240 Speaker 1: when it launched. In subsequent years, it would get tons 339 00:22:11,280 --> 00:22:15,479 Speaker 1: more views. Well, if I were depending upon monetization of 340 00:22:15,600 --> 00:22:18,960 Speaker 1: videos like that, I don't own those videos that belongs 341 00:22:18,960 --> 00:22:21,840 Speaker 1: to how stuff works. But if I did, and if 342 00:22:21,880 --> 00:22:27,119 Speaker 1: those videos violated a brand new rule from the platform, 343 00:22:27,600 --> 00:22:29,880 Speaker 1: well that would have an enormous impact on my ability 344 00:22:29,920 --> 00:22:32,240 Speaker 1: to make a living. Right, And because I don't control 345 00:22:32,280 --> 00:22:36,159 Speaker 1: the platform, I'm at the whim of that platform. This 346 00:22:36,200 --> 00:22:38,840 Speaker 1: is why it's risky to depend upon a third party 347 00:22:39,280 --> 00:22:42,280 Speaker 1: in order to reach your audience. If the platform makes 348 00:22:42,280 --> 00:22:44,920 Speaker 1: a change to its business model, you end up being affected. 349 00:22:44,960 --> 00:22:47,480 Speaker 1: I'm minded of When I talked to Bernie Burns, one 350 00:22:47,480 --> 00:22:50,040 Speaker 1: of the founders of Rooster Teeth, this was one of 351 00:22:50,080 --> 00:22:52,040 Speaker 1: his big concerns. He said, you don't want to put 352 00:22:52,040 --> 00:22:55,560 Speaker 1: all your eggs in the basket that belongs to someone else, 353 00:22:55,960 --> 00:22:58,600 Speaker 1: because if they change their mind on how stuff works, 354 00:22:59,359 --> 00:23:02,679 Speaker 1: you end up being hit by that. Well, now YouTube 355 00:23:02,840 --> 00:23:06,119 Speaker 1: is relaxing some of those rules. They're not getting rhythm, 356 00:23:06,600 --> 00:23:09,840 Speaker 1: but they are relaxing them a little bit. Videos that 357 00:23:09,960 --> 00:23:14,560 Speaker 1: include moderate profanity within the first fifteen seconds can get 358 00:23:14,640 --> 00:23:18,840 Speaker 1: some limited monetization. Videos that don't involve a ton of 359 00:23:18,880 --> 00:23:21,760 Speaker 1: profanity and they avoid it within the first seven seconds 360 00:23:21,800 --> 00:23:26,560 Speaker 1: are eligible for full monetization. And YouTube is also introducing 361 00:23:26,560 --> 00:23:28,879 Speaker 1: a review process to look into creators who have been 362 00:23:28,960 --> 00:23:33,560 Speaker 1: hit hard by this change in monetization, where like they 363 00:23:33,600 --> 00:23:38,200 Speaker 1: may have seen their income go close to zilch after 364 00:23:38,280 --> 00:23:42,240 Speaker 1: the results. The new adjustments went into effect this week, 365 00:23:42,280 --> 00:23:44,560 Speaker 1: so hopefully creators who are hit really hard we'll see 366 00:23:44,600 --> 00:23:48,119 Speaker 1: some relief. I will admit there's some channels I watch 367 00:23:48,640 --> 00:23:53,320 Speaker 1: where there were people who would occasionally use profanity, and 368 00:23:53,520 --> 00:23:57,080 Speaker 1: they now tend to beat that stuff out rather than 369 00:23:57,119 --> 00:24:00,480 Speaker 1: actually say the words, which they used to do. And 370 00:24:00,800 --> 00:24:03,480 Speaker 1: I think the reason for it is because they don't 371 00:24:03,480 --> 00:24:08,280 Speaker 1: want to see those monetization switches get hit and then 372 00:24:08,359 --> 00:24:14,040 Speaker 1: they take a massive revenue loss. Over at Meta, there's 373 00:24:14,080 --> 00:24:19,000 Speaker 1: been a leak. Meta introduced its own large language model 374 00:24:19,359 --> 00:24:23,760 Speaker 1: aka Lama, and they opened it up to a very 375 00:24:23,760 --> 00:24:26,280 Speaker 1: small group of people. It's like invitation only, case by 376 00:24:26,280 --> 00:24:31,040 Speaker 1: case basis and includes like academics and researchers, a few companies, 377 00:24:31,520 --> 00:24:37,840 Speaker 1: and they're essentially asking these folks to test the language model, 378 00:24:38,000 --> 00:24:42,240 Speaker 1: to provide feedback let Meta help improve it over time. 379 00:24:42,480 --> 00:24:45,119 Speaker 1: The company did not release it to the general public, 380 00:24:45,520 --> 00:24:51,080 Speaker 1: probably because like all other generative AI models, Meta's approach 381 00:24:51,320 --> 00:24:56,280 Speaker 1: is prone to doing stuff that isn't always appropriate. Generative 382 00:24:56,280 --> 00:25:00,600 Speaker 1: AI can sometimes spout out misinformation. They compare it speech, 383 00:25:00,640 --> 00:25:04,560 Speaker 1: It could call for violence, like it can generally behave 384 00:25:04,600 --> 00:25:07,919 Speaker 1: in ways that are not a good look for the 385 00:25:07,960 --> 00:25:13,040 Speaker 1: company that spawned it, particularly a large, public facing, publicly 386 00:25:13,119 --> 00:25:18,040 Speaker 1: traded company. So Meta did a limited release because this 387 00:25:18,080 --> 00:25:20,879 Speaker 1: model is still under development. It is not ready to 388 00:25:20,920 --> 00:25:24,800 Speaker 1: be deployed. It's still being built. But someone went and 389 00:25:24,920 --> 00:25:28,600 Speaker 1: leaked the model. Word popped up on four Chan, and 390 00:25:29,000 --> 00:25:32,400 Speaker 1: initially you started to see torrents and peer to peer 391 00:25:32,440 --> 00:25:37,119 Speaker 1: network sharing kind of spread this around, and eventually somebody 392 00:25:37,480 --> 00:25:42,280 Speaker 1: put it up on GitHub and people have included instructions 393 00:25:42,320 --> 00:25:45,920 Speaker 1: on how to download and access the language model. I'm 394 00:25:45,920 --> 00:25:48,840 Speaker 1: sure folks at Meta are stressed out about this, but 395 00:25:48,960 --> 00:25:52,280 Speaker 1: some people in the AI industry argue that the tool 396 00:25:52,400 --> 00:25:55,560 Speaker 1: is far more likely to improve quickly if it's in 397 00:25:55,600 --> 00:25:58,600 Speaker 1: a wide release as opposed to trusting a relatively small 398 00:25:58,600 --> 00:26:02,040 Speaker 1: base of researchers kind of the open source approach. And 399 00:26:02,200 --> 00:26:05,239 Speaker 1: that is true, but it also you know, you can 400 00:26:05,359 --> 00:26:08,280 Speaker 1: understand the concern on Meta's part, right, like if it 401 00:26:08,320 --> 00:26:10,399 Speaker 1: were to get out that a tool made by Meta 402 00:26:10,560 --> 00:26:16,639 Speaker 1: was used to generate AI created misinformation in order to 403 00:26:17,240 --> 00:26:21,320 Speaker 1: polarize people against one another. That wouldn't look good. Even 404 00:26:21,359 --> 00:26:24,600 Speaker 1: though you could argue that's not really Meta's fault because 405 00:26:24,720 --> 00:26:27,240 Speaker 1: they never intended that tool to get released out to 406 00:26:27,280 --> 00:26:31,880 Speaker 1: the public, you can understand where there'd be the resistance there. 407 00:26:32,160 --> 00:26:35,400 Speaker 1: I got a couple of Tesla stories today. Safety regulators 408 00:26:35,400 --> 00:26:38,560 Speaker 1: in the United States are once again scrutinizing Tesla in 409 00:26:38,600 --> 00:26:42,159 Speaker 1: the wake of reports that for a couple of suv 410 00:26:42,359 --> 00:26:46,919 Speaker 1: drivers there's this doozy of an issue that sometimes crops up, 411 00:26:47,200 --> 00:26:52,359 Speaker 1: namely that the steering wheel reportedly just plane comes off sometimes. 412 00:26:52,359 --> 00:26:58,080 Speaker 1: On Wednesday, the National Highway Traffic Safety Administration or in HTSA, 413 00:26:58,320 --> 00:27:03,480 Speaker 1: released documents about investigation into the model Y Tesla suv 414 00:27:04,320 --> 00:27:07,399 Speaker 1: and how the steering wheel reportedly can detach from the 415 00:27:07,440 --> 00:27:11,960 Speaker 1: steering column. This, as I understand it, is considered a 416 00:27:12,119 --> 00:27:15,600 Speaker 1: bad thing. It's not that these steering wheels are just 417 00:27:15,720 --> 00:27:19,760 Speaker 1: inherently flimsy, but that in at least two cases, a 418 00:27:19,840 --> 00:27:23,240 Speaker 1: bolt that is meant to hold the steering wheel to 419 00:27:23,440 --> 00:27:28,200 Speaker 1: the column just wasn't there. It didn't get installed so 420 00:27:28,400 --> 00:27:31,639 Speaker 1: the wheel was just attached to the column through a 421 00:27:31,680 --> 00:27:34,639 Speaker 1: friction fit. But it's a steering wheel. That means you 422 00:27:34,840 --> 00:27:38,879 Speaker 1: turn the wheel and all that turning is like, you know, 423 00:27:39,040 --> 00:27:42,640 Speaker 1: wiggling something loose, and over time the wheel does come 424 00:27:42,720 --> 00:27:50,240 Speaker 1: loose and ultimately can come off, and that's really scary obviously. Now, 425 00:27:50,240 --> 00:27:54,040 Speaker 1: according to the NHTSA, both cases in which a steering 426 00:27:54,080 --> 00:27:57,600 Speaker 1: wheel came off happened while the vehicles were still relatively 427 00:27:57,680 --> 00:28:00,159 Speaker 1: low in mileage, so it doesn't take very long for 428 00:28:00,200 --> 00:28:03,520 Speaker 1: this to manifest. It is hard to say if this 429 00:28:03,600 --> 00:28:06,920 Speaker 1: issue has affected a large number of vehicles or if 430 00:28:06,920 --> 00:28:11,040 Speaker 1: these two cases that the NHTSA has reported our outliers. 431 00:28:11,119 --> 00:28:14,440 Speaker 1: It is possible that this is an issue that has 432 00:28:14,480 --> 00:28:18,520 Speaker 1: happened obviously more than once, but not throughout the entire 433 00:28:18,800 --> 00:28:23,080 Speaker 1: fleet of SUVs. That being said, if you do own 434 00:28:23,119 --> 00:28:27,600 Speaker 1: a model Y SUV and it's relatively new, you might 435 00:28:27,880 --> 00:28:32,200 Speaker 1: want to check to see if that bolt is actually 436 00:28:32,320 --> 00:28:36,240 Speaker 1: present to hold the steering wheel in place, which might 437 00:28:36,240 --> 00:28:39,760 Speaker 1: mean having to take it to a dealership. So just 438 00:28:40,040 --> 00:28:42,920 Speaker 1: good to know. The other big story about Tesla involves 439 00:28:42,960 --> 00:28:46,720 Speaker 1: a fatal accident back in February. A driver and passenger 440 00:28:46,760 --> 00:28:51,040 Speaker 1: in a twenty fourteen Tesla Model S crashed into a 441 00:28:51,120 --> 00:28:55,160 Speaker 1: firetruck that was parked along a highway, blocking one of 442 00:28:55,200 --> 00:28:59,120 Speaker 1: the lanes. It was acting as a barrier for firefighters 443 00:28:59,160 --> 00:29:02,680 Speaker 1: as they will responding to a different crash. The Models 444 00:29:03,240 --> 00:29:06,520 Speaker 1: crashed into the firetruck. It killed the Model s's driver, 445 00:29:06,600 --> 00:29:09,680 Speaker 1: it critically injured the passenger. Firefighters had to use special 446 00:29:09,680 --> 00:29:11,920 Speaker 1: tools to cut the Tesla open to get at the 447 00:29:11,960 --> 00:29:16,200 Speaker 1: people inside. Four firefighters had some injuries as well as 448 00:29:16,200 --> 00:29:19,520 Speaker 1: a result of this crash, and the NHTSA has launched 449 00:29:19,520 --> 00:29:21,640 Speaker 1: an investigation to see if the Model S was in 450 00:29:21,720 --> 00:29:26,400 Speaker 1: driver assist mode aka autopilot, or potentially even full self 451 00:29:26,480 --> 00:29:30,960 Speaker 1: driving mode. This is part of a larger NHTSA investigation 452 00:29:31,120 --> 00:29:35,120 Speaker 1: into accidents involving Tesla's that were in autopilot mode and 453 00:29:35,160 --> 00:29:39,720 Speaker 1: then crashed into a parked emergency vehicle on a highway. Apparently, 454 00:29:39,920 --> 00:29:43,440 Speaker 1: there have been at least fifteen cases of Tesla's doing 455 00:29:43,440 --> 00:29:47,000 Speaker 1: this while in driver assist mode. That obviously raises questions 456 00:29:47,000 --> 00:29:50,960 Speaker 1: about how Tesla detects or fails to detect parked emergency 457 00:29:51,040 --> 00:29:54,520 Speaker 1: vehicles along highways. News outlets have reached out to Tesla 458 00:29:54,600 --> 00:29:57,080 Speaker 1: for comment. But seeing as how Tesla has followed the 459 00:29:57,120 --> 00:30:00,800 Speaker 1: Twitter model that Musk set out, namely, it doesn't have 460 00:30:00,840 --> 00:30:03,680 Speaker 1: a public relations department, it might be hard to get 461 00:30:03,680 --> 00:30:06,360 Speaker 1: a quote. Okay, I got a couple more stories I 462 00:30:06,400 --> 00:30:08,440 Speaker 1: want to talk about Before I get to that, Let's 463 00:30:08,480 --> 00:30:21,200 Speaker 1: take another quick break. Okay, just before the break, I 464 00:30:21,240 --> 00:30:23,280 Speaker 1: was talking a little bit about Musk, and I got 465 00:30:23,280 --> 00:30:26,440 Speaker 1: more to say about the Muskie billionaire. He had a 466 00:30:26,440 --> 00:30:29,680 Speaker 1: pretty public situation blow up in his face this week 467 00:30:29,880 --> 00:30:34,000 Speaker 1: on Twitter. It got ugly. So at the center of 468 00:30:34,040 --> 00:30:38,440 Speaker 1: the issue is a well now former Twitter employee named 469 00:30:38,720 --> 00:30:43,640 Speaker 1: Holly thor Leafson. Holly was trying to get confirmation on 470 00:30:43,640 --> 00:30:48,240 Speaker 1: whether or not he had been fired, because it seemed 471 00:30:48,280 --> 00:30:50,720 Speaker 1: like he had, but no one was telling him for sure. 472 00:30:51,120 --> 00:30:54,040 Speaker 1: And you might think, Wow, things have got to be 473 00:30:54,120 --> 00:30:56,640 Speaker 1: screwy at Twitter if you don't even know if you've 474 00:30:56,680 --> 00:30:59,760 Speaker 1: been fired or not. But it gets worse than that. 475 00:31:00,080 --> 00:31:03,360 Speaker 1: You see, Holley is not your typical Twitter employee. Back 476 00:31:03,400 --> 00:31:07,520 Speaker 1: in twenty twenty one, he ran a creative agency called Ueno, 477 00:31:08,000 --> 00:31:13,760 Speaker 1: which Twitter subsequently acquired, and Holly joined Twitter as part 478 00:31:13,960 --> 00:31:16,600 Speaker 1: of this deal, and the scuttle butt in the industry, 479 00:31:16,600 --> 00:31:20,320 Speaker 1: because Holly's pretty quiet about this. He's indicated some of 480 00:31:20,320 --> 00:31:22,959 Speaker 1: this is true, but hasn't given particulars. But the scuttle 481 00:31:23,000 --> 00:31:27,479 Speaker 1: butt is that Holly's compensation essentially is pulling from a 482 00:31:27,600 --> 00:31:32,320 Speaker 1: very large pool of money that was part of this acquisition. Then, 483 00:31:32,320 --> 00:31:36,040 Speaker 1: instead of getting a lump sum for the acquisition of 484 00:31:36,040 --> 00:31:39,040 Speaker 1: this company he was brought on as an employee, is 485 00:31:39,080 --> 00:31:41,680 Speaker 1: being paid out on the money that would have been 486 00:31:41,720 --> 00:31:44,880 Speaker 1: for the acquisition. And if he were to be fired, 487 00:31:45,000 --> 00:31:48,480 Speaker 1: Twitter would owe him the rest of that money, unless 488 00:31:48,520 --> 00:31:51,120 Speaker 1: it was like a four cause reason, like if Hallie 489 00:31:51,160 --> 00:31:54,600 Speaker 1: had done embezzelment or something, then obviously he would not 490 00:31:54,680 --> 00:31:58,719 Speaker 1: be contractually obligated that money. But if he was fired 491 00:31:58,760 --> 00:32:02,960 Speaker 1: without cause, then he would be due this massive amount 492 00:32:03,000 --> 00:32:05,200 Speaker 1: of money that was part of the acquisition. Now let 493 00:32:05,240 --> 00:32:07,360 Speaker 1: me give you a little bit more information on this guy. 494 00:32:08,000 --> 00:32:11,720 Speaker 1: He has muscular dystrophy and he uses a wheelchair. He's 495 00:32:11,800 --> 00:32:15,360 Speaker 1: from Iceland and he has long campaign the land government 496 00:32:15,400 --> 00:32:18,960 Speaker 1: and the people for better wheelchair access throughout the country. 497 00:32:19,400 --> 00:32:23,120 Speaker 1: He even negotiated his Twitter compensation package so that he 498 00:32:23,160 --> 00:32:27,520 Speaker 1: actually pays more taxes to Iceland. As I said, he 499 00:32:27,600 --> 00:32:31,640 Speaker 1: pays more in taxes because he believes in the change 500 00:32:31,640 --> 00:32:35,760 Speaker 1: he's asking for and he's willing to help pay for it. Anyway, 501 00:32:35,960 --> 00:32:39,040 Speaker 1: Holly ended up tweeting to Elon Musk to ask for 502 00:32:39,160 --> 00:32:43,720 Speaker 1: clarification because he was frankly getting nowhere through other channels. 503 00:32:43,760 --> 00:32:47,120 Speaker 1: Though throughout the process of that back and forth, he 504 00:32:47,320 --> 00:32:50,440 Speaker 1: did get confirmation that he had been fired, and Musk, 505 00:32:50,800 --> 00:32:56,400 Speaker 1: in his trademarked way, responded poorly. So first he clearly 506 00:32:56,400 --> 00:32:58,360 Speaker 1: didn't know who how he was, and he asked what 507 00:32:58,480 --> 00:33:01,200 Speaker 1: work how he had been doing. So he responds with that, 508 00:33:01,640 --> 00:33:04,560 Speaker 1: and then Elon responded in a response that has subsequently 509 00:33:04,600 --> 00:33:08,200 Speaker 1: been deleted in little laughing emojis, as if saying ha, 510 00:33:08,480 --> 00:33:12,880 Speaker 1: as if this is anything worth even talking about how 511 00:33:12,920 --> 00:33:16,360 Speaker 1: he actually initially didn't even tell him what it was, 512 00:33:16,440 --> 00:33:18,840 Speaker 1: because he said, if I talk about it, then I'm 513 00:33:18,840 --> 00:33:24,480 Speaker 1: breaking confidentiality agreements. I need to see that those confidentiality 514 00:33:24,560 --> 00:33:26,720 Speaker 1: agreements are waved before I talk about what I've been 515 00:33:26,760 --> 00:33:31,120 Speaker 1: working on. And Elon then cavalierly waves those agreements. They 516 00:33:31,160 --> 00:33:34,520 Speaker 1: consider them put aside. So Hollie then said, and then 517 00:33:35,000 --> 00:33:36,760 Speaker 1: this goes back and forth a bit, and then the 518 00:33:36,800 --> 00:33:39,360 Speaker 1: next day after it came out, that how he said 519 00:33:39,360 --> 00:33:41,960 Speaker 1: he had in fact been fired. Elon must tweeted about 520 00:33:42,000 --> 00:33:45,880 Speaker 1: Hallie and said that quote, He's the worst. Sorry, and 521 00:33:46,000 --> 00:33:49,680 Speaker 1: he then later would delete that quote or that tweet. 522 00:33:49,720 --> 00:33:53,800 Speaker 1: But you know, just because something's been deleted doesn't mean 523 00:33:53,840 --> 00:33:59,280 Speaker 1: the internet forgets. The Internet never forgets. Musk also posted quote, 524 00:33:59,320 --> 00:34:02,840 Speaker 1: the reality is that this guy, who is independently wealthy, 525 00:34:03,200 --> 00:34:06,560 Speaker 1: did no actual work, claimed as his excuse that he 526 00:34:06,680 --> 00:34:09,759 Speaker 1: had a disability that prevented him from typing, yet was 527 00:34:09,840 --> 00:34:12,759 Speaker 1: simultaneously tweeting up a storm. Can't say I have a 528 00:34:12,800 --> 00:34:17,160 Speaker 1: lot of respect for that now, Halle. He has had 529 00:34:17,719 --> 00:34:21,520 Speaker 1: an incredible reputation in the tech field, like the people 530 00:34:21,520 --> 00:34:23,600 Speaker 1: who have worked with him have nothing but great things 531 00:34:23,600 --> 00:34:26,280 Speaker 1: to say about him. To say that he is highly 532 00:34:26,320 --> 00:34:31,120 Speaker 1: admired is putting it lightly, and Musk got considerable pushback 533 00:34:31,520 --> 00:34:38,239 Speaker 1: after doing his typical billionaire idiot approach into waiting into 534 00:34:38,280 --> 00:34:41,759 Speaker 1: something he doesn't know anything about, and so ultimately he 535 00:34:41,840 --> 00:34:44,520 Speaker 1: goes and has a conversation with Halle over the phone 536 00:34:44,719 --> 00:34:47,600 Speaker 1: to get a better understanding, and then he tweets a 537 00:34:47,719 --> 00:34:52,279 Speaker 1: non apology, essentially as an apology of I was misled. 538 00:34:52,800 --> 00:34:55,680 Speaker 1: I heard things that ended up not being true. So 539 00:34:55,760 --> 00:34:58,280 Speaker 1: it's not my fault that I said all these stupid 540 00:34:58,320 --> 00:35:02,840 Speaker 1: things on Twitter that I've deleted so can't see him anymore. Yeah, 541 00:35:02,880 --> 00:35:05,640 Speaker 1: my opinion of Musk is obviously low, and it always 542 00:35:05,680 --> 00:35:07,920 Speaker 1: has been, but it just keeps going down, and I 543 00:35:07,960 --> 00:35:10,520 Speaker 1: admit that that's my own bias. If you love Musk, 544 00:35:11,239 --> 00:35:13,600 Speaker 1: you know, I'm sure you have your reasons. I'm just 545 00:35:14,400 --> 00:35:17,720 Speaker 1: I can't communicate this without bringing my own personal opinion 546 00:35:17,760 --> 00:35:23,000 Speaker 1: into it. Anyway. What Musk also said was quote better 547 00:35:23,040 --> 00:35:26,720 Speaker 1: to talk to people than communicate via tweet end quote, 548 00:35:26,920 --> 00:35:30,200 Speaker 1: because he was backpedaling right. He was saying, Oh, turns 549 00:35:30,200 --> 00:35:32,799 Speaker 1: out I didn't really understand the situation. I was told 550 00:35:32,840 --> 00:35:35,600 Speaker 1: wrong things or things that were true but don't really matter, 551 00:35:36,120 --> 00:35:39,680 Speaker 1: like it was all non apology stuff, and then he 552 00:35:39,680 --> 00:35:42,280 Speaker 1: says better to talk to people than communicate via tweet. 553 00:35:43,239 --> 00:35:49,080 Speaker 1: This is insane, folks, because he's the fricking CEO of Twitter. 554 00:35:49,480 --> 00:35:55,120 Speaker 1: He spent billions of dollars to buy Twitter. Twitter is 555 00:35:55,160 --> 00:35:59,239 Speaker 1: a communications platform and he's saying it's not good for 556 00:35:59,280 --> 00:36:03,200 Speaker 1: communication Asians. What Elon must have just said is Twitter 557 00:36:03,320 --> 00:36:05,440 Speaker 1: is not a good choice for doing what Twitter is 558 00:36:05,440 --> 00:36:09,640 Speaker 1: supposed to do. This is if you're trying to lead 559 00:36:09,680 --> 00:36:14,319 Speaker 1: your company which is already in like dire straits and 560 00:36:14,480 --> 00:36:17,520 Speaker 1: trying to lead it to climb out of that, Saying 561 00:36:17,560 --> 00:36:21,239 Speaker 1: that it's not good for communicating with people is probably 562 00:36:21,280 --> 00:36:26,719 Speaker 1: a bad move anyway. He then said that Hollie is 563 00:36:26,840 --> 00:36:30,200 Speaker 1: reconsidering working with Twitter, that the whole firing thing was 564 00:36:30,239 --> 00:36:33,480 Speaker 1: a big misunderstanding. It shouldn't have happened. Uh. You know, 565 00:36:33,520 --> 00:36:36,640 Speaker 1: Holly has said that he suspects that this is because again, 566 00:36:37,239 --> 00:36:40,960 Speaker 1: if he were to be fired, he would be owed 567 00:36:41,040 --> 00:36:43,200 Speaker 1: this huge amount of money. He has essentially said, Hey, 568 00:36:43,239 --> 00:36:45,640 Speaker 1: I'm ready to walk away from Twitter. You owe me, 569 00:36:45,960 --> 00:36:47,839 Speaker 1: So are you gonna pay me? Are you gonna pay 570 00:36:47,840 --> 00:36:50,160 Speaker 1: the money though you owe me? That's actually a really 571 00:36:50,239 --> 00:36:54,440 Speaker 1: good question because Twitter is currently you know, under fire 572 00:36:54,480 --> 00:36:57,840 Speaker 1: for not paying all of its bills like rent and stuff. 573 00:36:58,200 --> 00:37:02,200 Speaker 1: So it's a it's not guarantee that Twitter would pay 574 00:37:02,760 --> 00:37:05,840 Speaker 1: what was actually contractually owed to this guy, because Twitter 575 00:37:05,880 --> 00:37:12,240 Speaker 1: has been not paying a lot of bills so far. Yeah. Terrible, 576 00:37:12,440 --> 00:37:18,000 Speaker 1: terrible situation. And once again elon eliminating public relations teams 577 00:37:18,640 --> 00:37:21,320 Speaker 1: means that this kind of stuff, when it happens, there's 578 00:37:21,440 --> 00:37:26,280 Speaker 1: no one there to step in and handle things, handle 579 00:37:26,320 --> 00:37:31,080 Speaker 1: communications in a way that doesn't escalate into a situation 580 00:37:31,080 --> 00:37:33,239 Speaker 1: that could have been avoided from the very beginning. But 581 00:37:33,360 --> 00:37:39,040 Speaker 1: elon being elon just meant that this got way worse anyway. 582 00:37:40,719 --> 00:37:43,719 Speaker 1: I think Holley seems like a really interesting dude, like 583 00:37:43,880 --> 00:37:46,680 Speaker 1: has has spent a ton of money to try and 584 00:37:47,440 --> 00:37:50,880 Speaker 1: improve his home country, especially for people who are in 585 00:37:50,920 --> 00:37:54,880 Speaker 1: a similar situation that he's in. So cheers to you, buddy, 586 00:37:54,960 --> 00:38:02,200 Speaker 1: because man, this was just like a redeeculous, drama filled 587 00:38:02,719 --> 00:38:06,360 Speaker 1: event that did not need to happen. Finally, this is 588 00:38:06,400 --> 00:38:09,640 Speaker 1: a pretty cool news story. It's some science news to 589 00:38:09,640 --> 00:38:13,520 Speaker 1: close out this show. Researchers in Australia have isolated an 590 00:38:13,680 --> 00:38:17,719 Speaker 1: enzyme found in bacteria that live in the soil. This 591 00:38:17,760 --> 00:38:23,040 Speaker 1: particular enzyme can convert hydrogen, the trace amounts of hydrogen 592 00:38:23,080 --> 00:38:28,200 Speaker 1: that are in the atmosphere into electricity. That's really cool. Now, 593 00:38:28,200 --> 00:38:30,640 Speaker 1: according to what I've read, this enzyme can take the 594 00:38:30,680 --> 00:38:33,200 Speaker 1: trace amounts of hydrogen and the atmosphere turn it into 595 00:38:33,239 --> 00:38:38,160 Speaker 1: an electrical current. The enzyme is called HUC, which I'm 596 00:38:38,200 --> 00:38:41,080 Speaker 1: just going to pronounce this hook And I'm already seeing 597 00:38:41,080 --> 00:38:43,680 Speaker 1: a lot of predictions that this enzyme could be the 598 00:38:43,800 --> 00:38:49,359 Speaker 1: cornerstone for future technologies that harvest electricity from the air itself. 599 00:38:49,800 --> 00:38:53,200 Speaker 1: A lot of headlines talk about making electricity from thin air. 600 00:38:53,760 --> 00:38:59,240 Speaker 1: That would be super cool. However, before we just start 601 00:38:59,280 --> 00:39:05,160 Speaker 1: imagining Tesla esque future in which we're pulling electricity from 602 00:39:05,200 --> 00:39:08,759 Speaker 1: the air itself, we should remember hydrogen does not make 603 00:39:08,880 --> 00:39:11,359 Speaker 1: up very much of our atmosphere. Like, when I say 604 00:39:11,400 --> 00:39:15,400 Speaker 1: not very much, I'm talking about like point zero zero 605 00:39:15,680 --> 00:39:20,200 Speaker 1: zero zero five percent of our atmosphere is hydrogen. And 606 00:39:20,280 --> 00:39:23,919 Speaker 1: when you have such tiny amounts present, there's just not 607 00:39:24,080 --> 00:39:27,600 Speaker 1: enough fuel there to generate electricity to do anything beyond 608 00:39:27,640 --> 00:39:34,680 Speaker 1: powering perhaps very very basic, very low power components. Even 609 00:39:34,719 --> 00:39:38,640 Speaker 1: a simple watch would require way more power than what 610 00:39:38,719 --> 00:39:42,040 Speaker 1: this enzyme could produce, not because the enzyme isn't impressive, 611 00:39:42,080 --> 00:39:45,080 Speaker 1: it is, but just because there's just not enough free 612 00:39:45,200 --> 00:39:48,040 Speaker 1: hydrogen in the air. Now. I've talked several times on 613 00:39:48,080 --> 00:39:52,600 Speaker 1: the show about how hydrogen tends to bond with other elements, 614 00:39:52,920 --> 00:39:55,919 Speaker 1: So getting pure hydrogen usually means you have to take 615 00:39:55,960 --> 00:40:00,239 Speaker 1: something that's made of hydrogen bonded with other stuff, and 616 00:40:00,320 --> 00:40:03,520 Speaker 1: then pour some energy into it to break those molecular bonds. 617 00:40:03,600 --> 00:40:07,359 Speaker 1: The classic example is using electrolysis, where you pass an 618 00:40:07,360 --> 00:40:11,000 Speaker 1: electric current through water and this helps break that molecular 619 00:40:11,040 --> 00:40:14,920 Speaker 1: bond between hydrogen and oxygen so that you get oxygen 620 00:40:14,960 --> 00:40:18,680 Speaker 1: and hydrogen gas. But you know, that's one way to 621 00:40:18,680 --> 00:40:21,759 Speaker 1: get hydrogen, but that's not useful for this enzyme, right, 622 00:40:21,800 --> 00:40:24,400 Speaker 1: Like the enzyme is meant to pull hydrogen out of 623 00:40:24,440 --> 00:40:26,680 Speaker 1: the air. There's just not enough there to do anything 624 00:40:27,880 --> 00:40:33,000 Speaker 1: super useful beyond like powering, like I said, extremely low 625 00:40:33,120 --> 00:40:38,960 Speaker 1: power features or functions. So I think it's really cool. 626 00:40:39,280 --> 00:40:43,160 Speaker 1: I think as the potential for some interesting applications, like 627 00:40:43,160 --> 00:40:46,560 Speaker 1: like maybe some very low power sensor type stuff. But 628 00:40:46,680 --> 00:40:49,320 Speaker 1: it's not anywhere remotely close to being able to harvest 629 00:40:49,320 --> 00:40:52,479 Speaker 1: significance amount of electricity from thin air. And I wanted 630 00:40:52,520 --> 00:40:55,000 Speaker 1: to say that because a lot of the headlines I'm 631 00:40:55,000 --> 00:40:59,800 Speaker 1: seeing essentially say pulling electricity from thin air. And maybe 632 00:40:59,840 --> 00:41:03,319 Speaker 1: in the article it goes into the qualifiers about that, 633 00:41:04,160 --> 00:41:07,120 Speaker 1: but if you're just reading headlines, you might walk away 634 00:41:07,120 --> 00:41:11,480 Speaker 1: with the assumption that, oh, we're gonna pull electricity out 635 00:41:11,480 --> 00:41:15,040 Speaker 1: of the air itself, all of our electrical concerns are gone, 636 00:41:15,080 --> 00:41:17,240 Speaker 1: Like we don't have to worry about our energy needs anymore. 637 00:41:18,640 --> 00:41:21,200 Speaker 1: Stop pouring money into things like renewables because we can 638 00:41:21,200 --> 00:41:24,520 Speaker 1: pull electricity from the air itself. That's not really the case. 639 00:41:24,920 --> 00:41:29,440 Speaker 1: So always, always, always use critical thinking when you're looking 640 00:41:29,480 --> 00:41:32,640 Speaker 1: at things like science news in particular, because a lot 641 00:41:32,680 --> 00:41:39,400 Speaker 1: of science communicators are really good at expressing passion about science, 642 00:41:39,440 --> 00:41:40,880 Speaker 1: but if they don't give you the full story, you 643 00:41:40,960 --> 00:41:46,759 Speaker 1: might walk away with an accurate perspective on what they're 644 00:41:46,760 --> 00:41:50,680 Speaker 1: actually trying to communicate. Okay, that's it for this episode. 645 00:41:50,760 --> 00:41:53,080 Speaker 1: Hope you are all well. If you have suggestions for 646 00:41:53,120 --> 00:41:55,680 Speaker 1: future topics of tech stuff, reach out to me on Twitter. 647 00:41:56,120 --> 00:41:59,480 Speaker 1: You can tweet me at tech stuff HSW or you 648 00:41:59,480 --> 00:42:02,600 Speaker 1: can download the iHeartRadio app free to download, free to use. 649 00:42:02,680 --> 00:42:05,680 Speaker 1: Just pop on over to that and then you use 650 00:42:05,760 --> 00:42:08,319 Speaker 1: the little search engine to search for tech stuff. That'll 651 00:42:08,320 --> 00:42:09,879 Speaker 1: take you to the tech stuff page. You'll see there's 652 00:42:09,880 --> 00:42:12,520 Speaker 1: a little microphone icon there. You click on that you 653 00:42:12,560 --> 00:42:14,759 Speaker 1: can leave a voice message up to thirty seconds in link. 654 00:42:14,840 --> 00:42:16,160 Speaker 1: Let me know what you would like to hear, and 655 00:42:16,200 --> 00:42:25,600 Speaker 1: I will talk to you again really soon. Tech Stuff 656 00:42:25,680 --> 00:42:30,200 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 657 00:42:30,239 --> 00:42:33,799 Speaker 1: the iHeartRadio app Apple podcasts, or wherever you listen to 658 00:42:33,840 --> 00:42:34,760 Speaker 1: your favorite shows.