1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,920 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,120 --> 00:00:18,240 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,480 --> 00:00:22,360 Speaker 1: and I love all things tech. And today's tech News 5 00:00:22,400 --> 00:00:28,120 Speaker 1: episode is for Tuesday, May twenty one, and I've got 6 00:00:28,120 --> 00:00:31,640 Speaker 1: a real kind of global episode for you guys today 7 00:00:31,760 --> 00:00:35,440 Speaker 1: because news is hitting all over the world that involves 8 00:00:35,479 --> 00:00:38,839 Speaker 1: tech and politics, and so we're going to dive right 9 00:00:38,840 --> 00:00:43,240 Speaker 1: into it. In India, police rated the offices of Twitter 10 00:00:43,720 --> 00:00:47,120 Speaker 1: in two cities. And why did they do that. Well, 11 00:00:47,600 --> 00:00:51,320 Speaker 1: it all comes down to Twitter applying a manipulated media 12 00:00:51,560 --> 00:00:56,960 Speaker 1: label to an Indian politicians tweet. So back last February, 13 00:00:57,000 --> 00:00:59,480 Speaker 1: you might remember Twitter announced that it would apply such 14 00:00:59,560 --> 00:01:02,320 Speaker 1: labels to messages that were found to have, you know, 15 00:01:02,520 --> 00:01:08,120 Speaker 1: altered images or messaging that spread misinformation. You might also 16 00:01:08,200 --> 00:01:11,440 Speaker 1: remember that the company actually applied this policy a couple 17 00:01:11,440 --> 00:01:15,880 Speaker 1: of times to tweets sent by then US President Donald Trump. Well, 18 00:01:15,880 --> 00:01:19,959 Speaker 1: in this case, the tweet was written by Sambit Patra, 19 00:01:20,360 --> 00:01:23,480 Speaker 1: a member of the b j P party, which currently 20 00:01:23,520 --> 00:01:28,280 Speaker 1: has a majority control of India's government. That tweet accused 21 00:01:28,440 --> 00:01:32,720 Speaker 1: the opposition party, which is the Congress Party of having 22 00:01:32,760 --> 00:01:37,040 Speaker 1: released a quote unquote toolkit in an effort to undermine 23 00:01:37,080 --> 00:01:39,920 Speaker 1: the b j P Party's attempts to respond to the 24 00:01:39,920 --> 00:01:45,040 Speaker 1: coronavirus crisis. And you're probably aware that India is dealing 25 00:01:45,520 --> 00:01:49,600 Speaker 1: with a monumental health crisis, one that has affected millions 26 00:01:49,600 --> 00:01:53,120 Speaker 1: of people within that country, and that the government is 27 00:01:53,160 --> 00:01:57,040 Speaker 1: sensitive to scrutiny with regard to how they've handled the 28 00:01:57,120 --> 00:01:59,680 Speaker 1: issue or failed to handle it, as the case may be. 29 00:02:00,520 --> 00:02:04,920 Speaker 1: Patra's claim essentially says that the opposition party in India 30 00:02:05,240 --> 00:02:09,400 Speaker 1: was actively trying to sabotage government efforts to deal with 31 00:02:09,440 --> 00:02:14,600 Speaker 1: the crisis by creating this document, this toolkit strategy to 32 00:02:14,880 --> 00:02:19,960 Speaker 1: defeat government efforts, and an independent fact checking organization in 33 00:02:19,960 --> 00:02:24,120 Speaker 1: India called ault News investigated this claim and found that 34 00:02:24,200 --> 00:02:29,400 Speaker 1: the document Pottery had said proved that the Congress Party 35 00:02:29,800 --> 00:02:33,720 Speaker 1: had created this alleged toolkit was in fact printed on 36 00:02:33,960 --> 00:02:38,320 Speaker 1: forged letterhead and thus not a legitimate document. So Twitter 37 00:02:38,400 --> 00:02:41,840 Speaker 1: then applied the manipulated media label to the tweet. The 38 00:02:41,919 --> 00:02:45,919 Speaker 1: b JP party was not pleased with this and sent 39 00:02:46,120 --> 00:02:50,360 Speaker 1: messages to Twitter demanding that they remove it, and then 40 00:02:50,520 --> 00:02:54,480 Speaker 1: apparently they contacted the police to investigate. So the police 41 00:02:54,480 --> 00:02:57,880 Speaker 1: paid a visit to the to Twitter offices, but they 42 00:02:57,919 --> 00:03:01,160 Speaker 1: found that no one was there. So why was that, Well, 43 00:03:01,200 --> 00:03:05,280 Speaker 1: it's because of the aforementioned coronavirus crisis in India, so 44 00:03:05,360 --> 00:03:09,959 Speaker 1: Twitter employees in India work from home, so that whole 45 00:03:10,560 --> 00:03:12,920 Speaker 1: issue of the police showing up at the Twitter office 46 00:03:13,000 --> 00:03:16,480 Speaker 1: is kind of fell flat. The police activity has prompted 47 00:03:16,880 --> 00:03:20,880 Speaker 1: several journalists to suggest that perhaps this is an intimidation 48 00:03:20,960 --> 00:03:23,880 Speaker 1: effort on the part of the Indian government and that 49 00:03:24,000 --> 00:03:26,959 Speaker 1: it sends a message to Twitter. And India is another 50 00:03:26,960 --> 00:03:29,359 Speaker 1: country that has tried to grapple with the fact that 51 00:03:29,880 --> 00:03:33,320 Speaker 1: tech companies like Twitter and Facebook have a broad reach 52 00:03:33,639 --> 00:03:36,440 Speaker 1: and that these companies aren't always subject to the tight 53 00:03:36,600 --> 00:03:41,760 Speaker 1: controls of government agencies the way state run communication channels are. 54 00:03:42,120 --> 00:03:45,080 Speaker 1: And we all know that information is powerful stuff and 55 00:03:45,120 --> 00:03:48,560 Speaker 1: that a lot of governments, including the one here in 56 00:03:48,600 --> 00:03:51,200 Speaker 1: the United States, would prefer to keep you know, at 57 00:03:51,240 --> 00:03:54,920 Speaker 1: least a close hand on communication channels too. I don't know, 58 00:03:55,000 --> 00:03:59,200 Speaker 1: let's say, guide the flow of information. India and Twitter 59 00:03:59,400 --> 00:04:02,400 Speaker 1: have already had some battles about that earlier this year, 60 00:04:02,640 --> 00:04:05,880 Speaker 1: with a previous example being the Indian government pressuring Twitter 61 00:04:05,960 --> 00:04:09,760 Speaker 1: to ban accounts that belonged to farmers in India who 62 00:04:09,760 --> 00:04:14,760 Speaker 1: were protesting new legislation within the country. Meanwhile, in the 63 00:04:14,800 --> 00:04:19,160 Speaker 1: Middle East, Palestinian journalists reporting from the Gaza Strip say 64 00:04:19,200 --> 00:04:22,960 Speaker 1: that they have been blocked from using WhatsApp, the messenger app. 65 00:04:23,480 --> 00:04:27,360 Speaker 1: The political situation in the area is complicated, but essentially 66 00:04:27,440 --> 00:04:31,080 Speaker 1: the journalists were covering the issues of Palestinian residents of 67 00:04:31,120 --> 00:04:34,520 Speaker 1: the Gaza Strip who had fled their homes while the 68 00:04:34,560 --> 00:04:38,360 Speaker 1: area was under attack from military operations out of Israel. 69 00:04:39,040 --> 00:04:42,560 Speaker 1: Israel and Hamas agreed to a ceasefire on May twenty, 70 00:04:43,000 --> 00:04:48,279 Speaker 1: and shortly afterwards, seventeen journalists found their access to WhatsApp blocked. 71 00:04:48,920 --> 00:04:52,479 Speaker 1: The apparent reason behind this was that several of the 72 00:04:52,520 --> 00:04:56,320 Speaker 1: journalists twelve out of the seventeen of them, belonged to 73 00:04:56,520 --> 00:04:59,960 Speaker 1: a communal group within which the journalists would share for 74 00:05:00,080 --> 00:05:03,880 Speaker 1: nation related to Hamas's operations in the area, and the 75 00:05:03,920 --> 00:05:06,960 Speaker 1: journalists have said that this group was purely for the 76 00:05:07,000 --> 00:05:09,640 Speaker 1: purposes of journalism in order for them to all stay 77 00:05:09,720 --> 00:05:12,400 Speaker 1: aware of what was going on so that they could 78 00:05:12,400 --> 00:05:16,120 Speaker 1: report on the news as accurately as possible. It wasn't 79 00:05:16,160 --> 00:05:20,800 Speaker 1: indicating necessarily a support for you know, Hamas. It was 80 00:05:20,839 --> 00:05:24,760 Speaker 1: more about how do we report on this? And so 81 00:05:24,880 --> 00:05:29,080 Speaker 1: the channel and and WhatsApp appealed to Facebook, that's the 82 00:05:29,120 --> 00:05:32,680 Speaker 1: corporate owner of WhatsApp, and a few of the journalists 83 00:05:32,680 --> 00:05:35,840 Speaker 1: found their access restored a couple of days later. Others 84 00:05:35,839 --> 00:05:39,400 Speaker 1: were still waiting for it, but the contents of the 85 00:05:39,480 --> 00:05:44,280 Speaker 1: channel itself were wiped clear, with all previous conversations and 86 00:05:44,400 --> 00:05:49,120 Speaker 1: images and contact information all deleted. This follows other events 87 00:05:49,160 --> 00:05:53,159 Speaker 1: that raise additional questions about Israel's military response in the area. 88 00:05:53,480 --> 00:05:57,640 Speaker 1: For example, earlier Israeli air attacks destroyed a building that 89 00:05:57,760 --> 00:06:02,839 Speaker 1: had offices for Alja zero and the associated press inside 90 00:06:02,880 --> 00:06:05,800 Speaker 1: of it. Now, the Israeli government said that the same 91 00:06:05,839 --> 00:06:10,200 Speaker 1: building housed Hamas military intelligence within it, and thus this 92 00:06:10,320 --> 00:06:14,159 Speaker 1: was a strike to target enemy intelligence, but several press 93 00:06:14,279 --> 00:06:17,800 Speaker 1: organizations suggested that this might have been in fact an 94 00:06:17,800 --> 00:06:21,280 Speaker 1: attempt by the Israeli government to hinder the press's ability 95 00:06:21,320 --> 00:06:26,240 Speaker 1: to cover the war within the area. Similarly, journalists are 96 00:06:26,320 --> 00:06:30,760 Speaker 1: questioning if Israel is pressuring companies like Twitter and Facebook 97 00:06:30,960 --> 00:06:34,240 Speaker 1: to crack down on journalists who are covering the Palestinian 98 00:06:34,320 --> 00:06:38,640 Speaker 1: side of the conflict. Once again, we see the struggle 99 00:06:38,720 --> 00:06:42,520 Speaker 1: of large tech companies with a global presence that are 100 00:06:42,560 --> 00:06:46,120 Speaker 1: attempting to operate in a world has deep political and 101 00:06:46,240 --> 00:06:50,400 Speaker 1: social and regional schisms. And it's a pretty far cry 102 00:06:50,600 --> 00:06:54,359 Speaker 1: from the utopian ideal of the Internet, this idea that 103 00:06:54,720 --> 00:06:57,160 Speaker 1: the Internet was going to open up communications so that 104 00:06:57,240 --> 00:07:01,279 Speaker 1: it could freely travel across borders without any problems. The 105 00:07:01,360 --> 00:07:05,480 Speaker 1: reality is much more complicated. Also, I feel like this 106 00:07:05,520 --> 00:07:08,640 Speaker 1: should go without saying, but I do want to be clear. 107 00:07:09,040 --> 00:07:13,000 Speaker 1: It is entirely possible to be critical of Israel's military 108 00:07:13,040 --> 00:07:18,720 Speaker 1: operations and also condemn anti semitism. That is not a contradiction, 109 00:07:19,080 --> 00:07:23,640 Speaker 1: and anyone suggesting that criticizing the state use of military 110 00:07:23,680 --> 00:07:28,880 Speaker 1: operations in Israel is automatically anti Semitic is setting up 111 00:07:28,880 --> 00:07:32,040 Speaker 1: a false position. Now that being said, we have also 112 00:07:32,280 --> 00:07:37,320 Speaker 1: seen a rise of anti semitism, which in itself is condemnable. 113 00:07:37,720 --> 00:07:42,640 Speaker 1: As always, recommend people use critical thinking and compassion together 114 00:07:42,880 --> 00:07:48,400 Speaker 1: when they're considering things like world politics. Over at Ours Technica, 115 00:07:48,720 --> 00:07:52,440 Speaker 1: writer Tim Deshant has written a piece about China and 116 00:07:52,560 --> 00:07:57,160 Speaker 1: bitcoin mining. Bitcoin has been having a pretty bumpy ride lately. 117 00:07:57,920 --> 00:08:03,640 Speaker 1: It reached nearly six thousand dollars per bitcoin value before 118 00:08:03,680 --> 00:08:07,760 Speaker 1: plunging to around thirty two dollars per bitcoin and then 119 00:08:07,920 --> 00:08:10,600 Speaker 1: entering a recovery period. So as I record this, it's 120 00:08:10,640 --> 00:08:13,400 Speaker 1: now somewhere in the neighborhood of thirty seven thousand dollars 121 00:08:13,400 --> 00:08:16,720 Speaker 1: per bitcoin. So it does appear to be recovering right now, 122 00:08:17,240 --> 00:08:21,400 Speaker 1: but that might change soon. As Deshaunt reports, China is 123 00:08:21,440 --> 00:08:25,880 Speaker 1: going to quote crackdown on bitcoin mining and trading behavior 124 00:08:26,000 --> 00:08:30,640 Speaker 1: and resolutely prevent the transfer of individual risks to the 125 00:08:30,720 --> 00:08:34,920 Speaker 1: society end quote. And as it stands, bitcoin mining operations 126 00:08:34,920 --> 00:08:39,040 Speaker 1: in China contribute an enormous amount to the overall bitcoin 127 00:08:39,120 --> 00:08:42,400 Speaker 1: mining operations. So in case you're not familiar with how 128 00:08:42,480 --> 00:08:45,640 Speaker 1: bitcoin mining works, I'll give you a very high level overview. 129 00:08:46,200 --> 00:08:48,960 Speaker 1: There's a network of computers that are all connected to 130 00:08:49,000 --> 00:08:54,240 Speaker 1: the bitcoin system, and the system groups transactions of bitcoin 131 00:08:54,360 --> 00:08:58,240 Speaker 1: transactions together in blocks of data, and when a block 132 00:08:58,600 --> 00:09:02,360 Speaker 1: reaches capacity, when it's well, the system then essentially invites 133 00:09:02,440 --> 00:09:06,560 Speaker 1: all the connected computers to the network to verify the 134 00:09:06,559 --> 00:09:10,240 Speaker 1: transactions that are within that block of data. And this 135 00:09:10,360 --> 00:09:14,120 Speaker 1: is done essentially by guessing a very large number. The 136 00:09:14,160 --> 00:09:19,160 Speaker 1: first computer to guess successfully has verified the transactions and 137 00:09:19,240 --> 00:09:22,960 Speaker 1: has quote unquote mind a certain number of bitcoins. Uh, 138 00:09:23,000 --> 00:09:26,240 Speaker 1: that number of bitcoins actually goes down in increments over time, 139 00:09:26,280 --> 00:09:29,760 Speaker 1: so right now it's six point to five bitcoins per 140 00:09:29,840 --> 00:09:33,920 Speaker 1: mind block. That block of data then joins the end 141 00:09:34,040 --> 00:09:38,920 Speaker 1: of a chain of previously verified transaction blocks the block chain, 142 00:09:39,559 --> 00:09:42,400 Speaker 1: and the goal is to aim for about ten minutes 143 00:09:42,440 --> 00:09:45,680 Speaker 1: of time between opening up the opportunity to mind a 144 00:09:45,760 --> 00:09:50,120 Speaker 1: block and reaching the correct guess. But as more computers 145 00:09:50,200 --> 00:09:54,199 Speaker 1: joined the system, and more computers try to solve this problem, 146 00:09:54,200 --> 00:09:58,319 Speaker 1: and some computers having like super fast processors, and more 147 00:09:58,400 --> 00:10:01,760 Speaker 1: likely you're talking about gri of computers or networks of 148 00:10:01,760 --> 00:10:05,440 Speaker 1: computers all working together to try and solve this problem, 149 00:10:05,520 --> 00:10:08,400 Speaker 1: then the system has to make the problems more difficult 150 00:10:08,440 --> 00:10:11,600 Speaker 1: to solve in order to stick with that ten minute goal. 151 00:10:11,960 --> 00:10:15,720 Speaker 1: Otherwise the problems would be solved way too quickly. And 152 00:10:15,800 --> 00:10:19,080 Speaker 1: so this has created a kind of escalating situation in 153 00:10:19,120 --> 00:10:23,680 Speaker 1: which bitcoin miners would add increasingly powerful computer systems to 154 00:10:23,720 --> 00:10:28,120 Speaker 1: the network in attempt to be the first, particularly as 155 00:10:28,160 --> 00:10:31,240 Speaker 1: the value of bitcoin began to climb. And so it 156 00:10:31,320 --> 00:10:34,680 Speaker 1: might cost you a few hundred thousand dollars to set 157 00:10:34,760 --> 00:10:40,080 Speaker 1: up an actual mining operation, but if you can mind successfully, 158 00:10:40,440 --> 00:10:44,360 Speaker 1: then you can pay all those costs off pretty quickly. 159 00:10:44,760 --> 00:10:49,240 Speaker 1: So we saw a rise in power hungry bitcoin mining operations, 160 00:10:49,280 --> 00:10:54,320 Speaker 1: sucking up huge amounts of electricity and thus potentially contributing 161 00:10:54,320 --> 00:10:57,800 Speaker 1: to carbon emissions as a result because they were placing 162 00:10:58,160 --> 00:11:01,120 Speaker 1: a higher demand on power plants, and a lot of 163 00:11:01,120 --> 00:11:05,720 Speaker 1: power plants still use fossil fuels, so the environmental impact 164 00:11:05,920 --> 00:11:09,000 Speaker 1: is one reason that countries like China are cracking down 165 00:11:09,040 --> 00:11:12,560 Speaker 1: on cryptocurrencies that rely on this type of proof of 166 00:11:12,679 --> 00:11:17,400 Speaker 1: work approach. That's what bitcoin does. But another reason is 167 00:11:17,440 --> 00:11:21,760 Speaker 1: that criminals use cryptocurrency for all sorts of illegal activities, 168 00:11:22,200 --> 00:11:27,200 Speaker 1: from trading in illegal goods to money laundering to circumventing 169 00:11:27,240 --> 00:11:30,880 Speaker 1: tax laws. And there's also the fact that cryptocurrency exists 170 00:11:31,040 --> 00:11:35,760 Speaker 1: outside the control of centralized organizations like governments and banks. 171 00:11:36,480 --> 00:11:40,360 Speaker 1: Pretty much all country governments are a bit wary of 172 00:11:40,440 --> 00:11:43,840 Speaker 1: stuff that they don't have control over, but China in 173 00:11:43,880 --> 00:11:47,320 Speaker 1: particular is way up on that list. The state in 174 00:11:47,400 --> 00:11:53,040 Speaker 1: China is supreme and any entity, any organization, or currency 175 00:11:53,360 --> 00:11:57,840 Speaker 1: that could threaten the state's position is treated seriously. So 176 00:11:57,920 --> 00:12:01,000 Speaker 1: we could soon see a total ban on bitcoin mining 177 00:12:01,000 --> 00:12:04,000 Speaker 1: in China, which will change the game a great deal, 178 00:12:04,440 --> 00:12:07,360 Speaker 1: and it could mean that will see another drop in 179 00:12:07,440 --> 00:12:12,080 Speaker 1: Bitcoin's value, as China represents a huge market all by itself. 180 00:12:12,400 --> 00:12:15,880 Speaker 1: But we'll also see a change in which mining operations 181 00:12:15,880 --> 00:12:18,959 Speaker 1: are on the receiving end of more bitcoin. And then 182 00:12:19,000 --> 00:12:21,960 Speaker 1: I'm sure Elon Musk will tweet something that will throw 183 00:12:22,000 --> 00:12:24,760 Speaker 1: people into a tizzy, so we'll keep an eye on it. 184 00:12:25,760 --> 00:12:30,560 Speaker 1: Speaking of Elon Musk, his electric vehicle company Tesla you 185 00:12:30,600 --> 00:12:33,959 Speaker 1: may have heard of it is facing a big fine 186 00:12:34,040 --> 00:12:36,600 Speaker 1: in Norway and apparently at the heart of the matter 187 00:12:36,800 --> 00:12:40,280 Speaker 1: is the rate at which Tesla's batteries can charge. So 188 00:12:40,320 --> 00:12:43,200 Speaker 1: the company issued a software update for its cars back 189 00:12:43,200 --> 00:12:48,680 Speaker 1: in twenty nineteen and allegedly that has throttled the charging 190 00:12:48,760 --> 00:12:51,800 Speaker 1: rate for batteries that are in Tesla vehicles that had 191 00:12:51,800 --> 00:12:58,600 Speaker 1: been manufactured between and twenty It also allegedly reduces the 192 00:12:58,720 --> 00:13:02,880 Speaker 1: driving range that vehicles can have before they need a recharge. 193 00:13:03,280 --> 00:13:06,800 Speaker 1: The government has ordered Tesla to pay each affected customer 194 00:13:07,000 --> 00:13:10,680 Speaker 1: the equivalent of sixteen thousand dollars US in order to 195 00:13:10,679 --> 00:13:14,840 Speaker 1: settle the matter, and according to Norwegian press outlets, Tesla 196 00:13:14,960 --> 00:13:20,280 Speaker 1: sold around ten thousand vehicles that were made in those years. 197 00:13:20,760 --> 00:13:23,640 Speaker 1: Within Norway. So by doing some quick math, that would 198 00:13:23,679 --> 00:13:25,520 Speaker 1: mean the company would need to pay out around a 199 00:13:25,559 --> 00:13:29,920 Speaker 1: hundred sixty million dollars US in fines for this issue. 200 00:13:30,280 --> 00:13:33,640 Speaker 1: Tesla had the opportunity to respond to the charges, but 201 00:13:33,720 --> 00:13:37,440 Speaker 1: apparently never filed such a response and so ended up 202 00:13:37,440 --> 00:13:40,640 Speaker 1: being on that end of the judgment. Tesla has until 203 00:13:40,679 --> 00:13:44,000 Speaker 1: May thirty to comply or to file an appeal. The 204 00:13:44,040 --> 00:13:49,560 Speaker 1: company faces similar lawsuits here in the United States. The 205 00:13:49,640 --> 00:13:54,080 Speaker 1: German lower chamber of Parliament has passed legislation that permits 206 00:13:54,200 --> 00:13:58,160 Speaker 1: driverless vehicles on public roadways by next year. Now, the 207 00:13:58,200 --> 00:14:03,400 Speaker 1: country already allowed autonomous vehicles to test on public roads, 208 00:14:03,440 --> 00:14:06,079 Speaker 1: but this step would mean that for the first time, 209 00:14:06,320 --> 00:14:09,600 Speaker 1: driverless vehicles will have the legal right to operate on 210 00:14:09,720 --> 00:14:13,560 Speaker 1: roads without their needing to be a safety driver behind 211 00:14:13,600 --> 00:14:17,160 Speaker 1: the wheel. Now, to qualify for this distinction, cars have 212 00:14:17,280 --> 00:14:21,080 Speaker 1: to demonstrate that they fall into the fourth level of autonomy, 213 00:14:21,120 --> 00:14:25,440 Speaker 1: as designated by the Society of Automobile Engineers or s 214 00:14:25,480 --> 00:14:28,240 Speaker 1: a E. So yeah, just in case you weren't aware, 215 00:14:28,600 --> 00:14:32,640 Speaker 1: the s a E identifies six levels of autonomy, starting 216 00:14:32,680 --> 00:14:36,000 Speaker 1: at zero and ending at five. So at level zero, 217 00:14:36,440 --> 00:14:40,200 Speaker 1: all operations of the vehicle are effectively under manual control. 218 00:14:40,680 --> 00:14:45,119 Speaker 1: There are no real automated systems of you know, any 219 00:14:45,400 --> 00:14:47,760 Speaker 1: means that would control the vehicle, so a human is 220 00:14:47,760 --> 00:14:52,880 Speaker 1: absolutely required to operate the vehicle. Level one would have 221 00:14:53,080 --> 00:14:56,840 Speaker 1: minimal automation, so it might include stuff like adaptive cruise 222 00:14:56,840 --> 00:15:01,479 Speaker 1: control or lane assist, and then each excessive levels seaves 223 00:15:01,520 --> 00:15:05,040 Speaker 1: more operations that are handed over to automated systems, and 224 00:15:05,080 --> 00:15:07,360 Speaker 1: by the time you get to level four, the vehicle 225 00:15:07,560 --> 00:15:12,320 Speaker 1: should be able to operate by itself in most situations, 226 00:15:12,360 --> 00:15:14,480 Speaker 1: and usually you would just consider what would be the 227 00:15:14,600 --> 00:15:19,080 Speaker 1: normal operating conditions within a given region. If you reach 228 00:15:19,200 --> 00:15:23,040 Speaker 1: level five, then theoretically anyway, you've got a complete automation 229 00:15:23,120 --> 00:15:27,480 Speaker 1: that can operate under any conditions, whether they are typical 230 00:15:27,600 --> 00:15:31,440 Speaker 1: or otherwise. We do not have any vehicles that qualify 231 00:15:31,560 --> 00:15:35,560 Speaker 1: as level five autonomy, and according to the s a E, 232 00:15:36,000 --> 00:15:40,440 Speaker 1: A level for autonomous vehicle will only operate autonomously if 233 00:15:40,520 --> 00:15:44,360 Speaker 1: all limited conditions are met for such operations. So in 234 00:15:44,400 --> 00:15:47,040 Speaker 1: other words, you would have a list of things that 235 00:15:47,160 --> 00:15:50,440 Speaker 1: have to be met in order for the vehicle to 236 00:15:50,840 --> 00:15:54,320 Speaker 1: qualify for level four, and if any of those boxes 237 00:15:54,360 --> 00:15:59,120 Speaker 1: went unchecked, then it would not operate autonomously, you would 238 00:15:59,120 --> 00:16:01,520 Speaker 1: have to have someone dray of the vehicle. So in Germany, 239 00:16:02,280 --> 00:16:06,400 Speaker 1: this is going to include operating within specific geographic regions, 240 00:16:06,880 --> 00:16:11,120 Speaker 1: so such vehicles in those regions will have geo fencing 241 00:16:11,240 --> 00:16:14,720 Speaker 1: features that govern their operation. That kind of approach could 242 00:16:14,720 --> 00:16:18,080 Speaker 1: open up the possibility of robo taxis within a specific 243 00:16:18,160 --> 00:16:23,160 Speaker 1: region or city, but you wouldn't see cross country trucking 244 00:16:23,520 --> 00:16:29,360 Speaker 1: necessarily unless specific trucking routes fell under those geo fencing limitations. 245 00:16:29,960 --> 00:16:32,840 Speaker 1: This legislation now has to pass through the upper chamber 246 00:16:32,920 --> 00:16:37,880 Speaker 1: of Germany's Parliament before it will become law. We have 247 00:16:37,960 --> 00:16:40,560 Speaker 1: a couple more stories to cover, but before we get 248 00:16:40,600 --> 00:16:51,560 Speaker 1: to that, let's take a quick break. Okay, we're gonna 249 00:16:51,560 --> 00:16:55,840 Speaker 1: switch on over from global news to cyber security and 250 00:16:55,960 --> 00:17:00,720 Speaker 1: the Bows Corporation, that is, the company that makes audio equipment, 251 00:17:01,080 --> 00:17:04,119 Speaker 1: recently announced that it was the target of a ransomware 252 00:17:04,160 --> 00:17:09,439 Speaker 1: attack back in March. Of Ransomware is a cyber attack 253 00:17:09,520 --> 00:17:14,159 Speaker 1: that typically involves infecting a computer system with malware that 254 00:17:14,320 --> 00:17:18,800 Speaker 1: limits access to that system. It might encrypt data on 255 00:17:18,840 --> 00:17:22,159 Speaker 1: those systems so that people can't see the stuff that 256 00:17:22,240 --> 00:17:25,119 Speaker 1: was saved there, or it otherwise makes it difficult for 257 00:17:25,119 --> 00:17:29,359 Speaker 1: an organization to continue operations. So with ransomware, you usually 258 00:17:29,400 --> 00:17:32,719 Speaker 1: get a message indicating how you can pay off the attackers, 259 00:17:33,119 --> 00:17:38,640 Speaker 1: who then will presumably lift those conditions preventing you from 260 00:17:38,680 --> 00:17:42,280 Speaker 1: getting stuff done. And generally speaking, it is a bad 261 00:17:42,320 --> 00:17:46,119 Speaker 1: idea to pay off ransomware hackers, as it sends the 262 00:17:46,160 --> 00:17:50,000 Speaker 1: message that ransomware is a viable way to make money. 263 00:17:50,240 --> 00:17:56,040 Speaker 1: It encourages future attacks that's not great. Also, you're never 264 00:17:56,119 --> 00:17:59,960 Speaker 1: guaranteed that the attackers are actually gonna lift those restrictions. 265 00:18:00,160 --> 00:18:03,080 Speaker 1: They might not depending on how you pay them. You 266 00:18:03,160 --> 00:18:05,119 Speaker 1: might just be out the money and then they're just 267 00:18:05,280 --> 00:18:09,480 Speaker 1: laughing at you. But anyway, not paying a ransom is 268 00:18:09,560 --> 00:18:12,479 Speaker 1: easier to say than it is to do, as in 269 00:18:12,520 --> 00:18:15,240 Speaker 1: the meantime you really have to, you know, get back 270 00:18:15,280 --> 00:18:17,560 Speaker 1: to work, and depending upon the nature of the attack, 271 00:18:18,280 --> 00:18:20,560 Speaker 1: you might not be able to wait around why your 272 00:18:20,600 --> 00:18:24,679 Speaker 1: team is restoring operations through one method or another. Plus, 273 00:18:24,960 --> 00:18:27,879 Speaker 1: you might have data on those systems that just doesn't 274 00:18:27,920 --> 00:18:31,720 Speaker 1: exist anywhere else anyway. In the case of Bows, the 275 00:18:31,760 --> 00:18:34,720 Speaker 1: company states that the attack included a data breach, and 276 00:18:34,760 --> 00:18:37,800 Speaker 1: according to Bose, the company immediately reached out to a 277 00:18:37,960 --> 00:18:42,080 Speaker 1: quote small number end quote of affected individuals, as is 278 00:18:42,119 --> 00:18:44,720 Speaker 1: required by law when a company discovers that a data 279 00:18:44,760 --> 00:18:49,239 Speaker 1: breach included personal information that relates to, say, employees or 280 00:18:49,240 --> 00:18:53,520 Speaker 1: customers or third parties like corporate partners. In this case, 281 00:18:53,760 --> 00:18:58,120 Speaker 1: the information included personal data about current and former Bows employees. 282 00:18:58,640 --> 00:19:02,680 Speaker 1: That data included such stuff as social security numbers and 283 00:19:02,880 --> 00:19:07,520 Speaker 1: compensation information. While Bose can't determine if the attack was 284 00:19:07,600 --> 00:19:10,520 Speaker 1: aiming at x filtrading that data. In other words, they 285 00:19:10,560 --> 00:19:13,000 Speaker 1: aren't sure if that's what the hackers were trying to 286 00:19:13,040 --> 00:19:17,240 Speaker 1: get at, the company does acknowledge that the hackers had 287 00:19:17,320 --> 00:19:21,439 Speaker 1: some access to quote a limited set of folders end quote. 288 00:19:21,880 --> 00:19:24,640 Speaker 1: Bose also says that the company was able to reestablish 289 00:19:24,640 --> 00:19:29,240 Speaker 1: control of corporate systems by partnering with third party cybersecurity experts, 290 00:19:29,600 --> 00:19:34,280 Speaker 1: and that the company never paid the ransom. Now, while 291 00:19:34,280 --> 00:19:37,560 Speaker 1: we're on the subject of ransomware, pro Publica published an 292 00:19:37,600 --> 00:19:40,640 Speaker 1: interesting piece about it. A lot of alliteration there, so 293 00:19:40,680 --> 00:19:44,600 Speaker 1: sorry to Mr pop Filter. The ransomware group dark Side, 294 00:19:44,840 --> 00:19:47,840 Speaker 1: which was recently responsible for the hack that brought down 295 00:19:47,880 --> 00:19:52,120 Speaker 1: Colonial pipeline and thus created a short term fuel crisis 296 00:19:52,119 --> 00:19:55,280 Speaker 1: in the eastern half of the United States, had previously 297 00:19:55,359 --> 00:19:58,080 Speaker 1: been using a scheme that involved relying on the same 298 00:19:58,160 --> 00:20:02,520 Speaker 1: digital encryption keys for multiple victims, which is kind of 299 00:20:02,560 --> 00:20:06,119 Speaker 1: the equivalent to using the same password for multiple services. 300 00:20:06,680 --> 00:20:09,840 Speaker 1: It's a bad idea if you want to achieve your goals. 301 00:20:09,880 --> 00:20:14,560 Speaker 1: And cybersecurity researchers noticed this trend. A couple of them 302 00:20:14,760 --> 00:20:18,480 Speaker 1: were quietly reaching out to victims of dark Side hacks 303 00:20:18,880 --> 00:20:22,320 Speaker 1: in order to help them restore their services without having 304 00:20:22,320 --> 00:20:26,600 Speaker 1: to pay the ransom. And they were doing it quietly because, well, 305 00:20:27,480 --> 00:20:29,840 Speaker 1: if you let the bad guys know that you figured 306 00:20:29,880 --> 00:20:33,320 Speaker 1: out how to counteract their attacks, the bad guys find 307 00:20:33,400 --> 00:20:37,720 Speaker 1: new ways to attack. Then a cybersecurity company called bit 308 00:20:37,840 --> 00:20:42,680 Speaker 1: Defender also found this same vulnerability and released an announcement 309 00:20:42,720 --> 00:20:45,240 Speaker 1: about it and about how the company had developed a 310 00:20:45,400 --> 00:20:49,359 Speaker 1: process to counteract the attacks. Now, the intent, I assume 311 00:20:49,480 --> 00:20:51,359 Speaker 1: was to let companies know, hey, we figured out a 312 00:20:51,359 --> 00:20:53,880 Speaker 1: mistake that dark Side is making, so we can help 313 00:20:53,920 --> 00:20:57,160 Speaker 1: you if you've been attacked by them. Except the bad 314 00:20:57,240 --> 00:21:01,280 Speaker 1: guys read press releases too, and dark Side responded by 315 00:21:01,359 --> 00:21:05,080 Speaker 1: saying that it had taken note of that discovered vulnerability 316 00:21:05,160 --> 00:21:08,639 Speaker 1: in its attack scheme, and they patched it. They even 317 00:21:08,720 --> 00:21:12,560 Speaker 1: thanked bit Defender for the heads up, so a cybersecurity 318 00:21:12,560 --> 00:21:17,040 Speaker 1: firm effectively alerted hackers to a weakness in their own attacks, 319 00:21:17,560 --> 00:21:21,720 Speaker 1: which thus made the hackers better at hacking, which, um here, 320 00:21:21,800 --> 00:21:24,160 Speaker 1: let me check my notes here. Uh yeah, it says 321 00:21:24,240 --> 00:21:27,840 Speaker 1: here that's the exact opposite of what you want to do. Then, 322 00:21:27,920 --> 00:21:31,560 Speaker 1: not long after that, the hackers targeted Colonial Pipeline and 323 00:21:31,600 --> 00:21:34,800 Speaker 1: we saw the consequences of that particular move. And oh 324 00:21:34,920 --> 00:21:37,960 Speaker 1: we also know that Colonial actually gave in to the 325 00:21:38,000 --> 00:21:41,000 Speaker 1: hackers demands and paid a ransom of more than four 326 00:21:41,040 --> 00:21:44,600 Speaker 1: million dollars to dark Side, which again is a bad 327 00:21:44,720 --> 00:21:48,400 Speaker 1: freaking idea. Granted, the crisis we saw as a result 328 00:21:48,600 --> 00:21:52,680 Speaker 1: of that attack wasn't great, and it was exacerbated by 329 00:21:52,920 --> 00:21:56,480 Speaker 1: selfish and frankly stupid actions of a lot of folks 330 00:21:56,480 --> 00:21:58,960 Speaker 1: in the Eastern United States who decided to try and 331 00:21:59,040 --> 00:22:03,240 Speaker 1: hoard gasle But paying the hackers means that those hackers 332 00:22:03,240 --> 00:22:05,639 Speaker 1: will keep on doing what they're doing as long as 333 00:22:05,640 --> 00:22:10,840 Speaker 1: it's paying off. Anyway, this particular story got my dander up. 334 00:22:12,280 --> 00:22:16,360 Speaker 1: Continuing our cybersecurity news, Extreme Tech has a piece titled 335 00:22:16,640 --> 00:22:20,879 Speaker 1: new Morpheus CPU design defeats hundreds of hackers in DARBA 336 00:22:21,000 --> 00:22:25,399 Speaker 1: tests and this is pretty cool alright. So CPUs or 337 00:22:25,600 --> 00:22:30,000 Speaker 1: central processing units have a design that you could call 338 00:22:30,080 --> 00:22:34,840 Speaker 1: an architecture, and that architecture is dependent upon which chip 339 00:22:34,920 --> 00:22:39,160 Speaker 1: manufacturer you're talking about. Essentially, it's, at least in theory, 340 00:22:40,280 --> 00:22:45,119 Speaker 1: a a optimized way for data to be processed through 341 00:22:45,119 --> 00:22:48,600 Speaker 1: that chip. So a company like Intel has what they 342 00:22:48,640 --> 00:22:52,760 Speaker 1: call a TikTok approach, So they'll create one architecture designed 343 00:22:52,800 --> 00:22:55,960 Speaker 1: for a chip. Then in the next generation of chips, 344 00:22:55,960 --> 00:22:59,840 Speaker 1: they keep that same design even as they reduce the 345 00:23:00,080 --> 00:23:03,520 Speaker 1: size of components, so they are able to pack more 346 00:23:03,560 --> 00:23:06,000 Speaker 1: components on a chip, but they're staying with the same 347 00:23:06,520 --> 00:23:09,960 Speaker 1: general design as they had for the previous generation. Then 348 00:23:10,280 --> 00:23:14,240 Speaker 1: they overhauled the architecture for the following generation of chips 349 00:23:14,320 --> 00:23:18,800 Speaker 1: after that. So it's all about find the ideal architecture 350 00:23:19,160 --> 00:23:23,359 Speaker 1: for that size of components. Then you reduce the size 351 00:23:23,359 --> 00:23:27,000 Speaker 1: of components, then you find the ideal architecture again, and 352 00:23:27,000 --> 00:23:30,240 Speaker 1: you just keep going back and forth between those uh 353 00:23:30,640 --> 00:23:37,720 Speaker 1: those processes. Well, sometimes these architectures have flaws or vulnerabilities 354 00:23:37,760 --> 00:23:40,879 Speaker 1: in them, so it's in the hardware itself, and hackers 355 00:23:40,920 --> 00:23:45,720 Speaker 1: who are familiar with architecture can design code that targets 356 00:23:45,760 --> 00:23:49,640 Speaker 1: those vulnerabilities and could allow them to inject malware into 357 00:23:49,680 --> 00:23:54,520 Speaker 1: systems that rely on those types of chips. Enter the Morpheus. Now, 358 00:23:54,560 --> 00:23:59,159 Speaker 1: this chips job was to hide critical information like vulnerabilities 359 00:23:59,400 --> 00:24:03,119 Speaker 1: from hackers without impacting the ability of a developer to 360 00:24:03,200 --> 00:24:07,560 Speaker 1: write code for the machine, which is a non trivial problem. 361 00:24:07,600 --> 00:24:11,080 Speaker 1: The Morpheus is actually a simulated chip and it had 362 00:24:11,280 --> 00:24:15,200 Speaker 1: very modest capabilities. This was not like a screamingly fast, 363 00:24:15,359 --> 00:24:19,439 Speaker 1: bleeding edge processor. But what it does do is it 364 00:24:19,560 --> 00:24:24,480 Speaker 1: encrypts memory pointers every one hundred milliseconds, So in a 365 00:24:24,520 --> 00:24:27,960 Speaker 1: fraction of a second, the CPU switches up the encryption 366 00:24:28,040 --> 00:24:31,360 Speaker 1: scheme for these memory points and thus narrows the window 367 00:24:31,600 --> 00:24:35,280 Speaker 1: that hackers would have to one figure out the architecture 368 00:24:35,560 --> 00:24:40,359 Speaker 1: properly and to launch an attack that targets that architecture. So, 369 00:24:40,400 --> 00:24:42,760 Speaker 1: if I were to make an analogy, I would say 370 00:24:43,119 --> 00:24:46,560 Speaker 1: it's kind of like Hogwarts in the Harry Potter stories. 371 00:24:46,880 --> 00:24:50,960 Speaker 1: The staircases and halls of Hogwarts frequently change, which means 372 00:24:51,240 --> 00:24:54,920 Speaker 1: you're route to get from point A to point B changes. Now, 373 00:24:54,960 --> 00:24:59,119 Speaker 1: if Hogwarts were to change every one hundred milliseconds to us, 374 00:24:59,160 --> 00:25:01,240 Speaker 1: that would just seem like it was in a constant 375 00:25:01,359 --> 00:25:04,159 Speaker 1: state of change, and getting from one point to another 376 00:25:04,280 --> 00:25:10,040 Speaker 1: would become incredibly challenging, if not impossible. That's wizard's chess 377 00:25:10,880 --> 00:25:15,400 Speaker 1: or cybersecurity. Now, keep in mind this approach specifically applies 378 00:25:15,440 --> 00:25:20,560 Speaker 1: to counteracting hacker attempts to exploit underlying vulnerabilities in the 379 00:25:20,600 --> 00:25:24,840 Speaker 1: processors themselves. It does not prevent attacks that target software 380 00:25:24,880 --> 00:25:29,680 Speaker 1: vulnerabilities or attacks that rely on social engineering to get 381 00:25:29,720 --> 00:25:33,480 Speaker 1: access to systems. This is just one facet of cybersecurity. 382 00:25:33,800 --> 00:25:37,080 Speaker 1: It's a really nifty one and effective. The report says 383 00:25:37,160 --> 00:25:42,040 Speaker 1: that that hackers spent the equivalent of thousands of hours 384 00:25:42,080 --> 00:25:47,640 Speaker 1: attempting to compromise the system, and we're unable to do it. 385 00:25:48,200 --> 00:25:50,639 Speaker 1: Not the moment. This is more of a prototype or 386 00:25:50,680 --> 00:25:55,200 Speaker 1: proof of concept because the simulated chips were seriously underpowered 387 00:25:55,280 --> 00:25:58,560 Speaker 1: compared to today's technology. It would also mean that any 388 00:25:58,640 --> 00:26:01,880 Speaker 1: chips that use this approach would effectively be giving up 389 00:26:01,960 --> 00:26:05,000 Speaker 1: ten to fifteen percent of their performance in order to 390 00:26:05,040 --> 00:26:09,080 Speaker 1: carry out the encryption process in the background. Still, that's 391 00:26:09,119 --> 00:26:11,600 Speaker 1: a small price to pay for better security, at least 392 00:26:11,760 --> 00:26:17,200 Speaker 1: for some organizations and applications. In previous episodes, I reported 393 00:26:17,200 --> 00:26:20,000 Speaker 1: on how Nvidio, which is known mainly for the production 394 00:26:20,080 --> 00:26:24,960 Speaker 1: of graphics processing units or GPUs, is acquiring the semiconductor 395 00:26:25,000 --> 00:26:28,560 Speaker 1: company ARM, which is based in the UK. The Register 396 00:26:28,760 --> 00:26:31,960 Speaker 1: reports that ARM now has a hiring freeze and effect, 397 00:26:32,359 --> 00:26:35,600 Speaker 1: and that the well being allowance for the company that 398 00:26:35,720 --> 00:26:39,280 Speaker 1: had been paying out to employees is now cut and 399 00:26:39,320 --> 00:26:43,359 Speaker 1: apparently these decisions will hold until the acquisition is complete, 400 00:26:43,800 --> 00:26:47,880 Speaker 1: which is supposed to happen sometime next April, and this 401 00:26:47,960 --> 00:26:50,359 Speaker 1: means that the ARMED teams will not be allowed to 402 00:26:50,440 --> 00:26:55,240 Speaker 1: hire anyone new until then. Even if employees and existing 403 00:26:55,240 --> 00:26:59,440 Speaker 1: positions leave the company, there is no allowance to backfill 404 00:26:59,520 --> 00:27:02,720 Speaker 1: positions that are left empty by departures. And in addition, 405 00:27:02,920 --> 00:27:05,960 Speaker 1: any positions that were going to be filled before the 406 00:27:06,040 --> 00:27:09,639 Speaker 1: hiring freeze now have to remain empty. And it sounds 407 00:27:09,680 --> 00:27:11,920 Speaker 1: like working at ARM for the next year is going 408 00:27:11,960 --> 00:27:16,120 Speaker 1: to be a pretty tough experience with effectively a pay 409 00:27:16,200 --> 00:27:19,840 Speaker 1: cut for employees, and then team managers are left without 410 00:27:19,880 --> 00:27:21,639 Speaker 1: an option if they need to add more people to 411 00:27:21,680 --> 00:27:24,919 Speaker 1: their teams. Any exception has to get the approval of 412 00:27:25,080 --> 00:27:29,400 Speaker 1: executive leadership. ARM issued a statement that said the company 413 00:27:29,520 --> 00:27:31,920 Speaker 1: was ahead of its head count goals for the year, 414 00:27:32,320 --> 00:27:34,440 Speaker 1: and so the freeze really is just to prevent the 415 00:27:34,480 --> 00:27:38,600 Speaker 1: company from overextending itself and quote, remain within the cost 416 00:27:38,640 --> 00:27:43,280 Speaker 1: targets for the business end. Quote. Acquisitions can be really 417 00:27:43,359 --> 00:27:46,240 Speaker 1: scary for employees, and I should know. I used to 418 00:27:46,240 --> 00:27:50,000 Speaker 1: work for consultants who would help oversee acquisitions, so I 419 00:27:50,040 --> 00:27:52,600 Speaker 1: saw it from that side. And then I worked for 420 00:27:52,640 --> 00:27:57,200 Speaker 1: a company that was acquired several times within ten years. 421 00:27:57,240 --> 00:27:59,800 Speaker 1: So as an employee, you just don't know how you're 422 00:28:00,000 --> 00:28:02,240 Speaker 1: goal will be impacted or even if you will have 423 00:28:02,320 --> 00:28:05,680 Speaker 1: a place at the company once the dust settles. So 424 00:28:05,840 --> 00:28:08,920 Speaker 1: seeing these decisions with almost a year to go before 425 00:28:08,920 --> 00:28:14,960 Speaker 1: the acquisition completes has got to be rough for those employees. Also, 426 00:28:15,040 --> 00:28:18,080 Speaker 1: we don't know for sure that the British government is 427 00:28:18,080 --> 00:28:21,120 Speaker 1: going to sign off on this acquisition because there are 428 00:28:21,200 --> 00:28:25,200 Speaker 1: concerns that a company from outside the UK acquiring ARM 429 00:28:25,400 --> 00:28:29,520 Speaker 1: a company based within the UK could potentially represent national 430 00:28:29,560 --> 00:28:34,600 Speaker 1: security risks, so we will stay tuned. And finally, numerous 431 00:28:34,680 --> 00:28:37,800 Speaker 1: sources report that Netflix is looking to get into the 432 00:28:37,880 --> 00:28:42,440 Speaker 1: video games business, possibly offering up a subscription based service 433 00:28:42,560 --> 00:28:46,200 Speaker 1: as early as next year. According to initial reports, it 434 00:28:46,240 --> 00:28:48,960 Speaker 1: sounds like the service would be similar to what you 435 00:28:49,000 --> 00:28:52,480 Speaker 1: see with Apple Arcade, which offers up a subscription based 436 00:28:52,480 --> 00:28:56,480 Speaker 1: service for games that are similar to mobile games in general, 437 00:28:57,000 --> 00:29:00,640 Speaker 1: but they don't have the stuff you tip quickly find 438 00:29:00,640 --> 00:29:04,840 Speaker 1: in mobile games, stuff like in app purchases or advertising. 439 00:29:05,160 --> 00:29:07,960 Speaker 1: So in other words, it sounds like netflix Is offerings 440 00:29:08,080 --> 00:29:11,280 Speaker 1: might be more in the mobile category of games, rather 441 00:29:11,360 --> 00:29:15,200 Speaker 1: than say, a streaming service that offers up access to 442 00:29:15,320 --> 00:29:20,920 Speaker 1: Triple A style video games, akin to the Microsoft's Xbox 443 00:29:20,960 --> 00:29:26,440 Speaker 1: Pass or Google Stadia, which I guess the less said 444 00:29:26,440 --> 00:29:29,120 Speaker 1: about that the better right now. But it's all still 445 00:29:29,280 --> 00:29:32,600 Speaker 1: very early on and the company is actively looking to 446 00:29:32,640 --> 00:29:37,240 Speaker 1: fill leadership positions that would define Netflix's gaming strategy moving forward, 447 00:29:37,320 --> 00:29:41,680 Speaker 1: so this is an area of development. We will keep 448 00:29:41,680 --> 00:29:44,520 Speaker 1: our eyes on that as well. And that is the 449 00:29:44,520 --> 00:29:49,360 Speaker 1: news for Tuesday, May one. As always, if you have 450 00:29:49,400 --> 00:29:51,560 Speaker 1: suggestions for topics you would like me to cover on 451 00:29:51,640 --> 00:29:54,520 Speaker 1: future episodes of tech Stuff, reach out to me on Twitter. 452 00:29:54,880 --> 00:29:57,600 Speaker 1: The handle we use is text stuff h s W 453 00:29:58,200 --> 00:30:06,560 Speaker 1: and I'll talk to you again really soon, y. Text 454 00:30:06,560 --> 00:30:10,000 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 455 00:30:10,040 --> 00:30:12,800 Speaker 1: from I Heart Radio, visit the i Heart Radio app, 456 00:30:12,920 --> 00:30:16,080 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows,