1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,760 --> 00:00:14,400 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,480 --> 00:00:17,480 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,520 --> 00:00:20,079 Speaker 1: And how the tech are you. It's time for the 5 00:00:20,079 --> 00:00:25,439 Speaker 1: tech news for Thursday, October six, twenty twenty two. And 6 00:00:25,600 --> 00:00:29,680 Speaker 1: I mentioned this first item briefly in yesterday's episode, but 7 00:00:29,920 --> 00:00:33,839 Speaker 1: on Tuesday, after I had already submitted my news episode 8 00:00:33,840 --> 00:00:36,200 Speaker 1: for the day. So this is the height of rudeness. 9 00:00:36,920 --> 00:00:40,920 Speaker 1: Elon Musk went ahead and switched gears yet again. Actually, 10 00:00:40,920 --> 00:00:43,159 Speaker 1: to be fair, he did that on Monday, but the 11 00:00:43,200 --> 00:00:47,159 Speaker 1: news broke on Tuesday. And what I mean by all 12 00:00:47,240 --> 00:00:52,240 Speaker 1: this takes us back to the sprawling, chaotic and messy 13 00:00:52,440 --> 00:00:58,440 Speaker 1: Twitter acquisition. All right, let's get a previously on segment 14 00:00:58,520 --> 00:01:00,920 Speaker 1: in here to really kind of sum arise what has 15 00:01:00,920 --> 00:01:06,720 Speaker 1: been going on with this whole saga back in April two. 16 00:01:07,720 --> 00:01:10,279 Speaker 1: At this point, it feels like it happened an eternity ago. 17 00:01:10,720 --> 00:01:15,160 Speaker 1: Ellen Musk revealed that he had purchased, uh, let me see, 18 00:01:15,160 --> 00:01:18,520 Speaker 1: I think the technical term is a metric buttload of 19 00:01:18,560 --> 00:01:23,520 Speaker 1: Twitter stock, like just under ten percent. This was enough 20 00:01:23,600 --> 00:01:27,880 Speaker 1: to merit an invitation to join Twitter's board of directors, 21 00:01:27,920 --> 00:01:31,039 Speaker 1: and Musk considered doing that, but then he saw on 22 00:01:31,080 --> 00:01:34,000 Speaker 1: the fine print that if he did join the board 23 00:01:34,040 --> 00:01:37,080 Speaker 1: of directors, he would have to agree to a limit 24 00:01:37,200 --> 00:01:41,240 Speaker 1: on how much stock he could own. Now, whether Musk 25 00:01:41,400 --> 00:01:45,800 Speaker 1: was already considering buying Twitter outright, I don't know, but 26 00:01:46,720 --> 00:01:49,520 Speaker 1: at some point it became clear that that was what 27 00:01:49,600 --> 00:01:53,120 Speaker 1: he decided he wanted to do. So. Musk then goes 28 00:01:53,160 --> 00:01:56,480 Speaker 1: on to make an offer on Twitter to buy out 29 00:01:56,560 --> 00:02:00,800 Speaker 1: all existing stock at fifty four dollars twenty since per share, 30 00:02:01,280 --> 00:02:04,240 Speaker 1: which would bring the total cost of acquisition to around 31 00:02:04,240 --> 00:02:09,320 Speaker 1: forty four billion with a B dollars. Must then win 32 00:02:09,400 --> 00:02:13,000 Speaker 1: about securing financing for this deal, which included a meeting 33 00:02:13,040 --> 00:02:15,960 Speaker 1: with investors who would put up some of the cash 34 00:02:16,200 --> 00:02:19,320 Speaker 1: to fund it, and also banks that would loan out 35 00:02:19,880 --> 00:02:25,520 Speaker 1: more money against Musk's considerable personal assets. Oh and um, 36 00:02:25,639 --> 00:02:29,720 Speaker 1: one really important part of this initial agreement is that 37 00:02:30,080 --> 00:02:34,000 Speaker 1: Musk agreed to waive due diligence, which I think a 38 00:02:34,040 --> 00:02:37,920 Speaker 1: ton of folks have said was a curious strategy, which 39 00:02:38,280 --> 00:02:41,680 Speaker 1: that's what the kind people are saying. Others are saying 40 00:02:42,160 --> 00:02:44,960 Speaker 1: it was just playing dumb. You wouldn't agree to buy 41 00:02:45,000 --> 00:02:48,320 Speaker 1: a house without first having an inspector walked through and 42 00:02:48,360 --> 00:02:50,919 Speaker 1: make sure you know the foundation is solid and all 43 00:02:50,919 --> 00:02:53,880 Speaker 1: that kind of stuff. But things seem to be set 44 00:02:53,880 --> 00:02:57,640 Speaker 1: in motion for a Musk acquisition of Twitter. The board 45 00:02:57,639 --> 00:03:01,280 Speaker 1: of directors were happy with this idea. Uh, there was 46 00:03:01,880 --> 00:03:04,960 Speaker 1: the requirement to hold a shareholder vote to see if 47 00:03:05,000 --> 00:03:08,480 Speaker 1: shareholders agreed, and that didn't happen until this past September, 48 00:03:09,200 --> 00:03:13,600 Speaker 1: but ultimately shareholders voted in favor of that too. However, 49 00:03:14,080 --> 00:03:17,520 Speaker 1: in July, so not that long after the wheels were 50 00:03:17,560 --> 00:03:22,079 Speaker 1: in motion, Musk appeared to have second thoughts about this acquisition, 51 00:03:22,240 --> 00:03:25,360 Speaker 1: and there's been a lot of speculation about this. Musk 52 00:03:25,520 --> 00:03:29,359 Speaker 1: was arguing that Twitter had misrepresented its value at the 53 00:03:29,480 --> 00:03:33,560 Speaker 1: early negotiations, and he claimed that the platform was absolutely 54 00:03:33,680 --> 00:03:38,480 Speaker 1: riddled with bought accounts. Twitter claimed that, based upon its 55 00:03:38,480 --> 00:03:42,680 Speaker 1: own process, bots made up less than five of all 56 00:03:42,760 --> 00:03:47,720 Speaker 1: monetize able accounts. Musk argued there was way way more 57 00:03:47,760 --> 00:03:52,160 Speaker 1: than that, possibly as much as nine, which seems absolutely 58 00:03:52,280 --> 00:03:55,760 Speaker 1: unrealistic to me. But then I also think the less 59 00:03:55,760 --> 00:03:59,160 Speaker 1: than five percent number is at least hard to believe. 60 00:03:59,360 --> 00:04:02,520 Speaker 1: It might be true through, but it's hard to believe. 61 00:04:03,400 --> 00:04:07,559 Speaker 1: Lots of folks suggested a hypothesis that the financial downturn 62 00:04:07,720 --> 00:04:11,080 Speaker 1: that we've seen this year meant that Musk's own assets 63 00:04:11,120 --> 00:04:13,640 Speaker 1: were worth way less than what they had been when 64 00:04:13,720 --> 00:04:17,200 Speaker 1: he first made the offer, and that it was possible 65 00:04:17,640 --> 00:04:20,240 Speaker 1: that he was also starting to hit some resistance with 66 00:04:20,279 --> 00:04:24,160 Speaker 1: financing As a result of this, plus, Twitter itself started 67 00:04:24,200 --> 00:04:27,080 Speaker 1: to look like a less valuable purchase, helped in no 68 00:04:27,160 --> 00:04:30,000 Speaker 1: small part by the fact that Musk himself was slagging 69 00:04:30,000 --> 00:04:33,919 Speaker 1: off Twitter pretty much at every opportunity. And he might say, ha, 70 00:04:34,520 --> 00:04:37,200 Speaker 1: that seems weird that you would bad mouth something you 71 00:04:37,200 --> 00:04:40,520 Speaker 1: were planning on buying, since you'd actually be creating a 72 00:04:40,640 --> 00:04:43,880 Speaker 1: larger gap between the price you had agreed to pay 73 00:04:44,160 --> 00:04:47,200 Speaker 1: and the value of the thing you were buying. But 74 00:04:47,279 --> 00:04:50,360 Speaker 1: then you're not the world's richest man, right, You just 75 00:04:50,440 --> 00:04:54,440 Speaker 1: don't see the big picture anyway. Twitter filed a lawsuit 76 00:04:54,520 --> 00:04:57,560 Speaker 1: against Musk in an effort to force him to go 77 00:04:57,640 --> 00:05:00,560 Speaker 1: through with the acquisition. Twitter was our guing that the 78 00:05:00,600 --> 00:05:03,960 Speaker 1: deal Musk signed didn't leave him the option to just 79 00:05:04,160 --> 00:05:08,960 Speaker 1: back out, and that's true. Musk can't back out of 80 00:05:08,960 --> 00:05:14,840 Speaker 1: the deal unless certain specific criteria are met. For example, 81 00:05:15,360 --> 00:05:20,159 Speaker 1: if Musk could legally prove that Twitter purposefully misrepresented its 82 00:05:20,240 --> 00:05:24,960 Speaker 1: value to a significant degree, he could potentially walk out 83 00:05:25,000 --> 00:05:27,839 Speaker 1: of this deal. The details get a little more complicated 84 00:05:27,880 --> 00:05:31,039 Speaker 1: than that, but that's the basic idea. But even then, 85 00:05:31,440 --> 00:05:33,760 Speaker 1: Musk would still have to cough up a billion dollars 86 00:05:33,800 --> 00:05:37,160 Speaker 1: just to walk away. Twitter's lawsuit is set to go 87 00:05:37,279 --> 00:05:40,559 Speaker 1: to court on October seventeen, so it's getting here soon. 88 00:05:41,320 --> 00:05:44,400 Speaker 1: Musk's legal team had tried to push for a later date. 89 00:05:44,440 --> 00:05:48,640 Speaker 1: They wanted it to happen as late as early, but 90 00:05:48,760 --> 00:05:51,560 Speaker 1: the judge wasn't having it. Twitter was arguing for it 91 00:05:51,600 --> 00:05:53,800 Speaker 1: to be earlier. In fact, so the October date, in 92 00:05:53,880 --> 00:05:57,680 Speaker 1: some respect was kind of a compromise between the two. Now, 93 00:05:57,680 --> 00:06:00,840 Speaker 1: the judge has largely been sighting with Twitter on pre 94 00:06:01,000 --> 00:06:03,760 Speaker 1: trial decisions, and that kind of brings us up to 95 00:06:03,960 --> 00:06:08,120 Speaker 1: this week's news. On Monday, Musk filed a letter with 96 00:06:08,160 --> 00:06:12,799 Speaker 1: the Securities and Exchange Commission Twitter recommending himself to acquiring 97 00:06:12,839 --> 00:06:16,239 Speaker 1: Twitter and the agreed upon terms that were made way 98 00:06:16,240 --> 00:06:19,520 Speaker 1: back in April of this year. But he did ask 99 00:06:20,000 --> 00:06:25,880 Speaker 1: that this happened uh on a couple of preconditions. One 100 00:06:25,920 --> 00:06:29,480 Speaker 1: of those conditions is that if his financing falls through, 101 00:06:29,760 --> 00:06:32,400 Speaker 1: he would be allowed to pay the billion dollar penalty 102 00:06:32,480 --> 00:06:36,679 Speaker 1: and then walk away. The other condition is that Twitter 103 00:06:36,839 --> 00:06:40,919 Speaker 1: could please drop its lawsuit please. The lawsuit was starting 104 00:06:40,960 --> 00:06:43,160 Speaker 1: to look like it was gonna get real ugly for 105 00:06:43,240 --> 00:06:46,760 Speaker 1: all involved. In fact, not starting the lawsuit was clearly 106 00:06:46,800 --> 00:06:51,720 Speaker 1: going to be ugly for both Elon Musk and for Twitter. Already, 107 00:06:51,760 --> 00:06:55,640 Speaker 1: Elon Musk's personal text messages have been entered into the 108 00:06:55,680 --> 00:07:00,640 Speaker 1: court pre trial proceedings, which does not paint a good picture. 109 00:07:01,040 --> 00:07:05,279 Speaker 1: Twitter's dirty laundry is ready to be put on display. Plus, 110 00:07:06,360 --> 00:07:09,200 Speaker 1: the pre trial stuff may have indicated to Musk's legal 111 00:07:09,240 --> 00:07:11,720 Speaker 1: team that the chances of winning the case were a 112 00:07:11,760 --> 00:07:15,880 Speaker 1: wee bit on the slim side, and that would mean 113 00:07:15,920 --> 00:07:18,440 Speaker 1: that if Musk lost the legal battle, he would still 114 00:07:18,440 --> 00:07:21,440 Speaker 1: have to acquire the company anyway, his punishments, he has 115 00:07:21,520 --> 00:07:24,800 Speaker 1: to go through the deal that he already agreed to do. Now. 116 00:07:24,840 --> 00:07:27,760 Speaker 1: So far, Twitter has not filed for a stay in 117 00:07:27,920 --> 00:07:32,119 Speaker 1: court action, and it's kind of understandable because I feel 118 00:07:32,120 --> 00:07:36,080 Speaker 1: like if Twitter did drop the case entirely, they might 119 00:07:36,120 --> 00:07:40,000 Speaker 1: not trust that Elon Musk will follow through on his 120 00:07:40,240 --> 00:07:43,840 Speaker 1: statement that he is in fact going to acquire the company. However, 121 00:07:44,600 --> 00:07:48,320 Speaker 1: Musk was originally scheduled to sit for a deposition today 122 00:07:48,480 --> 00:07:52,480 Speaker 1: and that has been postponed, so that part has been delayed, 123 00:07:52,520 --> 00:07:57,000 Speaker 1: at least temporarily. Also, Reuter's reports that some banks have 124 00:07:57,400 --> 00:08:01,160 Speaker 1: held talks with Musk about financing back in the summer, 125 00:08:01,200 --> 00:08:05,160 Speaker 1: but had since backed out, so it's not entirely clear 126 00:08:05,640 --> 00:08:08,960 Speaker 1: where all the different financing is coming from. Uh. The 127 00:08:09,000 --> 00:08:12,520 Speaker 1: ones that are remaining in place are in a potentially 128 00:08:12,560 --> 00:08:15,760 Speaker 1: really tough position because it's an economically risky time for 129 00:08:15,800 --> 00:08:20,200 Speaker 1: banks to get involved in large debt financing transactions. So 130 00:08:20,240 --> 00:08:22,760 Speaker 1: this could mean that the banks that are still involved 131 00:08:22,760 --> 00:08:27,960 Speaker 1: in this are in danger of losing a another buttload 132 00:08:28,000 --> 00:08:31,360 Speaker 1: of money. So what I'm saying is that this deal 133 00:08:31,600 --> 00:08:36,160 Speaker 1: is not a sure thing. It looks like it's heading 134 00:08:36,200 --> 00:08:39,720 Speaker 1: towards Elon Musk acquiring Twitter. That seems to be where 135 00:08:39,720 --> 00:08:44,200 Speaker 1: we are headed, but the details are still a bit fuzzy. 136 00:08:44,320 --> 00:08:48,520 Speaker 1: Um Oh. We also heard that before all of this, 137 00:08:49,240 --> 00:08:52,200 Speaker 1: Musk had been holding some talks with Twitter in an 138 00:08:52,240 --> 00:08:55,560 Speaker 1: attempt to negotiate a lower price than the one he 139 00:08:55,600 --> 00:08:58,760 Speaker 1: had previously agreed to. But you know, when the ink 140 00:08:59,120 --> 00:09:01,760 Speaker 1: is dry on the contract, it's pretty hard to come 141 00:09:01,840 --> 00:09:03,960 Speaker 1: up with a convincing argument to get the seller to 142 00:09:04,120 --> 00:09:08,600 Speaker 1: reduce the price. Anyway, now we are all caught up 143 00:09:09,640 --> 00:09:12,680 Speaker 1: until later today, probably in which case I'll have to 144 00:09:13,160 --> 00:09:16,160 Speaker 1: do another update next week. Well I might as well 145 00:09:16,200 --> 00:09:19,480 Speaker 1: put this story next because we're already talking about Elon Musk. 146 00:09:20,280 --> 00:09:23,840 Speaker 1: Tesla released an announcement stating that it will be shifting 147 00:09:24,000 --> 00:09:29,520 Speaker 1: to a camera only approach in its driver assist technology. 148 00:09:29,600 --> 00:09:33,800 Speaker 1: That means the company is phasing out stuff like ultrasonic sensors. 149 00:09:33,840 --> 00:09:38,360 Speaker 1: It is already h taking out things like like LDAR, 150 00:09:38,800 --> 00:09:42,360 Speaker 1: So this is testas stripping out some of the technologies 151 00:09:42,400 --> 00:09:45,240 Speaker 1: that uses for computer vision really is what it comes 152 00:09:45,240 --> 00:09:49,560 Speaker 1: down to. Now. The ultrasonic sensors we're mostly used for 153 00:09:49,600 --> 00:09:54,880 Speaker 1: stuff like parking assist and collision prevention. And it's really 154 00:09:54,920 --> 00:09:58,360 Speaker 1: interesting to me because generally speaking, the trend you see 155 00:09:58,400 --> 00:10:02,120 Speaker 1: an autonomous vehicle company ease goes the opposite way. And 156 00:10:02,160 --> 00:10:04,240 Speaker 1: by that I mean most companies that are working in 157 00:10:04,240 --> 00:10:07,920 Speaker 1: that field are really building out a robust suite of 158 00:10:07,960 --> 00:10:10,920 Speaker 1: technologies to support computer vision. They're not relying on a 159 00:10:11,000 --> 00:10:16,079 Speaker 1: single one, but a whole array of different array is 160 00:10:16,160 --> 00:10:18,760 Speaker 1: almost a pun, but a whole array of different technologies 161 00:10:18,800 --> 00:10:23,440 Speaker 1: to try and provide computer vision for these complicated autonomous systems. 162 00:10:23,480 --> 00:10:26,840 Speaker 1: Tesla says, no, we're just gonna go pure optical camera. 163 00:10:27,320 --> 00:10:30,679 Speaker 1: We're gonna we're gonna simplify things. So yeah, it's kind 164 00:10:30,679 --> 00:10:34,440 Speaker 1: of a it's kind of a reversal of what everyone 165 00:10:34,520 --> 00:10:38,040 Speaker 1: else is doing. As Tesla does this, the company says 166 00:10:38,120 --> 00:10:42,480 Speaker 1: drivers should expect certain features to be unavailable at first 167 00:10:42,480 --> 00:10:45,400 Speaker 1: when they purchase a new Tesla that is camera only. 168 00:10:45,720 --> 00:10:49,280 Speaker 1: So those features include stuff like park assist uh summon, 169 00:10:49,480 --> 00:10:52,560 Speaker 1: where you get your car to magically drive up to 170 00:10:52,600 --> 00:10:55,880 Speaker 1: wherever you are. Auto park is another one. So these 171 00:10:55,960 --> 00:10:59,800 Speaker 1: kind of things that the ultrasonic sensor was specifically designed 172 00:10:59,800 --> 00:11:03,480 Speaker 1: to handle, you know, to to maneuver a car without 173 00:11:03,559 --> 00:11:07,280 Speaker 1: having it bump into stuff that's close by. Those are 174 00:11:07,280 --> 00:11:11,360 Speaker 1: the things that will not be available initially as Tesla 175 00:11:11,440 --> 00:11:14,920 Speaker 1: moves to this camera only approach. Now, the company claims 176 00:11:15,240 --> 00:11:18,600 Speaker 1: that the features will return, but it will take time 177 00:11:18,640 --> 00:11:21,480 Speaker 1: because the company has to ensure that the camera only 178 00:11:21,640 --> 00:11:26,560 Speaker 1: version can complete these tasks as reliably and as safely 179 00:11:26,600 --> 00:11:31,440 Speaker 1: as earlier Tesla models that had ultrasonic sensors. Now, I 180 00:11:31,440 --> 00:11:35,480 Speaker 1: imagine this move could bring down the cost of production 181 00:11:35,600 --> 00:11:39,640 Speaker 1: of the vehicles, they won't need as many different components, 182 00:11:39,920 --> 00:11:43,400 Speaker 1: so it will probably mean that making a Tesla vehicle 183 00:11:43,400 --> 00:11:45,960 Speaker 1: will be less expensive, which could mean that we'll see 184 00:11:46,000 --> 00:11:50,520 Speaker 1: reduced sticker prices on future Tesla's, But with so many 185 00:11:50,559 --> 00:11:54,320 Speaker 1: complex economic factors at play, I wouldn't count on that, 186 00:11:54,720 --> 00:11:59,720 Speaker 1: just because we have other things like you know, inflation, recessions, 187 00:12:00,320 --> 00:12:03,199 Speaker 1: semiconductor shortages, all these kind of things play into that. 188 00:12:03,880 --> 00:12:07,400 Speaker 1: So I'm not saying that the Tesla's next year are 189 00:12:07,440 --> 00:12:10,800 Speaker 1: going to be cheaper than the ones this year. Okay, 190 00:12:10,840 --> 00:12:13,320 Speaker 1: I've got a lot more stories that have nothing to 191 00:12:13,320 --> 00:12:15,960 Speaker 1: do with Elon Musk coming up, but first let's take 192 00:12:16,040 --> 00:12:29,000 Speaker 1: a quick break. We're back on Tuesday, a fire in 193 00:12:29,120 --> 00:12:35,400 Speaker 1: Amazon's JFK eight warehouse in New York temporarily shut down operations. 194 00:12:35,880 --> 00:12:39,840 Speaker 1: Amazon sent the day shift home with pay while working 195 00:12:39,840 --> 00:12:43,320 Speaker 1: with the fire department, which Amazon says certified the building 196 00:12:43,400 --> 00:12:46,440 Speaker 1: is safe. Not too long after getting the fire put out, 197 00:12:47,080 --> 00:12:50,120 Speaker 1: and then they had the night shift come on as 198 00:12:50,320 --> 00:12:53,480 Speaker 1: per normal, And when the night shift got there, some 199 00:12:53,520 --> 00:12:56,119 Speaker 1: workers reported that there were still areas of the warehouse 200 00:12:56,160 --> 00:12:59,680 Speaker 1: that had enough smoke to cause problems, you know, like 201 00:12:59,720 --> 00:13:02,400 Speaker 1: it was hard to breathe that kind of thing. Now, 202 00:13:02,400 --> 00:13:06,040 Speaker 1: this is the same Amazon warehouse that voted to unionize 203 00:13:06,280 --> 00:13:09,760 Speaker 1: earlier this year. That's something that Amazon has yet to 204 00:13:09,880 --> 00:13:13,560 Speaker 1: formally recognize, although the National Labor Relations Board to here 205 00:13:13,559 --> 00:13:17,000 Speaker 1: in the US has sided with the Amazon workers on 206 00:13:17,040 --> 00:13:22,000 Speaker 1: this one, and about fifty workers staged a walkout in 207 00:13:22,120 --> 00:13:25,000 Speaker 1: protest of being told to work in an environment that 208 00:13:25,080 --> 00:13:30,439 Speaker 1: they felt was inherently unsafe. Amazon has subsequently suspended those 209 00:13:30,480 --> 00:13:33,680 Speaker 1: workers with pay. Now, I think that's an interesting choice 210 00:13:34,240 --> 00:13:38,520 Speaker 1: considering how other Amazon facilities are also getting close to 211 00:13:38,559 --> 00:13:42,440 Speaker 1: holding votes on unionization, because I feel like this kind 212 00:13:42,440 --> 00:13:45,560 Speaker 1: of press is more likely to encourage employees to organize 213 00:13:45,559 --> 00:13:48,880 Speaker 1: into a union, which obviously Amazon does not want to 214 00:13:48,920 --> 00:13:52,200 Speaker 1: have happened. So making a choice like this seems to 215 00:13:52,480 --> 00:13:58,400 Speaker 1: fuel the movements to organize. Anyway, Amazon is saying it's 216 00:13:58,440 --> 00:14:01,439 Speaker 1: investigating the matter and that it's going to resolve the 217 00:14:01,520 --> 00:14:05,679 Speaker 1: suspensions one way or another once that investigation is complete. 218 00:14:05,960 --> 00:14:10,680 Speaker 1: A jury has found Uber's former chief security officer guilty 219 00:14:10,760 --> 00:14:14,839 Speaker 1: of attempting to cover up a massive hacker intrusion into 220 00:14:14,920 --> 00:14:18,040 Speaker 1: Uber's systems. All right, let's get some backstory on this. 221 00:14:18,559 --> 00:14:24,000 Speaker 1: So back in Joseph Sullivan became the chief security officer 222 00:14:24,080 --> 00:14:27,120 Speaker 1: for Uber. He would actually leave Uber in ten he 223 00:14:27,120 --> 00:14:30,160 Speaker 1: would start to work over at cloud Flare in the 224 00:14:30,200 --> 00:14:33,840 Speaker 1: same sort of role. Now, a year before Sullivan came 225 00:14:33,960 --> 00:14:39,320 Speaker 1: to Uber, in hackers penetrated Uber systems and they accessed 226 00:14:39,360 --> 00:14:44,440 Speaker 1: databases containing personal information for approximately fifty thousand Uber users 227 00:14:44,440 --> 00:14:48,440 Speaker 1: and drivers. Uber reported this to the US Federal Trade Commission, 228 00:14:48,640 --> 00:14:52,480 Speaker 1: or FTC, and this happened all shortly before Sullivan even 229 00:14:52,520 --> 00:14:56,440 Speaker 1: joined the company. But the FTC investigated Uber's security systems 230 00:14:56,440 --> 00:15:00,120 Speaker 1: and processes and essentially said, Yo, this ever happens to 231 00:15:00,160 --> 00:15:03,160 Speaker 1: you, you you need to let us know stat Now, this 232 00:15:03,280 --> 00:15:07,120 Speaker 1: was the environment into which Sullivan stepped as the new 233 00:15:07,240 --> 00:15:13,320 Speaker 1: Cso alright, So in twenty six a second hacker attack, 234 00:15:13,440 --> 00:15:17,640 Speaker 1: this one way bigger in scope, hit Uber. The hackers 235 00:15:17,680 --> 00:15:21,720 Speaker 1: access systems that contained data on around fifty seven million 236 00:15:22,400 --> 00:15:25,680 Speaker 1: Uber users and drivers. It included more than half a 237 00:15:25,720 --> 00:15:29,840 Speaker 1: million driver license numbers along with other information. Now, the 238 00:15:29,920 --> 00:15:33,520 Speaker 1: FDC had told Uber that the company has to report 239 00:15:33,720 --> 00:15:37,760 Speaker 1: these kinds of intrusions promptly, considering the massive effect they 240 00:15:37,760 --> 00:15:42,240 Speaker 1: can have on millions of people. But Sullivan allegedly chose 241 00:15:42,280 --> 00:15:45,560 Speaker 1: to cover the whole thing up. Oh and this hack 242 00:15:45,680 --> 00:15:50,120 Speaker 1: happened less than two weeks after Sullivan had just appeared 243 00:15:50,320 --> 00:15:53,480 Speaker 1: before the FTC to give an update on Uber's security 244 00:15:53,480 --> 00:15:57,880 Speaker 1: systems and practices. So Sullivan had just recently reassured the 245 00:15:57,920 --> 00:16:01,360 Speaker 1: FTC that Uber strategy was up today. Then he found 246 00:16:01,400 --> 00:16:05,200 Speaker 1: out that Uber was hit by this massive hacker intrusion. 247 00:16:05,680 --> 00:16:09,040 Speaker 1: Then he decided to cover it up further. Uber would 248 00:16:09,200 --> 00:16:14,240 Speaker 1: later pay the hackers one thousand dollars in cryptocurrency because 249 00:16:14,240 --> 00:16:17,160 Speaker 1: the hackers claimed they would delete the information they stole 250 00:16:17,360 --> 00:16:20,560 Speaker 1: only if they were paid a ransom. Uber even found 251 00:16:20,560 --> 00:16:24,080 Speaker 1: out the identities of a couple of those hackers and 252 00:16:24,120 --> 00:16:28,320 Speaker 1: convinced them to sign non disclosure agreements about the breach, 253 00:16:28,440 --> 00:16:32,600 Speaker 1: So instead of alerting authorities to these people, they're like, hey, 254 00:16:32,880 --> 00:16:35,200 Speaker 1: don't talk about this ever. All right, sign this deal, 255 00:16:35,320 --> 00:16:38,200 Speaker 1: you get your money, don't ever talk about it. Meanwhile, 256 00:16:38,960 --> 00:16:42,680 Speaker 1: these hackers were targeting other companies using very similar approaches 257 00:16:42,720 --> 00:16:46,280 Speaker 1: to what worked with Uber. So the FTC argued that 258 00:16:46,320 --> 00:16:49,720 Speaker 1: what Sullivan was effectively doing was covering for criminals who 259 00:16:49,720 --> 00:16:54,560 Speaker 1: were continuing to perpetuate digital crimes while also disguising the 260 00:16:54,560 --> 00:16:57,200 Speaker 1: fact that Uber had been hit by this. The whole 261 00:16:57,200 --> 00:16:59,400 Speaker 1: thing went to trial, and, as I said, a jury 262 00:16:59,440 --> 00:17:02,160 Speaker 1: found Soul have been guilty of a couple of different charges, 263 00:17:02,280 --> 00:17:07,480 Speaker 1: namely obstruction of justice and one called misprision of felony. 264 00:17:07,520 --> 00:17:11,159 Speaker 1: I was completely unfamiliar with that phrase, but it means 265 00:17:11,240 --> 00:17:14,600 Speaker 1: that he knew that a federal felony had been perpetrated, 266 00:17:14,840 --> 00:17:18,640 Speaker 1: and then he took steps to conceal that felony. While 267 00:17:18,680 --> 00:17:21,800 Speaker 1: he was found guilty, he has yet to be sentenced. 268 00:17:21,840 --> 00:17:24,480 Speaker 1: He could potentially face up to five years in prison 269 00:17:24,560 --> 00:17:28,439 Speaker 1: for the obstruction charge and three years for the misprision charge. 270 00:17:29,200 --> 00:17:33,080 Speaker 1: Fun fact, though before he started working in security roles 271 00:17:33,119 --> 00:17:36,520 Speaker 1: for tech companies, Sullivan was a lawyer with the Department 272 00:17:36,520 --> 00:17:42,640 Speaker 1: of Justice. Oh, how the turns have tabled. Yesterday, Intel 273 00:17:42,680 --> 00:17:45,560 Speaker 1: announced it is getting closer to a process that would 274 00:17:45,560 --> 00:17:50,720 Speaker 1: allow the company to scale up quantum computer production, specifically 275 00:17:50,800 --> 00:17:54,280 Speaker 1: chips for quantum computers, using a very similar approach to 276 00:17:54,440 --> 00:17:58,480 Speaker 1: how it designs chips for classical computers. Now, I've talked 277 00:17:58,480 --> 00:18:03,320 Speaker 1: about quantum computing before, how a sufficiently powerful quantum computer 278 00:18:03,680 --> 00:18:08,080 Speaker 1: paired with the right algorithm or program could potentially solve 279 00:18:08,480 --> 00:18:12,639 Speaker 1: very tricky computational problems in a fraction of the amount 280 00:18:12,640 --> 00:18:14,760 Speaker 1: of time it would take a classical computer to do 281 00:18:14,840 --> 00:18:18,240 Speaker 1: that same task. Now, this is not true for all 282 00:18:18,400 --> 00:18:22,400 Speaker 1: computational tasks, mind you. There's some things that a classical 283 00:18:22,440 --> 00:18:27,080 Speaker 1: computer can do much more efficiently than your typical quantum computer, 284 00:18:27,480 --> 00:18:30,600 Speaker 1: unless you were somehow able to build a truly massive 285 00:18:30,680 --> 00:18:34,119 Speaker 1: quantum computer that could compete with a classical computer for 286 00:18:34,160 --> 00:18:36,560 Speaker 1: those tasks, which would be very very hard to do. 287 00:18:37,440 --> 00:18:40,760 Speaker 1: For a subset of computational problems, however, including ones that 288 00:18:40,840 --> 00:18:44,960 Speaker 1: relate to our current encryption practices, a quantum computer could 289 00:18:44,960 --> 00:18:50,440 Speaker 1: potentially cause massive disruption. It's possible that with a powerful 290 00:18:50,560 --> 00:18:54,240 Speaker 1: enough quantum computer and with the right algorithm, you could 291 00:18:54,240 --> 00:18:57,240 Speaker 1: decrypt pretty much anything that has ever been encrypted in 292 00:18:57,280 --> 00:19:00,960 Speaker 1: a short amount of time. No more secrets. In other words, 293 00:19:01,720 --> 00:19:06,120 Speaker 1: So this has pushed research facilities and academic institutions and 294 00:19:06,320 --> 00:19:08,960 Speaker 1: various companies to work on developing the next generation of 295 00:19:09,040 --> 00:19:12,200 Speaker 1: encryption tools that would be able to withstand this kind 296 00:19:12,240 --> 00:19:16,720 Speaker 1: of computational approach. Anyway, For a long time, all of 297 00:19:16,760 --> 00:19:20,600 Speaker 1: this was largely in the realm of the theoretical because 298 00:19:20,600 --> 00:19:25,000 Speaker 1: early quantum computers were pretty puny, and they were more 299 00:19:25,040 --> 00:19:28,440 Speaker 1: of a demonstration of the principles of quantum computing as 300 00:19:28,480 --> 00:19:34,240 Speaker 1: opposed to a practical implementation, and scaling these quantum computers 301 00:19:34,320 --> 00:19:37,440 Speaker 1: is really hard to do. It's hard to build more 302 00:19:37,440 --> 00:19:40,560 Speaker 1: and more powerful quantum computers. It does happen, but it's 303 00:19:40,560 --> 00:19:44,199 Speaker 1: a very big challenge. They are incredibly expensive machines to 304 00:19:44,240 --> 00:19:47,679 Speaker 1: build and operate, and it's very very easy for stuff 305 00:19:47,720 --> 00:19:52,840 Speaker 1: to go wrong. But Intel's announcement indicates that this theoretical 306 00:19:53,680 --> 00:19:58,240 Speaker 1: reality may soon manifest as real reality, and not too 307 00:19:58,280 --> 00:20:02,080 Speaker 1: long from now. Granted, for all of this to really 308 00:20:02,119 --> 00:20:05,360 Speaker 1: be a problem, you also have to develop those algorithms 309 00:20:05,440 --> 00:20:07,760 Speaker 1: or you know, a series of instructions that a quantum 310 00:20:07,760 --> 00:20:11,359 Speaker 1: computer would follow to carry out a task like decrypting 311 00:20:11,400 --> 00:20:14,880 Speaker 1: stuff that we would otherwise think of as being practically untouchable. 312 00:20:15,119 --> 00:20:17,560 Speaker 1: It's not like you build a quantum computer and it 313 00:20:17,640 --> 00:20:22,120 Speaker 1: magically can decrypt things. You have to design an algorithm 314 00:20:22,160 --> 00:20:25,760 Speaker 1: that effectively leads the quantum computer to do this. But 315 00:20:25,800 --> 00:20:27,760 Speaker 1: there are people who are working on those algorithms and 316 00:20:27,840 --> 00:20:30,600 Speaker 1: improving them all the time. It's just that you have 317 00:20:30,680 --> 00:20:33,879 Speaker 1: to marry that with a quantum computer of sufficient power 318 00:20:34,320 --> 00:20:37,400 Speaker 1: to make it actually do the thing. You wanted to do. 319 00:20:38,280 --> 00:20:40,280 Speaker 1: But this really does mean we're on the precipice of 320 00:20:40,280 --> 00:20:46,000 Speaker 1: an enormous transformation in digital communication and encryption. And that's 321 00:20:46,040 --> 00:20:51,840 Speaker 1: just one possible quantum computing application. There are lots of others, 322 00:20:51,840 --> 00:20:56,640 Speaker 1: so very exciting. Also, you know, notably a little scary 323 00:20:56,880 --> 00:21:00,800 Speaker 1: because of the implications for things like encryption and the 324 00:21:00,920 --> 00:21:06,440 Speaker 1: idea that with the right machine and algorithm, all stuff 325 00:21:06,480 --> 00:21:08,920 Speaker 1: that previously we thought of as being private and safe 326 00:21:08,920 --> 00:21:13,560 Speaker 1: and locked away isn't. So that that is concerning. But yeah, 327 00:21:13,760 --> 00:21:16,920 Speaker 1: still also really exciting that this is happening and hopefully 328 00:21:16,960 --> 00:21:21,920 Speaker 1: things will turn out. Okay, all right, we're gonna take 329 00:21:21,920 --> 00:21:24,920 Speaker 1: another quick break while I calm myself down, and we'll 330 00:21:25,080 --> 00:21:27,920 Speaker 1: be back to conclude this news episode with a few 331 00:21:27,960 --> 00:21:40,760 Speaker 1: more stories. Okay, while looking at news articles for this episode, 332 00:21:40,800 --> 00:21:44,840 Speaker 1: I came across a headline titled social media use linked 333 00:21:44,880 --> 00:21:50,440 Speaker 1: to developing depression regardless of personality. Now, that headline reinforces 334 00:21:50,640 --> 00:21:54,840 Speaker 1: some preconceived ideas of my own, and it also mirrors 335 00:21:54,880 --> 00:21:58,399 Speaker 1: my own personal experience, so it seems to reinforce my 336 00:21:58,480 --> 00:22:02,760 Speaker 1: anecdotal experience. So my first reaction to this headline was, well, 337 00:22:02,760 --> 00:22:05,440 Speaker 1: of course, I'll go ahead and cover this, but I mean, 338 00:22:05,480 --> 00:22:09,040 Speaker 1: of course it does. However, then I thought, hang on, 339 00:22:09,760 --> 00:22:12,960 Speaker 1: I just did an episode about critical thinking. I should 340 00:22:13,040 --> 00:22:16,400 Speaker 1: use some critical thinking. I should really read about how 341 00:22:16,560 --> 00:22:20,720 Speaker 1: this study was performed and what it concluded. And once 342 00:22:20,880 --> 00:22:23,280 Speaker 1: I did that, and I to be clear, I just 343 00:22:23,400 --> 00:22:25,320 Speaker 1: read the press release. I have yet to read the 344 00:22:25,359 --> 00:22:27,760 Speaker 1: full study. I haven't taken enough time to do that yet. 345 00:22:28,160 --> 00:22:31,080 Speaker 1: But even just reading the press release, I tempered my 346 00:22:31,200 --> 00:22:33,800 Speaker 1: reaction significantly. Now this is not to say that I 347 00:22:33,840 --> 00:22:39,080 Speaker 1: think the headline is necessarily inaccurate, but rather that I 348 00:22:39,160 --> 00:22:44,280 Speaker 1: have more questions and I feel like there's some big 349 00:22:44,359 --> 00:22:49,320 Speaker 1: gaps in the reasoning here. Anyway, a group of researchers 350 00:22:49,480 --> 00:22:53,639 Speaker 1: from the University of Arkansas, Oregon State University, University of Alabama, 351 00:22:53,680 --> 00:22:56,000 Speaker 1: a couple of others. They took a sample of one 352 00:22:56,040 --> 00:23:00,000 Speaker 1: thousand US adults between eighteen and thirty years old. Uh. 353 00:23:00,000 --> 00:23:03,240 Speaker 1: Actually that sample was taken back in eighteen, so this 354 00:23:03,320 --> 00:23:05,880 Speaker 1: is four year old data. That's one thing we've got 355 00:23:05,880 --> 00:23:09,639 Speaker 1: to keep in mind. And they look to see how 356 00:23:09,840 --> 00:23:13,920 Speaker 1: depression correlated both with social media use as well with 357 00:23:14,320 --> 00:23:18,879 Speaker 1: certain personality traits, like are people who express certain types 358 00:23:18,920 --> 00:23:22,080 Speaker 1: of personalities? More likely to be depressed if they use 359 00:23:22,160 --> 00:23:24,800 Speaker 1: social media. That was kind of the question. Now the 360 00:23:24,800 --> 00:23:29,200 Speaker 1: press release says, quote, for each personality trait, social media 361 00:23:29,320 --> 00:23:33,640 Speaker 1: use was strongly associated with the development of depression. End quote. 362 00:23:34,359 --> 00:23:36,880 Speaker 1: Now that phrase has some wiggle room in it. Right, 363 00:23:37,240 --> 00:23:42,480 Speaker 1: strongly associated doesn't necessarily mean there's a causal relationship there. Right, 364 00:23:42,800 --> 00:23:44,760 Speaker 1: that you have an association, but it doesn't mean that 365 00:23:44,800 --> 00:23:48,439 Speaker 1: one causes the other. But in another part of the 366 00:23:48,440 --> 00:23:53,000 Speaker 1: press release, it says, quote, those with high neuroticism were 367 00:23:53,040 --> 00:23:56,280 Speaker 1: twice as likely to develop depression than those with low 368 00:23:56,359 --> 00:24:00,240 Speaker 1: neuroticism when using more than three minutes of so show 369 00:24:00,240 --> 00:24:05,240 Speaker 1: media per day. End quote. Y'all that better be a typo. 370 00:24:05,720 --> 00:24:08,639 Speaker 1: Three hundred minutes of social media per day. Three hundred 371 00:24:08,640 --> 00:24:13,440 Speaker 1: minutes is five hours. That is a lot of time 372 00:24:13,480 --> 00:24:16,200 Speaker 1: on social media. I mean, I did a cursory search 373 00:24:16,280 --> 00:24:19,159 Speaker 1: to find out how much the average person spends on 374 00:24:19,200 --> 00:24:22,080 Speaker 1: social media in a day. The number I kept seeing 375 00:24:22,240 --> 00:24:26,040 Speaker 1: was one forty seven minutes. This was on sites like Statista, 376 00:24:26,080 --> 00:24:28,840 Speaker 1: which some people have issues with, but I've seen it 377 00:24:28,920 --> 00:24:30,840 Speaker 1: reported in a couple different places. Of course, they could 378 00:24:30,840 --> 00:24:32,840 Speaker 1: all be getting their data from the same source, but 379 00:24:32,920 --> 00:24:36,720 Speaker 1: still one seven minutes is less than half of three 380 00:24:36,800 --> 00:24:40,520 Speaker 1: hundred minutes, right, So from what I was saying, the 381 00:24:40,600 --> 00:24:43,480 Speaker 1: average person spends less than half that amount of time. 382 00:24:44,040 --> 00:24:50,480 Speaker 1: So if you're spending three hundred minutes of time per 383 00:24:50,560 --> 00:24:54,760 Speaker 1: day on social media, you are well outside the norm. Also, 384 00:24:54,800 --> 00:24:57,240 Speaker 1: we should point out that according to these sites they 385 00:24:57,240 --> 00:24:59,880 Speaker 1: are saying a hundred forty seven minutes is average. That's 386 00:25:00,119 --> 00:25:03,399 Speaker 1: an increase of two minutes over one. So in other words, 387 00:25:04,119 --> 00:25:06,479 Speaker 1: this keeping in mind that the data we're looking at 388 00:25:06,520 --> 00:25:10,679 Speaker 1: in this study is four years old, presumably the average 389 00:25:10,680 --> 00:25:13,359 Speaker 1: amount of time spent on social media back in eighteen 390 00:25:13,440 --> 00:25:16,239 Speaker 1: was significantly lower than a hundred forty seven minutes. So 391 00:25:16,680 --> 00:25:19,840 Speaker 1: three hundreds is a crazy outlier, is what I'm saying. 392 00:25:20,320 --> 00:25:23,800 Speaker 1: And I'm also saying, if you're spending twice as much 393 00:25:23,840 --> 00:25:28,000 Speaker 1: time as the average person on social media, maybe there's 394 00:25:28,040 --> 00:25:32,359 Speaker 1: another factor at play that's contributing both to your depression 395 00:25:32,600 --> 00:25:36,000 Speaker 1: and the amount of time you're spending online. Also, I 396 00:25:36,000 --> 00:25:39,639 Speaker 1: should point out, one people, that's a very very small 397 00:25:39,760 --> 00:25:43,120 Speaker 1: sample size. It's hard to draw broad conclusions on such 398 00:25:43,160 --> 00:25:46,120 Speaker 1: a small sample. Now, none of this is to say 399 00:25:46,160 --> 00:25:49,960 Speaker 1: that the research is inherently bad or that the researchers 400 00:25:50,400 --> 00:25:53,920 Speaker 1: all came to faulty conclusions. It's just that, based on 401 00:25:53,960 --> 00:25:57,080 Speaker 1: the press release, I can't really see any kind of 402 00:25:57,200 --> 00:26:01,720 Speaker 1: causal link between social media use in depression. This isn't 403 00:26:01,720 --> 00:26:04,479 Speaker 1: to say there's not a link. There might be. I 404 00:26:04,520 --> 00:26:08,040 Speaker 1: just don't see it being pointed out in this this study. 405 00:26:08,119 --> 00:26:12,120 Speaker 1: Based on the press release. In fact, I could argue, well, 406 00:26:12,119 --> 00:26:14,159 Speaker 1: what if the causal link goes the other way. What 407 00:26:14,240 --> 00:26:17,679 Speaker 1: if it's not using social media leads to depression. What 408 00:26:17,800 --> 00:26:22,160 Speaker 1: if having depression leads to an overreliance on social media. 409 00:26:22,520 --> 00:26:24,119 Speaker 1: That kind of gets back to that idea of the 410 00:26:24,119 --> 00:26:27,439 Speaker 1: strong association, right, That could be a causal link that 411 00:26:27,480 --> 00:26:30,240 Speaker 1: goes in a different direction. It could just be a correlation. 412 00:26:30,960 --> 00:26:33,800 Speaker 1: So I don't feel like you can draw any firm 413 00:26:33,840 --> 00:26:36,080 Speaker 1: conclusions about this. Even though the press release goes on 414 00:26:36,160 --> 00:26:38,119 Speaker 1: to say that people should be cutting back on the 415 00:26:38,160 --> 00:26:41,679 Speaker 1: amount of time they spend on social media. I also 416 00:26:41,800 --> 00:26:45,160 Speaker 1: think that I just don't think this study supports that argument, 417 00:26:46,040 --> 00:26:49,199 Speaker 1: at least not not with any real evidence. It just 418 00:26:49,240 --> 00:26:53,119 Speaker 1: seems to confirm feelings we have about it. I do 419 00:26:53,240 --> 00:26:56,959 Speaker 1: think that it shows that we need a better designed, 420 00:26:57,359 --> 00:27:00,880 Speaker 1: larger scale study to really look at to this also 421 00:27:01,240 --> 00:27:03,680 Speaker 1: to acknowledge the fact that these kind of studies are 422 00:27:03,720 --> 00:27:08,280 Speaker 1: inherently difficult to do. It is so hard to control 423 00:27:08,440 --> 00:27:13,040 Speaker 1: for various variables that can affect things like depression that 424 00:27:13,119 --> 00:27:16,640 Speaker 1: are outside the scope of the study, and because of that, 425 00:27:16,760 --> 00:27:19,560 Speaker 1: it is very difficult to draw any sort of firm conclusion. 426 00:27:19,640 --> 00:27:22,159 Speaker 1: So the whole reason I said this was again to 427 00:27:22,320 --> 00:27:25,080 Speaker 1: once again reinforce the idea of critical thinking is important, 428 00:27:25,640 --> 00:27:30,400 Speaker 1: especially when you encounter things that reaffirm your your previously 429 00:27:30,480 --> 00:27:34,119 Speaker 1: held biases. Because it happened to me this morning, and 430 00:27:34,119 --> 00:27:37,640 Speaker 1: then I took a moment and started asking questions again. 431 00:27:38,320 --> 00:27:43,600 Speaker 1: Maybe it's all absolutely perfectly accurate. There's just some gaps 432 00:27:43,840 --> 00:27:46,960 Speaker 1: there in the reasoning, but you can't say for sure 433 00:27:47,000 --> 00:27:49,520 Speaker 1: one way or the other, which is why I go 434 00:27:49,600 --> 00:27:52,080 Speaker 1: on ti rates like this, all right, Moving on. Earlier 435 00:27:52,160 --> 00:27:54,479 Speaker 1: this week, the UK government announced plans to build a 436 00:27:54,520 --> 00:27:57,400 Speaker 1: prototype fusion reactor on the side of a decommissioned coal 437 00:27:57,480 --> 00:28:01,520 Speaker 1: mine and to have it up and running by twenty 438 00:28:01,640 --> 00:28:04,800 Speaker 1: forty now. The specific approach the UK is taking is 439 00:28:04,880 --> 00:28:09,920 Speaker 1: the spherical TACAMAC for energy production or STEP. A tacamac 440 00:28:10,040 --> 00:28:13,000 Speaker 1: is a machine that can generate incredibly powerful magnetic fields 441 00:28:13,000 --> 00:28:17,000 Speaker 1: meant to contain plasmas UH. This is in a doughnut 442 00:28:17,119 --> 00:28:19,960 Speaker 1: shape called it trus and the purpose of a tocamac 443 00:28:20,040 --> 00:28:25,159 Speaker 1: is to essentially force superheated atoms which are moving like 444 00:28:25,280 --> 00:28:29,840 Speaker 1: crazy otherwise, but forcing these these really fast moving atoms 445 00:28:30,359 --> 00:28:33,320 Speaker 1: to come into very close contact with one another so 446 00:28:33,359 --> 00:28:36,680 Speaker 1: that they fuse into a heavier atom and release UH 447 00:28:36,680 --> 00:28:39,840 Speaker 1: an enormous amount of energy in the process. So fusion, 448 00:28:40,400 --> 00:28:44,040 Speaker 1: if we can get it to work, would absolutely transform 449 00:28:44,080 --> 00:28:47,800 Speaker 1: our approach to energy. Fusion does not have the same 450 00:28:47,880 --> 00:28:52,120 Speaker 1: dangers and drawbacks as nuclear fission does, and researchers have 451 00:28:52,320 --> 00:28:56,000 Speaker 1: creative fusion reactions in labs. But the question is whether 452 00:28:56,000 --> 00:28:58,400 Speaker 1: we can figure out a way to make a sustainable 453 00:28:58,440 --> 00:29:01,320 Speaker 1: approach where you're able to, you know, do this more 454 00:29:01,360 --> 00:29:04,480 Speaker 1: than just in an instant and be able to continue 455 00:29:04,600 --> 00:29:07,520 Speaker 1: to have fusion reactions so that you can continue to 456 00:29:07,600 --> 00:29:13,280 Speaker 1: provide energy in the form of electricity, or also if 457 00:29:13,320 --> 00:29:16,080 Speaker 1: we can figure out a way to get more energy 458 00:29:16,120 --> 00:29:18,640 Speaker 1: out than what we put into it. Right, if it 459 00:29:18,720 --> 00:29:23,120 Speaker 1: costs you more energy to start and maintain the set 460 00:29:23,160 --> 00:29:27,160 Speaker 1: of fusion reactions, then you get out of those fusion reactions. 461 00:29:27,200 --> 00:29:29,160 Speaker 1: Then you're operating at a net loss, and it makes 462 00:29:29,200 --> 00:29:31,040 Speaker 1: no sense to do it. You know. It means that 463 00:29:31,160 --> 00:29:34,320 Speaker 1: you're wasting more energy than you're getting out of it. 464 00:29:34,760 --> 00:29:38,640 Speaker 1: And you could just bypass that whole process and use 465 00:29:38,680 --> 00:29:40,680 Speaker 1: the energy you were going to use to start and 466 00:29:40,720 --> 00:29:46,640 Speaker 1: maintain the reactions to provide electricity to folks. So if 467 00:29:46,680 --> 00:29:49,400 Speaker 1: we do figure it all out, that would be great. 468 00:29:49,640 --> 00:29:54,360 Speaker 1: It would be transformational. Twenty forties seems like an incredibly 469 00:29:54,480 --> 00:30:02,080 Speaker 1: ambitious and aggressive deadline to me. Um, maybe it's accurate, 470 00:30:02,160 --> 00:30:05,240 Speaker 1: But because I haven't done research on fusion in a 471 00:30:05,280 --> 00:30:09,840 Speaker 1: little while, maybe the advancements we've made make that less 472 00:30:10,560 --> 00:30:14,040 Speaker 1: unlikely than I feel as it is. But I'm gonna 473 00:30:14,040 --> 00:30:16,680 Speaker 1: need to do more research at the moment. I would 474 00:30:16,720 --> 00:30:21,320 Speaker 1: be shocked if a working prototype fusion reactor we're up 475 00:30:21,360 --> 00:30:24,840 Speaker 1: and running by twenty UM. I think it's more likely 476 00:30:24,920 --> 00:30:27,920 Speaker 1: to go well over scheduled and probably way over budget, 477 00:30:27,960 --> 00:30:31,200 Speaker 1: which is estimated to be in the like ten billion 478 00:30:31,320 --> 00:30:34,480 Speaker 1: pounds range. Remember we're talking about the UK here, so 479 00:30:34,600 --> 00:30:38,560 Speaker 1: pounds rather than dollars. But maybe, um, if it works, 480 00:30:38,640 --> 00:30:40,720 Speaker 1: that would be amazing. I would love to see it happen. 481 00:30:41,320 --> 00:30:43,960 Speaker 1: I just I feel I got a bad feeling about this, 482 00:30:44,360 --> 00:30:46,520 Speaker 1: but I do need to do more research to see 483 00:30:46,800 --> 00:30:51,640 Speaker 1: what progress we've made in fusion, because maybe is a 484 00:30:51,760 --> 00:30:56,280 Speaker 1: reasonable estimate. I would love to see it happen. The 485 00:30:56,360 --> 00:30:59,960 Speaker 1: Great Firewall Report, which is an organization that analyzes china 486 00:31:00,200 --> 00:31:02,760 Speaker 1: censorship approach to the Internet, has stated that the country's 487 00:31:02,800 --> 00:31:06,800 Speaker 1: government has made some updates to the technology uses to 488 00:31:07,040 --> 00:31:11,560 Speaker 1: prevent Chinese citizens from accessing anything that the government doesn't 489 00:31:11,560 --> 00:31:15,280 Speaker 1: approve of, which includes the whole laundry list of different things, 490 00:31:15,280 --> 00:31:19,000 Speaker 1: including stuff that criticizes the Chinese government, and the report 491 00:31:19,080 --> 00:31:22,720 Speaker 1: states that beginning on October three, the Chinese systems were 492 00:31:22,800 --> 00:31:27,760 Speaker 1: starting to block quote t LS based censorship circumvention servers 493 00:31:28,000 --> 00:31:30,560 Speaker 1: end quote TLS. By the way, it stands for a 494 00:31:30,600 --> 00:31:34,200 Speaker 1: transport layer security. It's a cryptographic protocol. It's used all 495 00:31:34,240 --> 00:31:36,800 Speaker 1: over the place. You've used it all the time without 496 00:31:36,840 --> 00:31:40,080 Speaker 1: even knowing about it, and it's essentially used to encrypt 497 00:31:40,120 --> 00:31:42,920 Speaker 1: data in an effort to provide security and privacy and 498 00:31:43,040 --> 00:31:47,320 Speaker 1: authentication for communications. So these servers were relying on a 499 00:31:47,360 --> 00:31:50,760 Speaker 1: protocol to essentially provide cover so that a user in 500 00:31:50,840 --> 00:31:53,200 Speaker 1: China would be able to access information that the government 501 00:31:53,200 --> 00:31:57,080 Speaker 1: would otherwise censor. But now China has found a way 502 00:31:57,080 --> 00:32:01,280 Speaker 1: to essentially fingerprint these TLS based servers and to block 503 00:32:01,320 --> 00:32:05,160 Speaker 1: access to them, and that sounds nasty for all sorts 504 00:32:05,160 --> 00:32:08,520 Speaker 1: of reasons. To me, it's not exactly surprising given China's 505 00:32:08,600 --> 00:32:12,720 Speaker 1: historical approach to information and communication. However, there are other 506 00:32:12,800 --> 00:32:16,840 Speaker 1: tools like naive proxy that's still work right now, but 507 00:32:16,960 --> 00:32:21,840 Speaker 1: the TLS based circumvention tools are being targeted pretty effectively. 508 00:32:22,480 --> 00:32:25,160 Speaker 1: And finally, one story that's been unfolding in the gaming 509 00:32:25,160 --> 00:32:29,040 Speaker 1: world revolves around the team based shooter game Overwatched two, 510 00:32:29,280 --> 00:32:32,480 Speaker 1: which has had no shortage of controversies surrounding it and 511 00:32:32,520 --> 00:32:34,760 Speaker 1: the company behind it, Activision Blizzard, is kind of a 512 00:32:34,800 --> 00:32:39,160 Speaker 1: ground zero for controversy in general. One controversy is that 513 00:32:39,240 --> 00:32:42,280 Speaker 1: Overwatch two is, of course it's equel to Overwatch, and 514 00:32:42,360 --> 00:32:45,760 Speaker 1: that upon the release of Overwatch two, Blizzard to shut 515 00:32:45,800 --> 00:32:50,240 Speaker 1: down the original game, so players can't play Overwatch anymore. 516 00:32:50,520 --> 00:32:55,160 Speaker 1: They're forced to play Overwatched two. And another issue is 517 00:32:55,200 --> 00:32:58,960 Speaker 1: that until recently, Blizzard was going to require all Overwatch 518 00:32:59,080 --> 00:33:02,280 Speaker 1: two players to I had an active phone number that 519 00:33:02,280 --> 00:33:06,360 Speaker 1: would be associated with their battle dot net account, and 520 00:33:06,400 --> 00:33:08,280 Speaker 1: the thinking behind this was to create kind of a 521 00:33:08,360 --> 00:33:12,440 Speaker 1: multi factor authentication system to establish not just player identity, 522 00:33:12,480 --> 00:33:15,440 Speaker 1: but to be able to single out cheaters and abusive players. 523 00:33:15,720 --> 00:33:19,280 Speaker 1: Right Like, if you have someone who's cheating in a game, 524 00:33:19,840 --> 00:33:23,440 Speaker 1: or they're hurling slurs around and they're just being trolling, 525 00:33:24,080 --> 00:33:28,040 Speaker 1: you can flag them. And now Blizzard could not just 526 00:33:28,160 --> 00:33:32,560 Speaker 1: banned that account, but ban any account associated with that 527 00:33:32,720 --> 00:33:35,760 Speaker 1: phone number. So that way the player would not just 528 00:33:35,800 --> 00:33:38,120 Speaker 1: be able to go out, create a new account and 529 00:33:38,200 --> 00:33:41,800 Speaker 1: go right back to abusing the system and the players 530 00:33:41,880 --> 00:33:46,080 Speaker 1: in it. They would be completely banned as long as 531 00:33:46,160 --> 00:33:49,040 Speaker 1: they were relying on that phone number. If they switched 532 00:33:49,040 --> 00:33:51,240 Speaker 1: phone numbers, then it was a different story. But obviously 533 00:33:51,280 --> 00:33:55,040 Speaker 1: that's a harder thing to do. But Blizzard has since 534 00:33:55,040 --> 00:33:58,680 Speaker 1: walked this back a little bit. Now, anyone who was 535 00:33:58,720 --> 00:34:02,680 Speaker 1: playing Overwatch the connected battle Net accounts since June nine 536 00:34:04,520 --> 00:34:07,320 Speaker 1: will not have to submit a phone number to connect 537 00:34:07,360 --> 00:34:10,480 Speaker 1: to their battle Net account. This change will start to 538 00:34:10,600 --> 00:34:15,399 Speaker 1: roll out beginning tomorrow, October seven. However, accounts that were 539 00:34:15,440 --> 00:34:19,759 Speaker 1: not connected to battle neet and then have to be 540 00:34:19,800 --> 00:34:23,399 Speaker 1: in order to play Overwatch two, and all new accounts 541 00:34:23,840 --> 00:34:27,720 Speaker 1: with Overwatch two will have to submit a phone number 542 00:34:27,760 --> 00:34:31,759 Speaker 1: in order to play. However, this phone number system will 543 00:34:31,800 --> 00:34:35,200 Speaker 1: not accept certain types of phone numbers, like those belonging 544 00:34:35,239 --> 00:34:39,359 Speaker 1: to a voice over Internet Protocol line or void line, 545 00:34:39,560 --> 00:34:43,279 Speaker 1: or phones that use prepaid SIM cards, which means that 546 00:34:43,320 --> 00:34:47,160 Speaker 1: gamers who rely on those kinds of phones will effectively 547 00:34:47,200 --> 00:34:50,000 Speaker 1: get locked out of playing over Watch two if they 548 00:34:50,040 --> 00:34:54,720 Speaker 1: aren't in that little window of time where they get 549 00:34:54,760 --> 00:35:00,920 Speaker 1: an exclusion from this policy. So this disproportionately effects lower 550 00:35:01,000 --> 00:35:04,800 Speaker 1: income gamers who might be reliant upon prepaid SIM cards 551 00:35:05,120 --> 00:35:10,080 Speaker 1: because they can't afford your typical phone contract. So because 552 00:35:10,120 --> 00:35:13,360 Speaker 1: they don't have the type of phone that is interoperable 553 00:35:13,400 --> 00:35:17,680 Speaker 1: with Blizzard system, they can't play over Watch two. And 554 00:35:18,640 --> 00:35:22,520 Speaker 1: this creates an accessibility issue. So it seems to me 555 00:35:23,239 --> 00:35:26,640 Speaker 1: that Blizzard was trying to solve one problem that is 556 00:35:26,840 --> 00:35:33,040 Speaker 1: online abuse and cheating, and inadvertently created an accessibility problem. Also, 557 00:35:33,760 --> 00:35:38,160 Speaker 1: we haven't seen yet whether Blizzards issues, whether blizzards approach 558 00:35:38,239 --> 00:35:43,480 Speaker 1: will actually affect abuse and cheating, if it will actually 559 00:35:43,520 --> 00:35:47,480 Speaker 1: work right, So we don't know if their solution works 560 00:35:47,600 --> 00:35:49,640 Speaker 1: to solve the problem it was meant to solve, but 561 00:35:49,719 --> 00:35:53,160 Speaker 1: we do know it has created a totally different problem. 562 00:35:53,320 --> 00:35:57,239 Speaker 1: Fun times, all right, that's it for the news for Thursday, 563 00:35:57,320 --> 00:36:00,759 Speaker 1: October six, two thousand twenty two. I hope you all 564 00:36:00,840 --> 00:36:03,960 Speaker 1: are well. If you want to suggest topics for me 565 00:36:04,000 --> 00:36:06,239 Speaker 1: to cover in future episodes of tech Stuff, there are 566 00:36:06,239 --> 00:36:08,239 Speaker 1: a couple of different ways of doing it. One is 567 00:36:08,280 --> 00:36:11,480 Speaker 1: to download the I Heart Radio app. It's free to download, 568 00:36:12,200 --> 00:36:14,760 Speaker 1: it's free to use. You can navigate over to tech Stuff. 569 00:36:15,280 --> 00:36:18,120 Speaker 1: You can click on the little microphone icon and leave 570 00:36:18,200 --> 00:36:20,480 Speaker 1: me a voice message up to thirty seconds in length. 571 00:36:20,600 --> 00:36:23,160 Speaker 1: And by the way, shout out to Tom Valdez for 572 00:36:23,320 --> 00:36:26,959 Speaker 1: sending me some really great messages. Really appreciate it. Tom, 573 00:36:27,000 --> 00:36:29,400 Speaker 1: thanks so much. You could be like Tom, click on 574 00:36:29,440 --> 00:36:32,600 Speaker 1: that little icon, leave a message. I've got a couple 575 00:36:32,640 --> 00:36:34,719 Speaker 1: of of shows that I'm going to be doing in 576 00:36:34,760 --> 00:36:37,920 Speaker 1: the near future that came from those suggestions. If you 577 00:36:37,960 --> 00:36:39,399 Speaker 1: don't want to do that, the other way to reach 578 00:36:39,440 --> 00:36:41,880 Speaker 1: out is on Twitter, at least as long as Twitter 579 00:36:41,960 --> 00:36:45,800 Speaker 1: is still a working entity. Who knows when that might change, 580 00:36:46,080 --> 00:36:48,680 Speaker 1: but it's working now, so you can leave me a message, 581 00:36:48,800 --> 00:36:52,719 Speaker 1: and my my handle there is tech Stuff h s 582 00:36:53,000 --> 00:36:58,080 Speaker 1: W and I'll talk to you again really soon. Y. 583 00:37:02,360 --> 00:37:05,359 Speaker 1: Text Stuff is an I Heart Radio production. For more 584 00:37:05,440 --> 00:37:08,840 Speaker 1: podcasts from I heart Radio, visit the I heart Radio app, 585 00:37:08,960 --> 00:37:12,120 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.