1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,760 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,800 --> 00:00:21,240 Speaker 1: And how the tech are Yet, it's time for the 5 00:00:21,280 --> 00:00:26,680 Speaker 1: tech news for Tuesday, April twenty six, two thousand, twenty two. 6 00:00:27,600 --> 00:00:32,280 Speaker 1: And after all the Twitter polls, the poison pills, and 7 00:00:32,400 --> 00:00:37,320 Speaker 1: the posturing, that's a lot of alliteration, Elon Musk and 8 00:00:37,360 --> 00:00:40,040 Speaker 1: Twitter's board of directors have come to an agreement for 9 00:00:40,200 --> 00:00:44,880 Speaker 1: Musk to purchase Twitter. Now that deal isn't done yet. 10 00:00:45,080 --> 00:00:48,080 Speaker 1: Shareholders have to agree on the acquisition and it also 11 00:00:48,159 --> 00:00:51,519 Speaker 1: has to pass regulatory approval. But that being said, I 12 00:00:51,560 --> 00:00:54,520 Speaker 1: think the general consensus is that there aren't gonna be 13 00:00:54,520 --> 00:00:57,520 Speaker 1: any serious roadblocks in the way of this deal and 14 00:00:57,600 --> 00:01:00,840 Speaker 1: it will in fact go through. It's a heck of 15 00:01:00,840 --> 00:01:03,240 Speaker 1: a story. I anticipate this is going to be one 16 00:01:03,280 --> 00:01:06,440 Speaker 1: of the really big stories when folks do retrospectives on 17 00:01:07,040 --> 00:01:10,840 Speaker 1: you know, the most important tech news of two. Elon 18 00:01:10,959 --> 00:01:14,800 Speaker 1: Musk has had a let's say controversial time on Twitter, 19 00:01:15,040 --> 00:01:17,520 Speaker 1: using a platform to post stuff that got him into 20 00:01:17,520 --> 00:01:21,680 Speaker 1: serious hot water with the SEC something that he protests 21 00:01:21,720 --> 00:01:25,560 Speaker 1: and says was unmerited um. Musk has also had a 22 00:01:25,560 --> 00:01:29,240 Speaker 1: bit of a dodgy past when it comes to following regulations, 23 00:01:29,600 --> 00:01:32,960 Speaker 1: and he's been known to use Twitter to promote certain cryptocurrencies, 24 00:01:33,000 --> 00:01:37,120 Speaker 1: which nearly always has had a massive effect and frequently 25 00:01:37,160 --> 00:01:40,880 Speaker 1: a short term effect on that cryptocurrency's value, which has 26 00:01:40,959 --> 00:01:44,399 Speaker 1: led some to accuse Musk of using his position to 27 00:01:44,520 --> 00:01:47,680 Speaker 1: try and pump and then dump crypto in an effort 28 00:01:47,720 --> 00:01:50,840 Speaker 1: to make a tidy profit at the expense of marks. Now, 29 00:01:50,960 --> 00:01:54,200 Speaker 1: Musk has said he plans on overseeing some really big 30 00:01:54,280 --> 00:01:59,040 Speaker 1: changes at Twitter, including authenticating every human on the platform 31 00:01:59,080 --> 00:02:03,080 Speaker 1: and eliminating bots. He has also repeatedly called for Twitter 32 00:02:03,160 --> 00:02:06,120 Speaker 1: to be a better steward of free speech, which has 33 00:02:06,160 --> 00:02:09,800 Speaker 1: led some to think that Musk might reverse some big 34 00:02:09,840 --> 00:02:13,600 Speaker 1: decisions like the lifelong ban on former President Donald Trump, 35 00:02:14,400 --> 00:02:16,679 Speaker 1: and this has a lot of people concerned that Twitter's 36 00:02:16,720 --> 00:02:20,640 Speaker 1: trend toward track cracking down on a misinformation and disinformation 37 00:02:20,680 --> 00:02:25,720 Speaker 1: campaigns could be reversed. Free speech is a pretty tricky 38 00:02:25,760 --> 00:02:29,079 Speaker 1: thing too, because you know, even the United States has 39 00:02:29,160 --> 00:02:33,440 Speaker 1: limits on free speech. It's not absolute. For example, the 40 00:02:33,480 --> 00:02:36,880 Speaker 1: United States Supreme Court has determined on numerous occasions that 41 00:02:37,040 --> 00:02:41,600 Speaker 1: free speech does have boundaries, like libel and slander are 42 00:02:41,720 --> 00:02:44,919 Speaker 1: two of those, so libel and slander don't count as 43 00:02:45,360 --> 00:02:50,440 Speaker 1: free speech, and so there's concerned that Musk's interpretation of 44 00:02:50,480 --> 00:02:53,560 Speaker 1: free speech might allow for rampant abuse on the platform. 45 00:02:53,919 --> 00:02:56,519 Speaker 1: Musk has said that there will still be content moderation 46 00:02:57,040 --> 00:03:00,440 Speaker 1: and that extreme tweets like those calling for physical violence 47 00:03:00,480 --> 00:03:04,120 Speaker 1: against people, will not be allowed, so it sounds like 48 00:03:04,120 --> 00:03:06,800 Speaker 1: there will be limitations there. He's also indicated that he 49 00:03:06,840 --> 00:03:10,639 Speaker 1: intends Twitter to work within the laws of various countries, 50 00:03:11,080 --> 00:03:16,120 Speaker 1: that he's not He's not advocating that Twitter flaunt uh 51 00:03:16,440 --> 00:03:20,079 Speaker 1: laws around the world, and that could also mean that, 52 00:03:20,560 --> 00:03:24,320 Speaker 1: you know, we're talking about free speech with limitations on 53 00:03:24,360 --> 00:03:28,440 Speaker 1: Twitter as well as elsewhere. So a lot of people 54 00:03:28,480 --> 00:03:32,200 Speaker 1: are kind of confused about this about where it's going 55 00:03:32,240 --> 00:03:35,640 Speaker 1: to go. Honestly, it's far too early to say because 56 00:03:35,760 --> 00:03:38,640 Speaker 1: we're not there yet. But a lot of people have 57 00:03:38,720 --> 00:03:43,440 Speaker 1: said they are leaving the platform. Much fewer are actually leaving, 58 00:03:43,480 --> 00:03:46,400 Speaker 1: So a lot of people are saying that's it, I'm moving, 59 00:03:46,560 --> 00:03:48,720 Speaker 1: and then they stay where they are, which you know 60 00:03:48,840 --> 00:03:53,000 Speaker 1: that happens every time we see a massive change in leadership, 61 00:03:53,040 --> 00:03:56,840 Speaker 1: whether we're talking about a social platform or a country. Uh. 62 00:03:57,120 --> 00:03:59,920 Speaker 1: Jack Dorsey, one of the co founders and the formers, 63 00:04:00,040 --> 00:04:03,160 Speaker 1: the EO of Twitter, says that Musk is the quote 64 00:04:03,360 --> 00:04:08,240 Speaker 1: singular solution I trust end quote. So there's definitely debate 65 00:04:08,400 --> 00:04:11,160 Speaker 1: on whether or not Musk taking over Twitter is a 66 00:04:11,200 --> 00:04:14,440 Speaker 1: good or bad thing. Now. I anticipate that I will 67 00:04:14,480 --> 00:04:19,080 Speaker 1: maintain my tech stuff hsw Twitter feed for this show, 68 00:04:19,880 --> 00:04:24,359 Speaker 1: but I have already sunset it, although not deactivated my 69 00:04:24,520 --> 00:04:28,080 Speaker 1: personal account. By sunset, I just mean I logged out. Uh. 70 00:04:28,120 --> 00:04:31,720 Speaker 1: And the reason I'm not deleting that account isn't because 71 00:04:31,800 --> 00:04:34,440 Speaker 1: I'm worried I'm gonna miss it. Instead, it's that I 72 00:04:34,480 --> 00:04:37,440 Speaker 1: don't want to surrender my Twitter handle and have someone 73 00:04:37,480 --> 00:04:41,160 Speaker 1: else take it over and then potentially post a lot 74 00:04:41,200 --> 00:04:44,240 Speaker 1: of nasty stuff that could, you know, for for a 75 00:04:44,360 --> 00:04:48,440 Speaker 1: casual follower, make them think that it was me. I 76 00:04:48,440 --> 00:04:51,360 Speaker 1: don't want that to ever happen. Uh. Now, for the record, 77 00:04:51,440 --> 00:04:53,800 Speaker 1: I don't think I'm really important enough for anyone to 78 00:04:53,839 --> 00:04:56,320 Speaker 1: actually care about doing that, but I feel as a 79 00:04:56,440 --> 00:04:59,520 Speaker 1: verified user it's probably for the best that I don't 80 00:04:59,560 --> 00:05:03,040 Speaker 1: open up that possibility anyway. I also want to say 81 00:05:03,080 --> 00:05:06,760 Speaker 1: to all of you that staying or leaving Twitter is 82 00:05:06,760 --> 00:05:09,440 Speaker 1: a personal choice. If you love Twitter and you don't 83 00:05:09,520 --> 00:05:11,960 Speaker 1: want to go and it's useful to you, those are 84 00:05:12,000 --> 00:05:15,400 Speaker 1: all legitimate reasons, and I feel you should feel fine 85 00:05:15,720 --> 00:05:18,800 Speaker 1: about staying on Twitter. That you know, I'm not my 86 00:05:18,839 --> 00:05:23,760 Speaker 1: own choice doesn't reflect my opinion of how other people operate. 87 00:05:24,200 --> 00:05:26,960 Speaker 1: But if you do feel uneasy and you want a skedaddle, 88 00:05:27,360 --> 00:05:30,479 Speaker 1: that's fine too. Even if it's just temporary, that's fine. 89 00:05:30,800 --> 00:05:34,760 Speaker 1: Don't let anyone dictate to you what you should or 90 00:05:34,800 --> 00:05:38,239 Speaker 1: should not do. For me, this is something I feel 91 00:05:38,320 --> 00:05:41,000 Speaker 1: I needed to do, both for myself as it can 92 00:05:41,040 --> 00:05:44,080 Speaker 1: have an impact on my mental well being, and also 93 00:05:44,120 --> 00:05:47,440 Speaker 1: because I genuinely feel that I won't be doing much 94 00:05:47,640 --> 00:05:51,680 Speaker 1: good on Twitter. Arguably I haven't done much good up 95 00:05:51,720 --> 00:05:54,320 Speaker 1: to this point, and I'm sure we'll have plenty more 96 00:05:54,320 --> 00:05:59,080 Speaker 1: stories about this as time goes on. And that's enough 97 00:05:59,680 --> 00:06:04,120 Speaker 1: for now, I think. But the Twitter conversation is absolutely 98 00:06:04,200 --> 00:06:08,159 Speaker 1: dominating the tech news space right now. It's lucky that 99 00:06:08,200 --> 00:06:10,400 Speaker 1: I was able to find a few other stories that 100 00:06:10,480 --> 00:06:14,279 Speaker 1: we can touch on to. One is that Netflix employee 101 00:06:14,320 --> 00:06:18,440 Speaker 1: morale is taking a serious hit. Now. This should come 102 00:06:18,520 --> 00:06:22,280 Speaker 1: as no real surprise last week, Netflix began a serious 103 00:06:22,279 --> 00:06:26,120 Speaker 1: slide in the stock market after reporting that the company 104 00:06:26,160 --> 00:06:29,360 Speaker 1: had a net loss of two hundred thousand subscribers around 105 00:06:29,360 --> 00:06:32,480 Speaker 1: the world in the first quarter of two thousand twenty two. 106 00:06:32,880 --> 00:06:37,200 Speaker 1: For many at Netflix, this slide has had a massive 107 00:06:37,240 --> 00:06:41,200 Speaker 1: personal impact, particularly people who are like in the executive level, 108 00:06:42,160 --> 00:06:45,880 Speaker 1: folks who own shares and Netflix. Because Netflix, like a 109 00:06:45,880 --> 00:06:50,719 Speaker 1: lot of tech companies, has rewarded employees with compensation in 110 00:06:50,760 --> 00:06:54,040 Speaker 1: the form of shares. Well, folks who have shares are 111 00:06:54,040 --> 00:06:57,160 Speaker 1: seeing their wealth decline every time the stock price takes 112 00:06:57,200 --> 00:07:00,440 Speaker 1: another hit, so it's like you're losing money working for 113 00:07:00,480 --> 00:07:03,760 Speaker 1: the company. The general caution that a lot of people 114 00:07:03,839 --> 00:07:07,240 Speaker 1: feel about Netflix after its first decline since it launched. 115 00:07:07,240 --> 00:07:10,520 Speaker 1: I mean, remember the earnings call we heard where we 116 00:07:10,560 --> 00:07:13,360 Speaker 1: found out about the two decline that was the first 117 00:07:13,360 --> 00:07:18,200 Speaker 1: decline in the history of Netflix. But apparently the same 118 00:07:18,240 --> 00:07:21,160 Speaker 1: concern that we're seeing outside Netflix is prevalent within the 119 00:07:21,200 --> 00:07:24,520 Speaker 1: ranks of Netflix itself. A lot of people are reportedly 120 00:07:24,560 --> 00:07:29,040 Speaker 1: considering leaving the company before things get worse, particularly again 121 00:07:29,080 --> 00:07:32,280 Speaker 1: at the executive and management levels. This is according to 122 00:07:32,280 --> 00:07:35,080 Speaker 1: Bloomberg by the Way, which published an article about the 123 00:07:35,080 --> 00:07:39,120 Speaker 1: whole ordeal. I still personally find this story a bit perplexing. 124 00:07:39,320 --> 00:07:42,160 Speaker 1: It's hard for me to reconcile that a company that 125 00:07:42,280 --> 00:07:46,600 Speaker 1: has been as monumentally successful as Netflix could be so 126 00:07:46,720 --> 00:07:50,560 Speaker 1: fragile that it's first really bad quarter sends the message 127 00:07:50,600 --> 00:07:53,880 Speaker 1: that the empire is on shaky ground. And here's the 128 00:07:53,880 --> 00:07:57,480 Speaker 1: thing that really wrinkles my brain. As Troy would say, 129 00:07:57,800 --> 00:08:01,920 Speaker 1: if enough people buy in to that perspective that Netflix 130 00:08:02,000 --> 00:08:05,800 Speaker 1: is in really serious trouble, it can become a self 131 00:08:05,800 --> 00:08:09,119 Speaker 1: fulfilling prophecy. Now, all that being said, it is clear 132 00:08:09,160 --> 00:08:12,600 Speaker 1: that there are huge challenges for Netflix and for other 133 00:08:12,720 --> 00:08:16,440 Speaker 1: streaming platforms. There's a limit to the audience out there. 134 00:08:17,320 --> 00:08:20,040 Speaker 1: Right Even if you are able to sign up everyone 135 00:08:20,080 --> 00:08:25,800 Speaker 1: who is interested, you eventually do stop getting new subscribers 136 00:08:25,840 --> 00:08:28,480 Speaker 1: because there's no one to to sign up. And that's 137 00:08:28,560 --> 00:08:33,560 Speaker 1: before you start taking into consideration the incredible competition in 138 00:08:33,600 --> 00:08:36,720 Speaker 1: the space, because there's so many different streaming platforms out 139 00:08:36,760 --> 00:08:40,120 Speaker 1: there now, and you know, obviously we're gonna probably see 140 00:08:40,160 --> 00:08:43,040 Speaker 1: more before we start to see things shake out. In 141 00:08:43,080 --> 00:08:45,800 Speaker 1: the long run, I don't think all of those streaming 142 00:08:45,840 --> 00:08:48,240 Speaker 1: services are going to survive. Some of them will end 143 00:08:48,320 --> 00:08:51,720 Speaker 1: up getting folded into others. Uh, or they might end 144 00:08:51,800 --> 00:08:55,000 Speaker 1: up changing hands. We'll see. But I always assume Netflix 145 00:08:55,040 --> 00:08:59,240 Speaker 1: would be one of the bedrock streaming services that would 146 00:08:59,280 --> 00:09:03,720 Speaker 1: be around, and and it really established itself so in 147 00:09:03,800 --> 00:09:07,600 Speaker 1: such a dominant position so early. Uh. But also it's 148 00:09:07,640 --> 00:09:09,720 Speaker 1: too early to suggest that the company is really at 149 00:09:09,720 --> 00:09:12,880 Speaker 1: the beginning of the end. This could be a blip 150 00:09:13,480 --> 00:09:16,920 Speaker 1: in Netflix's history, a serious one, but one that the 151 00:09:16,920 --> 00:09:20,640 Speaker 1: company fully recovers from. We just have to wait and see. 152 00:09:21,200 --> 00:09:24,000 Speaker 1: The Verge reports that Apple has retained the services of 153 00:09:24,040 --> 00:09:28,400 Speaker 1: a law firm called Littler Mendelssohn, widely known as a 154 00:09:28,640 --> 00:09:31,960 Speaker 1: union busting law firm. This is the same firm that 155 00:09:32,120 --> 00:09:36,200 Speaker 1: Starbucks hired in that company's efforts to counteract various union 156 00:09:36,280 --> 00:09:39,920 Speaker 1: pushes throughout its stores, and this comes on the heels 157 00:09:39,960 --> 00:09:43,480 Speaker 1: of an Atlanta Apple stores recent successful push to secure 158 00:09:43,600 --> 00:09:48,360 Speaker 1: enough employee signatures to warrant a union election. The flagship 159 00:09:48,400 --> 00:09:50,200 Speaker 1: Apple store in New York City is also in the 160 00:09:50,240 --> 00:09:54,120 Speaker 1: process of gathering signatures in order to have the same 161 00:09:54,200 --> 00:09:57,200 Speaker 1: kind of vote. And I am left to wonder if 162 00:09:57,440 --> 00:10:00,840 Speaker 1: the hiring of law firms like this one ultimately ends 163 00:10:00,920 --> 00:10:05,080 Speaker 1: up fueling the resolve of employees rather than discourages them. 164 00:10:05,120 --> 00:10:07,439 Speaker 1: It's hard for me to say from the outside. I 165 00:10:07,840 --> 00:10:11,160 Speaker 1: don't have an inside perspective of these things. It may 166 00:10:11,200 --> 00:10:15,640 Speaker 1: well be that the efforts to organize our succeeding despite 167 00:10:15,679 --> 00:10:20,680 Speaker 1: heavy opposition, rather than becoming energized by companies efforts to 168 00:10:20,800 --> 00:10:25,080 Speaker 1: squash them. I honestly can't tell. But again, it's another 169 00:10:25,160 --> 00:10:28,240 Speaker 1: indication that employees in the tech sector are really pushing 170 00:10:28,280 --> 00:10:32,160 Speaker 1: back against the established status quo. We've got several more 171 00:10:32,240 --> 00:10:34,640 Speaker 1: stories to go over before we get to those. Let's 172 00:10:34,679 --> 00:10:45,959 Speaker 1: take a quick break. We're back and over in Japan. 173 00:10:46,160 --> 00:10:50,319 Speaker 1: Apple is facing another battle, as is Google. Government report 174 00:10:50,360 --> 00:10:54,160 Speaker 1: in Japan named Apple and Google as essentially a duopoly 175 00:10:54,280 --> 00:10:58,080 Speaker 1: when it comes to the smartphone operating system market, and 176 00:10:58,200 --> 00:11:01,840 Speaker 1: that's totally true. Google is way in the lead worldwide. 177 00:11:02,120 --> 00:11:05,560 Speaker 1: Google has about a seventy percent of the smartphone operating 178 00:11:05,600 --> 00:11:09,199 Speaker 1: system market share according to stat Counter. Apple is at 179 00:11:09,200 --> 00:11:12,520 Speaker 1: a healthy twenty seven point five seven, which means that 180 00:11:12,720 --> 00:11:17,760 Speaker 1: together Apple and and Google or you know, iOS and Android, 181 00:11:18,200 --> 00:11:22,320 Speaker 1: make up about percent of the smartphone operating system market. 182 00:11:22,800 --> 00:11:25,720 Speaker 1: Kind Of hard to argue against the duopoly when you 183 00:11:25,760 --> 00:11:29,199 Speaker 1: have a figure like that. The government report expressed concern 184 00:11:29,360 --> 00:11:33,360 Speaker 1: that this arrangement, this du woppoly creates an unfair marketplace 185 00:11:33,400 --> 00:11:37,160 Speaker 1: for app developers and consumers alike, pointing out that it 186 00:11:37,280 --> 00:11:40,440 Speaker 1: is common practice for Android phones to come pre installed 187 00:11:40,440 --> 00:11:44,640 Speaker 1: with say the Chrome browser, or for Apple to favor Safari, 188 00:11:45,640 --> 00:11:50,640 Speaker 1: and the report argues that this stifles competition from other 189 00:11:50,760 --> 00:11:56,439 Speaker 1: browsers and and that it is actively discouraging users from 190 00:11:56,520 --> 00:12:00,360 Speaker 1: installing other browsers besides the ones that come pre installed 191 00:12:00,440 --> 00:12:04,760 Speaker 1: on the operating system. Further, the report argues that developers 192 00:12:04,760 --> 00:12:07,440 Speaker 1: are forced into ecosystems in order to have their work 193 00:12:07,480 --> 00:12:11,280 Speaker 1: accessible by people who use these smartphones, and that Apple 194 00:12:11,320 --> 00:12:15,040 Speaker 1: and Google should allow developers and users to choose alternative 195 00:12:15,080 --> 00:12:18,120 Speaker 1: app stores from the official ones on the platform. So, 196 00:12:18,200 --> 00:12:20,439 Speaker 1: in other words, you should be able to go to 197 00:12:20,600 --> 00:12:24,240 Speaker 1: other stores besides the Apple App Store and download apps 198 00:12:24,360 --> 00:12:27,280 Speaker 1: for your iPhone. Now, this is pretty much the opposite 199 00:12:27,280 --> 00:12:31,360 Speaker 1: of Apple's market strategy, and unsurprisingly, the company has protested 200 00:12:31,360 --> 00:12:36,360 Speaker 1: the report, arguing that Apple faces fierce competition in every arena. 201 00:12:36,559 --> 00:12:38,800 Speaker 1: And it remains to be seen what this report will 202 00:12:38,840 --> 00:12:41,400 Speaker 1: actually lead to in Japan, but it marks another spot 203 00:12:41,440 --> 00:12:44,000 Speaker 1: in the world where the big companies like Apple and 204 00:12:44,040 --> 00:12:48,560 Speaker 1: Google are starting to face resistance from regulators, like serious resistance. 205 00:12:49,120 --> 00:12:52,520 Speaker 1: And we've got another story about the shaky world of 206 00:12:52,679 --> 00:12:55,320 Speaker 1: n f t s. All right, So quick reminder, and 207 00:12:55,440 --> 00:12:58,480 Speaker 1: n f T is a non fungible token, which really 208 00:12:58,520 --> 00:13:02,360 Speaker 1: means it's a digital seartificate showing ownership of some sort 209 00:13:02,440 --> 00:13:06,960 Speaker 1: of digital asset, and it is tracked on a blockchain 210 00:13:07,080 --> 00:13:11,400 Speaker 1: of some sort so that there is a clear uh 211 00:13:12,600 --> 00:13:16,160 Speaker 1: chain of transactions so you can trace who owned it 212 00:13:16,240 --> 00:13:19,600 Speaker 1: at what point and who currently owns that digital asset. 213 00:13:19,880 --> 00:13:21,840 Speaker 1: As far as what that asset is, it could be 214 00:13:21,920 --> 00:13:24,040 Speaker 1: a line of code, it could be a piece of 215 00:13:24,080 --> 00:13:27,959 Speaker 1: digital art, it could be an item within a video game, 216 00:13:28,880 --> 00:13:32,120 Speaker 1: or pretty much anything else that's digital. And n f 217 00:13:32,160 --> 00:13:34,240 Speaker 1: t s have been put through the wringer after a 218 00:13:34,320 --> 00:13:39,079 Speaker 1: chaotic hype cycle that had its fair share of speculation, scams, 219 00:13:39,120 --> 00:13:43,679 Speaker 1: and other shenanigans, and The Verge reports that a hacker 220 00:13:43,960 --> 00:13:46,360 Speaker 1: has managed to steal n f t s, estimated to 221 00:13:46,360 --> 00:13:50,760 Speaker 1: be worth millions of dollars, using one really tricky method 222 00:13:51,000 --> 00:13:54,680 Speaker 1: and one super easy method. Now, the tricky part was 223 00:13:54,720 --> 00:13:56,840 Speaker 1: that the hacker was able to gain access to the 224 00:13:56,880 --> 00:14:01,720 Speaker 1: official Instagram account for the board Ape Yacht Club or 225 00:14:01,840 --> 00:14:04,760 Speaker 1: b A y C. That's one of the entities that 226 00:14:04,880 --> 00:14:08,600 Speaker 1: is famed for minting n f T s of digital art, 227 00:14:09,120 --> 00:14:13,000 Speaker 1: and folks have been speculating ridiculous amounts of money on 228 00:14:13,160 --> 00:14:16,240 Speaker 1: these pieces of art like this is the kind of 229 00:14:16,240 --> 00:14:20,040 Speaker 1: stuff that got people a bit wary of n f 230 00:14:20,040 --> 00:14:21,640 Speaker 1: T s in the first place, when they look at 231 00:14:21,640 --> 00:14:24,600 Speaker 1: the art and they say, wait, how much money are 232 00:14:24,640 --> 00:14:28,640 Speaker 1: people spending for a digital certificate showing ownership of this thing? 233 00:14:29,560 --> 00:14:32,640 Speaker 1: And uh, we actually don't know how the hacker managed 234 00:14:32,680 --> 00:14:35,200 Speaker 1: to get access of that Instagram account. As of the 235 00:14:35,240 --> 00:14:38,320 Speaker 1: recording of this podcast, the b A y C says 236 00:14:38,400 --> 00:14:42,960 Speaker 1: that it had activated two factor authentication on its Instagram account, 237 00:14:43,560 --> 00:14:46,800 Speaker 1: so it should have been impossible for anyone not in 238 00:14:46,880 --> 00:14:50,600 Speaker 1: possession of whatever that second factor was to be able 239 00:14:50,640 --> 00:14:54,120 Speaker 1: to get access to the account. Maybe it was an 240 00:14:54,120 --> 00:14:57,640 Speaker 1: inside job, but the hacker then used the official account 241 00:14:57,720 --> 00:15:01,200 Speaker 1: to send out a fishing link that, if followed, would 242 00:15:01,280 --> 00:15:04,560 Speaker 1: prompt n f T holders to interact with that link, 243 00:15:05,080 --> 00:15:08,560 Speaker 1: and the link would lead the hacker to joint crypto 244 00:15:08,600 --> 00:15:11,880 Speaker 1: tokens right out of those users wallets. So, in other words, 245 00:15:12,160 --> 00:15:14,920 Speaker 1: folks got a link from what they assumed to be 246 00:15:15,040 --> 00:15:17,480 Speaker 1: a trusted source. They got it from the official b 247 00:15:17,640 --> 00:15:22,560 Speaker 1: A y C Instagram account, which should be trustworthy, and 248 00:15:22,800 --> 00:15:25,720 Speaker 1: they ended up getting robbed because of that. Now, the 249 00:15:25,760 --> 00:15:29,040 Speaker 1: hacker convinced folks to follow follow the fishing link by 250 00:15:29,120 --> 00:15:31,560 Speaker 1: leaning on one of the most reliable techniques in the 251 00:15:31,640 --> 00:15:35,720 Speaker 1: history of humans, which is counting on greed. So the 252 00:15:35,800 --> 00:15:38,920 Speaker 1: link proclaimed to be an air drop of tokens, and 253 00:15:38,920 --> 00:15:43,920 Speaker 1: if folks joined and they connected their wallets to the link, 254 00:15:44,480 --> 00:15:47,560 Speaker 1: they were supposedly going to be showered with what amounts 255 00:15:47,600 --> 00:15:51,640 Speaker 1: to be free money, really free digital assets which could 256 00:15:51,720 --> 00:15:55,280 Speaker 1: later be sold or auctioned off for money, and folks 257 00:15:55,320 --> 00:15:57,720 Speaker 1: love the idea of free money, so there were a 258 00:15:57,760 --> 00:16:03,000 Speaker 1: lot of takers. Open C famous crypto exchange, has subsequently 259 00:16:03,080 --> 00:16:07,120 Speaker 1: banned the hackers wallet address on that platform, but it 260 00:16:07,240 --> 00:16:10,840 Speaker 1: is still visible on other platforms. No word yet on 261 00:16:10,960 --> 00:16:14,000 Speaker 1: how or if those who are affected by this theft 262 00:16:14,320 --> 00:16:18,560 Speaker 1: will be compensated. The Register reports that the Department of 263 00:16:18,600 --> 00:16:23,880 Speaker 1: Homeland Security, after initiating a hack DHS program in which 264 00:16:23,920 --> 00:16:27,720 Speaker 1: the department invited hackers to look for vulnerabilities and exploits 265 00:16:27,800 --> 00:16:32,200 Speaker 1: in the homeland securities systems, discovered that its systems are 266 00:16:32,280 --> 00:16:36,160 Speaker 1: far from secure. The four fifty security researchers who took 267 00:16:36,200 --> 00:16:40,640 Speaker 1: part in this event uncovered one hundred twenty two vulnerabilities, 268 00:16:41,320 --> 00:16:45,640 Speaker 1: seven of which they deemed to be critical. That's pretty 269 00:16:45,720 --> 00:16:49,480 Speaker 1: darned bad, but it really illustrates how these events are 270 00:16:49,560 --> 00:16:54,880 Speaker 1: critical for organizations, government or otherwise offering bug bounties. That 271 00:16:55,080 --> 00:16:58,880 Speaker 1: is a payment to people who uncover and subsequently report 272 00:16:59,120 --> 00:17:03,240 Speaker 1: vulnerabilities and systems. That gives organizations a chance to patch 273 00:17:03,360 --> 00:17:07,920 Speaker 1: those holes and prevent actual black hat hackers from exploiting 274 00:17:08,080 --> 00:17:12,479 Speaker 1: those vulnerabilities. I find it, frankly a little scary that 275 00:17:12,600 --> 00:17:15,679 Speaker 1: a department that has, let's face it, a checkered past, 276 00:17:16,240 --> 00:17:20,480 Speaker 1: had so many critical vulnerabilities and its various systems, But 277 00:17:20,520 --> 00:17:22,679 Speaker 1: I am glad that the department is taking steps to 278 00:17:22,720 --> 00:17:26,960 Speaker 1: address that in a responsible way. Penetration testing is a 279 00:17:26,960 --> 00:17:29,399 Speaker 1: pretty big deal, and I might need to get my 280 00:17:29,480 --> 00:17:32,119 Speaker 1: friend Shannon Morris back on the show to talk about 281 00:17:32,200 --> 00:17:36,880 Speaker 1: how that process typically works. She's an expert in that 282 00:17:36,960 --> 00:17:42,240 Speaker 1: field and I am not. The White House has urged 283 00:17:42,400 --> 00:17:46,080 Speaker 1: Congress to pass legislation that would allow more law enforcement 284 00:17:46,119 --> 00:17:50,480 Speaker 1: agencies in the United States access to drone tracking systems Now, 285 00:17:50,520 --> 00:17:54,440 Speaker 1: to be clear, this isn't about those agencies operating drones 286 00:17:54,520 --> 00:17:59,320 Speaker 1: in an effort to track citizens or identify suspects. Instead, 287 00:17:59,400 --> 00:18:03,560 Speaker 1: this is about giving those agencies the ability to detect 288 00:18:03,720 --> 00:18:07,720 Speaker 1: and track the drones themselves, presumably in an effort to 289 00:18:07,760 --> 00:18:12,760 Speaker 1: counteract crime and terrorism efforts that rely on drone technology. Now, 290 00:18:12,800 --> 00:18:15,560 Speaker 1: this is not yet a law. It's just a suggestion 291 00:18:15,600 --> 00:18:18,439 Speaker 1: to Congress, and it marks the continuation of the U. 292 00:18:18,520 --> 00:18:23,240 Speaker 1: S government's increased concern about drones and drone operation. One 293 00:18:23,280 --> 00:18:26,919 Speaker 1: story that I missed in one was the f a 294 00:18:27,080 --> 00:18:30,959 Speaker 1: a S new drone rules that require any new drones 295 00:18:31,200 --> 00:18:35,640 Speaker 1: sold in the United States to broadcast the drone operators 296 00:18:35,920 --> 00:18:41,040 Speaker 1: physical location. And that kind of makes sense in some scenarios, 297 00:18:41,160 --> 00:18:43,720 Speaker 1: like say there's a person who's operating a drone that's 298 00:18:43,720 --> 00:18:47,359 Speaker 1: in a restricted area which could pose as a threat. 299 00:18:47,600 --> 00:18:50,840 Speaker 1: Even if the person didn't intend to do that, it 300 00:18:50,880 --> 00:18:53,760 Speaker 1: could be critically important to track that person down and 301 00:18:54,040 --> 00:18:56,760 Speaker 1: be able to stop that person or to you know, 302 00:18:56,800 --> 00:18:59,199 Speaker 1: tell that person what's going on so that they can 303 00:18:59,520 --> 00:19:03,200 Speaker 1: move the drown out of the restricted area, because drones 304 00:19:03,240 --> 00:19:07,000 Speaker 1: could pose a threat to things like aircraft, for example. However, 305 00:19:07,920 --> 00:19:10,760 Speaker 1: this capability The idea of being able to track an 306 00:19:10,760 --> 00:19:15,440 Speaker 1: operator's physical location also has some troubling consequences. For instance, 307 00:19:15,960 --> 00:19:20,080 Speaker 1: during the Black Lives Matter protests in some folks were 308 00:19:20,160 --> 00:19:23,320 Speaker 1: using drones to keep an eye on police activity, and 309 00:19:23,400 --> 00:19:25,680 Speaker 1: it doesn't take much of a stretch of the imagination 310 00:19:25,720 --> 00:19:29,040 Speaker 1: to assume that a law enforcement agency that was under 311 00:19:29,119 --> 00:19:33,920 Speaker 1: citizens surveillance might use the drones broadcasting of the operator's 312 00:19:33,960 --> 00:19:38,280 Speaker 1: physical location to track down and stop that operator. So 313 00:19:38,440 --> 00:19:41,600 Speaker 1: it really does go both ways here, and it also 314 00:19:41,680 --> 00:19:45,440 Speaker 1: raises the point that drone tracking technology now is effectively 315 00:19:46,119 --> 00:19:50,800 Speaker 1: operator tracking technology, right if you're talking about a drone 316 00:19:50,840 --> 00:19:54,200 Speaker 1: that was purchased legally in the United States after this 317 00:19:54,359 --> 00:20:00,000 Speaker 1: f a A rule came into effect, then you could say, well, yeah, 318 00:20:00,320 --> 00:20:02,960 Speaker 1: it's important to track the drones, but we also have 319 00:20:03,000 --> 00:20:06,920 Speaker 1: to remember that this could be abused so that police 320 00:20:06,960 --> 00:20:11,600 Speaker 1: could track the operators. It's not even necessarily about the 321 00:20:11,640 --> 00:20:15,320 Speaker 1: technology in that respect. It's more again a kind of 322 00:20:15,400 --> 00:20:20,480 Speaker 1: citizen surveillance. So there are definitely some you know, dark 323 00:20:20,520 --> 00:20:23,960 Speaker 1: consequences to this as well. A UK watchdog called the 324 00:20:24,000 --> 00:20:27,360 Speaker 1: Competition and Markets Authority may soon have some new powers 325 00:20:27,359 --> 00:20:30,680 Speaker 1: when it comes to holding companies accountable for fake reviews 326 00:20:30,720 --> 00:20:36,000 Speaker 1: posted on sites websites, So we're talking about user generated 327 00:20:36,080 --> 00:20:38,399 Speaker 1: reviews for the most part, and the practice of certain 328 00:20:38,520 --> 00:20:42,879 Speaker 1: entities like shady product companies or sometimes the PR firms 329 00:20:42,960 --> 00:20:48,159 Speaker 1: hired to promote those products of either posting directly or 330 00:20:48,240 --> 00:20:52,920 Speaker 1: hiring people to post positive fake reviews in an effort 331 00:20:52,960 --> 00:20:57,360 Speaker 1: to market their goods. UH. The proposals in the UK 332 00:20:57,520 --> 00:21:02,120 Speaker 1: suggest making it quote clearly illegal end quote to hire 333 00:21:02,240 --> 00:21:06,680 Speaker 1: someone to write or host fake reviews, and if found guilty, 334 00:21:07,000 --> 00:21:11,200 Speaker 1: the c m A could find the responsible party, presumably 335 00:21:11,280 --> 00:21:14,200 Speaker 1: either the site that's hosting the reviews or the people 336 00:21:14,200 --> 00:21:17,720 Speaker 1: who are hiring people to post fake reviews up to 337 00:21:17,920 --> 00:21:21,919 Speaker 1: ten percent of their global turnover. This will create a 338 00:21:21,960 --> 00:21:25,720 Speaker 1: new responsibility for sites that host reviews, namely that they 339 00:21:25,800 --> 00:21:29,399 Speaker 1: need to have practices in place to be reasonably certain 340 00:21:29,480 --> 00:21:33,200 Speaker 1: that the reviews posted on the site are legitimate. So presumably, 341 00:21:33,600 --> 00:21:37,119 Speaker 1: if a website shows that it has taken reasonable action 342 00:21:37,200 --> 00:21:41,040 Speaker 1: to guard against fake reviews, the c m A won't 343 00:21:41,160 --> 00:21:45,720 Speaker 1: levy finds on that site should some fake reviews slip through. UM, 344 00:21:45,760 --> 00:21:49,199 Speaker 1: I'm guessing that there's going to be uh a grace 345 00:21:49,280 --> 00:21:52,800 Speaker 1: period where if a site is alerted to a fake review, 346 00:21:53,359 --> 00:21:57,760 Speaker 1: it has the opportunity to remove it before um any 347 00:21:57,840 --> 00:22:00,960 Speaker 1: kind of consideration would fall on it. In addition, the 348 00:22:01,040 --> 00:22:03,879 Speaker 1: UK government is looking to make it easier for consumers 349 00:22:03,880 --> 00:22:07,720 Speaker 1: to cancel subscriptions should they not want a service any longer. 350 00:22:07,920 --> 00:22:10,480 Speaker 1: I mentioned this in an earlier tech stuff News episode. 351 00:22:10,800 --> 00:22:13,119 Speaker 1: I'm sure all of you had some sort of experience 352 00:22:13,359 --> 00:22:15,439 Speaker 1: where you were put through a wild goose chase in 353 00:22:15,480 --> 00:22:19,639 Speaker 1: an effort to cancel a subscription. Now, personally, I think 354 00:22:19,720 --> 00:22:23,600 Speaker 1: these are good steps, assuming that the enforcement is reasonable 355 00:22:23,680 --> 00:22:27,359 Speaker 1: and fair, which admittedly is much easier said than done. 356 00:22:27,800 --> 00:22:29,800 Speaker 1: We have a couple more news stories to get through 357 00:22:30,040 --> 00:22:32,639 Speaker 1: before we get to those, Let's take another quick break. 358 00:22:40,359 --> 00:22:43,159 Speaker 1: Several years ago, I wrote an article for how stuff 359 00:22:43,160 --> 00:22:47,240 Speaker 1: works dot com about how the large had drawn Collider works, 360 00:22:47,560 --> 00:22:51,199 Speaker 1: And at the time of me writing that article, the 361 00:22:51,520 --> 00:22:55,240 Speaker 1: LHC had not yet graduated to full power experiments. Didn't 362 00:22:55,240 --> 00:22:59,080 Speaker 1: stop people from speculating on what the LHC would find 363 00:22:59,400 --> 00:23:04,439 Speaker 1: or potential calamity it might cause once it goes into power. 364 00:23:05,000 --> 00:23:08,840 Speaker 1: But yeah, I wrote the article on that, and then 365 00:23:08,840 --> 00:23:12,600 Speaker 1: it got up to doing full power experiments. But then 366 00:23:12,880 --> 00:23:16,320 Speaker 1: over the last three years, the LHC has been pretty quiet. 367 00:23:16,800 --> 00:23:20,879 Speaker 1: Scientists did initiate some pilot particle beams around the facility 368 00:23:20,880 --> 00:23:25,520 Speaker 1: in October twenty twenty one, but otherwise it's been offline. Now. 369 00:23:25,520 --> 00:23:30,080 Speaker 1: The purpose for that multi year hiatus was well, there's 370 00:23:30,200 --> 00:23:32,880 Speaker 1: it was manifold. There were quite a few One big 371 00:23:32,920 --> 00:23:36,680 Speaker 1: one was that there was a massive upgrade to the LHC, 372 00:23:37,000 --> 00:23:40,000 Speaker 1: and there were also repairs and maintenance to the facility. 373 00:23:40,760 --> 00:23:44,600 Speaker 1: And now it's back baby. Last week the LHC sent 374 00:23:44,680 --> 00:23:48,399 Speaker 1: out opposing beams of protons at an injection energy of 375 00:23:48,440 --> 00:23:53,240 Speaker 1: four fifty billion electron volts. All right, now, let's do 376 00:23:53,280 --> 00:23:55,680 Speaker 1: a quick rundown in case none of this is making 377 00:23:55,760 --> 00:23:58,560 Speaker 1: any sense to you. The Large Hadron Collider is a 378 00:23:58,600 --> 00:24:04,480 Speaker 1: particle accelerator, so it uses extremely powerful electromagnets to both 379 00:24:04,520 --> 00:24:09,680 Speaker 1: propel and direct opposing beams of particles such as protons, 380 00:24:09,680 --> 00:24:12,120 Speaker 1: but it could be other stuff too, so that these 381 00:24:12,160 --> 00:24:17,560 Speaker 1: particles collide at specific points along the pathway. Now located 382 00:24:17,600 --> 00:24:21,320 Speaker 1: at these specific points are a selection of different experiments. 383 00:24:21,359 --> 00:24:25,760 Speaker 1: These experiments focus on different elements of particle science, so 384 00:24:26,520 --> 00:24:30,119 Speaker 1: the collisions create scenarios that are otherwise impossible for us 385 00:24:30,160 --> 00:24:34,560 Speaker 1: to replicate. On Earth, including the formation of teeny tiny 386 00:24:34,600 --> 00:24:38,600 Speaker 1: black holes that last for a fraction of a fraction 387 00:24:38,840 --> 00:24:41,080 Speaker 1: of a second. That black hole thing is one of 388 00:24:41,119 --> 00:24:44,160 Speaker 1: the things that trips some people out, But it's really 389 00:24:44,160 --> 00:24:46,600 Speaker 1: important to note that the energy of these black holes 390 00:24:46,680 --> 00:24:49,680 Speaker 1: is such that they cannot last. It is less than 391 00:24:49,960 --> 00:24:55,040 Speaker 1: a gnat's beating of its wings once like it's it's tiny, tiny, 392 00:24:55,080 --> 00:24:58,520 Speaker 1: tiny amounts of energy, and they're gone before you can 393 00:24:58,560 --> 00:25:01,800 Speaker 1: even be aware that they were there without the use 394 00:25:01,880 --> 00:25:06,160 Speaker 1: of incredibly sensitive equipment, so they blip out of existence 395 00:25:06,160 --> 00:25:09,800 Speaker 1: almost as quickly as they appear. Anyway, the science done 396 00:25:09,840 --> 00:25:12,639 Speaker 1: at the LHC is expanding our understanding of how the 397 00:25:12,760 --> 00:25:15,840 Speaker 1: universe works. You know, how energy and how matter work, 398 00:25:16,480 --> 00:25:19,520 Speaker 1: and it's gone a long way to proving or disproving 399 00:25:19,640 --> 00:25:24,639 Speaker 1: various hypotheses in physics. And just because the facility wasn't 400 00:25:24,680 --> 00:25:28,720 Speaker 1: actively blasting particles around for three years and some change 401 00:25:29,000 --> 00:25:32,040 Speaker 1: doesn't mean that a lot of important science wasn't going on. 402 00:25:32,359 --> 00:25:35,880 Speaker 1: Quite the opposite, in fact, see one of the really 403 00:25:36,240 --> 00:25:42,160 Speaker 1: big challenges of these experiments is that the collisions generate 404 00:25:42,760 --> 00:25:46,760 Speaker 1: a truly tremendous amount of data, and it's a full 405 00:25:46,800 --> 00:25:49,679 Speaker 1: time gig just combing through all the information and making 406 00:25:49,720 --> 00:25:53,240 Speaker 1: sense of it. So, while the facility was effectively offline 407 00:25:53,280 --> 00:25:58,119 Speaker 1: as far as generating new experiments, there were countless scientists 408 00:25:58,160 --> 00:26:01,840 Speaker 1: examining the information that we're create was created through the 409 00:26:01,920 --> 00:26:05,520 Speaker 1: previous years of collisions. Now, in what is being called 410 00:26:05,600 --> 00:26:09,360 Speaker 1: Run three of the LHC, we're gonna see two new experiments, 411 00:26:09,640 --> 00:26:13,920 Speaker 1: one called S and D at LHC and another called Phaser. 412 00:26:14,520 --> 00:26:17,320 Speaker 1: These are going to look into everything from physics that 413 00:26:17,400 --> 00:26:21,400 Speaker 1: go beyond the standard model of physics to the production 414 00:26:21,520 --> 00:26:24,160 Speaker 1: of anti matter. Maybe it will help us understand why 415 00:26:24,160 --> 00:26:27,600 Speaker 1: our universe had a tiny bit more matter than it 416 00:26:27,680 --> 00:26:32,040 Speaker 1: had antimatter. That's that's why we've got stuff. If there 417 00:26:32,040 --> 00:26:34,600 Speaker 1: had been equal amounts, then they would have annihilated each 418 00:26:34,600 --> 00:26:36,920 Speaker 1: other completely and the universe would be empty. But it's 419 00:26:36,960 --> 00:26:42,000 Speaker 1: not why. Maybe this experiment will help us learn anyway. 420 00:26:42,080 --> 00:26:46,520 Speaker 1: I love learning about the LHC, though admittedly the stuff 421 00:26:46,920 --> 00:26:50,199 Speaker 1: is way over my head. China has announced that it 422 00:26:50,320 --> 00:26:52,919 Speaker 1: is going to launch a space mission sometime in the 423 00:26:53,000 --> 00:26:56,720 Speaker 1: next four years in order to send a spacecraft that 424 00:26:56,760 --> 00:27:00,560 Speaker 1: will collide with an asteroid on purpose to test a 425 00:27:00,640 --> 00:27:05,240 Speaker 1: potential asteroid deflection system. This is similar to a NASA 426 00:27:05,280 --> 00:27:08,480 Speaker 1: mission that was launched in November of last year. That 427 00:27:08,600 --> 00:27:11,639 Speaker 1: mission has sent to spacecraft to collide with Dimorphos, a 428 00:27:11,840 --> 00:27:16,320 Speaker 1: small asteroid that orbits a larger one called Didimus. Dimorphos 429 00:27:16,480 --> 00:27:19,520 Speaker 1: isn't on a collision course with Earth, so the NASA 430 00:27:19,560 --> 00:27:21,480 Speaker 1: mission is really just more of a proof of concept 431 00:27:21,560 --> 00:27:24,000 Speaker 1: to see if we can actually affect an asteroid's flight 432 00:27:24,040 --> 00:27:27,040 Speaker 1: path by hitting it kind of like a billiard ball, 433 00:27:27,280 --> 00:27:30,200 Speaker 1: which will happen later this year in September, I believe, 434 00:27:30,280 --> 00:27:33,240 Speaker 1: is when it's scheduled to have that collision. Now. NASA's 435 00:27:33,280 --> 00:27:37,080 Speaker 1: project is called the Double Asteroid Redirection Test or DART, 436 00:27:37,400 --> 00:27:40,480 Speaker 1: and China's test sounds like it's going to be pretty similar, 437 00:27:40,800 --> 00:27:43,760 Speaker 1: though the country has not yet chosen which asteroid it 438 00:27:43,760 --> 00:27:47,840 Speaker 1: will target for its own test, and asteroids certainly pose 439 00:27:47,960 --> 00:27:51,080 Speaker 1: a potential threat. The Earth has been hit before and 440 00:27:51,520 --> 00:27:54,920 Speaker 1: totally transformed by asteroids in the past, so this is 441 00:27:54,960 --> 00:27:59,280 Speaker 1: another step towards the identification, tracking, and deflection of space 442 00:27:59,280 --> 00:28:03,080 Speaker 1: objects that could otherwise pose catastrophic danger to life on Earth. 443 00:28:03,560 --> 00:28:07,679 Speaker 1: It's uh, it's something we definitely need, and it's it's 444 00:28:07,720 --> 00:28:09,440 Speaker 1: kind of cool to think we're getting there. I never 445 00:28:09,480 --> 00:28:14,560 Speaker 1: thought we were gonna, you know, try and deflect asteroids 446 00:28:14,600 --> 00:28:17,520 Speaker 1: by whacking them, you know, sort of the percussive maintenance 447 00:28:17,520 --> 00:28:21,399 Speaker 1: approach to asteroid deflection. There were a lot of people 448 00:28:21,600 --> 00:28:25,560 Speaker 1: discussing things like using a spacecraft to create become kind 449 00:28:25,560 --> 00:28:29,159 Speaker 1: of like a tug boat and using the gravitational attraction 450 00:28:29,320 --> 00:28:32,040 Speaker 1: between the asteroid and spacecraft itself to deflect the path. 451 00:28:32,400 --> 00:28:35,000 Speaker 1: But obviously for that to work, you have to go much, much, 452 00:28:35,119 --> 00:28:38,840 Speaker 1: much further out in order to affect an asteroid's path 453 00:28:38,920 --> 00:28:42,120 Speaker 1: before it gets too close to the Earth. Deflection might 454 00:28:42,200 --> 00:28:47,640 Speaker 1: be more effective with the percussive method for asteroids that 455 00:28:47,680 --> 00:28:50,760 Speaker 1: are a little closer in. We'll have to see. And finally, 456 00:28:51,400 --> 00:28:54,040 Speaker 1: for those of you who are eagerly anticipating the animated 457 00:28:54,040 --> 00:28:57,680 Speaker 1: Super Mario Brothers movie this holiday season, I have some 458 00:28:57,760 --> 00:29:01,360 Speaker 1: bad news. The movie has been delayed until April seven 459 00:29:01,520 --> 00:29:05,240 Speaker 1: in North America and April in Japan. This is the 460 00:29:05,240 --> 00:29:10,360 Speaker 1: film that controversially cast Chris Pratt as the voice of Mario. Anyway, 461 00:29:10,560 --> 00:29:14,760 Speaker 1: Nintendo didn't give any details about why there is going 462 00:29:14,800 --> 00:29:17,000 Speaker 1: to be this delay, but there could be a lot 463 00:29:17,040 --> 00:29:20,720 Speaker 1: of different reasons, including production delays due to COVID, but 464 00:29:20,840 --> 00:29:23,040 Speaker 1: we do know we will not be getting the Mario 465 00:29:23,160 --> 00:29:26,640 Speaker 1: film until next year. But no worries. You can still 466 00:29:26,720 --> 00:29:30,400 Speaker 1: hunt down and watch the classic nine three live action 467 00:29:30,600 --> 00:29:34,960 Speaker 1: movie starring Bob Hoskins as Mario, John Legozamo as Luigi, 468 00:29:35,320 --> 00:29:39,240 Speaker 1: and Dennis Hopper as Bowser. I've only ever seen that 469 00:29:39,360 --> 00:29:42,640 Speaker 1: once and it was like a fever dream. But I've 470 00:29:42,680 --> 00:29:46,840 Speaker 1: heard more and more folks recently, mostly in the millennial generation, 471 00:29:47,240 --> 00:29:51,760 Speaker 1: talk about it being entertaining. Not good, mind you, but entertaining. 472 00:29:52,520 --> 00:29:56,120 Speaker 1: Dear mileage may vary. And that's it for this episode 473 00:29:56,200 --> 00:29:59,760 Speaker 1: of tech stuff News. If you have suggestions for topics 474 00:29:59,800 --> 00:30:01,959 Speaker 1: I should cover on tech Stuff, feel free to keep 475 00:30:02,000 --> 00:30:04,200 Speaker 1: reaching out to me on Twitter with the handled text 476 00:30:04,240 --> 00:30:07,440 Speaker 1: stuff h SW That Twitter handle is not going anywhere. 477 00:30:07,960 --> 00:30:10,800 Speaker 1: I will continue to maintain it and check it, so 478 00:30:11,200 --> 00:30:17,080 Speaker 1: reach out there and I'll talk to you again really soon. YEA. 479 00:30:21,240 --> 00:30:24,240 Speaker 1: Tech Stuff is an I Heart Radio production. For more 480 00:30:24,320 --> 00:30:27,680 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 481 00:30:27,840 --> 00:30:31,000 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.