1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:15,080 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,240 --> 00:00:18,800 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,960 --> 00:00:22,079 Speaker 1: and I love all things tech. And this is the 5 00:00:22,079 --> 00:00:27,520 Speaker 1: tech news for Tuesday, June one, twenty twenty one. And 6 00:00:27,720 --> 00:00:31,160 Speaker 1: one story that I haven't really covered on this podcast 7 00:00:31,440 --> 00:00:36,080 Speaker 1: is that there is a global semiconductor shortage and that 8 00:00:36,200 --> 00:00:40,720 Speaker 1: has a cascading effect on the computer and electronics industries 9 00:00:40,920 --> 00:00:45,680 Speaker 1: and beyond. So let's dive into that really quickly. Now. 10 00:00:45,720 --> 00:00:48,560 Speaker 1: It mostly comes down to a couple of big things 11 00:00:48,800 --> 00:00:52,839 Speaker 1: that happened in twenty and one of those is, of course, 12 00:00:53,240 --> 00:00:59,200 Speaker 1: the pandemic. The global shutdown of various industries disrupted supply chains, 13 00:00:59,800 --> 00:01:03,160 Speaker 1: and that meant that shortages and raw materials became an 14 00:01:03,160 --> 00:01:07,800 Speaker 1: issue really quickly. It just had this you know, ripple effect. 15 00:01:08,400 --> 00:01:10,680 Speaker 1: And it's not like the whole world went on pause 16 00:01:10,880 --> 00:01:14,759 Speaker 1: at the same time, and it certainly isn't like they 17 00:01:14,800 --> 00:01:18,520 Speaker 1: all started up again in synchronization along with everyone else. 18 00:01:18,840 --> 00:01:20,840 Speaker 1: So in a way, this kind of led to the 19 00:01:20,880 --> 00:01:24,800 Speaker 1: supply chain version of a traffic jam, except instead of 20 00:01:25,120 --> 00:01:28,080 Speaker 1: cars going through you know, stop and start traffic, because 21 00:01:28,120 --> 00:01:31,479 Speaker 1: of some other event that happened Further down the road. 22 00:01:32,120 --> 00:01:35,600 Speaker 1: You have various manufacturers who found themselves waiting on other 23 00:01:35,640 --> 00:01:39,120 Speaker 1: elements in the supply chain, and thus they had to 24 00:01:39,160 --> 00:01:41,440 Speaker 1: go idle because they didn't have the stuff they needed 25 00:01:41,480 --> 00:01:46,200 Speaker 1: to do their part in the chain. Everything got knocked up. 26 00:01:46,840 --> 00:01:50,880 Speaker 1: Related to that, the silicon used to create the vials 27 00:01:50,920 --> 00:01:55,040 Speaker 1: to hold vaccines is the same stuff that's used in semiconductors. 28 00:01:55,240 --> 00:01:59,960 Speaker 1: And you know, because the vaccines are understandably of very 29 00:02:00,040 --> 00:02:04,200 Speaker 1: very high priority, it meant the semiconductor industry experienced a 30 00:02:04,360 --> 00:02:09,240 Speaker 1: silicon shortage, and thus the price for silicon went up. Now, 31 00:02:09,240 --> 00:02:12,280 Speaker 1: when the price for materials goes up, you get one 32 00:02:12,360 --> 00:02:15,960 Speaker 1: of two outcomes. Either the companies that are making stuff 33 00:02:16,040 --> 00:02:19,919 Speaker 1: out of that raw material have to increase prices or 34 00:02:20,080 --> 00:02:25,280 Speaker 1: they experience a smaller profit margin. Either way, the squeeze 35 00:02:25,360 --> 00:02:28,200 Speaker 1: is felt further up the chain. But the other big 36 00:02:28,200 --> 00:02:31,000 Speaker 1: issue on top of that is that the demand for 37 00:02:31,040 --> 00:02:35,519 Speaker 1: semiconductors grew a lot in by nearly seven per cent. 38 00:02:36,320 --> 00:02:41,359 Speaker 1: More of our everyday tech relies on semiconductors, everything from 39 00:02:41,400 --> 00:02:45,400 Speaker 1: you know, video game consoles to automobiles. Now, because of 40 00:02:45,440 --> 00:02:49,200 Speaker 1: the shortage of semiconductors. Everything else down the line gets 41 00:02:49,200 --> 00:02:51,560 Speaker 1: held up too, and this is likely to lead to 42 00:02:52,400 --> 00:02:57,360 Speaker 1: massive losses in several industries. Automakers are probably going to 43 00:02:57,360 --> 00:03:00,760 Speaker 1: be losing billions of dollars in the short term because 44 00:03:01,200 --> 00:03:06,040 Speaker 1: semiconductors are important components for nearly every system inside a car, 45 00:03:06,240 --> 00:03:09,760 Speaker 1: like there's more than fifty of them, and that includes 46 00:03:09,800 --> 00:03:14,080 Speaker 1: everything from entertainment two breaks to steering. Pat Gelsinger, the 47 00:03:14,120 --> 00:03:17,520 Speaker 1: new CEO of Intel, warrants that this could just be 48 00:03:17,639 --> 00:03:20,839 Speaker 1: the start of a shortage and that we could see 49 00:03:20,880 --> 00:03:24,960 Speaker 1: the effects of the shortage stretch on several years. Gelsinger 50 00:03:25,000 --> 00:03:28,480 Speaker 1: has said that the semiconductor industry has reacted quickly to 51 00:03:28,639 --> 00:03:32,280 Speaker 1: near term challenges, but that the long term effects are 52 00:03:32,280 --> 00:03:37,080 Speaker 1: still a big concern. Meanwhile, on the consumer side, we're 53 00:03:37,080 --> 00:03:40,520 Speaker 1: seeing the effects of this crisis. The laptop maker ACER 54 00:03:40,600 --> 00:03:44,680 Speaker 1: has said that due to this semiconductor shortage, the company 55 00:03:44,760 --> 00:03:47,640 Speaker 1: is only able to fill half of worldwide demand for 56 00:03:47,720 --> 00:03:51,840 Speaker 1: laptop production on any given day. So what this means 57 00:03:51,880 --> 00:03:54,520 Speaker 1: for all of us, you know, you and me is 58 00:03:54,560 --> 00:03:58,200 Speaker 1: that the supply for all sorts of tech, from smartphones 59 00:03:58,320 --> 00:04:01,600 Speaker 1: to computers to cars is going to be more limited 60 00:04:01,600 --> 00:04:03,880 Speaker 1: than what we've come to expect, at least for the 61 00:04:03,960 --> 00:04:08,000 Speaker 1: near term, and demand is likely to be high. So 62 00:04:08,200 --> 00:04:11,600 Speaker 1: when you've got high demand and you've got limited supply, 63 00:04:11,960 --> 00:04:15,200 Speaker 1: the next thing you typically see is prices go up. 64 00:04:15,800 --> 00:04:19,000 Speaker 1: So get ready to spend more money to buy your 65 00:04:19,080 --> 00:04:22,520 Speaker 1: tech over the next year or two until things shake 66 00:04:22,560 --> 00:04:26,160 Speaker 1: out and while we get ready to pay out more money, 67 00:04:26,240 --> 00:04:30,280 Speaker 1: the Guardian reports that the Silicon Six have actually paid 68 00:04:30,320 --> 00:04:34,480 Speaker 1: out less than they claimed. The Silicon Six refers to 69 00:04:34,720 --> 00:04:39,400 Speaker 1: six gargantuan tech companies. That would be Alphabet, which is 70 00:04:39,800 --> 00:04:49,040 Speaker 1: Google's parent company, Amazon, Apple, Facebook, Microsoft, and Netflix which interestingly, 71 00:04:49,560 --> 00:04:51,760 Speaker 1: uh is the first time I've seen Netflix added to 72 00:04:51,800 --> 00:04:55,560 Speaker 1: this particular list. Totally makes sense though, and according to 73 00:04:55,640 --> 00:05:00,280 Speaker 1: the Guardian, these six companies overstated their tax payments by 74 00:05:00,360 --> 00:05:04,520 Speaker 1: nearly one hundred billion dollars over the last ten years. 75 00:05:05,080 --> 00:05:08,839 Speaker 1: A report from the Fair Tax Foundation claims that these 76 00:05:08,880 --> 00:05:13,640 Speaker 1: six companies paid ninety six billion dollars less in tax 77 00:05:13,800 --> 00:05:17,120 Speaker 1: between two thousand and eleven and two thousand twenty than 78 00:05:17,240 --> 00:05:22,640 Speaker 1: their annual reports indicate. Moreover, these six companies paid a 79 00:05:22,720 --> 00:05:25,600 Speaker 1: tax rate that you or I would go bonkers over. 80 00:05:25,960 --> 00:05:29,760 Speaker 1: According to Fair Tax Foundation, these companies paid out three 81 00:05:29,960 --> 00:05:34,560 Speaker 1: point six percent of their total revenue in tax that's 82 00:05:34,560 --> 00:05:38,080 Speaker 1: two d nineteen billion dollars of taxes. And yeah, that 83 00:05:38,240 --> 00:05:42,400 Speaker 1: is an astounding amount of money, two hundred nineteen billion dollars. 84 00:05:42,440 --> 00:05:46,359 Speaker 1: But let's compare that to the amount of revenue they generated, 85 00:05:46,440 --> 00:05:52,880 Speaker 1: which was more than six trillion dollars trillion. That goes 86 00:05:52,920 --> 00:05:56,760 Speaker 1: beyond a princely sum. Now let's be fair. That is 87 00:05:56,920 --> 00:06:01,200 Speaker 1: revenue that is not profit, right, Like, profit is what 88 00:06:01,279 --> 00:06:05,000 Speaker 1: you get after you remove all the costs from the 89 00:06:05,000 --> 00:06:08,000 Speaker 1: money that you've brought in. So luckily the Guardian digs 90 00:06:08,040 --> 00:06:11,800 Speaker 1: down into that a little bit as well. Over that decade, 91 00:06:11,920 --> 00:06:15,640 Speaker 1: you know, the last ten years, Amazon collected around one 92 00:06:15,680 --> 00:06:20,440 Speaker 1: point six trillion dollars in revenue but collected a mere 93 00:06:20,680 --> 00:06:26,400 Speaker 1: sixty point five billion dollars in profits. So I mean, 94 00:06:26,760 --> 00:06:29,600 Speaker 1: I'm being a little flippant because honestly, these numbers are 95 00:06:29,640 --> 00:06:33,560 Speaker 1: so big I can't actually comprehend them. Like, from an 96 00:06:33,600 --> 00:06:36,120 Speaker 1: abstract perspective, I kind of get it, but you know, 97 00:06:36,320 --> 00:06:38,360 Speaker 1: if I try to dive any further down, it's just 98 00:06:38,400 --> 00:06:42,760 Speaker 1: too big. But you see that the amount of profit 99 00:06:42,920 --> 00:06:47,880 Speaker 1: compared to revenue is very, very tiny when you look 100 00:06:47,920 --> 00:06:51,960 Speaker 1: at them in ratio right, not by sheer amount. Sixty 101 00:06:52,040 --> 00:06:54,960 Speaker 1: billion dollars is a huge amount of money anyway. The 102 00:06:54,960 --> 00:06:58,120 Speaker 1: Guardian reports that Amazon ended up paying well below what 103 00:06:58,279 --> 00:07:01,720 Speaker 1: it should have been expected to pay, almost half as much. 104 00:07:01,839 --> 00:07:05,800 Speaker 1: In fact, there's a shifting movement around the world to 105 00:07:06,080 --> 00:07:10,120 Speaker 1: apply new tax laws that would limit companies from being 106 00:07:10,160 --> 00:07:14,480 Speaker 1: able to to put profits over into tax havens. Whether 107 00:07:14,680 --> 00:07:18,400 Speaker 1: that actually happens or not, whether countries around the world 108 00:07:18,520 --> 00:07:21,440 Speaker 1: really act on this, Uh, that remains to be seen, 109 00:07:21,480 --> 00:07:23,840 Speaker 1: because I'm sure there will be more than a few 110 00:07:23,880 --> 00:07:27,880 Speaker 1: dollars spent on lobbying to oppose those measures. So we'll 111 00:07:27,880 --> 00:07:32,400 Speaker 1: have to see how this develops further. Okay, but what 112 00:07:32,520 --> 00:07:36,640 Speaker 1: if money is no object to you? What if you 113 00:07:37,280 --> 00:07:40,400 Speaker 1: are swimming in the stuff you know, Scrooge McDuck style, 114 00:07:41,000 --> 00:07:44,080 Speaker 1: How do you flaunt your wealth in a way that's 115 00:07:44,120 --> 00:07:49,840 Speaker 1: both flashy and environmentally friendly. Well, my friend, maybe it's 116 00:07:49,880 --> 00:07:54,320 Speaker 1: time you look into the silent Shadow, a luxury electric 117 00:07:54,440 --> 00:07:58,280 Speaker 1: vehicle concept in the works over at Rolls Royce. Now, 118 00:07:58,320 --> 00:08:02,640 Speaker 1: the brand Rolls Royce has long been associated with luxury 119 00:08:02,920 --> 00:08:07,360 Speaker 1: and opulence, and the word shadow has significance for the 120 00:08:07,400 --> 00:08:11,840 Speaker 1: company because back in nineteen Rolls Royce introduced the Silver 121 00:08:12,040 --> 00:08:16,200 Speaker 1: Shadow luxury car. When it comes to the Silent Shadow, well, 122 00:08:16,200 --> 00:08:20,360 Speaker 1: we don't really have that many details about the car. 123 00:08:20,600 --> 00:08:23,240 Speaker 1: The company says that the plans are to have the 124 00:08:23,320 --> 00:08:27,360 Speaker 1: vehicle ready for purchase within the decade, but we don't 125 00:08:27,360 --> 00:08:30,560 Speaker 1: know any specs. We don't know what the projected prices, 126 00:08:30,680 --> 00:08:33,280 Speaker 1: although you know, if you have to ask, you can't 127 00:08:33,280 --> 00:08:37,920 Speaker 1: afford it. We don't know really any real details except that, 128 00:08:37,960 --> 00:08:41,040 Speaker 1: of course, you know, Silent Shadow. The name tells us 129 00:08:41,040 --> 00:08:43,120 Speaker 1: that we know it's going to be very, very quiet. 130 00:08:43,520 --> 00:08:45,920 Speaker 1: But then again, Rolls Royce is known, at least in 131 00:08:46,040 --> 00:08:50,520 Speaker 1: part for engineering cars that operate quietly, because the whole 132 00:08:50,559 --> 00:08:53,480 Speaker 1: point of a Rolls Royce is that the experience of 133 00:08:53,600 --> 00:08:56,760 Speaker 1: driving it, or for those who prefer to employ a driver, 134 00:08:57,000 --> 00:09:00,840 Speaker 1: the experience of riding in a Rolls Royce is to 135 00:09:01,040 --> 00:09:06,080 Speaker 1: enjoy luxury rather than that you know, chassis shaking, engine 136 00:09:06,240 --> 00:09:10,280 Speaker 1: revving experience you get with like muscle cars. Rolls Royce 137 00:09:10,800 --> 00:09:14,680 Speaker 1: isn't the only luxury car maker that's diving into electric vehicles, 138 00:09:14,920 --> 00:09:17,760 Speaker 1: and as I've reported in previous episodes of Tech Stuff, 139 00:09:18,679 --> 00:09:21,199 Speaker 1: that's pretty much a necessity because a lot of places 140 00:09:21,280 --> 00:09:24,520 Speaker 1: around the world intend to phase out the sale of 141 00:09:24,640 --> 00:09:28,160 Speaker 1: new fossil fuel powered vehicles over the next decade and 142 00:09:28,240 --> 00:09:31,800 Speaker 1: a half. Earlier this year, I talked about how the 143 00:09:31,840 --> 00:09:36,360 Speaker 1: electronics company l G was exiting the smartphone industry. The 144 00:09:36,440 --> 00:09:39,720 Speaker 1: company had shown off a couple of interesting concepts at CES, 145 00:09:39,840 --> 00:09:43,880 Speaker 1: but it sounds like they're never going to hit store shelves, 146 00:09:43,920 --> 00:09:50,200 Speaker 1: So that magical expanding smartphone is just gonna be a 147 00:09:50,280 --> 00:09:52,800 Speaker 1: thing of legend. I think we're not, at least we're 148 00:09:52,800 --> 00:09:55,720 Speaker 1: not going to see it from l G now Korea. 149 00:09:55,840 --> 00:09:59,680 Speaker 1: Biz Wire reports that l G is switching its former 150 00:09:59,720 --> 00:10:03,640 Speaker 1: smart phone manufacturing facilities over to make home appliances instead. 151 00:10:04,360 --> 00:10:08,559 Speaker 1: The company is also consolidating its operations in Brazil, expanding 152 00:10:08,640 --> 00:10:12,440 Speaker 1: facilities in the city of Manouse. And I apologize for 153 00:10:12,520 --> 00:10:15,480 Speaker 1: the terrible mispronunciation of that. I am certain I got 154 00:10:15,480 --> 00:10:19,760 Speaker 1: it completely wrong. While LG pulled the plug on smartphones 155 00:10:19,880 --> 00:10:23,520 Speaker 1: after finding the market too competitive, it was dominated by 156 00:10:23,800 --> 00:10:27,440 Speaker 1: companies like Apple and Samsung, home appliances are a totally 157 00:10:27,520 --> 00:10:32,199 Speaker 1: different story. One of the many consequences of the pandemic 158 00:10:32,200 --> 00:10:35,480 Speaker 1: of twan. I guess I shouldn't give it a year. 159 00:10:35,520 --> 00:10:38,560 Speaker 1: It's still going anyway. One of the big consequences was 160 00:10:38,679 --> 00:10:42,440 Speaker 1: this increased demand in home appliances as people spent more 161 00:10:42,440 --> 00:10:47,199 Speaker 1: time at home, and LG saw its sales skyrocket as 162 00:10:47,200 --> 00:10:52,200 Speaker 1: a result. Now this move reflects that increased demand was 163 00:10:52,520 --> 00:10:55,280 Speaker 1: really the driver for l g S decisions. However, it 164 00:10:55,360 --> 00:11:00,280 Speaker 1: will remain to be seen if that demand will contin nue, 165 00:11:00,480 --> 00:11:03,119 Speaker 1: right if the demands that were generated by the pandemic 166 00:11:03,559 --> 00:11:08,080 Speaker 1: in are going to stick around even as we start 167 00:11:08,160 --> 00:11:12,640 Speaker 1: to have a better handle on dealing with the consequences 168 00:11:12,640 --> 00:11:16,360 Speaker 1: of that pandemic. So we don't really know if this 169 00:11:16,520 --> 00:11:19,960 Speaker 1: is the start of a trend in home appliance sales 170 00:11:20,320 --> 00:11:22,760 Speaker 1: or if it's more of a blip in the radar. 171 00:11:23,559 --> 00:11:27,840 Speaker 1: C Net reports that some recently unsealed court documents show 172 00:11:27,920 --> 00:11:32,720 Speaker 1: that Google purposefully obvious skated the location settings in its 173 00:11:32,760 --> 00:11:36,120 Speaker 1: Android phone software so that it would be harder for 174 00:11:36,280 --> 00:11:39,880 Speaker 1: users to find those settings and then turn them off. 175 00:11:40,480 --> 00:11:44,040 Speaker 1: As we've seen when numerous tech companies, including one we're 176 00:11:44,040 --> 00:11:47,080 Speaker 1: going to cover in a second, the real business of 177 00:11:47,120 --> 00:11:53,360 Speaker 1: those companies aren't necessarily in hardware or social networking sites 178 00:11:53,520 --> 00:11:58,240 Speaker 1: or whatever the surface level businesses. You know, Google's business 179 00:11:59,000 --> 00:12:03,280 Speaker 1: isn't really search. It's in data. That's where the real 180 00:12:03,320 --> 00:12:07,960 Speaker 1: money is collecting and then exploiting data in different ways, 181 00:12:09,000 --> 00:12:11,920 Speaker 1: primarily when it comes to user data that comes in 182 00:12:11,960 --> 00:12:17,080 Speaker 1: the form of the company's relationships with various advertisers. You know, obviously, 183 00:12:17,600 --> 00:12:21,800 Speaker 1: the more information you can give an advertiser about their 184 00:12:21,840 --> 00:12:26,320 Speaker 1: intended market, the more effectively that advertiser can serve up 185 00:12:26,360 --> 00:12:30,400 Speaker 1: ads to that market. So it becomes this the cycle, 186 00:12:30,520 --> 00:12:35,920 Speaker 1: this feedback loop between these companies and advertisers that in 187 00:12:36,080 --> 00:12:41,960 Speaker 1: turn inform the business decisions of those companies like Google. So, 188 00:12:42,000 --> 00:12:46,400 Speaker 1: according to this document, Google discovered that if the location 189 00:12:46,520 --> 00:12:51,000 Speaker 1: settings on an Android phone were relatively easy to navigate to, 190 00:12:51,679 --> 00:12:55,600 Speaker 1: a lot of people opted to turn off their location settings. 191 00:12:55,640 --> 00:12:59,440 Speaker 1: How about that folks are not super jazzed about being 192 00:12:59,520 --> 00:13:04,440 Speaker 1: tracked wewere and so google solution to this issue was 193 00:13:04,520 --> 00:13:08,760 Speaker 1: not to shift business operations away from the benefits of 194 00:13:08,800 --> 00:13:14,160 Speaker 1: harvesting location data. No, the answer apparently was to make 195 00:13:14,280 --> 00:13:18,400 Speaker 1: those settings just way harder to find so that users 196 00:13:18,400 --> 00:13:22,240 Speaker 1: would continue to generate those geo located zeros and ones 197 00:13:22,320 --> 00:13:25,360 Speaker 1: for Google, so that Google could profit off of them 198 00:13:25,559 --> 00:13:29,160 Speaker 1: without those users actually really being aware of it. On 199 00:13:29,280 --> 00:13:33,200 Speaker 1: top of that, Google apparently reached out to various manufacturers 200 00:13:33,200 --> 00:13:36,160 Speaker 1: that make Android products in an effort to convince them 201 00:13:36,240 --> 00:13:41,920 Speaker 1: to hide the location settings away in deep various menus 202 00:13:41,960 --> 00:13:47,480 Speaker 1: and clunky user interfaces like LG pushed geolocation settings to 203 00:13:47,640 --> 00:13:52,199 Speaker 1: the second page of Settings and its phones, And anyone 204 00:13:52,200 --> 00:13:54,719 Speaker 1: who's had any experience on the Internet knows that if 205 00:13:54,760 --> 00:13:57,480 Speaker 1: you are below the fold, that is, if you have 206 00:13:57,559 --> 00:14:00,800 Speaker 1: to scroll down in order to see the ticular entry, 207 00:14:01,040 --> 00:14:04,400 Speaker 1: you lose like the vast majority of people who are 208 00:14:04,800 --> 00:14:08,280 Speaker 1: looking at your stuff like this is clear in Google Search. Right, 209 00:14:08,320 --> 00:14:11,560 Speaker 1: if you're not in those first few hits of a 210 00:14:11,600 --> 00:14:15,000 Speaker 1: Google search, the traffic that comes to you thanks to 211 00:14:15,040 --> 00:14:18,880 Speaker 1: Google Search is super low because you know, most people 212 00:14:18,880 --> 00:14:21,320 Speaker 1: don't bother to school down any further. Sayings true with 213 00:14:21,400 --> 00:14:24,720 Speaker 1: settings on phones. If it's not right there, a lot 214 00:14:24,720 --> 00:14:27,000 Speaker 1: of people don't take the effort to go any further. 215 00:14:27,760 --> 00:14:31,080 Speaker 1: This revelation comes on the heels of an investigation from 216 00:14:31,200 --> 00:14:35,280 Speaker 1: three years ago by the Associated Press into Google, and 217 00:14:35,320 --> 00:14:38,760 Speaker 1: that investigation found that Google was tracking location even if 218 00:14:38,880 --> 00:14:44,200 Speaker 1: users opted out of the location history feature. So apparently, 219 00:14:44,480 --> 00:14:47,440 Speaker 1: if you turned off location history on your Android device, 220 00:14:47,760 --> 00:14:49,800 Speaker 1: it just meant that Google was going to keep on 221 00:14:49,960 --> 00:14:53,680 Speaker 1: tracking you everywhere you go. They just wouldn't tell you 222 00:14:53,760 --> 00:14:57,640 Speaker 1: about it. I mean, why should you care about all 223 00:14:57,680 --> 00:15:01,560 Speaker 1: this anyway? Why why is this important? Well, location data 224 00:15:02,040 --> 00:15:05,760 Speaker 1: isn't just about collecting information on where you go and 225 00:15:05,920 --> 00:15:08,800 Speaker 1: when you go there. It's also about collecting the data 226 00:15:08,920 --> 00:15:11,600 Speaker 1: of everybody else at the same time. Anyone who has, 227 00:15:11,760 --> 00:15:15,480 Speaker 1: you know, a device that has geolocation connected to it, 228 00:15:16,480 --> 00:15:18,640 Speaker 1: if you're in a space that has a lot of 229 00:15:18,720 --> 00:15:22,760 Speaker 1: folks with phones in it. Well, now, as a data 230 00:15:22,800 --> 00:15:26,480 Speaker 1: collection company, you can do all sorts of interesting things. 231 00:15:26,680 --> 00:15:28,200 Speaker 1: So I'm going to give you an example. And this 232 00:15:28,240 --> 00:15:31,960 Speaker 1: is actually something that's related to a Twitter thread I 233 00:15:32,000 --> 00:15:34,920 Speaker 1: saw and I wish I could remember the person who 234 00:15:34,920 --> 00:15:37,320 Speaker 1: posted it because it was very good. But I'll give 235 00:15:37,360 --> 00:15:41,160 Speaker 1: you kind of an example. So let's say that you've 236 00:15:41,200 --> 00:15:42,800 Speaker 1: got a good friend of yours you haven't seen in 237 00:15:42,800 --> 00:15:45,160 Speaker 1: a long time, and you go to visit this friend 238 00:15:45,360 --> 00:15:49,720 Speaker 1: for a few days. So you're staying at your friend's house. Now, 239 00:15:49,760 --> 00:15:53,360 Speaker 1: your friend also has various devices, and you've got your smartphone, 240 00:15:53,640 --> 00:15:57,280 Speaker 1: and location tracking tells Google where you are. And assuming 241 00:15:57,280 --> 00:16:00,560 Speaker 1: Google also has at least some access to the data 242 00:16:00,720 --> 00:16:05,040 Speaker 1: that's generated by your friends devices, Google also knows who 243 00:16:05,160 --> 00:16:08,320 Speaker 1: you are with. They know that you're at this specific 244 00:16:08,400 --> 00:16:12,080 Speaker 1: person's house, and they know things about that specific person too. 245 00:16:12,360 --> 00:16:15,680 Speaker 1: Google knows all about your friends activities and what they 246 00:16:15,760 --> 00:16:17,680 Speaker 1: like and where they like to go and all that 247 00:16:17,760 --> 00:16:21,360 Speaker 1: kind of stuff. So now Google starts to integrate ads 248 00:16:21,440 --> 00:16:26,040 Speaker 1: into your various experiences that aren't just targeting you, They're 249 00:16:26,080 --> 00:16:30,280 Speaker 1: also targeting your friend. I mean, you like this person 250 00:16:30,440 --> 00:16:32,600 Speaker 1: well enough to stay at their house for a few days. 251 00:16:33,440 --> 00:16:37,160 Speaker 1: Maybe you like them well enough to shop for a 252 00:16:37,240 --> 00:16:40,800 Speaker 1: birthday present for them, and Google happens to know when 253 00:16:40,880 --> 00:16:43,880 Speaker 1: their birthday is because that's some of the data that 254 00:16:44,000 --> 00:16:48,720 Speaker 1: these companies collect, like email, addresses, birthdays. A lot of 255 00:16:48,720 --> 00:16:52,960 Speaker 1: the stuff these companies collect not through directly grabbing it 256 00:16:53,000 --> 00:16:57,400 Speaker 1: off of your device, but by you know, cross referencing 257 00:16:57,600 --> 00:17:01,040 Speaker 1: the databases that have your information in them from across 258 00:17:01,120 --> 00:17:03,920 Speaker 1: all the different services you use. So if you've ever 259 00:17:04,680 --> 00:17:06,640 Speaker 1: used a service where you've had to put in things 260 00:17:06,680 --> 00:17:09,880 Speaker 1: like your name and address and phone number and your 261 00:17:09,920 --> 00:17:12,359 Speaker 1: birthday and your email and all that kind of stuff, 262 00:17:13,400 --> 00:17:17,119 Speaker 1: all that data ends up getting mixed up with information 263 00:17:17,160 --> 00:17:20,800 Speaker 1: that's gathered from sources like devices, and that's a really 264 00:17:20,840 --> 00:17:24,160 Speaker 1: powerful thing. So now, because Google knows where you are 265 00:17:24,200 --> 00:17:27,000 Speaker 1: and who you're with, and they know about that person's birthday, 266 00:17:27,000 --> 00:17:30,359 Speaker 1: maybe they serve up ads about something that this friend 267 00:17:30,359 --> 00:17:33,560 Speaker 1: of yours really likes, and the suggestion is, hey, it's 268 00:17:33,560 --> 00:17:36,520 Speaker 1: it's it's my friend's birthday coming up. I should, you know, 269 00:17:36,680 --> 00:17:41,280 Speaker 1: click on this ad and buy this stuff. So now 270 00:17:41,320 --> 00:17:44,720 Speaker 1: you get this weird sensation that you're getting served ads 271 00:17:44,800 --> 00:17:49,560 Speaker 1: that are very specifically targeted at you and the experiences 272 00:17:49,640 --> 00:17:54,000 Speaker 1: that you've recently had. It feels like Google is listening 273 00:17:54,000 --> 00:17:57,040 Speaker 1: in on you right, like it's just spying on you, 274 00:17:57,240 --> 00:18:00,879 Speaker 1: and it's picking up stuff from say you're your phone's 275 00:18:00,920 --> 00:18:04,000 Speaker 1: microphone or whatever. But no, Google doesn't have to do 276 00:18:04,040 --> 00:18:06,200 Speaker 1: any of that. Google doesn't have to do any active 277 00:18:06,320 --> 00:18:09,200 Speaker 1: spying on you. It doesn't have to listen to you. 278 00:18:09,520 --> 00:18:12,800 Speaker 1: It's just collecting all the data from you and your 279 00:18:12,880 --> 00:18:16,320 Speaker 1: friend just by being who you are and where you are, 280 00:18:16,880 --> 00:18:20,280 Speaker 1: and then cross referencing that data with other data sets, 281 00:18:20,320 --> 00:18:23,200 Speaker 1: and then analyzing that data and then acting on it. 282 00:18:23,960 --> 00:18:27,240 Speaker 1: This is just one way where data collection can become 283 00:18:27,320 --> 00:18:32,399 Speaker 1: intrusive and creepy. Now we're gonna take a quick break. 284 00:18:32,400 --> 00:18:35,800 Speaker 1: When we come back, I'll give another update about another 285 00:18:35,880 --> 00:18:41,480 Speaker 1: company that's equally obsessed with your information. It rhymes with 286 00:18:41,560 --> 00:18:53,440 Speaker 1: a space look. But first, let's take a quick break. Okay, 287 00:18:53,480 --> 00:18:56,080 Speaker 1: we're back, And before the break, I was telling you 288 00:18:56,160 --> 00:18:59,840 Speaker 1: about Google and geolocation data and how the company was 289 00:18:59,880 --> 00:19:04,760 Speaker 1: apparently trying to hide the settings for geolocation away so 290 00:19:04,800 --> 00:19:07,560 Speaker 1: that fewer people would turn it off. I mean, that's 291 00:19:07,600 --> 00:19:12,480 Speaker 1: the thing. These companies are often compelled by various legislations 292 00:19:12,520 --> 00:19:16,880 Speaker 1: around the world to offer up that solution. But then 293 00:19:17,000 --> 00:19:20,719 Speaker 1: the companies do their best to make that option, you know, 294 00:19:21,960 --> 00:19:24,639 Speaker 1: harder to find so that fewer people actually use it 295 00:19:24,800 --> 00:19:28,320 Speaker 1: because it's it's we've seen when people are given the option, 296 00:19:28,640 --> 00:19:32,760 Speaker 1: they often like to opt out of these features. And 297 00:19:32,880 --> 00:19:36,320 Speaker 1: when your entire business is dependent upon those features, that's 298 00:19:36,320 --> 00:19:38,480 Speaker 1: where you see these companies coming up with these clever 299 00:19:38,600 --> 00:19:42,240 Speaker 1: ways to try and get around the issue. So along 300 00:19:42,240 --> 00:19:45,800 Speaker 1: that same vein, remember how Facebook put up a fuss 301 00:19:45,920 --> 00:19:48,840 Speaker 1: about the new privacy settings that are included in the 302 00:19:48,920 --> 00:19:53,400 Speaker 1: latest version of Apple's iOS. So one of the new 303 00:19:53,480 --> 00:19:57,880 Speaker 1: features of the iPhone operating system, well really the iOS 304 00:19:57,960 --> 00:20:01,159 Speaker 1: operating system, because it's for all sorts of devices, not 305 00:20:01,200 --> 00:20:04,320 Speaker 1: just an iPhone anyway, one of the new features is 306 00:20:04,359 --> 00:20:08,840 Speaker 1: that users will get a prompt asking if they will 307 00:20:08,880 --> 00:20:14,280 Speaker 1: allow certain apps like Facebook, for example, to collect data 308 00:20:14,359 --> 00:20:18,800 Speaker 1: about themselves outside of the app itself, So that includes 309 00:20:19,000 --> 00:20:22,560 Speaker 1: data that comes from other apps that are on your phone. So, 310 00:20:22,600 --> 00:20:26,560 Speaker 1: for example, Facebook, if you were to allow this option, 311 00:20:27,280 --> 00:20:30,920 Speaker 1: would potentially be allowed to collect information about your shopping 312 00:20:30,960 --> 00:20:35,040 Speaker 1: habits on other apps, or what restaurants you like to 313 00:20:35,160 --> 00:20:39,320 Speaker 1: order from whenever you use delivery services and so on. Now, 314 00:20:39,400 --> 00:20:43,239 Speaker 1: because of apples change in policy, users will get a 315 00:20:43,280 --> 00:20:47,320 Speaker 1: message asking them to grant permission to allow apps like 316 00:20:47,400 --> 00:20:52,040 Speaker 1: Facebook to do this. And Facebook really hates that. And 317 00:20:52,080 --> 00:20:55,200 Speaker 1: the reason the company really hates it is pretty much 318 00:20:55,440 --> 00:20:59,480 Speaker 1: the same reason that Google was burying location settings. Because 319 00:20:59,600 --> 00:21:02,840 Speaker 1: data is money, and if you give people the option 320 00:21:02,960 --> 00:21:07,520 Speaker 1: to share less data about themselves, they might actually take 321 00:21:07,520 --> 00:21:11,080 Speaker 1: that option. And I mean, really, isn't that just stealing? 322 00:21:11,720 --> 00:21:14,119 Speaker 1: I mean, when you get down to it, isn't it 323 00:21:14,200 --> 00:21:18,320 Speaker 1: just ungrateful for users to not hand over all the 324 00:21:18,359 --> 00:21:21,399 Speaker 1: information about who they are and what they do and 325 00:21:21,440 --> 00:21:25,879 Speaker 1: who they know so that poor scrappy companies like Facebook 326 00:21:25,880 --> 00:21:28,639 Speaker 1: can just make a buck off that info. Now, I'm 327 00:21:29,000 --> 00:21:34,000 Speaker 1: obviously I'm being incredibly obnoxious and sarcastic here because I 328 00:21:34,040 --> 00:21:39,800 Speaker 1: think Facebook is like literally the worst anyway. Now, a 329 00:21:39,840 --> 00:21:44,359 Speaker 1: new study suggests that Apple's changes to privacy are in 330 00:21:44,440 --> 00:21:48,920 Speaker 1: fact bad. The study says Apple is doing a bad 331 00:21:48,960 --> 00:21:52,480 Speaker 1: thing by including these privacy options. It says that those 332 00:21:52,520 --> 00:21:56,400 Speaker 1: policies only serve to help Apple, and they hurt all 333 00:21:56,480 --> 00:22:00,159 Speaker 1: other companies, and thus these are anti competitive practice. Is 334 00:22:00,160 --> 00:22:03,239 Speaker 1: that Apple has put in place, Apple saying if you 335 00:22:03,280 --> 00:22:07,320 Speaker 1: want to operate on our system, you must follow these rules, which, 336 00:22:07,320 --> 00:22:10,320 Speaker 1: by the way, we don't have to follow as Apple. 337 00:22:10,680 --> 00:22:13,600 Speaker 1: As Apple, we it's cool for us to collect all 338 00:22:13,640 --> 00:22:17,200 Speaker 1: the information, but you companies out there, you cannot do that, 339 00:22:18,080 --> 00:22:23,400 Speaker 1: and that this hurts other companies. Also, Facebook totally funded 340 00:22:23,840 --> 00:22:28,280 Speaker 1: that study. Now that might cause you to question the 341 00:22:28,320 --> 00:22:35,240 Speaker 1: studies objective perspective, right, Like, the studies outcome is essentially 342 00:22:35,280 --> 00:22:40,080 Speaker 1: in line with Facebook's complaints against Apple. So I think 343 00:22:40,160 --> 00:22:43,280 Speaker 1: that's actually a pretty darn healthy attitude to have to 344 00:22:43,440 --> 00:22:49,399 Speaker 1: question the objectivity of the results because the study itself 345 00:22:49,640 --> 00:22:52,879 Speaker 1: was funded by the company that has a beef against Apple. 346 00:22:53,359 --> 00:22:56,680 Speaker 1: But all that being said, does that mean it's possible 347 00:22:56,800 --> 00:22:59,320 Speaker 1: that the paper actually has a point in that Apple 348 00:22:59,400 --> 00:23:01,320 Speaker 1: is going to be fit while other companies do not. 349 00:23:02,680 --> 00:23:06,600 Speaker 1: I mean, maybe it's very likely Apple is certainly no 350 00:23:06,720 --> 00:23:10,959 Speaker 1: innocent lamb in this equation either. Right, You've got all 351 00:23:11,000 --> 00:23:14,399 Speaker 1: these different companies that are leveraging data in different ways. 352 00:23:14,600 --> 00:23:18,520 Speaker 1: Sometimes it's obvious and sometimes it's subtle, but they're all 353 00:23:18,560 --> 00:23:22,560 Speaker 1: profiting off of it. Now. I don't have a solution 354 00:23:22,960 --> 00:23:26,000 Speaker 1: that addresses this whole issue unless it's just to give 355 00:23:26,080 --> 00:23:30,119 Speaker 1: up on smartphones in general and go to like really 356 00:23:30,160 --> 00:23:33,920 Speaker 1: simple cell phones and and just kind of opt out 357 00:23:34,040 --> 00:23:37,439 Speaker 1: of the online experience. That's not really an option for 358 00:23:37,480 --> 00:23:41,560 Speaker 1: most people, or at least not a you know, an 359 00:23:41,600 --> 00:23:45,640 Speaker 1: attractive option. But the flip side is, unless there's some 360 00:23:46,080 --> 00:23:51,240 Speaker 1: specific legislation in place that that directs how data can 361 00:23:51,280 --> 00:23:55,000 Speaker 1: and cannot be used, I don't see really a way 362 00:23:55,040 --> 00:24:00,960 Speaker 1: of fixing this. Um it's a mess. It's speaking of messes. 363 00:24:01,520 --> 00:24:05,680 Speaker 1: In science fiction, autonomous killer robots are a common trope, 364 00:24:06,200 --> 00:24:11,600 Speaker 1: from terminator to RoboCop to the classic chopping mall. The 365 00:24:11,800 --> 00:24:16,680 Speaker 1: threat of AI powered killing machines is made apparent, and 366 00:24:16,720 --> 00:24:21,240 Speaker 1: we've seen numerous experts in robotics and AI speak out 367 00:24:21,359 --> 00:24:24,840 Speaker 1: against the development of these kinds of devices. They've pointed 368 00:24:24,840 --> 00:24:28,800 Speaker 1: out that autonomous weapons would very likely lead to a 369 00:24:28,840 --> 00:24:31,680 Speaker 1: new type of arms race, and that we would also 370 00:24:31,760 --> 00:24:35,640 Speaker 1: see horrific uses of this technology. It does not take 371 00:24:35,720 --> 00:24:39,200 Speaker 1: much imagination to conjure up a scenario in which a machine, 372 00:24:39,840 --> 00:24:43,600 Speaker 1: all under its own power, mistakenly identifies a group of 373 00:24:43,600 --> 00:24:47,880 Speaker 1: people as being targets and then attacks them. Or heck, 374 00:24:48,320 --> 00:24:51,119 Speaker 1: it's not hard to imagine a machine that identifies a 375 00:24:51,200 --> 00:24:56,000 Speaker 1: group quote unquote correctly, But the people behind the machine 376 00:24:56,160 --> 00:24:59,399 Speaker 1: are committed to wiping out specific populations, and they're just 377 00:24:59,480 --> 00:25:03,280 Speaker 1: using the machin jeans to carry out the awful, horrific work. 378 00:25:03,680 --> 00:25:07,040 Speaker 1: And according to the u N, we are essentially in 379 00:25:07,560 --> 00:25:12,120 Speaker 1: that terrifying era. A u N Security Council reports said 380 00:25:12,119 --> 00:25:16,159 Speaker 1: that in March of twenty the Nation of Turkey deployed 381 00:25:16,200 --> 00:25:22,960 Speaker 1: an STM cargo to military drone. This drone, apparently under 382 00:25:23,000 --> 00:25:28,199 Speaker 1: autonomous command, attacked Libyan armed forces that were repositioning and 383 00:25:28,280 --> 00:25:32,200 Speaker 1: withdrawing from an area. The report claims that the drone 384 00:25:32,240 --> 00:25:36,600 Speaker 1: could identify an attack targets without first establishing any line 385 00:25:36,600 --> 00:25:40,399 Speaker 1: of communication back to a human operator. The u N 386 00:25:40,560 --> 00:25:44,639 Speaker 1: had previously warned against this sort of thing, advocating for 387 00:25:44,680 --> 00:25:48,800 Speaker 1: a global ban on the production of autonomous weaponry. That 388 00:25:49,080 --> 00:25:53,200 Speaker 1: was a move that was opposed by two major world powers, 389 00:25:53,680 --> 00:25:58,800 Speaker 1: Russia and the United States. Now this was back in 390 00:25:58,840 --> 00:26:03,120 Speaker 1: the U S. Isn't a very different place politically today. 391 00:26:03,200 --> 00:26:08,280 Speaker 1: In however, I am not confident enough to say that 392 00:26:08,840 --> 00:26:13,240 Speaker 1: the US would unilaterally condemn the development of these kinds 393 00:26:13,280 --> 00:26:17,359 Speaker 1: of autonomous weapons. And I say that mostly because the 394 00:26:17,440 --> 00:26:21,680 Speaker 1: Obama administration had its own serious burden to bear when 395 00:26:21,720 --> 00:26:25,440 Speaker 1: it comes to the use of lethal military drones, though 396 00:26:26,200 --> 00:26:30,199 Speaker 1: those were under the control of human operators. Anyway, the 397 00:26:30,280 --> 00:26:33,320 Speaker 1: report has prompted more experts in the fields of AI 398 00:26:33,440 --> 00:26:36,159 Speaker 1: and machine learning to speak out against the practice of 399 00:26:36,160 --> 00:26:40,160 Speaker 1: developing and deploying autonomous weaponry. So it pretty much falls 400 00:26:40,160 --> 00:26:43,760 Speaker 1: to governments to take action from here and perhaps give 401 00:26:43,840 --> 00:26:46,640 Speaker 1: the U N the authority to to have a unilateral 402 00:26:46,760 --> 00:26:50,959 Speaker 1: ban on the development and thus, you know, processes in 403 00:26:51,040 --> 00:26:54,920 Speaker 1: place for any countries found to have violated that ban, 404 00:26:55,560 --> 00:27:01,520 Speaker 1: because otherwise, without that kind of global operative approach, we're 405 00:27:01,520 --> 00:27:04,720 Speaker 1: going to see countries say, well, we can't let there 406 00:27:04,760 --> 00:27:08,639 Speaker 1: be an autonomous weapon gap. If we don't pursue it, 407 00:27:09,200 --> 00:27:13,040 Speaker 1: we will be destroyed by these tools. Because our our 408 00:27:13,119 --> 00:27:17,320 Speaker 1: our opponents will surely go down that pathway, so we 409 00:27:17,440 --> 00:27:21,040 Speaker 1: have to and it becomes the sort of escalation that 410 00:27:21,080 --> 00:27:26,640 Speaker 1: we've seen time and time again. Pretty concerning stuff. We've 411 00:27:26,680 --> 00:27:30,320 Speaker 1: got another cyber attacks story to cover, this time targeting 412 00:27:30,359 --> 00:27:34,480 Speaker 1: the food industry. A company called JBS Foods had to 413 00:27:34,480 --> 00:27:37,959 Speaker 1: shut down operations over the weekend due to a cyber attack. 414 00:27:38,480 --> 00:27:42,040 Speaker 1: JBS Foods is the world's largest producer of beef and 415 00:27:42,080 --> 00:27:46,320 Speaker 1: poultry and the second largest producer of pork, which surprised 416 00:27:46,320 --> 00:27:48,879 Speaker 1: me because I mean, I guess pigs have to be 417 00:27:48,960 --> 00:27:54,159 Speaker 1: the biggest producer of pork. Uh huh jokes. Anyway, a 418 00:27:54,240 --> 00:27:58,720 Speaker 1: cyber attack forced JBS Foods to shut down operations in 419 00:27:58,840 --> 00:28:03,960 Speaker 1: multiple countries, including the UK, the United States, Australia, Canada, 420 00:28:04,040 --> 00:28:07,320 Speaker 1: and more. The attack hit the I T systems of 421 00:28:07,359 --> 00:28:10,199 Speaker 1: the company, and at the time of this recording, I 422 00:28:10,280 --> 00:28:14,159 Speaker 1: don't have specific details about the nature of that cyber attack. 423 00:28:15,000 --> 00:28:18,080 Speaker 1: If I had to guess, and again this is just 424 00:28:18,160 --> 00:28:22,920 Speaker 1: a guess, I would say it's very likely another ransomware attacks, 425 00:28:23,000 --> 00:28:26,480 Speaker 1: similar to what we saw with Colonial Pipeline earlier this year. 426 00:28:27,320 --> 00:28:31,639 Speaker 1: If that is the case, then JBS Foods could, in 427 00:28:31,720 --> 00:28:34,280 Speaker 1: theory be weighing the option about whether or not to 428 00:28:34,359 --> 00:28:38,480 Speaker 1: pay off a ransom. If that is the case, I 429 00:28:38,560 --> 00:28:42,840 Speaker 1: still maintain paying off ransoms is always a bad idea 430 00:28:43,080 --> 00:28:46,880 Speaker 1: because it consistently fuels more attacks in the future. The 431 00:28:46,920 --> 00:28:50,240 Speaker 1: more times hackers get paid off, the more they see 432 00:28:50,280 --> 00:28:53,360 Speaker 1: that this is profitable, and they'll do it even more. 433 00:28:54,160 --> 00:28:57,840 Speaker 1: The company is definitely working to restore functionality to its 434 00:28:57,840 --> 00:29:01,640 Speaker 1: systems and JBS Foods as that it has no evidence 435 00:29:01,720 --> 00:29:06,680 Speaker 1: that this attack compromised any data relating to employees, customers, 436 00:29:06,880 --> 00:29:11,960 Speaker 1: or suppliers, but that processing transactions might take a while 437 00:29:12,000 --> 00:29:15,640 Speaker 1: because the company has to restore functionality, So we'll keep 438 00:29:15,640 --> 00:29:18,800 Speaker 1: an eye on this story. And finally, up in space, 439 00:29:18,840 --> 00:29:23,120 Speaker 1: the International Space Station's robot arms suffered some damage recently, 440 00:29:23,360 --> 00:29:25,680 Speaker 1: and at first I was kind of hoping to read 441 00:29:25,720 --> 00:29:28,640 Speaker 1: about how the I s s got into a robot 442 00:29:28,840 --> 00:29:32,960 Speaker 1: arm wrestling competition will Sylvester Stallone and that this was 443 00:29:33,080 --> 00:29:37,000 Speaker 1: finally my eagerly anticipated sequel to the hit film Over 444 00:29:37,080 --> 00:29:39,560 Speaker 1: the Top, And I think that this one could be 445 00:29:39,560 --> 00:29:42,959 Speaker 1: called way over the Top. And now Stallone is like 446 00:29:43,040 --> 00:29:46,640 Speaker 1: a space trucker who likes to arm wrestle. But I'm 447 00:29:46,680 --> 00:29:48,840 Speaker 1: told that none of this is true, and I should 448 00:29:48,840 --> 00:29:51,960 Speaker 1: just probably not talk about that anymore. But what is 449 00:29:52,000 --> 00:29:55,160 Speaker 1: true is that the arm did get damaged, and the 450 00:29:55,240 --> 00:29:58,480 Speaker 1: real reason it got damaged was because of space debris, 451 00:29:59,200 --> 00:30:01,920 Speaker 1: which is a real issue and a growing one as 452 00:30:01,960 --> 00:30:04,520 Speaker 1: we send more stuff up into space and we lack 453 00:30:05,120 --> 00:30:10,280 Speaker 1: a coheri adhesive approach to getting that stuff down. Once 454 00:30:10,360 --> 00:30:13,840 Speaker 1: it ends its its useful life cycle, it's going to 455 00:30:13,920 --> 00:30:18,200 Speaker 1: get worse. And when this actually happened is hard to say, 456 00:30:18,200 --> 00:30:21,880 Speaker 1: but NASA states that the Canada Arm two, which has 457 00:30:21,920 --> 00:30:24,960 Speaker 1: been part of the I S S since two thousand one, 458 00:30:25,200 --> 00:30:28,840 Speaker 1: has a puncture in its thermal blanket, so this is 459 00:30:28,920 --> 00:30:32,600 Speaker 1: essentially like insulation around the arm, and that the boom 460 00:30:32,720 --> 00:30:37,440 Speaker 1: underneath also suffered some damage. As to when this happened, 461 00:30:37,920 --> 00:30:41,200 Speaker 1: I'm not actually sure, but the issue of space debris 462 00:30:41,280 --> 00:30:43,959 Speaker 1: is one that has been growing over the years without 463 00:30:44,040 --> 00:30:46,760 Speaker 1: much action on the part of terrestrial governments to create 464 00:30:46,800 --> 00:30:51,560 Speaker 1: a foundation for rules and processes to mitigate that issue, or, 465 00:30:51,720 --> 00:30:55,240 Speaker 1: as Jack right Nelson from the National University of Singapore 466 00:30:55,320 --> 00:30:59,680 Speaker 1: Faculty of Law said to the Register quote, the whole 467 00:30:59,800 --> 00:31:03,520 Speaker 1: in Canada Arm two is minuscule compared to the whole 468 00:31:03,600 --> 00:31:07,959 Speaker 1: in the international legal regime concerning space debris. End quote. 469 00:31:08,600 --> 00:31:13,120 Speaker 1: Couldn't have said it better myself, Mr Nelson. All right, 470 00:31:13,840 --> 00:31:17,080 Speaker 1: that is it for the news for Tuesday, June one, 471 00:31:17,080 --> 00:31:21,040 Speaker 1: twenty one. If you have any suggestions for topics I 472 00:31:21,040 --> 00:31:24,000 Speaker 1: should cover in future episodes of tech Stuff, please reach 473 00:31:24,040 --> 00:31:26,520 Speaker 1: out to me on Twitter the handle we use as 474 00:31:26,680 --> 00:31:29,760 Speaker 1: tech stuff H s W and I'll talk to you 475 00:31:29,800 --> 00:31:38,920 Speaker 1: again really soon. Y. Tech Stuff is an I Heart 476 00:31:39,000 --> 00:31:42,760 Speaker 1: Radio production. For more podcasts from my Heart Radio, visit 477 00:31:42,800 --> 00:31:45,840 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 478 00:31:45,920 --> 00:31:47,280 Speaker 1: listen to your favorite shows.