1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tex Stuff, a production from my Heart Radio. 2 00:00:12,160 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,960 --> 00:00:17,800 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,840 --> 00:00:22,560 Speaker 1: and how the tech are you. It is March twenty 5 00:00:22,720 --> 00:00:25,240 Speaker 1: twenty two, and I would like to begin this episode 6 00:00:25,320 --> 00:00:29,840 Speaker 1: of tech Stuff News with an apology. This goes out 7 00:00:29,880 --> 00:00:33,600 Speaker 1: to all my Irish listeners out there, and just know 8 00:00:34,120 --> 00:00:38,360 Speaker 1: that the following message is really for all my fellow Americans. 9 00:00:40,200 --> 00:00:43,120 Speaker 1: Top of the podcast to yeah and happy St Patrick's day. 10 00:00:43,120 --> 00:00:45,600 Speaker 1: Toy to toy to toy. All right, I got that 11 00:00:45,640 --> 00:00:50,959 Speaker 1: all the way again, mice. Sincerest. Apologies. Also, you might 12 00:00:50,960 --> 00:00:54,000 Speaker 1: hear a little bit of background noise in this episode 13 00:00:54,080 --> 00:00:56,400 Speaker 1: might sound a little different. That's because I am actually 14 00:00:56,520 --> 00:00:59,600 Speaker 1: in the office recording today. Had to be here for 15 00:01:00,160 --> 00:01:04,039 Speaker 1: some other reasons. And uh, hey, there's folks here. It's 16 00:01:04,080 --> 00:01:06,760 Speaker 1: like I haven't seen people in so long. It's kind 17 00:01:06,800 --> 00:01:10,320 Speaker 1: of nice. But let's get to the news. Bleeping Computer 18 00:01:10,720 --> 00:01:14,600 Speaker 1: reports that Russia is facing a cloud crunch in the 19 00:01:14,720 --> 00:01:19,920 Speaker 1: near future. According to the Russian news outlet Commercant, the 20 00:01:20,160 --> 00:01:23,520 Speaker 1: country could run out of cloud storage space within as 21 00:01:23,560 --> 00:01:27,319 Speaker 1: little as two months now. This is largely because the 22 00:01:27,440 --> 00:01:30,880 Speaker 1: major cloud storage providers are companies that have pulled out 23 00:01:30,920 --> 00:01:34,160 Speaker 1: of Russia in the wake of that country's invasion of Ukraine. 24 00:01:34,880 --> 00:01:37,640 Speaker 1: The news outlet also reported that at a meeting to 25 00:01:37,800 --> 00:01:41,440 Speaker 1: discuss this issue, there was a proposal to convert domestic 26 00:01:41,560 --> 00:01:46,840 Speaker 1: services within Russia over to cloud storage services, as well 27 00:01:46,880 --> 00:01:51,040 Speaker 1: as potentially seizing the assets of companies that once operated 28 00:01:51,120 --> 00:01:54,600 Speaker 1: within Russia that have since left. In other words, if 29 00:01:54,640 --> 00:02:00,080 Speaker 1: say Amazon was operating a data center in Russia, that 30 00:02:00,160 --> 00:02:04,640 Speaker 1: Russia might potentially sees those assets and use them for 31 00:02:04,720 --> 00:02:09,119 Speaker 1: their own kind of an imminent domain thing. Other options 32 00:02:09,160 --> 00:02:12,880 Speaker 1: include cutting off some media streaming services and then reallocating 33 00:02:12,919 --> 00:02:16,440 Speaker 1: those assets to cloud storage, which wouldn't necessarily be a 34 00:02:16,480 --> 00:02:20,560 Speaker 1: big popular move for Russian citizens. There's also the chance 35 00:02:20,600 --> 00:02:24,239 Speaker 1: that Russia will turn to Chinese cloud services, but that's 36 00:02:24,280 --> 00:02:27,960 Speaker 1: not a guarantee because well, it depends upon China's cooperation. 37 00:02:28,600 --> 00:02:32,920 Speaker 1: Russia has recently been investing heavily in surveillance systems and 38 00:02:32,960 --> 00:02:37,120 Speaker 1: facial recognition technology and smart city tech, and all of 39 00:02:37,120 --> 00:02:42,000 Speaker 1: those rely upon a robust cloud storage foundation. So we'll 40 00:02:42,000 --> 00:02:45,880 Speaker 1: have to see what Russia does because that clock is ticking. Also, 41 00:02:46,080 --> 00:02:49,440 Speaker 1: since we mentioned China earlier. The US government has sent 42 00:02:49,560 --> 00:02:53,600 Speaker 1: a message to China Telecom, which I'm sure you've gathered 43 00:02:53,639 --> 00:02:58,320 Speaker 1: as a telecommunications company from China, and China Telecom currently 44 00:02:58,360 --> 00:03:01,800 Speaker 1: provides services in America. Uh, but we'll have to stop 45 00:03:01,840 --> 00:03:05,839 Speaker 1: that within sixty days due to this government order. An 46 00:03:05,840 --> 00:03:09,960 Speaker 1: issue here is a concern that a telecommunications company potentially 47 00:03:09,960 --> 00:03:13,920 Speaker 1: has access to a ton of information and represents a 48 00:03:13,919 --> 00:03:17,880 Speaker 1: critical component for a country's infrastructure. And China and the 49 00:03:17,960 --> 00:03:22,240 Speaker 1: US have had, uh, let's say, a tenuous relationship. And 50 00:03:22,280 --> 00:03:26,239 Speaker 1: also the Chinese Communist government is notoriously hands on when 51 00:03:26,280 --> 00:03:29,919 Speaker 1: it comes to you know, companies in China, whether those 52 00:03:29,960 --> 00:03:33,880 Speaker 1: are native companies or companies operating within China. So there 53 00:03:34,000 --> 00:03:36,960 Speaker 1: is a legit fear that a Chinese company could act 54 00:03:37,040 --> 00:03:40,240 Speaker 1: like a spy or a general surveillance tool on the 55 00:03:40,320 --> 00:03:44,040 Speaker 1: US and pass information along to the Chinese government. There's 56 00:03:44,080 --> 00:03:46,960 Speaker 1: also a worry that should the relationship between US and 57 00:03:47,040 --> 00:03:50,760 Speaker 1: China get worse, that a telecom company from China could 58 00:03:50,840 --> 00:03:54,880 Speaker 1: end up causing massive problems in the US as communications infrastructure, 59 00:03:55,320 --> 00:03:59,760 Speaker 1: you know, by cutting off operations or otherwise just inter 60 00:04:00,000 --> 00:04:03,119 Speaker 1: hearing with day to day life. And we saw this 61 00:04:03,320 --> 00:04:06,800 Speaker 1: happen previously. We saw a very similar approach with Huawei. 62 00:04:07,480 --> 00:04:09,840 Speaker 1: That's a company that had provided a lot of networking 63 00:04:09,840 --> 00:04:13,560 Speaker 1: components to telecom companies throughout the United States, and then 64 00:04:13,800 --> 00:04:16,599 Speaker 1: the US government said get rid of all those and 65 00:04:16,680 --> 00:04:18,320 Speaker 1: we're kind of in that process now. A lot of 66 00:04:18,360 --> 00:04:19,960 Speaker 1: the smaller I s p s have a little bit 67 00:04:19,960 --> 00:04:24,480 Speaker 1: more time to get rid of those those components and 68 00:04:24,520 --> 00:04:28,000 Speaker 1: replace them with stuff from other companies. We've also seen 69 00:04:28,040 --> 00:04:31,120 Speaker 1: similar pushback on TikTok, which of course is owned by 70 00:04:31,160 --> 00:04:35,960 Speaker 1: a Chinese conglomerate called byte Dance. Anyway, China Telecom released 71 00:04:35,960 --> 00:04:38,760 Speaker 1: a statement saying that the company plans to quote pursue 72 00:04:38,800 --> 00:04:43,279 Speaker 1: all available options while continuing to serve our customers end quote. 73 00:04:43,600 --> 00:04:46,120 Speaker 1: So we'll have to see where this goes. U S 74 00:04:46,120 --> 00:04:50,680 Speaker 1: Senator Elizabeth Warren and US House Representative Mandare Jones have 75 00:04:50,839 --> 00:04:55,240 Speaker 1: introduced legislation in their respective Houses of Congress, and it's 76 00:04:55,240 --> 00:05:00,240 Speaker 1: called the Prohibiting Anti Competitive Mergers Act or HAMMA p 77 00:05:00,600 --> 00:05:06,159 Speaker 1: A m A. And while this isn't explicitly about tech companies, 78 00:05:06,480 --> 00:05:09,839 Speaker 1: Warren has said that she was really keeping those companies 79 00:05:09,839 --> 00:05:14,360 Speaker 1: in mind in particular while drafting the legislation. Essentially, if 80 00:05:14,400 --> 00:05:17,880 Speaker 1: this is signed into law, the act would ban any 81 00:05:17,920 --> 00:05:21,840 Speaker 1: massive mergers that were worth more than five billion dollars 82 00:05:22,560 --> 00:05:26,360 Speaker 1: or that would provide market shares beyond for employers or 83 00:05:26,440 --> 00:05:31,839 Speaker 1: thirty three for sellers. So it's meant to encourage a 84 00:05:31,880 --> 00:05:36,400 Speaker 1: more competitive marketplace and prevent companies from consolidating power. And 85 00:05:36,440 --> 00:05:39,440 Speaker 1: it's also meant to beef up regulator powers when reviewing 86 00:05:39,480 --> 00:05:43,680 Speaker 1: proposed mergers and to expand on reasons that regulators might 87 00:05:43,800 --> 00:05:47,559 Speaker 1: prevent such mergers should they represent any kind of anti 88 00:05:47,600 --> 00:05:51,279 Speaker 1: competitive practices. Not only that, but it would also allow 89 00:05:51,279 --> 00:05:56,680 Speaker 1: for retroactive review and the reversal of acquisitions and mergers. 90 00:05:56,760 --> 00:06:00,960 Speaker 1: So not only would this give regulators more authority moving forward, 91 00:06:01,320 --> 00:06:05,160 Speaker 1: they could actually look back on pre existing deals and 92 00:06:05,240 --> 00:06:09,400 Speaker 1: force companies to divest themselves of certain acquisitions. So immediately 93 00:06:10,160 --> 00:06:13,480 Speaker 1: I think about companies like Google and Meta, with Meta 94 00:06:13,560 --> 00:06:16,840 Speaker 1: positively leaping to mind because that company has a long 95 00:06:16,960 --> 00:06:21,360 Speaker 1: history of acquiring other social platforms and services that we're 96 00:06:21,720 --> 00:06:24,839 Speaker 1: seen as threats to Meta's claim on our free time. 97 00:06:25,000 --> 00:06:28,800 Speaker 1: I mean, like that, that's pretty widely known that the 98 00:06:28,880 --> 00:06:32,000 Speaker 1: reason why Meta, which at the time was you know, Facebook, 99 00:06:32,480 --> 00:06:36,360 Speaker 1: purchased Instagram. It was because Instagram was taking up a 100 00:06:36,360 --> 00:06:39,560 Speaker 1: lot of people's time and Facebook just couldn't compete, so 101 00:06:39,800 --> 00:06:43,400 Speaker 1: rather than compete, they bought it. Now, this act has 102 00:06:43,440 --> 00:06:46,240 Speaker 1: got a pretty broad reach, and it's one that I 103 00:06:46,120 --> 00:06:49,000 Speaker 1: imagine is going to get a lot of resistance before 104 00:06:49,360 --> 00:06:52,760 Speaker 1: it moves any further through the system. It's likely going 105 00:06:52,800 --> 00:06:55,080 Speaker 1: to be an uphill battle to pass this bill in 106 00:06:55,120 --> 00:06:59,720 Speaker 1: the Senate. And there's also no conservative co sponsors on 107 00:06:59,839 --> 00:07:03,680 Speaker 1: either version of this bill, so no Republicans have co 108 00:07:03,880 --> 00:07:06,920 Speaker 1: sponsored it. That's kind of an indication that we're probably 109 00:07:06,920 --> 00:07:10,320 Speaker 1: gonna see some resistance here. If somehow this does pass 110 00:07:10,400 --> 00:07:14,000 Speaker 1: into law, it would obviously change things dramatically, not just 111 00:07:14,080 --> 00:07:17,760 Speaker 1: for tech but for all industry. It's just we probably 112 00:07:17,840 --> 00:07:21,920 Speaker 1: see it very acutely in the tech space. For one thing, 113 00:07:22,360 --> 00:07:25,720 Speaker 1: we would see stuff like Microsoft's planned acquisition of Activision 114 00:07:25,760 --> 00:07:28,760 Speaker 1: Blizzard that would get nixed right away because that deal 115 00:07:28,840 --> 00:07:32,600 Speaker 1: is valued at nearly sixty nine billion dollars. That's well 116 00:07:32,640 --> 00:07:35,400 Speaker 1: over the five billion dollar threshold. But we'll have to 117 00:07:35,400 --> 00:07:37,920 Speaker 1: wait and see where this goes from here. One other 118 00:07:37,960 --> 00:07:42,600 Speaker 1: acquisition that the PAMA legislation would have prevented is Amazon's 119 00:07:42,640 --> 00:07:47,760 Speaker 1: acquisition of MGM, the movie and television studio. That deal 120 00:07:48,000 --> 00:07:52,320 Speaker 1: is worth more than eight billion Smackaroo's, so again above 121 00:07:52,360 --> 00:07:57,680 Speaker 1: that five billion threshold. However, the FTC did not make 122 00:07:57,760 --> 00:08:02,120 Speaker 1: a public response about the deal, like there was an 123 00:08:02,360 --> 00:08:05,600 Speaker 1: a period open for that to happen, and the FTC 124 00:08:05,800 --> 00:08:10,240 Speaker 1: took no action. Also, in the EU, the Antitrust Commission 125 00:08:10,360 --> 00:08:13,400 Speaker 1: already approved the acquisition, So in the EU it got 126 00:08:13,440 --> 00:08:17,320 Speaker 1: the green light, and it looks like Amazon now owns MGM. Now, 127 00:08:17,360 --> 00:08:20,320 Speaker 1: this could be one of those deals that the proposed 128 00:08:20,400 --> 00:08:25,880 Speaker 1: legislation would later reverse, right like, if that proposed PAMMA 129 00:08:26,040 --> 00:08:31,560 Speaker 1: Act passes, then presumably even with this deal going through now, 130 00:08:32,200 --> 00:08:37,839 Speaker 1: later on regulators could force Amazon to divest itself of MGM. 131 00:08:37,880 --> 00:08:40,200 Speaker 1: But again, I would be surprised if that PAMMA bill 132 00:08:40,240 --> 00:08:43,800 Speaker 1: actually does become law. The EU Commission said that there 133 00:08:43,920 --> 00:08:48,079 Speaker 1: was very limited overlap between Amazon and MGM, so saying 134 00:08:48,080 --> 00:08:51,439 Speaker 1: that the acquisition would not be anti competitive because it's 135 00:08:51,480 --> 00:08:54,560 Speaker 1: not like they were actually reducing the number of companies 136 00:08:54,600 --> 00:08:57,840 Speaker 1: that are in the same market. The acquisition will mean 137 00:08:57,880 --> 00:09:01,240 Speaker 1: that Amazon gets a big boost in content, though not 138 00:09:01,320 --> 00:09:04,520 Speaker 1: all the content ever created by MGM. That story gets 139 00:09:04,520 --> 00:09:09,320 Speaker 1: super complicated because different parties have acquired different sections of 140 00:09:09,520 --> 00:09:13,720 Speaker 1: MGMs library over the years. Uh, but it would include 141 00:09:13,760 --> 00:09:18,000 Speaker 1: stuff like the James Bond franchise, which obviously very valuable. 142 00:09:18,559 --> 00:09:21,480 Speaker 1: I might do a series about the history and evolution 143 00:09:21,520 --> 00:09:24,480 Speaker 1: of MGM in the future. It's pretty complicated, but I've 144 00:09:24,480 --> 00:09:28,080 Speaker 1: done similar episodes about Warner Brothers and some other studios, 145 00:09:28,360 --> 00:09:32,520 Speaker 1: and MGM obviously is another important one. Anyway, this is 146 00:09:32,520 --> 00:09:36,080 Speaker 1: a continuation of a trend of consolidation of media, and 147 00:09:36,160 --> 00:09:39,280 Speaker 1: now Amazon Prime Video will have tons of MGM content 148 00:09:39,360 --> 00:09:42,440 Speaker 1: on it, while Discovery and Warner Media are in the 149 00:09:42,480 --> 00:09:46,600 Speaker 1: process emerging and bringing all their content together. So maybe 150 00:09:46,600 --> 00:09:51,920 Speaker 1: you could argue that this specific example isn't inherently anti competitive, 151 00:09:52,040 --> 00:09:56,040 Speaker 1: but the general trend I think perhaps is Okay, We've 152 00:09:56,080 --> 00:09:58,400 Speaker 1: got a lot more stories to go through, but before 153 00:09:58,400 --> 00:10:08,960 Speaker 1: we get to that, let's take a quick break now 154 00:10:09,000 --> 00:10:12,280 Speaker 1: to talk about an acquisition that's on a smaller scale. 155 00:10:12,400 --> 00:10:16,600 Speaker 1: Google is buying a company called Raxium, which produces micro 156 00:10:16,880 --> 00:10:19,559 Speaker 1: l e d s for displays. L E d s 157 00:10:19,559 --> 00:10:22,439 Speaker 1: are light emitting diodes, and the micro l e ED 158 00:10:22,679 --> 00:10:27,839 Speaker 1: space is kind of positioned to compete against the predominant 159 00:10:27,880 --> 00:10:32,840 Speaker 1: standard bearer, which is oh LED technology. Reportedly, Google's interest 160 00:10:32,960 --> 00:10:35,880 Speaker 1: in micro l e d S is related to the 161 00:10:35,920 --> 00:10:40,280 Speaker 1: development of augmented reality technology. Google has a long history 162 00:10:40,520 --> 00:10:43,880 Speaker 1: dabbling in a R, the Google Glass project being a 163 00:10:44,000 --> 00:10:49,000 Speaker 1: very early example of that. But presumably whatever Google has 164 00:10:49,000 --> 00:10:51,440 Speaker 1: in mind is going to be more robust and a 165 00:10:51,480 --> 00:10:54,760 Speaker 1: more immersive approach to a R in the future, and 166 00:10:54,800 --> 00:10:57,440 Speaker 1: it appears that this micro led tech will play a 167 00:10:57,520 --> 00:11:01,600 Speaker 1: part in that in some form of display most likely. 168 00:11:01,960 --> 00:11:05,840 Speaker 1: And Apple has done similar things that actually acquired a 169 00:11:05,920 --> 00:11:08,960 Speaker 1: different micro led company. In fact, I think that happened 170 00:11:09,000 --> 00:11:12,480 Speaker 1: way back in As to when we might actually see 171 00:11:12,520 --> 00:11:17,400 Speaker 1: products from Google that incorporate this technology, that's probably gonna 172 00:11:17,440 --> 00:11:21,160 Speaker 1: take a while, And considering Google's track record with hardware, 173 00:11:21,880 --> 00:11:24,400 Speaker 1: it might take even longer for folks to feel comfortable 174 00:11:24,480 --> 00:11:28,559 Speaker 1: adopting it because Google has a long history of introducing products, 175 00:11:28,559 --> 00:11:32,080 Speaker 1: only too in support for those products, like a year 176 00:11:32,160 --> 00:11:36,760 Speaker 1: or two later. Uh. It's hard to get behind the 177 00:11:36,840 --> 00:11:42,400 Speaker 1: idea of buying hardware from Google simply because there is 178 00:11:42,440 --> 00:11:45,520 Speaker 1: this long history of Google saying that didn't work, let's 179 00:11:45,559 --> 00:11:48,160 Speaker 1: just stop supporting it. And then you've got this very 180 00:11:48,200 --> 00:11:52,000 Speaker 1: expensive paper weight that you know, over time can do 181 00:11:52,120 --> 00:11:56,560 Speaker 1: less and less because of that lack of support. Over 182 00:11:56,600 --> 00:11:59,600 Speaker 1: in Europe, a French cloud services company called O v 183 00:11:59,800 --> 00:12:03,800 Speaker 1: A Cloud has filed a complaint alleging that Microsoft is 184 00:12:03,840 --> 00:12:08,040 Speaker 1: in violation of antitrust measures in the EU. Now, according 185 00:12:08,080 --> 00:12:12,000 Speaker 1: to the complaint, Microsoft charges a different rate for its 186 00:12:12,080 --> 00:12:15,360 Speaker 1: Office suite of products if you use a different cloud 187 00:12:15,520 --> 00:12:20,680 Speaker 1: platform then Microsoft's own Azure. So in other words, if 188 00:12:21,040 --> 00:12:23,559 Speaker 1: you have a customer that wants to use say O 189 00:12:23,720 --> 00:12:27,560 Speaker 1: v H Cloud for its cloud platform, but still needs 190 00:12:27,600 --> 00:12:31,920 Speaker 1: to rely on Microsoft Office products for productivity software, then 191 00:12:32,000 --> 00:12:35,079 Speaker 1: it would have to pay more for that office software 192 00:12:35,120 --> 00:12:38,000 Speaker 1: suite than it would if they just went all in 193 00:12:38,200 --> 00:12:41,839 Speaker 1: with Microsoft. Oh v H Cloud says this practice is 194 00:12:41,880 --> 00:12:45,600 Speaker 1: anti competitive and it's meant to discourage customers from seeking 195 00:12:45,600 --> 00:12:49,360 Speaker 1: out any alternatives from Microsoft. And on the one hand, 196 00:12:49,400 --> 00:12:52,920 Speaker 1: you might say, you know, bundled services have kind of 197 00:12:52,960 --> 00:12:56,839 Speaker 1: always been a thing and typically you pay less would 198 00:12:56,840 --> 00:13:00,160 Speaker 1: you bundle things together. But on the other hand, you 199 00:13:00,160 --> 00:13:02,480 Speaker 1: have to keep in mind that there are some companies 200 00:13:02,480 --> 00:13:06,959 Speaker 1: that only provide a subset of those services, like a 201 00:13:07,040 --> 00:13:10,319 Speaker 1: cloud platform. So you know, O v H Cloud doesn't 202 00:13:10,320 --> 00:13:14,160 Speaker 1: necessarily have a productivity suite, so they can't compete on 203 00:13:14,320 --> 00:13:17,920 Speaker 1: apples to apples with Microsoft, and so you might really 204 00:13:18,360 --> 00:13:21,679 Speaker 1: love o v H clouds product, you might love their 205 00:13:21,679 --> 00:13:25,800 Speaker 1: customer service. You might think that's the superior cloud platform. 206 00:13:25,840 --> 00:13:28,680 Speaker 1: But you know, you probably also need a productivity suite. 207 00:13:28,920 --> 00:13:31,640 Speaker 1: So you as a customer might be more inclined to 208 00:13:31,679 --> 00:13:34,439 Speaker 1: just go all in with Microsoft because it's cheaper and easier. 209 00:13:35,520 --> 00:13:39,359 Speaker 1: And that doesn't mean that Microsoft is providing a better experience. 210 00:13:40,000 --> 00:13:43,720 Speaker 1: It's just, you know, it's it's giving you an incentive anyway. 211 00:13:43,720 --> 00:13:46,680 Speaker 1: Ov H cloud has filed the complaint, but there's no 212 00:13:46,760 --> 00:13:49,440 Speaker 1: word as of yet on what happens next or whether 213 00:13:49,520 --> 00:13:52,640 Speaker 1: Microsoft has even received a formal notice of the complaint yet. 214 00:13:53,960 --> 00:13:57,800 Speaker 1: Over in California, state lawmakers have proposed new legislation that 215 00:13:57,840 --> 00:14:01,839 Speaker 1: would give parents the opportunity to sue social networking platforms 216 00:14:02,120 --> 00:14:06,720 Speaker 1: for getting their kids addicted to social media. And this 217 00:14:06,800 --> 00:14:09,079 Speaker 1: is a pretty complicated issue, and I can actually view 218 00:14:09,160 --> 00:14:12,479 Speaker 1: it from a couple of different perspectives. So one perspective 219 00:14:12,520 --> 00:14:15,720 Speaker 1: is that I could see someone saying it's really the 220 00:14:15,760 --> 00:14:20,240 Speaker 1: responsibility of parents and schools to monitor kids and to 221 00:14:20,320 --> 00:14:25,400 Speaker 1: teach them and to help them arm themselves with knowledge 222 00:14:25,680 --> 00:14:28,160 Speaker 1: and prevent them from getting addicted in the first place, 223 00:14:28,440 --> 00:14:31,480 Speaker 1: So why would we hold companies accountable for things that 224 00:14:31,560 --> 00:14:36,080 Speaker 1: parents and school should be doing. And there's some merit 225 00:14:36,160 --> 00:14:38,720 Speaker 1: to that argument, I would say. But another view that 226 00:14:38,760 --> 00:14:42,480 Speaker 1: I can see, and I also find merit in, is 227 00:14:42,520 --> 00:14:46,120 Speaker 1: that companies like say Meta, devote an awful lot of 228 00:14:46,160 --> 00:14:49,560 Speaker 1: time coming up with ways to entice people to come 229 00:14:49,640 --> 00:14:53,920 Speaker 1: to and then stay on their platforms. In fact, almost 230 00:14:54,000 --> 00:14:58,840 Speaker 1: all of Meta's business hinges on attracting and retaining users 231 00:14:58,960 --> 00:15:02,080 Speaker 1: for as much of their time as is possible. In 232 00:15:02,080 --> 00:15:06,160 Speaker 1: other words, companies like Meta have built their business around 233 00:15:06,480 --> 00:15:11,160 Speaker 1: creating addictive experiences. So from that perspective, Meta and other 234 00:15:11,240 --> 00:15:15,160 Speaker 1: companies like it absolutely bear some of the responsibility for 235 00:15:15,240 --> 00:15:19,120 Speaker 1: kids getting addicted to social media. So the reality is 236 00:15:19,160 --> 00:15:22,560 Speaker 1: a complex one, and I think there's actually a plenty 237 00:15:22,560 --> 00:15:25,120 Speaker 1: of accountability to go around. I don't think we can 238 00:15:25,360 --> 00:15:28,600 Speaker 1: assign it all to just one party anyway. This proposed 239 00:15:28,680 --> 00:15:31,920 Speaker 1: law would allow parents to file class action lawsuits against 240 00:15:31,920 --> 00:15:36,560 Speaker 1: companies that provided services that their children subsequently became addicted to, 241 00:15:37,240 --> 00:15:40,840 Speaker 1: and the fine would be about a thousand dollars per 242 00:15:40,920 --> 00:15:44,360 Speaker 1: child UH or twenty five dollars per child in the 243 00:15:44,400 --> 00:15:48,520 Speaker 1: case of civil penalty cases. Uh, the legislation does allow 244 00:15:48,600 --> 00:15:52,720 Speaker 1: for some safe harbor protections if companies meet certain requirements 245 00:15:53,080 --> 00:15:56,520 Speaker 1: that indicate they are taking quote basic steps to avoid 246 00:15:56,600 --> 00:15:59,520 Speaker 1: addicting children in quote. So in other words, if a 247 00:15:59,520 --> 00:16:01,680 Speaker 1: company could show that it's at least doing the bare 248 00:16:01,760 --> 00:16:04,760 Speaker 1: minimum to try and prevent kids from being hooked on 249 00:16:04,800 --> 00:16:08,800 Speaker 1: their services, then they should be in the clear. Also, UH, 250 00:16:09,000 --> 00:16:12,080 Speaker 1: companies that generate less than one million dollars in revenue 251 00:16:12,080 --> 00:16:16,160 Speaker 1: per year would be excluded, and that I find really interesting. 252 00:16:16,280 --> 00:16:19,880 Speaker 1: I mean, I get the risk of big tech because 253 00:16:19,920 --> 00:16:22,280 Speaker 1: I mean, we we've all heard the stories, like we've 254 00:16:22,280 --> 00:16:26,800 Speaker 1: seen the internal documents from Meta, and you know, the 255 00:16:26,840 --> 00:16:30,600 Speaker 1: fact is that companies like Meta really are a major issue, 256 00:16:30,680 --> 00:16:34,320 Speaker 1: a major concern. However, to me, it's a little strange 257 00:16:34,360 --> 00:16:37,520 Speaker 1: to say, if you happen to create experiences that are 258 00:16:37,520 --> 00:16:40,960 Speaker 1: meant to be addictive to children, but you make less 259 00:16:40,960 --> 00:16:43,360 Speaker 1: than a hundred million dollars a year, you're to it's 260 00:16:43,360 --> 00:16:47,440 Speaker 1: okay that that doesn't sit well with me. But maybe 261 00:16:47,480 --> 00:16:52,280 Speaker 1: I'm just interpreting this incorrectly. Three years after getting into 262 00:16:52,280 --> 00:16:55,920 Speaker 1: the VR space, Facebook or I really should say, Meta 263 00:16:56,320 --> 00:16:59,480 Speaker 1: says it's going to roll out parental supervision controls for 264 00:16:59,600 --> 00:17:04,040 Speaker 1: VR experiences. Now, that's definitely important because we've already seen 265 00:17:04,119 --> 00:17:08,040 Speaker 1: several reports about how abusive folks in VR environments can 266 00:17:08,080 --> 00:17:11,760 Speaker 1: be amazingly awful to other people, and they could be 267 00:17:11,760 --> 00:17:15,439 Speaker 1: a really traumatic experience. There's a legit concern for the 268 00:17:15,440 --> 00:17:18,640 Speaker 1: health and safety of children and VR spaces like Meta's 269 00:17:18,720 --> 00:17:25,560 Speaker 1: Horizon Worlds, where again we've already seen evidence of predatory behaviors, 270 00:17:25,560 --> 00:17:28,640 Speaker 1: something you definitely don't want kids to be exposed to. Now, 271 00:17:28,680 --> 00:17:31,399 Speaker 1: I should add that Meta has stated that the Quest 272 00:17:31,560 --> 00:17:36,080 Speaker 1: It's it's main VR product, is intended for users who 273 00:17:36,080 --> 00:17:39,800 Speaker 1: are a thirteen or older, but that doesn't mean younger 274 00:17:39,880 --> 00:17:43,160 Speaker 1: kids aren't still using it. In fact, there's no real 275 00:17:43,240 --> 00:17:47,240 Speaker 1: way to prevent that from happening, and heck, thirteen, I 276 00:17:47,240 --> 00:17:51,320 Speaker 1: would argue, it's still pretty freaking young, and the potential 277 00:17:51,359 --> 00:17:54,520 Speaker 1: harm that we're talking about can be significant. So I 278 00:17:54,560 --> 00:17:58,800 Speaker 1: think that rolling out these parental controls is an important step. 279 00:17:59,119 --> 00:18:02,720 Speaker 1: I also think it's well overdue. Uh. I have an 280 00:18:02,720 --> 00:18:05,040 Speaker 1: issue with Meta in general because I feel like the 281 00:18:05,080 --> 00:18:08,680 Speaker 1: company treats the general public like guinea pigs. It will 282 00:18:08,880 --> 00:18:12,800 Speaker 1: roll out services on a really broad scale and then 283 00:18:13,440 --> 00:18:16,320 Speaker 1: look to see if they're harmful or not, and then 284 00:18:16,440 --> 00:18:19,480 Speaker 1: make changes. I feel like it's a little reckless to 285 00:18:19,600 --> 00:18:22,840 Speaker 1: go that direction. I feel there needs to be maybe 286 00:18:22,920 --> 00:18:26,639 Speaker 1: more QA and R and D work before you start 287 00:18:26,720 --> 00:18:30,080 Speaker 1: rolling stuff out to actual human beings out there. But 288 00:18:30,760 --> 00:18:34,200 Speaker 1: maybe I'm just being too conservative in that case. Anyway, 289 00:18:34,240 --> 00:18:37,479 Speaker 1: the parental controls will include alerts whenever a kid has 290 00:18:37,480 --> 00:18:40,600 Speaker 1: made a purchase within VR apps, and will also include 291 00:18:40,600 --> 00:18:43,480 Speaker 1: a dashboard listing off all the apps that the kid 292 00:18:43,640 --> 00:18:47,160 Speaker 1: has access to UH as well as how much time 293 00:18:47,200 --> 00:18:51,720 Speaker 1: they're spending in those VR environments, and parents will presumably 294 00:18:51,760 --> 00:18:55,639 Speaker 1: have some control over all of that. Meta is bringing 295 00:18:55,680 --> 00:18:59,119 Speaker 1: similar control features and educational material to its other platforms 296 00:18:59,119 --> 00:19:03,600 Speaker 1: as well, including Instagram. The company is pretty clearly been 297 00:19:03,640 --> 00:19:07,640 Speaker 1: on the defensive since Francis Hogan brought forward concerning internal 298 00:19:07,680 --> 00:19:11,240 Speaker 1: documents indicating Meta's research into its products effects on younger users, 299 00:19:12,040 --> 00:19:15,440 Speaker 1: which seemed to indicate that, you know, they're fairly negative. 300 00:19:15,960 --> 00:19:19,400 Speaker 1: And we've also been concerned obviously about the company's intention 301 00:19:19,480 --> 00:19:23,760 Speaker 1: to build out products that are specifically targeting younger demographics. 302 00:19:23,800 --> 00:19:27,960 Speaker 1: So I feel like this is Meta's approach to addressing 303 00:19:28,000 --> 00:19:34,320 Speaker 1: those concerns. Hopefully, uh, those methods that Meta takes will 304 00:19:34,359 --> 00:19:37,960 Speaker 1: be sufficient. But of course we've got more bad news 305 00:19:38,080 --> 00:19:41,520 Speaker 1: for Meta, and in this case, it's that Congress is 306 00:19:41,560 --> 00:19:44,280 Speaker 1: calling on Meta to answer for a failure that could 307 00:19:44,280 --> 00:19:49,959 Speaker 1: harm kids. That failure centers on Facebook Marketplace, the platforms 308 00:19:50,040 --> 00:19:54,480 Speaker 1: online commerce tool where users can buy and sell various stuff, 309 00:19:55,000 --> 00:19:56,760 Speaker 1: and at the heart of the matter is the issue 310 00:19:56,800 --> 00:20:00,480 Speaker 1: of recalled products. UH. The United States CA Gris says 311 00:20:00,520 --> 00:20:03,520 Speaker 1: that Meta has failed to prevent the sale of recalled 312 00:20:03,520 --> 00:20:07,560 Speaker 1: products it could be harmful and have been harmful to children, 313 00:20:08,000 --> 00:20:11,640 Speaker 1: and said that the failure is a quote remarkable dereliction 314 00:20:11,800 --> 00:20:15,040 Speaker 1: of duty by your company on behalf of your users 315 00:20:15,200 --> 00:20:17,879 Speaker 1: end quote. And this is not the first time that 316 00:20:17,960 --> 00:20:21,880 Speaker 1: Congress has sent a letter to Facebook urging the company 317 00:20:21,920 --> 00:20:26,040 Speaker 1: to take action on this very subject. In fact, lawmakers 318 00:20:26,080 --> 00:20:29,320 Speaker 1: have been saying this to Meta leadership for for a 319 00:20:29,359 --> 00:20:32,760 Speaker 1: few years now, and according to those lawmakers, the company 320 00:20:32,920 --> 00:20:36,439 Speaker 1: has appeared to do little, if anything about it. USA 321 00:20:36,520 --> 00:20:40,640 Speaker 1: Today investigated the issue and found instances of several recalled 322 00:20:40,640 --> 00:20:44,040 Speaker 1: products appearing on Facebook Marketplace, and these were products that 323 00:20:44,040 --> 00:20:46,560 Speaker 1: were associated with the deaths of more than one hundred 324 00:20:46,680 --> 00:20:51,760 Speaker 1: children collectively, which is truly awful to think about, and 325 00:20:51,800 --> 00:20:55,480 Speaker 1: they're still occasionally being bought and sold on the Facebook 326 00:20:55,520 --> 00:21:00,000 Speaker 1: marketplace platform. And this is despite the availability of tools 327 00:21:00,240 --> 00:21:04,800 Speaker 1: text and image recognition that could identify those kinds of 328 00:21:04,880 --> 00:21:09,240 Speaker 1: listings and at least allow Facebook to block them. So, 329 00:21:09,280 --> 00:21:11,840 Speaker 1: in other words, it is entirely possible for Meta to 330 00:21:11,960 --> 00:21:15,639 Speaker 1: log and subsequently remove these posts, and yet the company 331 00:21:15,640 --> 00:21:19,200 Speaker 1: fails to do so. However, let's talk about the law 332 00:21:19,359 --> 00:21:22,560 Speaker 1: for a second. So currently in the United States, it 333 00:21:22,800 --> 00:21:26,960 Speaker 1: is illegal to sell recalled products, Like if there's been 334 00:21:27,240 --> 00:21:30,160 Speaker 1: a toy that's been recalled because it turns out it's 335 00:21:30,200 --> 00:21:33,200 Speaker 1: toxic and kids could get sick if they were playing 336 00:21:33,240 --> 00:21:36,440 Speaker 1: with it. Um, it is illegal to sell that. However, 337 00:21:36,880 --> 00:21:42,760 Speaker 1: Facebook isn't actually selling anything in Facebook Marketplace. It is 338 00:21:43,119 --> 00:21:48,280 Speaker 1: an online platform. It's facilitating sales, so it gives people 339 00:21:48,320 --> 00:21:52,119 Speaker 1: a place where they can post and shop for stuff. 340 00:21:52,760 --> 00:21:56,840 Speaker 1: But Facebook itself isn't the merchant. So just as websites 341 00:21:57,280 --> 00:22:00,080 Speaker 1: shouldn't be held responsible for the stuff that used or 342 00:22:00,200 --> 00:22:04,840 Speaker 1: is post to those websites, Facebook shouldn't be held liable 343 00:22:04,880 --> 00:22:09,400 Speaker 1: for illegal products sold on marketplace because again, it's not 344 00:22:09,480 --> 00:22:12,560 Speaker 1: like Facebook's the vendor that says the law stands now, 345 00:22:12,920 --> 00:22:15,440 Speaker 1: there are some lawmakers in the US who are looking 346 00:22:15,480 --> 00:22:18,680 Speaker 1: to draft legislation that would require online stores to take 347 00:22:19,000 --> 00:22:22,720 Speaker 1: greater steps to prevent illegal sales. In the future, I 348 00:22:22,720 --> 00:22:26,160 Speaker 1: imagine that Facebook and lots of other online store companies 349 00:22:26,200 --> 00:22:28,760 Speaker 1: will take a few steps to address issues like this 350 00:22:29,400 --> 00:22:33,480 Speaker 1: before that goes much further, because generally speaking, we typically 351 00:22:33,480 --> 00:22:37,920 Speaker 1: see businesses and industries tackle these issues themselves before regulations 352 00:22:37,960 --> 00:22:41,000 Speaker 1: get involved. And wrapping up on our let's dump on 353 00:22:41,119 --> 00:22:45,960 Speaker 1: meta segment, let's have a somewhat goofy news item. Instagram 354 00:22:46,080 --> 00:22:50,560 Speaker 1: issued a twenty four hour ban on Kanye West because 355 00:22:50,960 --> 00:22:55,800 Speaker 1: something he posted violated Instagram's policies. Now, the specific post 356 00:22:56,359 --> 00:23:01,199 Speaker 1: wasn't singled out, however, general consists of says that was 357 00:23:01,400 --> 00:23:04,879 Speaker 1: likely to do with a post Kanye made about Trevor Noah, 358 00:23:05,080 --> 00:23:08,320 Speaker 1: the host of The Daily Show. Trevor Noah had made 359 00:23:08,359 --> 00:23:12,639 Speaker 1: some humorous commentary about the ongoing drama between Kanye West, 360 00:23:13,280 --> 00:23:17,840 Speaker 1: Kanye West's ex wife Kim Kardashian, and Kim Kardashian's boyfriend 361 00:23:17,960 --> 00:23:22,080 Speaker 1: Pete Davidson, and y'all, I'm old and I'm out of touch, 362 00:23:22,280 --> 00:23:24,119 Speaker 1: so I am not the best person to give a 363 00:23:24,160 --> 00:23:26,880 Speaker 1: full rundown on all the drama that's gone on here. 364 00:23:27,440 --> 00:23:30,240 Speaker 1: I'm aware of it, but I haven't really been following it. However, 365 00:23:30,280 --> 00:23:32,600 Speaker 1: I do think that, based on my limited knowledge of 366 00:23:32,680 --> 00:23:37,360 Speaker 1: Kanye West, this was a very Kanye thing to have happened. 367 00:23:38,000 --> 00:23:41,080 Speaker 1: All right, We've got more news items to get through 368 00:23:41,280 --> 00:23:43,560 Speaker 1: before we get to that. Let's take another quick break. 369 00:23:51,560 --> 00:23:55,879 Speaker 1: We're back the site Fast Company has a really interesting article. 370 00:23:56,359 --> 00:23:59,879 Speaker 1: It has the headline to fight Disinformation, Follow the Money 371 00:24:00,160 --> 00:24:03,520 Speaker 1: and the Ads, and it was written by Rob Peggararro. 372 00:24:04,400 --> 00:24:07,719 Speaker 1: It's about an organization called check my Ads and it 373 00:24:07,760 --> 00:24:12,000 Speaker 1: was founded by Nandini Jummy and Claire Atkin. The focus 374 00:24:12,040 --> 00:24:17,480 Speaker 1: of this organization is to defund disinformation, and the two 375 00:24:17,480 --> 00:24:22,000 Speaker 1: co founders made the case that disinformation campaigns are funded. 376 00:24:22,200 --> 00:24:26,639 Speaker 1: They make money, not by just posting on social networks 377 00:24:26,640 --> 00:24:30,520 Speaker 1: like that's not really where money comes in to these organizations. 378 00:24:30,560 --> 00:24:33,840 Speaker 1: If if that were hill they depended upon revenue, they 379 00:24:33,840 --> 00:24:36,920 Speaker 1: would fade away because they don't make enough that way. Instead, 380 00:24:37,600 --> 00:24:43,080 Speaker 1: it's attracting people to owned and operated websites that heavily 381 00:24:43,160 --> 00:24:46,560 Speaker 1: feature advertising and you know web ads. That's the basic 382 00:24:46,600 --> 00:24:50,080 Speaker 1: revenue plan we see across the web, and in many 383 00:24:50,119 --> 00:24:53,679 Speaker 1: of these cases, companies that use certain ad exchanges, that is, 384 00:24:54,520 --> 00:24:58,040 Speaker 1: a company that broker's ads right like it takes in 385 00:24:58,240 --> 00:25:02,600 Speaker 1: orders from cli ants and then places those ads on 386 00:25:02,680 --> 00:25:07,520 Speaker 1: websites well, certain add exchanges might end up running ads 387 00:25:07,560 --> 00:25:13,040 Speaker 1: against content that their clients would rather not be associated with. Uh. 388 00:25:13,119 --> 00:25:15,000 Speaker 1: There have been lots of cases with this. In fact, 389 00:25:15,040 --> 00:25:19,000 Speaker 1: the Sleeping Giants organization, which is related in a way 390 00:25:19,080 --> 00:25:24,080 Speaker 1: at least it's it's connected to check my Ads, has 391 00:25:24,320 --> 00:25:28,159 Speaker 1: long been pointing out how ads for certain companies have 392 00:25:28,240 --> 00:25:34,240 Speaker 1: appeared on let's say, controversial sites with the question of hey, 393 00:25:34,320 --> 00:25:38,320 Speaker 1: are you okay with your ad being posted against this article, 394 00:25:38,920 --> 00:25:43,240 Speaker 1: and that has often resulted in various companies pulling their 395 00:25:43,280 --> 00:25:47,520 Speaker 1: ads from those companies and telling ad exchanges, hey, we 396 00:25:47,600 --> 00:25:50,679 Speaker 1: don't want you to place our ads on sites like 397 00:25:50,760 --> 00:25:52,920 Speaker 1: this one. So that's kind of what checked my Ads 398 00:25:53,000 --> 00:25:58,359 Speaker 1: is all about, is like helping identify where these instances 399 00:25:58,400 --> 00:26:03,640 Speaker 1: are happening in an effort to defund those websites and 400 00:26:03,760 --> 00:26:07,679 Speaker 1: take away the money that allows these disinformation campaigns to 401 00:26:07,920 --> 00:26:10,879 Speaker 1: exist in the first place. And the whole article is 402 00:26:10,880 --> 00:26:13,679 Speaker 1: well worth a read. The co founders make a plead 403 00:26:13,720 --> 00:26:16,560 Speaker 1: to business owners. They tell business owners, Hey, you guys 404 00:26:16,600 --> 00:26:19,639 Speaker 1: need to be really good stewards of your brands. You 405 00:26:19,680 --> 00:26:22,800 Speaker 1: need to be aware of where you're advertising is going, 406 00:26:23,760 --> 00:26:26,480 Speaker 1: and uh, you need to take a more proactive approach 407 00:26:26,520 --> 00:26:29,600 Speaker 1: to being sure that the sites where your ads are 408 00:26:29,600 --> 00:26:33,720 Speaker 1: showing up a line with the vision of your company. 409 00:26:33,880 --> 00:26:36,720 Speaker 1: So interesting article, I recommend you check it out. Hey, 410 00:26:36,840 --> 00:26:40,320 Speaker 1: do you log into Netflix using someone else's account? Because 411 00:26:40,320 --> 00:26:42,879 Speaker 1: I know that happens a lot. I've actually got friends 412 00:26:42,880 --> 00:26:45,080 Speaker 1: who have casually mentioned that they use a log in 413 00:26:45,160 --> 00:26:50,040 Speaker 1: that belongs to someone else in order to get on Netflix. Obviously, 414 00:26:50,119 --> 00:26:54,720 Speaker 1: Netflix would really prefer people not do that. Netflix is 415 00:26:54,720 --> 00:26:58,119 Speaker 1: always gonna want everyone to have their own subscription, and 416 00:26:58,160 --> 00:27:00,920 Speaker 1: now what sounds like Netflix is looking at a way 417 00:27:00,920 --> 00:27:05,119 Speaker 1: to actively discourage that behavior. Namely, Netflix is going to 418 00:27:05,119 --> 00:27:09,520 Speaker 1: try and identify accounts that are shared across multiple users 419 00:27:09,560 --> 00:27:14,000 Speaker 1: and then charge those accounts more for a Netflix subscription. 420 00:27:14,640 --> 00:27:18,080 Speaker 1: So you if you are borrowing login information, you might 421 00:27:18,119 --> 00:27:21,800 Speaker 1: find that that login password has mysteriously changed, because it 422 00:27:21,920 --> 00:27:24,040 Speaker 1: might turn out that your parents, or your old college 423 00:27:24,080 --> 00:27:27,440 Speaker 1: roommate or whomever doesn't want to have to pay more 424 00:27:27,520 --> 00:27:30,840 Speaker 1: for their Netflix account just so you're freeloading butt can 425 00:27:30,880 --> 00:27:35,159 Speaker 1: watch the next season of Love Island or whatever. I 426 00:27:35,200 --> 00:27:38,280 Speaker 1: don't know. I don't even watch Netflix anymore. But at 427 00:27:38,280 --> 00:27:42,080 Speaker 1: the moment, Netflix is just doing a limited test on 428 00:27:42,119 --> 00:27:46,359 Speaker 1: this policy in Costa Rica, in Chile and Peru, and 429 00:27:46,400 --> 00:27:48,800 Speaker 1: the company has also stated that it will let account 430 00:27:48,840 --> 00:27:53,560 Speaker 1: holders allow the folks using the account sharing the account 431 00:27:53,640 --> 00:27:58,360 Speaker 1: to quote transfer profile information either to a new account 432 00:27:59,000 --> 00:28:03,199 Speaker 1: or an extra member sub account, thus keeping the viewing 433 00:28:03,240 --> 00:28:08,400 Speaker 1: history my list in personalized recommendations end quote. Presumably Netflix 434 00:28:08,440 --> 00:28:11,560 Speaker 1: will monitor the effects of this policy in the test 435 00:28:11,640 --> 00:28:15,160 Speaker 1: countries before rolling it out or tweaking it, and then 436 00:28:15,240 --> 00:28:18,200 Speaker 1: rolling it out worldwide. But this is just a heads 437 00:28:18,280 --> 00:28:20,560 Speaker 1: up if you happen to rely on a shared account 438 00:28:20,560 --> 00:28:24,800 Speaker 1: to access Netflix. Also, I imagine other streaming services are 439 00:28:24,800 --> 00:28:26,840 Speaker 1: going to watch this closely, and I would not be 440 00:28:26,880 --> 00:28:31,160 Speaker 1: shocked to see similar policies on other services follow suit. 441 00:28:31,720 --> 00:28:34,680 Speaker 1: At long last, the James web Space telescope has taken 442 00:28:34,720 --> 00:28:38,320 Speaker 1: the first fully focused image of a star in what 443 00:28:38,400 --> 00:28:41,160 Speaker 1: was essentially a test to make sure that the telescope 444 00:28:41,200 --> 00:28:43,920 Speaker 1: is working properly. And it is, and that's not a 445 00:28:43,960 --> 00:28:46,440 Speaker 1: trivial thing. I mean, way back when the Hubble Space 446 00:28:46,480 --> 00:28:50,840 Speaker 1: Telescope first came online, NASA was somewhat dismade to discover 447 00:28:51,080 --> 00:28:54,719 Speaker 1: that a manufacturing error meant the telescope was out of 448 00:28:54,720 --> 00:28:57,200 Speaker 1: focus and there was no way to fix it without 449 00:28:57,320 --> 00:29:00,920 Speaker 1: sending people up there. So NASA sent up missions of 450 00:29:00,960 --> 00:29:03,760 Speaker 1: astronauts to install some additional equipment on the Hubble and 451 00:29:03,800 --> 00:29:08,000 Speaker 1: address those errors. That's something that's not possible with the 452 00:29:08,080 --> 00:29:10,960 Speaker 1: James Webb Space Telescope because it's at a much further 453 00:29:11,080 --> 00:29:14,040 Speaker 1: out orbit. It's beyond the Moon and we can't get there. 454 00:29:14,680 --> 00:29:17,320 Speaker 1: So the test image was of a star that has 455 00:29:17,360 --> 00:29:20,760 Speaker 1: a name that involves so many numbers. There's no point 456 00:29:20,880 --> 00:29:23,600 Speaker 1: in me saying the name of the star here because honestly, 457 00:29:23,600 --> 00:29:25,840 Speaker 1: it would sound like I was making a joke. You'd say, oh, 458 00:29:25,880 --> 00:29:29,040 Speaker 1: he's just rattling off random numbers at this point. So 459 00:29:29,080 --> 00:29:32,880 Speaker 1: it doesn't have an easy name like Bob. Anyway, the 460 00:29:32,920 --> 00:29:36,040 Speaker 1: shutter speed on the imaging equipment was beyond slow. The 461 00:29:36,080 --> 00:29:40,840 Speaker 1: exposure lasted thirty five minutes, So imagine for a moment 462 00:29:40,880 --> 00:29:43,560 Speaker 1: that you have to sit perfectly still for thirty five 463 00:29:43,600 --> 00:29:46,800 Speaker 1: minutes while you're taking your next selfie. But that long 464 00:29:46,840 --> 00:29:50,200 Speaker 1: exposure time allows the telescope to collect as much light 465 00:29:50,280 --> 00:29:54,760 Speaker 1: as it can in order to image incredibly faint objects 466 00:29:54,760 --> 00:29:57,760 Speaker 1: out in space. The test is a great indication that 467 00:29:57,800 --> 00:30:00,960 Speaker 1: the telescope will work as it was design. Mind though, 468 00:30:01,040 --> 00:30:03,680 Speaker 1: it will still take several weeks of minor adjustments and 469 00:30:03,720 --> 00:30:06,280 Speaker 1: tweaking to tune the telescope so that it can get 470 00:30:06,280 --> 00:30:10,960 Speaker 1: down to some serious work. Cosmologically speaking, considering all the 471 00:30:11,000 --> 00:30:15,840 Speaker 1: delays and challenges around this particular space telescope, it's pretty 472 00:30:15,880 --> 00:30:18,480 Speaker 1: astounding to see it working like this, and it really 473 00:30:18,520 --> 00:30:22,640 Speaker 1: stands as a triumph of science and engineering. Finally, I'll 474 00:30:22,680 --> 00:30:25,680 Speaker 1: have to do a full episode about this next news item, 475 00:30:25,960 --> 00:30:29,760 Speaker 1: but I just wanted to share a quick reference to it. 476 00:30:30,240 --> 00:30:34,240 Speaker 1: So IBM in Video and various researchers published a paper 477 00:30:34,640 --> 00:30:39,440 Speaker 1: with the wonderful title b AM or BAM a Case 478 00:30:39,680 --> 00:30:45,400 Speaker 1: for enabling fine grain high throughput GPU orchestrated access to storage. 479 00:30:46,400 --> 00:30:49,440 Speaker 1: And in this case, BAM doesn't refer to Emerald Legassi 480 00:30:49,560 --> 00:30:53,720 Speaker 1: throwing garlic on something. It stands for a big accelerator 481 00:30:53,880 --> 00:30:56,560 Speaker 1: memory and the goal is to remove something that can 482 00:30:56,600 --> 00:31:00,920 Speaker 1: otherwise throttle the machine learning process, which is the connection 483 00:31:01,080 --> 00:31:07,040 Speaker 1: between processors and storage. The BAM system proposes a design 484 00:31:07,080 --> 00:31:11,480 Speaker 1: that connects GPUs or graphics processing units straight to s 485 00:31:11,480 --> 00:31:14,720 Speaker 1: s d s or solid state drives, so the s 486 00:31:14,840 --> 00:31:17,800 Speaker 1: s d s serve as memory and storage. And because 487 00:31:17,880 --> 00:31:23,280 Speaker 1: GPUs typically emphasize parallel processing applications, they are well suited 488 00:31:23,280 --> 00:31:26,920 Speaker 1: for machine learning applications. And keep in mind that machine 489 00:31:27,000 --> 00:31:30,960 Speaker 1: learning is all about training computer systems to do something 490 00:31:31,480 --> 00:31:34,160 Speaker 1: uh and it can really be like just about anything. 491 00:31:34,160 --> 00:31:37,600 Speaker 1: For example, you might use very large data sets to 492 00:31:37,800 --> 00:31:41,560 Speaker 1: train a computer system to look for signs when criminal 493 00:31:41,640 --> 00:31:46,960 Speaker 1: organizations are attempting to launder money using cryptocurrency. That could 494 00:31:47,000 --> 00:31:50,440 Speaker 1: be one use of machine learning. It doesn't have to 495 00:31:50,480 --> 00:31:53,320 Speaker 1: be like that weighty. You could train a system to 496 00:31:53,600 --> 00:31:56,520 Speaker 1: become better at natural language recognition and be able to 497 00:31:57,200 --> 00:32:03,920 Speaker 1: interact with people more naturally. Historically, the path between processors 498 00:32:03,920 --> 00:32:06,800 Speaker 1: and storage has created kind of a traffic jam in 499 00:32:06,920 --> 00:32:12,520 Speaker 1: machine learning applications, so this proposed architecture could alleviate that. 500 00:32:13,040 --> 00:32:15,080 Speaker 1: And like I said, I'll have to read up on 501 00:32:15,120 --> 00:32:17,600 Speaker 1: this further and maybe do a full episode about it 502 00:32:17,640 --> 00:32:20,360 Speaker 1: in the future. In the meantime, if you have any 503 00:32:20,400 --> 00:32:23,520 Speaker 1: suggestions for future episodes of tech Stuff, I welcome you 504 00:32:23,600 --> 00:32:26,000 Speaker 1: to reach out and let me know about your ideas. 505 00:32:26,440 --> 00:32:28,800 Speaker 1: The best way to do that is over on Twitter. 506 00:32:29,160 --> 00:32:32,440 Speaker 1: The handle for the show is tech Stuff H s 507 00:32:32,760 --> 00:32:38,000 Speaker 1: W and I'll talk to you again really soon. Yeah. 508 00:32:42,080 --> 00:32:45,120 Speaker 1: Text Stuff is an I Heart Radio production. For more 509 00:32:45,200 --> 00:32:48,600 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 510 00:32:48,720 --> 00:32:51,880 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.