1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:15,040 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,240 --> 00:00:18,520 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:19,079 --> 00:00:22,000 Speaker 1: And how the tech are you. I'm a little out 5 00:00:22,000 --> 00:00:25,160 Speaker 1: of breath because if you listen to yesterday's episode you 6 00:00:25,239 --> 00:00:31,480 Speaker 1: might have heard I, despite all of my precautionary behaviors, 7 00:00:31,520 --> 00:00:35,920 Speaker 1: caught COVID apparently, and that's all sorts of fun. But 8 00:00:36,400 --> 00:00:40,400 Speaker 1: the show, as they say, must keep going. I think 9 00:00:40,760 --> 00:00:44,880 Speaker 1: I don't know. I'm on pain killer. Let's talk about 10 00:00:44,880 --> 00:00:48,720 Speaker 1: the news for today, Tuesday, February eight, twenty twenty two, 11 00:00:49,040 --> 00:00:51,320 Speaker 1: and let's start off with some updates to a few 12 00:00:51,360 --> 00:00:55,319 Speaker 1: stories that I have talked about in recent episodes. First up, 13 00:00:55,440 --> 00:00:59,720 Speaker 1: the Nvidio bid for the semiconductor company ARM is a 14 00:00:59,760 --> 00:01:02,920 Speaker 1: fish lee off the table in video had been pursuing 15 00:01:03,080 --> 00:01:08,039 Speaker 1: a sixties six billion dollar acquisition of ARM, the British 16 00:01:08,120 --> 00:01:14,240 Speaker 1: semiconductor company, but encountered numerous regulatory obstacles along the way. Now, 17 00:01:14,280 --> 00:01:17,120 Speaker 1: I mentioned last week that it looked like this deal 18 00:01:17,720 --> 00:01:20,640 Speaker 1: was falling apart, and now we can definitively say that 19 00:01:20,720 --> 00:01:25,920 Speaker 1: the deal fell apart. Relatively recently, we've seen regulators in 20 00:01:26,120 --> 00:01:30,000 Speaker 1: regions like the European Union, the United States, and the 21 00:01:30,080 --> 00:01:35,720 Speaker 1: United Kingdom become more wary of large acquisitions and mergers, 22 00:01:35,760 --> 00:01:39,280 Speaker 1: particularly in the tech world, but not exclusively to tech, 23 00:01:40,080 --> 00:01:42,720 Speaker 1: and the proposed acquisition appears to be a victim of 24 00:01:42,800 --> 00:01:48,400 Speaker 1: increased scrutiny of such proposed deals. I'm sure this comes 25 00:01:48,440 --> 00:01:52,640 Speaker 1: as something of a relief to companies like Microsoft and Qualcom, 26 00:01:52,640 --> 00:01:56,640 Speaker 1: both of which depend upon ARM manufactured chips, and neither 27 00:01:56,720 --> 00:01:59,120 Speaker 1: of which would want to see in video get control 28 00:01:59,160 --> 00:02:02,240 Speaker 1: of that part of the fly chain. As for ARM, 29 00:02:02,280 --> 00:02:06,480 Speaker 1: it's current owner, which is the Asian conglomerate soft Bank, 30 00:02:07,520 --> 00:02:11,080 Speaker 1: plans to prepare ARM for an initial public offering for 31 00:02:11,240 --> 00:02:13,960 Speaker 1: later in the year, that is, to spend it off 32 00:02:14,160 --> 00:02:17,400 Speaker 1: as a publicly traded company. Now we might see another 33 00:02:17,480 --> 00:02:20,560 Speaker 1: tug of war happen on that front, because in the 34 00:02:20,680 --> 00:02:24,000 Speaker 1: UK a lot of officials are really eager to see 35 00:02:24,200 --> 00:02:28,920 Speaker 1: ARM a British company listed on the UK's stock market, 36 00:02:29,680 --> 00:02:34,160 Speaker 1: but soft Bank seems inclined to instead list ARM on 37 00:02:34,480 --> 00:02:38,239 Speaker 1: the New York Stock Exchange, where you might see a 38 00:02:38,360 --> 00:02:42,760 Speaker 1: slightly better share price value listed initially so soft Bank 39 00:02:42,800 --> 00:02:45,600 Speaker 1: can make back some of the money it's spent acquiring 40 00:02:45,680 --> 00:02:48,760 Speaker 1: arm in the first place. Politics and business are fun, 41 00:02:49,360 --> 00:02:53,040 Speaker 1: continuing to update stories. Earlier this month, news broke that 42 00:02:53,120 --> 00:02:56,079 Speaker 1: the United States Internal Revenue Service or i r S, 43 00:02:56,120 --> 00:03:00,280 Speaker 1: everyone's favorite department in the US, was planning to use 44 00:03:00,360 --> 00:03:03,880 Speaker 1: a third party company for the purposes of authenticating people 45 00:03:04,240 --> 00:03:08,679 Speaker 1: who are trying to access some online i r S services, 46 00:03:08,760 --> 00:03:12,240 Speaker 1: and they would be doing that through facial recognition technology. 47 00:03:12,520 --> 00:03:15,480 Speaker 1: The plan would have required people to submit a video 48 00:03:15,600 --> 00:03:19,400 Speaker 1: selfie of themselves and then use a camera either on 49 00:03:19,440 --> 00:03:22,600 Speaker 1: their phone or on a webcamp to verify their identity 50 00:03:22,639 --> 00:03:25,840 Speaker 1: before they would be able to use certain features on 51 00:03:25,880 --> 00:03:29,079 Speaker 1: the i r S website. Critics objected to this practice 52 00:03:29,160 --> 00:03:33,280 Speaker 1: pretty quickly, pointing out that facial recognition technology has often 53 00:03:33,320 --> 00:03:37,400 Speaker 1: proven to be unreliable, particularly for anyone who doesn't happen 54 00:03:37,440 --> 00:03:41,880 Speaker 1: to be male and white. There's a pervasive issue with 55 00:03:41,960 --> 00:03:46,720 Speaker 1: bias in facial recognition. Uh some products have more of 56 00:03:46,720 --> 00:03:50,560 Speaker 1: a problem with this than others. In addition, critics argue 57 00:03:50,680 --> 00:03:53,360 Speaker 1: that the requirement would put an unfair burden on people 58 00:03:53,400 --> 00:03:57,360 Speaker 1: who don't have the access to smartphones or webcams, and 59 00:03:57,440 --> 00:04:00,680 Speaker 1: thus create a deeper digital divide between the haves and 60 00:04:00,720 --> 00:04:03,680 Speaker 1: the have nots, and because you know, everyone's supposed to 61 00:04:03,680 --> 00:04:07,440 Speaker 1: file taxes, that's a real problem. And then there's the 62 00:04:07,480 --> 00:04:10,200 Speaker 1: concern for privacy and security that comes along with using 63 00:04:10,240 --> 00:04:13,960 Speaker 1: a private, third party company to work so closely with, 64 00:04:14,440 --> 00:04:18,360 Speaker 1: you know, a government service like internal Revenue. That seems 65 00:04:18,400 --> 00:04:21,560 Speaker 1: to raise a few more concerns. And now the I 66 00:04:21,680 --> 00:04:23,920 Speaker 1: r S has walked back its decision, saying it's going 67 00:04:23,960 --> 00:04:27,839 Speaker 1: to transition away from using this facial recognition technology which 68 00:04:27,880 --> 00:04:30,800 Speaker 1: had already started to roll out, and that it will 69 00:04:30,920 --> 00:04:35,080 Speaker 1: quote develop and bring online an additional authentication process that 70 00:04:35,160 --> 00:04:40,239 Speaker 1: does not involve facial recognition end quote. Not too long ago, 71 00:04:40,880 --> 00:04:44,240 Speaker 1: I dedicated an episode of tech Stuff to talk about Peloton, 72 00:04:44,800 --> 00:04:48,920 Speaker 1: the company best known for its connected exercise bikes, also 73 00:04:49,000 --> 00:04:52,640 Speaker 1: does treadmills and you take online spin classes from the 74 00:04:52,680 --> 00:04:56,520 Speaker 1: comfort of your own basement, I mean home. And Peloton 75 00:04:56,680 --> 00:04:59,839 Speaker 1: has had a pretty dramatic fall from grace over the 76 00:05:00,040 --> 00:05:05,239 Speaker 1: last few months, with the company's sales slowing dramatically after 77 00:05:05,360 --> 00:05:09,480 Speaker 1: a big spike early on in the pandemic. And then 78 00:05:09,560 --> 00:05:13,800 Speaker 1: there are reportedly warehouses filled with Peloton products that just 79 00:05:14,000 --> 00:05:17,560 Speaker 1: haven't sold, And there are also some pr issues that 80 00:05:17,600 --> 00:05:21,400 Speaker 1: were exacerbated by a couple of fictional characters suffering fictional 81 00:05:21,440 --> 00:05:25,000 Speaker 1: heart attacks after working on the bikes. Because that's the 82 00:05:25,040 --> 00:05:27,560 Speaker 1: kind of world we live in. I guess where fictional 83 00:05:27,680 --> 00:05:31,760 Speaker 1: character's death on a show can actually put a stigma 84 00:05:31,760 --> 00:05:37,120 Speaker 1: against a company anyway. John Foley, the founder of Peloton 85 00:05:37,360 --> 00:05:41,440 Speaker 1: and now former CEO, announced that he was stepping down 86 00:05:41,520 --> 00:05:45,640 Speaker 1: from his leadership role. Um well, that leadership role fully 87 00:05:45,680 --> 00:05:47,800 Speaker 1: has led Peloton ever since he came up with the 88 00:05:47,839 --> 00:05:51,239 Speaker 1: idea more than a decade ago, and he will instead 89 00:05:51,360 --> 00:05:54,320 Speaker 1: become the executive chair of the board of directors, so 90 00:05:54,400 --> 00:05:58,880 Speaker 1: still very much, you know, part of the company. And 91 00:05:59,120 --> 00:06:02,520 Speaker 1: a man named Barry McCarthy, who had served as chief 92 00:06:02,520 --> 00:06:05,480 Speaker 1: financial officer of Spotify, another company that's been the hot 93 00:06:05,480 --> 00:06:09,920 Speaker 1: seat recently, will become the new president and CEO of Peloton. 94 00:06:10,520 --> 00:06:14,880 Speaker 1: Peloton's current president, William Lynch, is also stepping down. Foley 95 00:06:14,960 --> 00:06:16,920 Speaker 1: and Lynch aren't the only two people at Peloton who 96 00:06:16,920 --> 00:06:19,960 Speaker 1: will be updating their resumes. The Wall Street Journal reports 97 00:06:20,400 --> 00:06:23,520 Speaker 1: that the company expects to cut about twenty percent of 98 00:06:23,560 --> 00:06:27,320 Speaker 1: its workforce. That's around two thousand, eight hundred jobs. There 99 00:06:27,320 --> 00:06:29,839 Speaker 1: have also been calls from activist shareholders to sell the 100 00:06:29,839 --> 00:06:32,880 Speaker 1: company in order to get a little money back for 101 00:06:32,920 --> 00:06:36,479 Speaker 1: those shareholders, because of course, share prices in Peloton have 102 00:06:36,600 --> 00:06:39,719 Speaker 1: dropped more than fiftent this year. Although I saw a 103 00:06:39,760 --> 00:06:43,320 Speaker 1: headline just before I started recording that suggests a lot 104 00:06:43,320 --> 00:06:45,880 Speaker 1: of movement on that front, so it may be different 105 00:06:45,880 --> 00:06:49,600 Speaker 1: by the time you hear this. Now. I just mentioned Spotify, 106 00:06:49,720 --> 00:06:52,240 Speaker 1: so let's let's talk about that for a moment. First 107 00:06:52,240 --> 00:06:55,479 Speaker 1: of all, Spotify and I Heart Radio or I Heart 108 00:06:55,520 --> 00:06:58,560 Speaker 1: Media are competitors, and I work for I Heeart Media. 109 00:06:59,000 --> 00:07:01,680 Speaker 1: I'm saying that up front because context is important, and 110 00:07:01,720 --> 00:07:04,600 Speaker 1: also to say I'm not speaking on behalf of my 111 00:07:04,680 --> 00:07:08,120 Speaker 1: employer in any way. I honestly don't know what anyone 112 00:07:08,720 --> 00:07:12,760 Speaker 1: official and my heart thinks about what's going on at Spotify, 113 00:07:12,920 --> 00:07:15,560 Speaker 1: and there's certainly no official company position that I can 114 00:07:15,640 --> 00:07:18,520 Speaker 1: point to. So this is just me here, all right. 115 00:07:18,560 --> 00:07:20,720 Speaker 1: With that out of the way, Spotify has been trying 116 00:07:20,720 --> 00:07:23,200 Speaker 1: to walk a tight rope over the last few weeks, 117 00:07:23,240 --> 00:07:27,520 Speaker 1: all thanks to Joe Rogan. Spotify famously signed Joe Rogan 118 00:07:27,600 --> 00:07:30,880 Speaker 1: to an exclusive deal for his podcast for the princely 119 00:07:31,000 --> 00:07:34,280 Speaker 1: sum of one hundred million dollars. It's a heck of 120 00:07:34,320 --> 00:07:38,160 Speaker 1: a deal. So to get Rogan's podcast, you had to 121 00:07:38,280 --> 00:07:42,200 Speaker 1: use Spotify anyway. A couple of weeks ago, several musicians 122 00:07:42,200 --> 00:07:46,720 Speaker 1: began to pull their musical catalogs off of Spotify because 123 00:07:46,720 --> 00:07:50,160 Speaker 1: of Rogan's tendency to spread COVID misinformation on his show. 124 00:07:51,080 --> 00:07:54,680 Speaker 1: The musicians expressed concern about the dangers this post, considering 125 00:07:54,760 --> 00:07:59,080 Speaker 1: Rogan's immense popularity. So Spotify's response was essentially, you know, 126 00:07:59,160 --> 00:08:02,559 Speaker 1: it's it's Joe Rogan show. We just licensed the show 127 00:08:02,680 --> 00:08:05,960 Speaker 1: from him. We don't take a hand in shaping the content. 128 00:08:06,040 --> 00:08:09,880 Speaker 1: We're not editors. Who he chooses to have on his 129 00:08:09,920 --> 00:08:12,520 Speaker 1: show and what he talks about are all up to him. 130 00:08:12,560 --> 00:08:14,560 Speaker 1: So we're just the platform and as long as he 131 00:08:14,560 --> 00:08:18,480 Speaker 1: doesn't violate our rules, the content stays. Those rules, by 132 00:08:18,480 --> 00:08:20,800 Speaker 1: the way, weren't public facing, Like there was no way 133 00:08:20,800 --> 00:08:24,960 Speaker 1: to see whether content was violent the rules if you 134 00:08:25,000 --> 00:08:28,960 Speaker 1: were outside Spotify, so you couldn't judge that. Uh. The 135 00:08:29,040 --> 00:08:32,360 Speaker 1: company subsequently then published those rules, so now they are 136 00:08:32,440 --> 00:08:35,600 Speaker 1: public facing, but at the time they weren't. And then 137 00:08:35,760 --> 00:08:39,280 Speaker 1: artists India are pulled her music off the platform because 138 00:08:39,320 --> 00:08:42,640 Speaker 1: of Rogan's history of using racial slurs on his show. 139 00:08:42,760 --> 00:08:46,120 Speaker 1: She actually shared a video montage of him doing so 140 00:08:46,920 --> 00:08:50,840 Speaker 1: over the course of numerous episodes, and this time Spotify 141 00:08:50,960 --> 00:08:55,439 Speaker 1: swept in and had behind the scenes discussions with Rogan 142 00:08:55,520 --> 00:08:58,560 Speaker 1: on the matter, and subsequently Rogan pulled down more than 143 00:08:58,600 --> 00:09:01,480 Speaker 1: a hundred of his past up episodes. Now The Verge 144 00:09:01,559 --> 00:09:05,160 Speaker 1: published an article about this titled Spotify is more Confused 145 00:09:05,160 --> 00:09:08,839 Speaker 1: about Joe Rogan than Ever, and The Verge pointed out 146 00:09:08,880 --> 00:09:13,120 Speaker 1: that the racial slurs, while horrible, don't actually seem to 147 00:09:13,240 --> 00:09:17,720 Speaker 1: violate Spotify's rules as they are written. It would take 148 00:09:18,000 --> 00:09:22,520 Speaker 1: some loose interpretation of the rules to say, oh, this 149 00:09:22,640 --> 00:09:25,920 Speaker 1: covers racial slurs as well, not that racial slurs aren't bad, 150 00:09:25,960 --> 00:09:29,000 Speaker 1: they're terrible, but rather that Spotify it didn't seem like 151 00:09:29,040 --> 00:09:32,000 Speaker 1: it was a direct violation of what Spotify's rules were. 152 00:09:32,480 --> 00:09:36,800 Speaker 1: So the Verge poses, why can Spotify pressure Rogan to 153 00:09:36,880 --> 00:09:41,000 Speaker 1: pull down those episodes in that case but do nothing 154 00:09:41,720 --> 00:09:44,600 Speaker 1: in the case of COVID misinformation? And this is a 155 00:09:44,640 --> 00:09:48,360 Speaker 1: really tough situation. On the one hand, creators value having 156 00:09:48,400 --> 00:09:51,360 Speaker 1: freedom and authority over their own work. I get to 157 00:09:51,440 --> 00:09:53,840 Speaker 1: choose what topics I cover for my show and how 158 00:09:53,880 --> 00:09:56,960 Speaker 1: I cover them. My heart does not pressure me to 159 00:09:57,080 --> 00:09:59,520 Speaker 1: do otherwise. You know, I might once in a while 160 00:09:59,559 --> 00:10:01,440 Speaker 1: get a Rik West like, Hey, would you do a 161 00:10:01,520 --> 00:10:04,320 Speaker 1: themed episode on such and such? But I can actually 162 00:10:04,320 --> 00:10:08,680 Speaker 1: say yes or no to that, and the time, whatever 163 00:10:08,679 --> 00:10:10,439 Speaker 1: the topic happens to be for that day is one 164 00:10:10,480 --> 00:10:13,600 Speaker 1: that I just chose myself, and it's my own way 165 00:10:13,640 --> 00:10:17,720 Speaker 1: of of expressing my thoughts on those topics. However, on 166 00:10:17,760 --> 00:10:21,520 Speaker 1: the other hand, spreading this information and promoting harmful portrayals 167 00:10:21,720 --> 00:10:26,000 Speaker 1: of people of color or giving white supremacists of platform, 168 00:10:26,160 --> 00:10:30,280 Speaker 1: which Rogan has done in the past, undeniably that causes harm. 169 00:10:30,320 --> 00:10:33,400 Speaker 1: And so you're left asking is it better to place 170 00:10:33,480 --> 00:10:38,120 Speaker 1: tighter restrictions on content and affect the creator's process, or 171 00:10:38,160 --> 00:10:40,960 Speaker 1: is it better to have the creator be free to 172 00:10:41,040 --> 00:10:43,800 Speaker 1: send out whatever message they want, even if that message 173 00:10:43,840 --> 00:10:47,160 Speaker 1: is harmful. Now, as a creator, I actually think tighter 174 00:10:47,200 --> 00:10:50,360 Speaker 1: restrictions are the better alternative. I mean, I value my 175 00:10:50,440 --> 00:10:53,760 Speaker 1: ability to say stuff, but I don't think that ability 176 00:10:53,800 --> 00:10:57,280 Speaker 1: is more important than the safety and dignity of other people. 177 00:10:57,480 --> 00:11:00,319 Speaker 1: I think that's way more important than oh God, you're 178 00:11:00,360 --> 00:11:03,880 Speaker 1: not letting me say this word that I shouldn't say. Um, 179 00:11:03,920 --> 00:11:07,400 Speaker 1: that's my own opinion, just my own personal opinion, not 180 00:11:07,600 --> 00:11:09,800 Speaker 1: again meant to be any kind of official stance on 181 00:11:09,840 --> 00:11:13,120 Speaker 1: the matter. All Right, we've got some more news stories 182 00:11:13,120 --> 00:11:15,360 Speaker 1: to cover, but before we get to that, let's take 183 00:11:15,400 --> 00:11:26,200 Speaker 1: a quick break. According to euro News, authorities in the 184 00:11:26,240 --> 00:11:28,800 Speaker 1: EU are looking at the concept of the metaverse with 185 00:11:28,920 --> 00:11:31,880 Speaker 1: regard to what, if any, regulations will need to be 186 00:11:31,960 --> 00:11:35,160 Speaker 1: in place to protect citizens from what some call the 187 00:11:35,200 --> 00:11:40,199 Speaker 1: future of the internet. In fact, Margreeth Vettiger, and I 188 00:11:40,240 --> 00:11:44,520 Speaker 1: apologize for butchering that name, she's an EU representative from Denmark, 189 00:11:44,679 --> 00:11:47,920 Speaker 1: went so far as to say, quote, the metaverse is 190 00:11:47,960 --> 00:11:51,240 Speaker 1: here already, so of course we want we start analyzing 191 00:11:51,280 --> 00:11:54,120 Speaker 1: what will be the role for a regulator, what is 192 00:11:54,120 --> 00:11:57,600 Speaker 1: the role for our legislature end quote. I actually take 193 00:11:57,640 --> 00:12:00,160 Speaker 1: issue with her statement that the metaverse is here are 194 00:12:00,200 --> 00:12:03,520 Speaker 1: already because no one has really defined what the metaverse 195 00:12:04,040 --> 00:12:07,600 Speaker 1: is or will be. There are a lot of kind 196 00:12:07,640 --> 00:12:13,400 Speaker 1: of vague ideas and proposals of what metaverse will actually mean. 197 00:12:13,480 --> 00:12:16,560 Speaker 1: Whenever we have something that we can definitively point to 198 00:12:16,600 --> 00:12:19,720 Speaker 1: and say that is a metaverse, and there's some examples 199 00:12:19,720 --> 00:12:22,840 Speaker 1: of stuff to have elements of some of those ideas, 200 00:12:22,880 --> 00:12:26,240 Speaker 1: I mean, minecraft roadblocks and even Second Life come to 201 00:12:26,320 --> 00:12:29,760 Speaker 1: mind as having some aspects of what people refer to 202 00:12:29,800 --> 00:12:33,880 Speaker 1: when they're talking about metaverse concepts, but none of those 203 00:12:33,960 --> 00:12:37,960 Speaker 1: seem to actually encompass everything that is brought to mind 204 00:12:38,040 --> 00:12:41,600 Speaker 1: when that vague term metaverse has mentioned. However, there are 205 00:12:41,679 --> 00:12:44,600 Speaker 1: definitely a lot of companies rushing into the metaverse space 206 00:12:45,040 --> 00:12:47,120 Speaker 1: convinced that it is in fact going to be the 207 00:12:47,160 --> 00:12:50,760 Speaker 1: future of the Internet. I'm still skeptical about that personally, 208 00:12:50,920 --> 00:12:52,920 Speaker 1: but I have a long history of being wrong about 209 00:12:52,960 --> 00:12:55,640 Speaker 1: this kind of stuff, and besides, I'm getting older and 210 00:12:55,679 --> 00:13:00,880 Speaker 1: grouchier every day, so that could be affecting my perception anyway. 211 00:13:01,360 --> 00:13:04,520 Speaker 1: It's clear that EU officials are looking ahead and trying 212 00:13:04,520 --> 00:13:07,200 Speaker 1: to anticipate what sort of regulations will need to be 213 00:13:07,240 --> 00:13:10,000 Speaker 1: in place in order to ensure that the meta verses, 214 00:13:10,559 --> 00:13:13,199 Speaker 1: whatever those turn out to be, played by the use 215 00:13:13,360 --> 00:13:18,400 Speaker 1: rules about citizen data, privacy and security. The EU has 216 00:13:18,440 --> 00:13:23,520 Speaker 1: been very forward on those pushing for stronger and stronger 217 00:13:23,679 --> 00:13:28,560 Speaker 1: regulations and turning from the metaverse to meta You know, 218 00:13:28,640 --> 00:13:32,080 Speaker 1: the company formerly known as Facebook. We have a few 219 00:13:32,080 --> 00:13:35,120 Speaker 1: more stories. For one, Meta has said it might actually 220 00:13:35,160 --> 00:13:39,880 Speaker 1: shut down operations in Europe for Facebook and Instagram that is, 221 00:13:40,080 --> 00:13:41,880 Speaker 1: the folks in Europe may one day find they can 222 00:13:41,920 --> 00:13:45,320 Speaker 1: no longer access those platforms. Now why is that, Well, 223 00:13:45,360 --> 00:13:48,319 Speaker 1: it all has to do with those data privacy laws 224 00:13:48,320 --> 00:13:52,760 Speaker 1: and the EU I just referenced. See Facebook currently transfers 225 00:13:52,840 --> 00:13:56,440 Speaker 1: data back and forth between the UK and the United States, 226 00:13:56,559 --> 00:14:00,520 Speaker 1: or really the United States and everywhere else. Like if 227 00:14:00,559 --> 00:14:04,760 Speaker 1: you lived in Africa, that data would be making its 228 00:14:04,760 --> 00:14:07,559 Speaker 1: way back to the United States for processing and analyzing 229 00:14:08,280 --> 00:14:13,040 Speaker 1: um and so in the European Union there are these 230 00:14:13,080 --> 00:14:17,160 Speaker 1: transatlantic data transfers that happens so that Facebook can quote 231 00:14:17,200 --> 00:14:21,720 Speaker 1: unquote offers services to European users. And by offer services, 232 00:14:22,200 --> 00:14:26,480 Speaker 1: I suspect we're not just talking about features on Facebook 233 00:14:26,480 --> 00:14:31,120 Speaker 1: and Instagram, but stuff like you know, targeted advertising. That 234 00:14:31,120 --> 00:14:33,040 Speaker 1: that's part of what the data is being used for. 235 00:14:33,160 --> 00:14:37,840 Speaker 1: But the use data protection laws are closing off the 236 00:14:37,920 --> 00:14:42,560 Speaker 1: avenues that Facebook can legally use to transfer data from 237 00:14:42,720 --> 00:14:46,800 Speaker 1: the European Union to the United States, largely because there 238 00:14:46,800 --> 00:14:50,720 Speaker 1: are concerns that such data transfers could end up being 239 00:14:50,760 --> 00:14:54,120 Speaker 1: mined by organizations like the National Security Agency or in 240 00:14:54,320 --> 00:14:57,479 Speaker 1: essay here in the United States, and that EU citizens 241 00:14:57,760 --> 00:15:02,080 Speaker 1: should not be subjected to that, especially without their consent, 242 00:15:03,320 --> 00:15:06,560 Speaker 1: so that's what's at the heart here. And the EU 243 00:15:06,640 --> 00:15:09,440 Speaker 1: has started to shut down some of the avenues that 244 00:15:09,480 --> 00:15:13,120 Speaker 1: Facebook would use to send data back and forth, so 245 00:15:13,520 --> 00:15:16,040 Speaker 1: that would mean Facebook would have to silo information in 246 00:15:16,080 --> 00:15:19,120 Speaker 1: the European Union, it would have to set up operations 247 00:15:19,240 --> 00:15:23,040 Speaker 1: in the EU specifically to handle all that in order 248 00:15:23,040 --> 00:15:26,560 Speaker 1: to keep you know, it's its services essentially the same 249 00:15:26,600 --> 00:15:30,040 Speaker 1: as they are now, rather than just having it shipped 250 00:15:30,040 --> 00:15:35,040 Speaker 1: over to the United States that data. So they're essentially saying, Hey, 251 00:15:35,080 --> 00:15:38,240 Speaker 1: if we can't transfer data back to the United States 252 00:15:38,240 --> 00:15:40,160 Speaker 1: so that we can make use of it, we can't 253 00:15:40,160 --> 00:15:43,040 Speaker 1: operate in the U, so we're gonna shut down now. 254 00:15:43,120 --> 00:15:45,520 Speaker 1: I should also add that REPS that Meta have said 255 00:15:45,560 --> 00:15:48,760 Speaker 1: the company doesn't actually have plans to shut down operations 256 00:15:48,760 --> 00:15:52,000 Speaker 1: in the EU. Instead, they're sending the message that there 257 00:15:52,040 --> 00:15:54,600 Speaker 1: needs to be some sort of official structure in place 258 00:15:54,680 --> 00:15:59,480 Speaker 1: so that the company can continue on with business as usual. Meanwhile, 259 00:15:59,480 --> 00:16:03,080 Speaker 1: REPS and EU are essentially saying, you need us more 260 00:16:03,120 --> 00:16:06,160 Speaker 1: than we need you. And in fact, Meta pulls about 261 00:16:07,120 --> 00:16:10,840 Speaker 1: its ad revenue from Europe. About half of all revenue 262 00:16:10,880 --> 00:16:15,400 Speaker 1: comes from North America and the remaining gets divvied up 263 00:16:15,440 --> 00:16:18,400 Speaker 1: around the rest of the world. So the United States 264 00:16:18,400 --> 00:16:21,480 Speaker 1: and Canada are the most important regions for Facebook when 265 00:16:21,480 --> 00:16:26,640 Speaker 1: it comes to revenue, but the EU that's significant. So 266 00:16:27,280 --> 00:16:30,240 Speaker 1: if Meta were not to comply, it could face some 267 00:16:30,280 --> 00:16:33,840 Speaker 1: pretty significant finds in Europe. And at the moment, the 268 00:16:33,880 --> 00:16:37,080 Speaker 1: matter is still one that's working its way through regulatory processes. 269 00:16:37,080 --> 00:16:39,440 Speaker 1: So it may turn out this whole kerfuffle dies without 270 00:16:39,560 --> 00:16:43,800 Speaker 1: much happening, but we'll have to keep an eye on it. Okay. 271 00:16:43,840 --> 00:16:47,360 Speaker 1: Another story with Meta is that Peter Theel venture capitalist, 272 00:16:47,400 --> 00:16:50,280 Speaker 1: who has long maintained a seat at Meta's board of directors, 273 00:16:50,840 --> 00:16:54,440 Speaker 1: is stepping down from the board. Feel himself is a 274 00:16:54,480 --> 00:16:59,240 Speaker 1: controversial figure, having become a prominent supporter of conservative politicians 275 00:16:59,800 --> 00:17:03,960 Speaker 1: in including former US President Donald Trump. Now I say 276 00:17:04,000 --> 00:17:08,040 Speaker 1: controversial because Trump of course got into hot water on 277 00:17:08,080 --> 00:17:13,840 Speaker 1: Facebook and other online platforms by repeatedly violating policies of 278 00:17:13,840 --> 00:17:18,240 Speaker 1: those platforms, including facebooks, and that necessitated his removal from 279 00:17:18,240 --> 00:17:20,520 Speaker 1: those platforms. In fact, I think Facebook was the first 280 00:17:20,560 --> 00:17:24,160 Speaker 1: one to do it. So, you know, Peter Theel being 281 00:17:24,440 --> 00:17:27,440 Speaker 1: a supporter of Donald Trump, there was this question about 282 00:17:27,800 --> 00:17:31,320 Speaker 1: whether his influence at the board level would cause Facebook 283 00:17:31,400 --> 00:17:36,879 Speaker 1: to make bad decisions regarding its content moderation policies, and 284 00:17:36,920 --> 00:17:39,919 Speaker 1: there were critics who were calling upon CEO Mark Zuckerberg 285 00:17:40,560 --> 00:17:44,680 Speaker 1: to cut tiles with feel but Mark Toads didn't do that. Anyway, 286 00:17:44,800 --> 00:17:48,320 Speaker 1: thel now appears to be interested in getting into politics himself, 287 00:17:49,160 --> 00:17:52,600 Speaker 1: so he has stepped down from Facebook's board, which would 288 00:17:52,640 --> 00:17:57,160 Speaker 1: absolutely be necessary before he could run for any kind 289 00:17:57,200 --> 00:18:00,160 Speaker 1: of office without you know, tons of people jump being 290 00:18:00,160 --> 00:18:04,320 Speaker 1: on his case for having conflicts of interest. Um. Yeah, 291 00:18:04,359 --> 00:18:08,440 Speaker 1: So the very important figure in Facebook's history and one 292 00:18:08,520 --> 00:18:13,439 Speaker 1: that has had an increasingly um prominent role in supporting 293 00:18:13,720 --> 00:18:17,719 Speaker 1: politics in the United States. As for Meta's version of 294 00:18:17,760 --> 00:18:21,159 Speaker 1: the Metaverse, the company has recently made a change to 295 00:18:21,200 --> 00:18:25,000 Speaker 1: its virtual reality social space called Horizon. It's got a 296 00:18:25,000 --> 00:18:27,800 Speaker 1: couple of different Horizon products out there, like Horizon World, 297 00:18:28,600 --> 00:18:32,320 Speaker 1: and the change now creates a virtual personal space perimeter 298 00:18:32,560 --> 00:18:35,800 Speaker 1: around each avatar. And you might wonder why, Well, it's 299 00:18:35,840 --> 00:18:41,080 Speaker 1: because people can be really awful, particularly in online spaces. 300 00:18:41,640 --> 00:18:44,600 Speaker 1: We've already seen early users of Horizon come forward with 301 00:18:44,680 --> 00:18:48,640 Speaker 1: complaints that they were being harassed in virtual spaces, some 302 00:18:48,720 --> 00:18:52,359 Speaker 1: of them saying that others were attempting to virtually grope 303 00:18:52,440 --> 00:18:55,119 Speaker 1: at them. And you know that might sound odd or 304 00:18:55,200 --> 00:18:57,520 Speaker 1: maybe even funny to you, but you know, when you 305 00:18:57,520 --> 00:19:01,120 Speaker 1: think about it, really is disturbing. Like first, I mean, 306 00:19:01,160 --> 00:19:03,400 Speaker 1: you have no idea what someone else has been through 307 00:19:03,480 --> 00:19:06,920 Speaker 1: in their life, and a virtual action like that could 308 00:19:06,960 --> 00:19:10,920 Speaker 1: be triggering. Like if someone actually has been the target 309 00:19:10,920 --> 00:19:15,359 Speaker 1: of sexual harassment or assault in their past, a virtual 310 00:19:15,400 --> 00:19:21,359 Speaker 1: representation of that is incredibly distressing. Beyond that, a good 311 00:19:21,440 --> 00:19:24,600 Speaker 1: virtual experience is really immersive. I mean, you know, they're 312 00:19:24,600 --> 00:19:28,600 Speaker 1: there therapists out there who use virtual reality to help 313 00:19:28,720 --> 00:19:32,520 Speaker 1: treat people who have various phobias as a kind of 314 00:19:32,560 --> 00:19:37,439 Speaker 1: immersion therapy where the the person who has like the 315 00:19:37,440 --> 00:19:42,080 Speaker 1: phobia can be exposed to a virtual representation of whatever 316 00:19:42,080 --> 00:19:45,200 Speaker 1: it is that triggers the fear and get more accustomed 317 00:19:45,240 --> 00:19:48,800 Speaker 1: to it without having to actually go in person and 318 00:19:48,920 --> 00:19:53,439 Speaker 1: experience that firsthand. Like they have that safety net of 319 00:19:53,480 --> 00:19:56,480 Speaker 1: it's a virtual experience. But I can tell you people 320 00:19:56,640 --> 00:19:59,840 Speaker 1: have had their bodies react just as if they were 321 00:20:00,000 --> 00:20:03,520 Speaker 1: in the real situation. So we know that these virtual 322 00:20:03,560 --> 00:20:08,840 Speaker 1: experiences can have real psychological effects on us. So if 323 00:20:08,880 --> 00:20:12,199 Speaker 1: you are the victim of assault, it can have a 324 00:20:12,240 --> 00:20:17,440 Speaker 1: real psychological effect. Even though you know You're ostensibly safely 325 00:20:17,520 --> 00:20:21,000 Speaker 1: at home in say your game room or office or whatever, 326 00:20:22,080 --> 00:20:26,160 Speaker 1: and nothing quote unquote real is happening to you. Um, 327 00:20:26,320 --> 00:20:29,160 Speaker 1: but yeah, having people blatantly deny you your personal space 328 00:20:29,240 --> 00:20:34,440 Speaker 1: is just awful. So this new system in Horizon creates 329 00:20:34,440 --> 00:20:38,840 Speaker 1: a two ft radius perimeter around every avatar by default, 330 00:20:39,520 --> 00:20:42,320 Speaker 1: and that is meant to set the stage for behavioral 331 00:20:42,480 --> 00:20:46,560 Speaker 1: norms within the metaverse. Now, keep in mind, this is 332 00:20:46,640 --> 00:20:50,480 Speaker 1: just one way people could harass one another in virtual space, 333 00:20:50,600 --> 00:20:53,520 Speaker 1: right to get all up into someone else's personal space. 334 00:20:53,560 --> 00:20:57,080 Speaker 1: That's one way that you could really harass somebody, But 335 00:20:57,400 --> 00:21:00,199 Speaker 1: there are lots of other ones, and Meta executive have 336 00:21:00,280 --> 00:21:04,920 Speaker 1: already indicated that moderating the metaverse to counteract toxicity will 337 00:21:04,960 --> 00:21:08,560 Speaker 1: be difficult, and actually, according to Andrew Bosworth of Meta, 338 00:21:09,160 --> 00:21:11,800 Speaker 1: if you're talking about metaverse at scale, it will be 339 00:21:11,840 --> 00:21:17,720 Speaker 1: practically impossible, which does not sound particularly fun to me. Okay, 340 00:21:18,000 --> 00:21:20,520 Speaker 1: we've got a few more news stories to cover before 341 00:21:20,560 --> 00:21:30,879 Speaker 1: we close out, but let's take another quick break. Let 342 00:21:30,960 --> 00:21:34,040 Speaker 1: us turn our attention now to Apple. The company is 343 00:21:34,119 --> 00:21:37,720 Speaker 1: also coming under fire recently, as eight state treasurers in 344 00:21:37,720 --> 00:21:42,200 Speaker 1: the United States have contacted the Securities and Exchange Commission 345 00:21:42,400 --> 00:21:46,680 Speaker 1: or SEC over allegations that Apple has been forcing employees 346 00:21:46,720 --> 00:21:50,840 Speaker 1: to sign non disclosure agreements more commonly referred to as 347 00:21:50,880 --> 00:21:56,399 Speaker 1: in DA's and use them as quote unquote concealment clauses. So, 348 00:21:56,440 --> 00:22:00,399 Speaker 1: in other words, the allegation is that Apple co versus 349 00:22:00,520 --> 00:22:04,920 Speaker 1: employees to sign an agreement that says the employee will 350 00:22:04,960 --> 00:22:08,920 Speaker 1: not reveal the existence of unlawful acts committed by people 351 00:22:08,960 --> 00:22:12,399 Speaker 1: within the company or facilitated by the company itself, that 352 00:22:12,520 --> 00:22:16,000 Speaker 1: you are not allowed to talk about that outside of 353 00:22:16,359 --> 00:22:21,080 Speaker 1: specific uh contexts within the company itself. You could not 354 00:22:21,160 --> 00:22:23,320 Speaker 1: go to a lawyer, you could not go to the press. 355 00:22:23,880 --> 00:22:26,840 Speaker 1: You weren't supposed to talk about to anyone. And the 356 00:22:26,920 --> 00:22:31,399 Speaker 1: letter from these state treasurers urge urges the SEC to 357 00:22:31,480 --> 00:22:36,000 Speaker 1: form rules that expressly forbid companies from using India's in 358 00:22:36,080 --> 00:22:40,920 Speaker 1: order to silence employees and prevent reports of discrimination, harassment, 359 00:22:41,000 --> 00:22:45,240 Speaker 1: and other illegal activities from being exposed outside the company. Further, 360 00:22:45,880 --> 00:22:49,840 Speaker 1: the treasurers are accusing Apple of not just using India's 361 00:22:49,880 --> 00:22:52,960 Speaker 1: in this way, but also then lying about that practice 362 00:22:53,240 --> 00:22:57,840 Speaker 1: because Apple has said that it does not require employees 363 00:22:57,960 --> 00:23:01,840 Speaker 1: to agree to concealment clauses, and so these allegations are 364 00:23:01,840 --> 00:23:06,240 Speaker 1: stating that that's categorically false, and that Apple then misrepresented 365 00:23:06,280 --> 00:23:12,360 Speaker 1: that when talking to official US regulators. The Treasurers are 366 00:23:12,359 --> 00:23:15,720 Speaker 1: now calling for an investigation and potentially action against Apple 367 00:23:15,760 --> 00:23:20,240 Speaker 1: should those allegations prove to be true. So that's still, 368 00:23:20,800 --> 00:23:23,359 Speaker 1: you know, again, working its way through that process. We 369 00:23:23,400 --> 00:23:26,840 Speaker 1: don't have anything hard to report on that yet, but 370 00:23:27,200 --> 00:23:29,920 Speaker 1: it does seem to be an escalation of what has 371 00:23:29,960 --> 00:23:34,119 Speaker 1: been called the Apple two movement t oo like me too. 372 00:23:35,720 --> 00:23:39,560 Speaker 1: Over in Europe, Apple is facing more fines. The Authority 373 00:23:39,680 --> 00:23:43,240 Speaker 1: for Consumers and Markets Department in the Netherlands has issued 374 00:23:43,280 --> 00:23:46,480 Speaker 1: a fine for five million euros for the third week 375 00:23:46,560 --> 00:23:49,240 Speaker 1: in a row, stating that Apple has failed to comply 376 00:23:49,359 --> 00:23:52,640 Speaker 1: with an order to allow dating apps to offer alternative 377 00:23:52,640 --> 00:23:57,560 Speaker 1: payment systems to Apple's own in app system. This ties 378 00:23:57,640 --> 00:24:00,800 Speaker 1: in with a global story of Apple racing pressure to 379 00:24:00,880 --> 00:24:05,199 Speaker 1: relinquish some control of the in app experience in iOS. So, 380 00:24:05,240 --> 00:24:08,479 Speaker 1: in case you're not familiar with that, Apple's policy was 381 00:24:08,520 --> 00:24:12,760 Speaker 1: that developers who created apps for iOS devices had and 382 00:24:12,880 --> 00:24:16,159 Speaker 1: any of them that included in app purchases would have 383 00:24:16,240 --> 00:24:19,400 Speaker 1: to use Apple's payment structure. So, in other words, let's 384 00:24:19,440 --> 00:24:23,080 Speaker 1: say I create a game for iOS devices, and within 385 00:24:23,119 --> 00:24:27,520 Speaker 1: the game, players can purchase different skins for their character 386 00:24:27,840 --> 00:24:30,600 Speaker 1: so that they have a different appearance. Well, I would 387 00:24:30,680 --> 00:24:34,040 Speaker 1: have to use apples in app payment structure for that, 388 00:24:34,119 --> 00:24:37,480 Speaker 1: and that would give Apple a cut of each transaction, 389 00:24:37,720 --> 00:24:42,320 Speaker 1: which ranged from fifteen depending upon the size of the 390 00:24:42,359 --> 00:24:45,959 Speaker 1: developer that was, you know, submitting the app app. And 391 00:24:46,000 --> 00:24:50,520 Speaker 1: because Apple controls the entire ecosystem for iOS apps, you 392 00:24:50,600 --> 00:24:53,960 Speaker 1: either played by Apple's rules or your app would not 393 00:24:54,040 --> 00:24:57,240 Speaker 1: get carried by Apple. And of course, you know, people 394 00:24:57,320 --> 00:25:01,640 Speaker 1: know that the folks who own iPhones are more prone 395 00:25:01,760 --> 00:25:06,000 Speaker 1: to in app purchases than other platforms, like even though 396 00:25:06,280 --> 00:25:09,800 Speaker 1: there are way more Android systems out there, the people 397 00:25:09,800 --> 00:25:12,840 Speaker 1: who spend the most money are those who have iOS systems. 398 00:25:13,080 --> 00:25:18,760 Speaker 1: So that would represent a pretty lucrative vein of revenue 399 00:25:18,760 --> 00:25:22,280 Speaker 1: for Apple. So Apple does not want to move away 400 00:25:22,320 --> 00:25:25,560 Speaker 1: from that, obviously, because that revenue is an important stream 401 00:25:25,640 --> 00:25:28,439 Speaker 1: for the company. So Apple has been resisting the push 402 00:25:28,840 --> 00:25:33,120 Speaker 1: in various regions. They've been appealing court decisions that have 403 00:25:33,480 --> 00:25:38,600 Speaker 1: ruled against Apple, commanding the company to allow alternative payment systems, 404 00:25:39,240 --> 00:25:41,840 Speaker 1: and the company has been trying to reverse this trend 405 00:25:43,000 --> 00:25:47,240 Speaker 1: with even the CEO making personal calls to lawmakers, but 406 00:25:47,359 --> 00:25:50,719 Speaker 1: over in the EU, regulators are not budging, and they 407 00:25:50,760 --> 00:25:54,240 Speaker 1: continue to levy fines on Apple for not complying or 408 00:25:54,400 --> 00:25:57,720 Speaker 1: failing to show evidence that the company is complying. The 409 00:25:57,760 --> 00:26:01,400 Speaker 1: current boutafines is, as I mentioned, five million euros a week, 410 00:26:01,720 --> 00:26:04,679 Speaker 1: which has a maximum of fifty million euros, which is 411 00:26:04,720 --> 00:26:07,440 Speaker 1: a lot of money, but for Apple it's kind of not. 412 00:26:08,280 --> 00:26:11,320 Speaker 1: I mean, Apple's revenue in one was a reported three 413 00:26:11,359 --> 00:26:16,240 Speaker 1: hundred sixty five point eight billion dollars and the profit 414 00:26:16,800 --> 00:26:21,000 Speaker 1: was nearly nine billion dollars. So I'm not sure how 415 00:26:21,040 --> 00:26:23,520 Speaker 1: seriously Apple is taking this just yet, or if the 416 00:26:23,560 --> 00:26:25,960 Speaker 1: company is just more focused on fighting the legal battle 417 00:26:26,040 --> 00:26:30,680 Speaker 1: to push back against this trend and an Apple future 418 00:26:30,720 --> 00:26:33,760 Speaker 1: news The company has long been rumored to be working 419 00:26:33,760 --> 00:26:37,640 Speaker 1: on a vehicle design sometimes called the Apple Car we're 420 00:26:37,760 --> 00:26:42,119 Speaker 1: cheekily sometimes the I Car, and recently an Apple patent 421 00:26:42,200 --> 00:26:45,720 Speaker 1: revealed that the company intends to incorporate machine learning into 422 00:26:45,760 --> 00:26:49,159 Speaker 1: the design of the vehicle for the purposes of autonomous operation. 423 00:26:50,640 --> 00:26:54,600 Speaker 1: So essentially, the conclusion is that the processors of today 424 00:26:54,920 --> 00:26:58,879 Speaker 1: are just not up to the task of handling situations 425 00:26:58,920 --> 00:27:02,399 Speaker 1: that can arise while driving. They're not fast enough to 426 00:27:02,840 --> 00:27:07,080 Speaker 1: react properly for every kind of driving situation that you know. 427 00:27:07,080 --> 00:27:10,320 Speaker 1: It's one thing to program for, say, collision detection or 428 00:27:10,800 --> 00:27:13,359 Speaker 1: you know, lane assist and that kind of stuff, but 429 00:27:13,440 --> 00:27:16,720 Speaker 1: it's another thing entirely to deal with all the possible 430 00:27:16,760 --> 00:27:19,200 Speaker 1: scenarios that can pop up when you're on the road. 431 00:27:20,080 --> 00:27:22,520 Speaker 1: If you are a driver, chances are there has been 432 00:27:22,520 --> 00:27:25,840 Speaker 1: a situation at one point or another that you've never 433 00:27:25,960 --> 00:27:30,000 Speaker 1: encountered before. Humans can typically react to those things in 434 00:27:30,040 --> 00:27:34,320 Speaker 1: an instinctive way that you know can be successful, but 435 00:27:34,400 --> 00:27:39,119 Speaker 1: a car that is following more you know, strict programming 436 00:27:39,480 --> 00:27:42,199 Speaker 1: may not be able to and that's a problem. So 437 00:27:42,320 --> 00:27:45,600 Speaker 1: machine learning could allow for a fleet of cars to 438 00:27:45,720 --> 00:27:50,800 Speaker 1: share their collective experiences with one another. So let's just imagine, 439 00:27:50,880 --> 00:27:53,000 Speaker 1: you know, this is a hypothetical. Imagine you've got a 440 00:27:53,080 --> 00:27:57,080 Speaker 1: thousand autonomous cars on the road, and most of those 441 00:27:57,119 --> 00:28:01,359 Speaker 1: cars are going to have relatively uneventful drives where nothing 442 00:28:01,400 --> 00:28:06,000 Speaker 1: particularly unexpected happens, so they're not learning anything. Uh, there 443 00:28:06,080 --> 00:28:08,600 Speaker 1: might be a few that have something that's a little 444 00:28:08,600 --> 00:28:11,199 Speaker 1: out of the ordinary, so they have the potential to 445 00:28:11,400 --> 00:28:14,120 Speaker 1: learn from those experiences, and you'll have a few, very 446 00:28:14,160 --> 00:28:18,880 Speaker 1: few outliers, they will have truly unusual experiences. Uh those 447 00:28:18,920 --> 00:28:22,560 Speaker 1: experiences and the car's reaction to those experiences can then 448 00:28:22,600 --> 00:28:26,560 Speaker 1: build into the overall fleet's knowledge base. So Car number 449 00:28:26,560 --> 00:28:29,480 Speaker 1: one can learn from the mistakes of Car one thousand, 450 00:28:30,520 --> 00:28:32,760 Speaker 1: and that means cars wouldn't be learning just from their 451 00:28:32,800 --> 00:28:36,520 Speaker 1: own mistakes, but from the mistakes and successes of all 452 00:28:36,560 --> 00:28:38,760 Speaker 1: the other cars in the fleet. Now, this gets a 453 00:28:38,800 --> 00:28:43,080 Speaker 1: little scary to think about because we're talking about large, heavy, 454 00:28:43,080 --> 00:28:46,720 Speaker 1: fast moving vehicles here, and the word mistake is not 455 00:28:46,920 --> 00:28:50,080 Speaker 1: much fun when you're thinking about it in those terms, right, Like, 456 00:28:50,120 --> 00:28:53,880 Speaker 1: a mistake can be you know, life altering, it could 457 00:28:53,880 --> 00:28:57,320 Speaker 1: be fatal. But on the other hand, this represents a 458 00:28:57,360 --> 00:28:59,760 Speaker 1: way to build out how cars will react in different 459 00:28:59,760 --> 00:29:03,840 Speaker 1: stations that can evolve far faster than if you were 460 00:29:03,840 --> 00:29:07,000 Speaker 1: just programming each scenario independently, which would take you forever 461 00:29:07,040 --> 00:29:10,320 Speaker 1: because you would never be able to account for every 462 00:29:10,320 --> 00:29:15,440 Speaker 1: single possibility on the road. Is just that's just not feasible. Now, 463 00:29:15,440 --> 00:29:17,760 Speaker 1: I should add that Apple is not the only company 464 00:29:17,840 --> 00:29:20,400 Speaker 1: that has looked into this kind of approach where you're 465 00:29:20,520 --> 00:29:24,320 Speaker 1: using machine learning and a fleet of vehicles to kind 466 00:29:24,360 --> 00:29:30,560 Speaker 1: of improve individual car responses, but the patent does give 467 00:29:30,600 --> 00:29:33,200 Speaker 1: us a rare glimpse into the ultra secret project of 468 00:29:33,240 --> 00:29:38,200 Speaker 1: an Apple Car. Finally, some Australian researchers have bad news 469 00:29:38,240 --> 00:29:42,840 Speaker 1: for alien lovers out there. The researchers used a powerful 470 00:29:43,160 --> 00:29:46,360 Speaker 1: radio telescope array to focus on the galactic center of 471 00:29:46,400 --> 00:29:49,480 Speaker 1: the Milky Way and they listened out for any signals 472 00:29:49,480 --> 00:29:52,600 Speaker 1: that could indicate alien activity. In other words, they were 473 00:29:52,600 --> 00:29:55,600 Speaker 1: looking for stuff what doesn't fit into the natural radio 474 00:29:55,640 --> 00:30:00,000 Speaker 1: signals you would expect to find from the galactic center. Uh. 475 00:30:00,040 --> 00:30:03,240 Speaker 1: They listen for around seven hours, and the report is 476 00:30:03,280 --> 00:30:06,960 Speaker 1: that all is quiet on the galactic front. Their search 477 00:30:07,120 --> 00:30:09,080 Speaker 1: was within a region known to have at least a 478 00:30:09,160 --> 00:30:13,160 Speaker 1: hundred forty four exo planets in it. An exo planet 479 00:30:13,280 --> 00:30:17,040 Speaker 1: is a planet that exists outside of our own Solar system. 480 00:30:17,080 --> 00:30:21,600 Speaker 1: Now note, an exoplanet is not necessarily a planet that 481 00:30:21,720 --> 00:30:25,680 Speaker 1: exists within the so called Goldilux zone. Uh. Those are 482 00:30:25,720 --> 00:30:29,400 Speaker 1: planets that are known to be in a distance that's 483 00:30:29,480 --> 00:30:33,240 Speaker 1: not too far nor too close from the host star 484 00:30:33,560 --> 00:30:38,880 Speaker 1: of that planet's system, and so could potentially support life 485 00:30:38,920 --> 00:30:41,360 Speaker 1: at least as we know it here on Earth. I 486 00:30:41,400 --> 00:30:43,040 Speaker 1: mean a lot of other factors would have to be 487 00:30:43,080 --> 00:30:46,000 Speaker 1: present too for that to be true. But one of 488 00:30:46,000 --> 00:30:47,600 Speaker 1: them is that, well, the planet has to be the 489 00:30:47,680 --> 00:30:49,600 Speaker 1: right distance from the star, else it's going to be 490 00:30:50,200 --> 00:30:52,960 Speaker 1: you know, too hot and too radiated to support life 491 00:30:53,000 --> 00:30:55,640 Speaker 1: as we know it, or too cold and too dark 492 00:30:55,720 --> 00:30:57,320 Speaker 1: to support life as we know it. Has to be 493 00:30:57,440 --> 00:31:01,560 Speaker 1: just right. But anyway, these researchers have looked around in 494 00:31:01,640 --> 00:31:04,680 Speaker 1: different sectors of the galaxy over the last decade with 495 00:31:04,800 --> 00:31:08,960 Speaker 1: occasional glimpses of time on radio telescopes is really precious, 496 00:31:09,440 --> 00:31:12,960 Speaker 1: so it's not like these radio telescopes are just scanning 497 00:31:13,400 --> 00:31:15,760 Speaker 1: the galaxy for signs of life. They're doing all sorts 498 00:31:15,800 --> 00:31:18,680 Speaker 1: of important scientific work. So it's only been here and 499 00:31:18,720 --> 00:31:20,960 Speaker 1: there that the researchers have been able to make use 500 00:31:21,000 --> 00:31:23,520 Speaker 1: of radio telescopes for this. But so far they have 501 00:31:23,600 --> 00:31:27,400 Speaker 1: come up with bup kiss. Now does that mean we're 502 00:31:27,480 --> 00:31:31,560 Speaker 1: all alone out here in the Milky Way? Well, not necessarily. Uh. 503 00:31:31,640 --> 00:31:34,520 Speaker 1: The astronomers have to listen in a specific ranges of 504 00:31:34,600 --> 00:31:38,960 Speaker 1: radio frequencies, which means, you know, potentially you could have 505 00:31:39,000 --> 00:31:43,520 Speaker 1: communications and different radio frequency ranges and we wouldn't pick 506 00:31:43,560 --> 00:31:45,560 Speaker 1: it up because we were tuned in it's like being 507 00:31:45,560 --> 00:31:48,040 Speaker 1: tuned into the wrong radio station, Like you're not going 508 00:31:48,080 --> 00:31:49,600 Speaker 1: to hear the song you want because you're on the 509 00:31:49,600 --> 00:31:52,560 Speaker 1: wrong station. Kind of like that. But you know, we're 510 00:31:52,560 --> 00:31:56,200 Speaker 1: talking about massive ranges of radio frequencies, and then there 511 00:31:56,200 --> 00:31:59,400 Speaker 1: are tons of other variables to consider, a lot of 512 00:31:59,400 --> 00:32:02,760 Speaker 1: them were pop arized by the Drake equation as proposed 513 00:32:02,800 --> 00:32:07,720 Speaker 1: by Dr Frank Drake. That equation frames the variables that 514 00:32:07,840 --> 00:32:09,920 Speaker 1: have to line up in order for there to actually 515 00:32:09,960 --> 00:32:13,680 Speaker 1: be a radio communicative species out there apart from our own, 516 00:32:13,680 --> 00:32:16,920 Speaker 1: I mean, and it includes stuff like you have to 517 00:32:17,120 --> 00:32:20,560 Speaker 1: figure out the rate at which stars form within our galaxy, 518 00:32:21,240 --> 00:32:23,920 Speaker 1: the fraction of the stars that are out there that 519 00:32:24,040 --> 00:32:29,120 Speaker 1: have planets orbiting them, Then the average number of orbiting 520 00:32:29,200 --> 00:32:33,640 Speaker 1: planets per star that could potentially support life, Like does 521 00:32:33,720 --> 00:32:37,600 Speaker 1: the average star have that has plants? Have one too 522 00:32:38,520 --> 00:32:42,440 Speaker 1: point five planets that can support life depending upon you know, 523 00:32:42,560 --> 00:32:45,080 Speaker 1: the number of stars that have orbiting plants and such. 524 00:32:45,560 --> 00:32:47,440 Speaker 1: Then you have to figure out the fraction of those 525 00:32:47,440 --> 00:32:50,080 Speaker 1: planets that actually develop life on them. They're not just 526 00:32:50,560 --> 00:32:54,120 Speaker 1: capable of supporting life, but life actually evolved. Then the 527 00:32:54,120 --> 00:32:57,479 Speaker 1: fraction of those planets where the life evolved into at 528 00:32:57,520 --> 00:33:02,280 Speaker 1: least one intelligent species, the fraction of those planets where 529 00:33:02,320 --> 00:33:05,760 Speaker 1: the intelligent life then develops some form of communication that 530 00:33:05,840 --> 00:33:09,000 Speaker 1: we would be able to detect, and then the length 531 00:33:09,000 --> 00:33:12,000 Speaker 1: of time at which such civilizations exist before they're no 532 00:33:12,040 --> 00:33:15,959 Speaker 1: longer able to communicate. So, in other words, like you 533 00:33:16,000 --> 00:33:19,120 Speaker 1: could have an intelligent species evolved to the point of 534 00:33:19,280 --> 00:33:22,920 Speaker 1: being able to communicate with radio waves, but then maybe 535 00:33:23,600 --> 00:33:26,720 Speaker 1: after a certain amount of time, that species wipes itself out. 536 00:33:27,120 --> 00:33:30,120 Speaker 1: It would be easy to imagine, considering the amount of 537 00:33:30,560 --> 00:33:34,160 Speaker 1: conflict that we see here on Earth, it's possible that 538 00:33:34,160 --> 00:33:37,000 Speaker 1: that is something that is not unique to our planet. 539 00:33:37,280 --> 00:33:39,800 Speaker 1: You know, a lot of our our science fiction deals 540 00:33:39,840 --> 00:33:43,320 Speaker 1: with alien races that don't have that kind of internal 541 00:33:43,360 --> 00:33:46,480 Speaker 1: conflict within their own home worlds. We don't know if 542 00:33:46,520 --> 00:33:48,200 Speaker 1: that's a thing. Well, we don't even know if the 543 00:33:48,280 --> 00:33:53,360 Speaker 1: alien races exist, let alone you know how harmonious they 544 00:33:53,360 --> 00:33:57,520 Speaker 1: are within their own species. So we don't really know 545 00:33:57,600 --> 00:33:59,760 Speaker 1: the values of all those variables I just mentioned. By 546 00:33:59,760 --> 00:34:03,240 Speaker 1: the way, the best we can do is make various guesses, 547 00:34:03,280 --> 00:34:06,320 Speaker 1: and those guesses changed dramatically as we learn new information 548 00:34:06,360 --> 00:34:10,080 Speaker 1: about our galaxy. What I think those variables do that 549 00:34:10,200 --> 00:34:12,160 Speaker 1: is useful as it gives us a way to kind 550 00:34:12,200 --> 00:34:15,480 Speaker 1: of conceptualize what we're up against when it comes to 551 00:34:15,480 --> 00:34:19,040 Speaker 1: figuring out if there is anyone to phone home too 552 00:34:19,400 --> 00:34:23,800 Speaker 1: out there. So far, sadly, it seems like the aliens 553 00:34:23,840 --> 00:34:27,359 Speaker 1: are maintaining radio silence. Maybe all of our calls are 554 00:34:27,360 --> 00:34:31,319 Speaker 1: going to voicemail. But you know, again, it doesn't This 555 00:34:31,440 --> 00:34:34,839 Speaker 1: doesn't mean that there is no intelligent alien life within 556 00:34:34,840 --> 00:34:38,120 Speaker 1: our galaxy, just that we have not found a way 557 00:34:38,120 --> 00:34:42,040 Speaker 1: of detecting it if it does exist. Uh. Personally, I 558 00:34:42,080 --> 00:34:45,719 Speaker 1: think it's entirely possible. There could be. I'm sure that 559 00:34:45,760 --> 00:34:48,520 Speaker 1: there has to be life somewhere else. I don't think 560 00:34:48,520 --> 00:34:52,720 Speaker 1: we're that special. I think that that's that. Statistically speaking, 561 00:34:53,400 --> 00:34:57,640 Speaker 1: it's almost certain that there is life on other planets 562 00:34:58,480 --> 00:35:01,399 Speaker 1: within our universe, certainly if not within our galaxy. I mean, 563 00:35:01,480 --> 00:35:04,080 Speaker 1: just the odds seem to suggest that that has to 564 00:35:04,080 --> 00:35:06,719 Speaker 1: be the case. That some of that life may have 565 00:35:06,800 --> 00:35:10,360 Speaker 1: evolved into intelligence I think is also pretty likely. But 566 00:35:10,400 --> 00:35:14,879 Speaker 1: we're also talking about such vast distances here that our 567 00:35:14,920 --> 00:35:17,840 Speaker 1: ability to pick up on it and the time frame 568 00:35:17,960 --> 00:35:19,840 Speaker 1: at which we could pick up on it, Because I 569 00:35:19,880 --> 00:35:23,960 Speaker 1: remember the further out something is the more into the 570 00:35:24,120 --> 00:35:27,319 Speaker 1: past we are looking when we observe that, because it 571 00:35:27,400 --> 00:35:31,440 Speaker 1: takes light years to travel to us, and radio communication 572 00:35:31,520 --> 00:35:35,319 Speaker 1: is traveling effectively, you know the speed of light. You know, 573 00:35:35,480 --> 00:35:37,200 Speaker 1: that means that we're looking back in the past, the 574 00:35:37,200 --> 00:35:39,840 Speaker 1: further out we look, and so it could mean that 575 00:35:39,880 --> 00:35:43,280 Speaker 1: we're looking at a time before the intelligent life started 576 00:35:43,280 --> 00:35:45,839 Speaker 1: to communicate via radio waves. Maybe it's doing it now, 577 00:35:46,080 --> 00:35:49,239 Speaker 1: but we won't know that for thousands of years. So 578 00:35:49,320 --> 00:35:51,480 Speaker 1: there's all these different variables to take into account. So 579 00:35:51,600 --> 00:35:55,000 Speaker 1: do not lose hope, alien lovers. Just know that we 580 00:35:55,120 --> 00:36:01,560 Speaker 1: haven't found the smoking flying saucer yet. That's it for 581 00:36:01,640 --> 00:36:05,320 Speaker 1: this episode of tech Stuff. The news for Tuesday, February 582 00:36:05,440 --> 00:36:09,640 Speaker 1: eighty two. I hope you are all well. If you 583 00:36:09,680 --> 00:36:12,320 Speaker 1: have suggestions for topics I should cover in future episodes 584 00:36:12,320 --> 00:36:15,160 Speaker 1: of tech Stuff, please reach out to me. The best 585 00:36:15,160 --> 00:36:17,440 Speaker 1: way is on Twitter. The handle for the show is 586 00:36:17,560 --> 00:36:21,160 Speaker 1: text Stuff H s W and I'll talk to you again. 587 00:36:22,000 --> 00:36:31,440 Speaker 1: Releasing Text Stuff is an I Heart Radio production. For 588 00:36:31,560 --> 00:36:34,520 Speaker 1: more podcasts from I Heart Radio, visit the i Heart 589 00:36:34,560 --> 00:36:37,759 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 590 00:36:37,800 --> 00:36:38,480 Speaker 1: favorite shows.