1 00:00:04,440 --> 00:00:12,479 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,640 --> 00:00:16,120 Speaker 1: Welcome to Tech Stuff. I'm your host Jonathan Strickland. I'm 3 00:00:16,120 --> 00:00:19,360 Speaker 1: an executive producer with iHeart Podcasts and how the tech 4 00:00:19,400 --> 00:00:22,239 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:22,360 --> 00:00:28,040 Speaker 1: November sixteenth, twenty twenty three. First up, over in the EU, 6 00:00:28,640 --> 00:00:34,199 Speaker 1: TikTok is protesting its designation as a gatekeeper. So this 7 00:00:34,360 --> 00:00:39,760 Speaker 1: relates to the EU's Digital Markets Act. That act establishes 8 00:00:39,880 --> 00:00:44,200 Speaker 1: this designation of gatekeeper for companies that meet several criteria 9 00:00:44,840 --> 00:00:47,640 Speaker 1: and if they do meet those criteria and the EU 10 00:00:47,720 --> 00:00:52,120 Speaker 1: determines their gatekeepers, it means these companies have obligations to 11 00:00:52,200 --> 00:00:56,320 Speaker 1: comply with a specific set of rules and restrictions that 12 00:00:56,400 --> 00:00:59,520 Speaker 1: the EU created for these very powerful companies. So first up, 13 00:01:00,120 --> 00:01:04,360 Speaker 1: what is a gatekeeper? Well, it's any company that serves 14 00:01:04,520 --> 00:01:10,080 Speaker 1: as a gateway between consumers and businesses through core platform services. 15 00:01:10,440 --> 00:01:13,280 Speaker 1: TikTok in this case, would be a core platform service. 16 00:01:13,720 --> 00:01:17,160 Speaker 1: But okay, beyond that, the companies that merit the designation 17 00:01:17,280 --> 00:01:21,040 Speaker 1: gatekeeper have to be really important players that hold significant 18 00:01:21,080 --> 00:01:26,000 Speaker 1: power in their respective markets. So think about companies that 19 00:01:26,080 --> 00:01:30,200 Speaker 1: have effectively a stranglehold on some specific element of the 20 00:01:30,240 --> 00:01:34,880 Speaker 1: digital world. Google web search would be a clear contender, right, 21 00:01:35,319 --> 00:01:39,920 Speaker 1: Facebook or Meta really would have a lot in social networks. 22 00:01:40,560 --> 00:01:43,600 Speaker 1: So to continue to be considered to be a gatekeeper, 23 00:01:43,959 --> 00:01:46,600 Speaker 1: there are some actual financial results that are important too. 24 00:01:46,920 --> 00:01:49,000 Speaker 1: A company has to have a turnover of at least 25 00:01:49,040 --> 00:01:53,120 Speaker 1: eight billion dollars for the last three financial years or 26 00:01:53,720 --> 00:01:56,840 Speaker 1: have an average market capitalization of at least seventy nine 27 00:01:56,880 --> 00:02:00,840 Speaker 1: point five billion dollars in the last financial year. The 28 00:02:00,880 --> 00:02:04,560 Speaker 1: reason for these numbers like seventy nine point five is 29 00:02:04,600 --> 00:02:07,240 Speaker 1: because obviously in the EU, this is all in euro 30 00:02:07,480 --> 00:02:11,840 Speaker 1: not in dollars. The company us to also provide services 31 00:02:11,880 --> 00:02:15,400 Speaker 1: to at least three member states within the EU, so 32 00:02:15,440 --> 00:02:18,360 Speaker 1: if it's only operating in one, it cannot be considered 33 00:02:18,400 --> 00:02:22,040 Speaker 1: a gatekeeper. If it is not making those kind of 34 00:02:22,080 --> 00:02:25,639 Speaker 1: financial returns, then it can't be a gatekeeper. It also 35 00:02:25,720 --> 00:02:29,680 Speaker 1: must be in an entrenched and durable position. Now that 36 00:02:29,800 --> 00:02:33,600 Speaker 1: means that these companies are really well established. They're not 37 00:02:33,720 --> 00:02:36,600 Speaker 1: likely to get displaced by a competitor anytime in the 38 00:02:36,600 --> 00:02:40,760 Speaker 1: foreseeable future. Now, to be clear, the EU is not 39 00:02:40,800 --> 00:02:45,480 Speaker 1: actually saying that TikTok is the gatekeeper here. That's one 40 00:02:45,520 --> 00:02:50,400 Speaker 1: of those core services, not the gatekeeper itself. Instead its 41 00:02:50,480 --> 00:02:54,680 Speaker 1: parent company bye Dance is what the EU has identified 42 00:02:54,800 --> 00:02:59,640 Speaker 1: as a gatekeeper. So TikTok is saying, oh, contrier, Monefrayer, 43 00:03:00,280 --> 00:03:03,639 Speaker 1: we're largely independent of Byteedance. We don't have that much 44 00:03:03,680 --> 00:03:08,080 Speaker 1: to do with them. We operate very much independently of Byteedance, 45 00:03:08,560 --> 00:03:11,120 Speaker 1: and when you take that into consideration, we don't qualify 46 00:03:11,120 --> 00:03:14,600 Speaker 1: as a gatekeeper. So that means we shouldn't have to 47 00:03:15,040 --> 00:03:19,280 Speaker 1: comply with the rules that gatekeepers have to follow. See 48 00:03:19,280 --> 00:03:22,320 Speaker 1: these rules include lots of stuff. One example is that 49 00:03:22,880 --> 00:03:26,120 Speaker 1: a gatekeeper has to be able to give users access 50 00:03:26,600 --> 00:03:30,440 Speaker 1: to the data that they are generating through using the platform. 51 00:03:31,160 --> 00:03:33,240 Speaker 1: So if a user says, hey, I want to have 52 00:03:33,280 --> 00:03:35,600 Speaker 1: access to all the data you have on me based 53 00:03:35,640 --> 00:03:40,000 Speaker 1: upon how I'm using your product, they have to do that. 54 00:03:40,000 --> 00:03:43,480 Speaker 1: That's just one example. So clearly, any company that has 55 00:03:43,600 --> 00:03:45,520 Speaker 1: gatekeeper status is going to have to put in a 56 00:03:45,520 --> 00:03:48,880 Speaker 1: lot of work to comply with these rules, and furthermore, 57 00:03:49,840 --> 00:03:52,200 Speaker 1: complying with those rules will also likely have a big 58 00:03:52,200 --> 00:03:54,800 Speaker 1: impact on how these companies can do business in the 59 00:03:54,880 --> 00:03:58,240 Speaker 1: EU in the first place. Right, those rules might end 60 00:03:58,360 --> 00:04:01,280 Speaker 1: up saying that things that are normal business practices for 61 00:04:01,320 --> 00:04:05,160 Speaker 1: these companies are no longer viable, and suddenly you have 62 00:04:05,240 --> 00:04:09,200 Speaker 1: this very lucrative form of revenue generation cut off from you. 63 00:04:09,320 --> 00:04:12,640 Speaker 1: So it's understandable that TikTok would want to find a 64 00:04:12,960 --> 00:04:19,400 Speaker 1: get out of Digital Markets Act free card. Interestingly, Meta 65 00:04:19,440 --> 00:04:24,040 Speaker 1: has also challenged its status as a gatekeeper. Considering how 66 00:04:24,120 --> 00:04:29,520 Speaker 1: frequently Zuckerberg has bragged about, you know, having billions of users, 67 00:04:30,279 --> 00:04:34,919 Speaker 1: it seems weird and difficult to argue that Meta somehow 68 00:04:35,120 --> 00:04:41,200 Speaker 1: doesn't qualify for gatekeeper status, but they're trying anyway. I'm 69 00:04:41,200 --> 00:04:43,960 Speaker 1: not sure they're going to find much success with that route. 70 00:04:44,360 --> 00:04:50,520 Speaker 1: But the other companies that have been designated gatekeepers include Amazon, Google, Microsoft, 71 00:04:50,640 --> 00:04:53,800 Speaker 1: and Apple. So the deadline for these companies to submit 72 00:04:53,839 --> 00:04:56,640 Speaker 1: a challenge to the classification is let me see, let 73 00:04:56,640 --> 00:05:01,240 Speaker 1: me look here today. Actually today's the last, so we'll 74 00:05:01,279 --> 00:05:05,080 Speaker 1: see if the EU reconsiders. I suspect that's not going 75 00:05:05,160 --> 00:05:07,599 Speaker 1: to be the case, but we'll see. Meanwhile, here in 76 00:05:07,600 --> 00:05:12,000 Speaker 1: the United States, Judge von Gonzalez Rogers ruled that Section 77 00:05:12,120 --> 00:05:15,839 Speaker 1: two thirty protections cannot be used as a blanket force 78 00:05:15,920 --> 00:05:20,200 Speaker 1: field defense. So as a reminder, Section two thirty states 79 00:05:20,200 --> 00:05:23,440 Speaker 1: that an Internet platform cannot be treated as a publisher 80 00:05:23,760 --> 00:05:27,279 Speaker 1: and then held responsible for the content that users are 81 00:05:27,320 --> 00:05:32,240 Speaker 1: posting to that platform. So, for example, if someone posts 82 00:05:32,320 --> 00:05:36,400 Speaker 1: a video on YouTube that is slandering an innocent person, 83 00:05:36,880 --> 00:05:41,400 Speaker 1: YouTube is not responsible for that video. They're not responsible 84 00:05:41,400 --> 00:05:44,520 Speaker 1: for the crime of slander. Section two thirty would protect 85 00:05:44,520 --> 00:05:48,240 Speaker 1: them from that. There are some limitations to section two thirty, 86 00:05:48,279 --> 00:05:51,440 Speaker 1: so things can get a little bit fuzzy. Like you know, 87 00:05:51,520 --> 00:05:55,080 Speaker 1: if a platform shows that it has not tried to 88 00:05:55,240 --> 00:06:01,120 Speaker 1: reasonably respond to issues from the recreated by users, then 89 00:06:01,160 --> 00:06:03,400 Speaker 1: it can lose some of the elements of section two 90 00:06:03,440 --> 00:06:07,440 Speaker 1: thirty protection. But this particular ruling takes a totally different 91 00:06:07,480 --> 00:06:11,400 Speaker 1: perspective on the problem. So at the heart of the 92 00:06:11,440 --> 00:06:17,240 Speaker 1: decision isn't the content that's posted to a platform. Instead, 93 00:06:17,680 --> 00:06:22,760 Speaker 1: it's actually directed at the platform's design and operation. So 94 00:06:23,480 --> 00:06:28,279 Speaker 1: the argument is if the platform's design is faulty and 95 00:06:28,480 --> 00:06:34,960 Speaker 1: allows for the posting and the proliferation of illegal content 96 00:06:35,040 --> 00:06:39,120 Speaker 1: on the platform that is not covered by section two thirty, 97 00:06:40,080 --> 00:06:43,680 Speaker 1: So it's not that the platform is responsible for publishing 98 00:06:44,160 --> 00:06:50,599 Speaker 1: the material, but that its measures to prevent abuse are 99 00:06:50,880 --> 00:06:54,839 Speaker 1: not sufficient, and therefore that's a design flaw and that 100 00:06:54,960 --> 00:06:57,560 Speaker 1: is not covered or protected by Section two thirty. So 101 00:06:57,640 --> 00:07:00,320 Speaker 1: let me give an example. Let's say a platform creates 102 00:07:00,360 --> 00:07:04,400 Speaker 1: an AI powered tool that is meant to detect instances 103 00:07:04,880 --> 00:07:08,880 Speaker 1: of illegal material posted to the platform, but this tool 104 00:07:08,920 --> 00:07:13,560 Speaker 1: fails to work reliably and some stuff gets past the tool. Well, 105 00:07:13,640 --> 00:07:17,119 Speaker 1: Judge Rogers's ruling says that the platform cannot hide behind 106 00:07:17,120 --> 00:07:20,280 Speaker 1: Section two thirty to protect themselves because the root of 107 00:07:20,320 --> 00:07:24,920 Speaker 1: the complaint is that the tool meant to protect against 108 00:07:25,240 --> 00:07:30,720 Speaker 1: the illegal material doesn't work. It's not about the publication 109 00:07:30,760 --> 00:07:33,560 Speaker 1: of the material. It's about this tool that's meant to 110 00:07:33,880 --> 00:07:37,840 Speaker 1: prevent that stuff, So it's all about contextualization. The ruling 111 00:07:37,880 --> 00:07:40,960 Speaker 1: means that this huge lawsuit that thirty states in the 112 00:07:41,040 --> 00:07:45,040 Speaker 1: US have brought against social networking companies can actually continue 113 00:07:45,240 --> 00:07:48,720 Speaker 1: to the discovery phase. So it doesn't mean that this 114 00:07:48,760 --> 00:07:52,160 Speaker 1: particular lawsuit is going to turn out one way versus another. 115 00:07:52,560 --> 00:07:55,160 Speaker 1: That's still to be determined in the courts. It just 116 00:07:55,240 --> 00:07:59,800 Speaker 1: means that the lawsuit can proceed to the next stage 117 00:08:00,080 --> 00:08:03,720 Speaker 1: in the court system. And it also means the extent 118 00:08:03,760 --> 00:08:08,480 Speaker 1: of protections offered by Section two thirty have boundaries. They're 119 00:08:08,480 --> 00:08:11,480 Speaker 1: not endless. It is an important precedent, and it's one 120 00:08:11,480 --> 00:08:14,040 Speaker 1: that's short to concern a lot of big tech companies 121 00:08:14,520 --> 00:08:18,480 Speaker 1: out in the space. Meanwhile, our march toward being engulfed 122 00:08:18,520 --> 00:08:22,520 Speaker 1: in AI generated content continues. Google is experimenting with several 123 00:08:22,520 --> 00:08:25,920 Speaker 1: AI powered features over on YouTube. One of them will 124 00:08:25,920 --> 00:08:28,560 Speaker 1: take a tune that you hum into a microphone and 125 00:08:28,600 --> 00:08:31,360 Speaker 1: then use that to generate a full music track for you. 126 00:08:32,000 --> 00:08:35,319 Speaker 1: Another tool has been rolled out to a small test 127 00:08:35,360 --> 00:08:39,280 Speaker 1: group of creators that is called dream Track, and this 128 00:08:39,360 --> 00:08:42,840 Speaker 1: tool lets the creator create a prompt to generate a 129 00:08:42,880 --> 00:08:45,520 Speaker 1: thirty second piece of music in the style of one 130 00:08:45,679 --> 00:08:50,440 Speaker 1: of nine musical artists. So why only nine? Well, Google 131 00:08:50,840 --> 00:08:53,960 Speaker 1: is trying to do this in a respectable and responsible 132 00:08:54,000 --> 00:08:57,760 Speaker 1: way and is working with artists on this tool. So 133 00:08:58,120 --> 00:09:01,240 Speaker 1: each one represented in the tool worked with Google and 134 00:09:01,280 --> 00:09:06,480 Speaker 1: gave their permission for this to actually work. So you 135 00:09:06,520 --> 00:09:08,959 Speaker 1: can't just use this tool to copy any artist out there. 136 00:09:09,000 --> 00:09:11,080 Speaker 1: It has to be someone who has given their express 137 00:09:11,120 --> 00:09:14,000 Speaker 1: permission for the tool to be able to do this. 138 00:09:14,360 --> 00:09:15,480 Speaker 1: And what it can do is it can take a 139 00:09:15,520 --> 00:09:19,400 Speaker 1: pretty generic prompt you might say, like make me a 140 00:09:19,440 --> 00:09:26,480 Speaker 1: song about driving fast along a coastal highway, and it 141 00:09:26,520 --> 00:09:28,640 Speaker 1: will take that prompt and then it will do everything 142 00:09:28,800 --> 00:09:33,319 Speaker 1: from generating the music to actually writing lyrics, to creating 143 00:09:33,880 --> 00:09:37,520 Speaker 1: vocals that sound like the artist you selected, and produce 144 00:09:37,559 --> 00:09:40,079 Speaker 1: the whole thing for you now. According to The Verge, 145 00:09:40,320 --> 00:09:42,840 Speaker 1: YouTube plans to roll out this feature specifically as a 146 00:09:42,840 --> 00:09:47,120 Speaker 1: way to augment YouTube shorts. That's YouTube's take on stuff 147 00:09:47,120 --> 00:09:49,600 Speaker 1: like TikTok videos and reels and that kind of thing. 148 00:09:50,000 --> 00:09:53,199 Speaker 1: Google's Deep Mind project provides the horsepower in the background 149 00:09:53,200 --> 00:09:57,440 Speaker 1: through an AI generation model that's called Leria, And unlike 150 00:09:57,480 --> 00:09:59,920 Speaker 1: with a lot of other generative AI applications, I don't 151 00:10:00,120 --> 00:10:02,920 Speaker 1: feel that icky about this one. I like that Google 152 00:10:02,960 --> 00:10:06,760 Speaker 1: reached out to actual artists first to get their buy 153 00:10:06,840 --> 00:10:10,439 Speaker 1: in on this. I like that it's limited in its application, 154 00:10:10,520 --> 00:10:14,440 Speaker 1: it's not like designed to just write music. And I 155 00:10:14,480 --> 00:10:18,360 Speaker 1: also like that, you know, it's it's going to give 156 00:10:18,520 --> 00:10:21,840 Speaker 1: creators a chance to do things like make a backing 157 00:10:21,920 --> 00:10:25,600 Speaker 1: track for their video without having to you know, take 158 00:10:25,679 --> 00:10:28,880 Speaker 1: up music or potentially, you know, try and lift someone 159 00:10:28,880 --> 00:10:31,720 Speaker 1: else's work and hope they don't get a copyright strike. 160 00:10:31,960 --> 00:10:34,000 Speaker 1: I like that as well, so I think this is 161 00:10:34,880 --> 00:10:38,040 Speaker 1: a reasonable use of generative AI, at least so far. 162 00:10:38,720 --> 00:10:41,880 Speaker 1: On a related note, Google has also introduced an audio 163 00:10:42,000 --> 00:10:46,079 Speaker 1: watermark for AI generated audio tracks. It's called synth id, 164 00:10:46,760 --> 00:10:50,200 Speaker 1: and while it's an audio watermark, it isn't audible, or rather, 165 00:10:50,240 --> 00:10:54,560 Speaker 1: it shouldn't be detectable, in Google's words, by human ears. 166 00:10:54,800 --> 00:10:57,360 Speaker 1: So theoretically, we puny humans won't be able to tell 167 00:10:57,400 --> 00:10:59,200 Speaker 1: that it's there. But if you were to PLoP the 168 00:10:59,240 --> 00:11:02,440 Speaker 1: track into some audio editing software, you should be able 169 00:11:02,440 --> 00:11:07,679 Speaker 1: to find the audio watermark somewhere in the track. It's 170 00:11:07,679 --> 00:11:11,000 Speaker 1: supposed to be true even if someone runs the AI 171 00:11:11,120 --> 00:11:15,080 Speaker 1: generated audio through compression, or they change the speed of 172 00:11:15,120 --> 00:11:17,400 Speaker 1: the track, or even if they put in other audio, 173 00:11:17,640 --> 00:11:20,280 Speaker 1: like if they mix it with something else. It is 174 00:11:20,280 --> 00:11:22,840 Speaker 1: not bulletproof. Google reps say that if someone had enough 175 00:11:22,840 --> 00:11:26,480 Speaker 1: determination and they pushed image manipulation far enough, they could 176 00:11:26,480 --> 00:11:30,040 Speaker 1: off uskate the watermark. But as we feel our way 177 00:11:30,080 --> 00:11:32,280 Speaker 1: toward how we can best make use of generative AI 178 00:11:32,400 --> 00:11:35,760 Speaker 1: so that we can enjoy its benefits without also having 179 00:11:35,800 --> 00:11:40,400 Speaker 1: to endure massively negative consequences, stuff like this technology can 180 00:11:40,440 --> 00:11:44,560 Speaker 1: help us get to that destination. All right, we're gonna 181 00:11:44,600 --> 00:11:46,360 Speaker 1: take a quick break. When we come back, I've got 182 00:11:46,400 --> 00:11:58,000 Speaker 1: some more news items to talk about we're back. So 183 00:11:58,200 --> 00:12:01,000 Speaker 1: imagine that you're in a ransomware game and you and 184 00:12:01,040 --> 00:12:03,640 Speaker 1: your fellow hackers have targeted a business and you get 185 00:12:03,640 --> 00:12:07,240 Speaker 1: into that business of systems, and you steal a whole 186 00:12:07,240 --> 00:12:09,680 Speaker 1: bunch of data, and then you tell the target, Hey, 187 00:12:09,760 --> 00:12:12,319 Speaker 1: I've got your data. If you do not pay out 188 00:12:12,320 --> 00:12:14,640 Speaker 1: a ransom, we're going to share this information with the 189 00:12:14,640 --> 00:12:16,719 Speaker 1: rest of the world. And then let's say that you 190 00:12:16,760 --> 00:12:20,440 Speaker 1: are not satisfied at how quickly or your target's moving. 191 00:12:20,480 --> 00:12:23,719 Speaker 1: Your target hasn't responded fast enough, so you could go 192 00:12:23,800 --> 00:12:26,040 Speaker 1: ahead and release all the information, but then you're not 193 00:12:26,080 --> 00:12:28,440 Speaker 1: going to get anything from your victim, right, It's just 194 00:12:28,480 --> 00:12:30,120 Speaker 1: going to be like, well, we did it and we 195 00:12:30,200 --> 00:12:33,240 Speaker 1: released information, but we didn't get any money for our efforts. 196 00:12:33,960 --> 00:12:36,240 Speaker 1: How do you get your target to pay up? Well, 197 00:12:36,760 --> 00:12:39,720 Speaker 1: how about you tell the authorities that they failed to 198 00:12:39,720 --> 00:12:43,160 Speaker 1: disclose a data breach, because that's what the Alpha ransomware 199 00:12:43,160 --> 00:12:46,080 Speaker 1: group did to a company called Meridian Link. So the 200 00:12:46,160 --> 00:12:49,440 Speaker 1: hacker group alerted the US Securities in Exchange Commission, or 201 00:12:49,559 --> 00:12:53,720 Speaker 1: SEC that they had hit Meridian Link with a ransomware attack, 202 00:12:54,400 --> 00:12:58,480 Speaker 1: but Meridian Link failed to disclose this attack, and not 203 00:12:58,559 --> 00:13:01,240 Speaker 1: too long ago, the SEC pasted a rule that says 204 00:13:01,640 --> 00:13:05,680 Speaker 1: companies have four days to report ransomware attacks that represent 205 00:13:05,760 --> 00:13:10,160 Speaker 1: a significant breach. So this was literally the hackers taddling 206 00:13:10,679 --> 00:13:13,680 Speaker 1: on their victim in an effort to pressure Meridian Link 207 00:13:13,720 --> 00:13:16,760 Speaker 1: to cough up the ransom. Never Mind that the hackers 208 00:13:16,840 --> 00:13:18,600 Speaker 1: might have been a bit premature to do this because 209 00:13:18,640 --> 00:13:21,680 Speaker 1: those rules, while they have been drafted, don't actually go 210 00:13:21,720 --> 00:13:25,800 Speaker 1: into effect until December fifteenth, so there is no legal 211 00:13:25,840 --> 00:13:29,880 Speaker 1: obligation yet for Meridian Link to have revealed this. Plus, 212 00:13:30,360 --> 00:13:33,480 Speaker 1: Meridian Link could argue that the breach doesn't actually amount 213 00:13:33,520 --> 00:13:35,839 Speaker 1: to being significant in the first place, so that it 214 00:13:35,880 --> 00:13:39,679 Speaker 1: wouldn't have you know, triggered the rules even if they 215 00:13:39,720 --> 00:13:42,040 Speaker 1: had been in effect at the time. But I just 216 00:13:42,040 --> 00:13:44,240 Speaker 1: thought it was interesting that the thieves are snitching on 217 00:13:44,280 --> 00:13:48,600 Speaker 1: the victims to the authorities. It's pretty crazy. Tesla and 218 00:13:48,640 --> 00:13:51,800 Speaker 1: worker unions in Sweden are in a pretty big fight 219 00:13:51,880 --> 00:13:55,400 Speaker 1: right now. So in Sweden, the labor market is really 220 00:13:55,440 --> 00:13:57,559 Speaker 1: different from other parts of the world. You know, a 221 00:13:57,640 --> 00:14:01,040 Speaker 1: lot of countries have things like minimum wage requirements and 222 00:14:01,200 --> 00:14:05,480 Speaker 1: working hours requirements, but in Sweden these decisions come down 223 00:14:05,600 --> 00:14:10,240 Speaker 1: to worker groups. So essentially unions and the various employers 224 00:14:10,240 --> 00:14:13,160 Speaker 1: in Sweden. So the two parties come together and they 225 00:14:13,240 --> 00:14:16,520 Speaker 1: negotiate the terms until both sides are satisfied, and then 226 00:14:16,840 --> 00:14:20,880 Speaker 1: that's those are the rules. But Tesla hasn't been doing that. 227 00:14:20,920 --> 00:14:23,240 Speaker 1: The company has refused to come to the table to 228 00:14:23,280 --> 00:14:28,440 Speaker 1: negotiate with Swedish workers, and so several different groups representing 229 00:14:28,680 --> 00:14:32,120 Speaker 1: different types of jobs have all decided they aren't going 230 00:14:32,160 --> 00:14:35,040 Speaker 1: to work with Tesla anymore. So that includes like dock 231 00:14:35,120 --> 00:14:39,720 Speaker 1: workers who will stop unloading Tesla cargo in Swedish ports, 232 00:14:39,840 --> 00:14:42,800 Speaker 1: So Tesla might ship stuff to Sweden, but no one's 233 00:14:42,800 --> 00:14:46,320 Speaker 1: going to be unloading those ships. Electricians will not work 234 00:14:46,320 --> 00:14:51,040 Speaker 1: on Tesla charging stations in Sweden. Cleaning staff will stop 235 00:14:51,040 --> 00:14:55,480 Speaker 1: showing up to clean showrooms. Several unions have expressed solidarity, 236 00:14:55,680 --> 00:14:58,680 Speaker 1: extending the protests beyond the first circle of employees affected 237 00:14:58,680 --> 00:15:02,560 Speaker 1: by Tesla's refusal to negoiate. The company is facing increasing 238 00:15:02,600 --> 00:15:05,240 Speaker 1: resistance in Sweden. Things are just going to get worse 239 00:15:05,360 --> 00:15:10,360 Speaker 1: unless Tesla makes motions to actually negotiate with unions. Even 240 00:15:10,360 --> 00:15:13,040 Speaker 1: the Swedish post office is going to stop delivering mail 241 00:15:13,080 --> 00:15:17,480 Speaker 1: to Tesla addresses starting next week. So elon Musk in 242 00:15:17,520 --> 00:15:20,080 Speaker 1: the past has shown great disdain for unions and for 243 00:15:20,160 --> 00:15:23,960 Speaker 1: worker organization in general. So I can't say that Tesla's 244 00:15:23,960 --> 00:15:27,080 Speaker 1: failure to engage in this process has surprised me. But 245 00:15:27,160 --> 00:15:30,520 Speaker 1: it will still surprise me if that remains the case, 246 00:15:30,840 --> 00:15:33,120 Speaker 1: because I don't get the impression that Tesla's really in 247 00:15:33,120 --> 00:15:35,920 Speaker 1: a position where they can just write off an entire country. 248 00:15:36,720 --> 00:15:39,960 Speaker 1: In space news, SpaceX has received approval from the Federal 249 00:15:40,000 --> 00:15:43,760 Speaker 1: Aviation Administration here in America to perform a second test 250 00:15:43,760 --> 00:15:47,520 Speaker 1: flight of the Starship Super heavy lifting vehicle, which, as 251 00:15:47,560 --> 00:15:51,880 Speaker 1: I record this is currently scheduled for tomorrow, November seventeenth. 252 00:15:52,320 --> 00:15:55,400 Speaker 1: Earlier this year, back in April, SpaceX attempted a test 253 00:15:55,440 --> 00:15:58,480 Speaker 1: launch of this vehicle and it did not go well. 254 00:15:59,160 --> 00:16:02,200 Speaker 1: About four minutes into the test launched, the Starship burst 255 00:16:02,240 --> 00:16:05,400 Speaker 1: into flame, and then it's self destructed. The failure caused 256 00:16:05,400 --> 00:16:08,000 Speaker 1: a great deal of damage to the surrounding area, including 257 00:16:08,040 --> 00:16:10,720 Speaker 1: one case where debris from the explosion apparently hit a 258 00:16:10,840 --> 00:16:14,120 Speaker 1: vehicle here on the ground. An investigation followed and the 259 00:16:14,160 --> 00:16:17,720 Speaker 1: FAA created a pretty long list of issues that SpaceX 260 00:16:17,720 --> 00:16:21,280 Speaker 1: would have to address before the FAA would grant permission 261 00:16:21,320 --> 00:16:24,000 Speaker 1: to conduct another test. But apparently all of that is 262 00:16:24,040 --> 00:16:27,480 Speaker 1: now reconciled, and if tomorrow's test goes as planned, the 263 00:16:27,640 --> 00:16:31,360 Speaker 1: starship should lift off, it should fly for around ninety minutes, 264 00:16:31,400 --> 00:16:35,240 Speaker 1: and then eventually it should descend vertically into the Pacific Ocean. 265 00:16:35,520 --> 00:16:37,000 Speaker 1: So we'll have to keep our eyes out to see 266 00:16:37,040 --> 00:16:40,200 Speaker 1: if that in fact happens. In New York City, the 267 00:16:40,240 --> 00:16:44,400 Speaker 1: company Joby Aviation held a demonstration of its electrical Vertical 268 00:16:44,440 --> 00:16:49,360 Speaker 1: takeoff and Landing Aircraft or EV tall as six propellers, 269 00:16:49,480 --> 00:16:52,320 Speaker 1: it kind of looks like an oversized remote control drone, 270 00:16:52,560 --> 00:16:54,560 Speaker 1: like just big enough for you to ride in, and 271 00:16:54,600 --> 00:16:57,880 Speaker 1: the fact that it has six propellers, whereas most consumer drones 272 00:16:57,920 --> 00:17:01,560 Speaker 1: have four. But this is an example of an electric 273 00:17:01,680 --> 00:17:05,040 Speaker 1: flying taxi concept that's been around in New York for 274 00:17:05,080 --> 00:17:09,200 Speaker 1: a while, and primarily the focus has been on transporting 275 00:17:09,240 --> 00:17:13,919 Speaker 1: people from locations like say downtown Manhattan to an airport 276 00:17:14,480 --> 00:17:18,119 Speaker 1: and or vice versa, from an airport to say downtown Manhattan. 277 00:17:18,680 --> 00:17:20,320 Speaker 1: And this is a trip that if you were taking 278 00:17:20,760 --> 00:17:23,560 Speaker 1: by ground transportation, it might take you an hour or 279 00:17:23,600 --> 00:17:26,880 Speaker 1: sometimes longer to get there, but by air it could 280 00:17:26,880 --> 00:17:30,080 Speaker 1: take you less than ten minutes. New York's Mayor Eric 281 00:17:30,119 --> 00:17:33,240 Speaker 1: Adams was present for this demonstration, which saw a job 282 00:17:33,640 --> 00:17:36,320 Speaker 1: ev tall vehicle lift off the ground and fly around 283 00:17:36,359 --> 00:17:39,240 Speaker 1: the area. The plan is to have Jobey Aviation cleared 284 00:17:39,240 --> 00:17:42,679 Speaker 1: for commercial operation by twenty twenty five, but the company 285 00:17:42,680 --> 00:17:45,160 Speaker 1: still has to meet a few more FAA requirements before 286 00:17:45,160 --> 00:17:47,600 Speaker 1: it can earn a license to operate in New York City. 287 00:17:48,080 --> 00:17:51,080 Speaker 1: Joby Aviation is just one company that's actually competing for 288 00:17:51,119 --> 00:17:53,960 Speaker 1: this potential future. There are others that are also looking 289 00:17:54,000 --> 00:17:57,359 Speaker 1: to take that spot in New York. Now, I'm not 290 00:17:57,400 --> 00:18:00,160 Speaker 1: gonna lie. I do think it's neat, but I'm I'm 291 00:18:00,160 --> 00:18:04,160 Speaker 1: not entirely convinced it actually meets a real need. I mean, sure, 292 00:18:04,200 --> 00:18:07,159 Speaker 1: getting to the airport more easily and quickly is great, 293 00:18:07,920 --> 00:18:10,520 Speaker 1: but the capacity of these aircraft is pretty limited, so 294 00:18:10,640 --> 00:18:14,119 Speaker 1: I don't actually see it making a significant impact on 295 00:18:14,240 --> 00:18:17,479 Speaker 1: larger issues like traffic, even traffic to the airports. I mean, 296 00:18:17,520 --> 00:18:19,480 Speaker 1: it's a very small percentage of people who are going 297 00:18:19,520 --> 00:18:22,080 Speaker 1: to be able to take advantage of this, at least 298 00:18:22,400 --> 00:18:25,440 Speaker 1: unless you've got like maybe a huge fleet of these things. 299 00:18:25,440 --> 00:18:27,040 Speaker 1: And then you have the issue of well where are 300 00:18:27,080 --> 00:18:30,760 Speaker 1: they taking off and landing? Where are you storing these things? 301 00:18:30,800 --> 00:18:32,879 Speaker 1: Like you know, there are other issues that come up. 302 00:18:32,920 --> 00:18:34,560 Speaker 1: If you're like, well we've got a big enough fleet 303 00:18:34,600 --> 00:18:37,320 Speaker 1: to make an impact on traffic, well, then you have 304 00:18:37,359 --> 00:18:40,080 Speaker 1: other problems you have to solve. I still think it's cool. 305 00:18:40,280 --> 00:18:43,879 Speaker 1: I just I'm not convinced that it really solves a problem. 306 00:18:44,400 --> 00:18:46,880 Speaker 1: I've got a couple of article recommendations for yall before 307 00:18:46,920 --> 00:18:50,280 Speaker 1: I sign off today. First up is Will Saddleberg's article 308 00:18:50,400 --> 00:18:54,840 Speaker 1: for Android Police. It is titled Android Isn't Cool with teenagers. 309 00:18:55,200 --> 00:18:59,000 Speaker 1: That's a big problem. So Saddleberg points out how, particularly 310 00:18:59,040 --> 00:19:02,440 Speaker 1: here in the United States, younger folks prefer iOS devices 311 00:19:02,480 --> 00:19:05,560 Speaker 1: to Android, and that's really bad news for Google because 312 00:19:05,600 --> 00:19:07,840 Speaker 1: you know, young folks who love Apple tend to grow 313 00:19:07,920 --> 00:19:11,680 Speaker 1: up to be adults who love Apple, and then Google 314 00:19:11,920 --> 00:19:16,880 Speaker 1: is left seeing an entire generation moving or not even 315 00:19:16,920 --> 00:19:21,200 Speaker 1: adopting Android devices. Plus, Settleberg touches on the whole crazy 316 00:19:21,200 --> 00:19:24,679 Speaker 1: status symbol thing here in the US in iPhones, particularly 317 00:19:24,720 --> 00:19:28,560 Speaker 1: with I message, where if there's like a group message 318 00:19:29,480 --> 00:19:34,479 Speaker 1: going on and one person has a green bubble of 319 00:19:34,560 --> 00:19:38,840 Speaker 1: text instead of blue, that person is then ostracized or 320 00:19:38,920 --> 00:19:41,720 Speaker 1: ridiculed because they aren't cool enough to have an iPhone, 321 00:19:42,359 --> 00:19:45,720 Speaker 1: which is super lame. I think personally like I would 322 00:19:45,720 --> 00:19:48,119 Speaker 1: just I would roll my eyes at that kind of stuff. 323 00:19:48,320 --> 00:19:50,760 Speaker 1: But then I also understand, like that's kind of how 324 00:19:50,840 --> 00:19:53,480 Speaker 1: kids operate. I was always oblivious to that sort of thing. 325 00:19:53,640 --> 00:19:56,600 Speaker 1: I was never aware enough to pick up on the 326 00:19:56,600 --> 00:19:59,159 Speaker 1: fact that I was being ostracized. I'm sure I was. 327 00:19:59,560 --> 00:20:02,760 Speaker 1: I just did notice. But like when I was growing up, 328 00:20:02,800 --> 00:20:05,199 Speaker 1: it was all things like designer clothes or whatever, or 329 00:20:05,240 --> 00:20:08,040 Speaker 1: certain labels. Certain brand labels were big and others were 330 00:20:08,320 --> 00:20:13,520 Speaker 1: considered garbage. And I never figured that out and I 331 00:20:13,600 --> 00:20:17,520 Speaker 1: turned out. I mean, I'm okay anyway. The other article 332 00:20:17,560 --> 00:20:20,880 Speaker 1: I have to recommend is John Broadkin's piece for Ours Technica. 333 00:20:21,040 --> 00:20:24,960 Speaker 1: It is titled Cable Lobby and Ted Cruz are Disappointed 334 00:20:25,000 --> 00:20:29,080 Speaker 1: as FCC band's digital discrimination. Broadkin does a really good 335 00:20:29,160 --> 00:20:32,240 Speaker 1: job breaking down the actual issue and the various perspectives. 336 00:20:32,760 --> 00:20:35,440 Speaker 1: It's not clear cut, Like I wouldn't say it's super 337 00:20:35,840 --> 00:20:38,359 Speaker 1: clear cut from any perspective. I would also say a 338 00:20:38,400 --> 00:20:43,160 Speaker 1: lot of perspectives are selectively ignoring certain things in order 339 00:20:43,200 --> 00:20:47,240 Speaker 1: to make a point, which is disingenuous, I would argue. 340 00:20:47,400 --> 00:20:49,199 Speaker 1: But anyway, the article does a really good job at 341 00:20:49,240 --> 00:20:52,320 Speaker 1: breaking all that down. Okay, that's it. I hope you 342 00:20:52,359 --> 00:20:55,880 Speaker 1: are all well, and I'll talk to you again really soon. 343 00:21:02,040 --> 00:21:06,680 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 344 00:21:07,000 --> 00:21:10,720 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 345 00:21:10,760 --> 00:21:11,840 Speaker 1: to your favorite shows.