1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,600 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,680 --> 00:00:17,920 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,120 --> 00:00:20,760 Speaker 1: And how the tech are you. It's time for the 5 00:00:20,760 --> 00:00:25,640 Speaker 1: tech news for February twenty second, twenty twenty two, otherwise 6 00:00:25,680 --> 00:00:28,040 Speaker 1: known as the date where you just start hitting the 7 00:00:28,120 --> 00:00:32,040 Speaker 1: number two until it's March. Let's get to it. So, 8 00:00:32,320 --> 00:00:35,720 Speaker 1: back during the Trump administration here in the United States, 9 00:00:36,280 --> 00:00:39,599 Speaker 1: we saw the US government pressure the social network platform 10 00:00:39,640 --> 00:00:44,159 Speaker 1: TikTok to sever ties with its Chinese parent company byte Dance, 11 00:00:44,800 --> 00:00:48,320 Speaker 1: citing a concern that TikTok could be siphoning personal data 12 00:00:48,360 --> 00:00:52,080 Speaker 1: from US citizens and sending that information back to China. 13 00:00:52,560 --> 00:00:56,800 Speaker 1: Claimed that TikTok refuted at the time, and there were 14 00:00:56,840 --> 00:00:58,600 Speaker 1: a lot of other issues that were tied up with 15 00:00:58,640 --> 00:01:01,480 Speaker 1: this as well. It was just that it was this 16 00:01:01,640 --> 00:01:05,400 Speaker 1: larger trade war between the US and China. But it 17 00:01:05,400 --> 00:01:08,119 Speaker 1: turns out that a pair of studies kind of justifies 18 00:01:08,440 --> 00:01:12,960 Speaker 1: those data concerns. A group of cybersecurity experts conducted studies 19 00:01:12,959 --> 00:01:17,360 Speaker 1: that we're looking into TikTok's data collection practices and concluded 20 00:01:17,400 --> 00:01:20,200 Speaker 1: that TikTok has been allowing byte Dance to have full 21 00:01:20,240 --> 00:01:24,960 Speaker 1: access to user data on the platform. The security experts 22 00:01:25,400 --> 00:01:28,840 Speaker 1: sent their findings to the news website The Rap and 23 00:01:28,920 --> 00:01:32,720 Speaker 1: The Wrap then did a very responsible thing from journalism standpoint, 24 00:01:32,880 --> 00:01:36,920 Speaker 1: They sought out other experts to independently confirm the findings, 25 00:01:36,920 --> 00:01:40,760 Speaker 1: which they did. TikTok, as of this recording, has not 26 00:01:40,880 --> 00:01:45,800 Speaker 1: yet responded to those reports. Further, the research suggests that 27 00:01:45,840 --> 00:01:49,280 Speaker 1: TikTok engineers have found a way to sidestep code audits 28 00:01:49,360 --> 00:01:52,840 Speaker 1: by both Google and Apple, which could also mean that 29 00:01:53,440 --> 00:01:56,800 Speaker 1: TikTok could potentially change the code without folks really knowing 30 00:01:56,840 --> 00:02:00,160 Speaker 1: about it, which brings up all sorts of issue is 31 00:02:00,680 --> 00:02:03,960 Speaker 1: and it suggests that TikTok has found some loopholes that's 32 00:02:03,960 --> 00:02:08,239 Speaker 1: exploiting while other platforms like say Facebook, continued to be 33 00:02:08,320 --> 00:02:13,040 Speaker 1: buffeted by recent changes to IOSS privacy and data tracking settings. 34 00:02:13,680 --> 00:02:16,880 Speaker 1: You might remember that was a big part of why Meta, 35 00:02:17,200 --> 00:02:20,480 Speaker 1: the parent company to Facebook, had such a rough year 36 00:02:20,560 --> 00:02:23,520 Speaker 1: last year. They lost out on an estimated ten billion 37 00:02:23,520 --> 00:02:27,399 Speaker 1: dollars in revenue because Apple changed it's it's data tracking 38 00:02:27,840 --> 00:02:33,079 Speaker 1: privacy settings. Now, according to those security experts, the TikTok 39 00:02:33,120 --> 00:02:38,079 Speaker 1: application can even query things on whatever devices installed on 40 00:02:38,200 --> 00:02:41,520 Speaker 1: so kind of similar to the data tracking issue, where 41 00:02:41,800 --> 00:02:46,440 Speaker 1: it's not just tracking your activity within the the app itself, 42 00:02:46,800 --> 00:02:50,560 Speaker 1: but other stuff you're doing on your down that same device. 43 00:02:50,960 --> 00:02:53,520 Speaker 1: So essentially TikTok has been able to do what Facebook 44 00:02:53,720 --> 00:02:56,239 Speaker 1: used to do but no longer can do, at least 45 00:02:56,600 --> 00:03:00,280 Speaker 1: if you opt out. My guess is that China isn't 46 00:03:00,320 --> 00:03:04,040 Speaker 1: so much spying on individual US citizens to build out 47 00:03:04,040 --> 00:03:07,560 Speaker 1: you know, dossiers on everyone, but they're rather you know, 48 00:03:08,000 --> 00:03:10,239 Speaker 1: selling that data to third parties that could use the 49 00:03:10,280 --> 00:03:13,480 Speaker 1: information for all sorts of stuff, you know, advertising being 50 00:03:13,480 --> 00:03:16,800 Speaker 1: the obvious one, but not necessarily the only one. Anyway, 51 00:03:16,840 --> 00:03:19,320 Speaker 1: It'll be interesting to see if Google and Apple respond 52 00:03:19,360 --> 00:03:23,200 Speaker 1: to this or how or if TikTok will now. Let's 53 00:03:23,200 --> 00:03:26,600 Speaker 1: flip over to Instagram, where the app has quietly made 54 00:03:26,720 --> 00:03:29,120 Speaker 1: changes to a feature that allows users to set a 55 00:03:29,200 --> 00:03:33,280 Speaker 1: daily time limit for their use of Instagram. So previously 56 00:03:33,960 --> 00:03:36,880 Speaker 1: you could go into your settings and set it your 57 00:03:37,000 --> 00:03:39,640 Speaker 1: your daily limit to as low as ten minutes a day, 58 00:03:40,120 --> 00:03:41,960 Speaker 1: and that would make it easier to be aware of 59 00:03:41,960 --> 00:03:44,600 Speaker 1: how much time you were actually spending on the platform 60 00:03:44,680 --> 00:03:48,720 Speaker 1: and avoid getting sucked into hours of just scrolling. This 61 00:03:48,840 --> 00:03:51,200 Speaker 1: is an opt in feature, so it's not like it's 62 00:03:51,200 --> 00:03:54,320 Speaker 1: on by default, but it was there. But now that 63 00:03:54,600 --> 00:03:57,960 Speaker 1: ten minute and fifteen minute time limit limit options, those 64 00:03:58,000 --> 00:04:01,320 Speaker 1: are both gone. The minimum time limit that you can 65 00:04:01,360 --> 00:04:04,520 Speaker 1: set now is thirty minutes. Now, I am not shocked 66 00:04:04,560 --> 00:04:08,720 Speaker 1: to hear this. After all, Meta Again, Instagram's parent company 67 00:04:09,160 --> 00:04:11,720 Speaker 1: had that earnings call that revealed some pretty rough news 68 00:04:11,720 --> 00:04:15,240 Speaker 1: to investors. Besides that ten billion dollars in lost revenue 69 00:04:15,360 --> 00:04:18,240 Speaker 1: or estimated lost revenue, it's really hard to, you know, 70 00:04:18,600 --> 00:04:22,760 Speaker 1: guarantee revenue that you lost because you didn't make it. 71 00:04:22,839 --> 00:04:26,799 Speaker 1: But anyway, the former iron grip that Facebook and Instagram 72 00:04:26,839 --> 00:04:30,520 Speaker 1: had on the average person's online time is starting to slip. 73 00:04:31,040 --> 00:04:33,920 Speaker 1: So perhaps one way the company is trying to address 74 00:04:33,960 --> 00:04:37,159 Speaker 1: this is to loosen the restrictions a little bit for 75 00:04:37,200 --> 00:04:40,520 Speaker 1: that daily time limit option, saying, hey, you know, thirty 76 00:04:40,560 --> 00:04:42,640 Speaker 1: minutes isn't all that bad. That should probably be your 77 00:04:42,680 --> 00:04:46,359 Speaker 1: upper limit um and have that set as the lowest option. Now, 78 00:04:47,480 --> 00:04:51,360 Speaker 1: some folks who who have used the feature, we'll find 79 00:04:51,400 --> 00:04:53,640 Speaker 1: that they will be spending more time on the platform 80 00:04:53,680 --> 00:04:56,479 Speaker 1: before they bump up against their daily limit. Now, to 81 00:04:56,560 --> 00:05:00,000 Speaker 1: be totally fair, this could just as easily be an example, 82 00:05:00,040 --> 00:05:02,960 Speaker 1: but where Meta just saw that very few people were 83 00:05:02,960 --> 00:05:05,520 Speaker 1: opting in for ten or fifteen minute time limits, because 84 00:05:05,520 --> 00:05:08,160 Speaker 1: that is a very short amount of time. So it's 85 00:05:08,160 --> 00:05:11,920 Speaker 1: possible that the company just said, nobody's using these settings 86 00:05:11,920 --> 00:05:14,400 Speaker 1: like that, it's such a tiny thing, it doesn't make 87 00:05:14,400 --> 00:05:17,240 Speaker 1: sense to support it, and they just dropped those options. 88 00:05:17,760 --> 00:05:19,719 Speaker 1: Uh so you might as well, you know, ask those 89 00:05:20,040 --> 00:05:22,799 Speaker 1: especially if you feel like you can't afford to discourage 90 00:05:22,839 --> 00:05:26,839 Speaker 1: people from being on your platform anyway. I suspect this 91 00:05:26,960 --> 00:05:29,080 Speaker 1: is just one of a billion different things that the 92 00:05:29,080 --> 00:05:32,839 Speaker 1: company plans to do to try and reverse some concerning trends, 93 00:05:33,360 --> 00:05:36,760 Speaker 1: namely that some platforms like Facebook have seen a decline 94 00:05:36,760 --> 00:05:40,279 Speaker 1: in users or an engagement or both, and that the 95 00:05:40,320 --> 00:05:43,520 Speaker 1: company has had a little bit of trouble attracting younger 96 00:05:43,640 --> 00:05:46,919 Speaker 1: users to its platforms, which means that you know, the 97 00:05:47,000 --> 00:05:50,080 Speaker 1: user base will shrink over time as it ages out. 98 00:05:50,720 --> 00:05:53,560 Speaker 1: Pro Publica and The New York Times investigated hundreds of 99 00:05:53,600 --> 00:05:56,400 Speaker 1: accounts on Twitter that appeared to be part of a 100 00:05:56,480 --> 00:06:00,839 Speaker 1: Chinese propaganda campaign during the Olympics, which prompted Twitter to 101 00:06:00,920 --> 00:06:05,240 Speaker 1: ban three thousand accounts, and the campaign was doing pretty 102 00:06:05,400 --> 00:06:08,440 Speaker 1: much what you'd expect when you hear the word propaganda. 103 00:06:08,520 --> 00:06:11,359 Speaker 1: It was trying to present an overly positive portrayal of 104 00:06:11,440 --> 00:06:16,120 Speaker 1: China and the Olympic Games, while simultaneously ignoring or discounting 105 00:06:16,160 --> 00:06:19,839 Speaker 1: stories related to human rights abuses in China, of which 106 00:06:20,080 --> 00:06:24,600 Speaker 1: there are many. Twitter reps said that the platform banned 107 00:06:24,680 --> 00:06:28,039 Speaker 1: those accounts because they were violating the company's policy against 108 00:06:28,080 --> 00:06:32,240 Speaker 1: platform manipulation. So essentially the issue here was that these 109 00:06:32,240 --> 00:06:35,400 Speaker 1: accounts were all trying to feed into massive conversations about 110 00:06:35,440 --> 00:06:39,200 Speaker 1: the Olympics and China and to steer those conversations in 111 00:06:39,240 --> 00:06:43,760 Speaker 1: a specific positive way. And you might be familiar with 112 00:06:43,920 --> 00:06:48,080 Speaker 1: the term sock puppet account so way back in the 113 00:06:48,080 --> 00:06:51,160 Speaker 1: early days of the Internet, I would spot these on occasion, 114 00:06:51,200 --> 00:06:54,760 Speaker 1: either in chat rooms or on forums. But this is 115 00:06:54,800 --> 00:06:57,719 Speaker 1: where a user creates a separate account, like they have 116 00:06:57,760 --> 00:07:00,440 Speaker 1: an account with whatever it is, but they create a 117 00:07:00,480 --> 00:07:04,119 Speaker 1: separate account in order to achieve some sort of goal. 118 00:07:04,800 --> 00:07:07,640 Speaker 1: It's possible that they make the sock puppet, which is 119 00:07:07,800 --> 00:07:10,120 Speaker 1: it's called that because it is under the control of 120 00:07:10,160 --> 00:07:14,440 Speaker 1: another user who's also in the same space. Uh, the 121 00:07:14,480 --> 00:07:17,200 Speaker 1: sock puppet might act like a hype man for the 122 00:07:17,240 --> 00:07:19,960 Speaker 1: primary user like oh yeah, this guy is so great yea, 123 00:07:20,880 --> 00:07:24,800 Speaker 1: Or sometimes the stock puppet stood in as an easily 124 00:07:24,840 --> 00:07:28,880 Speaker 1: defeatable opponent, right, because if you're controlling both sides the argument, 125 00:07:28,920 --> 00:07:32,240 Speaker 1: you can easily make the other side lose. So would 126 00:07:32,240 --> 00:07:34,800 Speaker 1: be a straw man for the primary user to to 127 00:07:34,880 --> 00:07:37,840 Speaker 1: tear down and look like a hero. The Twitter situation 128 00:07:38,320 --> 00:07:41,200 Speaker 1: is similar, except instead of one fake account used to 129 00:07:41,240 --> 00:07:45,360 Speaker 1: push a narrative, we're talking about thousands. They were all 130 00:07:45,640 --> 00:07:49,560 Speaker 1: pushing to to shape this conversation online in a specific way, 131 00:07:49,560 --> 00:07:52,920 Speaker 1: and Twitter said, yeah, you can't do that. That's manipulating 132 00:07:53,040 --> 00:07:56,560 Speaker 1: the platform for your own purposes. That's against the rules, 133 00:07:56,640 --> 00:08:00,800 Speaker 1: And so got to banning a remarkable with three thousand 134 00:08:00,800 --> 00:08:02,880 Speaker 1: accounts there, I wonder how many more are out there. 135 00:08:03,360 --> 00:08:06,360 Speaker 1: Over in the EU, a consumer advocacy group called the 136 00:08:06,440 --> 00:08:09,520 Speaker 1: b e u C has called for the Digital Markets 137 00:08:09,560 --> 00:08:12,320 Speaker 1: Act or d m A, which is slowly taking shape 138 00:08:12,320 --> 00:08:15,320 Speaker 1: in the EU. It's set to become you know, law 139 00:08:15,440 --> 00:08:18,640 Speaker 1: before too long. They wanted to include the ability for 140 00:08:18,760 --> 00:08:22,640 Speaker 1: individual users to sue big tech companies for violations, so 141 00:08:22,800 --> 00:08:26,240 Speaker 1: right now, as it is currently formed, the d m 142 00:08:26,240 --> 00:08:29,560 Speaker 1: A would only allow business users to sue companies that 143 00:08:29,680 --> 00:08:32,640 Speaker 1: violate the rules under d m A, but the b 144 00:08:32,800 --> 00:08:36,719 Speaker 1: e u C once that extended to all EU citizens, 145 00:08:36,720 --> 00:08:41,560 Speaker 1: whether collectively in class action suits or individually. Now, the 146 00:08:41,679 --> 00:08:43,800 Speaker 1: d m A sets out the major rules that big 147 00:08:43,840 --> 00:08:47,719 Speaker 1: tech companies have to follow within the EU, and specifically, 148 00:08:47,720 --> 00:08:53,720 Speaker 1: it's aiming at gate keepers, and the legislation defines gatekeepers 149 00:08:53,760 --> 00:08:58,280 Speaker 1: as large online platforms that quote have a strong economic position, 150 00:08:58,480 --> 00:09:01,199 Speaker 1: significant impact on the internal market, and are active in 151 00:09:01,280 --> 00:09:06,280 Speaker 1: multiple EU countries. They have a strong intermediation position meaning 152 00:09:06,320 --> 00:09:08,800 Speaker 1: that they link a large user base to a large 153 00:09:08,880 --> 00:09:12,480 Speaker 1: number of businesses, and have an entrenched and durable position 154 00:09:12,480 --> 00:09:15,200 Speaker 1: in the market, meaning that it is stable over time 155 00:09:15,440 --> 00:09:20,520 Speaker 1: end quote. So companies like Google, Apple, Meta, Amazon, and 156 00:09:20,559 --> 00:09:24,360 Speaker 1: Microsoft would qualify. The regulations guarantee certain rights to third 157 00:09:24,400 --> 00:09:28,440 Speaker 1: parties such as, uh, you know, if you are running 158 00:09:28,440 --> 00:09:31,760 Speaker 1: your business on top of another company's platform, the other 159 00:09:31,800 --> 00:09:35,960 Speaker 1: company whose platform it is, they can't uh promote their 160 00:09:35,960 --> 00:09:40,000 Speaker 1: own products over yours. So the easiest example to point 161 00:09:40,000 --> 00:09:43,560 Speaker 1: out is Amazon, right, because Amazon has its own brands 162 00:09:43,960 --> 00:09:47,640 Speaker 1: and its own partners, and the company has stood accused 163 00:09:47,760 --> 00:09:52,080 Speaker 1: many times of promoting its own products over those of 164 00:09:52,160 --> 00:09:56,319 Speaker 1: competing vendors who are using the Amazon marketplace as a 165 00:09:57,360 --> 00:10:01,520 Speaker 1: place to sell their stuff. And Amazon's landed in hot 166 00:10:01,520 --> 00:10:04,480 Speaker 1: water all over the world for this about promoting its 167 00:10:04,520 --> 00:10:09,040 Speaker 1: own products over those of the customers that are using 168 00:10:09,720 --> 00:10:12,959 Speaker 1: the platform for their own online marketplace. And so the 169 00:10:13,080 --> 00:10:15,560 Speaker 1: d m A and the EU would make this illegal, 170 00:10:15,840 --> 00:10:18,920 Speaker 1: and it would mean that an individual would be able 171 00:10:18,960 --> 00:10:23,760 Speaker 1: to sue, say Amazon, if they violated or they believed 172 00:10:23,880 --> 00:10:26,560 Speaker 1: that Amazon violated the d m A. Anyway, there's no 173 00:10:26,600 --> 00:10:28,640 Speaker 1: guarantee yet that we're going to see the d m 174 00:10:28,640 --> 00:10:31,640 Speaker 1: A expand to allow individuals to pursue claims against big 175 00:10:31,640 --> 00:10:34,760 Speaker 1: tech the way that businesses could if it's you know, 176 00:10:34,840 --> 00:10:37,040 Speaker 1: when it's finally finalized. So we'll have to keep an 177 00:10:37,040 --> 00:10:39,560 Speaker 1: eye out see what happens. All right, We've got some 178 00:10:39,600 --> 00:10:41,520 Speaker 1: more stories to go into, but before we do that, 179 00:10:41,600 --> 00:10:51,600 Speaker 1: let's take a quick break. You know, not too long ago, 180 00:10:51,640 --> 00:10:55,640 Speaker 1: I did an update about the hyper loop, which was 181 00:10:56,200 --> 00:10:59,800 Speaker 1: originally started off at least in the public consciousness, started 182 00:10:59,800 --> 00:11:04,760 Speaker 1: off as a proposal that Elon Musk made two revolutionize 183 00:11:05,480 --> 00:11:09,240 Speaker 1: short range travel, like travel that send the hundreds of 184 00:11:09,280 --> 00:11:13,359 Speaker 1: miles but not the thousands of miles distances using enclosed 185 00:11:13,559 --> 00:11:17,880 Speaker 1: trains typically the enclosed system would have a lot of 186 00:11:17,880 --> 00:11:20,599 Speaker 1: the air pumped out to cut down an air resistance. 187 00:11:20,920 --> 00:11:23,560 Speaker 1: You're usually talking about some sort of mag lev or 188 00:11:23,600 --> 00:11:28,679 Speaker 1: other levitation style train. Uh Musk's version used air bearings. 189 00:11:28,760 --> 00:11:32,400 Speaker 1: And I talked about how various companies had come up 190 00:11:32,440 --> 00:11:34,920 Speaker 1: trying to make this a reality, including one that became 191 00:11:35,280 --> 00:11:38,480 Speaker 1: known as First hyper Loop Technologies, than hyper Loop one, 192 00:11:38,640 --> 00:11:42,120 Speaker 1: then Virgin hyper Loop one, and now just Virgin hyper Loop, 193 00:11:42,679 --> 00:11:46,520 Speaker 1: and that the story was that recently Virgin hyper Loop 194 00:11:46,520 --> 00:11:50,960 Speaker 1: had made the decision to pivot away from designing passenger 195 00:11:52,040 --> 00:11:57,120 Speaker 1: style systems and focus more on cargo. And to that end, 196 00:11:57,720 --> 00:12:00,199 Speaker 1: there's more news that now the company has laid off 197 00:12:00,240 --> 00:12:03,360 Speaker 1: more than one hundred employees as part of this, that 198 00:12:03,400 --> 00:12:08,280 Speaker 1: they are refocusing on cargo versus passengers, which is you know, 199 00:12:08,400 --> 00:12:13,640 Speaker 1: particularly sad because it wasn't that long ago. It was 200 00:12:13,679 --> 00:12:16,800 Speaker 1: just last summer that the company was showing off its 201 00:12:16,840 --> 00:12:20,680 Speaker 1: passenger capabilities. But no, looks like that's not going to 202 00:12:20,720 --> 00:12:23,080 Speaker 1: be at least in the near future for Hyperloop. It 203 00:12:23,120 --> 00:12:26,480 Speaker 1: maybe that they'll eventually get back into that, but for 204 00:12:26,600 --> 00:12:30,840 Speaker 1: now it is off the table. So just a quick 205 00:12:30,960 --> 00:12:33,719 Speaker 1: update that those hundred or so folks. I think it's 206 00:12:33,720 --> 00:12:36,320 Speaker 1: a hundred and eleven actually who lost their jobs. Hope 207 00:12:36,360 --> 00:12:39,679 Speaker 1: they all land on their feet. Uh, but yeah, pretty 208 00:12:39,720 --> 00:12:45,000 Speaker 1: disappointing if you were hoping for a revolutionary change in transportation. 209 00:12:46,800 --> 00:12:49,200 Speaker 1: Then again, there are other hyper loop companies out there, 210 00:12:49,960 --> 00:12:53,000 Speaker 1: you know. And I also mentioned in a different episode 211 00:12:53,160 --> 00:12:55,480 Speaker 1: about how the I r S and the United States 212 00:12:55,520 --> 00:12:59,320 Speaker 1: recently reversed its decision to require people to use video 213 00:12:59,400 --> 00:13:02,880 Speaker 1: selfies and a facial recognition system run by a third 214 00:13:02,920 --> 00:13:06,120 Speaker 1: party i D company in order to access certain I 215 00:13:06,280 --> 00:13:09,360 Speaker 1: r S features online. So if you are a US 216 00:13:09,400 --> 00:13:11,520 Speaker 1: citizen and you want to create an ir S account, 217 00:13:11,520 --> 00:13:15,080 Speaker 1: well you can still do that video selfie and facial 218 00:13:15,080 --> 00:13:18,320 Speaker 1: recognition route if you like. You are not required to, 219 00:13:18,559 --> 00:13:20,800 Speaker 1: but you can use that. If you don't want to 220 00:13:20,880 --> 00:13:23,120 Speaker 1: use that, then what you have to do is sit 221 00:13:23,280 --> 00:13:27,760 Speaker 1: for a live virtual interview to verify your identity. Now, 222 00:13:27,800 --> 00:13:30,280 Speaker 1: that sounds to me like a pretty labor intensive solution. 223 00:13:30,360 --> 00:13:32,880 Speaker 1: I don't know if the I r S will be 224 00:13:32,920 --> 00:13:36,440 Speaker 1: providing these interviews themselves, or you know, if they'll be 225 00:13:36,880 --> 00:13:40,800 Speaker 1: working with another company, but yeah, it sounds like it's 226 00:13:40,800 --> 00:13:44,160 Speaker 1: still gonna be pretty intense to to get that. I 227 00:13:44,280 --> 00:13:48,640 Speaker 1: r S account established last week, a cargo ship carrying 228 00:13:48,720 --> 00:13:51,839 Speaker 1: hundreds of luxury cars caught fire as it was heading 229 00:13:51,840 --> 00:13:55,679 Speaker 1: across the Atlantic from Europe to North America. The cargo 230 00:13:55,760 --> 00:13:58,920 Speaker 1: included cars from Porsche more than a thousand of them 231 00:13:59,080 --> 00:14:02,880 Speaker 1: and Bentley more than a d nine of those, I believe, 232 00:14:03,720 --> 00:14:06,280 Speaker 1: and the cause of the fire has not yet been 233 00:14:06,280 --> 00:14:09,439 Speaker 1: determined or at least not yet announced as I'm recording this, 234 00:14:09,800 --> 00:14:12,800 Speaker 1: but it appears that batteries in some of the electric 235 00:14:12,880 --> 00:14:15,200 Speaker 1: vehicles in the cargo played a part in making the 236 00:14:15,240 --> 00:14:19,480 Speaker 1: fire more difficult to extinguish. So we don't know if 237 00:14:19,480 --> 00:14:21,800 Speaker 1: the batteries actually caused the fire, like if there was 238 00:14:21,840 --> 00:14:25,040 Speaker 1: a short circuit, if one of the batteries was you know, 239 00:14:25,360 --> 00:14:28,640 Speaker 1: damaged in transportation or something and that led to it, 240 00:14:29,520 --> 00:14:33,480 Speaker 1: or if they just exacerbated a fire that got caused 241 00:14:33,480 --> 00:14:36,520 Speaker 1: by some other source. But they it definitely made the 242 00:14:36,600 --> 00:14:40,000 Speaker 1: fire worse. Unfortunately, all twenty two crew members of the ship, 243 00:14:40,320 --> 00:14:44,160 Speaker 1: called the Felicity Ace were rescued, so there were there 244 00:14:44,160 --> 00:14:46,600 Speaker 1: were no losses aboard, which is the most important thing. 245 00:14:46,960 --> 00:14:49,960 Speaker 1: You know, we can replace stuff, we can't replace people. 246 00:14:50,480 --> 00:14:53,400 Speaker 1: But yeah, it's I'm sure for all the luxury car 247 00:14:53,680 --> 00:14:56,600 Speaker 1: lovers out there. They're shedding a tear as they think 248 00:14:56,640 --> 00:15:02,160 Speaker 1: of the various Porsche's and Bentley's and uch uh burning 249 00:15:02,160 --> 00:15:06,520 Speaker 1: to a crisp very sad. The United States Copyright Office 250 00:15:06,520 --> 00:15:09,720 Speaker 1: has answered a burning question in science fiction. Can a 251 00:15:09,840 --> 00:15:14,360 Speaker 1: robot hold a copyright for a work that the robot produced? No, 252 00:15:15,480 --> 00:15:19,520 Speaker 1: they can't. Actually, to be fair, that is too narrow 253 00:15:19,560 --> 00:15:21,800 Speaker 1: a use case. Really, The Copyright Office has said, no 254 00:15:22,000 --> 00:15:25,920 Speaker 1: artificial intelligence can secure a copyright for a work that 255 00:15:25,960 --> 00:15:30,400 Speaker 1: the AI itself creates. So all those interesting projects and 256 00:15:30,560 --> 00:15:34,200 Speaker 1: AI that revolve around computer systems or robots creating art 257 00:15:34,880 --> 00:15:37,280 Speaker 1: cannot get a copyright for any art that they produce. 258 00:15:38,120 --> 00:15:41,040 Speaker 1: This is still really too narrow a view because honestly, 259 00:15:41,080 --> 00:15:44,600 Speaker 1: the Copyright Office essentially says for something to qualify for 260 00:15:44,640 --> 00:15:48,400 Speaker 1: a copyright, it has to be the product of human authorship. So, 261 00:15:48,440 --> 00:15:50,880 Speaker 1: in other words, if it ain't made by a human, 262 00:15:51,240 --> 00:15:54,080 Speaker 1: it don't get no copyright. If you had an infinite 263 00:15:54,160 --> 00:15:56,800 Speaker 1: number of monkeys with an infinite number of typewriters and 264 00:15:56,800 --> 00:15:59,400 Speaker 1: one of them created the best novel ever written, that 265 00:15:59,480 --> 00:16:02,000 Speaker 1: novel would be in the public domain because the monkey 266 00:16:02,080 --> 00:16:04,440 Speaker 1: ain't no human. The same is true for AI, it 267 00:16:04,480 --> 00:16:07,880 Speaker 1: sounds like, and that does make an interesting note. The 268 00:16:07,960 --> 00:16:10,160 Speaker 1: U S Copyright Office is one of the few entities 269 00:16:10,200 --> 00:16:13,280 Speaker 1: I know of that has already determined that artificial intelligence 270 00:16:13,680 --> 00:16:17,600 Speaker 1: cannot be considered equivalent to human intelligence. Now I'm being 271 00:16:17,600 --> 00:16:20,360 Speaker 1: a little flippant here because this doesn't mean one that 272 00:16:20,400 --> 00:16:23,400 Speaker 1: we're always going to see the Copyright Office take this interpretation. 273 00:16:23,760 --> 00:16:27,200 Speaker 1: That could change over time, especially as AI gets more 274 00:16:27,880 --> 00:16:32,400 Speaker 1: you know, sophisticated, and to AI as it stands is incredible, 275 00:16:32,400 --> 00:16:36,800 Speaker 1: but still very much different from and arguably inferior to 276 00:16:37,000 --> 00:16:40,440 Speaker 1: human intelligence when we take intelligence as a whole, not 277 00:16:40,520 --> 00:16:43,400 Speaker 1: just the ability to solve problems, but when you take 278 00:16:43,440 --> 00:16:47,480 Speaker 1: things like creativity, innovation, out of the box thinking, associative thinking, 279 00:16:47,520 --> 00:16:51,040 Speaker 1: all that kind of stuff, things that humans are very 280 00:16:51,080 --> 00:16:55,000 Speaker 1: good at. Most machine systems are not very good at it. 281 00:16:55,560 --> 00:16:58,560 Speaker 1: So in those respects you could argue machines are still 282 00:16:58,600 --> 00:17:02,240 Speaker 1: way behind us. Anyway, if you make a computer program 283 00:17:02,320 --> 00:17:05,320 Speaker 1: that composes music or poetry or whatever, just know that 284 00:17:05,359 --> 00:17:07,480 Speaker 1: the program will not be able to secure a copyright 285 00:17:07,560 --> 00:17:10,159 Speaker 1: on the work it generates. Though I guess you know, 286 00:17:10,240 --> 00:17:12,240 Speaker 1: depending upon how you word it, you might be able 287 00:17:12,280 --> 00:17:14,600 Speaker 1: to get a copyright for the program itself. So that's 288 00:17:14,640 --> 00:17:17,919 Speaker 1: something right. And if your robot gets depressed that the U. S. 289 00:17:17,960 --> 00:17:20,800 Speaker 1: Copyright Office refuses to acknowledge its art, well it can 290 00:17:20,840 --> 00:17:23,879 Speaker 1: always go apply for a job at Whitecastle. Yes, the 291 00:17:23,920 --> 00:17:26,800 Speaker 1: company has announced it will be installing a robot from 292 00:17:27,000 --> 00:17:30,920 Speaker 1: miso or Robotics called the Flippy two in one white 293 00:17:30,920 --> 00:17:34,119 Speaker 1: Castle locations. Now, if you're not familiar with White Castle, 294 00:17:34,320 --> 00:17:36,719 Speaker 1: that's a fast food chain here in the United States. 295 00:17:37,359 --> 00:17:39,720 Speaker 1: Harold and Kumar like to go there. In fact, they 296 00:17:39,760 --> 00:17:42,439 Speaker 1: should do another Harold and Kumar movie and make the 297 00:17:42,480 --> 00:17:46,120 Speaker 1: Flippy robots look like the Terminator. That would be awesome, 298 00:17:46,280 --> 00:17:50,560 Speaker 1: a great stoner science fiction comedy movie. You're welcome, Hollywood. 299 00:17:50,880 --> 00:17:54,040 Speaker 1: I'll take my check. But anyway, the Flippy robot mostly 300 00:17:54,080 --> 00:17:56,320 Speaker 1: looks like a robotic arm mounted to a frame that 301 00:17:56,320 --> 00:17:58,959 Speaker 1: can move between fry stations. It does not look like 302 00:17:59,359 --> 00:18:02,440 Speaker 1: the Terminator. So it would take some you know, poetic 303 00:18:02,520 --> 00:18:05,399 Speaker 1: license to make the movie that I just pitched. But 304 00:18:06,000 --> 00:18:08,800 Speaker 1: this will allow Whitecastle to automate some of the kitchen work. 305 00:18:09,600 --> 00:18:11,800 Speaker 1: It is not the only fast food company that has 306 00:18:11,880 --> 00:18:15,920 Speaker 1: been you know, investing in AI and robotics in order 307 00:18:15,920 --> 00:18:19,800 Speaker 1: to do this. Um, I think if it means that, 308 00:18:20,920 --> 00:18:24,159 Speaker 1: if it means these companies are employing fewer humans, but 309 00:18:24,240 --> 00:18:27,360 Speaker 1: that they're paying the humans that they do employ more 310 00:18:27,400 --> 00:18:30,159 Speaker 1: of a living wage, I'm all for it, although I 311 00:18:30,280 --> 00:18:33,560 Speaker 1: of course worry that it will just mean that they'll say, hey, look, 312 00:18:33,600 --> 00:18:35,320 Speaker 1: we got to cut all those salaries out of all 313 00:18:35,320 --> 00:18:39,360 Speaker 1: those uh those uh, those franchises, and now we get 314 00:18:39,359 --> 00:18:41,640 Speaker 1: to keep more money for ourselves. I worry that that's 315 00:18:41,640 --> 00:18:44,600 Speaker 1: how it's going to go, but you know, hasn't been 316 00:18:44,600 --> 00:18:48,040 Speaker 1: written yet. Maybe maybe I should hold out. Hope. That's 317 00:18:48,040 --> 00:18:51,320 Speaker 1: it for today's tech news. Please let me know if 318 00:18:51,359 --> 00:18:54,600 Speaker 1: you have any suggestions for future episodes of tech Stuff, 319 00:18:55,000 --> 00:18:58,160 Speaker 1: whether it's a technology, a company, a trend in tech, 320 00:18:58,240 --> 00:19:00,879 Speaker 1: anything like that. Reach out to me on Twitter. The 321 00:19:00,920 --> 00:19:03,520 Speaker 1: handle for the show is tech Stuff h s W 322 00:19:04,200 --> 00:19:12,320 Speaker 1: and I'll talk to you again releases y. Tech Stuff 323 00:19:12,400 --> 00:19:15,520 Speaker 1: is an I Heart Radio production. For more podcasts from 324 00:19:15,560 --> 00:19:19,360 Speaker 1: I Heart Radio, visit the I Heart Radio app, Apple Podcasts, 325 00:19:19,480 --> 00:19:21,439 Speaker 1: or wherever you listen to your favorite shows.