1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:15,840 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,920 --> 00:00:19,320 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,360 --> 00:00:22,720 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:22,800 --> 00:00:27,720 Speaker 1: July sixth, twenty twenty three. And a whole lot is 6 00:00:27,840 --> 00:00:32,199 Speaker 1: happening in the social network platform space right now, as 7 00:00:32,240 --> 00:00:35,519 Speaker 1: I'm sure many of you are aware. So last weekend, 8 00:00:36,000 --> 00:00:39,680 Speaker 1: Twitter changed quite a bit and it really upset a 9 00:00:39,720 --> 00:00:42,559 Speaker 1: lot of people. First up, folks noticed that unless you 10 00:00:42,600 --> 00:00:46,960 Speaker 1: were actually logged into Twitter, you couldn't see stuff that 11 00:00:47,000 --> 00:00:49,600 Speaker 1: had been posted on Twitter. Now, that was not always 12 00:00:49,640 --> 00:00:52,800 Speaker 1: the case. You could go and view things on Twitter 13 00:00:52,840 --> 00:00:55,800 Speaker 1: without logging in. My wife has done this numerous times. 14 00:00:55,800 --> 00:00:58,880 Speaker 1: She does not have a Twitter account but occasionally checks Twitter. 15 00:00:59,160 --> 00:01:01,840 Speaker 1: Now this has since been walked back, so now if 16 00:01:01,840 --> 00:01:03,960 Speaker 1: you don't have an account, you can view it again. 17 00:01:04,000 --> 00:01:08,040 Speaker 1: But for a while you couldn't. Beyond that, Even if 18 00:01:08,080 --> 00:01:10,480 Speaker 1: you were logged in to an account, you would notice 19 00:01:10,480 --> 00:01:12,920 Speaker 1: that you would hit some view limits on the number 20 00:01:12,920 --> 00:01:17,120 Speaker 1: of tweets you could see. Initially, those limits were the 21 00:01:17,160 --> 00:01:19,640 Speaker 1: ability to see up to three hundred tweets per day, 22 00:01:19,720 --> 00:01:23,000 Speaker 1: if you were brand spank and new and an unverified account, 23 00:01:23,520 --> 00:01:26,520 Speaker 1: or rather not a Twitter Blue subscriber, I guess we 24 00:01:26,520 --> 00:01:31,360 Speaker 1: should say verification at this point means very little existing 25 00:01:31,800 --> 00:01:34,919 Speaker 1: quote unquote unverified accounts. So those who are not Twitter 26 00:01:35,000 --> 00:01:38,880 Speaker 1: Blue subscribers but who have been on Twitter for a while, 27 00:01:39,240 --> 00:01:42,160 Speaker 1: they could see up to six hundred tweets per day. 28 00:01:42,600 --> 00:01:45,240 Speaker 1: And then if you were a verified user, a Twitter 29 00:01:45,280 --> 00:01:48,640 Speaker 1: Blue subscriber, or maybe one of the celebrities that Musk 30 00:01:48,800 --> 00:01:55,160 Speaker 1: granted verification to, possibly as a trolling tactic, I don't know, anyway, 31 00:01:55,400 --> 00:01:58,360 Speaker 1: they could see up to six thousand tweets per day. 32 00:01:58,840 --> 00:02:04,600 Speaker 1: Elon Musk ventually explained his reasons for doing this. He 33 00:02:04,680 --> 00:02:07,840 Speaker 1: said that Twitter was being bombarded by bots, and those 34 00:02:07,880 --> 00:02:11,519 Speaker 1: bots were scraping data from Twitter in efforts to build 35 00:02:11,560 --> 00:02:15,239 Speaker 1: out things like large language models for AI chat bots, 36 00:02:15,680 --> 00:02:17,760 Speaker 1: and that the platform had made the changes in an 37 00:02:17,760 --> 00:02:22,600 Speaker 1: effort to confound those bots, and they couldn't tell anyone 38 00:02:22,639 --> 00:02:26,840 Speaker 1: about it beforehand because then the companies behind the bots 39 00:02:26,840 --> 00:02:30,480 Speaker 1: would have, you know, changed their tactics or something, which 40 00:02:30,480 --> 00:02:34,000 Speaker 1: is weird to me because by not communicating it, all 41 00:02:34,000 --> 00:02:37,880 Speaker 1: you're doing is just delaying the amount of time it 42 00:02:37,919 --> 00:02:40,639 Speaker 1: takes them to adjust to the tactics, Like, if that's 43 00:02:40,680 --> 00:02:43,600 Speaker 1: your concern that by announcing that you're going to do this, 44 00:02:44,280 --> 00:02:48,200 Speaker 1: you're giving people the chance to circumvent it, well, if 45 00:02:48,200 --> 00:02:50,920 Speaker 1: they can circumvent it, then why aren't that Shouldn't that 46 00:02:50,919 --> 00:02:53,840 Speaker 1: be happening right now? Right that circumvention should be going 47 00:02:53,840 --> 00:02:55,840 Speaker 1: on right now. So it doesn't matter whether you say 48 00:02:55,840 --> 00:03:00,200 Speaker 1: it or not, I don't know whatever. So it's an 49 00:03:00,280 --> 00:03:03,960 Speaker 1: unfortunate side effect that, you know, this attempt to confound 50 00:03:03,960 --> 00:03:08,360 Speaker 1: the bots affected everybody. Musk mention that the limits were temporary, 51 00:03:08,560 --> 00:03:11,440 Speaker 1: and I believe at the time that I'm recording this 52 00:03:11,600 --> 00:03:14,560 Speaker 1: right now, those limits have been increased to five hundred 53 00:03:14,560 --> 00:03:18,359 Speaker 1: tweets for brand new users, one thousand for unverified but 54 00:03:18,520 --> 00:03:23,320 Speaker 1: existing accounts, and ten thousand tweets for verified users. A 55 00:03:23,360 --> 00:03:25,480 Speaker 1: couple of other things we need to mention here. One 56 00:03:25,600 --> 00:03:28,760 Speaker 1: is that if this were a plan to confound bots, 57 00:03:28,800 --> 00:03:32,480 Speaker 1: it seems dumb because a verified account is not that expensive, 58 00:03:32,600 --> 00:03:34,800 Speaker 1: not for a company that's looking to build out large 59 00:03:34,880 --> 00:03:39,480 Speaker 1: language models. They could end up making lots of accounts 60 00:03:39,560 --> 00:03:42,640 Speaker 1: and then paying the small fee to verify them or 61 00:03:42,680 --> 00:03:46,080 Speaker 1: to subscribe to Twitter Blue and then use those to 62 00:03:46,160 --> 00:03:50,119 Speaker 1: scrape data. I mean, I guess Twitter could ban those accounts, 63 00:03:50,120 --> 00:03:53,880 Speaker 1: but still it's not a big expense. Secondly, Twitter relies 64 00:03:53,880 --> 00:03:59,360 Speaker 1: on revenue from advertising. Now I am no business genius, 65 00:03:59,520 --> 00:04:05,320 Speaker 1: but it seems to me that this move is discouraging 66 00:04:05,360 --> 00:04:09,480 Speaker 1: advertisers from continuing to work with Twitter because if they 67 00:04:09,920 --> 00:04:12,280 Speaker 1: are being told, hey, fewer people are going to be 68 00:04:12,360 --> 00:04:16,000 Speaker 1: able to see your tweet because of these limits, they're 69 00:04:16,040 --> 00:04:18,120 Speaker 1: going to say, what are we paying for? Then we're 70 00:04:18,160 --> 00:04:20,960 Speaker 1: paying for people to see our ads. And if you're 71 00:04:21,040 --> 00:04:23,160 Speaker 1: limiting the number of tweets they can see, then that 72 00:04:23,240 --> 00:04:25,120 Speaker 1: means you're limiting the number of people who can see 73 00:04:25,120 --> 00:04:28,400 Speaker 1: our ads. Third, as we will talk about in a moment, 74 00:04:28,560 --> 00:04:31,920 Speaker 1: Meta launched its own Twitter competitor this week, So the 75 00:04:32,000 --> 00:04:37,200 Speaker 1: timing of this move seems almost comical, right. It seems 76 00:04:37,240 --> 00:04:40,400 Speaker 1: like Elon Musk is trying to convince people to jump 77 00:04:40,440 --> 00:04:45,279 Speaker 1: ship and move over to Meta's short messaging social platform 78 00:04:45,400 --> 00:04:49,920 Speaker 1: threads and just leave Twitter, y'all. Maybe Elon Musk really 79 00:04:50,000 --> 00:04:53,520 Speaker 1: is a space brain business genius, like some people say, 80 00:04:53,800 --> 00:04:57,599 Speaker 1: but from my humble perspective, it sure looks like he's 81 00:04:57,600 --> 00:05:01,120 Speaker 1: made some dumb moves in a very short amount of time. 82 00:05:01,640 --> 00:05:04,800 Speaker 1: In addition to those decisions, Twitter also made another move 83 00:05:05,120 --> 00:05:08,640 Speaker 1: that has upset a lot of users, including me, and 84 00:05:08,720 --> 00:05:12,480 Speaker 1: that's a change to the social media dashboard app called 85 00:05:12,720 --> 00:05:15,360 Speaker 1: tweet Deck. Now, for those of y'all who have never 86 00:05:15,520 --> 00:05:20,120 Speaker 1: used tweet deck, this tool has some really useful features, 87 00:05:20,160 --> 00:05:22,400 Speaker 1: and big among those is that you can use tweet 88 00:05:22,440 --> 00:05:25,000 Speaker 1: deck to set up a view so you can monitor 89 00:05:25,120 --> 00:05:29,440 Speaker 1: multiple Twitter accounts from the same screen. So in one view, 90 00:05:29,640 --> 00:05:32,080 Speaker 1: I would be able to see my Twitter feed and 91 00:05:32,120 --> 00:05:36,360 Speaker 1: then all of the replies and mentions for me, I'd 92 00:05:36,360 --> 00:05:39,120 Speaker 1: be able to see the feeds for the Twitter handle 93 00:05:39,160 --> 00:05:42,240 Speaker 1: for this show, which is technically tech stuff HSW. But 94 00:05:42,320 --> 00:05:45,000 Speaker 1: now that tweet deck is locked behind a paywall, I'm 95 00:05:45,040 --> 00:05:47,040 Speaker 1: not sure how often I'll be logging in to check 96 00:05:47,040 --> 00:05:49,279 Speaker 1: on it. Plus, I could have a view for my 97 00:05:49,520 --> 00:05:54,320 Speaker 1: old show Forward Thinking, or my other podcast large nerdron Collider, etc. 98 00:05:55,000 --> 00:05:58,240 Speaker 1: Tweet Deck made it really easy to see activity across 99 00:05:58,320 --> 00:06:03,680 Speaker 1: multiple accounts. Media managers loved tools like this because they 100 00:06:03,680 --> 00:06:09,040 Speaker 1: could have one view and maintain a good idea of 101 00:06:09,080 --> 00:06:12,359 Speaker 1: what's going on across any accounts they happen to manage. 102 00:06:12,360 --> 00:06:14,839 Speaker 1: So if you are a manager, who Maybe you have 103 00:06:14,960 --> 00:06:18,159 Speaker 1: multiple clients and you like to have a single view 104 00:06:18,240 --> 00:06:20,279 Speaker 1: so you can see what's going on at any given time. 105 00:06:21,160 --> 00:06:25,240 Speaker 1: This becomes frustrating. Tweet Deck started off as an independent 106 00:06:25,320 --> 00:06:28,680 Speaker 1: third party app, but in twenty eleven, Twitter purchased it, 107 00:06:29,320 --> 00:06:31,680 Speaker 1: and the key element in all of this is that 108 00:06:31,800 --> 00:06:35,920 Speaker 1: tweet Deck, until now, or really until later this year, 109 00:06:36,640 --> 00:06:40,839 Speaker 1: was free to use, and then this month, Twitter announced 110 00:06:40,839 --> 00:06:43,320 Speaker 1: that tweet deck is shifting to become a Twitter Blue 111 00:06:43,360 --> 00:06:46,400 Speaker 1: exclusive feature. So when you take that in with the 112 00:06:46,440 --> 00:06:49,000 Speaker 1: other weird moves that happened over this past weekend, it 113 00:06:49,040 --> 00:06:53,120 Speaker 1: sure does feel like Musk's attempt to convince more people 114 00:06:53,400 --> 00:06:57,120 Speaker 1: to subscribe is what is driving these moves. Maybe it 115 00:06:57,160 --> 00:06:59,360 Speaker 1: has to do with bots in the case of limiting 116 00:06:59,800 --> 00:07:04,159 Speaker 1: p people's ability to see posts, but it sounds more like, hey, 117 00:07:04,200 --> 00:07:05,640 Speaker 1: if you want to see a lot of posts, you 118 00:07:05,680 --> 00:07:09,400 Speaker 1: need to subscribe, and locking tweet Deck behind a Twitter 119 00:07:09,440 --> 00:07:13,640 Speaker 1: Blue subscription seems likewise an attempt to get people to 120 00:07:13,680 --> 00:07:16,400 Speaker 1: subscribe to Twitter Blue, and considering the shaky ground the 121 00:07:16,440 --> 00:07:20,880 Speaker 1: company is on with advertisers, I can see why maybe 122 00:07:20,960 --> 00:07:24,520 Speaker 1: Musk is hoping that subscriptions will pick up the slack 123 00:07:25,040 --> 00:07:29,320 Speaker 1: that's left by advertisers getting really concerned about spending their 124 00:07:29,320 --> 00:07:32,840 Speaker 1: money on Twitter. But it is really hard to convince 125 00:07:32,880 --> 00:07:35,440 Speaker 1: people to fork over a subscription fee for Twitter blues 126 00:07:35,480 --> 00:07:40,720 Speaker 1: features when you've also got drum roll. Please Meta Threads. 127 00:07:41,560 --> 00:07:46,559 Speaker 1: So Meta launched Threads yesterday. This is the company's answer 128 00:07:46,640 --> 00:07:50,200 Speaker 1: to stuff like Twitter and masted On and blue Sky, 129 00:07:50,360 --> 00:07:53,720 Speaker 1: which is currently in an invitation only beta program. I 130 00:07:53,760 --> 00:07:57,440 Speaker 1: only just got access to blue Sky today because I 131 00:07:57,520 --> 00:08:00,640 Speaker 1: dragged my feet on requesting an invite, and I don't 132 00:08:00,640 --> 00:08:02,240 Speaker 1: know that I would have gotten one if I even 133 00:08:02,320 --> 00:08:05,640 Speaker 1: had requested an invite early on. It's kind of a 134 00:08:05,680 --> 00:08:09,160 Speaker 1: moot point. I ended up getting a invitation code from 135 00:08:09,560 --> 00:08:12,600 Speaker 1: a friend of mine, so I'm on blue Sky. It's neat. 136 00:08:13,720 --> 00:08:18,120 Speaker 1: Not revolutionary, but neat. Anyway. Meta has positioned Threads to 137 00:08:18,160 --> 00:08:21,080 Speaker 1: be an extension of Instagram. That makes a lot of 138 00:08:21,160 --> 00:08:26,080 Speaker 1: sense because Instagram boasts more than two point three billion 139 00:08:26,760 --> 00:08:30,800 Speaker 1: active users. That's billion with a B. So what better 140 00:08:30,840 --> 00:08:34,240 Speaker 1: way to compete with Twitter than to leverage an already 141 00:08:35,000 --> 00:08:40,120 Speaker 1: existing enormous user base. Now, according to Engadget, it took 142 00:08:40,160 --> 00:08:44,120 Speaker 1: Threads about seven hours to reach ten million users because 143 00:08:44,120 --> 00:08:46,440 Speaker 1: you do still have to create an account and you 144 00:08:46,480 --> 00:08:49,000 Speaker 1: do still have to download the app and stuff, but 145 00:08:49,080 --> 00:08:52,560 Speaker 1: it is connected to your Instagram account. This is what 146 00:08:52,679 --> 00:08:56,199 Speaker 1: gives Threads a huge leg up on would be competitors 147 00:08:56,280 --> 00:08:59,000 Speaker 1: like Blue Sky. Blue Sky has to start from scratch. 148 00:08:59,679 --> 00:09:05,120 Speaker 1: Threads is already an extension of an enormously successful company. 149 00:09:05,520 --> 00:09:10,959 Speaker 1: So why would Meta bother with a Twitter competitor at all. Well, 150 00:09:10,960 --> 00:09:13,560 Speaker 1: the answer to that question is pretty much because Elon 151 00:09:13,640 --> 00:09:18,400 Speaker 1: Musk has created a huge opportunity. Musk's handling of Twitter 152 00:09:18,720 --> 00:09:21,600 Speaker 1: has alienated a good number of users. Now there's some 153 00:09:21,679 --> 00:09:24,440 Speaker 1: users who absolutely love what Twitter has become, and there 154 00:09:24,480 --> 00:09:28,480 Speaker 1: are a lot of others who they have bemoaned the 155 00:09:28,559 --> 00:09:31,800 Speaker 1: changes that have happened since Musk has taken over the company. 156 00:09:32,640 --> 00:09:35,560 Speaker 1: And as I mentioned earlier, quite a few advertisers are 157 00:09:35,720 --> 00:09:40,840 Speaker 1: disenchanted with Twitter. So if Meta can establish a competitive service, 158 00:09:41,360 --> 00:09:44,520 Speaker 1: the company stands to gain by convincing those advertisers to 159 00:09:44,559 --> 00:09:47,840 Speaker 1: just go all in on a meta focused advertising approach. 160 00:09:48,280 --> 00:09:52,160 Speaker 1: I would not be surprised if salespeople in Meta are 161 00:09:52,160 --> 00:09:55,920 Speaker 1: positioning ad deals that include, you know, an ad campaign 162 00:09:56,040 --> 00:10:00,880 Speaker 1: as a presence on Facebook and Instagram and eventually on Threads. So, 163 00:10:00,960 --> 00:10:03,880 Speaker 1: in other words, Musk made some brash moves, and now 164 00:10:03,960 --> 00:10:08,200 Speaker 1: Zuckerberg wants to drink his milkshake and there will be 165 00:10:08,240 --> 00:10:14,760 Speaker 1: blood terminology. I actually reactivated my old Instagram account last 166 00:10:14,880 --> 00:10:17,800 Speaker 1: night just to create a Thread's presence and to see 167 00:10:17,840 --> 00:10:22,880 Speaker 1: what it's like. This hurt me emotionally to reconnect my 168 00:10:22,920 --> 00:10:25,480 Speaker 1: Instagram account because I just hadn't been on there in 169 00:10:25,520 --> 00:10:28,760 Speaker 1: more than a year. Anyway, right now, Threads is a 170 00:10:28,760 --> 00:10:31,400 Speaker 1: little bit of a chaotic mess Users face kind of 171 00:10:31,440 --> 00:10:34,400 Speaker 1: a fire hose of messages from folks that they may 172 00:10:34,480 --> 00:10:38,079 Speaker 1: or may not follow on Instagram. And this is kind 173 00:10:38,120 --> 00:10:41,079 Speaker 1: of Thread's attempt to populate the feed with more than 174 00:10:41,240 --> 00:10:44,760 Speaker 1: just two or three messages. Because if you logged in 175 00:10:44,800 --> 00:10:46,880 Speaker 1: there and the only thing you saw were the people 176 00:10:46,920 --> 00:10:49,240 Speaker 1: you're connected to on Instagram and what they had to say, 177 00:10:49,280 --> 00:10:51,880 Speaker 1: you might not see anything at all. Right, maybe none 178 00:10:51,920 --> 00:10:53,480 Speaker 1: of your friends are on there, or none of the 179 00:10:53,640 --> 00:10:57,000 Speaker 1: accounts you follow are on there. This way, it creates 180 00:10:57,520 --> 00:11:03,600 Speaker 1: a reliable, you know, flo of information and hopefully, according 181 00:11:03,600 --> 00:11:07,160 Speaker 1: to Meta anyway, hopefully convince people to stick with the service. 182 00:11:07,880 --> 00:11:10,319 Speaker 1: Now it's early days, I expect we're going to see 183 00:11:10,360 --> 00:11:12,960 Speaker 1: Meta roll out more features than the not too distant future. 184 00:11:13,600 --> 00:11:19,880 Speaker 1: Right now, there are some minor oversights, like there's no 185 00:11:20,000 --> 00:11:23,040 Speaker 1: easy way to just look at the stuff that your 186 00:11:23,040 --> 00:11:26,640 Speaker 1: friends are posting, so you do get this fire hose approach. Also, 187 00:11:27,000 --> 00:11:29,960 Speaker 1: you can't do any direct messaging through Threads. You can 188 00:11:30,320 --> 00:11:34,480 Speaker 1: in Instagram, but not through Threads, and maybe that's going 189 00:11:34,559 --> 00:11:36,720 Speaker 1: to change. We'll have to see. Now. Personally, I'm a 190 00:11:36,760 --> 00:11:40,280 Speaker 1: little reluctant to jump in wholeheartedly and to make Threads 191 00:11:40,360 --> 00:11:45,440 Speaker 1: my new means of communicating to the public, because, as 192 00:11:45,480 --> 00:11:49,400 Speaker 1: Twitter's former CEO Jack Dorsey pointed out on Twitter No 193 00:11:49,520 --> 00:11:54,640 Speaker 1: Less Meta notoriously slurts up all the personal data it can, 194 00:11:54,800 --> 00:11:58,280 Speaker 1: including how long you're using its services, what other services 195 00:11:58,320 --> 00:12:03,400 Speaker 1: you use, what purchases you make online, and way way 196 00:12:03,440 --> 00:12:07,480 Speaker 1: more than just that. So feeding that machine isn't something 197 00:12:07,559 --> 00:12:11,240 Speaker 1: I am personally excited about doing. But there is no 198 00:12:11,360 --> 00:12:16,160 Speaker 1: denying that Threads has already been a huge success. Word 199 00:12:16,200 --> 00:12:19,439 Speaker 1: of warning. By the way, if you make a Threads profile, 200 00:12:20,120 --> 00:12:24,040 Speaker 1: you can delete it, but only if you also delete 201 00:12:24,200 --> 00:12:28,600 Speaker 1: your Instagram account. So once it's up, it's up, unless 202 00:12:28,679 --> 00:12:31,160 Speaker 1: you're ready to nuke it from orbit, because that's the 203 00:12:31,200 --> 00:12:34,480 Speaker 1: only way to be sure. Okay, we're going to take 204 00:12:34,520 --> 00:12:36,920 Speaker 1: a quick break to think. Our sponsors will be back 205 00:12:36,960 --> 00:12:50,800 Speaker 1: with more news in just a moment. We're back. In France, citizens, 206 00:12:51,080 --> 00:12:54,680 Speaker 1: primarily young people, have been holding protests, some of those 207 00:12:54,800 --> 00:12:59,320 Speaker 1: escalating into full blown riots. This happened after French police 208 00:12:59,400 --> 00:13:03,319 Speaker 1: killed a teenager who was stopped during a traffic stop 209 00:13:03,320 --> 00:13:06,360 Speaker 1: in a Parisian suburb last week. That's a terrible story 210 00:13:06,400 --> 00:13:10,240 Speaker 1: in of itself. But the tech angle here is that 211 00:13:10,320 --> 00:13:15,319 Speaker 1: the President of France, Emmanuel Macron, has threatened to suspend 212 00:13:15,520 --> 00:13:19,120 Speaker 1: social networking platforms in France in response, or at least 213 00:13:19,640 --> 00:13:23,280 Speaker 1: raised the possibility of doing so, maybe not going so 214 00:13:23,360 --> 00:13:26,520 Speaker 1: far as threatening, but at least raising the question of 215 00:13:26,800 --> 00:13:30,200 Speaker 1: maybe that should be an option. So the concern is 216 00:13:30,240 --> 00:13:33,520 Speaker 1: that the rioters or protesters if you want to be 217 00:13:33,640 --> 00:13:36,800 Speaker 1: more critical of the government, are relying on social networks 218 00:13:36,840 --> 00:13:41,000 Speaker 1: to schedule and organize themselves, so by cutting off that avenue, 219 00:13:41,080 --> 00:13:47,199 Speaker 1: Macron could potentially attempt to restore order in times of 220 00:13:47,280 --> 00:13:52,559 Speaker 1: political unrest. This announcement, however, has spurred massive criticisms against Macron, 221 00:13:52,679 --> 00:13:55,400 Speaker 1: saying that it's the sort of move that an authoritarian 222 00:13:55,480 --> 00:13:58,520 Speaker 1: leader would make against citizens in an effort to suppress 223 00:13:58,559 --> 00:14:01,360 Speaker 1: the freedom of speech, I mentioned the freedom of assembly, 224 00:14:01,840 --> 00:14:05,120 Speaker 1: and Macrone's office has responded by saying that the idea 225 00:14:05,240 --> 00:14:09,840 Speaker 1: was never a widespread general blackout that would go on indefinitely. Instead, 226 00:14:10,840 --> 00:14:14,800 Speaker 1: this was really meant to initiate a conversation between leaders 227 00:14:14,840 --> 00:14:18,880 Speaker 1: of various territories and cities within France and talk about 228 00:14:18,880 --> 00:14:21,880 Speaker 1: the role of social networks during times of political unrest 229 00:14:21,920 --> 00:14:25,680 Speaker 1: and what moves, if any, would be reasonable during such 230 00:14:25,880 --> 00:14:30,320 Speaker 1: turbulent times. Now, we've seen lots of other countries lean 231 00:14:30,440 --> 00:14:34,040 Speaker 1: into silencing online communication in the name of security, and 232 00:14:34,080 --> 00:14:37,080 Speaker 1: it can often lead to governments going for that pause 233 00:14:37,160 --> 00:14:41,760 Speaker 1: button more and more frequently, with less and less justification. 234 00:14:41,880 --> 00:14:45,800 Speaker 1: So critics argue that all of this, all this discussion 235 00:14:45,840 --> 00:14:49,320 Speaker 1: about the role that social networks are playing in the 236 00:14:49,400 --> 00:14:53,120 Speaker 1: part of planning out these protests or assembling for riots, 237 00:14:53,680 --> 00:14:57,960 Speaker 1: is really just a distraction from the actual underlying problem, 238 00:14:58,040 --> 00:15:02,440 Speaker 1: which is police violence against city and until that is addressed, 239 00:15:03,400 --> 00:15:07,160 Speaker 1: the rest of this is kind of moot, because the 240 00:15:07,320 --> 00:15:10,760 Speaker 1: problem isn't just that people are getting together and rioting. 241 00:15:10,800 --> 00:15:13,720 Speaker 1: The problem is that they're doing this in response to 242 00:15:14,640 --> 00:15:20,720 Speaker 1: a perceived violation of justice. So that has been the 243 00:15:20,880 --> 00:15:25,400 Speaker 1: large criticism that has been posed against the government as well, 244 00:15:25,440 --> 00:15:28,920 Speaker 1: that they're really not actually dedicating the attention they need 245 00:15:28,960 --> 00:15:32,120 Speaker 1: to the underlying issue. Getting back to Meta for a moment, 246 00:15:32,280 --> 00:15:36,840 Speaker 1: there's a war going on between Canadian leadership and Zuckerberg's company, 247 00:15:37,000 --> 00:15:39,520 Speaker 1: and this is in regards to a law the Canadian 248 00:15:39,520 --> 00:15:44,000 Speaker 1: government passed that requires platforms like Meta to pay Canadian 249 00:15:44,080 --> 00:15:48,240 Speaker 1: media for the news that folks post on platforms like Facebook. 250 00:15:48,800 --> 00:15:52,200 Speaker 1: We've seen a similar law like this enacted in Australia, 251 00:15:52,320 --> 00:15:55,200 Speaker 1: and the argument is that platforms like Meta end up 252 00:15:55,280 --> 00:15:59,440 Speaker 1: hurting publishers and they profit off of publishers work without 253 00:15:59,480 --> 00:16:02,400 Speaker 1: sharing of the revenue with the people who created the 254 00:16:02,440 --> 00:16:05,920 Speaker 1: work in the first place. Meta, in response, has restricted 255 00:16:06,000 --> 00:16:09,160 Speaker 1: news content from appearing on its platforms if you're viewing 256 00:16:09,200 --> 00:16:12,480 Speaker 1: them in Canada, and now the Canadian government is saying 257 00:16:12,480 --> 00:16:16,640 Speaker 1: it will pull all government advertising from Meta's platforms. Now, 258 00:16:17,640 --> 00:16:21,160 Speaker 1: that is not actually that huge of an escalation because 259 00:16:21,720 --> 00:16:24,120 Speaker 1: my assumption is that the Canadian government is not a 260 00:16:24,320 --> 00:16:28,800 Speaker 1: major advertiser on Meta, especially when you're looking at global revenue, 261 00:16:28,800 --> 00:16:32,280 Speaker 1: when you're looking at more than one hundred billion dollars worldwide. 262 00:16:32,960 --> 00:16:35,480 Speaker 1: But this does send the message that Canada is not 263 00:16:35,680 --> 00:16:39,000 Speaker 1: backing down from the legislation. Also, the government hopes that 264 00:16:39,040 --> 00:16:42,080 Speaker 1: other companies in Canada will follow suit and pull their 265 00:16:42,120 --> 00:16:46,280 Speaker 1: advertising off of Meta's platforms, and that other countries that 266 00:16:46,360 --> 00:16:49,280 Speaker 1: are in the process of passing similar laws to protect 267 00:16:49,360 --> 00:16:52,960 Speaker 1: local media will continue to do so. And if that happens, 268 00:16:53,360 --> 00:16:56,080 Speaker 1: it's possible leverage will change and Meta will have to 269 00:16:56,080 --> 00:16:59,680 Speaker 1: make some concessions or risk losing out on significant revenue. 270 00:17:00,160 --> 00:17:02,560 Speaker 1: So you know, the argument is share a little bit 271 00:17:02,680 --> 00:17:05,800 Speaker 1: now or lose out a lot later on. Though you 272 00:17:05,840 --> 00:17:09,200 Speaker 1: could argue that Meta is concerned about other countries all 273 00:17:09,240 --> 00:17:12,280 Speaker 1: doing this and then collectively they take a huge bite 274 00:17:12,280 --> 00:17:16,199 Speaker 1: out of Meta's revenue, So we'll have to see. TikTok's 275 00:17:16,280 --> 00:17:19,400 Speaker 1: lawyers have petitioned a US judge to block a Montana 276 00:17:19,440 --> 00:17:23,720 Speaker 1: ban on TikTok before the ban takes effect in January 277 00:17:23,760 --> 00:17:27,280 Speaker 1: first of next year. TikTok argues that this ban constitutes 278 00:17:27,320 --> 00:17:30,760 Speaker 1: a violation of First Amendment rights, both for TikTok users 279 00:17:30,920 --> 00:17:33,560 Speaker 1: and for the company itself. This is where I remind 280 00:17:33,600 --> 00:17:37,040 Speaker 1: myself and everybody else here in the United States, corporations 281 00:17:37,080 --> 00:17:41,920 Speaker 1: are legally people because for lots of reasons, actually, many 282 00:17:41,960 --> 00:17:44,439 Speaker 1: of which really do make a lot of sense, but 283 00:17:44,600 --> 00:17:48,399 Speaker 1: that's a discussion for another time. The Montana ban calls 284 00:17:48,440 --> 00:17:52,080 Speaker 1: for fines of up to ten grand per violation if 285 00:17:52,119 --> 00:17:55,480 Speaker 1: TikTok allows content to appear in the state of Montana. 286 00:17:55,520 --> 00:17:58,719 Speaker 1: In other words, if if citizens are allowed to access 287 00:17:58,760 --> 00:18:02,159 Speaker 1: TikTok within Montana, TikTok could be hit by ten grand 288 00:18:02,200 --> 00:18:05,720 Speaker 1: the worth of fines each time. According to TikTok, more 289 00:18:05,760 --> 00:18:09,160 Speaker 1: than a third of Montana's citizens are actually on TikTok, 290 00:18:09,240 --> 00:18:11,800 Speaker 1: so you know, that's a lot of the state's population 291 00:18:11,920 --> 00:18:13,920 Speaker 1: to cut off from the service, though in the grand 292 00:18:13,920 --> 00:18:16,080 Speaker 1: scheme of things, we're talking about just three hundred and 293 00:18:16,080 --> 00:18:19,200 Speaker 1: eighty thousand people here, because Montana is not a heavily 294 00:18:19,240 --> 00:18:22,520 Speaker 1: populated state. How this law would actually be enforced as 295 00:18:22,600 --> 00:18:26,200 Speaker 1: a pretty darn good question, and how TikTok is supposed 296 00:18:26,240 --> 00:18:29,080 Speaker 1: to handle it is another one. And how the state 297 00:18:29,119 --> 00:18:32,720 Speaker 1: can prevent citizens from accessing TikTok through VPNs as yet 298 00:18:32,760 --> 00:18:36,320 Speaker 1: another good question. This all stems from the ongoing concern 299 00:18:36,359 --> 00:18:39,879 Speaker 1: that TikTok serves as a data siphon that takes precious 300 00:18:40,040 --> 00:18:43,199 Speaker 1: US citizen information and funnels it to China. That's an 301 00:18:43,240 --> 00:18:47,080 Speaker 1: accusation that the company has repeatedly denied. But then there 302 00:18:47,119 --> 00:18:50,440 Speaker 1: have been lots of people who have contradicted those denials, 303 00:18:50,480 --> 00:18:54,520 Speaker 1: including people who formerly worked for TikTok or its parent company. 304 00:18:55,080 --> 00:18:57,600 Speaker 1: So at this point I don't know who or what 305 00:18:57,840 --> 00:19:00,879 Speaker 1: to believe. But as I have said many times before, 306 00:19:01,280 --> 00:19:04,320 Speaker 1: in an era where our private information is collected by 307 00:19:04,440 --> 00:19:08,080 Speaker 1: numerous companies and then bought and sold on the open market, 308 00:19:08,440 --> 00:19:10,920 Speaker 1: taking out TikTok is kind of like if you were 309 00:19:11,000 --> 00:19:15,080 Speaker 1: to plug up a small hole in a dam, and 310 00:19:15,320 --> 00:19:17,800 Speaker 1: to your left and to your right are dozens of 311 00:19:17,800 --> 00:19:21,840 Speaker 1: other holes that you haven't even touched. Sure, you've plugged one, 312 00:19:22,400 --> 00:19:24,959 Speaker 1: but the problem is that this has been going on 313 00:19:25,000 --> 00:19:28,360 Speaker 1: for a while, and it's not due to a single source. 314 00:19:28,960 --> 00:19:32,520 Speaker 1: So yeah, I think the problem here is much bigger, 315 00:19:32,520 --> 00:19:35,760 Speaker 1: and the issue is we've been distracted by focusing just 316 00:19:35,840 --> 00:19:40,159 Speaker 1: on TikTok. France has passed a surveillance bill that has 317 00:19:40,200 --> 00:19:44,240 Speaker 1: civil liberties activists highly concerned. Going back to France again, 318 00:19:44,960 --> 00:19:49,480 Speaker 1: this bill allows authorities to essentially tap into citizens' devices 319 00:19:49,600 --> 00:19:54,320 Speaker 1: like laptops, smartphones, their cars, any other connected gadgets they 320 00:19:54,400 --> 00:19:58,720 Speaker 1: might rely upon, and to use those devices to track 321 00:19:58,800 --> 00:20:02,000 Speaker 1: them and to essentially spy on them. That could even 322 00:20:02,080 --> 00:20:06,000 Speaker 1: include activating a device's camera and microphone to record images 323 00:20:06,040 --> 00:20:10,040 Speaker 1: and sounds of targets. This law does have some limits. 324 00:20:10,040 --> 00:20:12,960 Speaker 1: It's not that the authorities can just, you know, turn 325 00:20:13,000 --> 00:20:16,119 Speaker 1: it on and spy on everybody all the time. First, 326 00:20:16,160 --> 00:20:18,640 Speaker 1: they have to secure permission from a judge before they 327 00:20:18,640 --> 00:20:23,000 Speaker 1: can begin surveilling a subject. And they are only supposed 328 00:20:23,040 --> 00:20:27,399 Speaker 1: to be able to use this tactic when quote justified 329 00:20:27,600 --> 00:20:31,280 Speaker 1: by the nature and seriousness of the crime end quote. 330 00:20:31,720 --> 00:20:36,040 Speaker 1: I don't know what the threshold is for a crime 331 00:20:36,080 --> 00:20:40,240 Speaker 1: to be serious enough to be to warrant this kind 332 00:20:40,280 --> 00:20:43,560 Speaker 1: of thing. I guess that's up to the individual judge's judgment. 333 00:20:45,040 --> 00:20:47,720 Speaker 1: But yeah, that's somewhat troubling that there's not a more, 334 00:20:48,240 --> 00:20:53,879 Speaker 1: you know, concrete threshold. Further, if you are a journalist, 335 00:20:53,960 --> 00:20:56,800 Speaker 1: a judge, a lawyer, or a doctor, I have some 336 00:20:56,880 --> 00:20:59,400 Speaker 1: good news for you. You've got some protection because those 337 00:20:59,520 --> 00:21:03,120 Speaker 1: jobs do not fall under legitimate targets according to the law. 338 00:21:03,200 --> 00:21:06,600 Speaker 1: This was an effort to prevent the law from being abused, 339 00:21:06,960 --> 00:21:10,199 Speaker 1: so that officials could say, keep an eye out on 340 00:21:10,440 --> 00:21:13,960 Speaker 1: a pesky journalist who is working on a big expose 341 00:21:14,000 --> 00:21:17,960 Speaker 1: about local government or something. The law is supposed to 342 00:21:18,000 --> 00:21:21,080 Speaker 1: protect people from that. If they fall into those categories, 343 00:21:21,080 --> 00:21:24,960 Speaker 1: anyone outside of that, it's a different story. The law 344 00:21:25,000 --> 00:21:27,879 Speaker 1: also has a time limit on how long authorities are 345 00:21:27,920 --> 00:21:30,560 Speaker 1: allowed to spy on a target, which is a maximum 346 00:21:30,600 --> 00:21:34,520 Speaker 1: of six months. Half the year. Seems like quite a time, 347 00:21:34,560 --> 00:21:37,280 Speaker 1: and I suppose at the end of that the authorities 348 00:21:37,320 --> 00:21:43,800 Speaker 1: could petition another judge to get another allowance to continue 349 00:21:43,800 --> 00:21:47,879 Speaker 1: spying on people. Activists point out this is already a 350 00:21:47,920 --> 00:21:52,520 Speaker 1: potentially catastrophic violation of citizen privacy and security, and furthermore, 351 00:21:52,680 --> 00:21:55,119 Speaker 1: it could be the start of a slippery slope into 352 00:21:55,280 --> 00:21:59,040 Speaker 1: more pervasive surveillance. As to how the government will actually 353 00:21:59,080 --> 00:22:03,119 Speaker 1: achieve these results, I presume it will require the cooperation 354 00:22:03,200 --> 00:22:06,840 Speaker 1: of device and telecommunications companies, because otherwise there's not like 355 00:22:06,880 --> 00:22:10,000 Speaker 1: a magic button that you can push that lets you, 356 00:22:10,000 --> 00:22:14,920 Speaker 1: you know, select a random citizen and then tap into 357 00:22:14,920 --> 00:22:20,440 Speaker 1: all their technology. You would need some methodology to access 358 00:22:20,480 --> 00:22:23,399 Speaker 1: that that stuff, which could mean either that you have 359 00:22:23,440 --> 00:22:26,160 Speaker 1: to get access to the actual gadget and then install 360 00:22:26,400 --> 00:22:29,040 Speaker 1: like an espionage app on there that would let you 361 00:22:29,080 --> 00:22:32,880 Speaker 1: do that, something akin to what the NSO Group has 362 00:22:32,920 --> 00:22:38,119 Speaker 1: done out of Israel selling it it's Pegasus package to 363 00:22:38,359 --> 00:22:42,879 Speaker 1: iOS device, you know, infiltrators that kind of thing, or 364 00:22:42,920 --> 00:22:48,880 Speaker 1: maybe direct work with the various telecommunications companies and gadget 365 00:22:48,920 --> 00:22:53,400 Speaker 1: companies to have them cooperate with this, which I would 366 00:22:53,400 --> 00:22:56,600 Speaker 1: imagine would be a tall ask, like it'd be a big, 367 00:22:57,119 --> 00:23:02,040 Speaker 1: big request to get those companies to agree, because consumers 368 00:23:02,080 --> 00:23:05,080 Speaker 1: don't like that kind of stuff, and if a consumer 369 00:23:05,119 --> 00:23:07,320 Speaker 1: finds out that a company is in cahoots with the 370 00:23:07,320 --> 00:23:10,440 Speaker 1: government to spy on people, that could be a real 371 00:23:10,600 --> 00:23:13,679 Speaker 1: pr blow to a company's image. So we'll have to 372 00:23:13,720 --> 00:23:17,720 Speaker 1: see how this develops, because I'm very curious how it 373 00:23:17,800 --> 00:23:24,240 Speaker 1: actually gets enforced, how it actually moves forward. But even 374 00:23:24,320 --> 00:23:27,240 Speaker 1: without those questions, it's a huge concern to me. I 375 00:23:27,280 --> 00:23:32,520 Speaker 1: think that this is a bad step for law and 376 00:23:33,080 --> 00:23:38,400 Speaker 1: for technology and for the continuation of the surveillance state. Okay, 377 00:23:38,760 --> 00:23:41,560 Speaker 1: we're going to take another quick break. When we come back, 378 00:23:41,600 --> 00:23:53,560 Speaker 1: I've got a few more stories to cover. Okay, we're 379 00:23:53,600 --> 00:23:57,119 Speaker 1: back and ready to wrap up this news episode. I've 380 00:23:57,160 --> 00:24:00,960 Speaker 1: got three more stories. First up. In New York City, 381 00:24:01,160 --> 00:24:04,120 Speaker 1: City government officials have passed a law that will require 382 00:24:04,240 --> 00:24:09,119 Speaker 1: hiring organizations to pay for an auditor, a third party 383 00:24:09,160 --> 00:24:12,960 Speaker 1: auditor to come in and check any software that uses 384 00:24:13,040 --> 00:24:16,199 Speaker 1: automated processes for the purposes of hiring. So if you 385 00:24:16,359 --> 00:24:22,000 Speaker 1: are a company that either outsources you're hiring to another company, 386 00:24:22,080 --> 00:24:25,960 Speaker 1: or you're using software that has automated systems to kind 387 00:24:25,960 --> 00:24:30,119 Speaker 1: of sort candidates so that you can more effectively decide 388 00:24:30,200 --> 00:24:35,680 Speaker 1: upon who you should hire, this applies to those companies. 389 00:24:35,720 --> 00:24:39,359 Speaker 1: So the goal here is to identify any unwanted bias 390 00:24:39,600 --> 00:24:44,520 Speaker 1: in the automated system that could disproportionately be disadvantageous to 391 00:24:44,560 --> 00:24:50,080 Speaker 1: certain populations. Right, we have seen multiple instances of bias 392 00:24:50,119 --> 00:24:53,760 Speaker 1: incorporated into automated functions. Now, most of the time this 393 00:24:53,800 --> 00:24:58,080 Speaker 1: is unintentional. It's not that bias is built in on purpose, 394 00:24:58,800 --> 00:25:01,760 Speaker 1: but bias can be a part of automated systems and 395 00:25:01,800 --> 00:25:06,680 Speaker 1: it can still have a massive negative impact on affected individuals. 396 00:25:07,200 --> 00:25:11,520 Speaker 1: So as companies lean harder on AI and automation to 397 00:25:11,560 --> 00:25:15,040 Speaker 1: do things like sort through potential job candidates, it becomes 398 00:25:15,080 --> 00:25:19,080 Speaker 1: really important to identify possible problems in the software so 399 00:25:19,119 --> 00:25:21,720 Speaker 1: that legit job seekers aren't left out of the loop 400 00:25:21,840 --> 00:25:23,840 Speaker 1: through no fault of their own and they are just 401 00:25:23,880 --> 00:25:28,840 Speaker 1: discriminated against due to this, you know, systemic bias that's 402 00:25:28,880 --> 00:25:32,600 Speaker 1: built into the software. So companies that do not comply 403 00:25:33,240 --> 00:25:36,840 Speaker 1: with the bias audit law will face fines for violations. 404 00:25:36,840 --> 00:25:39,800 Speaker 1: But these fines are are I mean, they're pretty small. 405 00:25:40,960 --> 00:25:43,439 Speaker 1: They're like three hundred and seventy five bucks for the 406 00:25:43,440 --> 00:25:47,440 Speaker 1: first offense, which is you know, that's nothing for most businesses. 407 00:25:48,400 --> 00:25:51,280 Speaker 1: Three hundred and fifty dollars for a fence number two, 408 00:25:51,320 --> 00:25:53,720 Speaker 1: and then every time after that it's fifteen hundred bucks. 409 00:25:53,920 --> 00:25:56,679 Speaker 1: And by a violation, we're not talking about every instance 410 00:25:57,560 --> 00:26:01,679 Speaker 1: of the use of such software. Rather, we're talking about 411 00:26:01,720 --> 00:26:05,400 Speaker 1: each day a company uses an automated tool that has 412 00:26:05,440 --> 00:26:08,760 Speaker 1: not yet been audited, or has been audited and found 413 00:26:08,840 --> 00:26:12,399 Speaker 1: to have bias and it hasn't been addressed, or that 414 00:26:12,520 --> 00:26:16,040 Speaker 1: the company has failed to inform job applicants that part 415 00:26:16,119 --> 00:26:23,160 Speaker 1: of the process involves software with automated processes in it. 416 00:26:23,240 --> 00:26:26,160 Speaker 1: Anything like that would count as a violation for each day, 417 00:26:26,400 --> 00:26:29,359 Speaker 1: So a day ends up being a violation as opposed 418 00:26:29,400 --> 00:26:33,680 Speaker 1: to an instance. So interesting that New York City has 419 00:26:33,720 --> 00:26:37,399 Speaker 1: passed this, I hope to see similar laws passed in 420 00:26:37,480 --> 00:26:41,439 Speaker 1: other areas. I think it is important to make certain 421 00:26:41,520 --> 00:26:46,040 Speaker 1: that as we start to lean heavier on AI that 422 00:26:46,119 --> 00:26:52,359 Speaker 1: we build in protections that can help prevent unintended consequences 423 00:26:52,400 --> 00:26:56,040 Speaker 1: from affecting people, because we've seen it happen time and 424 00:26:56,080 --> 00:27:00,760 Speaker 1: again already in other applications like facial recognition technology, for example, 425 00:27:01,080 --> 00:27:04,119 Speaker 1: So we want to make sure that those are not 426 00:27:04,800 --> 00:27:06,960 Speaker 1: really present for these kind of things. I mean, you 427 00:27:07,000 --> 00:27:11,679 Speaker 1: don't want to be out of work and then submitting 428 00:27:11,760 --> 00:27:15,040 Speaker 1: your resume and you know, cover letter and all that 429 00:27:15,080 --> 00:27:17,600 Speaker 1: kind of stuff to a company and then find out 430 00:27:18,200 --> 00:27:21,680 Speaker 1: that you weren't even up for consideration because some automated 431 00:27:21,760 --> 00:27:25,639 Speaker 1: program had a bias in it that automatically dismissed you 432 00:27:25,760 --> 00:27:29,240 Speaker 1: from consideration. That would stink. You know, you might be 433 00:27:29,280 --> 00:27:31,440 Speaker 1: the perfect person for that job, it might be your 434 00:27:31,520 --> 00:27:34,480 Speaker 1: passion and you never get a chance because some automated 435 00:27:34,520 --> 00:27:37,760 Speaker 1: piece of software had bias built into it. So we 436 00:27:37,920 --> 00:27:41,440 Speaker 1: definitely need laws like this because I don't think there's 437 00:27:41,600 --> 00:27:45,440 Speaker 1: enough of a move within the companies that are making 438 00:27:45,480 --> 00:27:49,359 Speaker 1: this software to stop it at the source from the 439 00:27:49,440 --> 00:27:51,480 Speaker 1: very get go. In fact, a lot of times they're 440 00:27:51,520 --> 00:27:53,480 Speaker 1: just not even aware of it. That's why you have 441 00:27:53,520 --> 00:27:56,080 Speaker 1: to have the third party auditor come in and really 442 00:27:56,160 --> 00:28:01,359 Speaker 1: check it. Out now second our penultimate story, second to last. 443 00:28:01,880 --> 00:28:05,480 Speaker 1: Stephen Winkleman, the CEO or Steffan I guess I should 444 00:28:05,480 --> 00:28:10,440 Speaker 1: say Stefan Winkleman, the CEO of the famous car company Lamborghini, 445 00:28:11,440 --> 00:28:15,520 Speaker 1: has said that Lamborghini has now sold its last fully 446 00:28:15,920 --> 00:28:20,040 Speaker 1: gas powered supercar. So it is the end of an era. 447 00:28:20,600 --> 00:28:25,440 Speaker 1: From this point forward, all vehicles sold by Lamborghini will 448 00:28:25,480 --> 00:28:28,880 Speaker 1: at the very least be hybrid cars, with all electric 449 00:28:28,960 --> 00:28:32,840 Speaker 1: vehicles coming a few years down the road, so to speak. Now, 450 00:28:32,960 --> 00:28:37,320 Speaker 1: this does not mean Lamborghini has manufactured its last gas 451 00:28:37,359 --> 00:28:40,480 Speaker 1: only vehicle, and that's because Lamborghini is a company that 452 00:28:40,520 --> 00:28:44,040 Speaker 1: takes pre orders on cars and then builds them to order. 453 00:28:44,520 --> 00:28:47,520 Speaker 1: So this is like a luxury supercar experience. So while 454 00:28:47,600 --> 00:28:52,479 Speaker 1: Lamborghini has sold its final gas drink and supercar, the 455 00:28:52,480 --> 00:28:55,800 Speaker 1: car hasn't been built yet. That's still in the future. 456 00:28:56,400 --> 00:28:59,160 Speaker 1: It's still an interesting footnote in history that July twenty 457 00:28:59,280 --> 00:29:02,600 Speaker 1: twenty three was the month when Lamborghini sold its last 458 00:29:02,760 --> 00:29:10,280 Speaker 1: gas only vehicle. It's a real mark of change. And finally, 459 00:29:10,720 --> 00:29:14,600 Speaker 1: after more than two months, NASA has re established communications 460 00:29:14,600 --> 00:29:19,920 Speaker 1: with the Mars based helicopter Ingenuity. This plucky little vehicle 461 00:29:20,000 --> 00:29:24,240 Speaker 1: relies upon the Mars rover Perseverance to provide the communication 462 00:29:24,400 --> 00:29:28,640 Speaker 1: link to NASA, But at the conclusion of its last flight, 463 00:29:28,880 --> 00:29:32,120 Speaker 1: which happened more than two months ago, Ingenuity plopped down 464 00:29:32,160 --> 00:29:36,680 Speaker 1: on Martian soil and a rocky hill blocked its line 465 00:29:36,720 --> 00:29:40,000 Speaker 1: of sight to Perseverance, and as such there was no 466 00:29:40,040 --> 00:29:42,720 Speaker 1: way for NASA to connect to the little helicopter because 467 00:29:42,720 --> 00:29:46,160 Speaker 1: he needed that line of sight connection. Now Perseverance has 468 00:29:46,200 --> 00:29:49,000 Speaker 1: moved into a position that allows for communication, so NASA 469 00:29:49,040 --> 00:29:54,160 Speaker 1: can finally gather data from Ingenuity's fifty second flight and 470 00:29:54,680 --> 00:29:57,560 Speaker 1: then check diagnostics to see if the helicopter is up 471 00:29:57,560 --> 00:30:00,959 Speaker 1: for another jaunt in a couple of weeks. Ingenuity has 472 00:30:01,000 --> 00:30:03,840 Speaker 1: been on Mars since early twenty twenty one. It made 473 00:30:03,840 --> 00:30:07,800 Speaker 1: its first flight on April nineteenth of twenty twenty one, 474 00:30:08,480 --> 00:30:10,959 Speaker 1: and originally the hope was to get five flights out 475 00:30:10,960 --> 00:30:14,240 Speaker 1: of a little thing. So this sucker has punched way 476 00:30:14,760 --> 00:30:17,840 Speaker 1: above its weight, having gone on more than fifty flights 477 00:30:17,880 --> 00:30:21,120 Speaker 1: at this point. Ingenuity has gathered images that have helped 478 00:30:21,200 --> 00:30:24,960 Speaker 1: NASA plot out the route for Perseverance, so that has 479 00:30:25,000 --> 00:30:29,080 Speaker 1: been like Ingenuity's prime operational use is to kind of 480 00:30:29,120 --> 00:30:33,040 Speaker 1: gather intelligence so that engineers can figure out the best 481 00:30:33,080 --> 00:30:36,480 Speaker 1: way forward for Perseverance to be able to explore Mars's 482 00:30:36,520 --> 00:30:40,320 Speaker 1: surface while encountering as few obstacles as possible. So here's 483 00:30:40,360 --> 00:30:43,920 Speaker 1: hoping ingenuity can keep taking to the skies for a 484 00:30:43,960 --> 00:30:47,880 Speaker 1: while longer yet. And that is it. That's the tech 485 00:30:47,960 --> 00:30:52,280 Speaker 1: news for Thursday, July sixth, twenty twenty three. I hope 486 00:30:52,360 --> 00:30:55,040 Speaker 1: you are all well and I'll talk to you again 487 00:30:55,840 --> 00:31:05,720 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 488 00:31:05,800 --> 00:31:10,520 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 489 00:31:10,560 --> 00:31:12,520 Speaker 1: wherever you listen to your favorite shows.