1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,160 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,280 --> 00:00:16,920 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:16,960 --> 00:00:19,000 Speaker 1: And how the tech are you. It's time for the 5 00:00:19,040 --> 00:00:24,120 Speaker 1: tech news for Tuesday, August nine, twenty two. First up 6 00:00:24,920 --> 00:00:28,200 Speaker 1: earlier today, late yesterday, depending on where you were. Google 7 00:00:28,280 --> 00:00:31,600 Speaker 1: had a bit of a hiccup as services became unavailable 8 00:00:31,600 --> 00:00:35,920 Speaker 1: briefly around the world. Apparently a software update did not 9 00:00:36,120 --> 00:00:39,559 Speaker 1: go as bland and took Google offline, and that prompted 10 00:00:39,600 --> 00:00:44,320 Speaker 1: a lot of buzz online. The company restored services pretty quickly, 11 00:00:44,720 --> 00:00:46,920 Speaker 1: but you know, the outage was enough to launch a 12 00:00:47,000 --> 00:00:50,839 Speaker 1: billion jokes about how you're supposed to Google, why Google 13 00:00:50,920 --> 00:00:53,320 Speaker 1: is down? If Google is down, and I'm sure for 14 00:00:53,400 --> 00:00:55,760 Speaker 1: people who go to Google to type out, you know, 15 00:00:55,880 --> 00:00:58,200 Speaker 1: the full U r L for a website inside the 16 00:00:58,240 --> 00:01:02,120 Speaker 1: search field, that was quite distressing. Now jokes aside the 17 00:01:02,120 --> 00:01:04,959 Speaker 1: outage as brief as it was, actually served as another 18 00:01:05,000 --> 00:01:08,280 Speaker 1: reminder of how intrinsic Google has become to the web 19 00:01:08,319 --> 00:01:11,480 Speaker 1: experience for many of us out there. And that's kind 20 00:01:11,480 --> 00:01:13,600 Speaker 1: of another way of saying that maybe we should be 21 00:01:13,640 --> 00:01:16,800 Speaker 1: really careful when we're putting our eggs in various baskets. 22 00:01:17,160 --> 00:01:19,880 Speaker 1: We might have put way too many in Google's might 23 00:01:19,920 --> 00:01:22,759 Speaker 1: have is that's that's too weak. We have put too 24 00:01:22,760 --> 00:01:27,600 Speaker 1: many eggs in Google's baskets. See also Meta but also 25 00:01:27,640 --> 00:01:31,000 Speaker 1: in Google news, the company is in a patent dispute 26 00:01:31,000 --> 00:01:35,600 Speaker 1: battle with the speaker company so Nos. Now, these patent 27 00:01:35,640 --> 00:01:38,880 Speaker 1: lawsuits are pretty common in the tech industry. I recently 28 00:01:38,920 --> 00:01:42,679 Speaker 1: talked about how Apple and Ericsson are in a global 29 00:01:42,840 --> 00:01:46,880 Speaker 1: struggle over patent um and that's a five G patent 30 00:01:46,920 --> 00:01:49,080 Speaker 1: in that case, and the two companies are trying to 31 00:01:49,080 --> 00:01:51,840 Speaker 1: hash out a licensing agreement. You know, Apple doesn't want 32 00:01:51,840 --> 00:01:54,440 Speaker 1: to pay so much and Ericson wants them to pay more. 33 00:01:55,160 --> 00:01:57,760 Speaker 1: So that's going on. Well in this case with Google 34 00:01:58,080 --> 00:02:00,800 Speaker 1: and with so Nos, Google is saying that so Nos 35 00:02:00,840 --> 00:02:06,240 Speaker 1: has violated seven patents relating to Google's smart speaker technology 36 00:02:06,360 --> 00:02:10,000 Speaker 1: or smart assistant technology. At the heart of the matter 37 00:02:10,200 --> 00:02:13,720 Speaker 1: is so Nos launching its own voice activated smart assistant. 38 00:02:14,200 --> 00:02:17,280 Speaker 1: You see, Google worked with so nos on smart speakers 39 00:02:17,320 --> 00:02:21,840 Speaker 1: to incorporate the Google Assistant feature in the so nos products. 40 00:02:22,320 --> 00:02:24,200 Speaker 1: And I have to be careful how I say this stuff, 41 00:02:24,240 --> 00:02:28,160 Speaker 1: because I actually have one of those speakers in this room, 42 00:02:28,240 --> 00:02:31,560 Speaker 1: and if I say one of the activation phrases, then 43 00:02:31,600 --> 00:02:35,680 Speaker 1: it'll start talking to me. Anyway, Google is essentially saying, 44 00:02:36,080 --> 00:02:40,600 Speaker 1: we worked hard to incorporate our smart assistant with your products, 45 00:02:40,639 --> 00:02:43,320 Speaker 1: and you paid very close attention to that, and then 46 00:02:43,360 --> 00:02:46,519 Speaker 1: you copied our work to launch your own competing smart 47 00:02:46,560 --> 00:02:50,680 Speaker 1: assistant product, and since we hold patents on that technology, 48 00:02:50,720 --> 00:02:53,920 Speaker 1: we're going to sue you. Now, if for some reason 49 00:02:54,040 --> 00:02:56,680 Speaker 1: you've been paying really close attention to patent disputes in 50 00:02:56,680 --> 00:02:59,280 Speaker 1: tech for a couple of years, you might actually remember 51 00:02:59,280 --> 00:03:01,480 Speaker 1: that in twenty twenty, we saw kind of the flip 52 00:03:01,520 --> 00:03:05,040 Speaker 1: of this. So NOS sued Google for a similar reason. 53 00:03:05,160 --> 00:03:08,320 Speaker 1: So NOS was saying that Google had copied so Nose 54 00:03:08,400 --> 00:03:13,359 Speaker 1: technology when Google was developing its own line of smart speakers. Oh, 55 00:03:13,680 --> 00:03:18,320 Speaker 1: how the turns have tabled. Google lost that court case 56 00:03:18,919 --> 00:03:21,320 Speaker 1: and had to change the design of its speakers in 57 00:03:21,440 --> 00:03:25,400 Speaker 1: order to avoid facing potential bands around the world or else, 58 00:03:25,480 --> 00:03:29,160 Speaker 1: you know, licensed the technology from so NOS. A SONOS 59 00:03:29,200 --> 00:03:31,760 Speaker 1: rep by the way, has remained somewhat defiant in the 60 00:03:31,800 --> 00:03:35,480 Speaker 1: face of this new lawsuit from Google, saying that SONS 61 00:03:35,520 --> 00:03:39,120 Speaker 1: has prevailed in every court case so far but that's 62 00:03:39,200 --> 00:03:42,640 Speaker 1: largely referring to the cases where so Nose had sued Google. 63 00:03:42,720 --> 00:03:46,560 Speaker 1: In this case, Google is suing so nos So. The 64 00:03:46,600 --> 00:03:49,880 Speaker 1: shoes you see are on other feet now, so we'll 65 00:03:49,880 --> 00:03:52,440 Speaker 1: have to see how this plays out. Also, I knew 66 00:03:52,440 --> 00:03:55,320 Speaker 1: this is serious business and we're talking about millions or 67 00:03:55,400 --> 00:03:58,680 Speaker 1: billions of dollars at play here, but I can't help 68 00:03:58,680 --> 00:04:02,080 Speaker 1: but get the feeling that these these enormous corporations are 69 00:04:02,160 --> 00:04:06,880 Speaker 1: essentially angry toddlers on a playground squabbling with one another 70 00:04:07,000 --> 00:04:10,560 Speaker 1: when we cover these kinds of stories. In India, the 71 00:04:10,640 --> 00:04:15,760 Speaker 1: government has suspended mobile internet service in two regions. Manipur, 72 00:04:16,080 --> 00:04:19,400 Speaker 1: which is in the northeast of India borders Myanmar, was 73 00:04:19,600 --> 00:04:23,080 Speaker 1: scheduled to have internet service shut down for five days. 74 00:04:23,600 --> 00:04:27,159 Speaker 1: The government was doing that because of expectation of quote, 75 00:04:27,279 --> 00:04:32,280 Speaker 1: tribal protests end quote. The other region where we've seen 76 00:04:32,360 --> 00:04:35,240 Speaker 1: internet service interrupted is Kashmir, which is in the north 77 00:04:35,240 --> 00:04:41,040 Speaker 1: of India, very far away from Manipur. Kashmir borders Pakistan, Afghanistan, 78 00:04:41,320 --> 00:04:45,960 Speaker 1: and China. Kashmir is also home to a population of 79 00:04:45,960 --> 00:04:49,919 Speaker 1: of Muslims, Kashmiri Muslims. Now that's relevant because of the 80 00:04:49,960 --> 00:04:53,479 Speaker 1: observation of Maharam, which is and by the way, my 81 00:04:53,560 --> 00:04:58,240 Speaker 1: apologies for my pronunciation. I'm sure I'm butchering it, but 82 00:04:58,520 --> 00:05:03,479 Speaker 1: that is a time of warning, and the suspension of 83 00:05:03,520 --> 00:05:07,839 Speaker 1: the Internet service coincides with a Muslim holy observation, which 84 00:05:08,520 --> 00:05:12,120 Speaker 1: seems significant to me. Digital rights and human rights groups 85 00:05:12,120 --> 00:05:16,880 Speaker 1: have long protested the Indian government's practice of shutting down communications. 86 00:05:17,520 --> 00:05:21,320 Speaker 1: The government defends these decisions. The government says that this 87 00:05:21,400 --> 00:05:24,359 Speaker 1: is a measure they take in order to ensure public order, 88 00:05:25,160 --> 00:05:29,520 Speaker 1: uh to to quiet unrest in other words, But the 89 00:05:29,640 --> 00:05:32,640 Speaker 1: rights groups point out that these kinds of moves cut 90 00:05:32,640 --> 00:05:35,800 Speaker 1: off entire populations from the rest of the world. And 91 00:05:35,880 --> 00:05:38,080 Speaker 1: it's certainly the case that India is very quick to 92 00:05:38,200 --> 00:05:41,320 Speaker 1: jump on the let's shut it all down course of action. 93 00:05:41,640 --> 00:05:44,120 Speaker 1: And we've also seen the Indian government laid down a 94 00:05:44,120 --> 00:05:47,400 Speaker 1: lot of pressure on companies to do stuff like crack 95 00:05:47,600 --> 00:05:52,080 Speaker 1: down on the population's ability to communicate during times of unrest, 96 00:05:52,400 --> 00:05:57,440 Speaker 1: while simultaneously allowing government officials unfettered access to platforms, even 97 00:05:57,440 --> 00:06:01,720 Speaker 1: if those officials are actively spreading this information. So yeah, 98 00:06:01,880 --> 00:06:06,320 Speaker 1: this is a complicated matter. Last week, Amazon announced its 99 00:06:06,360 --> 00:06:10,400 Speaker 1: intention to purchase another company and this one has privacy 100 00:06:10,440 --> 00:06:14,520 Speaker 1: advocates very much concerned. The company would be I Robot. 101 00:06:14,880 --> 00:06:17,320 Speaker 1: That's the company that brought us the roomba. You know, 102 00:06:18,000 --> 00:06:22,840 Speaker 1: the robot that sucks. I mean it's it's a robot vacuum. Anyway, 103 00:06:22,880 --> 00:06:27,080 Speaker 1: the deal is valued at one point seven billion dollars. 104 00:06:27,120 --> 00:06:30,120 Speaker 1: And you might wonder why privacy advocates are worried in 105 00:06:30,160 --> 00:06:32,720 Speaker 1: the first place. Well, it's because the roomba, in an 106 00:06:32,720 --> 00:06:36,480 Speaker 1: effort to do the best vacuum job, it can effectively 107 00:06:36,920 --> 00:06:41,240 Speaker 1: maps out your home as it moves around. It figures 108 00:06:41,240 --> 00:06:43,760 Speaker 1: out where the doors are, where the furniture is, where 109 00:06:43,760 --> 00:06:45,760 Speaker 1: the walls are, all that kind of stuff. So, in 110 00:06:45,839 --> 00:06:49,080 Speaker 1: other words, Amazon already wants to hear what you have 111 00:06:49,120 --> 00:06:52,440 Speaker 1: to say by giving you or convincing you to buy 112 00:06:52,520 --> 00:06:55,160 Speaker 1: their smart speakers. They want to know what you're watching 113 00:06:56,040 --> 00:06:59,880 Speaker 1: using like an Amazon Fire device or Amazon Prime Video. 114 00:07:00,640 --> 00:07:02,279 Speaker 1: And now they want to know where you put the 115 00:07:02,320 --> 00:07:04,760 Speaker 1: ottoman in your living room by using your room but 116 00:07:04,920 --> 00:07:07,880 Speaker 1: to spy on you, he got a guy on the inside. 117 00:07:08,640 --> 00:07:11,960 Speaker 1: Now what would Amazon do with this kind of information. Well, 118 00:07:12,000 --> 00:07:14,720 Speaker 1: Amazon is constantly making new products that are meant to 119 00:07:14,760 --> 00:07:17,400 Speaker 1: interact on a smart level. And if the company knows 120 00:07:17,440 --> 00:07:19,800 Speaker 1: how you've laid out your house, it can market more 121 00:07:19,840 --> 00:07:23,360 Speaker 1: effectively to you, or sell that information to a third 122 00:07:23,400 --> 00:07:26,440 Speaker 1: party that's interested in it. It can suggest stuff for 123 00:07:26,480 --> 00:07:29,200 Speaker 1: you to upgrade your home, and in turn will gather 124 00:07:29,360 --> 00:07:33,920 Speaker 1: even more data for Amazon. And it starts to look dystopian, 125 00:07:34,000 --> 00:07:37,360 Speaker 1: pritty darn quickly. Now, I should add the Amazon has 126 00:07:37,400 --> 00:07:42,520 Speaker 1: simply announced the intention to acquire Robot. The process still 127 00:07:42,520 --> 00:07:47,120 Speaker 1: has to go through regulatory approval, and there's some expectation 128 00:07:47,200 --> 00:07:50,200 Speaker 1: that the United States Federal Trade Commission, or FTC, is 129 00:07:50,240 --> 00:07:53,960 Speaker 1: going to push back against this planned deal. The FDC 130 00:07:54,040 --> 00:07:59,000 Speaker 1: has recently taken a more aggressive stance against big tech acquisitions. Uh, 131 00:07:59,120 --> 00:08:03,240 Speaker 1: you know. Recently, the FTC indicated its opposing Meta's planned 132 00:08:03,240 --> 00:08:06,680 Speaker 1: purchase of the company Supernatural, which makes fitness games for 133 00:08:06,760 --> 00:08:11,440 Speaker 1: virtual reality setups. So geeg Wire quotes a lawyer named 134 00:08:11,440 --> 00:08:14,960 Speaker 1: Ethan Glass who says he figures there's a chance that 135 00:08:15,000 --> 00:08:18,720 Speaker 1: the FTC will conduct a quote unquote deep investigation of 136 00:08:18,760 --> 00:08:22,040 Speaker 1: the deal and a twenty five percent chance that the 137 00:08:22,080 --> 00:08:26,440 Speaker 1: FTC will challenge the acquisition outright. As it stands, Amazon 138 00:08:26,560 --> 00:08:29,800 Speaker 1: hopes to have this deal complete by August four, three 139 00:08:29,880 --> 00:08:32,560 Speaker 1: at the latest and has committed to paying I Robot 140 00:08:32,600 --> 00:08:36,800 Speaker 1: more than ninety million dollars if regulators deny the acquisition. 141 00:08:37,200 --> 00:08:40,040 Speaker 1: The Wall Street Journal has an article pointing out that 142 00:08:40,200 --> 00:08:43,599 Speaker 1: an antitrust bill that's been taking shape in Congress is 143 00:08:43,640 --> 00:08:46,800 Speaker 1: in danger of being abandoned. All right, So Congress is 144 00:08:46,800 --> 00:08:49,439 Speaker 1: about to go into recess. So it's kind of like 145 00:08:49,520 --> 00:08:52,640 Speaker 1: when you went to recess in elementary school, except instead 146 00:08:52,679 --> 00:08:55,960 Speaker 1: of running outside to go play on the swings, Congress 147 00:08:56,240 --> 00:09:00,360 Speaker 1: ambles outside and accepts numerous, numerous and enormous contra usitions 148 00:09:00,400 --> 00:09:04,320 Speaker 1: from lobbying groups. Pardon my cynicism, but it's been a 149 00:09:04,360 --> 00:09:07,720 Speaker 1: bad year anyway. It seems pretty clear that Congress will 150 00:09:07,760 --> 00:09:11,840 Speaker 1: not speed up work on the antitrust bill before recessing. 151 00:09:12,200 --> 00:09:14,360 Speaker 1: They are not going to hold a vote, and then 152 00:09:14,520 --> 00:09:18,360 Speaker 1: Congress will reconvene later the in the fall. But there's 153 00:09:18,360 --> 00:09:20,800 Speaker 1: a limited number of days left for Congress to be 154 00:09:20,840 --> 00:09:24,480 Speaker 1: in session. So there's this narrowing window of opportunity to 155 00:09:24,600 --> 00:09:27,200 Speaker 1: get the antitrust bill in motion before it's kind of 156 00:09:27,240 --> 00:09:29,839 Speaker 1: dead in the water. Now. The reason I'm even talking 157 00:09:29,880 --> 00:09:34,080 Speaker 1: about antitrust bill on this show is that if passed, 158 00:09:34,160 --> 00:09:37,960 Speaker 1: it would have massive consequences in the tech industry, particularly 159 00:09:38,040 --> 00:09:41,880 Speaker 1: for the biggest companies in that industry. Uh. This is 160 00:09:41,880 --> 00:09:44,080 Speaker 1: the law that would prevent tech companies from giving their 161 00:09:44,120 --> 00:09:48,680 Speaker 1: own products and services preferential treatment over competitors. That would 162 00:09:48,760 --> 00:09:51,320 Speaker 1: force Amazon to make some pretty big changes because the 163 00:09:51,320 --> 00:09:54,680 Speaker 1: company has frequently been accused of promoting its own products 164 00:09:54,679 --> 00:09:58,240 Speaker 1: over others in search results. Big tech companies have been 165 00:09:58,240 --> 00:10:01,800 Speaker 1: spending millions of dollar is on ads trashing this bill, 166 00:10:02,400 --> 00:10:04,680 Speaker 1: and so far the Senate has failed to schedule a 167 00:10:04,760 --> 00:10:07,600 Speaker 1: vote on the matter, despite claims that there is bipartisan 168 00:10:07,640 --> 00:10:10,679 Speaker 1: support for the measure. The tech companies have a list 169 00:10:10,760 --> 00:10:13,400 Speaker 1: of warnings for anyone who supports the measure, claiming it 170 00:10:13,440 --> 00:10:16,240 Speaker 1: will put the economy and danger, it will open up 171 00:10:16,280 --> 00:10:20,400 Speaker 1: security vulnerabilities, and much more. Many analysts say that these 172 00:10:20,400 --> 00:10:24,480 Speaker 1: warnings are at best weak arguments and at worst outright misinformation, 173 00:10:24,520 --> 00:10:26,840 Speaker 1: but the bill supporters have felt the need to respond 174 00:10:26,920 --> 00:10:29,560 Speaker 1: to those arguments in an effort to just keep the 175 00:10:29,600 --> 00:10:33,240 Speaker 1: whole thing alive. Personally, I hope this does get scheduled 176 00:10:33,280 --> 00:10:35,360 Speaker 1: for the fall session and that we do see a vote, 177 00:10:35,640 --> 00:10:38,200 Speaker 1: but I'm not betting on it. Well, there's been a 178 00:10:38,200 --> 00:10:41,240 Speaker 1: growing awareness to the influence of big companies in general 179 00:10:41,440 --> 00:10:44,720 Speaker 1: and big tech companies in particular, it could be really 180 00:10:44,760 --> 00:10:47,439 Speaker 1: hard to reverse all the momentum that's built up over 181 00:10:47,440 --> 00:10:50,320 Speaker 1: the years. We've got more news stories to cover, but first, 182 00:10:50,440 --> 00:11:02,200 Speaker 1: let's take this quick break. Last Friday, Meta introduced the 183 00:11:02,360 --> 00:11:05,440 Speaker 1: Blender Bought three chat bought and asked users to have 184 00:11:05,520 --> 00:11:09,240 Speaker 1: conversations with it. Now. The idea is that through conversations, 185 00:11:09,280 --> 00:11:11,839 Speaker 1: Blender Bought three will learn how to interact with people 186 00:11:11,880 --> 00:11:14,920 Speaker 1: more naturally and would be able to employ, or at 187 00:11:15,000 --> 00:11:19,760 Speaker 1: least appear to employ, stuff like empathy and understanding. And 188 00:11:19,840 --> 00:11:22,440 Speaker 1: it will also very quickly start claiming that Trump is 189 00:11:22,440 --> 00:11:24,880 Speaker 1: still president of the United States and also make anti 190 00:11:24,920 --> 00:11:29,679 Speaker 1: Semitic remarks and promote conspiracy theories because we can't have 191 00:11:30,040 --> 00:11:34,320 Speaker 1: nice things. So within forty eight hours of it being 192 00:11:34,480 --> 00:11:38,679 Speaker 1: unleashed upon Facebook users, the bot began saying stuff like 193 00:11:38,720 --> 00:11:42,439 Speaker 1: Mark Zuckerberg is not to be trusted and that he's creepy. Okay, 194 00:11:42,480 --> 00:11:46,400 Speaker 1: so not everything it's said is wrong, uh so, but 195 00:11:46,480 --> 00:11:50,480 Speaker 1: it's kind of funny that Meta's own butt is saying 196 00:11:50,600 --> 00:11:55,040 Speaker 1: things that are very critical of Meta slash Facebook slash 197 00:11:55,040 --> 00:11:59,040 Speaker 1: Mark Zuckerberg. Journalists tested the chat bought, and they found 198 00:11:59,040 --> 00:12:01,920 Speaker 1: that if they asked about Joe Biden, sometimes they were 199 00:12:01,960 --> 00:12:04,960 Speaker 1: informed that Biden is a former vice president and that 200 00:12:05,040 --> 00:12:08,880 Speaker 1: he ran for office in for president in but that 201 00:12:08,920 --> 00:12:14,679 Speaker 1: he ultimately lost. Sometimes how about that? Huh? Now. When 202 00:12:14,720 --> 00:12:18,280 Speaker 1: Meta first announced the bot last week, a representative claimed 203 00:12:18,679 --> 00:12:22,040 Speaker 1: that the team behind the bot had quote conducted large 204 00:12:22,040 --> 00:12:26,199 Speaker 1: scale studies, co organized workshops, and developed new techniques to 205 00:12:26,240 --> 00:12:30,040 Speaker 1: create safeguards end quote. But it's really a different kettle 206 00:12:30,120 --> 00:12:33,120 Speaker 1: of fish when the chat bot actually goes live. And 207 00:12:33,440 --> 00:12:37,600 Speaker 1: I have some sympathy for that team, because creating a 208 00:12:37,600 --> 00:12:41,640 Speaker 1: tool that's designed to learn as the tool is being used, 209 00:12:42,280 --> 00:12:46,360 Speaker 1: brings along with it major risks. Even folks who don't 210 00:12:46,400 --> 00:12:50,480 Speaker 1: actually believe conspiracy theories might mess with a bot just 211 00:12:50,640 --> 00:12:53,840 Speaker 1: for funzies. You know, there's a type of juvenile glee 212 00:12:53,920 --> 00:12:57,240 Speaker 1: that comes along with being sassy to a robot. So, yeah, 213 00:12:57,360 --> 00:13:00,160 Speaker 1: it's a butt, but it doesn't have feelings, right, You 214 00:13:00,200 --> 00:13:03,760 Speaker 1: can't hurt its feelings, So you can act like a 215 00:13:03,920 --> 00:13:06,920 Speaker 1: jerk to this butt, and there are no consequences, right, Like, 216 00:13:06,960 --> 00:13:09,720 Speaker 1: you're not actually hurting anyone, so you're not gonna feel 217 00:13:09,960 --> 00:13:13,559 Speaker 1: guilty about it or shame about what you've done, except 218 00:13:14,120 --> 00:13:17,400 Speaker 1: that the body is learning and that these interactions will 219 00:13:17,440 --> 00:13:20,760 Speaker 1: go on to inform future interactions. And that's where things 220 00:13:20,800 --> 00:13:24,360 Speaker 1: can get really nasty really quickly. And I'm not sure 221 00:13:24,400 --> 00:13:26,959 Speaker 1: what the solution is here, unless it's just to create 222 00:13:27,040 --> 00:13:30,640 Speaker 1: bots that are not capable of learning through interactions, because 223 00:13:31,120 --> 00:13:33,760 Speaker 1: there are always going to be people, some of them 224 00:13:33,800 --> 00:13:37,440 Speaker 1: obviously horrible and some of them not, who will corrupt 225 00:13:37,840 --> 00:13:41,280 Speaker 1: that kind of a thing, maybe out of mischief, maybe 226 00:13:41,280 --> 00:13:44,200 Speaker 1: out of malice. Here's the thing, doesn't really matter what 227 00:13:44,280 --> 00:13:47,560 Speaker 1: the motivation was, because the end result is similar in 228 00:13:47,600 --> 00:13:52,120 Speaker 1: any case. So yeah, funny, funny ha ha. But it 229 00:13:52,200 --> 00:13:56,440 Speaker 1: also it really points out how kind of nasty we 230 00:13:56,520 --> 00:13:59,480 Speaker 1: all are online, or at least how nasty we can be. 231 00:14:00,400 --> 00:14:05,160 Speaker 1: Uh yeah, so on the service level, funny as you 232 00:14:05,200 --> 00:14:09,240 Speaker 1: dig down ugly. Microsoft has issued a warning that Windows 233 00:14:09,240 --> 00:14:14,200 Speaker 1: machines that have the latest supported processors could be susceptible 234 00:14:14,280 --> 00:14:18,600 Speaker 1: to data damage. This warning applies to machines running Windows 235 00:14:18,600 --> 00:14:23,720 Speaker 1: eleven or Windows Server twenty twenty two, and again affects 236 00:14:23,760 --> 00:14:26,120 Speaker 1: those that are running on the latest processors, and the 237 00:14:26,200 --> 00:14:29,480 Speaker 1: data damage in this case is a risk of data loss. Now, 238 00:14:29,520 --> 00:14:32,040 Speaker 1: this whole thing gets super technical. So I suggest for 239 00:14:32,080 --> 00:14:34,160 Speaker 1: those of you who have these kinds of machines, or 240 00:14:34,160 --> 00:14:37,680 Speaker 1: perhaps oversee i T operations in a big organization, to 241 00:14:37,800 --> 00:14:41,240 Speaker 1: look into it further. Microsoft is urging users to install 242 00:14:41,280 --> 00:14:44,680 Speaker 1: a Windows eleven or Windows Server twenty twenty two preview 243 00:14:44,800 --> 00:14:48,600 Speaker 1: update which was released on June twenty three, or a 244 00:14:48,640 --> 00:14:51,320 Speaker 1: security update that was released on July twelve, in order 245 00:14:51,360 --> 00:14:54,040 Speaker 1: to work around this problem. Here in the United States, 246 00:14:54,040 --> 00:14:57,640 Speaker 1: the Senate recently voted in favor of the Reduction Act 247 00:14:58,200 --> 00:15:02,000 Speaker 1: of twenty twenty two, which now goes to the House 248 00:15:02,080 --> 00:15:04,120 Speaker 1: for a vote before it would then go to the 249 00:15:04,120 --> 00:15:07,000 Speaker 1: President to be signed into law. The Act contains a 250 00:15:07,040 --> 00:15:09,960 Speaker 1: lot of stuff meant to tackle inflation, but also other 251 00:15:10,000 --> 00:15:13,520 Speaker 1: issues like climate change. To that end, one element in 252 00:15:13,560 --> 00:15:15,960 Speaker 1: the bill allows for a tax credit for the purchase 253 00:15:16,040 --> 00:15:19,160 Speaker 1: of new electric vehicles, but there are some parameters and 254 00:15:19,200 --> 00:15:23,360 Speaker 1: restrictions involved, and as ours Technical points out, one of 255 00:15:23,400 --> 00:15:25,800 Speaker 1: those restrictions might actually mean that no one will be 256 00:15:25,840 --> 00:15:28,480 Speaker 1: able to take advantage of that tax credit for a while. 257 00:15:29,240 --> 00:15:31,320 Speaker 1: See the credit has some usual stuff in it, like 258 00:15:31,360 --> 00:15:34,520 Speaker 1: there's a limit on the price tag for new vehicles 259 00:15:34,520 --> 00:15:38,320 Speaker 1: that are eligible for this tax cut or this tax credit. 260 00:15:38,360 --> 00:15:41,080 Speaker 1: I guess I should say, so if an electric vehicle 261 00:15:41,120 --> 00:15:46,720 Speaker 1: sedan cost dollars or less, then that would qualify. But 262 00:15:47,040 --> 00:15:49,760 Speaker 1: if let's say you're out there to buy an electric vehicle, 263 00:15:49,800 --> 00:15:53,280 Speaker 1: but you want to buy the luxury one, that tax 264 00:15:53,320 --> 00:15:55,440 Speaker 1: credit is not gonna apply because the price tag is 265 00:15:55,440 --> 00:15:57,160 Speaker 1: going to be too high. They're gonna say like, well, yeah, 266 00:15:57,200 --> 00:15:59,920 Speaker 1: you don't need the tax credit, you're buying a luxury vehicle. 267 00:16:00,560 --> 00:16:03,080 Speaker 1: This is where people who are buying, you know, a 268 00:16:03,080 --> 00:16:05,680 Speaker 1: a a sedan meant to get the family around that 269 00:16:05,800 --> 00:16:08,920 Speaker 1: kind of thing. Beyond that, the bill says that the 270 00:16:08,920 --> 00:16:12,040 Speaker 1: tax credit will only apply if the batteries in the 271 00:16:12,080 --> 00:16:16,160 Speaker 1: electric vehicle were mostly made in North America. That includes 272 00:16:16,160 --> 00:16:20,560 Speaker 1: a requirement that the materials that were extracted and processed 273 00:16:20,600 --> 00:16:23,680 Speaker 1: to make the batteries were done so in North America 274 00:16:23,840 --> 00:16:25,840 Speaker 1: or in a country that has a free trade agreement 275 00:16:25,880 --> 00:16:31,040 Speaker 1: with the United States. So that's where the hurdle is. 276 00:16:31,280 --> 00:16:33,800 Speaker 1: Right now, China serves as the world's source for stuff 277 00:16:33,840 --> 00:16:38,280 Speaker 1: like lithium, so it's possible that every electric vehicle that 278 00:16:38,400 --> 00:16:41,480 Speaker 1: is available in the United States this year will not 279 00:16:41,600 --> 00:16:45,120 Speaker 1: meet the criteria necessary to qualify for that tax credit. 280 00:16:45,560 --> 00:16:48,760 Speaker 1: Now on the other hand, the US government is pushing 281 00:16:48,800 --> 00:16:52,320 Speaker 1: really hard to encourage more domestic production, and that's good 282 00:16:52,320 --> 00:16:55,600 Speaker 1: for a lot of reasons. Places like China have horrendous 283 00:16:55,680 --> 00:16:59,760 Speaker 1: human rights records, They have practiced mining and processing techniques 284 00:16:59,760 --> 00:17:04,680 Speaker 1: that are caused massive environmental damage, and trade with China 285 00:17:04,720 --> 00:17:07,439 Speaker 1: also represents a vulnerability in the event of a trade 286 00:17:07,480 --> 00:17:10,640 Speaker 1: dispute like we saw with the trade war under President Trump. 287 00:17:10,960 --> 00:17:14,520 Speaker 1: Moving production to the United States, where there are more 288 00:17:14,560 --> 00:17:17,960 Speaker 1: strict protections in place, both for employees and the environment, 289 00:17:18,480 --> 00:17:23,040 Speaker 1: has clear benefits. It's an increased cost, you know, economically, 290 00:17:23,480 --> 00:17:28,000 Speaker 1: but in all other aspects it's kind of like a boost. 291 00:17:28,040 --> 00:17:31,800 Speaker 1: It also improves national security because you're no longer dependent 292 00:17:31,880 --> 00:17:35,080 Speaker 1: upon a foreign source for that material. But we are 293 00:17:35,200 --> 00:17:38,760 Speaker 1: not there yet, like we are nowhere close to being 294 00:17:38,800 --> 00:17:43,639 Speaker 1: able to produce stuff at scale. So citizens are going 295 00:17:43,680 --> 00:17:46,440 Speaker 1: to potentially find it really frustrating that there's a law 296 00:17:46,480 --> 00:17:49,960 Speaker 1: that theoretically would reward them for making the switch from 297 00:17:49,960 --> 00:17:54,400 Speaker 1: internal combustion engine vehicles to electric ones, but technically they 298 00:17:54,440 --> 00:17:57,639 Speaker 1: can't take advantage of it because of supply chain and 299 00:17:57,680 --> 00:18:02,440 Speaker 1: production issues. Fun times. On the subject of electric vehicles, 300 00:18:02,640 --> 00:18:05,520 Speaker 1: let's talk about Tesla for a second, the Department of 301 00:18:05,600 --> 00:18:08,320 Speaker 1: Motor Vehicles in the state of California has a bone 302 00:18:08,480 --> 00:18:12,560 Speaker 1: to pick with Tesla. The DMV says that Tesla has 303 00:18:12,600 --> 00:18:17,000 Speaker 1: misrepresented features like autopilot and full self driving mode. The 304 00:18:17,119 --> 00:18:21,200 Speaker 1: DMV is saying Tesla's systems, which are really under the 305 00:18:21,480 --> 00:18:26,639 Speaker 1: category of automated driver assist features, come across in Tesla's 306 00:18:27,280 --> 00:18:31,080 Speaker 1: you know wording as being more of a fully automated solution, 307 00:18:31,800 --> 00:18:35,080 Speaker 1: and that the company has made what amounts to deceitful 308 00:18:35,160 --> 00:18:38,359 Speaker 1: claims as to what the vehicles are actually capable of doing. 309 00:18:39,040 --> 00:18:42,439 Speaker 1: The d MV has four complaints. Two of them are 310 00:18:42,480 --> 00:18:45,400 Speaker 1: just the names autopilot and full self driving and those 311 00:18:45,400 --> 00:18:46,840 Speaker 1: of you all who have been listening to my show 312 00:18:46,880 --> 00:18:49,000 Speaker 1: for a while, No, I also take issue with those 313 00:18:49,119 --> 00:18:51,520 Speaker 1: terms because I feel that they said an expectation that's 314 00:18:51,560 --> 00:18:57,000 Speaker 1: unrealistic and in some cases incredibly dangerous. Elon Musk, by 315 00:18:57,000 --> 00:18:59,639 Speaker 1: the way, has dismissed this sort of concern. He says, 316 00:18:59,680 --> 00:19:02,840 Speaker 1: what's the idiot just takes a name for, you know, 317 00:19:03,240 --> 00:19:06,560 Speaker 1: saying what it does, And I guess, like, okay, but 318 00:19:06,600 --> 00:19:10,440 Speaker 1: you're marketing that, like that's the whole marketing push. You're 319 00:19:10,520 --> 00:19:15,000 Speaker 1: you're you're trying to send a concept and a message 320 00:19:15,000 --> 00:19:17,399 Speaker 1: to people. You can't have it both ways. I mean, 321 00:19:17,440 --> 00:19:20,359 Speaker 1: I get the whole Shakespeare Watson, a name that arose 322 00:19:20,400 --> 00:19:22,639 Speaker 1: by any other name would smell as sweet. I understand, 323 00:19:22,720 --> 00:19:26,920 Speaker 1: But I don't know. If you have a system that's 324 00:19:26,920 --> 00:19:29,760 Speaker 1: not really autopilot and you call it autopilot, I think 325 00:19:30,000 --> 00:19:33,040 Speaker 1: I think there's a complaint there, But that's that's me, 326 00:19:33,280 --> 00:19:36,400 Speaker 1: that's not a court. Maybe a court will find it differently. Anyway. 327 00:19:36,440 --> 00:19:38,800 Speaker 1: The d m V has two other complaints, not just 328 00:19:39,040 --> 00:19:42,560 Speaker 1: the names of these things, and those complaints have more 329 00:19:42,600 --> 00:19:46,119 Speaker 1: to do with how Tesla markets it's full self driving 330 00:19:46,119 --> 00:19:49,440 Speaker 1: capability feature. Like one of the two complaints is about 331 00:19:49,520 --> 00:19:53,280 Speaker 1: how there's a statement on Tesla's website that a driver 332 00:19:53,400 --> 00:19:56,360 Speaker 1: essentially has to quote get in and tell the car 333 00:19:56,440 --> 00:19:59,520 Speaker 1: where to go end quote. The d m VS claim 334 00:19:59,600 --> 00:20:03,200 Speaker 1: is that that's inaccurate and it further represents a deceptive 335 00:20:03,240 --> 00:20:07,879 Speaker 1: practice under California law. Now, the DMV does acknowledge something 336 00:20:07,880 --> 00:20:09,919 Speaker 1: that I have also pointed out in the past. If 337 00:20:09,960 --> 00:20:14,880 Speaker 1: you bother to read Tesla's pages about these features. As 338 00:20:14,880 --> 00:20:18,160 Speaker 1: you read, you will find disclaimers that these features are 339 00:20:18,240 --> 00:20:21,679 Speaker 1: not fully autonomous systems, and that a driver's supervision is 340 00:20:21,680 --> 00:20:25,760 Speaker 1: always necessary. That language is in there. However, the d 341 00:20:25,840 --> 00:20:29,960 Speaker 1: m V says, the fact that Tesla includes those disclaimers 342 00:20:30,000 --> 00:20:33,639 Speaker 1: doesn't erase these other statements that seem to be to 343 00:20:33,720 --> 00:20:36,439 Speaker 1: the contrary. So instead, what the DMV is saying is 344 00:20:36,480 --> 00:20:40,760 Speaker 1: that Tesla is just contradicting itself, which creates confusion and 345 00:20:40,840 --> 00:20:46,919 Speaker 1: ultimately is a deceptive practice and potentially dangerous. So DMV 346 00:20:47,040 --> 00:20:49,199 Speaker 1: is saying Tesla is trying to have its cake and 347 00:20:49,200 --> 00:20:52,240 Speaker 1: eat it too, right, It's trying to use wording that 348 00:20:52,359 --> 00:20:56,719 Speaker 1: in one way seems like really appealing marketing to customers 349 00:20:57,400 --> 00:21:00,760 Speaker 1: and then also include a disclaim Ahimer that says no, 350 00:21:00,960 --> 00:21:03,119 Speaker 1: but for reals, it's not what we just said. It 351 00:21:03,280 --> 00:21:06,639 Speaker 1: was in an effort to avoid accountability in the event 352 00:21:06,720 --> 00:21:11,639 Speaker 1: of something going wrong. We'll have to see how this develops. Okay, 353 00:21:11,640 --> 00:21:14,840 Speaker 1: we've got a couple more stories in just a moment, 354 00:21:14,880 --> 00:21:25,880 Speaker 1: but first let's take another quick break. Okay, we're gonna 355 00:21:25,880 --> 00:21:28,600 Speaker 1: close out with just a couple of quick stories. One 356 00:21:29,040 --> 00:21:33,720 Speaker 1: is to stick with vehicles again. The Automotive News reports 357 00:21:33,760 --> 00:21:37,560 Speaker 1: that more than a hundred eight vehicles will be dropped 358 00:21:37,560 --> 00:21:41,680 Speaker 1: from production schedules around the world due to the ongoing 359 00:21:41,760 --> 00:21:45,600 Speaker 1: semiconductor chips shortage, and more than one hundred thousand of 360 00:21:45,640 --> 00:21:49,080 Speaker 1: those will be cut from manufacturing facilities here in North America. 361 00:21:49,640 --> 00:21:54,480 Speaker 1: We just don't have enough microchips to build modern cars, 362 00:21:54,560 --> 00:21:56,920 Speaker 1: or at least as many as we had planned. Your 363 00:21:56,960 --> 00:22:00,280 Speaker 1: typical modern vehicle can have a few thousand micro ups 364 00:22:00,320 --> 00:22:03,240 Speaker 1: in it. I think the average is somewhere around three thousand, 365 00:22:03,600 --> 00:22:07,200 Speaker 1: and they control everything from your entertainment system to fuel 366 00:22:07,240 --> 00:22:11,680 Speaker 1: injection and beyond. While the computerization of vehicles has provided 367 00:22:11,680 --> 00:22:15,840 Speaker 1: a lot of new functionality and safety features and really 368 00:22:15,880 --> 00:22:20,640 Speaker 1: good stuff, the semiconductor shortage has highlighted just how dependent 369 00:22:20,720 --> 00:22:25,920 Speaker 1: we have become on microchip technology. Auto Forecast Solutions says 370 00:22:26,040 --> 00:22:29,560 Speaker 1: that already in North America, facilities have had to cut 371 00:22:29,600 --> 00:22:33,080 Speaker 1: more than one million vehicles from production this year, So 372 00:22:33,160 --> 00:22:36,639 Speaker 1: that one hundred thousand plus is for the rest of 373 00:22:36,640 --> 00:22:38,879 Speaker 1: the year. So, I mean, you know, what's another hundred 374 00:22:38,920 --> 00:22:41,399 Speaker 1: thousand if you've already had to cut a million. But yeah, 375 00:22:41,600 --> 00:22:45,159 Speaker 1: this continues to be an enormous challenge for the automotive 376 00:22:45,160 --> 00:22:49,160 Speaker 1: industry and for customers. As for when we will actually 377 00:22:49,200 --> 00:22:54,160 Speaker 1: emerge from this semiconductor shortage crisis. That is hard to say. 378 00:22:54,600 --> 00:22:56,640 Speaker 1: There have been some analysts who thought that we would 379 00:22:56,640 --> 00:22:58,800 Speaker 1: be out of it already, that we would be well 380 00:22:58,880 --> 00:23:03,840 Speaker 1: pasted it. There's say that it will happen sometime next year. Uh, 381 00:23:03,880 --> 00:23:06,080 Speaker 1: and there are quite a few that say they don't 382 00:23:06,080 --> 00:23:10,800 Speaker 1: expect it to improve until or later. In the meantime, 383 00:23:11,080 --> 00:23:14,840 Speaker 1: expect to see more shortages at dealerships and high prices 384 00:23:14,880 --> 00:23:18,119 Speaker 1: for the limited supply of vehicles that are already out there. Um, 385 00:23:18,400 --> 00:23:20,159 Speaker 1: I don't think we're gonna be able to see a 386 00:23:20,200 --> 00:23:24,919 Speaker 1: sustainable market for used vehicles right now. You can occasionally 387 00:23:25,000 --> 00:23:28,960 Speaker 1: find used vehicles marked at a price that's higher than 388 00:23:29,000 --> 00:23:31,719 Speaker 1: they were when you drove them off the lot. That 389 00:23:31,840 --> 00:23:35,400 Speaker 1: never happens. It's crazy because you know, I think everyone 390 00:23:35,440 --> 00:23:37,360 Speaker 1: knows that when you get into a brand new car 391 00:23:37,480 --> 00:23:40,840 Speaker 1: and you drive it off the lot, the value depreciates 392 00:23:40,880 --> 00:23:45,359 Speaker 1: by an enormous, enormous percent, Like it's ridiculous how many 393 00:23:45,560 --> 00:23:48,439 Speaker 1: thousands of dollars are marked off the value of the 394 00:23:48,480 --> 00:23:52,080 Speaker 1: car just because you've owned it for you know, a second. 395 00:23:52,680 --> 00:23:55,840 Speaker 1: Now we're seeing that change quite a bit because of 396 00:23:55,880 --> 00:23:59,399 Speaker 1: the semiconductor shortage, where you can have a used vehicle 397 00:24:00,000 --> 00:24:02,439 Speaker 1: Earth more than what it was when you purchased it. 398 00:24:02,800 --> 00:24:05,280 Speaker 1: Not in every case, but it has happened and it's 399 00:24:05,359 --> 00:24:08,840 Speaker 1: kind of crazy. Finally, we've talked quite a bit this 400 00:24:08,920 --> 00:24:12,280 Speaker 1: year about Netflix and the company's attempts to find ways 401 00:24:12,320 --> 00:24:14,840 Speaker 1: to grow in the wake of subscriber losses or at 402 00:24:14,920 --> 00:24:19,840 Speaker 1: least two stem subscriber losses. We've seen subscribers leave the 403 00:24:20,400 --> 00:24:22,600 Speaker 1: uh the service for the first time, like more people 404 00:24:22,680 --> 00:24:25,760 Speaker 1: leaving than new people coming in, and it has really 405 00:24:25,920 --> 00:24:30,240 Speaker 1: rocked the company. Well, one area that Netflix has experimented 406 00:24:30,320 --> 00:24:33,399 Speaker 1: with is with games. And I'm sure a lot of 407 00:24:33,400 --> 00:24:35,679 Speaker 1: you are out out there are aware of this, but 408 00:24:35,760 --> 00:24:38,159 Speaker 1: some of you might not be. If you are a 409 00:24:38,200 --> 00:24:41,720 Speaker 1: Netflix subscriber and you have an iOS or Android device 410 00:24:41,800 --> 00:24:44,480 Speaker 1: that has the Netflix app on it, you have access 411 00:24:44,520 --> 00:24:47,960 Speaker 1: to twenty four games right now. There's another twenty six 412 00:24:48,040 --> 00:24:52,360 Speaker 1: planned to be added to the service, and you can 413 00:24:52,359 --> 00:24:54,600 Speaker 1: play them. If you're a subscriber to Netflix, you can 414 00:24:54,680 --> 00:24:57,440 Speaker 1: use your your phone and access these games, and they're 415 00:24:57,480 --> 00:25:02,000 Speaker 1: supposed to be least premium games, but according to ap Topia, 416 00:25:02,280 --> 00:25:05,920 Speaker 1: less than one percent of all Netflix subscribers have actually 417 00:25:05,960 --> 00:25:10,240 Speaker 1: done that. So if the company was hoping that games 418 00:25:10,240 --> 00:25:13,960 Speaker 1: would convince users to stick with Netflix. It looks like 419 00:25:14,000 --> 00:25:16,400 Speaker 1: there's some bad news out there, at least as far 420 00:25:16,440 --> 00:25:20,479 Speaker 1: as the game's ability to retain customers is concerned. But hey, 421 00:25:20,520 --> 00:25:24,200 Speaker 1: if you weren't aware that Netflix has games, now you are. 422 00:25:24,359 --> 00:25:26,520 Speaker 1: Maybe you can open up your Netflix app on your 423 00:25:26,520 --> 00:25:29,800 Speaker 1: iOS or Android device and check it out and see 424 00:25:29,840 --> 00:25:31,640 Speaker 1: if maybe there's a game there that you like to play. 425 00:25:31,720 --> 00:25:36,480 Speaker 1: Who knows, Uh, I haven't done that. I'm a Netflix subscriber, 426 00:25:36,560 --> 00:25:40,720 Speaker 1: So maybe after this episode, I'll, out of curiosity check 427 00:25:40,760 --> 00:25:42,440 Speaker 1: it out. Maybe there'll be a game that will finally 428 00:25:42,480 --> 00:25:44,560 Speaker 1: pull me away from Marvel Puzzle Quest because that's the 429 00:25:44,600 --> 00:25:48,480 Speaker 1: only game I play on my phone. Um, got to 430 00:25:48,560 --> 00:25:53,040 Speaker 1: catch them all. All right, that's it for this episode 431 00:25:53,320 --> 00:25:57,200 Speaker 1: for the TECHNIQUS for Tuesday, August two. Hope you are 432 00:25:57,280 --> 00:25:59,480 Speaker 1: all well. If you would like to get in touch 433 00:25:59,560 --> 00:26:02,120 Speaker 1: with me about a topic you'd like me to cover 434 00:26:02,119 --> 00:26:03,919 Speaker 1: in a future episode of Text Stuff, you can do 435 00:26:03,960 --> 00:26:05,600 Speaker 1: that in one of a couple of ways. One is 436 00:26:05,640 --> 00:26:08,720 Speaker 1: to download the I Heart Radio app. It's free to download. 437 00:26:09,160 --> 00:26:11,920 Speaker 1: You can navigate over to tech Stuff. There's little microphone 438 00:26:12,080 --> 00:26:14,000 Speaker 1: icon you click on that you can leave a voice 439 00:26:14,040 --> 00:26:16,760 Speaker 1: message up to thirty seconds in length. The other way 440 00:26:17,000 --> 00:26:20,080 Speaker 1: is to send me a message on Twitter. The handle 441 00:26:20,160 --> 00:26:23,679 Speaker 1: for the show is tech Stuff H s W and 442 00:26:23,720 --> 00:26:32,760 Speaker 1: I'll talk to you again really soon. Tech Stuff is 443 00:26:32,760 --> 00:26:35,960 Speaker 1: an I Heart Radio production. For more podcasts from I 444 00:26:36,040 --> 00:26:39,640 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 445 00:26:39,760 --> 00:26:41,760 Speaker 1: or wherever you listen to your favorite shows.