1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,880 --> 00:00:14,880 Speaker 1: He there, and welcome to tech Stuff. I'm your host 3 00:00:15,040 --> 00:00:18,680 Speaker 1: Jonathan Strickland. I'mond executive producer with I Heart Radio. And 4 00:00:18,720 --> 00:00:22,240 Speaker 1: how the tech are you? This time for the tech 5 00:00:22,320 --> 00:00:27,960 Speaker 1: news for Tuesday, January two thousand twenty three. As I 6 00:00:28,000 --> 00:00:32,080 Speaker 1: mentioned last week, c e S recently happened. That's the 7 00:00:32,240 --> 00:00:36,519 Speaker 1: big consumer tech trade show that takes place in Las Vegas, Nevada. 8 00:00:37,120 --> 00:00:39,760 Speaker 1: And while I did not go this year, I did 9 00:00:39,840 --> 00:00:43,000 Speaker 1: read up on some of the stuff shown off and 10 00:00:43,080 --> 00:00:46,200 Speaker 1: thought i'd give just a quick glimpse at a few 11 00:00:46,240 --> 00:00:50,360 Speaker 1: things that caught my eye. And first up is a 12 00:00:50,600 --> 00:00:53,960 Speaker 1: nineties seven inch television because of course it caught my 13 00:00:54,040 --> 00:00:56,840 Speaker 1: eye because it blocks everything else that you could be 14 00:00:56,920 --> 00:01:01,200 Speaker 1: looking at that is enormous. The TV comes from l 15 00:01:01,280 --> 00:01:05,040 Speaker 1: G and it's an O LED television and nineties seven 16 00:01:06,000 --> 00:01:10,120 Speaker 1: oh LED TV. It also comes in a wireless version 17 00:01:10,200 --> 00:01:13,000 Speaker 1: where you can connect the television to your various components 18 00:01:13,040 --> 00:01:16,360 Speaker 1: without you know, having wires go everywhere. Now, I have 19 00:01:16,440 --> 00:01:19,720 Speaker 1: no idea how much that wireless version will cost if 20 00:01:19,760 --> 00:01:24,120 Speaker 1: it does actually become a true consumer product as opposed 21 00:01:24,160 --> 00:01:27,840 Speaker 1: to you know, kind of a concept or proof, you know, 22 00:01:27,840 --> 00:01:32,280 Speaker 1: a prototype or something. But the wired version the TV 23 00:01:33,000 --> 00:01:37,560 Speaker 1: is said to come out and around twenty five thousand dollars. 24 00:01:37,600 --> 00:01:41,399 Speaker 1: It's outside of my price range for sure. Still. I 25 00:01:41,440 --> 00:01:44,600 Speaker 1: can actually remember going to c E S back when 26 00:01:44,600 --> 00:01:48,200 Speaker 1: the largest oh LED screens you'd see were like eight inches, 27 00:01:48,480 --> 00:01:50,600 Speaker 1: when they were just kind of showing off what oh 28 00:01:50,720 --> 00:01:54,080 Speaker 1: Led could do, Like there was no such thing as 29 00:01:54,120 --> 00:01:55,760 Speaker 1: an no LED t V at that point. They had 30 00:01:55,800 --> 00:02:00,280 Speaker 1: little displays sometimes like bendable displays, but they were tie need. 31 00:02:00,680 --> 00:02:03,960 Speaker 1: So this is really an enormous leap from where the 32 00:02:04,000 --> 00:02:07,600 Speaker 1: technology was when I first started going to c e S. 33 00:02:07,600 --> 00:02:11,359 Speaker 1: Samsung meanwhile, showed off a seventy seven inch q d 34 00:02:11,960 --> 00:02:15,600 Speaker 1: oh Lad television, so there were no there's no shortage 35 00:02:15,639 --> 00:02:19,520 Speaker 1: of large TVs boasting great color representation and resolution. So again, 36 00:02:20,360 --> 00:02:24,280 Speaker 1: just really phenomenal when you when you've been going long 37 00:02:24,360 --> 00:02:26,600 Speaker 1: enough so that you saw where the technology was when 38 00:02:26,600 --> 00:02:29,919 Speaker 1: you started, versus where it is now. Like I said before, 39 00:02:30,800 --> 00:02:33,720 Speaker 1: you would never even see a television with an old screen, 40 00:02:34,000 --> 00:02:36,799 Speaker 1: and now you were looking at, you know, displays that 41 00:02:36,840 --> 00:02:39,920 Speaker 1: are creeping up to the hundred inch size, which is 42 00:02:39,960 --> 00:02:45,520 Speaker 1: just unfathomable to me. Yeah, pretty exciting stuff. A R 43 00:02:45,520 --> 00:02:49,280 Speaker 1: and VR also had some entries at c E S three. 44 00:02:49,480 --> 00:02:53,400 Speaker 1: Loomis l U m u S showed off their latest 45 00:02:53,400 --> 00:02:56,680 Speaker 1: design for a OUR glasses and they look kind of 46 00:02:57,000 --> 00:03:00,519 Speaker 1: like Black Frame I glasses, actually a lot like them. 47 00:03:00,960 --> 00:03:03,400 Speaker 1: Wired reports that they felt like they were a little 48 00:03:03,400 --> 00:03:06,720 Speaker 1: bit larger than your typical glasses are, but they otherwise 49 00:03:06,760 --> 00:03:10,160 Speaker 1: look pretty darn normal. Now, Loomas is in the business 50 00:03:10,240 --> 00:03:15,280 Speaker 1: of supplying smart display technology to other manufacturers to incorporate 51 00:03:15,280 --> 00:03:18,639 Speaker 1: into their products, so it's not like we're gonna see 52 00:03:18,639 --> 00:03:22,760 Speaker 1: a Loomis branded set of a R glasses anytime soon. Instead, 53 00:03:23,160 --> 00:03:25,920 Speaker 1: you might end up with a Loomis display built into 54 00:03:26,000 --> 00:03:29,040 Speaker 1: some other product, like whether it's a smart window or 55 00:03:29,080 --> 00:03:34,600 Speaker 1: a windshield or something along those lines. Meanwhile, HTC introduced 56 00:03:34,600 --> 00:03:39,560 Speaker 1: the viv x R Elite VR headset. It's got a 57 00:03:39,600 --> 00:03:43,840 Speaker 1: sleeker form factor, looks a little little way less chunky 58 00:03:43,880 --> 00:03:49,320 Speaker 1: than earlier VR headsets. It's also reportedly less than half 59 00:03:49,480 --> 00:03:54,200 Speaker 1: the weight of the meta Quest Pro headset, so it's 60 00:03:54,240 --> 00:03:57,440 Speaker 1: pretty lightweight. And keep in mind the meta Quest Pro 61 00:03:58,200 --> 00:04:02,800 Speaker 1: is meta slash, Facebook's attempt to create a headset that 62 00:04:02,880 --> 00:04:07,920 Speaker 1: professionals would use in order in business settings, and if 63 00:04:07,920 --> 00:04:09,840 Speaker 1: you have a headset like that, then you want it 64 00:04:09,840 --> 00:04:12,760 Speaker 1: to be lightweight, because if you're wearing it a lot, 65 00:04:13,000 --> 00:04:15,880 Speaker 1: like if you're working a normal work day, then you're 66 00:04:15,880 --> 00:04:18,240 Speaker 1: gonna be using your headset a lot. You definitely don't 67 00:04:18,240 --> 00:04:19,800 Speaker 1: want it to be too heavy. So it sounds like 68 00:04:19,839 --> 00:04:23,359 Speaker 1: this one is pretty pretty sleek in that regard, and 69 00:04:23,400 --> 00:04:26,240 Speaker 1: it sounded like it's a fairly impressive piece of technology, 70 00:04:26,680 --> 00:04:29,800 Speaker 1: and it better be because it sure ain't cheap. It'll 71 00:04:29,839 --> 00:04:32,880 Speaker 1: be coming to market in late February for the princely 72 00:04:33,040 --> 00:04:38,680 Speaker 1: sum of one thousand dollars, So that's real expensive. You know, 73 00:04:38,760 --> 00:04:43,200 Speaker 1: it's like the price of a an entry level or 74 00:04:43,240 --> 00:04:46,119 Speaker 1: maybe you could say like a medium level gaming PC. 75 00:04:46,600 --> 00:04:48,480 Speaker 1: It's nowhere close to what you would pay if you're 76 00:04:48,480 --> 00:04:50,520 Speaker 1: wanting to go top of the line, like that's you're 77 00:04:50,520 --> 00:04:53,479 Speaker 1: getting into the multiple thousands of dollars then, But a 78 00:04:53,520 --> 00:04:57,440 Speaker 1: thousand bucks that's a lot to spend on what most 79 00:04:57,440 --> 00:05:00,479 Speaker 1: people look at as a peripheral as opposed to like 80 00:05:00,520 --> 00:05:04,560 Speaker 1: a computer. So yeah, I don't know if this is 81 00:05:04,600 --> 00:05:09,200 Speaker 1: going to help usher in a wider acceptance of VR. 82 00:05:09,760 --> 00:05:13,360 Speaker 1: It might be really impressive and work really well, but 83 00:05:13,440 --> 00:05:15,359 Speaker 1: at that price point, I think it's still going to 84 00:05:15,400 --> 00:05:18,599 Speaker 1: prevent a lot of people from jumping into the field, 85 00:05:19,279 --> 00:05:22,640 Speaker 1: but I could be wrong. I've been skeptical about VR 86 00:05:22,760 --> 00:05:25,800 Speaker 1: becoming a big thing, Like I've always thought that it's 87 00:05:25,800 --> 00:05:29,080 Speaker 1: more of a niche technology, that it has its place, 88 00:05:29,560 --> 00:05:32,279 Speaker 1: and that you can create really incredible experiences for VR. 89 00:05:32,560 --> 00:05:35,440 Speaker 1: But because of things like the cost, as well as 90 00:05:36,279 --> 00:05:39,520 Speaker 1: the fact that some people just get motion sick while 91 00:05:39,520 --> 00:05:43,279 Speaker 1: they're using the technology, means that you're probably not gonna 92 00:05:43,320 --> 00:05:48,280 Speaker 1: see it become the mainstream technology that say, replaces computers 93 00:05:48,360 --> 00:05:52,080 Speaker 1: or becomes the next revolution in computing after the mobile revolution. 94 00:05:53,000 --> 00:05:55,840 Speaker 1: Every year it's c e s. There are always gadgets 95 00:05:55,839 --> 00:05:59,320 Speaker 1: that get attention because they're different from everything else. They're 96 00:05:59,360 --> 00:06:04,159 Speaker 1: not necessarily better or even useful, but they're different, and 97 00:06:04,200 --> 00:06:07,159 Speaker 1: then that gets a lot of attention because people, frankly, 98 00:06:07,360 --> 00:06:13,800 Speaker 1: reporters get tired of covering televisions and cars and audio systems. 99 00:06:13,880 --> 00:06:17,960 Speaker 1: Like there's certain categories at CES that are always represented, 100 00:06:18,200 --> 00:06:21,120 Speaker 1: and even if you see something that's really good, it's 101 00:06:21,160 --> 00:06:23,800 Speaker 1: like a jump ahead of what the previous models have been, 102 00:06:24,440 --> 00:06:26,159 Speaker 1: the fact that you've been covering that for years just 103 00:06:26,200 --> 00:06:27,800 Speaker 1: means that you're kind of tired of it, so when 104 00:06:27,800 --> 00:06:31,640 Speaker 1: something different comes along, people notice. I'm actually reminded of 105 00:06:31,680 --> 00:06:35,200 Speaker 1: the haptic fork from several years ago. This was a 106 00:06:35,240 --> 00:06:38,440 Speaker 1: fork that had a motor built into the handle, and 107 00:06:38,480 --> 00:06:42,080 Speaker 1: the motor would cause the fork to vibrate and and 108 00:06:42,160 --> 00:06:44,919 Speaker 1: make food fall off the fork if it detected you 109 00:06:44,960 --> 00:06:47,480 Speaker 1: were trying to eat too quickly. It was a way 110 00:06:47,520 --> 00:06:50,320 Speaker 1: to try and force you to slow down your eating. Well. 111 00:06:50,320 --> 00:06:52,880 Speaker 1: This year, one thing that I saw a lot of 112 00:06:52,960 --> 00:06:56,400 Speaker 1: coverage on at c e S was the we Things 113 00:06:56,760 --> 00:07:00,280 Speaker 1: you Scan. Now, this is a device. It's a little 114 00:07:00,320 --> 00:07:04,840 Speaker 1: cartridge thing that fits inside your toilet and it analyzes 115 00:07:05,120 --> 00:07:07,720 Speaker 1: your p So the idea is that this thing can 116 00:07:07,720 --> 00:07:11,200 Speaker 1: analyze urine and then report back via an app on 117 00:07:11,320 --> 00:07:15,080 Speaker 1: things like whether or not you're properly hydrated, or maybe 118 00:07:15,120 --> 00:07:19,360 Speaker 1: you're lacking some specific nutrients, or maybe you're using it 119 00:07:19,400 --> 00:07:22,000 Speaker 1: to track your hormone cycle because you and your partner 120 00:07:22,080 --> 00:07:24,400 Speaker 1: want to have a baby and this can help track 121 00:07:24,480 --> 00:07:28,160 Speaker 1: cycles for that kind of thing. There's no one cartridge 122 00:07:28,200 --> 00:07:30,720 Speaker 1: it fits all solution here. It's not like it's not 123 00:07:30,760 --> 00:07:34,000 Speaker 1: like making the claims of Farinus, where one thing can 124 00:07:34,040 --> 00:07:37,360 Speaker 1: cover all eventualities. So instead you have to choose the 125 00:07:37,400 --> 00:07:41,720 Speaker 1: cartridge that applies to whatever it is you want to measure, 126 00:07:42,320 --> 00:07:45,360 Speaker 1: and then on top of that, you have to subscribe 127 00:07:45,560 --> 00:07:48,600 Speaker 1: to the service. So there's a subscription fee on top 128 00:07:48,600 --> 00:07:52,560 Speaker 1: of the cartridge fee to use the service. The cartridges 129 00:07:52,560 --> 00:07:55,080 Speaker 1: are going to cost around six hundred dollars a pop. 130 00:07:55,680 --> 00:07:58,000 Speaker 1: I'm not sure what the monthly subscription fee is going 131 00:07:58,040 --> 00:07:59,800 Speaker 1: to be. I will say that while I saw a 132 00:07:59,800 --> 00:08:02,360 Speaker 1: lot of outlets report on this technology, it actually surprised 133 00:08:02,400 --> 00:08:05,520 Speaker 1: me that not nearly as many pointed out the concerns 134 00:08:05,520 --> 00:08:10,000 Speaker 1: that Cindy Cone of the Electronic Frontier Foundation raised, namely 135 00:08:10,040 --> 00:08:14,200 Speaker 1: that anytime you're talking about medical information and you're talking 136 00:08:14,200 --> 00:08:17,440 Speaker 1: about an app ecosystem, you need to be concerned about 137 00:08:17,520 --> 00:08:21,480 Speaker 1: data security and privacy. So Cone asks a really good question, 138 00:08:22,080 --> 00:08:26,040 Speaker 1: is there anything being done with that data beyond the analysis? 139 00:08:26,560 --> 00:08:30,680 Speaker 1: Could the we Things company leverage that data for other purposes? 140 00:08:30,680 --> 00:08:34,280 Speaker 1: Could it possibly sell that information to other companies? And 141 00:08:34,320 --> 00:08:37,360 Speaker 1: in the wake of a controversial Supreme Court decision here 142 00:08:37,400 --> 00:08:40,160 Speaker 1: in the US last year, could it mean that state 143 00:08:40,240 --> 00:08:44,199 Speaker 1: governments that restrict women's access to things like abortion use 144 00:08:44,400 --> 00:08:48,920 Speaker 1: data collected by we Things to kind of monitor women 145 00:08:50,000 --> 00:08:53,240 Speaker 1: becomes very invasive, very handmade's tale when you start getting 146 00:08:53,240 --> 00:08:56,040 Speaker 1: into this now. For their part, We Things later responded 147 00:08:56,080 --> 00:08:58,319 Speaker 1: to this criticism, saying that the product and service will 148 00:08:58,320 --> 00:09:02,640 Speaker 1: be subject to company data private CE policies, so U 149 00:09:02,800 --> 00:09:06,480 Speaker 1: big relief there. And last week I mentioned there would 150 00:09:06,520 --> 00:09:10,240 Speaker 1: likely be a few flying cars on display at CES. 151 00:09:11,200 --> 00:09:13,800 Speaker 1: Typically these take the form of what looks like just 152 00:09:13,880 --> 00:09:18,079 Speaker 1: a very large drone with four or more rotors, so 153 00:09:18,240 --> 00:09:22,520 Speaker 1: like your typical quad copter design quad copter only refers 154 00:09:22,559 --> 00:09:26,680 Speaker 1: to ones that four rotors, but you know, big enough 155 00:09:26,720 --> 00:09:29,200 Speaker 1: so that you could fit one or two people into 156 00:09:29,280 --> 00:09:31,480 Speaker 1: like a cockpit that's in the center of these things. 157 00:09:31,920 --> 00:09:34,720 Speaker 1: That's what most flying cars look like. But one thing 158 00:09:35,120 --> 00:09:39,240 Speaker 1: I didn't anticipate was a quote unquote flying boat that's 159 00:09:39,280 --> 00:09:42,440 Speaker 1: not really flying, but that's kind of what c net 160 00:09:42,480 --> 00:09:46,160 Speaker 1: called it. It is the Candela C eight. It's an 161 00:09:46,160 --> 00:09:49,520 Speaker 1: electric speedboat, so it is an electric vehicle and it 162 00:09:49,600 --> 00:09:53,760 Speaker 1: uses hydrofoils. So these are like wing like structures that 163 00:09:54,280 --> 00:09:57,000 Speaker 1: submerge under the water while the rest of the boat 164 00:09:57,080 --> 00:09:59,640 Speaker 1: lifts up off the surface of the of the sea 165 00:09:59,720 --> 00:10:02,319 Speaker 1: as you start to move up to speed, and the 166 00:10:02,400 --> 00:10:06,720 Speaker 1: hydrofoils act like wings, but instead of the fluid being air, 167 00:10:06,880 --> 00:10:09,839 Speaker 1: the fluid is actually the water of the sea, and 168 00:10:10,160 --> 00:10:12,680 Speaker 1: this reduces the amount of surface area that actually makes 169 00:10:12,679 --> 00:10:14,640 Speaker 1: contact with the water. It means that the boat can 170 00:10:14,679 --> 00:10:18,120 Speaker 1: go pretty darn fast, like around thirty four miles per hour. 171 00:10:18,240 --> 00:10:21,360 Speaker 1: That's really a good clip for a boat. But the 172 00:10:21,440 --> 00:10:24,000 Speaker 1: C eight also has some other tricks up its sleeve. 173 00:10:24,000 --> 00:10:28,200 Speaker 1: Because hydrofoils, they're cool, but they're not new. Uh. Some 174 00:10:28,280 --> 00:10:30,400 Speaker 1: of the other things that the C eight has include 175 00:10:30,440 --> 00:10:34,319 Speaker 1: a self piloting feature, so sort of like Tesla's autopilot 176 00:10:34,720 --> 00:10:36,880 Speaker 1: that's meant to keep the boat on a set course. 177 00:10:37,000 --> 00:10:39,559 Speaker 1: So let's say that you are, you know, going from 178 00:10:39,679 --> 00:10:42,440 Speaker 1: one island to another. It's gonna take about three hours. 179 00:10:43,200 --> 00:10:46,360 Speaker 1: You set the course and the boat keeps you on 180 00:10:46,400 --> 00:10:49,280 Speaker 1: that and it detects when you start drifting due to 181 00:10:49,360 --> 00:10:51,880 Speaker 1: other things like wind or current that kind of stuff, 182 00:10:52,280 --> 00:10:55,560 Speaker 1: and can keep you on the right path. It also 183 00:10:55,600 --> 00:10:57,360 Speaker 1: has tons of sensors so that the boat can keep 184 00:10:57,400 --> 00:10:59,959 Speaker 1: itself stable and balanced, which is important when you're raised 185 00:11:00,040 --> 00:11:03,240 Speaker 1: up on hydrofoils. And it will only set you back 186 00:11:03,640 --> 00:11:08,520 Speaker 1: three hundred ninety thousand dollars. Now, of course, there were 187 00:11:08,520 --> 00:11:10,640 Speaker 1: a lot of other items on display at c E 188 00:11:10,760 --> 00:11:14,120 Speaker 1: S three, But since I didn't go and can only 189 00:11:14,240 --> 00:11:17,800 Speaker 1: form opinions based on coverage, I'm gonna just limited to 190 00:11:17,840 --> 00:11:20,600 Speaker 1: those things that thought were kind of neat. Honestly, there's 191 00:11:20,640 --> 00:11:22,280 Speaker 1: so much more for me to read up on. I 192 00:11:22,760 --> 00:11:24,920 Speaker 1: I'm still unaware of a lot of stuff that was 193 00:11:24,960 --> 00:11:27,880 Speaker 1: at c E S. So I'm going to segue into 194 00:11:27,960 --> 00:11:41,040 Speaker 1: some other news, But first, let's take a quick break. Okay, 195 00:11:41,080 --> 00:11:44,920 Speaker 1: we're back over in China. Some recent Tesla customers are 196 00:11:45,000 --> 00:11:51,520 Speaker 1: really really unhappy. Why. Well, recently, Tesla marked down the 197 00:11:51,600 --> 00:11:55,000 Speaker 1: prices of various vehicles it was selling in China by 198 00:11:55,040 --> 00:11:59,120 Speaker 1: several thousand dollars. So the amount of markdown really depends 199 00:11:59,160 --> 00:12:02,520 Speaker 1: upon the specific model of car, but all of them 200 00:12:02,600 --> 00:12:06,520 Speaker 1: got big price cuts. However, that means that the folks 201 00:12:06,559 --> 00:12:09,840 Speaker 1: who purchased a Tesla in China before the company made 202 00:12:09,840 --> 00:12:14,439 Speaker 1: this markdown had paid an earlier higher price for their cars. 203 00:12:14,920 --> 00:12:18,160 Speaker 1: Now they're upset that if they had just waited a 204 00:12:18,200 --> 00:12:21,960 Speaker 1: little bit and then purchased their vehicle after the cuts, 205 00:12:21,960 --> 00:12:25,760 Speaker 1: they could have done so for a decent discount. And yeah, 206 00:12:25,800 --> 00:12:28,720 Speaker 1: you know, this kind of buyer's remorse is pretty common. 207 00:12:28,800 --> 00:12:31,520 Speaker 1: I know that I have personally kicked myself for buying 208 00:12:31,559 --> 00:12:35,360 Speaker 1: something that later went on sale, like not long after 209 00:12:35,400 --> 00:12:38,200 Speaker 1: I bought it, and I think, gosh, if only I 210 00:12:38,240 --> 00:12:40,920 Speaker 1: had waited, I could have saved you know, however much money. 211 00:12:40,920 --> 00:12:44,400 Speaker 1: But hindsight, right, you had no way of knowing necessarily 212 00:12:44,600 --> 00:12:47,640 Speaker 1: at the time when you make a purchase that a 213 00:12:47,679 --> 00:12:50,120 Speaker 1: week or a month later, it's going to be significantly 214 00:12:50,160 --> 00:12:53,960 Speaker 1: cheaper unless you have like insider information, you just don't know. 215 00:12:54,679 --> 00:12:58,200 Speaker 1: But for some customers in Chengdu, China, things went beyond 216 00:12:58,240 --> 00:13:03,760 Speaker 1: buyer's remorse because mers reportedly stormed the Tesla dealership. They 217 00:13:03,800 --> 00:13:07,320 Speaker 1: got inside, they vandalized the dealership. They damaged one of 218 00:13:07,320 --> 00:13:10,120 Speaker 1: the electric vehicles on display and a TV that was 219 00:13:10,160 --> 00:13:14,440 Speaker 1: on display. They stole some stuff allegedly uh. They also 220 00:13:14,520 --> 00:13:17,600 Speaker 1: left behind a list of demands, which includes stuff like 221 00:13:17,760 --> 00:13:21,240 Speaker 1: a free lifetime subscription to the full self driving feature 222 00:13:21,320 --> 00:13:26,400 Speaker 1: that Tesla offers, plus like ten million Tesla points. I'm 223 00:13:26,559 --> 00:13:29,800 Speaker 1: unfamiliar with Tesla points, so I need to look into 224 00:13:29,840 --> 00:13:32,120 Speaker 1: that to see what. I'm sure it's a loyalty thing, 225 00:13:32,160 --> 00:13:35,600 Speaker 1: but like, what are Tesla points for. They also wanted 226 00:13:35,640 --> 00:13:40,000 Speaker 1: extended warranties that kind of thing. Tesla the dealership said, hey, 227 00:13:40,320 --> 00:13:42,600 Speaker 1: we're sorry. There's not like a program in place to 228 00:13:42,640 --> 00:13:45,880 Speaker 1: give you a refund or to give you a rebate 229 00:13:45,960 --> 00:13:48,679 Speaker 1: based upon the change in price. That's just not how 230 00:13:48,720 --> 00:13:51,360 Speaker 1: this works. And the company, I don't think is very 231 00:13:51,480 --> 00:13:54,640 Speaker 1: likely to meet any of these demands. But the Chinese 232 00:13:54,640 --> 00:13:57,480 Speaker 1: media is reporting the matters relating to quote unquote customer 233 00:13:57,640 --> 00:14:01,440 Speaker 1: rights as opposed to reporting on this is like acts 234 00:14:01,520 --> 00:14:05,160 Speaker 1: of vandalism. Uh, and that could indicate that Tesla is 235 00:14:05,160 --> 00:14:07,400 Speaker 1: gonna have a tough time handling pr in China in 236 00:14:07,400 --> 00:14:10,560 Speaker 1: the short term. Like when I hear this story. As 237 00:14:10,640 --> 00:14:14,040 Speaker 1: much as I don't like Tesla, I think, well, it's 238 00:14:14,120 --> 00:14:17,320 Speaker 1: unfortunate when you buy something and then the next week 239 00:14:17,360 --> 00:14:20,120 Speaker 1: it goes down in price. That's that stinks. It's never 240 00:14:20,200 --> 00:14:23,560 Speaker 1: fun for that to happen to you. But it's not 241 00:14:23,680 --> 00:14:27,640 Speaker 1: like the company owes you anything like you just it 242 00:14:27,720 --> 00:14:30,840 Speaker 1: was just bad luck, bad timing. Um. So I really 243 00:14:30,880 --> 00:14:34,800 Speaker 1: don't feel that Tesla is in the wrong here, and 244 00:14:34,840 --> 00:14:38,640 Speaker 1: it it's weird that the media is referencing this as 245 00:14:38,960 --> 00:14:42,280 Speaker 1: customer rights. Although China is a very different kind of 246 00:14:42,560 --> 00:14:45,520 Speaker 1: country than the United States, so maybe that has a 247 00:14:45,520 --> 00:14:48,560 Speaker 1: big part of it, but yeah, that's not a not 248 00:14:48,680 --> 00:14:53,520 Speaker 1: a great indication of a solid, you know, reputation for 249 00:14:53,640 --> 00:14:59,880 Speaker 1: Tesla within China. Next up, social media companies including Facebook, YouTube, TikTok, 250 00:15:00,080 --> 00:15:04,160 Speaker 1: and snapchat, among others, are facing a new lawsuit, this 251 00:15:04,240 --> 00:15:08,080 Speaker 1: time from the Seattle Public School System. Seattle is in 252 00:15:08,200 --> 00:15:12,400 Speaker 1: Washington State here in the United States. The lawsuit charges 253 00:15:12,560 --> 00:15:16,680 Speaker 1: that these social platforms have violated a public nuisance law 254 00:15:17,240 --> 00:15:21,040 Speaker 1: that they are designed for the purposes of quote, hooking 255 00:15:21,320 --> 00:15:25,440 Speaker 1: tens of millions of students across the country into positive 256 00:15:25,440 --> 00:15:29,680 Speaker 1: feedback loops of excessive use and abuse of defendants social 257 00:15:29,720 --> 00:15:33,920 Speaker 1: media platforms end quote. And you know that's kind of 258 00:15:34,080 --> 00:15:37,680 Speaker 1: hard to deny because every social network out there has 259 00:15:37,720 --> 00:15:42,240 Speaker 1: worked to find ways to keep users engaged on their platforms. 260 00:15:42,960 --> 00:15:46,600 Speaker 1: That's what's profitable, and it's way less profitable for someone 261 00:15:46,640 --> 00:15:48,880 Speaker 1: to pop into your service, take a look and then 262 00:15:48,920 --> 00:15:51,800 Speaker 1: pop out. You want those eyeballs to stay glued to 263 00:15:51,840 --> 00:15:54,320 Speaker 1: your service for as long as you can possibly keep them, 264 00:15:54,640 --> 00:15:58,920 Speaker 1: so you purposefully design your service to encourage extended engagement 265 00:15:59,400 --> 00:16:04,400 Speaker 1: by creating these kinds of feedback loops that psychologically reward 266 00:16:04,440 --> 00:16:08,520 Speaker 1: the user for sticking with the service. From the network side, 267 00:16:09,120 --> 00:16:11,520 Speaker 1: you can argue this is all in an effort to 268 00:16:11,520 --> 00:16:14,080 Speaker 1: provide the best experience to the user, Like that's how 269 00:16:14,120 --> 00:16:15,800 Speaker 1: you can frame it, like we're just trying to give 270 00:16:15,800 --> 00:16:19,720 Speaker 1: the user the best possible experience. However, the flip side 271 00:16:19,720 --> 00:16:22,600 Speaker 1: of looking at this is to say it's really an 272 00:16:22,640 --> 00:16:27,080 Speaker 1: effort to make the user increasingly dependent upon the service 273 00:16:27,680 --> 00:16:31,080 Speaker 1: that they become kind of addicted to it. Further, the 274 00:16:31,160 --> 00:16:35,120 Speaker 1: complaint alleges that the networks have perpetuated harm on young users, 275 00:16:35,400 --> 00:16:37,880 Speaker 1: citing research and reports about how the use of social 276 00:16:37,920 --> 00:16:44,960 Speaker 1: media can be associated with various mental health issues like depression, anxiety, suicide, 277 00:16:45,520 --> 00:16:48,480 Speaker 1: eating disorders, that kind of thing. Now, whether the court 278 00:16:48,520 --> 00:16:51,360 Speaker 1: system will also find that these companies are responsible for 279 00:16:51,480 --> 00:16:56,520 Speaker 1: perpetuating harm and hooking people into the system remains to 280 00:16:56,520 --> 00:16:58,320 Speaker 1: be seen, because we're just at the very beginning of 281 00:16:58,320 --> 00:17:01,080 Speaker 1: this process and I think be very hard to argue 282 00:17:01,080 --> 00:17:04,920 Speaker 1: against it. I suspect that what we'll see as a 283 00:17:05,520 --> 00:17:10,480 Speaker 1: settlement at some point, because you know, we've even seen 284 00:17:10,520 --> 00:17:17,280 Speaker 1: internal documents from within Facebook slash Meta that essentially support 285 00:17:17,960 --> 00:17:20,680 Speaker 1: at least some of these arguments, And you know, when 286 00:17:20,680 --> 00:17:24,040 Speaker 1: your company's internal documents appear to support the allegations made 287 00:17:24,080 --> 00:17:28,720 Speaker 1: against you, that becomes difficult to defend against. So we'll 288 00:17:28,720 --> 00:17:30,840 Speaker 1: have to see. I mean, you never know. It all 289 00:17:30,880 --> 00:17:34,400 Speaker 1: depends upon the court and how arguments are framed. So 290 00:17:34,880 --> 00:17:36,600 Speaker 1: it's not like it's a done deal one way or 291 00:17:36,640 --> 00:17:40,480 Speaker 1: the other yet. Next we're gonna talk about AI. In fact, 292 00:17:40,520 --> 00:17:43,159 Speaker 1: we're gonna talk a lot about AI. I think stories 293 00:17:43,160 --> 00:17:48,359 Speaker 1: about AI will be a huge thing throughout three. Chat 294 00:17:48,400 --> 00:17:52,360 Speaker 1: gpt kind of got things fired up last November, and 295 00:17:52,480 --> 00:17:56,400 Speaker 1: our first story about AI relates to the AI chat 296 00:17:56,480 --> 00:18:00,840 Speaker 1: butt that has people creating machine generated children's book as 297 00:18:00,880 --> 00:18:04,159 Speaker 1: teachers worried about students using it to cheat. You know, 298 00:18:04,359 --> 00:18:08,880 Speaker 1: it goes from everything from interesting distraction to the downfall 299 00:18:08,920 --> 00:18:13,440 Speaker 1: of society. And that of course is chat GPT. Reuter's 300 00:18:13,480 --> 00:18:16,600 Speaker 1: reports that Microsoft is in talks with open Ai that's 301 00:18:16,640 --> 00:18:20,280 Speaker 1: the group that owns and operates chat gpt, and that 302 00:18:20,400 --> 00:18:24,000 Speaker 1: Microsoft plans to invest up to ten billion with a 303 00:18:24,160 --> 00:18:28,800 Speaker 1: B dollars in open Ai. This deal would, reported lea 304 00:18:28,840 --> 00:18:33,600 Speaker 1: see Microsoft ultimately take a forty nine steak in open ai, 305 00:18:33,880 --> 00:18:37,359 Speaker 1: and other investors would take up the other and the 306 00:18:37,400 --> 00:18:40,240 Speaker 1: final two percent. Because if you add nine. You do 307 00:18:40,320 --> 00:18:42,639 Speaker 1: not get a hundred. I tried multiple times. I just 308 00:18:42,680 --> 00:18:46,520 Speaker 1: couldn't get there. Now, the final two percent would belong 309 00:18:46,600 --> 00:18:50,760 Speaker 1: to open aies nonprofit parent company. And one thing I 310 00:18:50,800 --> 00:18:54,960 Speaker 1: did not realize but absolutely should have known. I mean 311 00:18:54,960 --> 00:18:57,040 Speaker 1: maybe I knew it at one point and I just forgot. 312 00:18:57,560 --> 00:19:00,840 Speaker 1: But it's that open Ai was originally founded by Sam 313 00:19:00,880 --> 00:19:04,679 Speaker 1: Altman and Elon Musk. And once you hear that Elon 314 00:19:04,800 --> 00:19:08,200 Speaker 1: Musk is involved in all that disruptive stuff that involves 315 00:19:08,240 --> 00:19:12,720 Speaker 1: like arts and education, makes perfect sense because Elon Musk 316 00:19:12,840 --> 00:19:17,720 Speaker 1: likes to stir uh poo poo, will say, I won't 317 00:19:17,800 --> 00:19:21,440 Speaker 1: use the rude word for it, but yeah, Elon Musk 318 00:19:21,560 --> 00:19:23,920 Speaker 1: is uh. I'm sure he thinks of it as being 319 00:19:23,960 --> 00:19:27,040 Speaker 1: a disruptor, which I guess is, you know, putting a 320 00:19:27,080 --> 00:19:31,639 Speaker 1: positive spin on it. But he wrecks shop by the 321 00:19:31,640 --> 00:19:36,040 Speaker 1: way I put it. And yeah, and that includes creating 322 00:19:36,080 --> 00:19:39,000 Speaker 1: things or funding things he doesn't. Musk is not an engineer, 323 00:19:39,040 --> 00:19:42,679 Speaker 1: but he funds things that end up causing lots of 324 00:19:42,680 --> 00:19:48,240 Speaker 1: headaches for established institutions and established ways of doing things. 325 00:19:48,320 --> 00:19:53,800 Speaker 1: And frequently, not always, but frequently, the solution proposed by 326 00:19:53,920 --> 00:19:57,479 Speaker 1: Musk is not superior to whatever was happening before it. 327 00:19:58,080 --> 00:20:01,520 Speaker 1: So yeah, not a big surprise in the sense of like, oh, 328 00:20:01,560 --> 00:20:05,320 Speaker 1: well that all fits. But yeah, I somehow missed that 329 00:20:05,400 --> 00:20:10,040 Speaker 1: Elon musk Get co founded open Ai. Now that's not 330 00:20:10,119 --> 00:20:13,800 Speaker 1: the only Microsoft AI story that I have today. Another 331 00:20:13,880 --> 00:20:17,520 Speaker 1: deals with a program called vol E v A L 332 00:20:17,800 --> 00:20:21,119 Speaker 1: L dash E. Now I can only assume that was 333 00:20:21,200 --> 00:20:25,200 Speaker 1: meant as a reference to the Pixar film Wally, But 334 00:20:25,440 --> 00:20:31,240 Speaker 1: voll E is an AI system that can synthesize voices 335 00:20:31,520 --> 00:20:36,400 Speaker 1: like any voice. So all it takes is an audio 336 00:20:36,480 --> 00:20:40,280 Speaker 1: sample of about three seconds of any voice. And not 337 00:20:40,320 --> 00:20:45,159 Speaker 1: only that, Volley can synthesize speech that mimics the emotional 338 00:20:45,200 --> 00:20:48,920 Speaker 1: tone and the acoustics that were present when the speaker 339 00:20:49,000 --> 00:20:52,439 Speaker 1: created the sample. So if you had three seconds of 340 00:20:52,480 --> 00:20:57,080 Speaker 1: someone totally losing their temper and say a tiled bathroom, 341 00:20:57,440 --> 00:21:01,560 Speaker 1: the synthesized audio would likely sound livid and in a 342 00:21:01,600 --> 00:21:03,720 Speaker 1: place that has a lot of hard services and echo 343 00:21:03,800 --> 00:21:08,160 Speaker 1: to it. Or if someone were chipper and happy than 344 00:21:08,200 --> 00:21:12,600 Speaker 1: they were in a well insulated space for sound, then 345 00:21:12,640 --> 00:21:15,439 Speaker 1: it would come through the same way in the synthesized audio. 346 00:21:15,520 --> 00:21:19,640 Speaker 1: So in theory, you could collect audio samples of someone 347 00:21:19,720 --> 00:21:23,840 Speaker 1: in different emotional states, and then using text to speech 348 00:21:23,880 --> 00:21:26,560 Speaker 1: and then switching to preserve the right tone. You could 349 00:21:26,600 --> 00:21:31,080 Speaker 1: use this tool to produce like a moving monologue that 350 00:21:31,160 --> 00:21:34,600 Speaker 1: the actual person has never said in their entire life. 351 00:21:35,600 --> 00:21:40,080 Speaker 1: This is both fascinating and it's kind of scary. It's 352 00:21:40,400 --> 00:21:44,760 Speaker 1: apparently not always convincing. I have not actually heard a 353 00:21:44,800 --> 00:21:46,760 Speaker 1: lot of samples from this, so I haven't had a 354 00:21:46,760 --> 00:21:49,760 Speaker 1: lot of personal experience with it. But from what I've read, 355 00:21:50,240 --> 00:21:53,280 Speaker 1: at least in some cases you can tell, oh, this 356 00:21:53,359 --> 00:21:56,880 Speaker 1: was machine generated, but in others there are some synthesized 357 00:21:56,920 --> 00:21:59,160 Speaker 1: passages that are actually sound like they were read by 358 00:21:59,320 --> 00:22:02,080 Speaker 1: the person who supply the original audio sample, and that 359 00:22:02,160 --> 00:22:06,640 Speaker 1: you could not tell the difference. Now, Microsoft already recognizes 360 00:22:06,680 --> 00:22:10,480 Speaker 1: how this kind of tool could cause serious problems, as 361 00:22:10,520 --> 00:22:15,359 Speaker 1: Ours Technically reports researchers have written quote. Since vol E 362 00:22:15,600 --> 00:22:20,000 Speaker 1: could synthesize speech that maintains speaker identity, it may carry 363 00:22:20,040 --> 00:22:23,879 Speaker 1: potential risks in misuse of the model, such as spoofing 364 00:22:23,960 --> 00:22:29,719 Speaker 1: voice identification or impersonating a specific speaker. To mitigate such risks, 365 00:22:30,280 --> 00:22:33,720 Speaker 1: it is possible to build a detection model to discriminate 366 00:22:33,760 --> 00:22:37,520 Speaker 1: whether an audio clip was synthesized by VOLI. We will 367 00:22:37,560 --> 00:22:41,840 Speaker 1: also put Microsoft AI principles into practice when further developing 368 00:22:41,880 --> 00:22:45,720 Speaker 1: the models. End quote. Also, Microsoft is not sharing this 369 00:22:45,840 --> 00:22:49,639 Speaker 1: code that can help reduce the threat of someone using 370 00:22:49,680 --> 00:22:52,119 Speaker 1: it to make it sound like. Let's say a famous 371 00:22:52,160 --> 00:22:54,719 Speaker 1: person said things they never actually said. And when you 372 00:22:54,760 --> 00:22:57,520 Speaker 1: consider the extent to which trolls on the Internet will 373 00:22:57,600 --> 00:23:00,919 Speaker 1: go to hurt someone just for the lulls, this is 374 00:23:00,960 --> 00:23:03,040 Speaker 1: a very good thing because I can only imagine them 375 00:23:03,080 --> 00:23:05,120 Speaker 1: using a tool like this to make someone they don't 376 00:23:05,160 --> 00:23:09,800 Speaker 1: like appear to say the absolutely worst stuff you can imagine. 377 00:23:09,840 --> 00:23:14,199 Speaker 1: I'm talking like, truly vile things. That would be the 378 00:23:14,240 --> 00:23:17,520 Speaker 1: first thing I would expect of this tool were to 379 00:23:17,600 --> 00:23:22,040 Speaker 1: be publicly accessible, and by preserving emotional tone, it could 380 00:23:22,080 --> 00:23:25,280 Speaker 1: even sound like the person was happy while they were 381 00:23:25,320 --> 00:23:27,760 Speaker 1: saying the worst things you can imagine, which is a 382 00:23:27,800 --> 00:23:31,280 Speaker 1: big old yuck. All Right, I've got a couple more 383 00:23:31,320 --> 00:23:33,439 Speaker 1: stories I want to finish out with, but before we 384 00:23:33,480 --> 00:23:45,720 Speaker 1: get to those, let's take another quick break. Hey, we're back, 385 00:23:45,720 --> 00:23:47,879 Speaker 1: and we got a couple more AI stories before we 386 00:23:47,960 --> 00:23:51,520 Speaker 1: move on to some other stuff. And these involve the 387 00:23:51,640 --> 00:23:54,920 Speaker 1: organization do not pay. So this group does a lot 388 00:23:54,920 --> 00:23:58,240 Speaker 1: of consumer advocacy kind of work. They got their start 389 00:23:58,280 --> 00:24:00,600 Speaker 1: as a service that help people identify I and then 390 00:24:00,720 --> 00:24:04,360 Speaker 1: cancel subscriptions to stuff that they rarely or never use. 391 00:24:04,640 --> 00:24:07,800 Speaker 1: So you might create an account and then think, oh, hey, 392 00:24:07,880 --> 00:24:10,280 Speaker 1: I haven't watched anything on this streaming platform and months 393 00:24:10,320 --> 00:24:12,639 Speaker 1: why am I still paying for it? And then you 394 00:24:12,680 --> 00:24:14,919 Speaker 1: want to bail and Do Not Pay would help you 395 00:24:14,960 --> 00:24:17,240 Speaker 1: do that kind of thing. Well, something else the company 396 00:24:17,240 --> 00:24:20,000 Speaker 1: has been experimenting with is using AI to help people 397 00:24:20,040 --> 00:24:24,320 Speaker 1: when they have to appear in court, and apparently coming 398 00:24:24,400 --> 00:24:27,359 Speaker 1: up next month, the organization is working with a defendant 399 00:24:27,800 --> 00:24:31,040 Speaker 1: as they have to go to traffic court to defend themselves. 400 00:24:31,520 --> 00:24:34,119 Speaker 1: The defendant is going to wear an ear piece and 401 00:24:34,160 --> 00:24:38,000 Speaker 1: the AI will generate responses for the defendant to repeat 402 00:24:38,080 --> 00:24:41,159 Speaker 1: while they're in court. Now, there are a few other details, 403 00:24:41,800 --> 00:24:44,639 Speaker 1: but it's pretty scarce because Do not Pay wishes to 404 00:24:44,680 --> 00:24:48,200 Speaker 1: preserve the defendants privacy, which makes total sense. We only 405 00:24:48,240 --> 00:24:50,280 Speaker 1: know it's not going to take place in a traffic 406 00:24:50,320 --> 00:24:54,080 Speaker 1: court in California, and Do not Pay has previously used 407 00:24:54,119 --> 00:24:58,359 Speaker 1: AI to help people fight unfair parking tickets and found 408 00:24:58,600 --> 00:25:01,359 Speaker 1: a sixty six US rate across a quarter of a 409 00:25:01,359 --> 00:25:05,040 Speaker 1: million cases with that, so that usually involved AI writing 410 00:25:05,800 --> 00:25:09,760 Speaker 1: generating a letter in an effort to fight a parking ticket, 411 00:25:10,080 --> 00:25:12,600 Speaker 1: which is not a bad success rate six four percent. 412 00:25:12,720 --> 00:25:14,600 Speaker 1: But this would be the first time that AI would 413 00:25:14,600 --> 00:25:17,919 Speaker 1: actually be used in an in person court case to 414 00:25:18,480 --> 00:25:23,439 Speaker 1: boost someone's defense UH strategies, and the law hasn't exactly 415 00:25:23,480 --> 00:25:24,960 Speaker 1: been ahead of the game. When it comes to a 416 00:25:25,119 --> 00:25:28,520 Speaker 1: I probably will not surprise you that there isn't any 417 00:25:28,680 --> 00:25:32,200 Speaker 1: law against accepting AI powered assistance while appearing at trial, 418 00:25:32,960 --> 00:25:35,399 Speaker 1: at least not yet, because no one ever thought that 419 00:25:35,400 --> 00:25:36,680 Speaker 1: that was going to be a thing, or at least 420 00:25:36,680 --> 00:25:41,080 Speaker 1: not a thing right now. So there's nothing preventing someone 421 00:25:41,119 --> 00:25:44,280 Speaker 1: from doing this because no one's actually passed the law 422 00:25:44,280 --> 00:25:47,800 Speaker 1: to prevent it. But then, traffic court is small potatoes 423 00:25:47,840 --> 00:25:50,919 Speaker 1: because it's it's pretty low stakes. It's usually over a 424 00:25:50,960 --> 00:25:54,720 Speaker 1: matter that involves some money, but it's not that much money, 425 00:25:54,760 --> 00:25:56,359 Speaker 1: at least not in the grand scheme of things. It 426 00:25:56,440 --> 00:25:58,320 Speaker 1: might be a lot for the person who has to 427 00:25:58,320 --> 00:26:01,800 Speaker 1: pay a fine, but in the grand scheme it's never 428 00:26:02,200 --> 00:26:05,959 Speaker 1: that huge, and frequently cases can end up being tossed 429 00:26:05,960 --> 00:26:09,399 Speaker 1: out just because the officer who wrote the ticket fails 430 00:26:09,440 --> 00:26:11,879 Speaker 1: to show up to court. If the officer doesn't show up, 431 00:26:11,920 --> 00:26:15,600 Speaker 1: then the person charged as just gets to go. But 432 00:26:15,680 --> 00:26:19,359 Speaker 1: on the flip side, Do Not Pay is perhaps cheekily 433 00:26:19,960 --> 00:26:23,159 Speaker 1: seeking to argue a case in front of the Supreme 434 00:26:23,280 --> 00:26:26,600 Speaker 1: Court of the United States. So the CEO of Do 435 00:26:26,640 --> 00:26:30,560 Speaker 1: Not Pay tweeted, because of course this was in a tweet, 436 00:26:30,960 --> 00:26:34,399 Speaker 1: that the company would pay a million dollars to any 437 00:26:34,520 --> 00:26:36,800 Speaker 1: lawyer scheduled to argue a case in front of the 438 00:26:36,800 --> 00:26:41,200 Speaker 1: Supreme Court, and this lawyer, in return for accepting the 439 00:26:41,280 --> 00:26:44,440 Speaker 1: million dollars, would agree to wear some wireless ear buds 440 00:26:44,480 --> 00:26:47,720 Speaker 1: like air pods and then follow the directions of the 441 00:26:47,800 --> 00:26:51,879 Speaker 1: AI lawyer butt. The CEO says he's serious about this offwerare, 442 00:26:51,880 --> 00:26:53,719 Speaker 1: but I don't think anyone's going to take him up 443 00:26:53,760 --> 00:26:56,680 Speaker 1: on it, because while a million dollars is a lot 444 00:26:56,720 --> 00:26:59,280 Speaker 1: of money, I don't think any lawyer wants to be 445 00:26:59,680 --> 00:27:02,520 Speaker 1: the one who has their name associated with a potential 446 00:27:02,560 --> 00:27:06,440 Speaker 1: publicity disaster. The things don't go well, then your name 447 00:27:06,480 --> 00:27:09,840 Speaker 1: could forever be associated with this weird AI experiment. And 448 00:27:09,880 --> 00:27:12,040 Speaker 1: I think most lawyers who are arguing in front of 449 00:27:12,040 --> 00:27:15,719 Speaker 1: the Supreme Court wish to preserve their reputation to some degree. 450 00:27:16,040 --> 00:27:18,679 Speaker 1: Plus there are rules about the kinds of stuff you 451 00:27:18,680 --> 00:27:21,760 Speaker 1: can actually bring into the Supreme Court that includes there's 452 00:27:21,760 --> 00:27:24,760 Speaker 1: a ban on electronic devices while the court is in session. 453 00:27:25,440 --> 00:27:29,280 Speaker 1: So it's questionable about whether this would even be allowed 454 00:27:29,320 --> 00:27:32,239 Speaker 1: in the first place, just on a basic level. So 455 00:27:32,280 --> 00:27:35,080 Speaker 1: while the offer is sure to get some attention, I 456 00:27:35,160 --> 00:27:37,639 Speaker 1: don't think there's any real chance of someone pouncing on it. 457 00:27:37,680 --> 00:27:40,280 Speaker 1: But does illustrate how AI is going to continue to 458 00:27:40,280 --> 00:27:44,040 Speaker 1: be a really big topic this year. All right, Switching 459 00:27:44,040 --> 00:27:46,720 Speaker 1: away from AI across the pond in the UK, new 460 00:27:46,800 --> 00:27:51,000 Speaker 1: legislation now requires builders to include gigabit Internet connections on 461 00:27:51,040 --> 00:27:55,520 Speaker 1: any new home construction, which is a really cool thing. 462 00:27:55,680 --> 00:27:58,439 Speaker 1: Any new home built in the UK has to have 463 00:27:58,520 --> 00:28:02,760 Speaker 1: gigabit Internet connection is built into it. Uh. This of 464 00:28:02,800 --> 00:28:05,200 Speaker 1: course doesn't mean that everyone will actually have access to 465 00:28:05,240 --> 00:28:07,879 Speaker 1: gigabit Internet because you still have to have the service 466 00:28:07,880 --> 00:28:10,199 Speaker 1: available in the area. You still have to pay for 467 00:28:10,240 --> 00:28:13,399 Speaker 1: that service, but it will be possible. At least it 468 00:28:13,440 --> 00:28:17,320 Speaker 1: won't be because your home lacks the connection. The connection 469 00:28:17,320 --> 00:28:20,280 Speaker 1: will be built into the home. In addition, the legislation 470 00:28:20,320 --> 00:28:22,240 Speaker 1: is going to make it easier for people in existing 471 00:28:22,280 --> 00:28:26,600 Speaker 1: structures to get gigabit connections installed now. Easy does not 472 00:28:26,720 --> 00:28:32,560 Speaker 1: mean cheap because the legislation is seeking for a cab 473 00:28:33,000 --> 00:28:35,639 Speaker 1: a spending cap on how much it would cost to 474 00:28:36,840 --> 00:28:40,600 Speaker 1: install connections into existing buildings. But that cap is high. 475 00:28:40,640 --> 00:28:44,800 Speaker 1: It's like two thousand pounds. That's a lot of money, um, 476 00:28:44,840 --> 00:28:48,560 Speaker 1: But the UK government estimates that around of installations would 477 00:28:48,560 --> 00:28:51,200 Speaker 1: fall under such a cap anyway. So the cap is 478 00:28:51,240 --> 00:28:55,000 Speaker 1: really there to prevent people who are living in out 479 00:28:55,000 --> 00:28:58,280 Speaker 1: of the way places like rural areas where there's very 480 00:28:58,280 --> 00:29:02,240 Speaker 1: little infrastructure. It's it's meant to prevent them from being 481 00:29:02,520 --> 00:29:07,800 Speaker 1: victimized by having a h price tags they are just 482 00:29:07,880 --> 00:29:10,600 Speaker 1: way too high to ever pay to have that connectivity 483 00:29:10,640 --> 00:29:13,720 Speaker 1: built into them. So yeah, it's really expensive, but it's 484 00:29:13,720 --> 00:29:17,360 Speaker 1: also meant to be a check on providers so that 485 00:29:17,400 --> 00:29:20,000 Speaker 1: they can't bleed someone dry just because they happen to 486 00:29:20,080 --> 00:29:23,920 Speaker 1: live out in the middle of nowhere. The legislation doesn't 487 00:29:23,960 --> 00:29:29,600 Speaker 1: guarantee gigabit speed connectivity, uh, just the actual connections, which 488 00:29:29,640 --> 00:29:32,120 Speaker 1: is there is a difference right the service and the 489 00:29:32,200 --> 00:29:35,840 Speaker 1: actual hardware. So if there is no gigabit service in 490 00:29:35,880 --> 00:29:38,960 Speaker 1: the region, then whatever is fastest will end up taking 491 00:29:39,040 --> 00:29:43,080 Speaker 1: its place. The legislation also holds landlord's accountable too. If 492 00:29:43,080 --> 00:29:46,520 Speaker 1: a renter wants to upgrade to gigabit connectivity within their apartment, 493 00:29:46,920 --> 00:29:50,400 Speaker 1: they would contact a broadband provider, but because they don't 494 00:29:50,440 --> 00:29:53,880 Speaker 1: own their their structure, like if they're living in an 495 00:29:53,920 --> 00:29:58,080 Speaker 1: apartment or or a rented house or whatever, then the 496 00:29:58,160 --> 00:30:00,680 Speaker 1: provider has to reach out to land words first to 497 00:30:00,680 --> 00:30:04,880 Speaker 1: get permission, and nearly half of such requests went unanswered 498 00:30:05,000 --> 00:30:08,120 Speaker 1: last year. But now if a landlord does not respond 499 00:30:08,160 --> 00:30:10,800 Speaker 1: within thirty five days of receiving such a request from 500 00:30:10,800 --> 00:30:13,800 Speaker 1: a broadband provider, the provider can take the matter to 501 00:30:13,880 --> 00:30:19,560 Speaker 1: court to get access rights to the rented structure. By 502 00:30:19,600 --> 00:30:22,160 Speaker 1: the way, reading up on what the broadband situation is 503 00:30:22,200 --> 00:30:25,640 Speaker 1: like in the UK just reminded me how badly we 504 00:30:25,760 --> 00:30:28,080 Speaker 1: have it here in the United States, where there is 505 00:30:28,120 --> 00:30:32,600 Speaker 1: a distinct lack of competition among providers in most markets. 506 00:30:32,640 --> 00:30:36,480 Speaker 1: Like the report I was reading, the person said there 507 00:30:36,560 --> 00:30:39,320 Speaker 1: was like a hundred different providers they could choose from. 508 00:30:39,360 --> 00:30:41,720 Speaker 1: That's not the case in the United States. Like, for instance, 509 00:30:41,840 --> 00:30:46,000 Speaker 1: where I live, there is one provider who offers service 510 00:30:46,160 --> 00:30:49,680 Speaker 1: that is more than two hundred megabits per second, So 511 00:30:49,720 --> 00:30:52,120 Speaker 1: I'm not even in the gigabit per second area, and 512 00:30:52,240 --> 00:30:56,000 Speaker 1: in the megabits per second, the next fastest is at 513 00:30:56,040 --> 00:31:00,000 Speaker 1: thirty megabits per second. And that's the two major provide 514 00:31:00,040 --> 00:31:02,440 Speaker 1: letters in this area, and all the other providers that 515 00:31:02,480 --> 00:31:05,120 Speaker 1: are available in this area. Their services are built on 516 00:31:05,200 --> 00:31:09,280 Speaker 1: top of the of the infrastructure provided by the first 517 00:31:09,560 --> 00:31:13,640 Speaker 1: two choices. So there's no competition in the United States, 518 00:31:14,040 --> 00:31:17,880 Speaker 1: and it's ridiculous that we continue to pretend like there is. Uh, 519 00:31:17,920 --> 00:31:19,880 Speaker 1: that's me getting on my high horse about that and 520 00:31:19,920 --> 00:31:24,480 Speaker 1: being irritated at how poorly this has been handled by 521 00:31:25,320 --> 00:31:29,160 Speaker 1: regulatory agencies here in the United States. And I think 522 00:31:29,160 --> 00:31:31,800 Speaker 1: it's ridiculous that I live in the city of Atlanta 523 00:31:31,840 --> 00:31:35,280 Speaker 1: and I can't get access to gig a bit Internet. Um. 524 00:31:35,960 --> 00:31:38,200 Speaker 1: And you should be upset too, because that affects the show. 525 00:31:39,000 --> 00:31:42,720 Speaker 1: You know, this show, so it's it's affecting you too. 526 00:31:42,800 --> 00:31:46,920 Speaker 1: It's not just me, alright, I'm done. Last year, one 527 00:31:46,960 --> 00:31:48,760 Speaker 1: of the news stories that I covered was about how 528 00:31:48,800 --> 00:31:52,120 Speaker 1: BMW was irritating people by locking certain car features behind 529 00:31:52,160 --> 00:31:56,400 Speaker 1: subscription services, like heated car seats. So the upsetting thing 530 00:31:56,480 --> 00:31:59,640 Speaker 1: is that these features typically are already present in the 531 00:31:59,760 --> 00:32:03,560 Speaker 1: val right the vehicle can already do the thing that 532 00:32:03,680 --> 00:32:07,480 Speaker 1: the subscription covers. It's just that the feature is switched 533 00:32:07,600 --> 00:32:11,840 Speaker 1: off unless the owner pays a subscription to activate them. 534 00:32:11,880 --> 00:32:14,360 Speaker 1: And a lot of people have the opinion that if 535 00:32:14,360 --> 00:32:18,320 Speaker 1: a feature is in the vehicle, it should just be 536 00:32:18,400 --> 00:32:22,480 Speaker 1: accessible upon purchase. I think that's our reasonable opinion. You 537 00:32:22,520 --> 00:32:25,480 Speaker 1: could offer vehicles to have the feature as an option, 538 00:32:25,920 --> 00:32:27,720 Speaker 1: and you can have other versions of the vehicle that 539 00:32:27,840 --> 00:32:30,800 Speaker 1: do not have that option, and that can be reflected 540 00:32:30,800 --> 00:32:33,560 Speaker 1: in the sticker price of the individual cars. But to 541 00:32:33,640 --> 00:32:37,240 Speaker 1: have something built into a car that works but it's 542 00:32:37,320 --> 00:32:40,719 Speaker 1: turned off by default, well that's frustrating for a lot 543 00:32:40,760 --> 00:32:44,960 Speaker 1: of folks. And BMW is doubling down on that. Actually, 544 00:32:44,960 --> 00:32:47,320 Speaker 1: it's more than doubling down on that. Here in the 545 00:32:47,400 --> 00:32:52,360 Speaker 1: United States, BMW has five features that are locked behind 546 00:32:52,400 --> 00:32:56,280 Speaker 1: subscription services. This is new in the US. They include 547 00:32:56,480 --> 00:33:02,080 Speaker 1: Parking Assistant, Professional Traffic Camera, driving US Stance plus, drive recorder, 548 00:33:02,600 --> 00:33:07,080 Speaker 1: and remote engine start. So, for example, if you wanted 549 00:33:07,160 --> 00:33:10,040 Speaker 1: a one year subscription so that you could start your 550 00:33:10,080 --> 00:33:13,120 Speaker 1: engine remotely, that would set you back a hundred five 551 00:33:13,200 --> 00:33:16,600 Speaker 1: dollars For the year, you could pay two hundred fifty 552 00:33:16,600 --> 00:33:19,760 Speaker 1: dollars at pre purchased three years of service, or if 553 00:33:19,760 --> 00:33:22,960 Speaker 1: you wanted a lifetime subscription to Remote Engine Start for 554 00:33:23,000 --> 00:33:26,000 Speaker 1: that specific BMW as long as you owned it, that 555 00:33:26,000 --> 00:33:29,680 Speaker 1: would cost you three thirty dollars. Otherwise it's a ten 556 00:33:29,800 --> 00:33:34,040 Speaker 1: bucks a month service. Personally, I look forward to the 557 00:33:34,080 --> 00:33:38,160 Speaker 1: dystopian future where stuff like windshield wipers and turn signals 558 00:33:38,160 --> 00:33:41,560 Speaker 1: are also subscription based, except of course, here in Atlanta, 559 00:33:41,680 --> 00:33:43,600 Speaker 1: most folks would be well ahead of the curve on 560 00:33:43,720 --> 00:33:46,440 Speaker 1: saving money because they don't know what turn signals are 561 00:33:46,480 --> 00:33:50,120 Speaker 1: for anyway. Finally, we have a right to Repair update. 562 00:33:50,160 --> 00:33:52,680 Speaker 1: I'll actually be doing an episode kind of as an 563 00:33:52,760 --> 00:33:55,040 Speaker 1: update to write to repair in its current status here 564 00:33:55,040 --> 00:33:57,880 Speaker 1: in the United States pretty soon. But one of the 565 00:33:57,880 --> 00:34:02,960 Speaker 1: big holdouts on right to repair has recently made some concessions. 566 00:34:03,480 --> 00:34:07,080 Speaker 1: That holdout is John Dear. It's a company that's best 567 00:34:07,120 --> 00:34:09,680 Speaker 1: known as making farming equipment. They also make things like 568 00:34:09,800 --> 00:34:13,840 Speaker 1: lawnmowers and that kind of stuff. Traditionally, John Deer lockdown 569 00:34:13,840 --> 00:34:16,480 Speaker 1: its equipment so that farmers had no choice but to 570 00:34:16,520 --> 00:34:21,000 Speaker 1: take their stuff to authorized John Deer service facilities, and 571 00:34:21,040 --> 00:34:24,360 Speaker 1: that could sometimes be more expensive than an independent repair shop. 572 00:34:24,880 --> 00:34:27,640 Speaker 1: Like if you've ever had a car, and you've looked 573 00:34:27,680 --> 00:34:31,880 Speaker 1: at prices of taking that car to a dealership's service 574 00:34:32,480 --> 00:34:36,360 Speaker 1: center versus going to an independent repair shop, you might see, oh, 575 00:34:36,480 --> 00:34:38,000 Speaker 1: I could save a lot more money if I go 576 00:34:38,080 --> 00:34:41,720 Speaker 1: to the independent place. Well, that same thing is true 577 00:34:41,760 --> 00:34:44,640 Speaker 1: with other types of equipment as well, and John Deere 578 00:34:45,080 --> 00:34:47,600 Speaker 1: tried very hard to make sure that people could not 579 00:34:48,360 --> 00:34:51,279 Speaker 1: have an alternative. They used a few different approaches to 580 00:34:51,320 --> 00:34:55,360 Speaker 1: discourage or outright prevent someone from going anywhere besides a 581 00:34:55,480 --> 00:35:00,760 Speaker 1: John Dear licensed service facility. And this is because locking 582 00:35:00,840 --> 00:35:04,359 Speaker 1: someone into an ecosystem of service and repairs creates an 583 00:35:04,400 --> 00:35:08,000 Speaker 1: ongoing revenue stream. So you're not just selling someone a tractor, 584 00:35:08,920 --> 00:35:12,600 Speaker 1: you're also essentially selling them all their maintenance and repair 585 00:35:12,760 --> 00:35:16,800 Speaker 1: Moving forward, it might be in the form of securing 586 00:35:16,880 --> 00:35:21,680 Speaker 1: licensing agreements with various service centers around the region. But yeah, 587 00:35:21,680 --> 00:35:23,759 Speaker 1: this is a way to make more money rather than 588 00:35:23,800 --> 00:35:28,920 Speaker 1: just rely on hardware sales. But on Sunday, the companies 589 00:35:28,960 --> 00:35:33,080 Speaker 1: signed a memorandum of understanding with the American Farm Bureau 590 00:35:33,200 --> 00:35:36,680 Speaker 1: Federation and now farmers and independent repair shops will be 591 00:35:36,719 --> 00:35:41,200 Speaker 1: allowed to access stuff like repair manuals, tools and parts 592 00:35:41,280 --> 00:35:45,280 Speaker 1: needed to perform specific kinds of repairs on specific kinds 593 00:35:45,280 --> 00:35:48,680 Speaker 1: of equipment. But in return, the Bureau agreed that farmers 594 00:35:48,680 --> 00:35:52,040 Speaker 1: and independent repair shops would maintain trade secrets, they would 595 00:35:52,080 --> 00:35:55,240 Speaker 1: not divulge them, and they also promised not to override 596 00:35:55,239 --> 00:35:57,840 Speaker 1: things like safety features and that kind of stuff. In 597 00:35:57,880 --> 00:36:00,560 Speaker 1: other words, you can't soup up your actors so that 598 00:36:00,640 --> 00:36:02,759 Speaker 1: it can travel the length of the field in like 599 00:36:02,840 --> 00:36:05,120 Speaker 1: one tenth the normal time or anything like that, because 600 00:36:05,160 --> 00:36:08,040 Speaker 1: that would be dangerous. But this is a pretty big 601 00:36:08,080 --> 00:36:12,000 Speaker 1: win for the right to repair. Okay, that's the news 602 00:36:12,080 --> 00:36:15,440 Speaker 1: I have for you for Tuesday, January twenty three. Will 603 00:36:15,480 --> 00:36:18,440 Speaker 1: be back later this week with some more news, assuming 604 00:36:18,480 --> 00:36:21,799 Speaker 1: that stuff happens, and I mean it keeps that keeps 605 00:36:21,840 --> 00:36:24,600 Speaker 1: doing that, so I guess that's gonna be a thing. 606 00:36:25,400 --> 00:36:27,920 Speaker 1: And if you have suggestions for topics I should cover 607 00:36:27,920 --> 00:36:30,440 Speaker 1: in future episodes of tech Stuff, please reach out to me. 608 00:36:30,760 --> 00:36:32,239 Speaker 1: There are a couple of different ways you can do it. 609 00:36:32,320 --> 00:36:34,480 Speaker 1: One is you can download the I Heart radio app. 610 00:36:34,520 --> 00:36:37,400 Speaker 1: It's free to use. It's free to download. You just 611 00:36:37,440 --> 00:36:39,879 Speaker 1: type tech stuff into the little search feature that will 612 00:36:39,920 --> 00:36:42,920 Speaker 1: take you to the tech Stuff podcast page and you 613 00:36:43,000 --> 00:36:44,960 Speaker 1: click on that and you'll see there's a little microphone 614 00:36:45,120 --> 00:36:46,759 Speaker 1: icon in there. If you click on that, you can 615 00:36:46,840 --> 00:36:49,120 Speaker 1: leave me a voice message up to thirty seconds in length, 616 00:36:49,520 --> 00:36:52,440 Speaker 1: or if you prefer, you can send me your thoughts 617 00:36:52,480 --> 00:36:55,560 Speaker 1: in Twitter form. The handle for the show is tech 618 00:36:55,680 --> 00:37:01,879 Speaker 1: Stuff hs W and I'll talk to you again really soon. Yes. 619 00:37:06,040 --> 00:37:09,040 Speaker 1: Text Stuff is an I Heart Radio production. For more 620 00:37:09,120 --> 00:37:12,520 Speaker 1: podcasts from I Heart Radio, visit the i Heart Radio app, 621 00:37:12,640 --> 00:37:15,800 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.