1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,960 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,840 --> 00:00:19,400 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:19,520 --> 00:00:22,560 Speaker 1: And how the tech are you? It's time for the 5 00:00:22,560 --> 00:00:27,920 Speaker 1: tech news for a Thursday, December eight, two thousand twenty two. Yesterday, 6 00:00:28,040 --> 00:00:32,120 Speaker 1: Apple announced it will offer full encryption for user data 7 00:00:32,280 --> 00:00:37,000 Speaker 1: stored in iCloud, So that includes stuff like chat histories 8 00:00:37,000 --> 00:00:39,479 Speaker 1: and photos and that kind of thing. It's something that 9 00:00:39,560 --> 00:00:43,680 Speaker 1: Apple users and privacy organizations have wanted for a long time, 10 00:00:43,720 --> 00:00:47,559 Speaker 1: particularly in the wake of things like you know, hackers 11 00:00:47,560 --> 00:00:51,159 Speaker 1: getting access to people's photos and I Cloud. Having that 12 00:00:51,200 --> 00:00:54,520 Speaker 1: stuff encrypted would make a huge difference. So the encryption 13 00:00:54,600 --> 00:00:57,600 Speaker 1: means that if someone else were to gain access to 14 00:00:57,640 --> 00:01:00,240 Speaker 1: the data, but they don't have access to the ivan 15 00:01:00,360 --> 00:01:03,720 Speaker 1: key used to decrypt the data, all they will end 16 00:01:03,800 --> 00:01:07,160 Speaker 1: up with is gibberish. And that, of course means we 17 00:01:07,240 --> 00:01:10,760 Speaker 1: need to talk about the US Federal Bureau of Investigation 18 00:01:10,840 --> 00:01:15,240 Speaker 1: a k. The f b I, because boy howdy, they 19 00:01:15,280 --> 00:01:19,280 Speaker 1: do not like Apple's move here. The FBI would much 20 00:01:19,280 --> 00:01:23,120 Speaker 1: prefer that this data remain unencrypted. That way, should the 21 00:01:23,160 --> 00:01:27,840 Speaker 1: FBI need to access information in an investigation, they could 22 00:01:27,840 --> 00:01:30,640 Speaker 1: do so without having the whole sticky situation of this 23 00:01:30,760 --> 00:01:34,200 Speaker 1: data not actually meaning anything to them because it would 24 00:01:34,240 --> 00:01:36,840 Speaker 1: be all jumbled up. So the FBI argues that the 25 00:01:36,959 --> 00:01:42,240 Speaker 1: encrypted data represents a danger to Americans that without access 26 00:01:42,319 --> 00:01:44,440 Speaker 1: to that data, the FBI is going to have a 27 00:01:44,440 --> 00:01:49,200 Speaker 1: harder time detecting, investigating, or preventing crimes that could otherwise 28 00:01:49,240 --> 00:01:53,560 Speaker 1: harm Americans. And there's some truth to that, but this 29 00:01:53,680 --> 00:01:56,200 Speaker 1: also ignores the fact that people want their information to 30 00:01:56,200 --> 00:01:59,480 Speaker 1: be safe from prying eyes. So let me give you 31 00:01:59,680 --> 00:02:02,800 Speaker 1: a really just simple, honest example. Let's say that you 32 00:02:02,880 --> 00:02:07,120 Speaker 1: are unhappy with your job, and so you start sending 33 00:02:07,120 --> 00:02:11,320 Speaker 1: out feelers to your network of contacts to see if 34 00:02:11,360 --> 00:02:15,600 Speaker 1: perhaps there are any other opportunities elsewhere that you would 35 00:02:15,639 --> 00:02:18,440 Speaker 1: like to pursue. Well, you would want those communications to 36 00:02:18,440 --> 00:02:21,360 Speaker 1: be encrypted so that if your current employer were to 37 00:02:21,639 --> 00:02:25,480 Speaker 1: somehow come across those transmissions, you wouldn't immediately be called 38 00:02:25,520 --> 00:02:30,320 Speaker 1: out on it then potentially face retribution anyway, The FBI 39 00:02:30,400 --> 00:02:34,440 Speaker 1: has long urged companies that use encryption that they should 40 00:02:34,440 --> 00:02:39,200 Speaker 1: incorporate a system whereby the FBI could decrypt information stored 41 00:02:39,320 --> 00:02:43,200 Speaker 1: in those systems. This is the so called back door strategy, 42 00:02:43,240 --> 00:02:48,200 Speaker 1: giving the FBI a back door into the ecosystem. And 43 00:02:48,240 --> 00:02:51,520 Speaker 1: I know I have said this a billion times, but 44 00:02:51,600 --> 00:02:55,520 Speaker 1: I'm gonna say it again. Intentionally putting in a back 45 00:02:55,560 --> 00:03:02,880 Speaker 1: door is never a good idea full stop never. Now 46 00:03:02,919 --> 00:03:06,160 Speaker 1: why is that? Well, think of it this way. Hackers 47 00:03:06,200 --> 00:03:11,120 Speaker 1: are constantly probing systems to find vulnerabilities that they can exploit. 48 00:03:11,560 --> 00:03:14,360 Speaker 1: So they're looking for ways to gain purchase in a 49 00:03:14,440 --> 00:03:17,720 Speaker 1: system and to intrude into that system, and that includes 50 00:03:18,040 --> 00:03:22,160 Speaker 1: finding ways to decrypt data. So if you build in 51 00:03:22,280 --> 00:03:26,240 Speaker 1: that capability, presumably with the intent to only allow the 52 00:03:26,320 --> 00:03:29,880 Speaker 1: FBI to use it, it represents an enormous target for 53 00:03:29,919 --> 00:03:33,120 Speaker 1: hackers because when you know there's a back door, there's 54 00:03:33,120 --> 00:03:36,080 Speaker 1: no reason to just keep slamming away at the front door. Right, 55 00:03:36,480 --> 00:03:40,800 Speaker 1: So putting in backdoor access is equivalent to including a 56 00:03:40,840 --> 00:03:47,520 Speaker 1: security vulnerability on purpose. It's dumb. Plus, we know that 57 00:03:47,560 --> 00:03:52,160 Speaker 1: people in positions of power can abuse that power. Now, 58 00:03:52,200 --> 00:03:55,680 Speaker 1: not everyone does. I'm sure there's no shortage of decent 59 00:03:55,720 --> 00:03:59,880 Speaker 1: people in organizations who would never do such a thing, 60 00:04:00,920 --> 00:04:04,400 Speaker 1: but we have seen plenty of cases where people in 61 00:04:04,440 --> 00:04:08,760 Speaker 1: positions of power have abused things like access to data 62 00:04:09,120 --> 00:04:12,440 Speaker 1: see also the n s A. And you can never 63 00:04:12,560 --> 00:04:15,640 Speaker 1: be certain that everyone in the FBI who would have 64 00:04:15,760 --> 00:04:18,920 Speaker 1: access to a tool like backdoor decryption in the y 65 00:04:19,000 --> 00:04:23,839 Speaker 1: cloud wouldn't abuse that access. So, in short, I think 66 00:04:23,880 --> 00:04:26,720 Speaker 1: Apple is doing the right thing because the company so 67 00:04:26,760 --> 00:04:31,000 Speaker 1: far has resisted all calls to weaken its systems. Now 68 00:04:31,000 --> 00:04:36,080 Speaker 1: over at the Pentagon, a hefty nine billion dollar contract 69 00:04:36,160 --> 00:04:39,679 Speaker 1: that's billion with a B has gone out to several 70 00:04:39,680 --> 00:04:43,240 Speaker 1: tech companies to build out a cloud computing network for 71 00:04:43,279 --> 00:04:45,760 Speaker 1: the Pentagon's use. And this comes at the end of 72 00:04:45,800 --> 00:04:49,800 Speaker 1: a really long and messy situation in which various companies 73 00:04:49,839 --> 00:04:53,240 Speaker 1: had secured parts of a contract, but then there were 74 00:04:53,279 --> 00:04:56,040 Speaker 1: accusations that flared up that there was some hanky panky 75 00:04:56,080 --> 00:04:59,479 Speaker 1: going on in the contract phase, meaning that you know, 76 00:04:59,520 --> 00:05:03,600 Speaker 1: folks suspected there was collusion between Pentagon officials and certain 77 00:05:03,640 --> 00:05:08,200 Speaker 1: companies as opposed to the Pentagon selecting the best legitimate proposal. 78 00:05:09,040 --> 00:05:13,120 Speaker 1: That appears to all be settled now, with Microsoft, Amazon, Google, 79 00:05:13,200 --> 00:05:16,800 Speaker 1: and Oracle all getting a slice of that nine billion 80 00:05:16,839 --> 00:05:21,839 Speaker 1: dollar pie. The purpose of this new cloud infrastructure will 81 00:05:21,880 --> 00:05:25,360 Speaker 1: be to hold information of varying degrees of secrecy, although 82 00:05:25,440 --> 00:05:29,560 Speaker 1: everything from like unclassified information up to top secret stuff, 83 00:05:29,880 --> 00:05:33,400 Speaker 1: and also to allow the rapid dissemination of information to 84 00:05:33,760 --> 00:05:37,320 Speaker 1: any parties that need it within like the Department of Defense, 85 00:05:37,680 --> 00:05:42,279 Speaker 1: such as troops are stationed in distant countries. As such, 86 00:05:42,520 --> 00:05:45,719 Speaker 1: the system will need to have robust security to put 87 00:05:45,720 --> 00:05:48,479 Speaker 1: it lightly, to prevent you know, spies from being able 88 00:05:48,480 --> 00:05:52,360 Speaker 1: to see critical communications and intelligence. This is a very 89 00:05:52,360 --> 00:05:56,880 Speaker 1: tall order considering the challenges of making anything digitally bullet proof. 90 00:05:57,360 --> 00:06:01,279 Speaker 1: So interesting that this is finally moving forward and that 91 00:06:01,360 --> 00:06:04,920 Speaker 1: it's going to be a confederation of different big tech 92 00:06:05,000 --> 00:06:08,520 Speaker 1: companies that will be involved in it. Lawmakers and regulators 93 00:06:08,520 --> 00:06:11,840 Speaker 1: and the EU continue to be a thorn in the 94 00:06:11,880 --> 00:06:15,120 Speaker 1: side of big tech. The Court of Justice of the 95 00:06:15,160 --> 00:06:19,240 Speaker 1: European Union has ruled that Google is legally obligated to 96 00:06:19,360 --> 00:06:23,880 Speaker 1: remove search results if users can prove that the search 97 00:06:23,920 --> 00:06:28,680 Speaker 1: results lead to inaccurate data. Now, I imagine that's going 98 00:06:28,720 --> 00:06:32,760 Speaker 1: to be an enormous problem moving forward, because anyone who 99 00:06:32,760 --> 00:06:37,520 Speaker 1: objects to any search result could presumably challenge it as 100 00:06:37,560 --> 00:06:40,960 Speaker 1: for proving results to be inaccurate. I think that's also 101 00:06:41,080 --> 00:06:43,600 Speaker 1: going to be a bit odd because typically the burden 102 00:06:43,640 --> 00:06:48,320 Speaker 1: of proof is on whatever party is claiming accuracy, like, 103 00:06:48,400 --> 00:06:50,760 Speaker 1: you know, the whole you can't prove a negative kind 104 00:06:50,760 --> 00:06:53,000 Speaker 1: of thing. But in this ruling, the court is saying, 105 00:06:53,279 --> 00:06:56,800 Speaker 1: if someone requests the removal of a search result, that 106 00:06:57,040 --> 00:07:02,120 Speaker 1: someone must also prove that quote much information is manifestly 107 00:07:02,240 --> 00:07:06,359 Speaker 1: inaccurate end quote. So this does sound like the burden 108 00:07:06,560 --> 00:07:09,440 Speaker 1: is on the claimant. Now, in a way, I guess 109 00:07:09,480 --> 00:07:13,000 Speaker 1: that makes sense, because really Google is just indexing and 110 00:07:13,040 --> 00:07:16,920 Speaker 1: displaying search results right. In most cases, Google is not 111 00:07:17,080 --> 00:07:20,280 Speaker 1: the party that's responsible for the generation of the information. 112 00:07:20,720 --> 00:07:24,200 Speaker 1: It's just serving up links. Though things do get a 113 00:07:24,240 --> 00:07:27,120 Speaker 1: little tricky when you start talking about things like ads 114 00:07:27,240 --> 00:07:30,600 Speaker 1: that appear in search results. The court is trying to 115 00:07:30,640 --> 00:07:33,560 Speaker 1: find a way to maneuver the rough waters between the 116 00:07:33,680 --> 00:07:37,080 Speaker 1: right to be forgotten, which refers to a person's right 117 00:07:37,160 --> 00:07:39,680 Speaker 1: to not have details of their life published up on 118 00:07:39,720 --> 00:07:43,880 Speaker 1: the Internet, and the freedom of speech. Now I can 119 00:07:43,920 --> 00:07:48,400 Speaker 1: appreciate the desire to give people the power to challenge misinformation. 120 00:07:48,680 --> 00:07:52,559 Speaker 1: We have seen how destructive misinformation can be on lots 121 00:07:52,560 --> 00:07:55,760 Speaker 1: of different levels. But I imagine this particular approach is 122 00:07:55,800 --> 00:08:01,600 Speaker 1: going to necessitate some sort of arbitration. Hardy to determine 123 00:08:01,640 --> 00:08:05,200 Speaker 1: which claims are legitimate and which are mere attempts to remove, 124 00:08:05,800 --> 00:08:09,720 Speaker 1: you know, troublesome links. Like for example, if I were 125 00:08:09,760 --> 00:08:13,520 Speaker 1: an executive of a company and I made decisions that 126 00:08:13,600 --> 00:08:17,760 Speaker 1: lead to massive economic loss, I would probably want to 127 00:08:17,800 --> 00:08:21,880 Speaker 1: try and scrub Google of search results that detail my mistakes, 128 00:08:21,920 --> 00:08:25,800 Speaker 1: that reflects poorly on me. Now, maybe those links are 129 00:08:25,840 --> 00:08:28,679 Speaker 1: all two stories that are true, but I don't really 130 00:08:29,160 --> 00:08:31,680 Speaker 1: want them to show up. I mean, I definitely don't 131 00:08:31,680 --> 00:08:34,280 Speaker 1: want them to show up if they're true, So I 132 00:08:34,280 --> 00:08:37,160 Speaker 1: would probably go through the process of trying to challenge that. 133 00:08:37,520 --> 00:08:39,720 Speaker 1: I imagine we're going to see a lot of instances 134 00:08:39,760 --> 00:08:42,400 Speaker 1: of that kind of thing as we move forward, and 135 00:08:42,480 --> 00:08:44,880 Speaker 1: not all cases are going to be as straightforward as 136 00:08:44,920 --> 00:08:48,480 Speaker 1: my rather primitive example just was. Like that one, you 137 00:08:48,480 --> 00:08:51,120 Speaker 1: could argue, all right, well, this is clearly someone who's 138 00:08:51,160 --> 00:08:55,000 Speaker 1: just trying to whitewash their background. That's not a legitimate 139 00:08:55,040 --> 00:08:59,040 Speaker 1: example of misinformation. But in other cases it might be 140 00:08:59,160 --> 00:09:02,920 Speaker 1: harder to make determination. Now you might even argue this 141 00:09:03,000 --> 00:09:06,080 Speaker 1: is the wrong way to go about doing things, because 142 00:09:06,800 --> 00:09:12,800 Speaker 1: really the fault lies with whatever site has published the misinformation, 143 00:09:13,520 --> 00:09:17,320 Speaker 1: and that the EU Court should instead focus on finding 144 00:09:17,360 --> 00:09:21,240 Speaker 1: ways to address that issue rather than go after a 145 00:09:21,320 --> 00:09:24,200 Speaker 1: company that just provides links to stuff. It's not like 146 00:09:24,240 --> 00:09:28,680 Speaker 1: Google wrote all these things, they just index them. But 147 00:09:28,960 --> 00:09:30,800 Speaker 1: you know, it's a really tough challenge to go after 148 00:09:30,840 --> 00:09:35,520 Speaker 1: the source, particularly for sites that aren't located in the EU. 149 00:09:35,640 --> 00:09:38,080 Speaker 1: Your jurisdiction only goes so far. And how do you 150 00:09:39,280 --> 00:09:44,160 Speaker 1: enforce a rule for a server that isn't located in 151 00:09:44,200 --> 00:09:47,960 Speaker 1: your in your jurisdiction. So I don't have the perfect 152 00:09:47,960 --> 00:09:51,320 Speaker 1: solution here. I just think that this particular approach of 153 00:09:51,480 --> 00:09:58,160 Speaker 1: punishing Google for publishing links to information um doesn't really 154 00:09:58,200 --> 00:10:00,480 Speaker 1: get at the problem. And it's really just going to 155 00:10:00,520 --> 00:10:05,719 Speaker 1: create a massive administrative mess um and it's going to 156 00:10:05,840 --> 00:10:10,320 Speaker 1: cost lots and lots of money to go through all 157 00:10:10,360 --> 00:10:13,400 Speaker 1: of this. So I don't know how long this will 158 00:10:13,440 --> 00:10:17,440 Speaker 1: be an action, or if it will be you know, 159 00:10:18,640 --> 00:10:22,880 Speaker 1: tweaked or repealed. It's hard to say. I just I 160 00:10:22,960 --> 00:10:26,800 Speaker 1: can't see it being successful moving forward. But again that 161 00:10:26,840 --> 00:10:31,680 Speaker 1: could just be my own my opic limitations. Okay, we've 162 00:10:31,679 --> 00:10:34,040 Speaker 1: got lots more stories to get through before we go 163 00:10:34,120 --> 00:10:46,360 Speaker 1: any further. Let's take a quick break. Okay, we're back. 164 00:10:46,800 --> 00:10:50,160 Speaker 1: The Wall Street Journal reports that Google is merging its 165 00:10:50,200 --> 00:10:53,880 Speaker 1: ways most of W A. Z and Google Maps divisions 166 00:10:54,120 --> 00:10:57,679 Speaker 1: into a single division and a cost saving measure. So 167 00:10:57,760 --> 00:11:01,320 Speaker 1: the plan is still to maintain separate apps and not 168 00:11:01,559 --> 00:11:05,199 Speaker 1: combine Ways and Maps together. So while the divisions will 169 00:11:05,240 --> 00:11:08,679 Speaker 1: be consolidated, the apps will not be at least not 170 00:11:08,760 --> 00:11:11,959 Speaker 1: for now. Google hasn't indicated that they're going to be 171 00:11:12,000 --> 00:11:13,920 Speaker 1: any layoffs. In fact, the company says there are no 172 00:11:14,000 --> 00:11:18,040 Speaker 1: plans for layoffs following this merger. However, the current CEO 173 00:11:18,200 --> 00:11:22,640 Speaker 1: of Ways Nia Park will have to leave once these 174 00:11:22,679 --> 00:11:26,720 Speaker 1: two departments have finished that unification process and there's been 175 00:11:26,760 --> 00:11:30,560 Speaker 1: a sufficient transition period. For those of y'all who are 176 00:11:30,600 --> 00:11:34,199 Speaker 1: not familiar with Ways, it's a traffic app as in 177 00:11:34,360 --> 00:11:37,800 Speaker 1: real world street traffic, like the kind that cars and 178 00:11:37,840 --> 00:11:40,440 Speaker 1: bikes and stuff are on, and it helps you navigate 179 00:11:40,480 --> 00:11:44,440 Speaker 1: to a destination by suggesting the fastest route and also 180 00:11:44,480 --> 00:11:49,200 Speaker 1: giving you real time updates from user generated inputs. So 181 00:11:49,440 --> 00:11:52,880 Speaker 1: if Ways users up ahead of you note that there 182 00:11:53,080 --> 00:11:56,000 Speaker 1: is a speed trap, they could use the app to say, hey, 183 00:11:56,280 --> 00:11:58,280 Speaker 1: the po po or on the lookout for lead foots, 184 00:11:58,320 --> 00:12:02,160 Speaker 1: so he's off the throttle buster, but you know, probably not. 185 00:12:02,240 --> 00:12:04,960 Speaker 1: In so many words, and boom, you get a little 186 00:12:04,960 --> 00:12:08,360 Speaker 1: indicator along your route that the fuzzes out there looking 187 00:12:08,360 --> 00:12:12,520 Speaker 1: for road demons or something. Since Google purchased Ways back 188 00:12:12,559 --> 00:12:15,920 Speaker 1: in two thousand thirteen, the company has incorporated some of 189 00:12:16,000 --> 00:12:19,840 Speaker 1: ways as features into its own Google Maps app. But 190 00:12:19,920 --> 00:12:22,400 Speaker 1: you might wonder why did Google keep these two divisions 191 00:12:22,480 --> 00:12:25,040 Speaker 1: separate in the first place. Well, that was likely to 192 00:12:25,160 --> 00:12:30,480 Speaker 1: avoid imperial entanglements, as Obi Wan Kenobi would say, which 193 00:12:30,520 --> 00:12:32,480 Speaker 1: is really just my cheeky way of saying that the 194 00:12:32,520 --> 00:12:35,839 Speaker 1: Federal Trade Commission the f TC here in the United 195 00:12:35,880 --> 00:12:39,160 Speaker 1: States was concerned that the acquisition would lead to decreased 196 00:12:39,160 --> 00:12:42,199 Speaker 1: competition in the space, and so Google, perhaps in an 197 00:12:42,200 --> 00:12:46,959 Speaker 1: effort to ward off any regulatory obstacles, decided to operate 198 00:12:47,000 --> 00:12:51,720 Speaker 1: Ways as a separate entity. According to Noam Bardon, who 199 00:12:51,760 --> 00:12:54,920 Speaker 1: is a former CEO of Ways, having Google as an 200 00:12:54,960 --> 00:12:57,800 Speaker 1: overlord ended up being a bit of a detriment because 201 00:12:58,640 --> 00:13:02,960 Speaker 1: it really kind of inhibited innovation and growth. He posted 202 00:13:02,960 --> 00:13:05,680 Speaker 1: on LinkedIn last year that the company probably would have 203 00:13:05,679 --> 00:13:10,360 Speaker 1: been more successful had it remained independent. Well, hindsight is 204 00:13:11,280 --> 00:13:13,320 Speaker 1: and it's not like we can look at an incredibly 205 00:13:13,360 --> 00:13:17,720 Speaker 1: long list of examples of Google acquisitions that didn't go well, Nope, 206 00:13:18,400 --> 00:13:22,040 Speaker 1: every single thing Google has ever purchased has gone on 207 00:13:22,080 --> 00:13:26,400 Speaker 1: to grow and prosper cough cough. One story I haven't 208 00:13:26,520 --> 00:13:30,760 Speaker 1: really covered and tech stuff yet is about AI generating 209 00:13:30,920 --> 00:13:36,960 Speaker 1: really compelling text Now. I'm specifically referencing chat gpt. That's 210 00:13:36,960 --> 00:13:40,920 Speaker 1: a chat bought powered by open Ai. That's the organization 211 00:13:40,920 --> 00:13:44,760 Speaker 1: that's also responsible for developing doll E and AI app 212 00:13:44,840 --> 00:13:49,120 Speaker 1: that can generate images based on text inputs. Chat Gpt 213 00:13:49,320 --> 00:13:52,320 Speaker 1: has been getting a lot of attention because it's free 214 00:13:52,360 --> 00:13:57,800 Speaker 1: to use and the AI can create some really cool 215 00:13:57,920 --> 00:14:01,880 Speaker 1: responses to text inquiries. Now some are arguing that this 216 00:14:01,920 --> 00:14:05,480 Speaker 1: could be the future of search, in that chat gpt 217 00:14:06,160 --> 00:14:10,480 Speaker 1: synthesizes information from numerous sources and then generates a response 218 00:14:10,520 --> 00:14:13,160 Speaker 1: in real time. Though the app is quick to point 219 00:14:13,200 --> 00:14:15,920 Speaker 1: out it is not actively connected to the Internet, so 220 00:14:16,040 --> 00:14:19,560 Speaker 1: it can't address issues that are currently unfolding. Instead, it 221 00:14:19,640 --> 00:14:22,960 Speaker 1: draws upon a huge database of indexed information. Kind of 222 00:14:23,000 --> 00:14:26,840 Speaker 1: reminds me of Watson when Watson from IBM went up 223 00:14:26,920 --> 00:14:31,640 Speaker 1: on Jeopardy. The responses tend to be presented in a 224 00:14:31,680 --> 00:14:35,040 Speaker 1: way that makes them sound like they're pretty darn legit, 225 00:14:35,520 --> 00:14:38,840 Speaker 1: though folks have also noticed that sometimes the answers that 226 00:14:38,960 --> 00:14:43,800 Speaker 1: chat GPT gives are just plain wrong. That's a problem 227 00:14:43,880 --> 00:14:47,120 Speaker 1: since the presentation of the information appears to be confident 228 00:14:47,480 --> 00:14:51,320 Speaker 1: and reliable, like if you expect to get the real 229 00:14:51,480 --> 00:14:54,040 Speaker 1: answer when you're asking a question, you might trust that 230 00:14:54,120 --> 00:14:58,320 Speaker 1: what you get is legit, and in all cases that 231 00:14:58,400 --> 00:15:02,600 Speaker 1: may not actually work right. There might be some outliers, 232 00:15:03,080 --> 00:15:07,760 Speaker 1: maybe a lot of outliers that if you trust that information, 233 00:15:07,800 --> 00:15:10,560 Speaker 1: you'd be going down the wrong path. So it could 234 00:15:10,640 --> 00:15:17,800 Speaker 1: lead to people thinking that information is you know, inherently trustworthy, 235 00:15:18,320 --> 00:15:21,480 Speaker 1: or it could lead to people, you know, thinking that 236 00:15:21,560 --> 00:15:25,000 Speaker 1: all information is inherently untrustworthy. It could go either way. 237 00:15:25,040 --> 00:15:29,880 Speaker 1: So it's complicated because chat GPT doesn't disclose what sources 238 00:15:29,960 --> 00:15:33,160 Speaker 1: it used in order to generate the responses that it gives, 239 00:15:33,720 --> 00:15:35,760 Speaker 1: so you can't go back and check its work. In 240 00:15:35,800 --> 00:15:37,720 Speaker 1: other words, you can't say, all, right, well, where did 241 00:15:37,760 --> 00:15:40,480 Speaker 1: you hear this from? You're just getting an answer. So 242 00:15:40,640 --> 00:15:43,440 Speaker 1: using chat GPT to get an answer to a question 243 00:15:43,520 --> 00:15:46,720 Speaker 1: makes me think of the old idea of the semantic web. 244 00:15:47,120 --> 00:15:51,600 Speaker 1: That concept involves using the web and getting contextual results 245 00:15:51,640 --> 00:15:56,160 Speaker 1: based upon your preferences, your needs, your current situation. So, 246 00:15:56,280 --> 00:16:00,320 Speaker 1: for example, with the semantic web, you might law again 247 00:16:00,440 --> 00:16:03,480 Speaker 1: to find out how to prepare a specific kind of meal, 248 00:16:03,600 --> 00:16:06,160 Speaker 1: Let's say, like the Thanksgiving turkey, since in the United 249 00:16:06,160 --> 00:16:08,880 Speaker 1: States we had Thanksgiving a couple of weeks back, and 250 00:16:08,880 --> 00:16:11,960 Speaker 1: instead of getting a bunch of links two different recipes 251 00:16:12,000 --> 00:16:15,000 Speaker 1: the way you would with traditional search, the semantic web 252 00:16:15,000 --> 00:16:19,440 Speaker 1: would give you a procedurally generated series of instructions, and further, 253 00:16:19,960 --> 00:16:23,440 Speaker 1: in the ideal implementation, those instructions would be tailored to 254 00:16:23,520 --> 00:16:26,760 Speaker 1: your own level of expertise in the kitchen, as well 255 00:16:26,800 --> 00:16:31,040 Speaker 1: as your own taste preferences. But you know, chat GBT 256 00:16:31,440 --> 00:16:33,320 Speaker 1: looks like it's a bit of a step in that direction, 257 00:16:33,320 --> 00:16:36,160 Speaker 1: except obviously it's not able to adapt to each and 258 00:16:36,200 --> 00:16:39,120 Speaker 1: every user like that. Anyway, there have been a lot 259 00:16:39,120 --> 00:16:42,800 Speaker 1: of concerns about chat GPT, ranging from future students are 260 00:16:42,840 --> 00:16:46,360 Speaker 1: just gonna use AI to outsource homework and thus never 261 00:16:46,560 --> 00:16:50,560 Speaker 1: learn anything all the way to this will destroy Google's 262 00:16:50,600 --> 00:16:53,160 Speaker 1: at business because people will get the answers they want 263 00:16:53,160 --> 00:16:56,840 Speaker 1: without actually having to scroll through a list of links 264 00:16:56,880 --> 00:16:59,840 Speaker 1: from search results and then click through to them. And 265 00:16:59,880 --> 00:17:03,520 Speaker 1: I do think there are reasons to be concerned. I'm 266 00:17:03,560 --> 00:17:05,840 Speaker 1: not quite ready to say that the sky is falling 267 00:17:06,000 --> 00:17:09,760 Speaker 1: just yet. But I'd say like cracks are starting to appear. 268 00:17:10,000 --> 00:17:13,399 Speaker 1: And I do think also just in general, that educators 269 00:17:13,440 --> 00:17:17,080 Speaker 1: really need to concentrate on teaching kids how to think, 270 00:17:17,800 --> 00:17:21,000 Speaker 1: because that's way more important than just memorizing dates and stuff. 271 00:17:21,040 --> 00:17:25,320 Speaker 1: But that's a rant for another time. Carl Rassine, the 272 00:17:25,359 --> 00:17:29,040 Speaker 1: Attorney General for Washington d C, has filed a lawsuit 273 00:17:29,240 --> 00:17:32,160 Speaker 1: against Amazon. Not. The heart of the matter is how 274 00:17:32,200 --> 00:17:37,639 Speaker 1: Amazon between and twenty nineteen was dipping into tips that 275 00:17:37,720 --> 00:17:42,440 Speaker 1: customers were leaving for Amazon delivery drivers. Now, Amazon has 276 00:17:42,440 --> 00:17:45,800 Speaker 1: already had to answer for this once before, because the 277 00:17:45,880 --> 00:17:49,160 Speaker 1: FTC investigated the matter and said, hey, yeah, you are 278 00:17:49,240 --> 00:17:52,480 Speaker 1: stealing from your employees. So the company ended up paying 279 00:17:52,560 --> 00:17:56,800 Speaker 1: nearly sixty two million dollars to drivers and restitution. But 280 00:17:57,040 --> 00:18:02,520 Speaker 1: Racines point is that customers didn't receive any consideration for 281 00:18:02,640 --> 00:18:06,960 Speaker 1: the harm that they experienced because of Amazon's actions, and 282 00:18:07,080 --> 00:18:09,120 Speaker 1: I can kind of get behind that. I mean, if 283 00:18:09,160 --> 00:18:11,840 Speaker 1: I leave a tip for a person, I really want 284 00:18:11,840 --> 00:18:14,800 Speaker 1: that tip to go to the person who was responsible 285 00:18:14,880 --> 00:18:19,080 Speaker 1: for making the experience whatever it was. When I find 286 00:18:19,080 --> 00:18:22,200 Speaker 1: out that a place of business forces employees to surrender 287 00:18:22,280 --> 00:18:24,800 Speaker 1: their tips so that the company can take a cut, 288 00:18:25,040 --> 00:18:27,040 Speaker 1: I get pretty darn upset about that. I mean, the 289 00:18:27,080 --> 00:18:30,000 Speaker 1: company already got it's cut from my business. The tip 290 00:18:30,119 --> 00:18:32,399 Speaker 1: is on top of that. Right now, there is a 291 00:18:32,480 --> 00:18:35,000 Speaker 1: case that you can make that in places like the 292 00:18:35,040 --> 00:18:38,159 Speaker 1: restaurant business here in the United States that you know, 293 00:18:38,280 --> 00:18:40,760 Speaker 1: sometimes you have to pool tips to divvy them up 294 00:18:40,760 --> 00:18:43,680 Speaker 1: to cover staff who traditionally don't get tips at all, 295 00:18:44,119 --> 00:18:48,000 Speaker 1: but they still get paid terrible wages. But again, that's 296 00:18:48,000 --> 00:18:49,960 Speaker 1: going to send me into a rant of how the 297 00:18:50,080 --> 00:18:53,280 Speaker 1: restaurant business in the United States is an absolute racket. 298 00:18:53,320 --> 00:18:55,680 Speaker 1: For those of who are actually in the service industry, 299 00:18:56,320 --> 00:18:58,480 Speaker 1: it is a mess. Not across the board, there are 300 00:18:58,720 --> 00:19:01,960 Speaker 1: places that pay their staff a living wage, but that's 301 00:19:02,480 --> 00:19:05,320 Speaker 1: not what the law allows. The law allows them to 302 00:19:05,359 --> 00:19:10,159 Speaker 1: be criminally underpaid and to become completely dependent upon tips. 303 00:19:10,200 --> 00:19:14,560 Speaker 1: It is messed up anyway. Racine says that Amazon has 304 00:19:14,640 --> 00:19:17,239 Speaker 1: yet to answer for the harm done to consumers in 305 00:19:17,280 --> 00:19:20,160 Speaker 1: the DC area for this practice, and wishes to seek 306 00:19:20,240 --> 00:19:24,480 Speaker 1: civil penalties against the company. Perhaps that Amazon had said, Hey, whoa, 307 00:19:24,560 --> 00:19:27,960 Speaker 1: we already paid the drivers. Plus we don't do that anymore. 308 00:19:28,040 --> 00:19:31,600 Speaker 1: We stopped doing that in twenty nineteen, So where's the problem, 309 00:19:31,680 --> 00:19:35,239 Speaker 1: Like we don't do what you're are angry about, and 310 00:19:35,240 --> 00:19:37,680 Speaker 1: we already paid back the drivers. Now, Rasina is saying 311 00:19:37,960 --> 00:19:40,560 Speaker 1: the lawsuit is necessary to send a message to companies 312 00:19:40,560 --> 00:19:44,840 Speaker 1: that deceiving customers as well as stealing from employees will 313 00:19:44,920 --> 00:19:48,080 Speaker 1: not be tolerated and will be met with penalties. So 314 00:19:48,320 --> 00:19:50,360 Speaker 1: what Rasina is saying is no, no, no, no no. 315 00:19:50,760 --> 00:19:53,440 Speaker 1: You you paid back the drivers, but you still deceived 316 00:19:53,600 --> 00:19:56,359 Speaker 1: the public, and that is what I'm going after you 317 00:19:56,440 --> 00:20:00,800 Speaker 1: for interesting approach. Todd Rakita, the Orney General for the 318 00:20:00,880 --> 00:20:05,320 Speaker 1: US state of Indiana, has gone and sued TikTok, claiming 319 00:20:05,320 --> 00:20:08,959 Speaker 1: the company has violated state consumer protection laws. Now, there 320 00:20:08,960 --> 00:20:11,720 Speaker 1: are a couple of concerns in this lawsuit. In fact, 321 00:20:11,760 --> 00:20:15,040 Speaker 1: there's a pair of lawsuits. So one is really about 322 00:20:15,119 --> 00:20:18,240 Speaker 1: how the Attorney General says TikTok has failed to ensure 323 00:20:18,280 --> 00:20:22,480 Speaker 1: the safety and privacy for its younger users, exposing them 324 00:20:22,520 --> 00:20:26,560 Speaker 1: to inappropriate content and even suggesting such content through the 325 00:20:26,640 --> 00:20:30,800 Speaker 1: use of its algorithms, and also putting user privacy at risk, 326 00:20:31,280 --> 00:20:34,840 Speaker 1: which is really bad when those users tend to be 327 00:20:35,080 --> 00:20:39,920 Speaker 1: under age. The second lawsuit is one that concerns something 328 00:20:39,920 --> 00:20:42,760 Speaker 1: we've heard several times over the last few years that 329 00:20:42,920 --> 00:20:46,920 Speaker 1: TikTok fails to disclose to what degree the Chinese government 330 00:20:47,040 --> 00:20:51,240 Speaker 1: has access to user data on TikTok. Now reps that 331 00:20:51,320 --> 00:20:55,200 Speaker 1: TikTok have said that the app has extensive content controls 332 00:20:55,240 --> 00:20:58,600 Speaker 1: in place, and further that the company is confident it 333 00:20:58,640 --> 00:21:03,439 Speaker 1: can address quote all reasonable US national security concerns end quote. 334 00:21:04,040 --> 00:21:08,520 Speaker 1: TikTok's parent company is Bite Dance, Chinese company, and TikTok 335 00:21:08,600 --> 00:21:13,200 Speaker 1: has repeatedly been scrutinized with concerns that this association gives 336 00:21:13,320 --> 00:21:18,000 Speaker 1: Chinese agents access to US citizens private information. That's a 337 00:21:18,000 --> 00:21:22,840 Speaker 1: concern that TikTok has repeatedly dismissed or denied. As to 338 00:21:22,880 --> 00:21:26,800 Speaker 1: what I think. I mean, I remain uneasy about TikTok, 339 00:21:27,040 --> 00:21:30,160 Speaker 1: but then I also feel uneasy about all social networks 340 00:21:30,160 --> 00:21:33,639 Speaker 1: at this point. It's just, you know, our information is 341 00:21:33,680 --> 00:21:39,600 Speaker 1: being harvested like crazy across multiple platforms. So at some 342 00:21:39,680 --> 00:21:42,280 Speaker 1: level you just start to ask, am I am I 343 00:21:42,359 --> 00:21:44,879 Speaker 1: just not okay with it in this case, but I 344 00:21:44,960 --> 00:21:46,959 Speaker 1: am okay with it in this other case? Or am 345 00:21:46,960 --> 00:21:49,160 Speaker 1: I not okay with it at all? And it turns 346 00:21:49,160 --> 00:21:51,480 Speaker 1: out as I get older, I get more paranoid and 347 00:21:51,520 --> 00:21:54,280 Speaker 1: grouchy So I guess what I'm saying is the jury 348 00:21:54,359 --> 00:21:58,159 Speaker 1: is still out on this. I don't have any evidence 349 00:21:58,280 --> 00:22:01,399 Speaker 1: that the Chinese government has a says to user data 350 00:22:01,520 --> 00:22:05,919 Speaker 1: on TikTok um, but I'm also not entirely satisfied with 351 00:22:05,960 --> 00:22:10,960 Speaker 1: TikTok's answers for that. So yeah, it's tricky. Okay, We've 352 00:22:11,000 --> 00:22:13,360 Speaker 1: got a few more stories to get through before we 353 00:22:13,720 --> 00:22:16,200 Speaker 1: wrap up this episode. Before we do that, let's take 354 00:22:16,240 --> 00:22:28,000 Speaker 1: another quick break. We're back, and now we have a 355 00:22:28,080 --> 00:22:33,160 Speaker 1: quick trio of Twitter related stories. Nothing major. First up, 356 00:22:33,680 --> 00:22:36,600 Speaker 1: as Twitter grew in popularity, we saw the app developer 357 00:22:36,640 --> 00:22:40,600 Speaker 1: community rise up to integrate Twitter functionality into different apps. 358 00:22:40,920 --> 00:22:44,000 Speaker 1: You know, stuff like tweet deck, which allows users to 359 00:22:44,000 --> 00:22:47,440 Speaker 1: create a single view for multiple Twitter accounts, among other things. 360 00:22:47,760 --> 00:22:49,960 Speaker 1: In fact, tweet deck was popular enough for Twitter to 361 00:22:50,040 --> 00:22:52,960 Speaker 1: acquire it back in two thousand eleven. Well, now we're 362 00:22:52,960 --> 00:22:57,800 Speaker 1: seeing that same app developer community turn their attention to Mastodon, 363 00:22:58,280 --> 00:23:01,879 Speaker 1: which some people have migrated to as an alternative to Twitter. 364 00:23:02,440 --> 00:23:07,359 Speaker 1: Mastodon has an inherently different structure from Twitter. It is 365 00:23:07,440 --> 00:23:10,919 Speaker 1: decentralized because Macedon is a platform that allows people to 366 00:23:10,960 --> 00:23:14,199 Speaker 1: set up servers, So joining Masodon means you have to 367 00:23:14,280 --> 00:23:17,960 Speaker 1: choose a home server. You can still communicate with people 368 00:23:18,000 --> 00:23:22,280 Speaker 1: who have chosen a different home server, but your abilities 369 00:23:22,440 --> 00:23:25,440 Speaker 1: on your home server versus what you can do across 370 00:23:25,480 --> 00:23:28,560 Speaker 1: all servers are very different, and app developers have seen 371 00:23:28,600 --> 00:23:32,280 Speaker 1: an opportunity because as millions of people have joined Mastodon, 372 00:23:33,280 --> 00:23:36,240 Speaker 1: they have discovered that Macedon lacks a lot of the 373 00:23:36,240 --> 00:23:39,199 Speaker 1: features and functions that you could find on Twitter. To 374 00:23:39,280 --> 00:23:41,639 Speaker 1: that end, app developers are working to create apps that 375 00:23:41,720 --> 00:23:45,159 Speaker 1: either replicate certain Twitter functions or find new ways to 376 00:23:45,160 --> 00:23:48,760 Speaker 1: add features to the basic Mastodon experience. Now that might 377 00:23:48,880 --> 00:23:52,240 Speaker 1: end up irritating some Mastodon users, people who have been 378 00:23:52,280 --> 00:23:54,760 Speaker 1: on the platform for a while, because this could mean 379 00:23:54,760 --> 00:23:57,560 Speaker 1: that we'll see even more new people flood servers and 380 00:23:57,600 --> 00:24:01,400 Speaker 1: potentially change the communities that have been place for ages. 381 00:24:02,359 --> 00:24:04,800 Speaker 1: But considering people are still wondering if Twitter is going 382 00:24:04,840 --> 00:24:07,240 Speaker 1: to stand the test of time, it might be a 383 00:24:07,440 --> 00:24:10,760 Speaker 1: necessary evil. In other Twitter news, the company is reporting 384 00:24:10,800 --> 00:24:13,879 Speaker 1: going that it's going to charge iOS users a little 385 00:24:13,880 --> 00:24:17,240 Speaker 1: bit more for Twitter Blue when the service comes back online. 386 00:24:17,560 --> 00:24:20,200 Speaker 1: You might remember that Twitter Blue was put on ice 387 00:24:20,760 --> 00:24:24,160 Speaker 1: for the time being after users and trust me, nobody 388 00:24:24,200 --> 00:24:28,600 Speaker 1: could have predicted this after users abused the new verification 389 00:24:28,680 --> 00:24:32,760 Speaker 1: check by creating impersonation accounts and then flooding Twitter with jokes, harassment, 390 00:24:32,800 --> 00:24:35,520 Speaker 1: and other mischief. Who could have seen that coming? I mean, 391 00:24:35,560 --> 00:24:37,760 Speaker 1: it was just out of the blue, right Twitter Blue 392 00:24:38,200 --> 00:24:40,880 Speaker 1: m This appears, by the way, to be a response 393 00:24:40,880 --> 00:24:44,440 Speaker 1: to Apple's policy of taking a cut of each in 394 00:24:44,640 --> 00:24:49,159 Speaker 1: app transaction. So in order to recapture the money that 395 00:24:49,200 --> 00:24:52,879 Speaker 1: would otherwise go to Apple, Musk and team are upping 396 00:24:52,920 --> 00:24:56,480 Speaker 1: Twitter Blues subscription fee from seven bucks a month, which 397 00:24:56,480 --> 00:24:59,440 Speaker 1: is how much it will cost you to subscribe via 398 00:24:59,680 --> 00:25:03,520 Speaker 1: the EBB, to eleven bucks a month if you subscribe 399 00:25:03,680 --> 00:25:07,280 Speaker 1: via the iOS app. I'm not sure how many users 400 00:25:07,280 --> 00:25:09,560 Speaker 1: will be shelling out for subscriptions at all, but I'm 401 00:25:09,560 --> 00:25:11,800 Speaker 1: guessing Elon Musk is hoping it's going to be a lot, 402 00:25:11,880 --> 00:25:14,879 Speaker 1: because reportedly Twitter has been having a doozy of a 403 00:25:14,920 --> 00:25:17,800 Speaker 1: time keeping its AD revenue going and something needs to 404 00:25:17,800 --> 00:25:22,160 Speaker 1: be there to help offset costs. I haven't heard whether 405 00:25:22,240 --> 00:25:23,959 Speaker 1: or not Twitter is going to do this for Google 406 00:25:24,160 --> 00:25:28,840 Speaker 1: and for Android users, because Google also takes a cut 407 00:25:29,000 --> 00:25:33,760 Speaker 1: of an app transactions. When these stories come up, they 408 00:25:33,800 --> 00:25:38,080 Speaker 1: almost always focus on Apple rather than Android, and I 409 00:25:38,119 --> 00:25:41,680 Speaker 1: always find that confusing because both companies do this kind 410 00:25:41,680 --> 00:25:45,040 Speaker 1: of thing. Now, rounding out our Twitter stories is just 411 00:25:45,160 --> 00:25:48,760 Speaker 1: a quick financial bit of news. Reuter's reports that the 412 00:25:48,800 --> 00:25:53,040 Speaker 1: banks that helped finance Musk's acquisition of Twitter are looking 413 00:25:53,080 --> 00:25:55,840 Speaker 1: to margin loans to help take the edge off some 414 00:25:55,960 --> 00:25:59,040 Speaker 1: of the massive debt that the company has, which is 415 00:25:59,040 --> 00:26:03,040 Speaker 1: around their teen billion dollars. Now, part of that debt 416 00:26:03,440 --> 00:26:06,600 Speaker 1: is three billion dollars of unsecured debt and it has 417 00:26:06,600 --> 00:26:10,840 Speaker 1: an interest rate of eleven point seven. So not only 418 00:26:11,000 --> 00:26:14,720 Speaker 1: is Twitter and debt, but the interest payments are enormous, 419 00:26:14,760 --> 00:26:16,560 Speaker 1: and so the banks are looking for ways to kind 420 00:26:16,560 --> 00:26:18,960 Speaker 1: of migrate some of that debt into other forms that 421 00:26:19,560 --> 00:26:22,080 Speaker 1: from a non technical perspective are are just not quite 422 00:26:22,119 --> 00:26:25,679 Speaker 1: so scary. Now. Originally the plan was to sell debt 423 00:26:25,840 --> 00:26:29,280 Speaker 1: to investors who would buy the debt in the hopes 424 00:26:29,320 --> 00:26:32,840 Speaker 1: of being repaid. Right, you you buy the debt with 425 00:26:33,000 --> 00:26:36,800 Speaker 1: the hope of the the the company that has the 426 00:26:36,800 --> 00:26:42,680 Speaker 1: debt paying you back with interest. But then you know, everybody, 427 00:26:43,080 --> 00:26:46,560 Speaker 1: banks and investors were everybody was looking at the chaos 428 00:26:46,560 --> 00:26:50,920 Speaker 1: that was going on on Twitter, and essentially everyone said, yeah, 429 00:26:51,359 --> 00:26:54,080 Speaker 1: maybe not, because who the heck is going to buy 430 00:26:54,119 --> 00:26:56,280 Speaker 1: up debt when it looks like the company could collapse 431 00:26:56,320 --> 00:26:58,919 Speaker 1: in any second. Now, all that could change when we 432 00:26:58,960 --> 00:27:02,239 Speaker 1: get into three, because if the team at Twitter can 433 00:27:02,320 --> 00:27:05,119 Speaker 1: keep things afloat, then it could end up building up 434 00:27:05,119 --> 00:27:08,200 Speaker 1: more confidence in the company and maybe then we'll see 435 00:27:08,320 --> 00:27:11,680 Speaker 1: investors buy up debt. But for the time being, that's 436 00:27:11,720 --> 00:27:15,280 Speaker 1: on hold. Something I didn't talk about last month was 437 00:27:15,320 --> 00:27:19,600 Speaker 1: how the Board of Supervisors for San Francisco had approved 438 00:27:19,640 --> 00:27:24,600 Speaker 1: a really scary proposal that would have given police the 439 00:27:24,680 --> 00:27:31,040 Speaker 1: right to use robots equipped with explosives to use offensively 440 00:27:31,160 --> 00:27:34,600 Speaker 1: in situations where deadly force was warranted, such as an 441 00:27:34,640 --> 00:27:39,840 Speaker 1: incidents involving armed suspects. Now, that decision promptly sparked an 442 00:27:39,960 --> 00:27:43,760 Speaker 1: understandable critical response, as people worried that the police would 443 00:27:44,080 --> 00:27:48,440 Speaker 1: potentially abuse that technology or that any mistakes made would 444 00:27:48,520 --> 00:27:53,600 Speaker 1: lead to absolute tragedy. The initial vote past eight to 445 00:27:53,800 --> 00:27:58,760 Speaker 1: three in favor, but this particular proposal needs two votes. 446 00:27:58,800 --> 00:28:01,919 Speaker 1: In fact, all of these proposals need to votes to 447 00:28:02,040 --> 00:28:05,480 Speaker 1: move forward before it goes to the Mayor's office and 448 00:28:05,520 --> 00:28:10,120 Speaker 1: lo and behold. After that initial wave of criticism hit 449 00:28:10,480 --> 00:28:14,400 Speaker 1: the for this policy, that same group of supervisors then 450 00:28:14,520 --> 00:28:18,760 Speaker 1: voted eight to three to ban the use of lethal 451 00:28:18,800 --> 00:28:23,080 Speaker 1: force by police robots, which sends the original proposal back 452 00:28:23,119 --> 00:28:26,320 Speaker 1: for review where it could be altered or maybe just 453 00:28:26,400 --> 00:28:28,680 Speaker 1: outright scrapped. But it sounds to me like the board 454 00:28:28,680 --> 00:28:30,960 Speaker 1: had a real change of heart once folks pointed out 455 00:28:31,000 --> 00:28:34,800 Speaker 1: the bombs on wheels controlled by police could lead to 456 00:28:34,880 --> 00:28:39,640 Speaker 1: truly catastrophic consequences. Speaking of robots, Sony says it has 457 00:28:39,720 --> 00:28:44,080 Speaker 1: the technology to manufacture humanoid robots quickly. All Sony needs 458 00:28:44,520 --> 00:28:48,200 Speaker 1: is a reason. Kind of the cdo of Sony Group 459 00:28:48,240 --> 00:28:51,800 Speaker 1: Corporation essentially said that once the company identifies a good 460 00:28:52,080 --> 00:28:56,120 Speaker 1: use case for humanoid robots, it could go into manufacturing 461 00:28:56,120 --> 00:28:59,240 Speaker 1: those robots pretty darned quickly. And moreover, there are several 462 00:28:59,280 --> 00:29:02,200 Speaker 1: other companies that are in a similar position. It's just 463 00:29:02,240 --> 00:29:05,240 Speaker 1: that we don't really have a compelling use case yet, 464 00:29:05,640 --> 00:29:08,840 Speaker 1: and I can kind of understand that. First of all, though, 465 00:29:09,320 --> 00:29:13,000 Speaker 1: making a robust humanoid robot comes with a ton of 466 00:29:13,040 --> 00:29:16,800 Speaker 1: engineering challenges. It turns out that stuff that most people 467 00:29:16,840 --> 00:29:21,640 Speaker 1: take for granted can be really hard problems for roboticists 468 00:29:21,680 --> 00:29:25,080 Speaker 1: to solve. But assuming that you have solved most or 469 00:29:25,120 --> 00:29:28,480 Speaker 1: even just some of those engineering problems, you still have 470 00:29:28,520 --> 00:29:33,720 Speaker 1: the question of what does this robot do? Though, usually 471 00:29:33,800 --> 00:29:37,800 Speaker 1: engineers design robots for a very narrow spectrum of tasks, 472 00:29:38,360 --> 00:29:41,240 Speaker 1: which reduces the variables that the robot has to contend 473 00:29:41,280 --> 00:29:45,360 Speaker 1: with and makes the design and engineering processes easier. Now 474 00:29:45,400 --> 00:29:50,080 Speaker 1: note I did not say easy, only easy. Year. If 475 00:29:50,120 --> 00:29:53,120 Speaker 1: all your robot has to do is weld four points 476 00:29:53,120 --> 00:29:56,320 Speaker 1: on a vehicle chassis as it goes down the production line, 477 00:29:56,960 --> 00:30:00,080 Speaker 1: well you don't have to worry about the robot being 478 00:30:00,080 --> 00:30:03,120 Speaker 1: able to open doors, or climb stairs or even look 479 00:30:03,160 --> 00:30:07,800 Speaker 1: remotely humanoid. None of that matters, And honestly, I think 480 00:30:07,840 --> 00:30:10,440 Speaker 1: the best use for robots is to tackle jobs that 481 00:30:10,480 --> 00:30:14,240 Speaker 1: people are not good at, or jobs that are one 482 00:30:14,280 --> 00:30:18,920 Speaker 1: of the three classic d s and robotics, which is dirty, dangerous, 483 00:30:19,080 --> 00:30:23,400 Speaker 1: or dull. Otherwise, if a human can already do the 484 00:30:23,480 --> 00:30:26,880 Speaker 1: job perfectly, well, there's no real reason to make a 485 00:30:27,000 --> 00:30:30,560 Speaker 1: robot do the job because we've already solved the problem 486 00:30:30,640 --> 00:30:33,320 Speaker 1: because a human can do the thing. But if the 487 00:30:33,400 --> 00:30:36,320 Speaker 1: job is dirty, dangerous, or dull, then making a robot 488 00:30:36,360 --> 00:30:38,800 Speaker 1: to do that thing makes sense because it spares a 489 00:30:38,880 --> 00:30:41,600 Speaker 1: human from having to deal with it, and that human 490 00:30:41,640 --> 00:30:44,600 Speaker 1: can go on to do something that isn't dirty, dangerous, 491 00:30:44,680 --> 00:30:47,320 Speaker 1: or dull. So those are really the only cases I 492 00:30:47,360 --> 00:30:52,040 Speaker 1: can think of u apart from maybe some applications in 493 00:30:52,160 --> 00:30:58,560 Speaker 1: social robotics, which is its own weird kind of of realm. 494 00:30:58,600 --> 00:31:01,440 Speaker 1: It's it's weird because you don't have to just take 495 00:31:01,480 --> 00:31:04,440 Speaker 1: into consideration what can the robots do. You also have 496 00:31:04,520 --> 00:31:08,000 Speaker 1: to take into consideration how do people react to the robots. 497 00:31:08,040 --> 00:31:12,280 Speaker 1: And it's the people's side that really can be hard 498 00:31:12,320 --> 00:31:16,680 Speaker 1: for engineers to tackle because it's not necessarily logical. But 499 00:31:17,400 --> 00:31:22,120 Speaker 1: that's a matter for a deeper dive than some future podcast. Finally, 500 00:31:22,280 --> 00:31:26,880 Speaker 1: thanks to Gismoto, I am now aware of the Dyson Zone. 501 00:31:27,400 --> 00:31:29,560 Speaker 1: The Dyson Zone is the name for a pair of 502 00:31:29,720 --> 00:31:35,360 Speaker 1: noise canceling headphones from Dyson, you know, the vacuum cleaner company. 503 00:31:35,840 --> 00:31:39,600 Speaker 1: These headphones have multiple microphones designed to detect noise from 504 00:31:39,640 --> 00:31:43,320 Speaker 1: the outside world, and then the headphone speakers counteract that 505 00:31:43,400 --> 00:31:46,760 Speaker 1: noise by generating sound waves that are in effect opposite 506 00:31:46,800 --> 00:31:49,840 Speaker 1: to the ones from the outside of the headphones. That's 507 00:31:49,880 --> 00:31:53,520 Speaker 1: just how active noise canceling headphones work because opposite waves 508 00:31:53,560 --> 00:31:58,360 Speaker 1: cancel each other out. Anyway, a couple of these noise 509 00:31:58,440 --> 00:32:02,800 Speaker 1: canceling microphone are unique to Dyson because they are designed 510 00:32:02,840 --> 00:32:06,480 Speaker 1: to pick up the noise generated by an air filtration 511 00:32:06,640 --> 00:32:10,680 Speaker 1: system built into the headphones themselves. Yep, these headphones have 512 00:32:11,120 --> 00:32:16,440 Speaker 1: an active air filter component. So there's this visor like 513 00:32:16,960 --> 00:32:19,720 Speaker 1: peripheral maybe I should say it's it's like a face mask, 514 00:32:20,440 --> 00:32:24,520 Speaker 1: and it extends the active filtration system to fit over 515 00:32:24,560 --> 00:32:26,320 Speaker 1: the nose and mouth, although it does not make a 516 00:32:26,320 --> 00:32:29,400 Speaker 1: perfect seal, so Dyson has already said this is not 517 00:32:29,520 --> 00:32:32,200 Speaker 1: a device that will protect you from stuff like COVID. 518 00:32:33,200 --> 00:32:37,840 Speaker 1: It connects to the headphones via magnets, so you can 519 00:32:37,880 --> 00:32:40,200 Speaker 1: wear the headphones without this if you want to. From 520 00:32:40,200 --> 00:32:42,760 Speaker 1: what Gizmodo says, it sounds like the headphones are pretty 521 00:32:42,800 --> 00:32:45,160 Speaker 1: heavy even without the visor, but when you put it 522 00:32:45,200 --> 00:32:50,160 Speaker 1: on there you know they've got a considerable heft to them. Now, 523 00:32:50,160 --> 00:32:53,160 Speaker 1: when the visor is attached, you can have filtered air 524 00:32:53,200 --> 00:32:56,440 Speaker 1: blasted at your breathing holes. So if you like listening 525 00:32:56,440 --> 00:33:00,520 Speaker 1: to your Thumbosaurus tunes, Do to Do, but you happened 526 00:33:00,560 --> 00:33:03,120 Speaker 1: to sit next to a stinky co worker, Well I 527 00:33:03,160 --> 00:33:06,080 Speaker 1: got to your solution right here. Sure, you'll look like 528 00:33:06,120 --> 00:33:09,040 Speaker 1: one of those old stock photography images. Of a cyborg 529 00:33:09,160 --> 00:33:12,000 Speaker 1: or something. And sure you'll need to show out nearly 530 00:33:12,040 --> 00:33:16,040 Speaker 1: a grand to buy these headphones, but you'll get fresh 531 00:33:16,080 --> 00:33:18,640 Speaker 1: air for a little bit until the battery wears out. 532 00:33:19,040 --> 00:33:21,040 Speaker 1: That happens to be about four hours if you have 533 00:33:21,080 --> 00:33:23,760 Speaker 1: it on the lowest fan setting, or an hour and 534 00:33:23,760 --> 00:33:26,480 Speaker 1: a half if it's on the highest. And yeah, it 535 00:33:26,600 --> 00:33:31,400 Speaker 1: is gonna cost you nine dollars, so it's probably gonna 536 00:33:31,800 --> 00:33:34,360 Speaker 1: weed out a lot of potential buyers. You can pre 537 00:33:34,480 --> 00:33:37,840 Speaker 1: order a pair at a Dyson store starting in March. 538 00:33:38,440 --> 00:33:40,239 Speaker 1: If you've got a grand burning a hole in your 539 00:33:40,240 --> 00:33:44,120 Speaker 1: pocket and you don't mind looking, I'm gonna say odd 540 00:33:45,080 --> 00:33:47,800 Speaker 1: as you listen to your music and you breathe your 541 00:33:47,840 --> 00:33:51,160 Speaker 1: filtered air, Yeah, you should check it out. If you 542 00:33:51,200 --> 00:33:53,960 Speaker 1: haven't seen pictures of the diceon zone, look it up. 543 00:33:54,000 --> 00:33:57,560 Speaker 1: Because I looked at it, and I thought this can't 544 00:33:57,560 --> 00:34:00,400 Speaker 1: be a real thing, right, and it's certainly it cost 545 00:34:00,440 --> 00:34:04,600 Speaker 1: a grand, right, And I'm never gonna see someone wearing 546 00:34:04,640 --> 00:34:07,800 Speaker 1: one of these, right, And I don't know. I mean, 547 00:34:08,120 --> 00:34:11,920 Speaker 1: in Atlanta, I'm not likely to see someone wearing this 548 00:34:12,080 --> 00:34:16,760 Speaker 1: unless they're really being like a tech poser or something. 549 00:34:17,160 --> 00:34:20,280 Speaker 1: But you know who knows maybe maybe it will become 550 00:34:20,600 --> 00:34:24,080 Speaker 1: the must have Christmas gift next year for people who 551 00:34:24,160 --> 00:34:27,000 Speaker 1: really have a whole lot of money to spend on 552 00:34:27,120 --> 00:34:30,400 Speaker 1: Christmas gifts. That's kind of out of my price range. Okay, 553 00:34:30,440 --> 00:34:33,319 Speaker 1: that's it for this episode of tech News. Hope you 554 00:34:33,520 --> 00:34:36,520 Speaker 1: enjoyed it. If you have any suggestions for topics for 555 00:34:36,520 --> 00:34:39,960 Speaker 1: me to cover on tech Stuff, including suggestions for big 556 00:34:40,040 --> 00:34:44,560 Speaker 1: tech news items that came out over the year twenty two, 557 00:34:44,680 --> 00:34:47,719 Speaker 1: because I'm doing a wrap up episode pretty soon, let 558 00:34:47,800 --> 00:34:49,600 Speaker 1: me know. You can get in touch a couple of 559 00:34:49,600 --> 00:34:51,840 Speaker 1: different ways. One way is you can download the i 560 00:34:51,920 --> 00:34:55,000 Speaker 1: Heart Radio app. It's free to download and use, and 561 00:34:55,080 --> 00:34:58,000 Speaker 1: navigate on over to tech Stuff using the little handy 562 00:34:58,080 --> 00:35:00,840 Speaker 1: dandy search bar and they're you will see a little 563 00:35:00,880 --> 00:35:03,040 Speaker 1: microphone icon. If you click on that, you can leave 564 00:35:03,080 --> 00:35:06,160 Speaker 1: a voice message up to thirty seconds in length and 565 00:35:06,440 --> 00:35:08,160 Speaker 1: let me know what you would like to hear. If 566 00:35:08,160 --> 00:35:10,879 Speaker 1: you would prefer not to do that, which I totally get, 567 00:35:11,440 --> 00:35:14,000 Speaker 1: then you can leave me a message on Twitter. The 568 00:35:14,080 --> 00:35:17,840 Speaker 1: handle for the show is tech Stuff hs W and 569 00:35:17,880 --> 00:35:27,080 Speaker 1: I'll talk to you again really soon. YEA. Tech Stuff 570 00:35:27,160 --> 00:35:30,319 Speaker 1: is an i Heart Radio production. For more podcasts from 571 00:35:30,360 --> 00:35:34,120 Speaker 1: I Heart Radio, visit the I heart Radio app, Apple podcasts, 572 00:35:34,239 --> 00:35:36,240 Speaker 1: or wherever you listen to your favorite shows.