1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:15,160 Speaker 1: Hey there, and welcome to text Stuff. I'm your host, 3 00:00:15,280 --> 00:00:18,280 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,560 --> 00:00:21,680 Speaker 1: and I love all things tech. And this is the 5 00:00:21,720 --> 00:00:28,240 Speaker 1: tech news for Tuesday, August twenty one. As I record 6 00:00:28,320 --> 00:00:30,960 Speaker 1: this here in the United States, the Senate has just 7 00:00:31,120 --> 00:00:36,360 Speaker 1: passed a one point to trillion dollar infrastructure bill. Now, 8 00:00:36,440 --> 00:00:39,559 Speaker 1: for a bill to pass in the United States, it 9 00:00:39,640 --> 00:00:43,640 Speaker 1: has to pass by vote both the Senate and the 10 00:00:43,760 --> 00:00:48,440 Speaker 1: House of Representatives in Congress before then being signed into 11 00:00:48,560 --> 00:00:51,880 Speaker 1: law by the President. Now, the House is not set 12 00:00:51,920 --> 00:00:54,480 Speaker 1: to vote on this until they come back from recess. 13 00:00:55,120 --> 00:00:57,440 Speaker 1: That always makes me imagine a bunch of Congress people 14 00:00:57,440 --> 00:01:01,520 Speaker 1: out on the playground. Anyway, my point is that, you know, 15 00:01:01,640 --> 00:01:05,320 Speaker 1: with the Senate passing the bill, the House still has 16 00:01:05,440 --> 00:01:08,360 Speaker 1: its part to play, so changes can still happen to 17 00:01:08,520 --> 00:01:12,600 Speaker 1: the language in this bill, So nothing is set in 18 00:01:12,720 --> 00:01:16,080 Speaker 1: stone until it actually passes and then the President has 19 00:01:16,120 --> 00:01:19,360 Speaker 1: to sign it into law. Now, the bill that the 20 00:01:19,400 --> 00:01:23,800 Speaker 1: Senate passed includes five fifty billion dollars of news spending, 21 00:01:24,319 --> 00:01:26,400 Speaker 1: which is what puts it at the one point to 22 00:01:26,600 --> 00:01:31,399 Speaker 1: trillion dollar mark total, and that includes stuff that you'd expect, 23 00:01:31,640 --> 00:01:36,720 Speaker 1: like investment in roads and bridges that kind of infrastructure, 24 00:01:37,280 --> 00:01:40,160 Speaker 1: or building out more rail systems for both freight and 25 00:01:40,280 --> 00:01:45,480 Speaker 1: passenger trains, or building up more transit systems for various 26 00:01:45,560 --> 00:01:48,800 Speaker 1: regional areas, that kind of thing. But there are some 27 00:01:48,840 --> 00:01:53,160 Speaker 1: bits that we should mention beyond that. For example, there's 28 00:01:53,200 --> 00:01:56,440 Speaker 1: a seven point five billion dollar budget for building out 29 00:01:56,560 --> 00:02:01,720 Speaker 1: charging infrastructure for electric vehicles. Now you're probably aware there 30 00:02:01,840 --> 00:02:05,600 Speaker 1: is a very hard push to transition away from internal 31 00:02:05,640 --> 00:02:09,959 Speaker 1: combustion engine vehicles that depend upon fossil fuels. In fact, 32 00:02:10,000 --> 00:02:12,959 Speaker 1: there are states that are really pushing to make sure 33 00:02:13,040 --> 00:02:17,679 Speaker 1: that you know, no new internal combustion engine vehicles are 34 00:02:17,960 --> 00:02:21,680 Speaker 1: sold in those states after a given year. But in 35 00:02:21,760 --> 00:02:24,519 Speaker 1: order for this to be practical, we actually need the 36 00:02:24,560 --> 00:02:28,280 Speaker 1: infrastructure in order to juice up electric vehicles that will 37 00:02:28,320 --> 00:02:31,680 Speaker 1: replace all the internal combustion ones over time. Also, just 38 00:02:31,720 --> 00:02:35,960 Speaker 1: as a reminder, these bills don't outlaw internal combustion engines. 39 00:02:36,080 --> 00:02:39,680 Speaker 1: It's not like you know, in twenty thirty it'll be 40 00:02:39,720 --> 00:02:43,560 Speaker 1: against the law to own a gas powered car. It's 41 00:02:43,600 --> 00:02:46,640 Speaker 1: just that all new vehicles will need to be electric. 42 00:02:47,400 --> 00:02:49,400 Speaker 1: And of course we also have to address how we 43 00:02:49,480 --> 00:02:52,720 Speaker 1: generate that electricity. If the electricity comes from a power 44 00:02:52,720 --> 00:02:55,880 Speaker 1: plant that's burning fossil fuels, if it's a coal powered 45 00:02:56,000 --> 00:03:01,240 Speaker 1: power plant, then we're kind of shifting the problem to 46 00:03:01,320 --> 00:03:05,880 Speaker 1: a different point along the chain. But anyway, that's one 47 00:03:06,000 --> 00:03:09,959 Speaker 1: part of the bill. There's also a sixty five billion 48 00:03:10,240 --> 00:03:14,800 Speaker 1: dollar budget to expand broadband internet access, something that is 49 00:03:15,000 --> 00:03:18,120 Speaker 1: sorely needed in many parts of the United States. There 50 00:03:18,160 --> 00:03:20,600 Speaker 1: are lots of communities here in the US that have 51 00:03:21,200 --> 00:03:24,920 Speaker 1: little to no access to broadband and considering how much 52 00:03:25,040 --> 00:03:28,280 Speaker 1: of our interactions with society now depend upon having a 53 00:03:28,320 --> 00:03:33,000 Speaker 1: reliable connection to the Internet, particularly during times of crisis 54 00:03:33,120 --> 00:03:37,360 Speaker 1: like the pandemic, that's a huge problem. It It creates 55 00:03:37,440 --> 00:03:43,160 Speaker 1: an enormous divide that puts these communities at a severe disadvantage. 56 00:03:43,520 --> 00:03:46,480 Speaker 1: But there are also a few items in that Infrastructure 57 00:03:46,520 --> 00:03:50,920 Speaker 1: Bill that could surprise you. For example, one element one 58 00:03:50,960 --> 00:03:55,240 Speaker 1: amendment in the Infrastructure Bill places are requirement on car 59 00:03:55,320 --> 00:03:58,840 Speaker 1: manufacturers in the future, and according to this measure, car 60 00:03:58,880 --> 00:04:02,920 Speaker 1: companies will have to in opreate a passive system capable 61 00:04:03,000 --> 00:04:07,280 Speaker 1: of detecting blood alcohol levels in drivers. And the idea 62 00:04:07,320 --> 00:04:09,280 Speaker 1: here is that the vehicle itself would be able to 63 00:04:09,280 --> 00:04:13,160 Speaker 1: tell if a driver were under the influence, if their 64 00:04:13,240 --> 00:04:17,440 Speaker 1: blood alcohol level exceeded the legal limit, and if that 65 00:04:17,480 --> 00:04:20,080 Speaker 1: were the case, the car would not start, so the 66 00:04:20,080 --> 00:04:23,600 Speaker 1: car would not allow the driver to actually operate the vehicle. Now, 67 00:04:23,600 --> 00:04:26,080 Speaker 1: a lot of people have raised objections to this for 68 00:04:26,200 --> 00:04:29,680 Speaker 1: various reasons, and I can I can get behind several 69 00:04:29,720 --> 00:04:33,240 Speaker 1: of the reasons for those objections. Now, I am absolutely 70 00:04:33,279 --> 00:04:36,520 Speaker 1: against drunk driving, but I also happen to know that 71 00:04:36,640 --> 00:04:40,120 Speaker 1: tech isn't infallible. There are a lot of ways that 72 00:04:40,240 --> 00:04:45,159 Speaker 1: this requirement could have unintended consequences. For example, car companies 73 00:04:45,200 --> 00:04:47,359 Speaker 1: will have to figure out the system that can detect 74 00:04:47,400 --> 00:04:50,960 Speaker 1: a driver's blood alcohol level and ignore everyone else in 75 00:04:51,000 --> 00:04:55,000 Speaker 1: the car. So let's say you are a designated driver 76 00:04:55,400 --> 00:04:57,760 Speaker 1: and you're going to give a lift to your friends 77 00:04:58,000 --> 00:04:59,920 Speaker 1: and they've had a few too many. But you are 78 00:05:00,080 --> 00:05:03,760 Speaker 1: designated driver, you are sober, and you're doing this so 79 00:05:03,800 --> 00:05:06,080 Speaker 1: that your friends stay safe and they can still go 80 00:05:06,120 --> 00:05:09,440 Speaker 1: out and you know, drink or whatever. The system in 81 00:05:09,440 --> 00:05:11,400 Speaker 1: this car would need to be able to tell that 82 00:05:11,520 --> 00:05:14,640 Speaker 1: you are sober, even if your friends in the car 83 00:05:14,800 --> 00:05:19,479 Speaker 1: are all inebriated. So a passive system that's analyzing, say 84 00:05:19,520 --> 00:05:23,480 Speaker 1: the alcohol content detected in the air would somehow need 85 00:05:23,520 --> 00:05:28,520 Speaker 1: to differentiate between you and your passengers. False positives could 86 00:05:28,560 --> 00:05:33,520 Speaker 1: cause frustrating and potentially dangerous consequences, and false negatives would 87 00:05:33,560 --> 00:05:36,279 Speaker 1: be even worse, of course, if a system failed to 88 00:05:36,360 --> 00:05:39,359 Speaker 1: pick up on the fact that a driver was intoxicated. 89 00:05:40,200 --> 00:05:42,920 Speaker 1: I think this is a case where we see politicians 90 00:05:43,080 --> 00:05:47,560 Speaker 1: leaning on a technological solution to a social problem, but 91 00:05:47,680 --> 00:05:51,200 Speaker 1: the tech isn't quite up to the task to do it, 92 00:05:51,839 --> 00:05:56,200 Speaker 1: and that means the solution really isn't a solution at all. 93 00:05:56,520 --> 00:06:01,120 Speaker 1: And because the bill specifies a passive system, something like 94 00:06:01,240 --> 00:06:05,800 Speaker 1: a breathalyzer wouldn't qualify because that's an active system. You 95 00:06:05,880 --> 00:06:10,600 Speaker 1: must actively blow into a breathalyzer to use it, So 96 00:06:10,680 --> 00:06:14,320 Speaker 1: a passive system would need to monitor alcohol levels without 97 00:06:14,400 --> 00:06:19,000 Speaker 1: any like direct input or action from the drivers. It 98 00:06:19,200 --> 00:06:22,839 Speaker 1: just seems like it's asking an awful lot of technology, 99 00:06:22,880 --> 00:06:26,440 Speaker 1: and I'm not sure tech can actually deliver. The bill 100 00:06:26,839 --> 00:06:29,240 Speaker 1: builds in a little time for car companies to actually 101 00:06:29,320 --> 00:06:31,800 Speaker 1: do this, which is good. I mean, car companies have 102 00:06:31,920 --> 00:06:37,120 Speaker 1: their production plans set out years in advance, so with 103 00:06:37,200 --> 00:06:39,360 Speaker 1: the way the language of the bill is or the 104 00:06:39,400 --> 00:06:42,000 Speaker 1: way the language of the amendment is formed, it would 105 00:06:42,000 --> 00:06:46,560 Speaker 1: mean there'd be a range of at the earliest to 106 00:06:46,560 --> 00:06:51,520 Speaker 1: to start incorporating this kind of technology. That date is 107 00:06:51,560 --> 00:06:54,839 Speaker 1: there just in case the tech being mandated is found 108 00:06:54,960 --> 00:06:59,680 Speaker 1: to not meet the criteria of being reasonable, practical, and appropriate. 109 00:07:01,200 --> 00:07:03,560 Speaker 1: None of that. I mean, I guess practical covers it. 110 00:07:03,640 --> 00:07:06,719 Speaker 1: But you know that doesn't sound like effective is among 111 00:07:06,880 --> 00:07:11,480 Speaker 1: the three. Maybe practical is supposed to cover that. Another 112 00:07:11,520 --> 00:07:16,080 Speaker 1: element in the Infrastructure Bill relates to cryptocurrency specifically, the 113 00:07:16,120 --> 00:07:20,520 Speaker 1: bill includes requirements to report cryptocurrency transactions for the purposes 114 00:07:20,600 --> 00:07:24,440 Speaker 1: of taxing them. After all, I mean, you've gotta pay 115 00:07:24,440 --> 00:07:27,960 Speaker 1: for this one point to trillion dollar bills somehow. There 116 00:07:28,000 --> 00:07:30,400 Speaker 1: are a few different ways of doing that. You could 117 00:07:30,480 --> 00:07:33,920 Speaker 1: raise existing taxes, which tends to be a pretty politically 118 00:07:34,040 --> 00:07:38,360 Speaker 1: unpopular move. People aren't crazy about that idea. You could 119 00:07:38,560 --> 00:07:41,560 Speaker 1: shift money from some parts of the budget to a 120 00:07:41,560 --> 00:07:43,760 Speaker 1: different part of the budget. But this also tends to 121 00:07:43,760 --> 00:07:47,200 Speaker 1: be politically unpopular because it usually means some senator or 122 00:07:47,280 --> 00:07:50,480 Speaker 1: other sees money that would go to businesses that are 123 00:07:50,720 --> 00:07:55,160 Speaker 1: in their constituency diminish. So you'll see senators fight for 124 00:07:55,240 --> 00:07:58,720 Speaker 1: certain things, not because they think that the thing they're 125 00:07:58,720 --> 00:08:01,720 Speaker 1: fighting for is right, but rather the thing they're fighting 126 00:08:01,720 --> 00:08:05,360 Speaker 1: for represents money that would be going to their state 127 00:08:05,640 --> 00:08:10,800 Speaker 1: because the industries that represent that that particular element are 128 00:08:10,840 --> 00:08:15,520 Speaker 1: located in their state. It does get very political. Another 129 00:08:15,560 --> 00:08:17,440 Speaker 1: thing you could do is try and find loopholes that 130 00:08:17,440 --> 00:08:20,560 Speaker 1: are in the tax code and close those loopholes. And 131 00:08:20,600 --> 00:08:22,880 Speaker 1: that would allow the government to collect taxes on stuff 132 00:08:23,200 --> 00:08:27,520 Speaker 1: that previously it couldn't but perhaps should have been able to. 133 00:08:28,000 --> 00:08:30,800 Speaker 1: And that's kind of where the crypto part falls, this 134 00:08:30,880 --> 00:08:36,880 Speaker 1: idea that there are these transactions that should fall into uh, 135 00:08:36,920 --> 00:08:40,800 Speaker 1: you know, taxable income, but often don't unless they exceed 136 00:08:40,840 --> 00:08:45,720 Speaker 1: ten thousand dollars. However, the language in the bill, in 137 00:08:45,880 --> 00:08:49,839 Speaker 1: this crypto amendment is ambiguous, and that worries a lot 138 00:08:49,880 --> 00:08:53,240 Speaker 1: of crypto investors. Now, some of that worry might be 139 00:08:53,240 --> 00:08:55,720 Speaker 1: coming from folks who are trying to use cryptocurrency as 140 00:08:55,720 --> 00:08:59,120 Speaker 1: a means of evading taxes, and uh, you know, you 141 00:08:59,160 --> 00:09:02,320 Speaker 1: can't really take their arguments and good faith right there 142 00:09:02,320 --> 00:09:09,280 Speaker 1: trying to circumvent an actual, you know, law, that's not fantastic. 143 00:09:09,320 --> 00:09:12,520 Speaker 1: But others are concerned that because of that language, which 144 00:09:12,559 --> 00:09:15,839 Speaker 1: talks a lot about brokers, and defines a broker as 145 00:09:16,040 --> 00:09:20,640 Speaker 1: quote any person who, for consideration, is responsible for regularly 146 00:09:20,720 --> 00:09:24,760 Speaker 1: providing any service effectuating transfers of digital assets on behalf 147 00:09:24,800 --> 00:09:28,320 Speaker 1: of another person. End quote is pretty vague. It sounds 148 00:09:28,320 --> 00:09:32,360 Speaker 1: like that could cover pretty much everyone in the crypto ecosystem, 149 00:09:32,400 --> 00:09:35,880 Speaker 1: at least for proof of work cryptocurrencies like bitcoin. So 150 00:09:35,960 --> 00:09:39,559 Speaker 1: let me explain. Let's say you've got a fraction of 151 00:09:39,600 --> 00:09:43,319 Speaker 1: a bitcoin, right, You've got like point zero zero whatever 152 00:09:43,880 --> 00:09:46,640 Speaker 1: one bitcoins, and you want to use this fraction of 153 00:09:46,640 --> 00:09:49,200 Speaker 1: the bitcoin to purchase a brand new television from a 154 00:09:49,240 --> 00:09:53,000 Speaker 1: merchant who happens to accept bitcoin is payment. So you 155 00:09:53,040 --> 00:09:58,280 Speaker 1: make your transaction. You pay the required amount of bitcoin 156 00:09:58,400 --> 00:10:01,760 Speaker 1: to the merchant, and you hand over your digital currency 157 00:10:01,840 --> 00:10:06,480 Speaker 1: or essentially sending a digital certificate that transfers from you 158 00:10:06,559 --> 00:10:09,439 Speaker 1: to the merchant. But this transaction has to be verified. 159 00:10:09,800 --> 00:10:12,800 Speaker 1: So this way, the merchant knows for sure that the 160 00:10:12,840 --> 00:10:17,200 Speaker 1: digital currency they're receiving is legitimate, that it's a real transaction, 161 00:10:17,520 --> 00:10:21,560 Speaker 1: and that actual value is being exchanged. So the transaction 162 00:10:21,800 --> 00:10:27,240 Speaker 1: process is part of bitcoin mining, transactions clumped together in blocks. 163 00:10:27,520 --> 00:10:30,120 Speaker 1: Every ten minutes or so, a block is ready to 164 00:10:30,120 --> 00:10:35,200 Speaker 1: be verified and then once verified that that verified block 165 00:10:35,480 --> 00:10:37,600 Speaker 1: goes to the end of the blockchain and becomes the 166 00:10:37,600 --> 00:10:41,160 Speaker 1: most recent block in the blockchain, and that's the chain 167 00:10:41,200 --> 00:10:43,920 Speaker 1: of transaction blocks that stretch all the way back to 168 00:10:43,960 --> 00:10:48,520 Speaker 1: the beginning of bitcoin. Bitcoin miners compete to essentially solve 169 00:10:48,600 --> 00:10:52,000 Speaker 1: a very hard math problem that will then serve as 170 00:10:52,040 --> 00:10:56,800 Speaker 1: the verification for that block of transactions. So in this scenario, 171 00:10:57,200 --> 00:11:00,520 Speaker 1: would that mean every single bitcoin mining operation can count 172 00:11:00,559 --> 00:11:05,440 Speaker 1: as a broker? Would it only be whichever bitcoin minor 173 00:11:05,559 --> 00:11:09,160 Speaker 1: actually got the math problem correct first? I mean, that 174 00:11:09,200 --> 00:11:11,680 Speaker 1: would kind of make sense. The reward for solving the 175 00:11:11,720 --> 00:11:17,280 Speaker 1: math problem is some bitcoin, and I could definitely be taxed. 176 00:11:17,320 --> 00:11:19,360 Speaker 1: But then you also have the issue that, you know, 177 00:11:19,440 --> 00:11:23,360 Speaker 1: bitcoins a global cryptocurrency, the United States does not have 178 00:11:23,440 --> 00:11:26,600 Speaker 1: jurisdiction over the entire world or anything like that. So 179 00:11:26,640 --> 00:11:29,400 Speaker 1: it gets messy, is what I'm saying. And the fear 180 00:11:29,520 --> 00:11:31,760 Speaker 1: is that the vague language in the bill will make 181 00:11:31,800 --> 00:11:35,000 Speaker 1: it difficult, if not impossible, to execute the law properly 182 00:11:35,320 --> 00:11:38,800 Speaker 1: without lots of unintended consequences. And this is the kind 183 00:11:38,840 --> 00:11:41,480 Speaker 1: of stuff you see when after a survey you find 184 00:11:41,480 --> 00:11:45,000 Speaker 1: out that more than all senators had never even heard 185 00:11:45,040 --> 00:11:50,240 Speaker 1: of cryptocurrency before. Ignorance of technology often becomes an issue. 186 00:11:50,760 --> 00:11:52,640 Speaker 1: Not Just to be clear, I'm not in favor of 187 00:11:52,679 --> 00:11:55,840 Speaker 1: tax evasion, just like I'm not in favor of drunk driving, 188 00:11:56,160 --> 00:11:57,880 Speaker 1: and I do think there needs to be some sort 189 00:11:57,880 --> 00:12:00,480 Speaker 1: of system in place to close off loopholes that allow, 190 00:12:00,840 --> 00:12:03,560 Speaker 1: you know, wealthy people to get even more wealthy at 191 00:12:03,600 --> 00:12:07,040 Speaker 1: the expense of pretty much everyone else. And you really 192 00:12:07,040 --> 00:12:09,959 Speaker 1: do have to be wealthy to be a cryptocurrency minor 193 00:12:10,040 --> 00:12:13,080 Speaker 1: and effective one anyway, because the hardware you need in 194 00:12:13,200 --> 00:12:16,480 Speaker 1: order to be competitive is very expensive, and the power 195 00:12:16,520 --> 00:12:20,400 Speaker 1: bill to supply all that equipment with electricity is also expensive. 196 00:12:20,720 --> 00:12:24,800 Speaker 1: So I don't necessarily disagree with the motivation for these 197 00:12:24,840 --> 00:12:28,000 Speaker 1: amendments in this infrastructure bill, like I think they're coming 198 00:12:28,080 --> 00:12:31,760 Speaker 1: from generally a good place, but I do worry about 199 00:12:31,800 --> 00:12:34,760 Speaker 1: the actual language itself and how that will manifest in 200 00:12:34,800 --> 00:12:37,400 Speaker 1: the real world. And it's always kind of chaotic when 201 00:12:37,400 --> 00:12:40,680 Speaker 1: politics tries to catch up with tech. We've got a 202 00:12:40,720 --> 00:12:43,200 Speaker 1: couple more stories to cover, but before we get to that, 203 00:12:43,440 --> 00:12:53,679 Speaker 1: let's take a quick break. We're back, and if you've 204 00:12:53,720 --> 00:12:55,920 Speaker 1: been listening to tech stuff for a while, you've heard 205 00:12:55,920 --> 00:12:58,640 Speaker 1: me talk about the issue of bias. When it comes 206 00:12:58,679 --> 00:13:03,080 Speaker 1: to artificial intelligence. Bias influences AI to favor one set 207 00:13:03,120 --> 00:13:07,000 Speaker 1: of outcomes over others. Now, it's not always a bad thing, 208 00:13:07,400 --> 00:13:10,560 Speaker 1: mind you. It really depends upon the application. But when 209 00:13:10,559 --> 00:13:13,400 Speaker 1: it comes to stuff that relates to say, someone's appearance, 210 00:13:13,600 --> 00:13:17,600 Speaker 1: or their ethnicity or something along those lines, bias usually 211 00:13:17,640 --> 00:13:19,520 Speaker 1: turns out to be a pretty bad thing to have 212 00:13:19,600 --> 00:13:23,280 Speaker 1: in your system. Well, Twitter recently held a competition among 213 00:13:23,520 --> 00:13:26,440 Speaker 1: developers and hackers, and at the heart of it was 214 00:13:26,480 --> 00:13:30,800 Speaker 1: Twitter's photo cropping algorithm. Now, if you've ever browsed Twitter, 215 00:13:31,320 --> 00:13:34,720 Speaker 1: you've likely seen that there were photos in the Twitter 216 00:13:34,800 --> 00:13:38,160 Speaker 1: feed are cropped and clicking on the photo opens up 217 00:13:38,200 --> 00:13:40,920 Speaker 1: the full image. So the crop to photo access kind 218 00:13:40,960 --> 00:13:43,760 Speaker 1: of like a preview and it gets all images to 219 00:13:44,000 --> 00:13:48,240 Speaker 1: fit within Twitter's you know, established parameters for what photos 220 00:13:48,240 --> 00:13:51,480 Speaker 1: should look like in the Twitter feed. But these photos, 221 00:13:51,480 --> 00:13:53,880 Speaker 1: obviously they have to be auto cropped because there's just 222 00:13:53,960 --> 00:13:57,200 Speaker 1: no way that Twitter could employ humans to do it manually. 223 00:13:57,720 --> 00:14:00,600 Speaker 1: That means Twitter needed to design and now rhythm that 224 00:14:00,640 --> 00:14:04,280 Speaker 1: would make decisions on where and how to crop photos 225 00:14:04,320 --> 00:14:07,760 Speaker 1: so that they fit within Twitter's interface. And that meant 226 00:14:08,000 --> 00:14:10,920 Speaker 1: there was the possibility that this algorithm would have certain 227 00:14:10,960 --> 00:14:15,720 Speaker 1: biases that might favor, say, white faces over black faces. 228 00:14:16,160 --> 00:14:19,960 Speaker 1: In fact, users in twenty suggested that this was happening. 229 00:14:20,240 --> 00:14:23,760 Speaker 1: So Twitter's competition was all about inviting people to examine 230 00:14:23,760 --> 00:14:27,320 Speaker 1: the photo cropping algorithm and look for signs of bias 231 00:14:27,440 --> 00:14:30,200 Speaker 1: in a bug bounty, and it would reward people for 232 00:14:30,280 --> 00:14:34,200 Speaker 1: the discovery of flaws in the system. This, by the way, 233 00:14:34,440 --> 00:14:37,400 Speaker 1: is a pretty responsible approach in my opinion. I mean, 234 00:14:37,520 --> 00:14:40,400 Speaker 1: of course, it would be best if the algorithm didn't 235 00:14:40,440 --> 00:14:44,040 Speaker 1: have bias to start off with, but failing that, opening 236 00:14:44,080 --> 00:14:47,560 Speaker 1: things up to a wider audience of critics and hackers 237 00:14:47,600 --> 00:14:53,160 Speaker 1: creates the opportunity to isolate and address bugs quickly. At 238 00:14:53,200 --> 00:14:56,840 Speaker 1: def Con, which is an information security conference held each year, 239 00:14:57,000 --> 00:15:00,200 Speaker 1: lots of hackers go to it. The award for best 240 00:15:00,280 --> 00:15:02,600 Speaker 1: performance in the bug bounty went to a person who 241 00:15:02,640 --> 00:15:07,160 Speaker 1: found that Twitter's algorithm preferred faces that are quote slim, young, 242 00:15:07,680 --> 00:15:11,440 Speaker 1: of light or warm skin color and smooth skin texture 243 00:15:11,760 --> 00:15:15,840 Speaker 1: and with stereotypically feminine facial traits end quote. And I 244 00:15:15,880 --> 00:15:18,240 Speaker 1: love how this hacker actually put this to the test. 245 00:15:18,480 --> 00:15:21,960 Speaker 1: The hacker created a again, a g N, a generative 246 00:15:22,120 --> 00:15:26,840 Speaker 1: adversarial network to create computer generated images of human faces. 247 00:15:26,840 --> 00:15:31,400 Speaker 1: So again does this well? Again does anything by essentially 248 00:15:31,400 --> 00:15:36,040 Speaker 1: having to AI systems two machine learning systems. One of 249 00:15:36,080 --> 00:15:39,320 Speaker 1: those is in charge of generating something, so in this 250 00:15:39,360 --> 00:15:43,880 Speaker 1: particular instance we're talking about computer generated images of faces. 251 00:15:44,640 --> 00:15:48,680 Speaker 1: The other one attempts to figure out which stuff that's 252 00:15:48,720 --> 00:15:51,920 Speaker 1: being fed to it is real versus computer generated, So 253 00:15:51,960 --> 00:15:55,520 Speaker 1: in this case, what images were created by a computer 254 00:15:55,680 --> 00:15:59,160 Speaker 1: and which images are just photos of people? And over time, 255 00:15:59,360 --> 00:16:03,200 Speaker 1: both systems get better at their jobs. So if the 256 00:16:03,280 --> 00:16:07,360 Speaker 1: detector is figuring stuff out much more frequently, then you 257 00:16:07,400 --> 00:16:12,440 Speaker 1: feed that information over to the generator, which then adjusts 258 00:16:12,480 --> 00:16:17,000 Speaker 1: how it goes about generating stuff and tries again. And 259 00:16:17,040 --> 00:16:20,760 Speaker 1: then as the detector's success rate falls, it tries to 260 00:16:20,880 --> 00:16:25,720 Speaker 1: adjust how it, you know, analyzes images and gets better 261 00:16:25,760 --> 00:16:30,200 Speaker 1: at it that way. So, over thousands of trials of 262 00:16:30,440 --> 00:16:33,920 Speaker 1: pitting these two systems against each other, you start to 263 00:16:33,960 --> 00:16:40,480 Speaker 1: create more convincing computer generated images of faces um And 264 00:16:40,520 --> 00:16:43,720 Speaker 1: in fact, you can create them so convincingly that if 265 00:16:43,800 --> 00:16:46,240 Speaker 1: you were to look at a series of pictures, you 266 00:16:46,320 --> 00:16:48,600 Speaker 1: probably wouldn't be able to tell which ones were real 267 00:16:48,680 --> 00:16:52,120 Speaker 1: photos of actual people and which ones were just computer generated. 268 00:16:53,240 --> 00:16:57,080 Speaker 1: Some other hackers also found other elements of bias. A 269 00:16:57,200 --> 00:16:59,960 Speaker 1: one found that the algorithm had a negative bias towards 270 00:17:00,040 --> 00:17:03,040 Speaker 1: people with white or gray hair, which means, before too long, 271 00:17:03,080 --> 00:17:05,000 Speaker 1: you all won't be seeing any photos of me on Twitter, 272 00:17:05,680 --> 00:17:09,640 Speaker 1: and that another one found that the algorithm would favor 273 00:17:09,760 --> 00:17:14,439 Speaker 1: English text over Arabic text whenever those would appear in 274 00:17:14,440 --> 00:17:17,040 Speaker 1: an image, like it would crop out Arabic text, but 275 00:17:17,080 --> 00:17:20,720 Speaker 1: it would keep in English text. So that was an issue, 276 00:17:21,200 --> 00:17:23,920 Speaker 1: and Twitter really took the right step here, in my opinion. 277 00:17:24,040 --> 00:17:27,479 Speaker 1: We've seen other companies like Amazon really doubled down on 278 00:17:27,560 --> 00:17:30,400 Speaker 1: denying that there's any kind of bias problem in systems 279 00:17:30,440 --> 00:17:35,800 Speaker 1: like facial recognition technology, but Twitter instead accepted the criticism 280 00:17:35,920 --> 00:17:38,000 Speaker 1: and then invited people to really drill down and find 281 00:17:38,000 --> 00:17:43,080 Speaker 1: the issues, which ultimately, you know, could lead to Twitter 282 00:17:43,240 --> 00:17:46,600 Speaker 1: creating better algorithms with less bias. So in my opinion, 283 00:17:46,640 --> 00:17:48,720 Speaker 1: it was a great job for Twitter and an even 284 00:17:48,760 --> 00:17:52,679 Speaker 1: better job to the hackers who found those bugs and 285 00:17:52,720 --> 00:17:56,720 Speaker 1: really gross. News NBC reported that a company called Teleperformance, 286 00:17:56,800 --> 00:18:00,560 Speaker 1: which supports tons of other companies including Apple and Amazon 287 00:18:00,560 --> 00:18:06,080 Speaker 1: and Uber, has instituted a particularly draconian practice for employees 288 00:18:06,200 --> 00:18:11,479 Speaker 1: in Colombia, so Teleperformances employees essentially do contract work for 289 00:18:11,560 --> 00:18:16,000 Speaker 1: these other companies, often handling sensitive data in the process. 290 00:18:16,080 --> 00:18:19,920 Speaker 1: And because of COVID, Teleperformance has shifted many of its 291 00:18:19,960 --> 00:18:23,440 Speaker 1: offices around the world from going into the office to 292 00:18:23,640 --> 00:18:28,120 Speaker 1: a work from home policy. But in Colombia, employees were 293 00:18:28,160 --> 00:18:31,960 Speaker 1: required to sign new contracts that included clauses that allowed 294 00:18:32,000 --> 00:18:38,000 Speaker 1: Teller Performance to install cameras inside the homes of those employees. 295 00:18:38,880 --> 00:18:41,880 Speaker 1: Now why would it do this well to make sure 296 00:18:41,920 --> 00:18:44,720 Speaker 1: that the employees were doing their jobs right, and also 297 00:18:44,760 --> 00:18:47,959 Speaker 1: to make sure that no unauthorized individuals had access to 298 00:18:48,240 --> 00:18:51,439 Speaker 1: the company's computers. But think about this for a second. 299 00:18:51,480 --> 00:18:54,000 Speaker 1: I'm sure most of you would object to your boss 300 00:18:54,080 --> 00:18:56,720 Speaker 1: telling you that they want to set up a camera 301 00:18:56,800 --> 00:18:59,439 Speaker 1: in your home so that the boss can monitor you 302 00:18:59,520 --> 00:19:01,920 Speaker 1: as you work from home. Add to that the fact 303 00:19:01,960 --> 00:19:05,720 Speaker 1: that for many people, living space is limited, so their 304 00:19:05,920 --> 00:19:09,720 Speaker 1: office might also be their bedroom, and you have serious 305 00:19:09,760 --> 00:19:13,600 Speaker 1: privacy violations going on right there. Generally speaking, I am 306 00:19:13,640 --> 00:19:17,720 Speaker 1: opposed to all the monitoring software and hardware that is 307 00:19:17,760 --> 00:19:21,880 Speaker 1: in use to quote unquote guarantee productivity because I don't 308 00:19:21,920 --> 00:19:26,159 Speaker 1: think it serves much purpose beyond being oppressive and invasive. 309 00:19:26,240 --> 00:19:29,919 Speaker 1: I think it's far more demoralizing to employees than it 310 00:19:30,080 --> 00:19:34,000 Speaker 1: is helpful to employers. And in this case, this issue 311 00:19:34,000 --> 00:19:36,400 Speaker 1: goes above and beyond. I don't think there's any real 312 00:19:36,480 --> 00:19:40,600 Speaker 1: defense for it. According to Interesting Engineering, a website where 313 00:19:40,600 --> 00:19:43,159 Speaker 1: I read about this, Apple and Amazon have said that 314 00:19:43,240 --> 00:19:45,800 Speaker 1: they did not request this particular kind of scrutiny for 315 00:19:45,840 --> 00:19:49,520 Speaker 1: their work. But Uber was a different story. Uber did 316 00:19:49,600 --> 00:19:53,280 Speaker 1: opt for this level, with the company defending the decision 317 00:19:53,359 --> 00:19:57,320 Speaker 1: saying that contractors are working with sensitive customer information and 318 00:19:57,359 --> 00:19:59,720 Speaker 1: the risk of that info getting out to someone else 319 00:19:59,800 --> 00:20:03,240 Speaker 1: is prompted Uber to request the higher level of employee surveillance, 320 00:20:03,480 --> 00:20:08,359 Speaker 1: to which I say, this system is broken. I have 321 00:20:08,560 --> 00:20:12,480 Speaker 1: one other big story to cover, but before I jump 322 00:20:12,520 --> 00:20:23,280 Speaker 1: on that, let's take another break, all right. Finally, in 323 00:20:23,359 --> 00:20:28,680 Speaker 1: an ongoing story that I really wish would just conclude already, 324 00:20:29,400 --> 00:20:33,920 Speaker 1: we're seeing misinformation campaigns about COVID surge here in the 325 00:20:34,000 --> 00:20:37,000 Speaker 1: United States, while at the same time we're seeing case 326 00:20:37,119 --> 00:20:40,240 Speaker 1: numbers on the rise. And why is this a tech story? Well, 327 00:20:40,280 --> 00:20:45,560 Speaker 1: it's because the misinformation campaigns are heavily relying upon social 328 00:20:45,600 --> 00:20:49,880 Speaker 1: networking platforms like Facebook and Twitter in order to spread 329 00:20:50,160 --> 00:20:55,720 Speaker 1: the misinformation and to convince people to not follow the 330 00:20:55,840 --> 00:21:00,720 Speaker 1: guidelines that scientists and doctors have come up with, which 331 00:21:00,760 --> 00:21:06,960 Speaker 1: is making a bad problem monumentally worse. We have variants 332 00:21:07,040 --> 00:21:10,719 Speaker 1: like Delta and Delta plus and Lambda, and these are 333 00:21:10,840 --> 00:21:15,760 Speaker 1: virulent enough and resistant enough to vaccinations to occasionally cause 334 00:21:15,880 --> 00:21:20,919 Speaker 1: breakthrough COVID cases. Now that's not always the case, and 335 00:21:21,000 --> 00:21:25,800 Speaker 1: you are better protected if you're vaccinated, but the point 336 00:21:25,840 --> 00:21:31,880 Speaker 1: being that these variants can break through vaccinations in certain cases, 337 00:21:32,119 --> 00:21:35,359 Speaker 1: and yeah, on the whole, these cases tend to be 338 00:21:35,520 --> 00:21:39,520 Speaker 1: far less severe than those for people who are unvaccinated 339 00:21:39,600 --> 00:21:43,040 Speaker 1: and then get COVID. The vaccinated cases tend to be 340 00:21:43,160 --> 00:21:46,159 Speaker 1: much more mild with their symptoms, but still not good, 341 00:21:46,359 --> 00:21:49,600 Speaker 1: and it's still creating more opportunities for the virus to 342 00:21:50,000 --> 00:21:53,760 Speaker 1: mutate further and more variants to arise as a result 343 00:21:53,800 --> 00:21:57,480 Speaker 1: of it, and vaccination rates in some parts of the country, 344 00:21:57,680 --> 00:22:02,080 Speaker 1: like my own state of Georgia, are still far behind 345 00:22:02,400 --> 00:22:05,399 Speaker 1: where they should be. The New York Times reports that 346 00:22:05,440 --> 00:22:12,360 Speaker 1: misinformation campaigns about COVID and COVID vaccines increased significantly over 347 00:22:12,400 --> 00:22:15,800 Speaker 1: the last couple of months. According to Signal Labs, false 348 00:22:15,880 --> 00:22:21,320 Speaker 1: information about vaccine effectiveness increased by four hundred thirty seven 349 00:22:21,359 --> 00:22:25,720 Speaker 1: percent over the summer. Now, some of these campaigns originate 350 00:22:25,880 --> 00:22:29,760 Speaker 1: from Russian troll farms, which are not just pushing out 351 00:22:29,800 --> 00:22:34,560 Speaker 1: they're not spreading misinformation, they're pushing out disinformation outright lies. 352 00:22:35,520 --> 00:22:39,320 Speaker 1: Then you've got people who become banner carriers. They pick 353 00:22:39,440 --> 00:22:43,520 Speaker 1: up this this falsehood, and then they spread that falsehood further, 354 00:22:43,680 --> 00:22:46,760 Speaker 1: and they promoted and they give it more credence, and 355 00:22:46,840 --> 00:22:53,600 Speaker 1: the consequences are beyond awful. Obviously the worst consequence is 356 00:22:53,640 --> 00:22:57,800 Speaker 1: that more people are getting very sick, and some of 357 00:22:57,800 --> 00:23:02,359 Speaker 1: these people are dying. Right For the unvaccinated, it's awful 358 00:23:02,400 --> 00:23:05,440 Speaker 1: because they're being they're they're getting information that are telling 359 00:23:05,480 --> 00:23:10,480 Speaker 1: them not to get vaccinated, which drastically increases the chance 360 00:23:10,520 --> 00:23:13,040 Speaker 1: that if they get COVID it will be a serious 361 00:23:13,119 --> 00:23:17,159 Speaker 1: case and that they could in fact die. A lot 362 00:23:17,160 --> 00:23:20,080 Speaker 1: of the ones, even the ones who who will survive, 363 00:23:20,280 --> 00:23:24,320 Speaker 1: are ending up filling up hospitals, particularly here in the Southeast. 364 00:23:24,640 --> 00:23:27,480 Speaker 1: That means that in regions like my state, where again 365 00:23:27,520 --> 00:23:32,560 Speaker 1: at a point where hospitals are at capacity or over capacity. 366 00:23:32,640 --> 00:23:37,159 Speaker 1: That means anyone experiencing any kind of medical emergency is 367 00:23:37,200 --> 00:23:40,480 Speaker 1: in much greater danger because of a lack of medical 368 00:23:40,520 --> 00:23:43,840 Speaker 1: capacity to treat them. So even if you're totally safe, 369 00:23:43,920 --> 00:23:46,680 Speaker 1: let's say that like I'm staying home, I'm isolating, I'm 370 00:23:46,760 --> 00:23:50,720 Speaker 1: not going out. I'm not you know, uh, exposing myself 371 00:23:50,760 --> 00:23:53,359 Speaker 1: to the possibility of getting COVID. But I take a 372 00:23:53,400 --> 00:23:55,360 Speaker 1: spill and I fall down my stairs and I break 373 00:23:55,440 --> 00:23:58,480 Speaker 1: my legs, there might not be any way for me 374 00:23:58,560 --> 00:24:00,800 Speaker 1: to get to a hospital in good time. I could 375 00:24:00,800 --> 00:24:04,800 Speaker 1: be suffering for quite some time before anyone can treat 376 00:24:04,840 --> 00:24:08,840 Speaker 1: me because all of the local facilities are over capacity. 377 00:24:09,480 --> 00:24:12,200 Speaker 1: That's a real issue. And that's that's a minor one, 378 00:24:12,320 --> 00:24:15,520 Speaker 1: right breaking your legs bad, but clearly there are much 379 00:24:15,560 --> 00:24:19,320 Speaker 1: more serious medical emergencies that could arise where people would 380 00:24:19,320 --> 00:24:21,359 Speaker 1: not be able to get the treatment they need. It 381 00:24:21,440 --> 00:24:23,880 Speaker 1: also means that for all the people who did their 382 00:24:23,880 --> 00:24:26,440 Speaker 1: best to stay safe, you know, the folks who went 383 00:24:26,480 --> 00:24:29,520 Speaker 1: and got vaccinated, the people who are wearing masks and 384 00:24:29,560 --> 00:24:32,280 Speaker 1: who are socially distancing, they're going to have to continue 385 00:24:32,280 --> 00:24:35,760 Speaker 1: to do that or potentially become part of the problem. 386 00:24:35,840 --> 00:24:40,159 Speaker 1: And there is so much quarantined fatigue out there that 387 00:24:40,200 --> 00:24:43,359 Speaker 1: a lot of people who up to this point have 388 00:24:43,480 --> 00:24:48,080 Speaker 1: been being super careful are just not willing to do 389 00:24:48,119 --> 00:24:50,680 Speaker 1: that anymore because they feel like it's a punishment, right, 390 00:24:50,680 --> 00:24:53,679 Speaker 1: Like they did everything right, but they're being punished for 391 00:24:53,760 --> 00:24:56,400 Speaker 1: it by being told, hey, you have to keep doing this. Meanwhile, 392 00:24:56,560 --> 00:25:00,640 Speaker 1: there are these other people who refuse to accept any 393 00:25:00,680 --> 00:25:04,360 Speaker 1: kind of accountability, who are going out and doing all 394 00:25:04,400 --> 00:25:07,639 Speaker 1: the fun things without wearing a mask or anything. And 395 00:25:07,680 --> 00:25:12,560 Speaker 1: I understand that that feels monumentally unfair, and it is 396 00:25:12,600 --> 00:25:16,840 Speaker 1: hard for me to condemn people who have that reaction. 397 00:25:16,880 --> 00:25:19,920 Speaker 1: I mean, in full disclosure, I have even gone out 398 00:25:20,400 --> 00:25:24,439 Speaker 1: to be with friends and a few isolated cases, and 399 00:25:24,480 --> 00:25:29,159 Speaker 1: we did take precautions, but it still was more risky 400 00:25:29,200 --> 00:25:31,800 Speaker 1: than just staying at home. Like I could have just 401 00:25:31,840 --> 00:25:34,200 Speaker 1: stayed at home and that would have been safer than 402 00:25:34,240 --> 00:25:36,840 Speaker 1: going out, even with all the precautions we were taking. 403 00:25:37,359 --> 00:25:40,040 Speaker 1: But you know, the situation is not going to magically 404 00:25:40,040 --> 00:25:43,480 Speaker 1: get better just because we wanted to. The virus doesn't 405 00:25:43,600 --> 00:25:47,840 Speaker 1: care if we're being denied the chance to go out 406 00:25:47,960 --> 00:25:51,560 Speaker 1: and live our lives. The virus doesn't care about anything 407 00:25:51,560 --> 00:25:56,480 Speaker 1: other than spreading. So meanwhile, we have these bad actors 408 00:25:56,600 --> 00:26:00,000 Speaker 1: and we have people with wishful thinking that are spreading 409 00:26:00,040 --> 00:26:04,280 Speaker 1: inaccurate stories about the disease and vaccinations, and that continues 410 00:26:04,320 --> 00:26:08,000 Speaker 1: to take hold, and people continue to get sick, and 411 00:26:08,080 --> 00:26:13,760 Speaker 1: our medical system continues to be overtaxed, and people continue 412 00:26:13,800 --> 00:26:19,359 Speaker 1: to feel demoralized. Politicians continue to struggle with how do 413 00:26:19,480 --> 00:26:24,800 Speaker 1: they handle this situation. Things will not magically get better 414 00:26:25,480 --> 00:26:29,439 Speaker 1: just because we want them to. On a related note, 415 00:26:29,840 --> 00:26:34,200 Speaker 1: here in Georgia, our politician Marjorie Taylor Green once again 416 00:26:34,720 --> 00:26:39,000 Speaker 1: found herself suspended on Twitter. Previously, she received a twelve 417 00:26:39,040 --> 00:26:43,520 Speaker 1: hour suspension for spreading misinformation on the platform. This time 418 00:26:43,720 --> 00:26:46,440 Speaker 1: she's received a suspension that will last a full week 419 00:26:47,119 --> 00:26:51,960 Speaker 1: for repeatedly violating the platform's misinformation policies. In this particular case, 420 00:26:52,320 --> 00:26:56,960 Speaker 1: she was posting misinformation about COVID and vaccinations on the platform. 421 00:26:57,080 --> 00:27:02,080 Speaker 1: Um she continues to be a worse of misinformation about 422 00:27:02,119 --> 00:27:05,000 Speaker 1: those issues. So when our own leaders are taking part 423 00:27:05,160 --> 00:27:07,640 Speaker 1: in the misinformation campaigns, it can get a bit hard 424 00:27:07,720 --> 00:27:09,199 Speaker 1: to see the light at the end of the tunnel. 425 00:27:10,119 --> 00:27:13,600 Speaker 1: But to my listeners, to all of you out there 426 00:27:14,160 --> 00:27:18,480 Speaker 1: who have gone to get vaccinated, who wear masks, who 427 00:27:18,840 --> 00:27:22,919 Speaker 1: have sacrificed so much in an effort to try and 428 00:27:22,960 --> 00:27:25,679 Speaker 1: be part of the solution and not part of the problem. 429 00:27:25,800 --> 00:27:31,720 Speaker 1: Thank you, Your sacrifices are meaningful. Even if the people 430 00:27:31,800 --> 00:27:34,960 Speaker 1: you're sacrificing for don't realize it or acknowledge it, it 431 00:27:35,080 --> 00:27:40,800 Speaker 1: is meaningful, and I, for one, deeply appreciate the fact 432 00:27:40,880 --> 00:27:43,399 Speaker 1: that you have gone to those lengths. To those of 433 00:27:43,400 --> 00:27:48,280 Speaker 1: you who have not been vaccinated, please get vaccinated, please, 434 00:27:48,480 --> 00:27:53,119 Speaker 1: because as we are seeing the those who are not 435 00:27:53,280 --> 00:27:58,440 Speaker 1: vaccinated are really suffering the most, and that the ripple 436 00:27:58,480 --> 00:28:01,720 Speaker 1: effect means that we're going to see more variants arise 437 00:28:02,000 --> 00:28:06,280 Speaker 1: as the virus has more opportunities to mutate, and that's 438 00:28:06,320 --> 00:28:09,879 Speaker 1: going to place hardship on everyone all over again. And 439 00:28:09,920 --> 00:28:15,200 Speaker 1: we've already lost so many people. We need to stop that. 440 00:28:16,760 --> 00:28:18,920 Speaker 1: And to everyone else who falls somewhere in the middle, 441 00:28:20,080 --> 00:28:25,040 Speaker 1: be safe, make good choices, use critical thinking and compassion 442 00:28:25,080 --> 00:28:31,080 Speaker 1: and equal measure, and we'll get through this. It's going 443 00:28:31,160 --> 00:28:34,200 Speaker 1: to take longer than it should have because there are 444 00:28:34,240 --> 00:28:39,160 Speaker 1: far too many actors in this who either deny reality 445 00:28:39,280 --> 00:28:43,160 Speaker 1: or outright want the worst outcome for us and are 446 00:28:43,200 --> 00:28:48,480 Speaker 1: pushing those narratives. But we can get through it. That's 447 00:28:48,520 --> 00:28:51,800 Speaker 1: it for this news episode. I know that was very 448 00:28:51,880 --> 00:28:54,320 Speaker 1: soap boxy at the end of it. I know some 449 00:28:54,400 --> 00:28:56,600 Speaker 1: of you get really fed up with that. I'm not 450 00:28:56,640 --> 00:28:59,400 Speaker 1: going to apologize for it, because I care too much 451 00:28:59,400 --> 00:29:02,280 Speaker 1: about you guys. I care about my listeners, not whether 452 00:29:02,360 --> 00:29:04,560 Speaker 1: or not you download my show. If you choose to 453 00:29:04,600 --> 00:29:09,320 Speaker 1: never download another episode, fine, I understand. I still wish 454 00:29:09,320 --> 00:29:13,800 Speaker 1: you well. I care because I don't want people suffering. 455 00:29:13,880 --> 00:29:16,280 Speaker 1: I don't want you to get sick. I don't want 456 00:29:16,320 --> 00:29:20,080 Speaker 1: anyone you love to get sick. I want us to 457 00:29:20,120 --> 00:29:23,680 Speaker 1: be able to put this very dark chapter in the past. 458 00:29:24,560 --> 00:29:27,920 Speaker 1: With that, let's wrap up. If you have suggestions for 459 00:29:28,040 --> 00:29:30,960 Speaker 1: topics I should cover in tech Stuff, please reach out 460 00:29:31,000 --> 00:29:33,920 Speaker 1: to me over on Twitter. The handle we use is 461 00:29:34,080 --> 00:29:37,520 Speaker 1: text stuff h s W and I'll talk to you again, 462 00:29:38,320 --> 00:29:46,400 Speaker 1: really Sion. Text Stuff is an I Heart Radio production. 463 00:29:46,640 --> 00:29:49,440 Speaker 1: For more podcasts from My Heart Radio, visit the I 464 00:29:49,560 --> 00:29:52,800 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 465 00:29:52,840 --> 00:29:53,760 Speaker 1: your favorite shows.