1 00:00:04,440 --> 00:00:12,239 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,240 --> 00:00:15,880 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,920 --> 00:00:19,840 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,960 --> 00:00:23,480 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:23,760 --> 00:00:28,600 Speaker 1: May ninth, twenty twenty three, and we begin, as is 6 00:00:28,760 --> 00:00:33,040 Speaker 1: our custom, with some AI related stories, but it's a 7 00:00:33,040 --> 00:00:36,120 Speaker 1: little bit different this time around. So first up, the 8 00:00:36,120 --> 00:00:40,320 Speaker 1: Wall Street Journal has an article titled chat gpt Fever 9 00:00:40,520 --> 00:00:46,120 Speaker 1: has investors pouring billions into AI startups no business plan required. 10 00:00:46,920 --> 00:00:49,720 Speaker 1: If you've been around for a while, that headline's going 11 00:00:49,760 --> 00:00:52,400 Speaker 1: to sound really familiar to you because you've seen the 12 00:00:52,440 --> 00:00:55,520 Speaker 1: same darned thing happen before. I mean, it happened in 13 00:00:55,560 --> 00:00:59,959 Speaker 1: the nineties with webs startup companies as investors flooded fledgeling 14 00:01:00,160 --> 00:01:03,600 Speaker 1: companies with more money than they knew what to do with, 15 00:01:04,319 --> 00:01:06,680 Speaker 1: and a lot of those companies didn't have any kind 16 00:01:06,720 --> 00:01:10,759 Speaker 1: of business plan that would allow them to actually work 17 00:01:10,880 --> 00:01:12,800 Speaker 1: in the long run, and a lot of those companies 18 00:01:13,160 --> 00:01:16,360 Speaker 1: collapsed a year or two later and all that money 19 00:01:16,360 --> 00:01:20,240 Speaker 1: went away. The dot com bubble was brutal. It happened 20 00:01:20,280 --> 00:01:22,800 Speaker 1: again in two thousand and eight, with the real estate crisis, 21 00:01:22,880 --> 00:01:26,080 Speaker 1: banks were issuing loans to folks who really couldn't afford 22 00:01:26,120 --> 00:01:29,080 Speaker 1: to pay those loans back, and then banks were selling 23 00:01:29,120 --> 00:01:33,720 Speaker 1: that debt to other institutions that were buying debt in 24 00:01:33,800 --> 00:01:36,440 Speaker 1: order to try and make even more money. It all 25 00:01:36,480 --> 00:01:40,880 Speaker 1: collapsed on itself to disastrous effect. It happened with cryptocurrency 26 00:01:40,880 --> 00:01:45,080 Speaker 1: and blockchain companies that really flared up with NFTs. We 27 00:01:45,160 --> 00:01:47,800 Speaker 1: saw it happening with the metaverse as well. So pretty 28 00:01:47,880 --> 00:01:52,200 Speaker 1: much anytime there's an interesting technological innovation that emerges, folks 29 00:01:52,240 --> 00:01:55,200 Speaker 1: will get hit by fomo super hard. But fomo is 30 00:01:55,240 --> 00:01:57,840 Speaker 1: a fear of missing out in case you didn't know, 31 00:01:58,240 --> 00:02:00,480 Speaker 1: and they don't want to be the person on the 32 00:02:00,480 --> 00:02:03,120 Speaker 1: block who can't afford to move into a mansion because 33 00:02:03,160 --> 00:02:06,920 Speaker 1: they didn't go and pour their life savings into some startup. 34 00:02:07,400 --> 00:02:09,880 Speaker 1: We all want to be the investor who got into 35 00:02:09,880 --> 00:02:13,280 Speaker 1: Apple just before Steve Jobs came back and took the 36 00:02:13,320 --> 00:02:15,960 Speaker 1: company to the stratosphere. Or we want to be one 37 00:02:16,000 --> 00:02:18,720 Speaker 1: of the people who put money into Google when it 38 00:02:18,840 --> 00:02:21,800 Speaker 1: was just a startup working out of a garage. But 39 00:02:21,880 --> 00:02:26,280 Speaker 1: this behavior ends up being self destructive and destructive in general, 40 00:02:27,000 --> 00:02:30,640 Speaker 1: even assuming that the people behind an AI centric company 41 00:02:30,680 --> 00:02:33,160 Speaker 1: are on the up and up, there's no guarantee that 42 00:02:33,200 --> 00:02:36,639 Speaker 1: their idea is going to be practical or monetizable in 43 00:02:36,720 --> 00:02:40,280 Speaker 1: the long run. Then there are all the opportunists who 44 00:02:40,320 --> 00:02:42,760 Speaker 1: see a chance to make a killing by presenting a 45 00:02:42,800 --> 00:02:46,880 Speaker 1: cool sounding idea without having any intention of seeing it through. 46 00:02:47,040 --> 00:02:50,240 Speaker 1: They might promise the moon and stars because they see 47 00:02:50,320 --> 00:02:52,400 Speaker 1: that there are a lot of suckers out there with 48 00:02:52,520 --> 00:02:55,000 Speaker 1: a ton of money to invest, and they're excited about 49 00:02:55,040 --> 00:02:58,440 Speaker 1: the prospect of an AI company. It is a perfect 50 00:02:58,480 --> 00:03:01,440 Speaker 1: storm for parting a fool with their money. In other words, 51 00:03:01,880 --> 00:03:04,560 Speaker 1: now that's not to say that every AI company out 52 00:03:04,560 --> 00:03:07,400 Speaker 1: there is a bad idea or a scam. I'm sure 53 00:03:07,520 --> 00:03:09,720 Speaker 1: there are a lot of great ideas too that are 54 00:03:09,760 --> 00:03:13,440 Speaker 1: built on a foundation of a solid business plan. But 55 00:03:13,760 --> 00:03:16,760 Speaker 1: we're also rushing toward a time where regulators around the 56 00:03:16,800 --> 00:03:20,200 Speaker 1: world are starting to square off against AI. Now I 57 00:03:20,240 --> 00:03:24,880 Speaker 1: am no financial advisor, and goodness knows, I have got 58 00:03:24,880 --> 00:03:27,280 Speaker 1: a terrible track record at predicting whether something is going 59 00:03:27,320 --> 00:03:29,440 Speaker 1: to be a hit or not. I would just suggest 60 00:03:29,919 --> 00:03:34,600 Speaker 1: that before anyone invests serious money into any startup, they 61 00:03:34,680 --> 00:03:39,200 Speaker 1: use some critical thinking and careful consideration before paying out 62 00:03:39,840 --> 00:03:44,240 Speaker 1: a mountain of cash. Today, IBM launched an AI platform 63 00:03:44,400 --> 00:03:48,320 Speaker 1: called Watson X. Now you might remember that Watson is 64 00:03:48,400 --> 00:03:52,440 Speaker 1: IBM's AI platform that famously competed on the game show 65 00:03:52,600 --> 00:03:57,640 Speaker 1: Jeopardy way back in twenty eleven. For IBM, that exhibition 66 00:03:57,800 --> 00:04:00,120 Speaker 1: was really a whole lot of advertising for Watson, and 67 00:04:00,360 --> 00:04:03,760 Speaker 1: the company was hoping that other companies would come up 68 00:04:03,760 --> 00:04:07,720 Speaker 1: with cool business ideas that would require AI, and since 69 00:04:08,080 --> 00:04:12,400 Speaker 1: AI was really really hard to do well, those companies 70 00:04:12,520 --> 00:04:15,960 Speaker 1: would hire on IBM and pay to use Watson as 71 00:04:15,960 --> 00:04:18,680 Speaker 1: the foundation for their idea rather than have to build 72 00:04:18,720 --> 00:04:21,320 Speaker 1: everything from the ground up. But this was more than 73 00:04:21,320 --> 00:04:24,159 Speaker 1: a decade ago, and it was mad expensive to pay 74 00:04:24,160 --> 00:04:26,520 Speaker 1: for that kind of computing power. But now that we're 75 00:04:26,560 --> 00:04:30,640 Speaker 1: starting to see insane amounts of investment pour into AI, well, 76 00:04:30,760 --> 00:04:33,719 Speaker 1: IBM sees that the iron is hot, so it's time 77 00:04:33,760 --> 00:04:37,400 Speaker 1: to strike. So again, Watson X is meant as an 78 00:04:37,640 --> 00:04:41,120 Speaker 1: enterprise product, So it's meant for companies that want to 79 00:04:41,240 --> 00:04:45,440 Speaker 1: leverage AI to do something, maybe even something that will 80 00:04:45,560 --> 00:04:48,640 Speaker 1: end up eliminating jobs for people like you know, programmers, 81 00:04:48,720 --> 00:04:52,000 Speaker 1: for example, and then they will lean on a IBM 82 00:04:52,160 --> 00:04:54,760 Speaker 1: to provide the horse power while they take advantage of 83 00:04:54,800 --> 00:04:57,840 Speaker 1: the capabilities to do whatever it is they plan to do. 84 00:04:58,360 --> 00:05:03,720 Speaker 1: That's the basic concept behind watson x. NPR's Shannon Bond 85 00:05:04,120 --> 00:05:07,600 Speaker 1: has an article titled people are trying to claim real 86 00:05:07,680 --> 00:05:11,680 Speaker 1: videos are deep fakes. The courts are not amused, and 87 00:05:11,720 --> 00:05:14,000 Speaker 1: the headline gives you a strong hint at the contents 88 00:05:14,040 --> 00:05:16,039 Speaker 1: of that article. The article's great, by the way, you 89 00:05:16,040 --> 00:05:18,400 Speaker 1: should definitely read the whole thing. I think we can 90 00:05:18,440 --> 00:05:21,080 Speaker 1: all recognize the potential for deep fakes to cause a 91 00:05:21,120 --> 00:05:25,480 Speaker 1: person real harm. So imagine that a video pops up 92 00:05:25,520 --> 00:05:28,479 Speaker 1: online that makes it look like you were part of 93 00:05:28,520 --> 00:05:31,000 Speaker 1: some illegal activity that you had nothing to do with, 94 00:05:31,560 --> 00:05:35,280 Speaker 1: or that you were doing something you personally object to. 95 00:05:36,000 --> 00:05:39,560 Speaker 1: You have been robbed of your agency. Someone has effectively 96 00:05:39,600 --> 00:05:42,560 Speaker 1: forced you, or at least an image of you, to 97 00:05:42,640 --> 00:05:46,400 Speaker 1: do something that you wouldn't do. That's a violation and 98 00:05:46,440 --> 00:05:49,640 Speaker 1: it could do you serious harm. Imagine that your employer 99 00:05:50,000 --> 00:05:53,400 Speaker 1: sees a video of you appearing to deface property or 100 00:05:53,480 --> 00:05:56,800 Speaker 1: to take part in some violent act. It doesn't take 101 00:05:56,839 --> 00:05:58,600 Speaker 1: long before you start to come up with all sorts 102 00:05:58,640 --> 00:06:02,680 Speaker 1: of scenarios that are ugly and harmful, but then there's 103 00:06:02,720 --> 00:06:06,279 Speaker 1: the other side of the deep fake coin. Imagine someone 104 00:06:06,279 --> 00:06:10,880 Speaker 1: who was legitimately caught on camera actually doing something illegal 105 00:06:11,040 --> 00:06:14,400 Speaker 1: or unethical or immoral, and imagine that that person tries 106 00:06:14,440 --> 00:06:17,119 Speaker 1: to weasel out of accountability by claiming that the video 107 00:06:17,320 --> 00:06:20,200 Speaker 1: is in fact a deep fake. That's not me, that's 108 00:06:20,240 --> 00:06:23,919 Speaker 1: computer generated. When deep fakes get so good that we 109 00:06:23,960 --> 00:06:27,520 Speaker 1: can't tell their deep fakes, the sword cuts both ways. 110 00:06:28,120 --> 00:06:32,080 Speaker 1: Bobby Chesney and Danielle Cetron coined the phrase the liar's 111 00:06:32,240 --> 00:06:36,680 Speaker 1: dividend to describe this situation, one in which a person 112 00:06:36,720 --> 00:06:39,800 Speaker 1: can deny responsibility for their actions that were caught on 113 00:06:39,920 --> 00:06:43,440 Speaker 1: video by claiming it was faked footage. The NPR article 114 00:06:43,480 --> 00:06:46,279 Speaker 1: cites a case in which lawyers for a certain Elon 115 00:06:46,440 --> 00:06:50,680 Speaker 1: Musk claimed that a recorded statement made by Musk was 116 00:06:50,720 --> 00:06:54,720 Speaker 1: actually a deep fake. The judge was not having it. 117 00:06:55,040 --> 00:06:57,320 Speaker 1: In fact, she said that to accept the claim that 118 00:06:57,360 --> 00:07:00,920 Speaker 1: the audio was not real would effectively say legal precedent 119 00:07:00,960 --> 00:07:03,880 Speaker 1: that would let Musk say whatever he wants to, no 120 00:07:03,920 --> 00:07:07,000 Speaker 1: matter what the consequences might be, and then fall back 121 00:07:07,000 --> 00:07:09,159 Speaker 1: on claims that it was all just generated by AI 122 00:07:09,320 --> 00:07:12,640 Speaker 1: in order to avoid accountability. I imagine such a precedent 123 00:07:12,960 --> 00:07:17,320 Speaker 1: would be gleefully adopted by countless public figures. It really 124 00:07:17,440 --> 00:07:20,640 Speaker 1: helps them fight against these so called cancel culture. The 125 00:07:21,000 --> 00:07:23,800 Speaker 1: play to foist blame on deep fakes so far has 126 00:07:23,880 --> 00:07:27,240 Speaker 1: not been successful in court, as the headline of the 127 00:07:27,320 --> 00:07:30,800 Speaker 1: article suggests, and there are companies that develop tools that 128 00:07:30,880 --> 00:07:33,960 Speaker 1: look for tiny indications that a video was actually faked. 129 00:07:34,280 --> 00:07:36,880 Speaker 1: So it's not like deep fake technology is perfect and 130 00:07:36,920 --> 00:07:39,480 Speaker 1: that it can go undetected. But that's how things are 131 00:07:39,560 --> 00:07:42,640 Speaker 1: right now. There's no telling if or when we'll get 132 00:07:42,640 --> 00:07:46,160 Speaker 1: to a point when the fakes are indistinguishable from the 133 00:07:46,200 --> 00:07:49,240 Speaker 1: real thing, and we may see court systems adopt practices 134 00:07:49,280 --> 00:07:52,880 Speaker 1: that require anyone using a deep fake defense to provide 135 00:07:53,040 --> 00:07:56,960 Speaker 1: proof of their claim. However, the article also points out 136 00:07:57,080 --> 00:08:01,720 Speaker 1: that the opposite effect could end up happening with that approach. 137 00:08:01,800 --> 00:08:06,760 Speaker 1: Right so, instead of proving that something isn't fake, the 138 00:08:06,880 --> 00:08:10,560 Speaker 1: increase in the costs and time on part of prosecution 139 00:08:11,040 --> 00:08:13,640 Speaker 1: to provide evidence that, yeah, this is a real video, 140 00:08:13,680 --> 00:08:16,960 Speaker 1: it's not fake. That could mean that the prosecution never 141 00:08:17,000 --> 00:08:19,560 Speaker 1: submits the video in the first place because of the 142 00:08:19,560 --> 00:08:22,400 Speaker 1: defense can say no, that's fake. You have to prove 143 00:08:22,440 --> 00:08:25,800 Speaker 1: it's true, and the cost of proving it's true is 144 00:08:26,080 --> 00:08:29,880 Speaker 1: super expensive. Prosecution might say, you know, we can't submit 145 00:08:29,920 --> 00:08:32,080 Speaker 1: that as evidence because we don't have the money to 146 00:08:32,160 --> 00:08:36,040 Speaker 1: pay to prove that it's that it's real. And so 147 00:08:36,520 --> 00:08:39,640 Speaker 1: it could be that the requirement to provide proof, you know, 148 00:08:40,200 --> 00:08:44,920 Speaker 1: ironically prevents anyone from submitting video evidence in the first place. 149 00:08:45,640 --> 00:08:49,480 Speaker 1: And really it just means that we're heading toward a 150 00:08:49,520 --> 00:08:57,040 Speaker 1: future where people can't trust reality, which is terrifying. Sam 151 00:08:57,080 --> 00:09:01,600 Speaker 1: Bateman Freed aka SBF aka the founder of the cryptocurrency 152 00:09:01,640 --> 00:09:04,520 Speaker 1: exchange FTX, you know, the one that went belly up 153 00:09:04,600 --> 00:09:07,720 Speaker 1: late last year, says it's all been fun and stuff, 154 00:09:07,760 --> 00:09:09,960 Speaker 1: but maybe the courts could just like toss out the 155 00:09:10,040 --> 00:09:13,000 Speaker 1: charges against him. Not all of the charges, mind you, 156 00:09:13,080 --> 00:09:15,760 Speaker 1: but you know, ten out of the thirteen. So if 157 00:09:15,800 --> 00:09:18,600 Speaker 1: you don't remember what happened, here's the super fast version. 158 00:09:19,280 --> 00:09:23,600 Speaker 1: SBF had two cryptocurrency companies that he co founded. One 159 00:09:23,720 --> 00:09:27,200 Speaker 1: was Alameda Research that's an investment fund or it was 160 00:09:27,240 --> 00:09:30,679 Speaker 1: an investment fund that would take investor dollars and then 161 00:09:30,840 --> 00:09:34,760 Speaker 1: put them into various crypto companies. The other one was FTX, 162 00:09:34,800 --> 00:09:37,800 Speaker 1: which was an exchange where you could exchange one type 163 00:09:37,800 --> 00:09:41,280 Speaker 1: of currency for some other type of currency. Someone leaked 164 00:09:41,320 --> 00:09:45,360 Speaker 1: an internal document that showed that Alameda had secretly been 165 00:09:45,400 --> 00:09:49,439 Speaker 1: funneling money belonging to FTX customers in order to pay 166 00:09:49,440 --> 00:09:53,040 Speaker 1: off Alameda investors, and they were essentially counting on the 167 00:09:53,040 --> 00:09:56,200 Speaker 1: fact that the FTX community would keep enough of their 168 00:09:56,280 --> 00:10:00,800 Speaker 1: cash inside FTX to cover the transfer, the idea being that, Okay, 169 00:10:00,840 --> 00:10:03,360 Speaker 1: when we make the money back, we can cover this. 170 00:10:03,920 --> 00:10:06,079 Speaker 1: But when news got out that this was happening, there 171 00:10:06,120 --> 00:10:09,160 Speaker 1: was essentially a run on the bank and everyone started 172 00:10:09,160 --> 00:10:11,600 Speaker 1: to try and pull their money out of FTX, and 173 00:10:11,640 --> 00:10:14,360 Speaker 1: it all came tumbling down. The FED stepped in and 174 00:10:14,400 --> 00:10:16,760 Speaker 1: seized assets in an attempt to get as much money 175 00:10:16,840 --> 00:10:19,920 Speaker 1: back for customers as possible, and folks like SBF were 176 00:10:19,920 --> 00:10:22,960 Speaker 1: on the hook for committing lots and lots of crimes. Now, 177 00:10:23,160 --> 00:10:27,079 Speaker 1: SBF is saying that prosecutors unfairly targeted him and that 178 00:10:27,200 --> 00:10:31,440 Speaker 1: FTX wasn't the only cryptocurrency to collapse last year. It's 179 00:10:31,440 --> 00:10:35,000 Speaker 1: not like it was an outlier. So essentially SBF's lawyers 180 00:10:35,000 --> 00:10:39,160 Speaker 1: are saying because other companies failed, ftx's failure wasn't special 181 00:10:39,200 --> 00:10:42,640 Speaker 1: and thus did not warrant the speed and ferocity of 182 00:10:42,679 --> 00:10:47,240 Speaker 1: prosecution that it got, except that some of SBF's compatriots 183 00:10:47,240 --> 00:10:50,480 Speaker 1: have already pleaded guilty to charge as a fraud. And 184 00:10:50,600 --> 00:10:53,400 Speaker 1: also it sounds like it's kind of too coke reasoning. 185 00:10:53,600 --> 00:10:57,280 Speaker 1: Like you also the idea being, hey, because this person 186 00:10:57,320 --> 00:11:00,679 Speaker 1: did a crime, that means my crime's not as bad. No, 187 00:11:00,760 --> 00:11:04,040 Speaker 1: it just means that two people have comit a crime. Anyway, 188 00:11:04,480 --> 00:11:07,520 Speaker 1: the document showing the transfer of funds that came out 189 00:11:07,800 --> 00:11:11,160 Speaker 1: before FTX collapsed, I think that's a big problem. I mean, 190 00:11:11,360 --> 00:11:14,240 Speaker 1: I'm not a legal expert. Maybe it's not a big problem. 191 00:11:14,480 --> 00:11:16,640 Speaker 1: I would just assume that when you have a document 192 00:11:17,160 --> 00:11:21,720 Speaker 1: that indicates that something hinky is going on and then 193 00:11:21,840 --> 00:11:26,079 Speaker 1: the company goes under, that does warrant a closer look. 194 00:11:26,600 --> 00:11:29,559 Speaker 1: So there may be aspects to SBF's claims that I 195 00:11:29,600 --> 00:11:33,360 Speaker 1: am completely missing, or maybe even some of his lawyer's 196 00:11:33,400 --> 00:11:37,200 Speaker 1: complaints against prosecutors and government officials are accurate. But for 197 00:11:37,280 --> 00:11:40,640 Speaker 1: the moment, I remained skeptical that this ploy is going 198 00:11:40,800 --> 00:11:45,040 Speaker 1: to work. Sticking with crypto, yesterday, the cryptocurrency exchange called 199 00:11:45,080 --> 00:11:49,000 Speaker 1: Bittrex Incorporated filed for bankruptcy protection. So just under a 200 00:11:49,040 --> 00:11:52,400 Speaker 1: month ago, the US Securities and Exchange Commission or SEC 201 00:11:52,960 --> 00:11:57,640 Speaker 1: alleged that Bittrex was acting as an unregistered securities exchange. 202 00:11:58,040 --> 00:12:01,320 Speaker 1: Following that accusation, the company shut down operations in the 203 00:12:01,400 --> 00:12:06,560 Speaker 1: United States. So Bittrix Incorporated is the US arm of 204 00:12:06,640 --> 00:12:10,160 Speaker 1: this crypto exchange. This is not unusual. There are a 205 00:12:10,200 --> 00:12:15,040 Speaker 1: lot of crypto companies that maintain a separate entity, specifically 206 00:12:15,160 --> 00:12:18,960 Speaker 1: for the United States. You can also see ftx's former 207 00:12:19,040 --> 00:12:23,640 Speaker 1: chief rival, Binance, another cryptocurrency exchange. It does this too. 208 00:12:23,960 --> 00:12:29,720 Speaker 1: And while Beatrix Incorporated appears to be going under Bittrix Global, 209 00:12:30,240 --> 00:12:34,160 Speaker 1: the part of the company that sees business everywhere besides 210 00:12:34,200 --> 00:12:37,480 Speaker 1: the United States, has no future like that in store 211 00:12:37,480 --> 00:12:40,600 Speaker 1: for it. It's, still, according to the founders, going strong. 212 00:12:41,000 --> 00:12:44,240 Speaker 1: So the US branch says it still has its customer 213 00:12:44,320 --> 00:12:47,679 Speaker 1: funds in its possession and it's requesting a time during 214 00:12:47,840 --> 00:12:51,800 Speaker 1: bankruptcy proceedings to allow customers to retrieve their money before 215 00:12:51,800 --> 00:12:55,600 Speaker 1: it becomes a big part of this messy bankruptcy proceeding. 216 00:12:56,080 --> 00:13:00,000 Speaker 1: Investigations into Bittrex indicate that perhaps something hinky was going 217 00:13:00,000 --> 00:13:03,320 Speaker 1: going on, and perhaps there was an effort to cover 218 00:13:03,440 --> 00:13:08,079 Speaker 1: up said hinkiness. Though the company has denied any such hincosity. 219 00:13:08,679 --> 00:13:10,880 Speaker 1: It did agree to pay twenty nine million dollars in 220 00:13:10,960 --> 00:13:14,120 Speaker 1: fines to the Treasury Department relating to stuff like money 221 00:13:14,200 --> 00:13:18,480 Speaker 1: laundering and sidestepping international sanctions against other countries. So you 222 00:13:18,480 --> 00:13:22,680 Speaker 1: know that's totally different. Okay, we're gonna take a quick break, 223 00:13:22,720 --> 00:13:25,920 Speaker 1: and when we come back, I'm gonna talk about ads 224 00:13:25,920 --> 00:13:38,360 Speaker 1: a little bit. We're back, okay, and I know, in fact, 225 00:13:38,360 --> 00:13:41,200 Speaker 1: if there's just one thing I know, it's that people 226 00:13:41,280 --> 00:13:43,560 Speaker 1: do not like ads. They tell me that all the 227 00:13:43,640 --> 00:13:47,360 Speaker 1: time regarding my show. Also, just quick moment of me 228 00:13:47,480 --> 00:13:50,120 Speaker 1: being real with all of y'all. So I work for 229 00:13:50,160 --> 00:13:54,560 Speaker 1: a really big media company, right, Folks, way higher up 230 00:13:54,600 --> 00:13:57,559 Speaker 1: on the ladder than I am, decide how many ads 231 00:13:58,000 --> 00:14:01,440 Speaker 1: get served on shows like mine. I don't get to 232 00:14:01,480 --> 00:14:04,360 Speaker 1: make that decision. That decision is made and I have 233 00:14:04,440 --> 00:14:07,200 Speaker 1: to abide by it. Now, those ads pay the bills 234 00:14:07,280 --> 00:14:09,920 Speaker 1: and they keep shows like this one going, so they 235 00:14:09,960 --> 00:14:13,200 Speaker 1: do serve a purpose. Just No, I'm not the person 236 00:14:13,240 --> 00:14:14,960 Speaker 1: sitting in the big old chair pet and the kitty 237 00:14:15,040 --> 00:14:18,360 Speaker 1: cat while laughing maniacally. It's one of my bosses. I 238 00:14:18,400 --> 00:14:23,520 Speaker 1: won't say which one or ones. Anyway, The Verge reports 239 00:14:23,800 --> 00:14:27,640 Speaker 1: that Google is inserting more ads into Gmail now. Previously, 240 00:14:27,680 --> 00:14:31,320 Speaker 1: Google inserted ads into the top sections of the promotions 241 00:14:31,680 --> 00:14:34,480 Speaker 1: and social inbox tabs, you know, the ones that you 242 00:14:34,520 --> 00:14:37,840 Speaker 1: hardly ever check. In fact, I never checked those unless 243 00:14:37,880 --> 00:14:40,080 Speaker 1: I was expecting something and I couldn't find it in 244 00:14:40,080 --> 00:14:42,160 Speaker 1: my inbox, so then I looked to see if maybe 245 00:14:42,160 --> 00:14:45,360 Speaker 1: it got shuffled into one of those tabs. Now Google 246 00:14:45,480 --> 00:14:48,960 Speaker 1: is testing the ads in other places, such as in 247 00:14:49,040 --> 00:14:52,440 Speaker 1: the updates filter. So this filter creates a view of 248 00:14:52,520 --> 00:14:56,160 Speaker 1: your email that shows messages that you know give you 249 00:14:56,280 --> 00:15:00,000 Speaker 1: updates about stuff. So let's say you ordered a product 250 00:15:00,560 --> 00:15:02,680 Speaker 1: and you did it online and you just want to 251 00:15:02,720 --> 00:15:04,880 Speaker 1: see where in the delivery process it is what was 252 00:15:04,920 --> 00:15:07,880 Speaker 1: the last update. You might use the update filter that 253 00:15:08,040 --> 00:15:09,840 Speaker 1: kind of weed up all the other stuff in order 254 00:15:09,840 --> 00:15:14,280 Speaker 1: to find this specific update more efficiently. Well, if you 255 00:15:14,320 --> 00:15:17,280 Speaker 1: are in the test group, then when you use that filter, 256 00:15:17,440 --> 00:15:23,240 Speaker 1: it would also show you ads incorporated within the results 257 00:15:23,280 --> 00:15:26,560 Speaker 1: of this filter, presumably ads for stuff that's not necessarily 258 00:15:26,600 --> 00:15:29,200 Speaker 1: related to your actual updates. And as I'm sure you 259 00:15:29,200 --> 00:15:33,400 Speaker 1: can imagine, Google users who have seen this as part 260 00:15:33,440 --> 00:15:36,040 Speaker 1: of the test are not super happy about this change. 261 00:15:36,600 --> 00:15:39,800 Speaker 1: Some users shared screenshots that showed the ads appeared in 262 00:15:39,920 --> 00:15:43,480 Speaker 1: between valid email messages, so you'd have like an email, 263 00:15:43,800 --> 00:15:48,680 Speaker 1: an email, and email, an AD that looks like an email, 264 00:15:49,280 --> 00:15:51,880 Speaker 1: but it's labeled AD if you looked off to the side, 265 00:15:51,920 --> 00:15:55,160 Speaker 1: and then more emails, so it's like filtered into it 266 00:15:55,240 --> 00:15:57,920 Speaker 1: as opposed to all gathered at the top, which is, 267 00:15:58,320 --> 00:16:00,800 Speaker 1: you know, kind of how Google had in doing things 268 00:16:01,120 --> 00:16:02,680 Speaker 1: where all the ads were at the top. You would 269 00:16:02,680 --> 00:16:04,160 Speaker 1: scroll past the ads and you would get to the 270 00:16:04,240 --> 00:16:08,800 Speaker 1: actual content. Now they're being interspersed, so that also is 271 00:16:08,840 --> 00:16:11,720 Speaker 1: something people are objecting to, and in fact, it's something 272 00:16:11,760 --> 00:16:16,600 Speaker 1: that podcasters subject to too. A lot of podcasters like myself, 273 00:16:17,080 --> 00:16:21,360 Speaker 1: we want to have a clear delineation between the episode 274 00:16:21,480 --> 00:16:24,240 Speaker 1: and the ad breaks, and so that's why before every 275 00:16:24,240 --> 00:16:26,640 Speaker 1: ad break, I say we're going to take a quick break, 276 00:16:26,920 --> 00:16:29,760 Speaker 1: because I want to make sure that it's clear we're 277 00:16:29,800 --> 00:16:33,320 Speaker 1: transitioning into ads and it's not like some bait and switch. 278 00:16:33,760 --> 00:16:36,200 Speaker 1: So Google's doing something that the rest of us figured 279 00:16:36,200 --> 00:16:41,280 Speaker 1: out is not good and ends up undermining trust, but 280 00:16:41,320 --> 00:16:43,640 Speaker 1: they're doing it anyway, at least in this test. Whether 281 00:16:43,760 --> 00:16:45,720 Speaker 1: this gets rolled out to the general public, we have 282 00:16:45,760 --> 00:16:48,400 Speaker 1: to wait and see. But I will say, if Google 283 00:16:48,440 --> 00:16:51,200 Speaker 1: sees that it's making them more money, I think it 284 00:16:51,240 --> 00:16:54,120 Speaker 1: would be unrealistic to expect them to just back off 285 00:16:54,120 --> 00:16:57,440 Speaker 1: of it. So we'll see now. At the same time, 286 00:16:58,200 --> 00:17:01,640 Speaker 1: Microsoft is effectively turning to Google to say hold my beer. 287 00:17:02,240 --> 00:17:05,240 Speaker 1: So tech Radar reports that Microsoft appears to be pushing 288 00:17:05,280 --> 00:17:09,360 Speaker 1: more ads into the Windows eleven experience now as it stands, 289 00:17:09,760 --> 00:17:12,080 Speaker 1: when you open up the start menu on Windows eleven, 290 00:17:12,680 --> 00:17:16,359 Speaker 1: you can see a display of ads for other Microsoft services, 291 00:17:16,720 --> 00:17:21,960 Speaker 1: typically stuff like cloud storage or access to other Microsoft 292 00:17:22,080 --> 00:17:26,080 Speaker 1: suites of programs. So it's not just that it's an 293 00:17:26,080 --> 00:17:29,159 Speaker 1: operating system, it's also serving as an advertising platform for 294 00:17:29,280 --> 00:17:33,640 Speaker 1: other Microsoft products. Tech Radars Darren Allen says we might 295 00:17:33,680 --> 00:17:38,600 Speaker 1: see something similar roll out into the Settings app within Windows, 296 00:17:38,640 --> 00:17:40,399 Speaker 1: so like if you ever have to go into Windows 297 00:17:40,440 --> 00:17:43,359 Speaker 1: to change settings for something, you might end up seeing 298 00:17:43,359 --> 00:17:46,320 Speaker 1: ads for Microsoft services in there as well. A Twitter 299 00:17:46,440 --> 00:17:50,320 Speaker 1: user with the handle at the book is closed they 300 00:17:50,359 --> 00:17:53,520 Speaker 1: also go by Alba Core shared some images that were 301 00:17:53,560 --> 00:17:57,199 Speaker 1: said to be screenshots of the setting's homepage in a 302 00:17:57,320 --> 00:18:01,360 Speaker 1: test build of Windows eleven, and sure enough, there are 303 00:18:01,400 --> 00:18:05,800 Speaker 1: more ads for Microsoft products, specifically the image that Albacore 304 00:18:05,960 --> 00:18:10,600 Speaker 1: shared had an ad for Microsoft three sixty five embedded 305 00:18:11,280 --> 00:18:15,639 Speaker 1: in the Settings home view. It appears that the price 306 00:18:15,680 --> 00:18:19,840 Speaker 1: you pay for using Microsoft Windows computers is a persistent 307 00:18:19,880 --> 00:18:23,320 Speaker 1: barrage of notices urging you to, you know, buy more 308 00:18:23,400 --> 00:18:27,920 Speaker 1: Microsoft products and services. Now, I should add this has 309 00:18:28,000 --> 00:18:31,840 Speaker 1: not been rolled out to Windows users in general. In fact, 310 00:18:31,840 --> 00:18:34,480 Speaker 1: it's not even been confirmed to be real yet when 311 00:18:34,480 --> 00:18:38,560 Speaker 1: I was recording this. But assuming that there are beta 312 00:18:38,600 --> 00:18:41,479 Speaker 1: testers out there who are encountering these kinds of things, 313 00:18:41,720 --> 00:18:44,840 Speaker 1: we can only hope that they push back against this practice. 314 00:18:44,880 --> 00:18:49,040 Speaker 1: If nothing else, it's really not doing Microsoft any favors 315 00:18:49,119 --> 00:18:53,600 Speaker 1: because various regulators around the world are scrutinizing the company 316 00:18:53,640 --> 00:18:58,560 Speaker 1: over matters of anti competitiveness. And if you're integrating ads 317 00:18:58,720 --> 00:19:03,560 Speaker 1: for your own products into your own operating system, that 318 00:19:04,040 --> 00:19:07,880 Speaker 1: feels to me like it could be treading dangerously close 319 00:19:07,960 --> 00:19:13,080 Speaker 1: to territory that regulators would latch onto and say you 320 00:19:13,119 --> 00:19:15,600 Speaker 1: are using an unfair advantage here because you have a 321 00:19:15,640 --> 00:19:18,679 Speaker 1: dominant position in the operating system market. For you to 322 00:19:18,720 --> 00:19:22,240 Speaker 1: be using that to advertise your own services, but say 323 00:19:22,440 --> 00:19:26,400 Speaker 1: not anyone else's is potentially anti competitive. I'm not saying 324 00:19:26,480 --> 00:19:30,359 Speaker 1: that regulators are necessarily going to clamp on down to that. 325 00:19:30,520 --> 00:19:33,560 Speaker 1: But like, it just surprises me that Microsoft would even 326 00:19:33,640 --> 00:19:37,159 Speaker 1: pursue this based upon the pushback the company has received 327 00:19:37,200 --> 00:19:40,720 Speaker 1: recently around the world. You know, when everyone's giving you 328 00:19:40,800 --> 00:19:44,320 Speaker 1: side eye, maybe you should cool it for a little bit. 329 00:19:44,840 --> 00:19:47,560 Speaker 1: Here in the United States, the Department of Defense is 330 00:19:47,600 --> 00:19:52,080 Speaker 1: seeking a way to fund new tech projects without first 331 00:19:52,440 --> 00:19:57,359 Speaker 1: gaining congressional approval. Now that notice gave me a knee 332 00:19:57,440 --> 00:20:01,800 Speaker 1: jerk reaction of WHOA, that's not good. But then I 333 00:20:01,880 --> 00:20:04,440 Speaker 1: read it and I started to get a deeper appreciation 334 00:20:04,640 --> 00:20:08,720 Speaker 1: beyond the surface level reaction. So the way things normally 335 00:20:08,760 --> 00:20:14,160 Speaker 1: work is that the Pentagon identifies a technological need for 336 00:20:14,640 --> 00:20:17,280 Speaker 1: matters of defense. Right they figure out, we need this 337 00:20:17,359 --> 00:20:22,439 Speaker 1: program in order to stay current and to give the 338 00:20:22,520 --> 00:20:26,440 Speaker 1: nation the best security possible. And the need then has 339 00:20:26,480 --> 00:20:30,280 Speaker 1: to be presented to lawmakers for approval before it can 340 00:20:30,440 --> 00:20:33,760 Speaker 1: move forward and then get funding. And then there's the 341 00:20:33,840 --> 00:20:37,120 Speaker 1: actual round of securing funding for the new project. It's 342 00:20:37,240 --> 00:20:40,560 Speaker 1: to incorporate it into the budget and to be able 343 00:20:40,720 --> 00:20:45,480 Speaker 1: to actually pay to have this tech project put into action. 344 00:20:46,280 --> 00:20:50,280 Speaker 1: This whole process of approval and budgeting can take a 345 00:20:50,359 --> 00:20:55,080 Speaker 1: really long time. The concern is that as time passes, 346 00:20:55,160 --> 00:20:59,000 Speaker 1: while you're just waiting to get started, adversaries are rushing 347 00:20:59,040 --> 00:21:02,919 Speaker 1: ahead with their own because they're relying on less oversight, 348 00:21:03,080 --> 00:21:06,240 Speaker 1: or they have a streamline process, or they're being run 349 00:21:06,320 --> 00:21:09,760 Speaker 1: by the military, so they just approve everything without having 350 00:21:09,800 --> 00:21:13,399 Speaker 1: to go to anyone else. And so the proposal is 351 00:21:13,520 --> 00:21:17,680 Speaker 1: for a three hundred million dollar allowance within the DoD 352 00:21:18,000 --> 00:21:21,200 Speaker 1: to use to fund new projects without first getting Congressional 353 00:21:21,240 --> 00:21:24,920 Speaker 1: approval for them. Now, these projects do have to hit 354 00:21:25,119 --> 00:21:28,680 Speaker 1: several criteria in order to merit a share of that 355 00:21:28,680 --> 00:21:32,080 Speaker 1: three hundred million dollars. They have to be new projects. 356 00:21:32,119 --> 00:21:35,760 Speaker 1: Any existing project requires the old budgeting approach, that three 357 00:21:35,800 --> 00:21:39,080 Speaker 1: hundred million comes out the overall budget for the Department 358 00:21:39,119 --> 00:21:41,720 Speaker 1: of Defense. So this is not like a three hundred 359 00:21:41,760 --> 00:21:46,600 Speaker 1: million dollar check handed out by Congress. Congress is saying you, 360 00:21:47,040 --> 00:21:49,959 Speaker 1: or the proposal is saying Congress would allow the Department 361 00:21:50,000 --> 00:21:52,600 Speaker 1: of Defense to carve out three hundred million dollars within 362 00:21:52,680 --> 00:21:57,000 Speaker 1: its budget to serve for this purpose, where in return 363 00:21:57,080 --> 00:22:00,359 Speaker 1: for not having to seek Congressional approval, this one hundred 364 00:22:00,359 --> 00:22:03,800 Speaker 1: million can be used to fund various stuff, and any 365 00:22:03,880 --> 00:22:06,399 Speaker 1: project leader would have to get a sign off from 366 00:22:06,480 --> 00:22:09,840 Speaker 1: the Secretary of Defense first before the project would move forward. 367 00:22:09,840 --> 00:22:13,600 Speaker 1: The Secretary of Defense would have to determine that waiting 368 00:22:13,680 --> 00:22:16,680 Speaker 1: for the normal budget cycle would result in being too 369 00:22:16,680 --> 00:22:19,919 Speaker 1: far behind that that would be a dangerous thing. So 370 00:22:21,040 --> 00:22:23,600 Speaker 1: there are a lot of boxes that have to be 371 00:22:23,680 --> 00:22:29,200 Speaker 1: checked marked before this proposal would even be a thing now. 372 00:22:29,240 --> 00:22:32,199 Speaker 1: As I said, my first reaction upon reading the headline 373 00:22:32,240 --> 00:22:34,560 Speaker 1: was to be nervous, because I'm not a fan of 374 00:22:35,119 --> 00:22:40,919 Speaker 1: military organizations having less governmental oversight. But on the flip side, 375 00:22:41,000 --> 00:22:44,920 Speaker 1: it is undeniable that the pace of technology is way 376 00:22:45,320 --> 00:22:48,840 Speaker 1: way faster than the political system, which moves at a 377 00:22:48,840 --> 00:22:54,000 Speaker 1: more glacial speed. And it's not like there's no oversight 378 00:22:54,040 --> 00:22:56,840 Speaker 1: at all. Nor is it like giving the DoD a 379 00:22:56,840 --> 00:22:59,480 Speaker 1: blank check to do whatever they want with it, and 380 00:22:59,520 --> 00:23:03,119 Speaker 1: the next you know, you've got robot soldiers. So as 381 00:23:03,200 --> 00:23:08,320 Speaker 1: the proposal stands, I'm cautiously in favor of it, but 382 00:23:08,359 --> 00:23:11,560 Speaker 1: I'm also old enough to be ready to regret saying that. 383 00:23:11,680 --> 00:23:16,200 Speaker 1: Further down the line. Okay, we've got a few more 384 00:23:16,280 --> 00:23:18,720 Speaker 1: news stories to get to, but it's time for us 385 00:23:18,760 --> 00:23:21,359 Speaker 1: to take another quick break and we'll be right back. 386 00:23:30,720 --> 00:23:34,480 Speaker 1: So we're back. Ours Technica has a great piece that's 387 00:23:34,520 --> 00:23:39,320 Speaker 1: titled white House challenges hackers to break top AI models 388 00:23:39,359 --> 00:23:43,880 Speaker 1: at def Con thirty one. So def Con is a 389 00:23:43,920 --> 00:23:48,520 Speaker 1: hacker and cybersecurity conference. It's the type of event where 390 00:23:48,600 --> 00:23:51,639 Speaker 1: you don't want to bring a personal laptop or smartphone 391 00:23:51,680 --> 00:23:55,480 Speaker 1: with you. Now burn our phones only please. It's a 392 00:23:55,480 --> 00:23:58,159 Speaker 1: place where folks have created all sorts of device is 393 00:23:58,200 --> 00:24:02,560 Speaker 1: meant to siphon information from all types of sources, so 394 00:24:02,600 --> 00:24:06,600 Speaker 1: we're talking things like RFID readers and NFC sensors and 395 00:24:06,760 --> 00:24:10,920 Speaker 1: sniffers and other stuff as well. It is not unusual 396 00:24:11,240 --> 00:24:14,639 Speaker 1: for the US government or for various companies to issue 397 00:24:14,680 --> 00:24:18,120 Speaker 1: a challenge to attendees in an effort to find security 398 00:24:18,200 --> 00:24:22,560 Speaker 1: vulnerabilities and thus get the chance to patch those vulnerabilities 399 00:24:23,200 --> 00:24:25,920 Speaker 1: early on before someone figures out a way to exploit 400 00:24:25,960 --> 00:24:29,040 Speaker 1: them in like a zero day attack. And as we 401 00:24:29,080 --> 00:24:32,720 Speaker 1: have seen over the last year, AI is a huge deal. 402 00:24:32,880 --> 00:24:36,920 Speaker 1: It has the potential to do amazing things or amazingly 403 00:24:37,160 --> 00:24:41,360 Speaker 1: terrible things. Figuring out if various AI platforms are secure 404 00:24:41,920 --> 00:24:44,800 Speaker 1: is absolutely critical for making sure that we steer AI 405 00:24:44,960 --> 00:24:49,200 Speaker 1: more toward the good stuff and away from the bad stuff. 406 00:24:49,520 --> 00:24:53,159 Speaker 1: Because weaponized AI is legit scary stuff, after all. So 407 00:24:53,240 --> 00:24:56,960 Speaker 1: the goal of this challenge is to have hackers uncover 408 00:24:57,119 --> 00:25:01,280 Speaker 1: gaps and security and AI design that companies will then 409 00:25:01,320 --> 00:25:04,800 Speaker 1: be able to address to minimize the chances of things 410 00:25:04,840 --> 00:25:08,600 Speaker 1: going really, really wrong in the future. That doesn't mean 411 00:25:08,600 --> 00:25:11,600 Speaker 1: that you know things won't go really wrong. Maybe they will, 412 00:25:11,800 --> 00:25:15,359 Speaker 1: but at least this is a step toward trying to 413 00:25:15,440 --> 00:25:18,760 Speaker 1: make sure that we're doing the best we can to 414 00:25:18,880 --> 00:25:22,320 Speaker 1: avoid that outcome. I think it's a necessary step. I 415 00:25:22,359 --> 00:25:26,639 Speaker 1: think the people who attend these conferences they have the 416 00:25:26,760 --> 00:25:31,919 Speaker 1: kind of mentality and skill set and knowledge to really 417 00:25:32,119 --> 00:25:34,480 Speaker 1: put various systems to the test. They look at things 418 00:25:34,520 --> 00:25:37,840 Speaker 1: in a different way, and by doing that and using 419 00:25:37,880 --> 00:25:42,399 Speaker 1: things in ways that weren't necessarily the intended use, you 420 00:25:42,440 --> 00:25:46,160 Speaker 1: can sometimes find out, Hey, this thing you didn't think 421 00:25:46,280 --> 00:25:49,160 Speaker 1: was important, turns out it's absolutely critical and you need 422 00:25:49,200 --> 00:25:51,399 Speaker 1: to address it. So I think that this is a 423 00:25:51,400 --> 00:25:55,640 Speaker 1: good step, but it's always I would be so intimidated 424 00:25:55,720 --> 00:25:59,720 Speaker 1: to appear at a defcon. I've never gone, and I 425 00:25:59,720 --> 00:26:04,360 Speaker 1: don't think I ever will. I'm not smart enough, I'm 426 00:26:04,359 --> 00:26:07,480 Speaker 1: not educated enough in that world. I mean I could 427 00:26:07,720 --> 00:26:10,840 Speaker 1: certainly learn a lot and report on a lot, but 428 00:26:11,480 --> 00:26:14,919 Speaker 1: I couldn't contribute anything other than potentially my own personal 429 00:26:15,040 --> 00:26:19,240 Speaker 1: data because I poorly secured a device or two or something. 430 00:26:20,040 --> 00:26:22,280 Speaker 1: That's kind of my nightmare. So I just stay far 431 00:26:22,320 --> 00:26:24,959 Speaker 1: away and let the smart people handle it, and then 432 00:26:25,000 --> 00:26:27,560 Speaker 1: I read up on it now in our last news 433 00:26:27,560 --> 00:26:30,399 Speaker 1: story of the day, which is going to include a 434 00:26:30,480 --> 00:26:35,679 Speaker 1: lot of Jonathan Snark in it. Peter thel technologist entrepreneur 435 00:26:35,760 --> 00:26:39,480 Speaker 1: Peter Thiel said in an episode of the podcast Honestly 436 00:26:39,640 --> 00:26:43,480 Speaker 1: with Barry Weiss that he is so deeply disappointed that 437 00:26:43,600 --> 00:26:46,680 Speaker 1: humanity hasn't figured out how to conquer death. I mean, 438 00:26:46,720 --> 00:26:48,879 Speaker 1: what are we even doing right? Like, we have not 439 00:26:49,160 --> 00:26:53,879 Speaker 1: spent the time needed to eliminate death or figure out 440 00:26:53,920 --> 00:26:57,399 Speaker 1: whether or not that's impossible. He also revealed he's gonna 441 00:26:57,400 --> 00:27:01,919 Speaker 1: get frozen after he dies, put into cryogenic storage so 442 00:27:02,000 --> 00:27:05,040 Speaker 1: that one day he might be able to be brought back, 443 00:27:05,240 --> 00:27:08,800 Speaker 1: possibly if such a thing is possible. And y'all, I 444 00:27:08,840 --> 00:27:13,560 Speaker 1: feel like this is a theme with certain uber rich people. 445 00:27:13,640 --> 00:27:17,160 Speaker 1: I get the feeling. Elon Musk falls into this category too. 446 00:27:17,920 --> 00:27:21,359 Speaker 1: The sensation I get, and this is just my own opinion, 447 00:27:22,119 --> 00:27:24,479 Speaker 1: is that these are people who are terrified by the 448 00:27:24,520 --> 00:27:28,159 Speaker 1: notion that one day they will die, just like all 449 00:27:28,160 --> 00:27:31,600 Speaker 1: the poor people out there die, that all the wealth 450 00:27:31,720 --> 00:27:36,239 Speaker 1: they've mounted throughout their lifetimes will still not stop the 451 00:27:36,280 --> 00:27:40,320 Speaker 1: grim reaper from slinging that scythe down on them one day, 452 00:27:40,800 --> 00:27:44,800 Speaker 1: and then they'll just die like one of the common people. No, 453 00:27:45,080 --> 00:27:49,200 Speaker 1: thank you. No, these billionaires are looking for a way out, 454 00:27:49,560 --> 00:27:53,000 Speaker 1: whether it's being turned into a theasicle or maybe having 455 00:27:53,080 --> 00:27:57,080 Speaker 1: their brains somehow pourted over into a machine. Because the 456 00:27:57,119 --> 00:28:04,040 Speaker 1: possibility of just ceasing to exist is plain unthinkable to them. 457 00:28:04,240 --> 00:28:08,360 Speaker 1: It is so counter to their daily experience that they 458 00:28:08,480 --> 00:28:12,080 Speaker 1: cannot accept it. I think that's the fear that fuels 459 00:28:12,080 --> 00:28:14,560 Speaker 1: a lot of futurists as well, who for years have 460 00:28:14,640 --> 00:28:18,479 Speaker 1: predicted the technological singularity. To me, that's more of a 461 00:28:18,520 --> 00:28:23,119 Speaker 1: revelation that they fear that undiscovered country from who's born 462 00:28:23,320 --> 00:28:27,800 Speaker 1: no traveler returns, rather than a genuine prediction of where 463 00:28:27,880 --> 00:28:32,639 Speaker 1: tech itself is actually going. Anyway, I can't say that 464 00:28:32,680 --> 00:28:35,840 Speaker 1: I'm really surprised by this. Honestly, it confirms a lot 465 00:28:35,880 --> 00:28:39,360 Speaker 1: of biases I already had. It doesn't mean that I'm 466 00:28:39,400 --> 00:28:42,320 Speaker 1: right about everything else, obviously. Again, it's all my opinion, 467 00:28:42,400 --> 00:28:45,600 Speaker 1: but its opinion, just based off of observation. Why not, 468 00:28:46,040 --> 00:28:49,120 Speaker 1: you know, dedicate a portion of your huge wealth to 469 00:28:49,320 --> 00:28:51,640 Speaker 1: keeping you going after you've snuffed it. I mean it 470 00:28:51,680 --> 00:28:54,120 Speaker 1: makes sense, right, you're a billionaire. Why not just put 471 00:28:54,160 --> 00:28:56,680 Speaker 1: some of that money aside to keep you on ice 472 00:28:57,040 --> 00:28:58,720 Speaker 1: in case one day you're brought back. I mean, what 473 00:28:58,840 --> 00:29:01,080 Speaker 1: else is that money going to do? Feed the hungry? 474 00:29:02,320 --> 00:29:06,160 Speaker 1: Why they're just gonna get hungry again tomorrow? Right? Oh gosh, 475 00:29:06,320 --> 00:29:10,200 Speaker 1: eat the rich, y'all? Okay, Before I sign off, I 476 00:29:10,240 --> 00:29:13,120 Speaker 1: thought I would do one more thing and mention a 477 00:29:13,120 --> 00:29:16,240 Speaker 1: couple of longer form pieces that I came across this 478 00:29:16,360 --> 00:29:19,200 Speaker 1: week that I think y'all should check out. I'm gonna 479 00:29:19,200 --> 00:29:21,560 Speaker 1: try and do this more regularly when I come across 480 00:29:21,600 --> 00:29:24,680 Speaker 1: something that I'm like, this is really good. It doesn't 481 00:29:24,680 --> 00:29:28,600 Speaker 1: really fit within news because it goes more into journalism, 482 00:29:29,000 --> 00:29:31,880 Speaker 1: but I feel like people need to be aware of it. 483 00:29:32,240 --> 00:29:35,480 Speaker 1: So first up is a piece by David Pierce in 484 00:29:35,560 --> 00:29:38,920 Speaker 1: The Verge. It is titled speed Trap, and it's about 485 00:29:38,960 --> 00:29:44,600 Speaker 1: an initiative Google had that ended up having really negative consequences. 486 00:29:44,640 --> 00:29:47,720 Speaker 1: It's a great read. It's a great read. I think 487 00:29:48,200 --> 00:29:50,960 Speaker 1: you'd really like it. The second piece I want to 488 00:29:51,000 --> 00:29:55,040 Speaker 1: mention is in the Atlantic and it's by Ariel or 489 00:29:55,080 --> 00:30:00,800 Speaker 1: Ariel Sabar titled The billion Dollar Ponzi Scheme Hooked Warren 490 00:30:00,840 --> 00:30:04,480 Speaker 1: Buffett and the US Treasury. This piece is about a 491 00:30:04,520 --> 00:30:07,840 Speaker 1: guy who ran a company that built solar powered generators, 492 00:30:07,920 --> 00:30:12,920 Speaker 1: like portable generators you could bring to different sites with 493 00:30:13,000 --> 00:30:16,640 Speaker 1: the idea of disrupting the portable generator market to create 494 00:30:16,760 --> 00:30:21,760 Speaker 1: a carbon neutral approach to portable electricity generation. And it 495 00:30:21,800 --> 00:30:24,560 Speaker 1: was meant for things like construction sites and festivals and stuff. 496 00:30:24,960 --> 00:30:28,120 Speaker 1: It's a good article. It reminds me a lot of 497 00:30:28,160 --> 00:30:31,640 Speaker 1: the Therenose story in many ways, because it seems like 498 00:30:31,680 --> 00:30:33,360 Speaker 1: it's one of those things where someone comes up with 499 00:30:33,400 --> 00:30:36,840 Speaker 1: an idea that would be great if you could get 500 00:30:36,840 --> 00:30:41,080 Speaker 1: it to work, but then the idea ends up making 501 00:30:41,080 --> 00:30:43,880 Speaker 1: the person huge amounts of money before they realize that 502 00:30:44,560 --> 00:30:47,040 Speaker 1: their idea doesn't actually work, or at least it doesn't 503 00:30:47,040 --> 00:30:49,800 Speaker 1: work at a level that it needs to in order 504 00:30:49,960 --> 00:30:53,160 Speaker 1: to be a viable business. Both of those articles are 505 00:30:53,160 --> 00:30:55,920 Speaker 1: well worth your time. And also I have no connection 506 00:30:56,040 --> 00:31:00,000 Speaker 1: to either publication or either author. I don't know the author, 507 00:31:00,600 --> 00:31:03,120 Speaker 1: and I mean I'm just a reader of those publications, 508 00:31:03,840 --> 00:31:07,320 Speaker 1: so there's nothing connecting the back end there. These are 509 00:31:07,360 --> 00:31:09,840 Speaker 1: just pieces I thought were really good and that more 510 00:31:09,840 --> 00:31:13,240 Speaker 1: people should check out. That's it for this episode. For 511 00:31:13,360 --> 00:31:17,040 Speaker 1: the news for Tuesday, May ninth, twenty twenty three. I 512 00:31:17,080 --> 00:31:19,520 Speaker 1: hope you are all well, and I'll talk to you 513 00:31:19,560 --> 00:31:30,320 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 514 00:31:30,440 --> 00:31:35,280 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 515 00:31:35,400 --> 00:31:40,960 Speaker 1: or wherever you listen to your favorite shows.