1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,880 --> 00:00:14,800 Speaker 1: He there, and welcome to tech Stuff. I'm your host, 3 00:00:14,960 --> 00:00:18,440 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio. And 4 00:00:18,480 --> 00:00:21,520 Speaker 1: how the tech are you. It's time for the tech 5 00:00:21,560 --> 00:00:27,720 Speaker 1: news for a Thursday, December one, two thousand twenty two. 6 00:00:27,880 --> 00:00:32,680 Speaker 1: Where did the year go? Yet again? I at the 7 00:00:32,760 --> 00:00:35,400 Speaker 1: end of a year. I feel like the previous year 8 00:00:35,560 --> 00:00:40,280 Speaker 1: both lasted an eternity and was over in the blink 9 00:00:40,320 --> 00:00:43,159 Speaker 1: of an eye. I don't know how that is possible. 10 00:00:43,280 --> 00:00:47,559 Speaker 1: I guess it's cause Einstein found out time is relative. Man, 11 00:00:48,600 --> 00:00:52,000 Speaker 1: that guy was the worst. Okay, let's get to it. 12 00:00:52,120 --> 00:00:55,360 Speaker 1: I am I'm kidding about Einstein. Let's get to the 13 00:00:55,400 --> 00:00:59,880 Speaker 1: tech news so far this week, and um, it's getting 14 00:01:00,000 --> 00:01:03,360 Speaker 1: little quiet, which is fine. Usually we do see tech 15 00:01:03,440 --> 00:01:06,360 Speaker 1: news quiet down towards the end of the year, gearing 16 00:01:06,480 --> 00:01:09,240 Speaker 1: up to c e s and then things get back 17 00:01:09,240 --> 00:01:13,720 Speaker 1: into crazy town. So, first up, we have a scary 18 00:01:13,760 --> 00:01:18,120 Speaker 1: situation here in the United States. I think it's pretty 19 00:01:18,120 --> 00:01:21,520 Speaker 1: well understood by just about everybody that cyber attacks and 20 00:01:21,600 --> 00:01:26,600 Speaker 1: cyber warfare are a constant threat. Really, they're constantly happening, 21 00:01:27,280 --> 00:01:31,800 Speaker 1: whether it's from state sponsored hacking groups that are working 22 00:01:31,880 --> 00:01:36,600 Speaker 1: on behalf of a government, or quote unquote independent groups 23 00:01:36,640 --> 00:01:40,720 Speaker 1: that claim to merely be aligned with some nations goals 24 00:01:40,800 --> 00:01:44,720 Speaker 1: but not actually you know, sponsored by that nation, or 25 00:01:45,160 --> 00:01:48,920 Speaker 1: maybe it's a cyber criminal organization. But we've seen tons 26 00:01:49,040 --> 00:01:54,000 Speaker 1: of attacks on various high profile targets all around the world. Well, 27 00:01:54,240 --> 00:01:58,400 Speaker 1: the scary thing I read is that, according to cybersecurity researchers, 28 00:01:59,040 --> 00:02:04,840 Speaker 1: eighty seven per cent of US defense contractors failed to 29 00:02:04,920 --> 00:02:11,720 Speaker 1: meet basic cybersecurity regulations. Eighty seven per cent failed. That 30 00:02:11,880 --> 00:02:17,359 Speaker 1: is a sobering thought. Infosecurity Magazines James Coker explains that 31 00:02:17,760 --> 00:02:21,160 Speaker 1: to get a passing grade, they must achieve a score 32 00:02:21,200 --> 00:02:26,200 Speaker 1: of seventy on the Supplier Risk Performance System as designed 33 00:02:26,240 --> 00:02:31,919 Speaker 1: by the Pentagon. Only of defense contractors managed to do that. 34 00:02:32,880 --> 00:02:35,400 Speaker 1: By the way, to be considered fully compliant, you would 35 00:02:35,400 --> 00:02:39,040 Speaker 1: have to achieve a score of one ten. So yeah, 36 00:02:39,520 --> 00:02:42,640 Speaker 1: this is a little like flunking your big math test, 37 00:02:43,000 --> 00:02:45,480 Speaker 1: except in this case, your math test describes how prepared 38 00:02:45,520 --> 00:02:49,480 Speaker 1: you are for cyber attacks and security intrusions. And considering 39 00:02:49,480 --> 00:02:53,720 Speaker 1: we're talking about key contractors in the supply chain for 40 00:02:53,760 --> 00:02:56,600 Speaker 1: the U. S Department of Defense, and that We've had 41 00:02:56,720 --> 00:03:01,040 Speaker 1: so much conversation of the last two years about importance 42 00:03:01,520 --> 00:03:06,640 Speaker 1: and delicacy of supply chains. This is a huge problem now, y'all. 43 00:03:06,680 --> 00:03:09,160 Speaker 1: I have complained in the past about how the average 44 00:03:09,280 --> 00:03:14,200 Speaker 1: person it's pretty bad at practicing basic infosecurity measures, but 45 00:03:14,280 --> 00:03:19,679 Speaker 1: this is unfathomable to me. You would think that these companies, 46 00:03:19,760 --> 00:03:23,760 Speaker 1: given the business that they are in, would be particularly careful, 47 00:03:24,240 --> 00:03:27,840 Speaker 1: as they would clearly be high profile targets for hackers. 48 00:03:27,880 --> 00:03:30,680 Speaker 1: But it appears the opposite is true now. In James 49 00:03:30,680 --> 00:03:35,160 Speaker 1: Coker's article, which is called majority of US defense contractors 50 00:03:35,240 --> 00:03:41,280 Speaker 1: not meeting basic cybersecurity requirements, well, he explains part of 51 00:03:41,320 --> 00:03:44,320 Speaker 1: the problem is that traditionally the US government hasn't been 52 00:03:44,400 --> 00:03:49,000 Speaker 1: super good at cracking down on these requirements. And you know, 53 00:03:49,400 --> 00:03:51,040 Speaker 1: you just have to look at kids to know if 54 00:03:51,080 --> 00:03:54,480 Speaker 1: a rule is not being enforced, well, you might just 55 00:03:54,560 --> 00:03:56,480 Speaker 1: feel like there's no need to observe the rule in 56 00:03:56,520 --> 00:03:59,640 Speaker 1: the first place, Like it's almost the same as no 57 00:03:59,760 --> 00:04:02,680 Speaker 1: rule being there at all. Coker also points out that 58 00:04:02,760 --> 00:04:06,920 Speaker 1: some of these contractors are, you know, smaller companies, They're 59 00:04:06,960 --> 00:04:13,120 Speaker 1: not like huge defense contractors, and not all of them 60 00:04:13,160 --> 00:04:16,880 Speaker 1: have the assets or experience or knowledge base that you 61 00:04:16,920 --> 00:04:19,919 Speaker 1: find in some of the larger organizations, and that the 62 00:04:20,000 --> 00:04:23,720 Speaker 1: learning curve to adopting proper security measures is a pretty 63 00:04:23,720 --> 00:04:26,480 Speaker 1: steep one. It can be tough to do, and that 64 00:04:26,520 --> 00:04:28,719 Speaker 1: may well be the case, but I maintain that when 65 00:04:28,760 --> 00:04:33,640 Speaker 1: you consider the potential consequences of a security intrusion into 66 00:04:33,680 --> 00:04:37,400 Speaker 1: the supply chain of national defense, having to buckle down 67 00:04:37,520 --> 00:04:41,000 Speaker 1: in order to meet regulations is a tiny price to pay. 68 00:04:41,400 --> 00:04:44,760 Speaker 1: While we're on the subject of infosecurity, I have in 69 00:04:44,800 --> 00:04:48,640 Speaker 1: the past recommended that people adopt password vault systems so 70 00:04:48,680 --> 00:04:51,840 Speaker 1: that they can create and store strong passwords for all 71 00:04:51,920 --> 00:04:55,479 Speaker 1: the online services they use, and I still think that's 72 00:04:55,520 --> 00:04:58,600 Speaker 1: a really good idea. I use one myself. You know 73 00:04:58,680 --> 00:05:01,320 Speaker 1: you don't want to reu is the same password at all? 74 00:05:01,600 --> 00:05:04,240 Speaker 1: You don't. I mean, you might want to for the 75 00:05:04,279 --> 00:05:06,919 Speaker 1: sake of convenience, but you don't for the sake of security. 76 00:05:07,320 --> 00:05:09,680 Speaker 1: You also want each of your passwords to be difficult 77 00:05:09,720 --> 00:05:12,919 Speaker 1: to guess, and that means making each password really hard 78 00:05:12,960 --> 00:05:15,200 Speaker 1: to remember. Because by difficult to guess, we're not just 79 00:05:15,240 --> 00:05:18,159 Speaker 1: talking about for humans. We're talking about for computers too. 80 00:05:18,800 --> 00:05:21,599 Speaker 1: Now you can do something really clever like you can 81 00:05:21,640 --> 00:05:25,960 Speaker 1: pick say, three unrelated words for each service, and you 82 00:05:26,040 --> 00:05:28,800 Speaker 1: string these three words together and that makes your password. 83 00:05:28,839 --> 00:05:30,839 Speaker 1: This is actually a really good way to make a 84 00:05:30,880 --> 00:05:35,240 Speaker 1: strong password. But even then, as you add more passwords, 85 00:05:35,279 --> 00:05:38,200 Speaker 1: like as you have more and more services that you're 86 00:05:38,279 --> 00:05:40,599 Speaker 1: doing this for, it can get a little tricky to 87 00:05:40,640 --> 00:05:43,960 Speaker 1: remember which string of words you used for which service. 88 00:05:44,400 --> 00:05:47,880 Speaker 1: So password vaults help out in this case, right, Typically 89 00:05:48,279 --> 00:05:51,640 Speaker 1: you use one master password in order to access the vault, 90 00:05:52,000 --> 00:05:54,159 Speaker 1: and then everything else is stored in the vault so 91 00:05:54,200 --> 00:05:56,960 Speaker 1: you don't have to remember it well. One such password 92 00:05:57,040 --> 00:06:01,000 Speaker 1: vault is last Pass, and unfortunately, hackers were able to 93 00:06:01,120 --> 00:06:05,159 Speaker 1: access a cloud storage service used by last pass, and 94 00:06:05,200 --> 00:06:09,000 Speaker 1: they were able to access quote unquote certain elements of 95 00:06:09,120 --> 00:06:13,480 Speaker 1: last pass users information. Now, there hasn't really been any 96 00:06:13,480 --> 00:06:18,160 Speaker 1: clarification on what that means exactly, like what information was accessed, 97 00:06:18,839 --> 00:06:23,479 Speaker 1: and very little on if anything was even you know, taken. 98 00:06:24,080 --> 00:06:28,040 Speaker 1: But presumably any passwords accessed are heavily encrypted, which at 99 00:06:28,040 --> 00:06:30,360 Speaker 1: the very least makes it unlikely that the hackers are 100 00:06:30,360 --> 00:06:33,600 Speaker 1: able to do anything with the information they stole, at 101 00:06:33,640 --> 00:06:37,640 Speaker 1: least not right away. Further, last Pass says it does 102 00:06:37,680 --> 00:06:42,000 Speaker 1: not store master passwords at all. That instead, when you 103 00:06:42,080 --> 00:06:44,799 Speaker 1: put in your master password, it goes through what's called 104 00:06:44,839 --> 00:06:49,560 Speaker 1: a one way salted hash. So this is an encryption 105 00:06:49,640 --> 00:06:54,000 Speaker 1: process that is not reversible, and it generates this jumble 106 00:06:54,080 --> 00:06:57,560 Speaker 1: of characters that can then be used as a key. Anyway, 107 00:06:57,600 --> 00:06:59,800 Speaker 1: if you use last Pass, you might want to look 108 00:06:59,800 --> 00:07:02,680 Speaker 1: in to see just you know what, if anything the 109 00:07:02,720 --> 00:07:06,279 Speaker 1: service is recommending you do. It might be to change 110 00:07:06,279 --> 00:07:10,040 Speaker 1: your master password, which you know you should be doing 111 00:07:10,960 --> 00:07:14,400 Speaker 1: on the rag anyway, and I still think password vaults 112 00:07:14,440 --> 00:07:16,800 Speaker 1: are a critical security tool for the average person. By 113 00:07:16,800 --> 00:07:19,840 Speaker 1: the way, I actually use one myself. I used to 114 00:07:19,960 --> 00:07:22,400 Speaker 1: use last Pass, but now I use a different one, 115 00:07:23,000 --> 00:07:26,119 Speaker 1: So I mean I didn't have any issues with last Pass, 116 00:07:26,160 --> 00:07:29,320 Speaker 1: I just kind of switched to a different service. But yeah, 117 00:07:29,400 --> 00:07:33,720 Speaker 1: I still think that they're all, you know, important elements 118 00:07:33,800 --> 00:07:38,080 Speaker 1: to personal data security. A few weeks back, the Washington 119 00:07:38,160 --> 00:07:41,680 Speaker 1: Post published an article linking a software company to a 120 00:07:41,800 --> 00:07:45,480 Speaker 1: US military contractor that raised a lot of eyebrows. So 121 00:07:45,520 --> 00:07:50,000 Speaker 1: this software company is called trust Core Systems. That's a 122 00:07:50,120 --> 00:07:53,880 Speaker 1: t R U S t c O R Systems, and 123 00:07:53,920 --> 00:07:57,320 Speaker 1: it's in the business of issuing digital certificates, which is 124 00:07:57,360 --> 00:08:00,360 Speaker 1: an important part of making sure that the sites you 125 00:08:00,480 --> 00:08:05,080 Speaker 1: visit are in fact legitimate. So certificates are what tell 126 00:08:05,160 --> 00:08:09,240 Speaker 1: browsers that a site is trustworthy, that there's been this 127 00:08:09,760 --> 00:08:15,000 Speaker 1: designated authority or hundreds of them actually that ends up 128 00:08:15,040 --> 00:08:18,880 Speaker 1: generating this certificate that says, yes, you can trust this website. 129 00:08:19,160 --> 00:08:21,760 Speaker 1: So there are hundreds of these companies that you know, 130 00:08:22,720 --> 00:08:25,240 Speaker 1: issue these kinds of certificates, and trust Core is one 131 00:08:25,280 --> 00:08:29,160 Speaker 1: of them. But according to the Washington Post, trust Core 132 00:08:29,640 --> 00:08:34,040 Speaker 1: has the same slate of officers, agents and partners as 133 00:08:34,040 --> 00:08:37,880 Speaker 1: a company that's been known to make spywear and is 134 00:08:38,000 --> 00:08:43,520 Speaker 1: in turn connected to a defense company called Packet Forensics. Now, 135 00:08:44,160 --> 00:08:47,160 Speaker 1: when you hear the name packet Forensics, that suggests a 136 00:08:47,240 --> 00:08:51,439 Speaker 1: company that's in the business of analyzing data transmissions, possibly 137 00:08:51,480 --> 00:08:55,640 Speaker 1: to intercept communications and pass intelligence along to US government agencies. 138 00:08:56,000 --> 00:08:59,320 Speaker 1: So this starts to paint a pretty ugly picture. You've 139 00:08:59,320 --> 00:09:03,480 Speaker 1: got a company it's in charge of certifying trustworthiness that's 140 00:09:03,480 --> 00:09:08,120 Speaker 1: tied to a company that is effectively spying on digital transmissions. 141 00:09:08,600 --> 00:09:12,080 Speaker 1: The association has been enough for both Microsoft and Mozilla 142 00:09:12,200 --> 00:09:16,320 Speaker 1: to stop trusting certificates from trust Core, which kind of 143 00:09:16,320 --> 00:09:18,920 Speaker 1: seems to have an ironic name now doesn't it, and 144 00:09:19,040 --> 00:09:22,839 Speaker 1: other browsers are likely to follow a suit anyway. Everybody's 145 00:09:22,840 --> 00:09:25,240 Speaker 1: a spy, kind of like how in the john Wick 146 00:09:25,280 --> 00:09:29,280 Speaker 1: movies everybody is an assassin. I mean, seriously, the entire 147 00:09:29,400 --> 00:09:32,280 Speaker 1: economy in the john Wick universe must be assassin based. 148 00:09:32,920 --> 00:09:38,160 Speaker 1: I guess that's a discussion for a different podcast. Across 149 00:09:38,200 --> 00:09:41,560 Speaker 1: the pond, in the UK, more than one hundred thousand 150 00:09:41,760 --> 00:09:45,880 Speaker 1: small businesses have joined in a class action lawsuit against Google, 151 00:09:46,520 --> 00:09:49,920 Speaker 1: and they are seeking more than thirteen point five billion 152 00:09:50,120 --> 00:09:54,640 Speaker 1: British pounds and lost ad revenues. So these are mostly 153 00:09:54,720 --> 00:09:59,760 Speaker 1: publishers and related businesses, and they're saying that they cannot 154 00:09:59,800 --> 00:10:02,040 Speaker 1: come peat against Google when it comes to the online 155 00:10:02,080 --> 00:10:06,760 Speaker 1: ad business, which is a business that Google definitively dominates. 156 00:10:06,760 --> 00:10:10,120 Speaker 1: I mean, you just can't deny that claim. It is 157 00:10:10,160 --> 00:10:15,080 Speaker 1: obvious that Google dominates online advertising. But these companies also 158 00:10:15,160 --> 00:10:19,360 Speaker 1: say that Google, through this domination, can essentially dictate pricing 159 00:10:19,720 --> 00:10:23,600 Speaker 1: and other terms of ad deals, and that this affects 160 00:10:23,600 --> 00:10:27,680 Speaker 1: the overall ad industry, that these other companies have no 161 00:10:27,840 --> 00:10:32,880 Speaker 1: choice but to follow Google's lead because Google is so 162 00:10:33,000 --> 00:10:36,560 Speaker 1: powerful and has so much weight in the industry that 163 00:10:37,080 --> 00:10:40,880 Speaker 1: all terms are defined by Google, and that Google is 164 00:10:40,920 --> 00:10:43,440 Speaker 1: acting more or less as a monopoly, at least as 165 00:10:43,480 --> 00:10:46,800 Speaker 1: far as settling these terms. So the lawsuit argues that 166 00:10:46,840 --> 00:10:49,079 Speaker 1: these smaller companies have had to sell ad space for 167 00:10:49,280 --> 00:10:52,800 Speaker 1: much less than they should, losing out on up to 168 00:10:53,720 --> 00:10:56,720 Speaker 1: of their AD revenue since January one, two thousand fourteen. 169 00:10:57,200 --> 00:10:59,520 Speaker 1: Now we'll have to keep an eye on this lawsuit 170 00:11:00,240 --> 00:11:03,320 Speaker 1: to see where it goes, But in general, this falls 171 00:11:03,320 --> 00:11:05,840 Speaker 1: in line with this larger trend we've been seeing in 172 00:11:05,960 --> 00:11:10,360 Speaker 1: tech as more regulators, politicians, and even smaller companies are 173 00:11:10,400 --> 00:11:17,079 Speaker 1: pushing back against big text dominant position in various markets. Okay, 174 00:11:17,120 --> 00:11:19,360 Speaker 1: that's the first section of tech News. We're gonna take 175 00:11:19,360 --> 00:11:22,960 Speaker 1: a quick break. When we come back, it's everybody's favorite 176 00:11:23,000 --> 00:11:25,280 Speaker 1: guy to talk about in tech. Things are about to 177 00:11:25,280 --> 00:11:37,760 Speaker 1: get muskie, but first this break. Okay, we're back and 178 00:11:37,840 --> 00:11:40,280 Speaker 1: it's time for the Elon Must section of the show. 179 00:11:40,320 --> 00:11:42,920 Speaker 1: But this will be relatively short. We've only got a 180 00:11:42,920 --> 00:11:45,440 Speaker 1: couple of stories. So first up, and this is a 181 00:11:45,480 --> 00:11:48,720 Speaker 1: tiny one, but over in China, Tesla has had to 182 00:11:48,800 --> 00:11:52,720 Speaker 1: issue a recall or recall, I guess that's how we 183 00:11:52,760 --> 00:11:57,000 Speaker 1: should say that on more than four five thousand Tesla 184 00:11:57,160 --> 00:12:00,720 Speaker 1: vehicles in order to address a problem with side marker 185 00:12:00,840 --> 00:12:03,880 Speaker 1: lights on the cars, which were determined that you know, 186 00:12:04,040 --> 00:12:09,760 Speaker 1: under extreme circumstances, could potentially contribute to catastrophic car accidents. 187 00:12:10,520 --> 00:12:13,400 Speaker 1: But a recall ain't a recall, at least not like 188 00:12:13,440 --> 00:12:15,960 Speaker 1: it was in the old days, because in this case, 189 00:12:16,000 --> 00:12:19,679 Speaker 1: the recall is actually an over the air firmware update, 190 00:12:20,320 --> 00:12:23,000 Speaker 1: So owners are not going to have to take their 191 00:12:23,080 --> 00:12:25,520 Speaker 1: vehicles anywhere. They're not gonna have to go back to 192 00:12:25,600 --> 00:12:27,880 Speaker 1: like the dealership or something and give up their car 193 00:12:27,960 --> 00:12:31,800 Speaker 1: for any length of time. It'll actually be issued over 194 00:12:31,840 --> 00:12:36,240 Speaker 1: the air, and it still qualifies as a recall legally, 195 00:12:36,520 --> 00:12:39,960 Speaker 1: even though you know the drivers just it just means 196 00:12:39,960 --> 00:12:43,200 Speaker 1: that their their lights will behave slightly differently from one 197 00:12:43,280 --> 00:12:46,160 Speaker 1: day to the next. Honestly, I do think it's super 198 00:12:46,240 --> 00:12:49,320 Speaker 1: cool that cars have reached a level of sophistication in 199 00:12:49,360 --> 00:12:52,480 Speaker 1: which at least some issues can be fixed just by 200 00:12:52,520 --> 00:12:55,199 Speaker 1: sending out an update and you don't have to take 201 00:12:55,240 --> 00:12:57,920 Speaker 1: it back. It is kind of odd that we have 202 00:12:58,040 --> 00:13:01,199 Speaker 1: this sort of antiquated system where we have to designate 203 00:13:01,240 --> 00:13:04,480 Speaker 1: that as a recall. Because as much as I dog 204 00:13:05,040 --> 00:13:08,800 Speaker 1: on Elon Musk and on Tesla. Uh. I don't think 205 00:13:08,800 --> 00:13:12,240 Speaker 1: it's really fair to just say, hey, almost half a 206 00:13:12,240 --> 00:13:15,720 Speaker 1: million Tesla's were recalled in China, because I feel like 207 00:13:15,760 --> 00:13:21,000 Speaker 1: that paints a very inaccurate picture of what's actually happening. 208 00:13:21,360 --> 00:13:25,959 Speaker 1: So I personally think we need to kind of update 209 00:13:26,080 --> 00:13:29,280 Speaker 1: our definitions of what a recall is and isn't, or 210 00:13:29,360 --> 00:13:34,120 Speaker 1: have some other term for these kinds of fixes where 211 00:13:34,320 --> 00:13:39,000 Speaker 1: recall does not bring up this idea that people had 212 00:13:39,080 --> 00:13:41,600 Speaker 1: to surrender their vehicles or that Tesla had to take 213 00:13:41,600 --> 00:13:46,160 Speaker 1: them back or anything like that anyway. Next up is Neuralink, 214 00:13:46,240 --> 00:13:49,520 Speaker 1: which is the Elon Musk backed company that is developing 215 00:13:49,600 --> 00:13:54,840 Speaker 1: brain computer interfaces or BC eyes. Now, as Brain computer 216 00:13:54,920 --> 00:13:58,120 Speaker 1: Interface suggests, this is a type of technology that would 217 00:13:58,160 --> 00:14:00,920 Speaker 1: allow a human to interact with a comp you directly 218 00:14:01,080 --> 00:14:05,520 Speaker 1: through thought through brain activity. And Elon Musk has been 219 00:14:05,559 --> 00:14:07,880 Speaker 1: known to get all futuristic with this vision and talk 220 00:14:07,920 --> 00:14:10,600 Speaker 1: about how one day human intelligence is going to merge 221 00:14:10,640 --> 00:14:14,000 Speaker 1: with AI. To me, that sounds like he's kind of 222 00:14:14,040 --> 00:14:19,520 Speaker 1: falling into the philosophy of Ray Kurtzwil, a known futurist 223 00:14:19,520 --> 00:14:24,680 Speaker 1: who known famous futurist who has really pushed forward the 224 00:14:24,720 --> 00:14:28,200 Speaker 1: idea of the singularity being on the horizon. I think 225 00:14:28,200 --> 00:14:30,080 Speaker 1: he most recently said he thinks it will be here 226 00:14:30,080 --> 00:14:33,280 Speaker 1: by two thousand forty five. I just can't shake the 227 00:14:33,280 --> 00:14:36,960 Speaker 1: feeling that these really rich people are just terrified at 228 00:14:37,000 --> 00:14:39,000 Speaker 1: the thought that one day they are going to cease 229 00:14:39,080 --> 00:14:42,000 Speaker 1: to exist, so they're kind of feverishly predicting and hoping 230 00:14:42,040 --> 00:14:45,480 Speaker 1: for a get out of death free card. Maybe I'm 231 00:14:45,480 --> 00:14:48,760 Speaker 1: being totally unfair. I could just be so cynical and 232 00:14:48,800 --> 00:14:52,600 Speaker 1: skeptical that that's how I feel about it now. But 233 00:14:52,840 --> 00:14:54,800 Speaker 1: you know, they could be onto something, right. I mean, 234 00:14:54,840 --> 00:15:00,360 Speaker 1: I don't want to dismiss the concept, but we are definitive, far, 235 00:15:00,520 --> 00:15:02,760 Speaker 1: far far away from being able to merge AI with 236 00:15:02,880 --> 00:15:06,840 Speaker 1: human intelligence. Musk did say he hopes that the neuralink 237 00:15:06,960 --> 00:15:10,240 Speaker 1: interface will be implanted in a real human brain within 238 00:15:10,320 --> 00:15:12,600 Speaker 1: six months or so, because so far the company has 239 00:15:12,640 --> 00:15:15,840 Speaker 1: been testing this tech out on animals like monkeys and pigs. 240 00:15:16,520 --> 00:15:19,840 Speaker 1: And to be clear, there are other bc I devices 241 00:15:19,880 --> 00:15:22,800 Speaker 1: out there, including some that are, you know, attached to 242 00:15:22,840 --> 00:15:26,680 Speaker 1: real human brains. People are really using these kind of 243 00:15:27,000 --> 00:15:31,360 Speaker 1: interfaces to interact with computers for very specific use cases, 244 00:15:31,800 --> 00:15:35,480 Speaker 1: and neural links design is particularly sophisticated. It's a really 245 00:15:35,680 --> 00:15:39,440 Speaker 1: cool design and it has benefits over other implementations of 246 00:15:39,480 --> 00:15:44,640 Speaker 1: this technology, including a smaller surgical footprint, which is obviously important. 247 00:15:44,760 --> 00:15:49,000 Speaker 1: You want to reduce or ideally eliminate the risk of 248 00:15:49,040 --> 00:15:53,800 Speaker 1: things like infection or surgery complications because you have to 249 00:15:53,840 --> 00:15:56,280 Speaker 1: implant these things into brains, which means you've got to 250 00:15:56,280 --> 00:15:59,760 Speaker 1: get to the brain, and that means going through either 251 00:15:59,840 --> 00:16:03,840 Speaker 1: the skull or I saw one suggestion that had going 252 00:16:03,880 --> 00:16:09,040 Speaker 1: through the jugular to feed a chip up into the brain. 253 00:16:09,120 --> 00:16:11,760 Speaker 1: That way. Either way seems pretty extreme to me, right, 254 00:16:12,640 --> 00:16:16,520 Speaker 1: And you know, it also is going to allow for 255 00:16:16,640 --> 00:16:20,280 Speaker 1: wireless transmission of data because a lot of the BCIs 256 00:16:20,320 --> 00:16:22,800 Speaker 1: that exist right now, you have to be tethered to 257 00:16:22,920 --> 00:16:25,680 Speaker 1: a machine. Now in the case of b C I 258 00:16:25,880 --> 00:16:27,720 Speaker 1: S the way we're seeing it used in a lot 259 00:16:27,760 --> 00:16:31,680 Speaker 1: of cases are for people who have limited or no mobility, 260 00:16:32,200 --> 00:16:36,360 Speaker 1: right So being tethered to a machine, while not ideal, 261 00:16:36,920 --> 00:16:40,560 Speaker 1: also does not have a huge impact on quality of 262 00:16:40,600 --> 00:16:42,720 Speaker 1: life in the sense that these are people who otherwise 263 00:16:42,760 --> 00:16:46,800 Speaker 1: aren't capable of moving anyway. But what there are able 264 00:16:46,840 --> 00:16:50,240 Speaker 1: to do now is to use thought to control electronics 265 00:16:50,280 --> 00:16:53,000 Speaker 1: in some way, either by moving a cursor and typing 266 00:16:53,000 --> 00:16:56,640 Speaker 1: things out, or some other form of communication where they're 267 00:16:56,680 --> 00:17:01,320 Speaker 1: able to interact with their environments and with other people 268 00:17:02,120 --> 00:17:05,360 Speaker 1: when previously they were not capable of doing that. And honestly, 269 00:17:05,480 --> 00:17:09,320 Speaker 1: that is the use that I can get behind, and 270 00:17:09,359 --> 00:17:12,040 Speaker 1: that's in fact what the Neuralalink teams are working toward. Really. 271 00:17:12,400 --> 00:17:15,520 Speaker 1: Elon Musk is talking about AI and human intelligence merging, 272 00:17:15,960 --> 00:17:18,640 Speaker 1: but these teams are looking at a more pragmatic approach 273 00:17:18,960 --> 00:17:21,560 Speaker 1: and one that could have a transformative effect on a 274 00:17:21,600 --> 00:17:26,040 Speaker 1: person's life who otherwise would be facing incredible challenges that 275 00:17:26,080 --> 00:17:29,840 Speaker 1: most of us can't even imagine going through. To me, 276 00:17:30,480 --> 00:17:34,560 Speaker 1: that is the inspiring thing about this technology, way way 277 00:17:34,680 --> 00:17:40,240 Speaker 1: more energizing and and inspiring than thinking that one day 278 00:17:40,240 --> 00:17:42,840 Speaker 1: I'll be able to complete the New York Times crossword 279 00:17:42,840 --> 00:17:46,320 Speaker 1: puzzle and pen in just five minutes. I don't see 280 00:17:46,359 --> 00:17:49,520 Speaker 1: that as being the huge benefit. And lastly, in our 281 00:17:49,680 --> 00:17:54,400 Speaker 1: Muskie section, Terry Breton, and I don't know if I'm 282 00:17:54,560 --> 00:17:57,920 Speaker 1: pronouncing that name correctly anyway, This is an EU official 283 00:17:57,960 --> 00:18:01,359 Speaker 1: in charge of implementing the upcoming Digital Services Act or 284 00:18:01,440 --> 00:18:04,800 Speaker 1: d s A in the EU once that Act has 285 00:18:04,840 --> 00:18:10,240 Speaker 1: been finished and approved. That is it's still not complete anyway. 286 00:18:10,280 --> 00:18:13,119 Speaker 1: Bretton has indicated that Twitter is going to have a 287 00:18:13,119 --> 00:18:15,520 Speaker 1: lot of work to do in order to comply with 288 00:18:15,640 --> 00:18:20,120 Speaker 1: EU laws to operate in the EU, the implication being 289 00:18:20,359 --> 00:18:23,280 Speaker 1: that if Twitter fails to do that, it could potentially 290 00:18:23,280 --> 00:18:27,400 Speaker 1: be banned from the EU. However, I should add that 291 00:18:27,840 --> 00:18:33,200 Speaker 1: the EU really focuses on very large operating platforms or 292 00:18:33,359 --> 00:18:38,240 Speaker 1: v lops, and as of now, Twitter has not yet 293 00:18:38,359 --> 00:18:41,840 Speaker 1: been designated a v LAP, so it's possible that Twitter 294 00:18:41,920 --> 00:18:46,679 Speaker 1: won't be won't be subject to the most restrictive rules 295 00:18:46,840 --> 00:18:49,320 Speaker 1: in the EU. It still will have to follow some, 296 00:18:50,400 --> 00:18:54,320 Speaker 1: but not maybe all, of the super tight restrictions. I'm 297 00:18:54,320 --> 00:18:58,080 Speaker 1: sure that Musk would much prefer not having to follow 298 00:18:58,160 --> 00:19:01,119 Speaker 1: every single restriction that will be coming up from the 299 00:19:01,240 --> 00:19:04,199 Speaker 1: d s A. Elon Musk actually had a meeting with 300 00:19:04,200 --> 00:19:08,040 Speaker 1: Breton and it seemed to go pretty well. Musk said 301 00:19:08,080 --> 00:19:11,159 Speaker 1: that he thought that the rules were all very reasonable. 302 00:19:11,840 --> 00:19:16,760 Speaker 1: But this puzzles me a bit simply because of what 303 00:19:16,880 --> 00:19:19,080 Speaker 1: Musk says and what he's been doing appear to be 304 00:19:19,119 --> 00:19:22,240 Speaker 1: at odds of each other. Because, like the d s A, 305 00:19:22,359 --> 00:19:25,480 Speaker 1: is going to require transparent and thoughtful sets of policies 306 00:19:25,480 --> 00:19:30,480 Speaker 1: on things, for example, like banning and unbanning accounts. But 307 00:19:30,640 --> 00:19:34,600 Speaker 1: must just recently announced that, you know, thousands of banded 308 00:19:34,600 --> 00:19:38,359 Speaker 1: accounts would be allowed back on Twitter, and there's nothing 309 00:19:38,359 --> 00:19:42,879 Speaker 1: transparent or thoughtful about that approach it. It seems, at 310 00:19:42,960 --> 00:19:47,040 Speaker 1: least on the outside, that Musk is ruling Twitter mostly 311 00:19:47,080 --> 00:19:51,280 Speaker 1: by whim, which is antithetical to the requirements of the EU. 312 00:19:52,240 --> 00:19:54,520 Speaker 1: But Musk also has said that he plans to hand 313 00:19:54,520 --> 00:19:57,480 Speaker 1: control of Twitter over to some other CEO at some 314 00:19:57,560 --> 00:20:00,680 Speaker 1: point in the future, So may be by the time 315 00:20:00,680 --> 00:20:03,040 Speaker 1: the d s A is actually in full effect, it 316 00:20:03,040 --> 00:20:05,439 Speaker 1: will be a moot point because Musk won't be the 317 00:20:05,480 --> 00:20:09,280 Speaker 1: one calling the shots. I don't know. I just feel 318 00:20:09,320 --> 00:20:13,239 Speaker 1: like the narratives here are are at cross purposes with 319 00:20:13,280 --> 00:20:17,399 Speaker 1: one another. The two things cannot be an alignment based 320 00:20:17,440 --> 00:20:20,720 Speaker 1: upon what we have seen so far with Musk's version 321 00:20:20,720 --> 00:20:23,480 Speaker 1: of Twitter. But yeah, confusing stuff, and that is it. 322 00:20:23,640 --> 00:20:28,200 Speaker 1: That's it for the Elon Musk section this week. Thank goodness. 323 00:20:28,240 --> 00:20:31,000 Speaker 1: We do have a couple more stories Before we get 324 00:20:31,040 --> 00:20:33,160 Speaker 1: to those. We're gonna take another quick break and we'll 325 00:20:33,200 --> 00:20:45,120 Speaker 1: be right back. Okay, got a few more to wrap up. 326 00:20:45,600 --> 00:20:49,240 Speaker 1: Andy Jase, this CEO of Amazon has indicated that the 327 00:20:49,400 --> 00:20:54,120 Speaker 1: Prime video streaming unit of Amazon could potentially spin off 328 00:20:54,119 --> 00:20:57,359 Speaker 1: to become its own company. This really confused me at first, because, 329 00:20:57,480 --> 00:21:00,720 Speaker 1: like the headlines were saying Prime to become standalone owned company, 330 00:21:00,760 --> 00:21:04,320 Speaker 1: and Prime refers not just to the streaming video service 331 00:21:04,359 --> 00:21:07,439 Speaker 1: but to the Amazon Prime program, which is one that 332 00:21:07,480 --> 00:21:09,960 Speaker 1: gets you like free shipping and all that kind of 333 00:21:09,960 --> 00:21:14,280 Speaker 1: stuff for a yearly subscription. But yeah, anyway, he said 334 00:21:14,320 --> 00:21:18,520 Speaker 1: this during an interview during the deal Book Summit, which 335 00:21:18,600 --> 00:21:22,800 Speaker 1: just concluded. A lot of stuff happened there, including some 336 00:21:22,800 --> 00:21:26,560 Speaker 1: stuff about f t X and its implosion, but I 337 00:21:26,600 --> 00:21:28,680 Speaker 1: don't want to get into all that. Maybe I'll talk 338 00:21:28,720 --> 00:21:32,040 Speaker 1: about that and into the year episode. So it would 339 00:21:32,040 --> 00:21:35,040 Speaker 1: be interesting to see Prime Video kind of spinoff and 340 00:21:35,080 --> 00:21:37,520 Speaker 1: become its own standalone company. But I've got a lot 341 00:21:37,560 --> 00:21:41,440 Speaker 1: of questions. For example, if the video streaming division becomes 342 00:21:41,440 --> 00:21:45,600 Speaker 1: a standalone company, would that mean an Amazon Prime membership 343 00:21:46,320 --> 00:21:50,400 Speaker 1: would no longer serve as access to the streaming content 344 00:21:50,880 --> 00:21:54,159 Speaker 1: on the standalone company the separate company, or would Prime 345 00:21:54,200 --> 00:21:58,400 Speaker 1: members be able to use their membership both on Amazon 346 00:21:58,800 --> 00:22:01,800 Speaker 1: and the standalone video a service. If not, would that 347 00:22:01,840 --> 00:22:04,720 Speaker 1: mean they'd have to subscribe to yet another streaming service. 348 00:22:05,680 --> 00:22:07,879 Speaker 1: I don't know the answer to this. These are all 349 00:22:07,920 --> 00:22:11,600 Speaker 1: just hypothetical questions. Anyway, Jesse did not outright say that 350 00:22:11,720 --> 00:22:14,520 Speaker 1: this is definitely going to happen. He just said that 351 00:22:15,320 --> 00:22:18,560 Speaker 1: over time the company has looked at opportunities to follow 352 00:22:18,640 --> 00:22:22,119 Speaker 1: this kind of approach, so I wouldn't be surprised to 353 00:22:22,200 --> 00:22:25,560 Speaker 1: see it happen. I am very curious about the implementation 354 00:22:25,680 --> 00:22:29,920 Speaker 1: and whether or not that would affect how you access 355 00:22:30,080 --> 00:22:33,800 Speaker 1: either the Amazon Prime features that have become very popular 356 00:22:34,000 --> 00:22:37,840 Speaker 1: at Amazon or the streaming video, because if they split 357 00:22:37,880 --> 00:22:42,399 Speaker 1: that out, then they're gonna be people asking questions like, well, 358 00:22:42,680 --> 00:22:45,600 Speaker 1: are you going to reduce the cost of Amazon Prime them? 359 00:22:45,760 --> 00:22:48,680 Speaker 1: Because if I'm not getting the streaming video, then you're 360 00:22:48,680 --> 00:22:52,359 Speaker 1: taking stuff away from the subscription, So why would I 361 00:22:52,400 --> 00:22:55,399 Speaker 1: pay as much when you're taking things away? These are 362 00:22:55,400 --> 00:22:58,399 Speaker 1: all questions that I just don't have answers to. Now. 363 00:22:58,880 --> 00:23:01,360 Speaker 1: I'm sure most of y'all are familiar with the concept 364 00:23:01,400 --> 00:23:05,240 Speaker 1: of focus groups and entertainment. These groups, which usually consist 365 00:23:05,320 --> 00:23:08,720 Speaker 1: of you know, just average people, are gathered together by 366 00:23:08,800 --> 00:23:14,760 Speaker 1: market analysts, end up watching early cuts of shows or films, 367 00:23:15,160 --> 00:23:18,920 Speaker 1: and then they give feedback to studio representatives, who might 368 00:23:19,040 --> 00:23:22,080 Speaker 1: then take that feedback and send it to producers, who 369 00:23:22,200 --> 00:23:24,880 Speaker 1: might then phone up directors and demand that the directors 370 00:23:24,880 --> 00:23:28,720 Speaker 1: make the movie less sad or whatever, you know, like, hey, 371 00:23:28,760 --> 00:23:31,600 Speaker 1: what if Old Yeller just gets better by the end. 372 00:23:32,080 --> 00:23:35,960 Speaker 1: That kind of thing. Now, in some cases, focus groups 373 00:23:35,960 --> 00:23:38,400 Speaker 1: can really help set a project on the right path, 374 00:23:38,640 --> 00:23:43,320 Speaker 1: Like maybe it turns out that motives are muddled and 375 00:23:43,320 --> 00:23:46,040 Speaker 1: people don't understand why characters are doing things, and it 376 00:23:46,119 --> 00:23:48,600 Speaker 1: wasn't the intent of the director for that to happen, 377 00:23:49,280 --> 00:23:51,560 Speaker 1: it's just how it came out in the edit. Well, 378 00:23:51,640 --> 00:23:53,479 Speaker 1: if people are confused, it may not be a very 379 00:23:53,480 --> 00:23:56,440 Speaker 1: satisfying experience. Maybe it's something the director can fix even 380 00:23:56,520 --> 00:23:58,840 Speaker 1: by having an alternative edit, or maybe they have to 381 00:23:58,880 --> 00:24:02,240 Speaker 1: go and do reshoots. Those can all be good things, 382 00:24:02,280 --> 00:24:05,480 Speaker 1: but it also could lead to a director's vision being 383 00:24:05,520 --> 00:24:09,240 Speaker 1: totally compromised. We've heard stories of that to where a 384 00:24:09,240 --> 00:24:12,560 Speaker 1: director essentially loses all control of a film or an editor, 385 00:24:12,960 --> 00:24:15,159 Speaker 1: and because really, when you get down to it, the 386 00:24:15,160 --> 00:24:18,880 Speaker 1: finished film, the editor's touch is at least as important 387 00:24:18,880 --> 00:24:22,480 Speaker 1: as the directors, sometimes more important. Anyway, all that aside, 388 00:24:23,160 --> 00:24:27,160 Speaker 1: Netflix is actually going to expand its focus group program 389 00:24:27,240 --> 00:24:31,320 Speaker 1: right now that consists of around two thousand subscribers who 390 00:24:31,800 --> 00:24:35,360 Speaker 1: are allowed to watch Netflix original content early and give 391 00:24:35,440 --> 00:24:38,320 Speaker 1: feedback on it. So the company plans to expand this 392 00:24:38,400 --> 00:24:42,479 Speaker 1: to quote tens of thousands of users around the world 393 00:24:42,600 --> 00:24:45,800 Speaker 1: end quote, and that's going to happen starting early next year. 394 00:24:46,280 --> 00:24:50,560 Speaker 1: So if you're a Netflix subscriber, maybe you'll become a tastemaker. 395 00:24:50,840 --> 00:24:53,360 Speaker 1: Then you can be the one to tell Tim Burton, hey, 396 00:24:53,400 --> 00:24:59,359 Speaker 1: please stop messing with the Adams family's interpersonal dynamics so much. 397 00:24:59,880 --> 00:25:03,880 Speaker 1: Or Tega is doing a phenomenal job, but you're messing 398 00:25:04,080 --> 00:25:08,119 Speaker 1: with one of the greatest families and American fiction. Stop it. 399 00:25:09,720 --> 00:25:15,119 Speaker 1: I might be projecting. Rolls Royce, the aviation company, not 400 00:25:15,280 --> 00:25:20,480 Speaker 1: the luxury car, recently demonstrated a jet engine using hydrogen 401 00:25:20,560 --> 00:25:23,520 Speaker 1: as fuel. So the engine was a Rolls Royce A 402 00:25:23,960 --> 00:25:28,000 Speaker 1: twenty one A and it was modified to accept hydrogen 403 00:25:28,119 --> 00:25:32,520 Speaker 1: as combustion fuel. And you know, hydrogen can be used 404 00:25:32,520 --> 00:25:35,240 Speaker 1: as combustible fuel, but it can also be used in 405 00:25:35,280 --> 00:25:38,119 Speaker 1: stuff like fuel cells. Fuel cells use a totally different 406 00:25:38,119 --> 00:25:43,800 Speaker 1: physical process from combustion, and Garzia Vitadini, the CTO of 407 00:25:43,920 --> 00:25:46,639 Speaker 1: Rolls Royce, said, quote we are pushing the boundaries to 408 00:25:46,720 --> 00:25:50,640 Speaker 1: discover the zero carbon possibilities of hydrogen, which could help 409 00:25:50,760 --> 00:25:54,639 Speaker 1: reshape the future of flight. End quote. Now, it is 410 00:25:54,720 --> 00:25:59,080 Speaker 1: true that burning hydrogen does not produce carbon dioxide, so 411 00:25:59,160 --> 00:26:02,080 Speaker 1: that's great, But I will have more to say about 412 00:26:02,440 --> 00:26:08,040 Speaker 1: combusting in a little second. Further rolls. Royce said that 413 00:26:08,160 --> 00:26:11,160 Speaker 1: they got the hydrogen by relying on renewable energy. This 414 00:26:11,280 --> 00:26:16,000 Speaker 1: is also critically important. So hydrogen is the most abundant 415 00:26:16,080 --> 00:26:19,920 Speaker 1: element in our galaxy, but hydrogen also bonds with other 416 00:26:20,000 --> 00:26:24,480 Speaker 1: elements very very readily, and it forms compounds. Uh it. 417 00:26:24,800 --> 00:26:27,800 Speaker 1: We typically do not encounter hydrogen in its pure form. 418 00:26:27,960 --> 00:26:29,680 Speaker 1: If we did, and we could just harness it, things 419 00:26:29,720 --> 00:26:34,240 Speaker 1: would be way easier. Instead, we have to harvest hydrogen 420 00:26:34,400 --> 00:26:37,640 Speaker 1: from some other source. Now, one way to do this 421 00:26:37,760 --> 00:26:40,359 Speaker 1: is to add kind of a secondary process to something 422 00:26:40,440 --> 00:26:44,600 Speaker 1: like natural gas mining operations, because that produces a lot 423 00:26:44,600 --> 00:26:48,840 Speaker 1: of hydrogen in the process. However, if we do that, 424 00:26:49,640 --> 00:26:54,680 Speaker 1: then we tie our source of hydrogen to ongoing fossil 425 00:26:54,800 --> 00:26:59,720 Speaker 1: fuels operations. That really just extends our reliance on fossil fuels, right, 426 00:26:59,800 --> 00:27:03,480 Speaker 1: and said of it, saying like, let's let's move away 427 00:27:03,600 --> 00:27:07,120 Speaker 1: from relying on fossil fuels and depend more on sources 428 00:27:07,160 --> 00:27:10,960 Speaker 1: like hydrogen. It says, oh, well, while we're depending on 429 00:27:11,000 --> 00:27:13,560 Speaker 1: fossil fuels, let's also get hydrogen. That means that we 430 00:27:13,680 --> 00:27:18,800 Speaker 1: become less less likely to just move off of fossil 431 00:27:18,840 --> 00:27:24,840 Speaker 1: fuels entirely. So really, any solution quote unquote that just 432 00:27:24,960 --> 00:27:27,200 Speaker 1: assumes that fossil fuels are still going to be part 433 00:27:27,200 --> 00:27:31,119 Speaker 1: of the picture is not great, generally speaking from an 434 00:27:31,200 --> 00:27:34,880 Speaker 1: environmental perspective. But that's not the only way to get hydrogen. 435 00:27:34,920 --> 00:27:38,480 Speaker 1: Another source of hydrogen is water. You know, good old 436 00:27:38,640 --> 00:27:42,439 Speaker 1: h that's the hydrogen to oh two hydrogen atoms to 437 00:27:42,440 --> 00:27:46,200 Speaker 1: every oxygen atom. If you run an electric current through water, 438 00:27:46,320 --> 00:27:49,720 Speaker 1: you can break those molecular bonds and you release oxygen 439 00:27:50,160 --> 00:27:53,320 Speaker 1: and you release hydrogen. But in order to do that, 440 00:27:53,359 --> 00:27:55,600 Speaker 1: you have to generate an electrical current, right, you have 441 00:27:55,680 --> 00:28:00,359 Speaker 1: to use energy to do this to break these molecular bonds. 442 00:28:00,400 --> 00:28:03,840 Speaker 1: So Rolls Royce was looking at renewable energy systems like 443 00:28:03,880 --> 00:28:09,200 Speaker 1: wind turbines and uh and and tidal turbines to generate 444 00:28:09,200 --> 00:28:11,760 Speaker 1: the electricity needed to harvest the hydrogen. So that way 445 00:28:11,800 --> 00:28:15,480 Speaker 1: they're not relying on like a coal powered power plant. Right, 446 00:28:15,960 --> 00:28:18,800 Speaker 1: So that's good. That's a pretty good ecosystem to get 447 00:28:18,840 --> 00:28:23,600 Speaker 1: your hydrogen through means that are not carbon emission systems. 448 00:28:23,640 --> 00:28:27,240 Speaker 1: But here's where we start to encounter a problem because, yeah, 449 00:28:27,280 --> 00:28:31,600 Speaker 1: burning hydrogen doesn't create carbon dioxide. However, burning hydrogen in 450 00:28:31,640 --> 00:28:37,280 Speaker 1: our atmosphere, specifically at higher temperatures, does create other byproducts. Now, 451 00:28:37,320 --> 00:28:40,760 Speaker 1: the main byproduct is water, and people say, oh, well water, 452 00:28:40,880 --> 00:28:44,040 Speaker 1: that's that's fine, right, It's just water, and that's true. 453 00:28:44,720 --> 00:28:49,280 Speaker 1: But it also at high temperatures can create nitrogen oxides. 454 00:28:50,520 --> 00:28:52,680 Speaker 1: You know, because there's a lot of nitrogen in our atmosphere, 455 00:28:53,040 --> 00:28:57,160 Speaker 1: nitrogen and oxygen, So burning at these high temperatures can, 456 00:28:57,240 --> 00:29:01,360 Speaker 1: as a byproduct, produced nitrogen oxides that is also a pollutant. 457 00:29:02,400 --> 00:29:06,320 Speaker 1: It can cause respiratory problems. It's a big contributor to 458 00:29:06,400 --> 00:29:11,080 Speaker 1: stuff like smog. So while you could convincingly argue that 459 00:29:11,200 --> 00:29:15,120 Speaker 1: using hydrogen and jet engines is cleaner than typical jet fuel, 460 00:29:15,600 --> 00:29:19,480 Speaker 1: I'd have to look at all the analysis to make 461 00:29:19,520 --> 00:29:24,320 Speaker 1: that conclusion. But it seems, you know, sensible, It's still 462 00:29:24,360 --> 00:29:28,040 Speaker 1: not totally free of pollutants, and I think it's a 463 00:29:28,080 --> 00:29:30,560 Speaker 1: heck of an engineering achievement. Don't get me wrong. I 464 00:29:30,600 --> 00:29:34,920 Speaker 1: think it's a great engineering achievement, and I don't want 465 00:29:34,960 --> 00:29:38,040 Speaker 1: to to diminish that or dismiss it or anything like that. 466 00:29:38,640 --> 00:29:41,959 Speaker 1: But I also don't want to ignore one pollutant just 467 00:29:42,040 --> 00:29:47,480 Speaker 1: because this new approach could eliminate emissions of some other pollutant. Right, 468 00:29:47,560 --> 00:29:52,360 Speaker 1: we have to keep the whole picture in mind. Otherwise 469 00:29:52,360 --> 00:29:56,160 Speaker 1: we just trade one problem for a different problem. If 470 00:29:56,200 --> 00:30:02,080 Speaker 1: we're able to reconcile all of at and to determine, okay, well, 471 00:30:02,240 --> 00:30:06,880 Speaker 1: does this approach make sense? Um is the pollutant significant? 472 00:30:07,320 --> 00:30:11,000 Speaker 1: If it's not significant, then maybe it makes perfect sense 473 00:30:11,040 --> 00:30:14,000 Speaker 1: to go this way. But if it is significant, if 474 00:30:14,360 --> 00:30:17,920 Speaker 1: all we're doing is trading carbon emissions for nitrogen oxide emissions, 475 00:30:18,880 --> 00:30:22,400 Speaker 1: then we still have some tough questions we have to answer. Still, 476 00:30:23,160 --> 00:30:27,520 Speaker 1: Anything that is pushing us away from fossil fuels and 477 00:30:27,680 --> 00:30:32,880 Speaker 1: toward an approach that is less environmentally dangerous, I think 478 00:30:33,080 --> 00:30:36,360 Speaker 1: is ultimately a good thing. And that's it. That's it 479 00:30:36,480 --> 00:30:39,600 Speaker 1: for this news episode of tech Stuff. Hope you are 480 00:30:39,760 --> 00:30:43,560 Speaker 1: all well. As I said earlier this week, I am 481 00:30:43,640 --> 00:30:46,200 Speaker 1: working on an end of the year kind of wrap 482 00:30:46,280 --> 00:30:49,200 Speaker 1: up of the big news stories that unfolded in tech 483 00:30:49,320 --> 00:30:52,920 Speaker 1: in two, if you have any favorites that you would 484 00:30:52,960 --> 00:30:56,480 Speaker 1: like me to cover, let me know. Uh, some of 485 00:30:56,520 --> 00:30:59,920 Speaker 1: the major stuff I'm obviously going to tackle, like Elon Musk, 486 00:31:00,080 --> 00:31:02,000 Speaker 1: Twitter obviously is going to have to play a part 487 00:31:02,000 --> 00:31:07,200 Speaker 1: in that. Meta's crisis is gonna play a part in 488 00:31:07,200 --> 00:31:10,440 Speaker 1: that um But you know, if there are specific stories 489 00:31:10,440 --> 00:31:12,720 Speaker 1: that happened within tech that you think are really important, 490 00:31:12,760 --> 00:31:15,840 Speaker 1: even if they weren't necessarily huge, but you think they 491 00:31:15,920 --> 00:31:20,200 Speaker 1: have important implementations for tech or consumers or anything like that, 492 00:31:20,920 --> 00:31:23,040 Speaker 1: feel free to let me know. You can get in 493 00:31:23,080 --> 00:31:25,120 Speaker 1: touch in a couple of different ways. One way is 494 00:31:25,160 --> 00:31:27,760 Speaker 1: you can download the I Heart radio app and you 495 00:31:27,760 --> 00:31:30,160 Speaker 1: can navigate over to the tech Stuff page. You just 496 00:31:30,200 --> 00:31:32,880 Speaker 1: put that in the search field and there you will 497 00:31:32,920 --> 00:31:35,040 Speaker 1: see a little microphone icon. If you click on the 498 00:31:35,080 --> 00:31:37,640 Speaker 1: microphone icon, you can leave a voice message up to 499 00:31:37,720 --> 00:31:40,720 Speaker 1: thirty seconds in length. Let me know what you think. 500 00:31:40,760 --> 00:31:42,440 Speaker 1: If you would like me to play the message in 501 00:31:42,480 --> 00:31:44,600 Speaker 1: a future episode of Tech Stuff, just let me know 502 00:31:44,680 --> 00:31:46,840 Speaker 1: that as well. I will only do it if you 503 00:31:46,880 --> 00:31:48,760 Speaker 1: tell me it's okay. The other way, if you don't 504 00:31:48,760 --> 00:31:51,000 Speaker 1: want to talk into a microphone, and I understand if 505 00:31:51,200 --> 00:31:53,640 Speaker 1: that's how you feel, You can leave me a message 506 00:31:53,720 --> 00:31:56,680 Speaker 1: on Twitter to handle for the show is tech Stuff 507 00:31:57,040 --> 00:32:01,120 Speaker 1: hs W and I'll talk to you again really soon. 508 00:32:07,000 --> 00:32:10,040 Speaker 1: Text Stuff is an I heart Radio production. For more 509 00:32:10,120 --> 00:32:13,520 Speaker 1: podcasts from I heart Radio, visit the i heart Radio app, 510 00:32:13,640 --> 00:32:16,800 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.