1 00:00:04,440 --> 00:00:12,840 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey therein 2 00:00:13,039 --> 00:00:16,640 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,680 --> 00:00:21,200 Speaker 1: an executive producer with iHeartRadio. And how the tech are you? 4 00:00:21,960 --> 00:00:25,680 Speaker 1: It's time for the tech news for May eleventh, twenty 5 00:00:25,920 --> 00:00:30,960 Speaker 1: twenty three, a Thursday. Bye, I'm doing my math correctly. 6 00:00:31,640 --> 00:00:36,680 Speaker 1: Google is currently holding its Io event. This is where 7 00:00:36,800 --> 00:00:42,000 Speaker 1: Google invites app developers to attend various keynotes and workshops 8 00:00:42,360 --> 00:00:46,200 Speaker 1: to learn about new Google features that the developers can 9 00:00:46,479 --> 00:00:51,080 Speaker 1: integrate into their own work. So yesterday Google held the 10 00:00:51,120 --> 00:00:54,560 Speaker 1: opening keynote, which is traditionally when the company makes the 11 00:00:54,640 --> 00:00:59,640 Speaker 1: bulk of its announcements and sometimes includes like impressive displays 12 00:00:59,640 --> 00:01:04,600 Speaker 1: of technology to get folks excited about Watson development. So 13 00:01:04,760 --> 00:01:08,480 Speaker 1: here's a quick rundown of some of the things Google 14 00:01:08,560 --> 00:01:12,560 Speaker 1: announced yesterday. They unveiled several new devices. One of them 15 00:01:12,640 --> 00:01:15,880 Speaker 1: is called the Pixel Fold, which, as you might imagine, 16 00:01:15,920 --> 00:01:19,960 Speaker 1: is a foldable mobile device, essentially a smartphone that can 17 00:01:20,120 --> 00:01:23,039 Speaker 1: unfold into kind of a tablet. It has an o 18 00:01:23,200 --> 00:01:27,160 Speaker 1: led screen when it's folded, that screen measures five point 19 00:01:27,200 --> 00:01:30,080 Speaker 1: eight inches on the diagonal, but if you unfold it 20 00:01:30,080 --> 00:01:33,040 Speaker 1: it becomes a seven point six inch display. That doesn't 21 00:01:33,080 --> 00:01:36,920 Speaker 1: sound like it gets much bigger, but it's actually kind 22 00:01:36,920 --> 00:01:40,040 Speaker 1: of like a almost like a square tablet rather than 23 00:01:40,080 --> 00:01:43,480 Speaker 1: the rectangular kind you're you're probably used to. It is 24 00:01:43,560 --> 00:01:46,480 Speaker 1: more square in shape. I have to admit I've never 25 00:01:46,480 --> 00:01:49,200 Speaker 1: been the target market for a device like this, like 26 00:01:49,240 --> 00:01:52,920 Speaker 1: a hybrid device that can be a smartphone or a tablet. 27 00:01:53,520 --> 00:01:55,800 Speaker 1: But for those who do want that sort of thing, 28 00:01:56,600 --> 00:01:59,600 Speaker 1: they will need to save up about one thousand and 29 00:01:59,640 --> 00:02:02,920 Speaker 1: seven hundred and ninety nine bucks to get one when 30 00:02:03,000 --> 00:02:08,959 Speaker 1: they become available. Google also showed off a new smartphone model, 31 00:02:09,120 --> 00:02:13,360 Speaker 1: the Pixel seven A, so the A series of phones, 32 00:02:13,400 --> 00:02:16,400 Speaker 1: like the five A and the seven A. Those tend 33 00:02:16,440 --> 00:02:19,560 Speaker 1: to be kind of a budget model, a slight update 34 00:02:19,600 --> 00:02:23,120 Speaker 1: but a budget model of the previous year's newest phone. 35 00:02:23,840 --> 00:02:26,440 Speaker 1: So the seven A will cost five hundred bucks for 36 00:02:26,600 --> 00:02:29,800 Speaker 1: the basic version. There's also a version that will support 37 00:02:29,919 --> 00:02:35,880 Speaker 1: millimeter wave transmissions, but only certain carriers support that. But 38 00:02:35,919 --> 00:02:38,000 Speaker 1: if your carrier does, then you can get that one 39 00:02:38,000 --> 00:02:40,480 Speaker 1: for five hundred and fifty bucks when they go on 40 00:02:40,560 --> 00:02:44,840 Speaker 1: sale and then. Google also introduced a tablet called the 41 00:02:45,280 --> 00:02:49,560 Speaker 1: wait for it, Pixel Tablet. It's an eleven inch tablet. 42 00:02:49,680 --> 00:02:52,919 Speaker 1: That one will cost five hundred bucks when it hits shelves. 43 00:02:52,919 --> 00:02:57,000 Speaker 1: These have Google's latest processor in them, and lots of 44 00:02:57,040 --> 00:02:59,280 Speaker 1: other bells and whistles, but you know, I'm not going 45 00:02:59,360 --> 00:03:01,440 Speaker 1: to dive into all of that because it would take 46 00:03:01,480 --> 00:03:05,360 Speaker 1: up the whole episode. Beyond the hardware, Google also talked 47 00:03:05,360 --> 00:03:09,080 Speaker 1: about some changes to Google products and services, and a 48 00:03:09,280 --> 00:03:13,000 Speaker 1: huge one is Google Search. You know, it used to 49 00:03:13,040 --> 00:03:15,200 Speaker 1: be that that search was the only thing we really 50 00:03:15,240 --> 00:03:18,440 Speaker 1: associated with Google. These days, I don't know what most 51 00:03:18,480 --> 00:03:20,519 Speaker 1: people think of when they think of Google. I still 52 00:03:20,600 --> 00:03:23,400 Speaker 1: think of search, but then I was around before Google 53 00:03:23,520 --> 00:03:26,760 Speaker 1: Search really became a thing, when I was using web Crawler, 54 00:03:27,120 --> 00:03:29,800 Speaker 1: So yeah, I don't know if everyone still thinks of 55 00:03:29,840 --> 00:03:32,240 Speaker 1: Google Search. But that's where the big change is coming. 56 00:03:32,760 --> 00:03:37,280 Speaker 1: And you know, Google Search has remained relatively unchanged for ages, 57 00:03:37,280 --> 00:03:41,680 Speaker 1: although you could argue the incorporation of ads has switched 58 00:03:41,720 --> 00:03:44,600 Speaker 1: things up a few times. But this new feature, which 59 00:03:44,600 --> 00:03:48,080 Speaker 1: will be an opt in experience, meaning it's not just 60 00:03:48,160 --> 00:03:51,520 Speaker 1: the standard experience you do have to elect to be 61 00:03:51,600 --> 00:03:56,280 Speaker 1: part of it, is called Search Generative Experience, and as 62 00:03:56,320 --> 00:04:00,480 Speaker 1: that name suggests, if you click into that mean that 63 00:04:00,520 --> 00:04:04,440 Speaker 1: your search query could include an AI generated answer at 64 00:04:04,440 --> 00:04:07,200 Speaker 1: the top of your search results, and that might mean 65 00:04:07,240 --> 00:04:09,800 Speaker 1: you don't even need to go any further to find 66 00:04:09,840 --> 00:04:12,480 Speaker 1: out more about whatever it is you're searching for. Like 67 00:04:12,520 --> 00:04:16,280 Speaker 1: if you're searching for a fairly simple or you know, 68 00:04:16,360 --> 00:04:20,599 Speaker 1: not to in depth topic, then you'll get a little 69 00:04:21,040 --> 00:04:23,279 Speaker 1: summary and you might not have to click through to 70 00:04:23,360 --> 00:04:26,720 Speaker 1: a different web page. Or maybe you use the answer 71 00:04:26,960 --> 00:04:30,480 Speaker 1: that's provided by AI to ask follow up questions and 72 00:04:30,520 --> 00:04:34,479 Speaker 1: get more in depth answers and still not click through. 73 00:04:34,920 --> 00:04:37,760 Speaker 1: This is the kind of stuff that makes web page 74 00:04:37,839 --> 00:04:41,919 Speaker 1: administrators really nervous because if no one's clicking through, then 75 00:04:42,080 --> 00:04:45,360 Speaker 1: there's no traffic. If there's no traffic, then there's no revenue. 76 00:04:45,400 --> 00:04:49,120 Speaker 1: And unless you're, you know, a subscription based website, and 77 00:04:49,160 --> 00:04:54,040 Speaker 1: people just don't bother to cancel their subscriptions. But otherwise, yeah, 78 00:04:54,320 --> 00:04:56,480 Speaker 1: it's a scary kind of thing. In fact, this is 79 00:04:56,520 --> 00:04:59,640 Speaker 1: something that a lot of websites have been concerned about 80 00:04:59,640 --> 00:05:04,000 Speaker 1: for all while, but it could potentially be helpful to 81 00:05:04,160 --> 00:05:07,440 Speaker 1: users as long as the answers are reliable. You know, 82 00:05:07,480 --> 00:05:10,560 Speaker 1: as we have seen that's not always the case. The 83 00:05:10,600 --> 00:05:13,560 Speaker 1: company also has opened up google Bard before it was 84 00:05:13,600 --> 00:05:15,920 Speaker 1: in a beta test and you had to be invited 85 00:05:15,920 --> 00:05:18,480 Speaker 1: to be part of it. But now you can just 86 00:05:18,760 --> 00:05:21,120 Speaker 1: use it if you want to and maybe see if 87 00:05:21,160 --> 00:05:25,279 Speaker 1: it's improved. Since Google developers expressed concerns about its dependability 88 00:05:25,320 --> 00:05:28,240 Speaker 1: and accuracy, I would recommend that if you do use 89 00:05:28,279 --> 00:05:31,480 Speaker 1: Google Bard, don't you know, rely on it to give 90 00:05:31,520 --> 00:05:34,960 Speaker 1: you safe instructions on how to go scuba diving, because 91 00:05:35,360 --> 00:05:38,360 Speaker 1: developers pointed out that an earlier build of Google Bard 92 00:05:38,800 --> 00:05:42,840 Speaker 1: gave instructions that could be harmful or even fatal if 93 00:05:42,920 --> 00:05:48,320 Speaker 1: you followed them. So yeah, I would not depend entirely 94 00:05:48,480 --> 00:05:53,120 Speaker 1: upon AI responses right now. Google also announced that it 95 00:05:53,240 --> 00:05:57,719 Speaker 1: is updating its operating system for wearable devices. This is 96 00:05:58,120 --> 00:06:04,120 Speaker 1: fittingly called where a Swear where OS, so the fourth 97 00:06:04,200 --> 00:06:07,560 Speaker 1: generation is what is currently in development, just so you know, 98 00:06:08,520 --> 00:06:14,400 Speaker 1: where OS three hasn't yet fully deployed. So it's interesting 99 00:06:14,400 --> 00:06:17,440 Speaker 1: that they were talking about the next generation when the 100 00:06:17,520 --> 00:06:21,920 Speaker 1: current one hasn't really fully rolled out yet. There's lots 101 00:06:21,960 --> 00:06:24,600 Speaker 1: more too. Google also showed off an AI tool that 102 00:06:24,720 --> 00:06:29,080 Speaker 1: pixel users the smartphone and maybe the tablet as well. 103 00:06:29,520 --> 00:06:31,960 Speaker 1: The pixel users will get to use this AI tool 104 00:06:32,160 --> 00:06:35,719 Speaker 1: that will help them edit photos. So let's say you 105 00:06:35,760 --> 00:06:38,560 Speaker 1: wanted to change the brightness of just one part of 106 00:06:38,600 --> 00:06:41,120 Speaker 1: the photo without affecting the rest. You could do that 107 00:06:41,600 --> 00:06:44,280 Speaker 1: or and this was really cool. They showed off an 108 00:06:44,279 --> 00:06:46,159 Speaker 1: example of this. Let's say that you took a picture 109 00:06:46,160 --> 00:06:49,360 Speaker 1: of your friend. Say your friend is standing near a 110 00:06:49,440 --> 00:06:51,760 Speaker 1: cruise ship, and it's a good picture, but you just 111 00:06:51,880 --> 00:06:54,800 Speaker 1: wish that you had had your friend's step like six 112 00:06:54,839 --> 00:06:58,000 Speaker 1: inches to the right. Well, with this tool, you could 113 00:06:58,279 --> 00:07:00,920 Speaker 1: shift your friends six inches to the right, and AI 114 00:07:01,000 --> 00:07:04,760 Speaker 1: would effectively paint the background so that it wouldn't look 115 00:07:04,800 --> 00:07:06,880 Speaker 1: as if you had moved the person in the first place, 116 00:07:07,480 --> 00:07:11,840 Speaker 1: like it should seem as if you could move your 117 00:07:11,880 --> 00:07:15,200 Speaker 1: friend to wherever in the frame you want, and the 118 00:07:15,280 --> 00:07:17,440 Speaker 1: AI will just take care of the background so that 119 00:07:17,600 --> 00:07:19,520 Speaker 1: it doesn't look like you messed with it at all, 120 00:07:19,600 --> 00:07:23,040 Speaker 1: which is really cool. Also kind of scary because when 121 00:07:23,080 --> 00:07:25,880 Speaker 1: you get into that real time photo manipulation stuff where 122 00:07:25,960 --> 00:07:30,840 Speaker 1: AI is kind of covering for you, it can raise 123 00:07:30,880 --> 00:07:35,800 Speaker 1: a lot of potential misuse scenarios further down the line, 124 00:07:35,800 --> 00:07:38,040 Speaker 1: but still kind of neat if you're just using it 125 00:07:38,120 --> 00:07:41,040 Speaker 1: casually for the purpose for which it was intended. There 126 00:07:41,080 --> 00:07:44,239 Speaker 1: was a neat privacy announcement. Android devices will be able 127 00:07:44,280 --> 00:07:49,160 Speaker 1: to detect Bluetooth devices that were intended to track stuff, 128 00:07:49,480 --> 00:07:52,160 Speaker 1: so like Apple's air tags, right, the purpose of air 129 00:07:52,200 --> 00:07:55,040 Speaker 1: tags is such you can slip one into, say your luggage, 130 00:07:55,400 --> 00:07:57,400 Speaker 1: and then you can use the air tag to find 131 00:07:57,640 --> 00:08:00,880 Speaker 1: that piece of luggage if you get your destination and 132 00:08:00,920 --> 00:08:03,840 Speaker 1: your luggage doesn't that kind of thing right, Well, some 133 00:08:03,880 --> 00:08:08,120 Speaker 1: people obviously have been using air tags to tag other stuff, 134 00:08:08,160 --> 00:08:11,000 Speaker 1: not just like luggage or whatever, but sometimes, like I 135 00:08:11,000 --> 00:08:15,400 Speaker 1: don't know, a person that they wanted to follow, like 136 00:08:15,480 --> 00:08:19,440 Speaker 1: a stalker would use it to slip the air tag 137 00:08:19,600 --> 00:08:25,120 Speaker 1: into say a car and end up stalking someone that way. Well, 138 00:08:25,120 --> 00:08:28,400 Speaker 1: now Android will be able to detect these sorts of devices. 139 00:08:28,640 --> 00:08:32,120 Speaker 1: So if it sees that there's some Bluetooth device that 140 00:08:32,280 --> 00:08:36,839 Speaker 1: is traveling with you, like it's detected, and it's still 141 00:08:36,840 --> 00:08:39,480 Speaker 1: detected even as you move through your environment, like if 142 00:08:39,480 --> 00:08:41,360 Speaker 1: you're driving down the street, it will give you an 143 00:08:41,360 --> 00:08:44,080 Speaker 1: alert so that way you can check to make sure 144 00:08:44,160 --> 00:08:49,440 Speaker 1: that someone hasn't bugged your car or whatever. I assume 145 00:08:49,600 --> 00:08:51,680 Speaker 1: that it's only going to do this when it does 146 00:08:51,720 --> 00:08:54,840 Speaker 1: detect that stuff is moving around with you, so that 147 00:08:55,160 --> 00:08:58,640 Speaker 1: you don't get crazy numbers of notifications. When say you 148 00:08:58,679 --> 00:09:02,520 Speaker 1: go to the gym and everyone's using Bluetooth earbuds or something. 149 00:09:03,480 --> 00:09:05,880 Speaker 1: Google also made some other announcements, but I think that 150 00:09:05,880 --> 00:09:09,920 Speaker 1: that's enough for right now, and we could probably dive 151 00:09:09,960 --> 00:09:14,000 Speaker 1: into more in future episodes of tech stuff. Over in Russia, 152 00:09:14,400 --> 00:09:18,760 Speaker 1: the Russian government has fined Google's parent company, Alphabet, the 153 00:09:18,920 --> 00:09:24,520 Speaker 1: staggering sum of three million rubles, which is, let me 154 00:09:24,760 --> 00:09:33,120 Speaker 1: just convert this into dollars. Oh, it's about forty thousand bucks. Okay, 155 00:09:33,200 --> 00:09:36,760 Speaker 1: that's not a lot. But then you might say, okay, well, 156 00:09:36,760 --> 00:09:39,120 Speaker 1: why did they find Alphabet in the first place. What 157 00:09:39,360 --> 00:09:45,080 Speaker 1: was the infraction. Well, the Russian government says Alphabet failed 158 00:09:45,120 --> 00:09:49,600 Speaker 1: to remove videos on YouTube that the government has determined 159 00:09:49,920 --> 00:09:54,920 Speaker 1: contain misinformation, primarily about Russia's war against Ukraine, which has 160 00:09:55,000 --> 00:09:58,280 Speaker 1: not been going well for Putin, but the Russian government 161 00:09:58,280 --> 00:10:03,360 Speaker 1: would prefer that not be publicized within Russia itself, and 162 00:10:03,480 --> 00:10:06,880 Speaker 1: also any videos that contained what the government has referred 163 00:10:06,920 --> 00:10:13,440 Speaker 1: to as LGBT propaganda. The government classifies any content that 164 00:10:13,559 --> 00:10:18,640 Speaker 1: quote unquote promotes homosexuality as propaganda and that whomever is 165 00:10:18,679 --> 00:10:23,680 Speaker 1: responsible for publishing it can face fines. Now, I would 166 00:10:23,679 --> 00:10:27,680 Speaker 1: make some very judgmental comments about this about how Russia 167 00:10:27,720 --> 00:10:30,840 Speaker 1: has targeted these folks. But I feel like before I 168 00:10:30,880 --> 00:10:33,560 Speaker 1: can start casting stones at Russia, I gotta aim a 169 00:10:33,600 --> 00:10:37,280 Speaker 1: whole lot closer to home, because currently we're seeing an 170 00:10:37,400 --> 00:10:43,600 Speaker 1: increased movement against the LGBTQ plus community in various states 171 00:10:43,640 --> 00:10:47,679 Speaker 1: here in the US, including some that border my state, 172 00:10:47,800 --> 00:10:52,960 Speaker 1: and I imagine my state, which has perhaps some sympathetic 173 00:10:53,160 --> 00:10:56,560 Speaker 1: politicians in charge, could follow suits. So I can't get 174 00:10:56,960 --> 00:10:59,880 Speaker 1: two up in arms about Russia doing this. Although avi 175 00:11:00,280 --> 00:11:06,440 Speaker 1: Russia has gone to extremes to suppress anyone in the 176 00:11:06,559 --> 00:11:11,760 Speaker 1: LGBTQ community there, I can't, you know, We're seeing similar 177 00:11:11,760 --> 00:11:13,839 Speaker 1: things happen here in the States, so I can't really 178 00:11:14,360 --> 00:11:18,040 Speaker 1: just single out Russia anyway. I don't know if Google 179 00:11:18,080 --> 00:11:20,560 Speaker 1: will bother paying this fine. I mean, maybe they will. 180 00:11:20,559 --> 00:11:23,280 Speaker 1: It's only forty thousand dollars. I'm sure executives could just 181 00:11:23,360 --> 00:11:25,320 Speaker 1: root around in their couches and come up with that. 182 00:11:25,920 --> 00:11:28,880 Speaker 1: But Google, like a lot of tech companies, has already 183 00:11:29,000 --> 00:11:32,200 Speaker 1: pulled out of Russia. They closed down their offices based 184 00:11:32,280 --> 00:11:36,400 Speaker 1: in the country, and while Google services are still available 185 00:11:36,400 --> 00:11:39,880 Speaker 1: in Russia, the company doesn't operate there like on a 186 00:11:40,720 --> 00:11:45,280 Speaker 1: like a corporate scale. Speaking of YouTube, there's apparently a 187 00:11:45,320 --> 00:11:48,720 Speaker 1: new anti ad blocker feature on the service, at least 188 00:11:48,720 --> 00:11:52,000 Speaker 1: for some users. According to Reddit, a few folks having 189 00:11:52,080 --> 00:11:55,680 Speaker 1: counter messages saying that ad blockers are not allowed on YouTube, 190 00:11:56,000 --> 00:11:58,720 Speaker 1: and if you want to watch YouTube without ads, you 191 00:11:58,800 --> 00:12:02,640 Speaker 1: need to become a YouTube Premium subscriber, or your other 192 00:12:02,720 --> 00:12:05,640 Speaker 1: option is just to disable your ad blocker and then 193 00:12:05,720 --> 00:12:09,560 Speaker 1: you get ads supported YouTube access. Google has seen a 194 00:12:09,600 --> 00:12:12,600 Speaker 1: lot more people subscribe to YouTube Premium over the past 195 00:12:12,960 --> 00:12:15,600 Speaker 1: couple of years, so it is possible that this is 196 00:12:15,640 --> 00:12:19,120 Speaker 1: really meant to push more folks toward that. In fact, 197 00:12:19,120 --> 00:12:22,040 Speaker 1: the redditors said that the pop up message included a 198 00:12:22,080 --> 00:12:25,400 Speaker 1: button to make it super simple to subscribe to YouTube Premium, 199 00:12:25,480 --> 00:12:27,480 Speaker 1: so you could just click there in the pop up 200 00:12:27,520 --> 00:12:30,360 Speaker 1: window and do it. So far, this message has been 201 00:12:30,400 --> 00:12:33,880 Speaker 1: part of a limited rolled out experiment. It's not going 202 00:12:33,920 --> 00:12:36,760 Speaker 1: to everyone. It's just going to a subset of YouTube users. 203 00:12:36,960 --> 00:12:39,800 Speaker 1: It is possible that YouTube will not expand this to 204 00:12:39,880 --> 00:12:43,440 Speaker 1: the general user base. But as I've said before, if 205 00:12:43,520 --> 00:12:47,560 Speaker 1: a company sees that a tactic is effective and is 206 00:12:47,960 --> 00:12:50,880 Speaker 1: generating revenue, then they're more likely than not going to 207 00:12:51,640 --> 00:12:56,000 Speaker 1: adopt it wholesale. It's hard to back off of that, 208 00:12:56,520 --> 00:12:59,800 Speaker 1: and as full disclosure, I've actually been a YouTube Premium 209 00:12:59,800 --> 00:13:03,920 Speaker 1: subscriber for ages. Although to be fair, I was actually 210 00:13:03,920 --> 00:13:08,559 Speaker 1: a Google Music subscriber, but then Google sunset that service, 211 00:13:08,679 --> 00:13:10,960 Speaker 1: and when they did, they ported my subscription over to 212 00:13:11,040 --> 00:13:15,240 Speaker 1: YouTube Premium because there's YouTube music as well, and I 213 00:13:15,400 --> 00:13:20,240 Speaker 1: just kept it because I like watching YouTube without ads. 214 00:13:20,280 --> 00:13:23,920 Speaker 1: But I also understand that, you know, you can't get 215 00:13:23,960 --> 00:13:27,040 Speaker 1: something for nothing or else. No one can make some 216 00:13:27,120 --> 00:13:32,600 Speaker 1: things anymore. Speaking of that, this is perfect timing. We're 217 00:13:32,640 --> 00:13:35,360 Speaker 1: going to take a quick break so that we can 218 00:13:35,400 --> 00:13:38,199 Speaker 1: listen to a few messages from our sponsors and then 219 00:13:38,320 --> 00:13:50,720 Speaker 1: I'll be right back. Okay, here we go back with 220 00:13:50,840 --> 00:13:55,240 Speaker 1: the news. Sam Altman, the CEO of Open AI, is 221 00:13:55,640 --> 00:14:00,120 Speaker 1: about to appear before the US Senate Judiciary Subcommittee on Privacy, 222 00:14:00,440 --> 00:14:06,240 Speaker 1: Technology and the Law. This happens next Tuesday. Senators want 223 00:14:06,280 --> 00:14:08,760 Speaker 1: to have a conversation with Altman about his perspective on 224 00:14:08,800 --> 00:14:12,079 Speaker 1: how governments should approach AI. You know what sorts of 225 00:14:12,200 --> 00:14:15,120 Speaker 1: laws might need to be updated or even new laws 226 00:14:15,400 --> 00:14:18,840 Speaker 1: potentially proposed in order to allow for the safe and 227 00:14:18,920 --> 00:14:24,400 Speaker 1: responsible development and deployment of AI while minimizing risks and 228 00:14:24,440 --> 00:14:27,360 Speaker 1: threats and other types of problems. Now, in the past, 229 00:14:27,440 --> 00:14:31,200 Speaker 1: Altman has been pretty candid about his opinions, and they 230 00:14:31,240 --> 00:14:33,600 Speaker 1: aren't always what you would expect from a guy who 231 00:14:33,680 --> 00:14:37,000 Speaker 1: runs a company that is known for perhaps the most 232 00:14:37,080 --> 00:14:40,280 Speaker 1: high profile implementation of AI that's out there right now, 233 00:14:40,280 --> 00:14:43,560 Speaker 1: at least among the general public. He has said more 234 00:14:43,600 --> 00:14:47,360 Speaker 1: than once that chat GPT is not as miraculous as 235 00:14:47,400 --> 00:14:50,160 Speaker 1: some people made it out to be. And remember, he's 236 00:14:50,240 --> 00:14:53,920 Speaker 1: the guy running the company that made chat GPT. He 237 00:14:53,960 --> 00:14:58,040 Speaker 1: has already met with political leaders about AI safeguards recently. 238 00:14:58,160 --> 00:15:01,360 Speaker 1: So I think this is a really good step. I 239 00:15:01,400 --> 00:15:04,200 Speaker 1: don't think it's just going to become political theater where 240 00:15:04,240 --> 00:15:06,920 Speaker 1: Altman's being grilled for all this stuff. I feel like 241 00:15:07,040 --> 00:15:10,240 Speaker 1: this there's the potential of actual collaboration here. Now. I 242 00:15:10,280 --> 00:15:13,640 Speaker 1: have never met Altman, I don't know him personally. I 243 00:15:13,680 --> 00:15:16,480 Speaker 1: get the sense that I mean he is trying to 244 00:15:16,520 --> 00:15:20,520 Speaker 1: walk the line between running an AI company and also 245 00:15:20,640 --> 00:15:23,920 Speaker 1: trying to manage expectations in a field where the hype 246 00:15:23,920 --> 00:15:26,960 Speaker 1: cycle can take off without warning. We've seen that recently 247 00:15:27,040 --> 00:15:30,760 Speaker 1: with reports about how investors are pouring billions of dollars 248 00:15:31,400 --> 00:15:34,880 Speaker 1: into different companies that are centered around AI, even if 249 00:15:34,920 --> 00:15:38,200 Speaker 1: those companies don't have you know, a business plan or something. 250 00:15:38,760 --> 00:15:42,320 Speaker 1: So I hope that this meeting next week is informative 251 00:15:42,400 --> 00:15:46,000 Speaker 1: and productive and that perhaps lawmakers can start looking into 252 00:15:46,040 --> 00:15:49,520 Speaker 1: things that need tweaking, like you know, right to publicity 253 00:15:49,520 --> 00:15:53,160 Speaker 1: and right to personality laws as an example. Meanwhile, over 254 00:15:53,160 --> 00:15:58,040 Speaker 1: in the European Union, lawmakers there are also nearing the 255 00:15:58,080 --> 00:16:02,960 Speaker 1: final step toward regular AI. So the EU has a 256 00:16:03,000 --> 00:16:06,800 Speaker 1: big set of rules about AI that's nearing the point 257 00:16:06,840 --> 00:16:09,480 Speaker 1: of adoption. And these rules cover a lot of ground, 258 00:16:09,640 --> 00:16:13,880 Speaker 1: including stuff like the rules around facial recognition, for example, 259 00:16:14,360 --> 00:16:17,600 Speaker 1: that it should not be used in public spaces, that 260 00:16:17,720 --> 00:16:20,240 Speaker 1: law enforcement should not be relying upon it, because, as 261 00:16:20,240 --> 00:16:24,040 Speaker 1: we've seen here in the United States, facial recognition technology 262 00:16:24,080 --> 00:16:26,920 Speaker 1: is far from perfect. There's a tendency for this tech 263 00:16:27,000 --> 00:16:29,120 Speaker 1: to have a bias that makes it difficult for facial 264 00:16:29,120 --> 00:16:33,760 Speaker 1: recognition systems to differentiate people of color, for example, and 265 00:16:33,880 --> 00:16:37,840 Speaker 1: this leads to disproportionate harassment when law enforcement relies on 266 00:16:37,880 --> 00:16:41,840 Speaker 1: such technology to identify and track suspects. So the EU 267 00:16:42,320 --> 00:16:46,560 Speaker 1: doesn't want to perpetuate that, and these rules would forbid 268 00:16:46,720 --> 00:16:50,960 Speaker 1: law enforcement from relying on facial recognition technology for those purposes. 269 00:16:51,520 --> 00:16:55,480 Speaker 1: The EU's approach has been to categorize AI not in 270 00:16:55,640 --> 00:16:59,640 Speaker 1: terms of the AI's capability, but rather in terms of 271 00:16:59,720 --> 00:17:04,639 Speaker 1: its perceived risk, which is interesting, Like, it's interesting to 272 00:17:04,680 --> 00:17:11,040 Speaker 1: think about classifying AI as the perceived risk it has 273 00:17:11,119 --> 00:17:15,720 Speaker 1: on people. So if an implementation of AI is perceived 274 00:17:16,280 --> 00:17:19,840 Speaker 1: to be dangerous, well, then it'll likely get the unwanted 275 00:17:20,040 --> 00:17:24,120 Speaker 1: label of unacceptable, Like the level of risk is unacceptable, 276 00:17:24,119 --> 00:17:26,080 Speaker 1: so we cannot allow it. It would be against the 277 00:17:26,160 --> 00:17:30,119 Speaker 1: law to implement AI for that specific purpose. If the 278 00:17:30,119 --> 00:17:33,240 Speaker 1: AI is meant to I don't know, just like automatically 279 00:17:33,320 --> 00:17:37,240 Speaker 1: tally up calculations in a spreadsheet, it would probably get 280 00:17:37,280 --> 00:17:41,639 Speaker 1: a much lower risk assessment assigned to it, although you 281 00:17:41,680 --> 00:17:45,720 Speaker 1: could argue that even that application of technology can potentially 282 00:17:45,800 --> 00:17:48,679 Speaker 1: have really nasty consequences. It all depends upon what that 283 00:17:48,800 --> 00:17:52,720 Speaker 1: spreadsheet is all about. Right now, I have yet to 284 00:17:52,920 --> 00:17:57,240 Speaker 1: actually read the law. The law itself has been in 285 00:17:57,280 --> 00:17:59,760 Speaker 1: the making for a couple of years. This has been 286 00:17:59,800 --> 00:18:02,960 Speaker 1: a long process, with lots of different parties having input 287 00:18:03,680 --> 00:18:08,200 Speaker 1: as the EU lawmakers have tried to structure this proposed bill. 288 00:18:08,760 --> 00:18:11,040 Speaker 1: So I am not able to give my full opinion 289 00:18:11,280 --> 00:18:14,760 Speaker 1: about this because I haven't read it. I do think 290 00:18:14,800 --> 00:18:18,400 Speaker 1: that it's encouraging to see lawmakers get to this stage. However, 291 00:18:18,960 --> 00:18:21,080 Speaker 1: it took a long time to get there, but at 292 00:18:21,160 --> 00:18:24,000 Speaker 1: least we got there. So often when I'm talking about 293 00:18:24,040 --> 00:18:27,680 Speaker 1: AI and the law, I'm just talking about this nebulous 294 00:18:27,960 --> 00:18:31,720 Speaker 1: era that we're in right now where very little appears 295 00:18:31,720 --> 00:18:35,760 Speaker 1: to be happening. Well, things are happening in the EU. 296 00:18:35,920 --> 00:18:38,600 Speaker 1: This is, by the way, very super complicated stuff. You 297 00:18:38,640 --> 00:18:41,840 Speaker 1: want to make sure that you protect citizens and all that, 298 00:18:41,880 --> 00:18:44,399 Speaker 1: but you also you don't want to enter into a 299 00:18:44,440 --> 00:18:48,199 Speaker 1: situation where you're outright preventing research and development on tools 300 00:18:48,200 --> 00:18:52,000 Speaker 1: that could potentially create enormous public good. It is not 301 00:18:52,119 --> 00:18:55,320 Speaker 1: an easy thing to do. I think we often get frustrated, 302 00:18:55,320 --> 00:18:58,399 Speaker 1: including myself, get frustrated at our leaders for taking so 303 00:18:58,600 --> 00:19:03,280 Speaker 1: long to adopt rules and regulations meant to protect people. 304 00:19:04,320 --> 00:19:07,800 Speaker 1: Often we feel like whatever regulations are out there aren't 305 00:19:07,840 --> 00:19:11,240 Speaker 1: to protect people, but are there to protect other things 306 00:19:11,320 --> 00:19:13,960 Speaker 1: like corporations, which in here in the US are treated 307 00:19:14,080 --> 00:19:19,760 Speaker 1: as people. But you know, it's refreshing to see some 308 00:19:20,359 --> 00:19:23,000 Speaker 1: progress on that front, at least in the EU. We 309 00:19:23,040 --> 00:19:27,240 Speaker 1: also have a few stories about other places that are 310 00:19:27,320 --> 00:19:31,280 Speaker 1: adopting rules that relate to AI. For example, lawmakers here 311 00:19:31,480 --> 00:19:34,280 Speaker 1: in the United States, specifically in the state of Minnesota, 312 00:19:34,960 --> 00:19:37,400 Speaker 1: are closing in on passing a bill that would make 313 00:19:37,440 --> 00:19:41,000 Speaker 1: it a crime to create a deep fake image or 314 00:19:41,119 --> 00:19:45,760 Speaker 1: video that shows someone appearing to have sex without that 315 00:19:45,840 --> 00:19:48,760 Speaker 1: person's consent. So, in other words, it's a law that 316 00:19:48,800 --> 00:19:52,920 Speaker 1: would make deep fake pornography illegal if the person being 317 00:19:52,920 --> 00:19:56,240 Speaker 1: depicted in the video or image didn't agree to it 318 00:19:56,280 --> 00:19:58,920 Speaker 1: in the first place. Likewise, it would be illegal to 319 00:19:59,000 --> 00:20:03,560 Speaker 1: use deep fakes in an effort to spread election disinformation. Now, 320 00:20:03,600 --> 00:20:06,520 Speaker 1: the state's House of Representatives had already voted on a 321 00:20:06,600 --> 00:20:11,639 Speaker 1: version of that bill, and yesterday the state's Senate. Because 322 00:20:11,720 --> 00:20:14,480 Speaker 1: the lawmakers are in two houses, the House of Representatives 323 00:20:14,480 --> 00:20:16,439 Speaker 1: and the Senate, both at the state level, and then 324 00:20:16,480 --> 00:20:19,000 Speaker 1: we have a federal level here in the US as well, 325 00:20:19,560 --> 00:20:21,639 Speaker 1: this is just for the state of Minnesota. So the 326 00:20:21,640 --> 00:20:27,919 Speaker 1: Senate unanimously voted in favor of its version of the bill. However, 327 00:20:27,960 --> 00:20:31,960 Speaker 1: the Senates version is technically more strict than the House version. 328 00:20:32,000 --> 00:20:36,320 Speaker 1: The House version had an exception built in with regard 329 00:20:36,359 --> 00:20:40,720 Speaker 1: to things like free speech and satire and parody, but 330 00:20:40,840 --> 00:20:44,240 Speaker 1: to get further into that would get far more complicated. Now, 331 00:20:44,280 --> 00:20:46,679 Speaker 1: the Senates version of the bill will go back to 332 00:20:46,720 --> 00:20:50,920 Speaker 1: the House of Representatives. They will then discuss the changes 333 00:20:50,920 --> 00:20:54,080 Speaker 1: made and vote on whether or not to adopt it. 334 00:20:54,480 --> 00:20:56,440 Speaker 1: If they do vote to adopt it, which seems like 335 00:20:56,480 --> 00:20:59,159 Speaker 1: a pretty strong bet, there's a lot of support for 336 00:20:59,240 --> 00:21:02,199 Speaker 1: this bill, it would then move to the governor to 337 00:21:02,200 --> 00:21:05,399 Speaker 1: be signed into law. Minnesota is not the first state 338 00:21:05,440 --> 00:21:08,119 Speaker 1: to do this. They would follow states like Texas and 339 00:21:08,160 --> 00:21:11,040 Speaker 1: California that have already put laws on the books that 340 00:21:11,160 --> 00:21:16,960 Speaker 1: make certain uses of deep fake technology outright illegal. You 341 00:21:17,040 --> 00:21:20,120 Speaker 1: might remember a story earlier this month about how IBM's 342 00:21:20,119 --> 00:21:24,960 Speaker 1: CEO speculated that as many as seven eight hundred open positions, 343 00:21:25,520 --> 00:21:28,280 Speaker 1: which is just a small group of a much larger 344 00:21:29,080 --> 00:21:32,119 Speaker 1: section of open positions that are all under a hiring 345 00:21:32,119 --> 00:21:35,959 Speaker 1: freeze right now, those seven eight hundred jobs could ultimately 346 00:21:36,040 --> 00:21:39,400 Speaker 1: go to AI rather than being filled with human beings. 347 00:21:39,880 --> 00:21:44,520 Speaker 1: Well now, IBM's chief Commercial Officer, Rob Thomas has said 348 00:21:44,520 --> 00:21:47,760 Speaker 1: that managers who refused to use AI or who failed 349 00:21:47,800 --> 00:21:52,040 Speaker 1: to learn how to use AI in the context of 350 00:21:52,119 --> 00:21:55,680 Speaker 1: their jobs will find themselves replaced with managers who do 351 00:21:56,160 --> 00:22:00,040 Speaker 1: use AI. Now that at first, when you hear it 352 00:22:00,080 --> 00:22:03,120 Speaker 1: makes it sound a little sinister, but honestly, I think 353 00:22:03,200 --> 00:22:06,120 Speaker 1: this is more in line with how most AI experts 354 00:22:06,160 --> 00:22:09,680 Speaker 1: have framed the best use of this technology for the 355 00:22:09,760 --> 00:22:13,840 Speaker 1: last several years, that AI is not meant to outright 356 00:22:13,920 --> 00:22:19,280 Speaker 1: replace people, but rather to augment employee's abilities so that 357 00:22:19,320 --> 00:22:22,280 Speaker 1: they can do their jobs more effectively, and also to 358 00:22:22,400 --> 00:22:24,280 Speaker 1: focus on the parts of the job that are more 359 00:22:24,359 --> 00:22:29,240 Speaker 1: rewarding and are less suitable for automation. Right so, someone 360 00:22:29,280 --> 00:22:31,639 Speaker 1: who is willing to use AI in order to do 361 00:22:31,680 --> 00:22:34,680 Speaker 1: that will be seen as a more valuable leader than 362 00:22:34,720 --> 00:22:40,000 Speaker 1: someone who isn't. While the CEO's revelation about potentially automating 363 00:22:40,040 --> 00:22:44,000 Speaker 1: thousands of jobs outright is I think a controversial one, 364 00:22:44,440 --> 00:22:48,840 Speaker 1: I think mister Thomas's point is much less controversial. It 365 00:22:48,920 --> 00:22:52,439 Speaker 1: makes sense as long as you're talking about using AI 366 00:22:52,560 --> 00:22:56,240 Speaker 1: to augment your ability to do your job. If he, 367 00:22:56,440 --> 00:22:59,080 Speaker 1: in fact is talking about managers using AI to eliminate 368 00:22:59,200 --> 00:23:02,679 Speaker 1: entire divisions of actual employees, then no, I can't. I 369 00:23:02,720 --> 00:23:05,120 Speaker 1: can't sign on for that. I think that's a bad idea. 370 00:23:05,520 --> 00:23:10,160 Speaker 1: Microsoft's quest to acquire Activision Blizzard continues to hit more roadblocks. 371 00:23:10,440 --> 00:23:13,360 Speaker 1: The competition regulator of the UK has now passed an 372 00:23:13,359 --> 00:23:17,840 Speaker 1: interim order that forbids the two companies from purchasing an 373 00:23:17,920 --> 00:23:22,120 Speaker 1: interest in one another, so they can't like Microsoft would 374 00:23:22,160 --> 00:23:26,399 Speaker 1: not be allowed to purchase any ownership of Activision or 375 00:23:26,520 --> 00:23:30,959 Speaker 1: vice versa. While the UK is standing firm against this acquisition, 376 00:23:31,600 --> 00:23:35,160 Speaker 1: the European Union is expected to actually approve the deal 377 00:23:35,560 --> 00:23:39,440 Speaker 1: sometime this month, So if you're keeping score, Japan has 378 00:23:39,520 --> 00:23:42,720 Speaker 1: already approved this deal, the EU is on the way 379 00:23:42,760 --> 00:23:47,399 Speaker 1: to approving the deal, the UK has voted against the deal, 380 00:23:47,880 --> 00:23:51,879 Speaker 1: and the US is teetering toward opposing the deal but 381 00:23:52,000 --> 00:23:55,840 Speaker 1: has not yet firmly come down, So it's really still 382 00:23:55,880 --> 00:23:58,440 Speaker 1: kind of a coin flip situation. And I honestly don't 383 00:23:58,480 --> 00:24:00,879 Speaker 1: know how things go if every one other than the 384 00:24:01,000 --> 00:24:04,159 Speaker 1: UK approves this deal, Like, I don't know how that 385 00:24:04,280 --> 00:24:07,880 Speaker 1: ends up working, because global company structures and the rules 386 00:24:07,960 --> 00:24:12,040 Speaker 1: regarding them confuse the heck out of me. So maybe 387 00:24:12,200 --> 00:24:15,560 Speaker 1: some people out there probably know exactly how this would unfold. 388 00:24:16,080 --> 00:24:17,680 Speaker 1: I am not one of them, so I'm not going 389 00:24:17,720 --> 00:24:21,639 Speaker 1: to try and guess. Disney Plus saw a drop in 390 00:24:21,760 --> 00:24:25,280 Speaker 1: subscribers for the second quarter in a row, with four 391 00:24:25,280 --> 00:24:30,120 Speaker 1: million people deciding they would let it go. Let it go. However, 392 00:24:30,520 --> 00:24:34,240 Speaker 1: at the same time, the company reduced the losses that 393 00:24:34,400 --> 00:24:39,080 Speaker 1: it has experienced running the streaming service. So while more 394 00:24:39,119 --> 00:24:43,720 Speaker 1: people have left, the actual financial losses have decreased. They're 395 00:24:43,720 --> 00:24:47,720 Speaker 1: still operating at a loss. They're not profitable, but they're 396 00:24:47,760 --> 00:24:51,120 Speaker 1: not losing as much money per quarter as they had been. 397 00:24:51,640 --> 00:24:55,879 Speaker 1: So producing content for streaming is expensive, as you might imagine. 398 00:24:56,119 --> 00:24:59,160 Speaker 1: Right now in the US, it's impossible at least four 399 00:24:59,320 --> 00:25:02,760 Speaker 1: US base operations. It's impossible because the Writer's Guild of 400 00:25:02,800 --> 00:25:06,720 Speaker 1: America is on strike. But on an earnings call CEO 401 00:25:07,119 --> 00:25:10,840 Speaker 1: for the second time now, Bob Eiger said that the 402 00:25:10,880 --> 00:25:15,120 Speaker 1: company will unite the Disney Plus and the Hulu streaming 403 00:25:15,200 --> 00:25:19,399 Speaker 1: services into a single service. And for someone like me 404 00:25:19,880 --> 00:25:22,840 Speaker 1: who subscribes to one of those, that being Disney Plus, 405 00:25:23,080 --> 00:25:27,399 Speaker 1: but not the other, that being Hulu, that sounds pretty great. 406 00:25:28,040 --> 00:25:29,960 Speaker 1: One thing that's less great is that the plan is 407 00:25:30,000 --> 00:25:33,520 Speaker 1: also to increase the monthly subscription fee for those who 408 00:25:33,560 --> 00:25:37,159 Speaker 1: want an ad free experience. So yes, you will have 409 00:25:37,200 --> 00:25:40,440 Speaker 1: access to more content, but you'll also be paying more 410 00:25:40,560 --> 00:25:44,080 Speaker 1: to get that content if you don't want ads. I 411 00:25:44,119 --> 00:25:47,280 Speaker 1: don't know about the ads supported one that may remain 412 00:25:47,320 --> 00:25:50,040 Speaker 1: the same. The price hike could get here before the 413 00:25:50,160 --> 00:25:53,639 Speaker 1: unified streaming platform does. That also raises the possibility that 414 00:25:53,680 --> 00:25:56,920 Speaker 1: the price could go up again a second time once 415 00:25:57,000 --> 00:25:59,560 Speaker 1: Hulu and Disney Plus have tied the knot. I don't know. 416 00:26:00,200 --> 00:26:03,800 Speaker 1: Iiger said, this unified platform should launch before the end 417 00:26:03,840 --> 00:26:06,440 Speaker 1: of the year, and I may have to find someone 418 00:26:06,440 --> 00:26:08,960 Speaker 1: who is an expert in streaming strategies to come on 419 00:26:09,000 --> 00:26:11,400 Speaker 1: the show so we can really talk about the challenges 420 00:26:11,960 --> 00:26:15,920 Speaker 1: that streaming companies face when they're running these services. We've 421 00:26:15,960 --> 00:26:19,320 Speaker 1: seen it a time and again, from Netflix to Warner Brothers, 422 00:26:19,359 --> 00:26:23,439 Speaker 1: Discovery to Paramount Plus and beyond. There have been a 423 00:26:23,440 --> 00:26:27,560 Speaker 1: lot of examples of services trying new approaches to generate 424 00:26:27,600 --> 00:26:31,040 Speaker 1: revenue and to reduce costs, and in some cases to 425 00:26:31,200 --> 00:26:34,399 Speaker 1: merge with other services. You know, war Brothers Discovery and 426 00:26:34,480 --> 00:26:37,040 Speaker 1: Paramount Plus have both gone through that and now discuss 427 00:26:37,160 --> 00:26:40,199 Speaker 1: Disney is doing it too. So as someone who just 428 00:26:40,280 --> 00:26:43,800 Speaker 1: can't bring himself to subscribe to everything that's out there, 429 00:26:44,240 --> 00:26:47,919 Speaker 1: I'm actually in favor of a little consolidation in the field. 430 00:26:48,000 --> 00:26:50,399 Speaker 1: I don't want it to go out of control, but 431 00:26:50,680 --> 00:26:53,600 Speaker 1: to reduce the different number of streaming platforms so that 432 00:26:53,680 --> 00:26:57,840 Speaker 1: I'm not having to, you know, manage eight different subscriptions 433 00:26:57,920 --> 00:27:00,400 Speaker 1: or something that sounds good to me. I just don't 434 00:27:00,400 --> 00:27:04,800 Speaker 1: want to see streaming subscriptions reach the same height as cable, right, 435 00:27:04,840 --> 00:27:08,280 Speaker 1: because then we're just back to the model that was disrupted. Now, 436 00:27:08,280 --> 00:27:11,959 Speaker 1: it may be that it's necessary to do that in 437 00:27:12,080 --> 00:27:15,359 Speaker 1: order to fund the platforms so that they can actually 438 00:27:15,440 --> 00:27:17,200 Speaker 1: create the content that we want to see in the 439 00:27:17,240 --> 00:27:22,080 Speaker 1: first place. But yeah, this is that complicated nature about streaming. 440 00:27:22,160 --> 00:27:23,760 Speaker 1: That's why I need to get an expert on the 441 00:27:23,800 --> 00:27:25,600 Speaker 1: show so we can kind of talk it out and 442 00:27:25,800 --> 00:27:30,639 Speaker 1: really understand all the different factors that go into the 443 00:27:30,720 --> 00:27:35,359 Speaker 1: streaming platform business. All right, I've got three more news 444 00:27:35,359 --> 00:27:38,560 Speaker 1: stories I want to talk about that are pretty interesting. 445 00:27:38,760 --> 00:27:41,720 Speaker 1: But before we do that, let's take another quick break 446 00:27:41,760 --> 00:27:54,720 Speaker 1: for our sponsors. Okay, we're back, and this is potentially 447 00:27:55,040 --> 00:28:00,520 Speaker 1: really exciting news. So Heleon Energy, which is aiming to 448 00:28:00,560 --> 00:28:04,720 Speaker 1: bring fusion power not just into reality, but to make 449 00:28:04,760 --> 00:28:09,160 Speaker 1: it a viable commercial service, has announced that it has 450 00:28:09,280 --> 00:28:14,360 Speaker 1: landed its first customer, which is Microsoft. Microsoft has agreed 451 00:28:14,880 --> 00:28:18,879 Speaker 1: to buy a certain amount of energy from Helion Energy 452 00:28:18,960 --> 00:28:24,640 Speaker 1: once its fusion reactor goes into commercial service. So fusion 453 00:28:24,760 --> 00:28:29,440 Speaker 1: power is what the sun runs on. Stars run on fusion. 454 00:28:29,600 --> 00:28:34,040 Speaker 1: It is the opposite of fission, which is what traditional 455 00:28:34,119 --> 00:28:37,920 Speaker 1: nuclear power plants rely on. So fission nuclear fission is 456 00:28:37,960 --> 00:28:41,600 Speaker 1: where you take a heavy atom and you expend some 457 00:28:41,800 --> 00:28:45,360 Speaker 1: energy that causes this atom to break apart. As the 458 00:28:45,400 --> 00:28:49,560 Speaker 1: atom breaks apart, it releases way more energy and you 459 00:28:49,720 --> 00:28:53,280 Speaker 1: harness that to generate electricity. Typically you do it by 460 00:28:53,320 --> 00:28:58,320 Speaker 1: superheating water into steam, which then turns turbines and generates 461 00:28:58,320 --> 00:29:02,200 Speaker 1: electricity that way. That's your typical way of creating electricity 462 00:29:02,200 --> 00:29:05,560 Speaker 1: through a nuclear power plant. It's not that different from 463 00:29:05,640 --> 00:29:08,840 Speaker 1: using other types of energy to heat water up and 464 00:29:08,880 --> 00:29:11,040 Speaker 1: turn it into steam so that you can turn a turbine. 465 00:29:11,080 --> 00:29:15,000 Speaker 1: It's just it's on nuclear now. Fission power plants have 466 00:29:15,120 --> 00:29:18,360 Speaker 1: some real perception problems. Some would argue that that's all 467 00:29:18,400 --> 00:29:21,440 Speaker 1: they have, that those are the only problems. It's perception, 468 00:29:21,560 --> 00:29:26,120 Speaker 1: and that everything else is totally you know, cromulent. I'm 469 00:29:26,160 --> 00:29:29,240 Speaker 1: not quite at that level, but I do think that 470 00:29:29,360 --> 00:29:34,160 Speaker 1: the opposition to nuclear power is perhaps predicated more on 471 00:29:35,200 --> 00:29:40,560 Speaker 1: outdated concerns. Not all of them are moot, but some 472 00:29:40,600 --> 00:29:43,640 Speaker 1: of them are. Anyway, a lot of people associate nuclear 473 00:29:43,680 --> 00:29:47,480 Speaker 1: fission power plants with nuclear waste, which is understandable. It's 474 00:29:47,520 --> 00:29:51,480 Speaker 1: scary stuff because it can remain dangerous for tens of 475 00:29:51,480 --> 00:29:55,880 Speaker 1: thousands of years, and thus nuclear power plants are also 476 00:29:55,960 --> 00:29:58,880 Speaker 1: often seen as a potential threat to safety, also understandable 477 00:29:58,880 --> 00:30:01,800 Speaker 1: because we've had some high profile examples of that happening. 478 00:30:02,480 --> 00:30:05,320 Speaker 1: Even if you could argue that, and you can, you 479 00:30:05,360 --> 00:30:08,320 Speaker 1: can argue that our reliance on things like fossil fuels 480 00:30:08,640 --> 00:30:14,680 Speaker 1: have led to far more deaths and medical complications over 481 00:30:14,840 --> 00:30:19,600 Speaker 1: the course of its history than all the nuclear mishaps 482 00:30:19,600 --> 00:30:25,320 Speaker 1: and disasters added up together. Right, So nuclear fusion doesn't 483 00:30:25,360 --> 00:30:28,280 Speaker 1: have these perception problems because what it does is you 484 00:30:28,360 --> 00:30:33,720 Speaker 1: take very light atoms, stuff like hydrogen, and then you 485 00:30:33,840 --> 00:30:38,480 Speaker 1: use some energy to fuse them together. So classically we 486 00:30:38,520 --> 00:30:41,960 Speaker 1: would take hydrogen and then applying a lot of energy, 487 00:30:42,160 --> 00:30:46,120 Speaker 1: really focusing these atoms so that you can overcome the 488 00:30:46,320 --> 00:30:50,840 Speaker 1: nuclear forces that would otherwise resist fusion, you overcome that, 489 00:30:51,200 --> 00:30:54,280 Speaker 1: they fuse together and then you get helium, and that 490 00:30:54,320 --> 00:30:58,280 Speaker 1: process also releases an enormous amount of energy. So there 491 00:30:58,280 --> 00:31:01,440 Speaker 1: have been several experimental fusion reactors around the world that 492 00:31:01,680 --> 00:31:04,760 Speaker 1: have achieved fusion. They have done it, but there are 493 00:31:04,800 --> 00:31:07,480 Speaker 1: still hurdles in the way of making that a viable 494 00:31:07,520 --> 00:31:11,240 Speaker 1: means of producing energy. So, for one thing, most of 495 00:31:11,280 --> 00:31:15,560 Speaker 1: these experiments actually required more energy to fuse the atoms 496 00:31:15,560 --> 00:31:19,120 Speaker 1: together than they got from the reaction. So if you're 497 00:31:19,320 --> 00:31:23,920 Speaker 1: expending more energy then you get out of the process, 498 00:31:24,240 --> 00:31:27,080 Speaker 1: that's a net loss. That doesn't work right, Like, you 499 00:31:27,440 --> 00:31:30,320 Speaker 1: might as well be using that energy directly to produce 500 00:31:30,360 --> 00:31:34,040 Speaker 1: electricity as opposed to using this middleman where you've got 501 00:31:34,040 --> 00:31:38,280 Speaker 1: a net energy loss. Even in the cases where you 502 00:31:38,280 --> 00:31:41,080 Speaker 1: could argue you got more energy out and there are 503 00:31:41,120 --> 00:31:46,640 Speaker 1: a couple, but they very narrowly focus in on specific 504 00:31:46,680 --> 00:31:49,640 Speaker 1: parts of the process and ignore everything else. But even 505 00:31:49,680 --> 00:31:52,200 Speaker 1: if you get past that, you still have the challenge 506 00:31:52,200 --> 00:31:57,640 Speaker 1: of making this a persistent reaction so that you can 507 00:31:57,720 --> 00:32:01,400 Speaker 1: continue to produce energy and not just have like a 508 00:32:01,440 --> 00:32:05,200 Speaker 1: spike of energy production. Then you have to essentially refuel 509 00:32:05,280 --> 00:32:08,240 Speaker 1: and do it all over again. So there's that hurdle 510 00:32:08,280 --> 00:32:11,360 Speaker 1: as well. So this is very very challenging to do. 511 00:32:11,720 --> 00:32:15,840 Speaker 1: It's hard to do. However, Helium says it's aiming for 512 00:32:16,160 --> 00:32:22,120 Speaker 1: commercial power generation by twenty twenty eight. That's crazy soon 513 00:32:23,000 --> 00:32:26,600 Speaker 1: way earlier than what most folks have predicted. Now, I 514 00:32:26,640 --> 00:32:31,520 Speaker 1: would love to see this happen. I have my reservations, 515 00:32:31,720 --> 00:32:35,560 Speaker 1: I have my doubts, but if Helium can actually pull 516 00:32:35,560 --> 00:32:38,520 Speaker 1: this off, well, it could serve as an incredible model 517 00:32:38,760 --> 00:32:42,719 Speaker 1: to totally overhaul our energy infrastructure. Fusion could provide a 518 00:32:42,720 --> 00:32:46,120 Speaker 1: source of clean energy without stuff like carbon emissions or 519 00:32:46,200 --> 00:32:52,320 Speaker 1: nuclear waste. The actual fuel we'd be using would be well, 520 00:32:52,320 --> 00:32:55,360 Speaker 1: I mean, technically, it's the most plentiful stuff on the planet, 521 00:32:55,400 --> 00:32:57,720 Speaker 1: although you do have to spend a lot of energy 522 00:32:57,760 --> 00:33:01,000 Speaker 1: to get it. But if we could do that without 523 00:33:01,080 --> 00:33:03,560 Speaker 1: it taking a huge chunk out of the energy we're 524 00:33:03,640 --> 00:33:07,680 Speaker 1: producing through the process, it's worth it. So it would 525 00:33:07,800 --> 00:33:11,160 Speaker 1: mean that we could rapidly transition off of fossil fuels. 526 00:33:11,680 --> 00:33:16,480 Speaker 1: That would be like literally ancient technology in comparison. But 527 00:33:16,880 --> 00:33:20,080 Speaker 1: we have to hope it all works out. We can't 528 00:33:20,120 --> 00:33:23,080 Speaker 1: assume it's going to work out. Hoping, I think is okay, 529 00:33:23,120 --> 00:33:26,000 Speaker 1: but assuming is not. And I should add there are 530 00:33:26,080 --> 00:33:30,040 Speaker 1: experts in the nuclear fusion field who remain skeptical of 531 00:33:30,120 --> 00:33:35,160 Speaker 1: Helium in general, who have said that it's a company 532 00:33:35,240 --> 00:33:41,880 Speaker 1: that has shown more on paper than in reality. So 533 00:33:43,000 --> 00:33:46,720 Speaker 1: whether this company can actually produce energy starting in twenty 534 00:33:46,760 --> 00:33:49,440 Speaker 1: twenty eight on a commercial level. Remains to be seen. 535 00:33:50,000 --> 00:33:52,360 Speaker 1: I hope it works out because if it does, that 536 00:33:52,400 --> 00:33:58,920 Speaker 1: would be transformational, absolutely transformational. But you know it's I 537 00:33:58,960 --> 00:34:01,440 Speaker 1: don't I don't know how how sure A bet that 538 00:34:01,640 --> 00:34:05,720 Speaker 1: is up in Canada, specifically in a suburb of Montreal 539 00:34:05,880 --> 00:34:10,680 Speaker 1: called Brossard. That community is testing out a new traffic 540 00:34:10,800 --> 00:34:13,239 Speaker 1: light and it's a traffic light that's kind of like 541 00:34:13,280 --> 00:34:16,880 Speaker 1: Santa Claus, except of knowing if you're sleeping or if 542 00:34:16,920 --> 00:34:19,760 Speaker 1: you're awake, and knows if you're obeying the speed limit, 543 00:34:20,040 --> 00:34:23,080 Speaker 1: and if you're not, no green light for you. So 544 00:34:23,920 --> 00:34:26,920 Speaker 1: this traffic light is meant to calm traffic and it 545 00:34:27,040 --> 00:34:30,280 Speaker 1: monitors a driver's speed as they go down the street 546 00:34:30,760 --> 00:34:33,280 Speaker 1: and they're approaching the light, and if they're going above 547 00:34:33,400 --> 00:34:36,239 Speaker 1: the speed limit, then the light turns red and then 548 00:34:36,239 --> 00:34:38,360 Speaker 1: the driver has to slow down and come to a stop. 549 00:34:38,640 --> 00:34:40,800 Speaker 1: But if the driver is within the speed limit, the 550 00:34:40,880 --> 00:34:43,400 Speaker 1: light stays green and they can just pass right through. 551 00:34:43,840 --> 00:34:45,960 Speaker 1: So those of you all in Europe might be saying, 552 00:34:46,120 --> 00:34:50,319 Speaker 1: we've had traffic lights like this forever, but this is 553 00:34:50,360 --> 00:34:52,719 Speaker 1: the first time it's been used in Canada and for 554 00:34:52,760 --> 00:34:55,800 Speaker 1: those who are in like Canada or the United States 555 00:34:55,840 --> 00:34:58,480 Speaker 1: and you're having trouble imagining this. This is not a 556 00:34:58,520 --> 00:35:02,640 Speaker 1: traffic light that's at an inner because that would be crazy, right. 557 00:35:02,719 --> 00:35:04,840 Speaker 1: If it's a light that would just be green whenever 558 00:35:04,880 --> 00:35:06,759 Speaker 1: people are going the speed limit and red if they 559 00:35:06,760 --> 00:35:09,759 Speaker 1: were going too fast and it's at an intersection, then 560 00:35:09,800 --> 00:35:11,480 Speaker 1: you just end up with a lot of you know, 561 00:35:11,600 --> 00:35:15,680 Speaker 1: low speed collisions that it would be the intersection everyone 562 00:35:15,680 --> 00:35:18,759 Speaker 1: would avoid. No, this light is just right smack dab 563 00:35:18,800 --> 00:35:21,319 Speaker 1: in the middle along the side of a street. So 564 00:35:21,840 --> 00:35:23,799 Speaker 1: I would love to actually see these kinds of things 565 00:35:23,880 --> 00:35:27,200 Speaker 1: rolled out in my neighborhood because I sometimes see people 566 00:35:27,320 --> 00:35:31,279 Speaker 1: zipping down our little side streets way too fast in 567 00:35:31,320 --> 00:35:35,360 Speaker 1: an effort to bypass traffic that's on the main streets. 568 00:35:35,800 --> 00:35:38,080 Speaker 1: And there are a lot of kids who live in 569 00:35:38,080 --> 00:35:43,080 Speaker 1: my neighborhood and I worry about them because folks who 570 00:35:43,160 --> 00:35:48,160 Speaker 1: are frustrated with traffic on a big road are aggressively 571 00:35:48,280 --> 00:35:50,839 Speaker 1: driving down side streets where you're not meant to go 572 00:35:50,920 --> 00:35:53,319 Speaker 1: that fast in order to just get around it. So 573 00:35:53,840 --> 00:35:56,440 Speaker 1: I would like to see this incorporated in other places. 574 00:35:56,719 --> 00:36:00,640 Speaker 1: Then again, I also don't drive so maybe the drivers 575 00:36:00,640 --> 00:36:02,719 Speaker 1: out there screaming at me saying no, I don't want 576 00:36:02,719 --> 00:36:07,360 Speaker 1: to have a light, tell me I can't speed. And finally, 577 00:36:08,080 --> 00:36:11,600 Speaker 1: an enterprising young woman who has already created a successful 578 00:36:11,760 --> 00:36:15,360 Speaker 1: career as an influencer is doing something a bit bold 579 00:36:15,480 --> 00:36:18,560 Speaker 1: and perhaps controversial. So her name is is Karen or 580 00:36:18,560 --> 00:36:22,640 Speaker 1: maybe it's Karenne, it's cr yn, So I'll say Karenn 581 00:36:23,000 --> 00:36:26,440 Speaker 1: karenn Marjorie, And my apologies if I just picked the 582 00:36:26,480 --> 00:36:31,600 Speaker 1: wrong pronunciation. I am old and KARENN Marjorie's like twenty three, 583 00:36:31,760 --> 00:36:35,040 Speaker 1: so we are worlds apart. I don't I have never 584 00:36:35,080 --> 00:36:37,720 Speaker 1: heard of her before, but she has like two million 585 00:36:37,800 --> 00:36:40,879 Speaker 1: followers on Snapchat, so lots of people do know who 586 00:36:40,880 --> 00:36:45,560 Speaker 1: she is. And she teamed up with some developers who used, 587 00:36:45,640 --> 00:36:49,080 Speaker 1: you know, thousands of hours of her content to essentially 588 00:36:49,120 --> 00:36:53,200 Speaker 1: build an AI version of herself. And you might say, well, 589 00:36:53,239 --> 00:36:56,680 Speaker 1: why would she do that? So she does it so 590 00:36:56,840 --> 00:37:01,520 Speaker 1: that she can rent out artificial Karennes to folks who 591 00:37:01,560 --> 00:37:05,640 Speaker 1: want to have her as a friend or maybe a girlfriend, 592 00:37:06,480 --> 00:37:10,680 Speaker 1: or maybe a casual fling. So the AI version of 593 00:37:10,760 --> 00:37:15,239 Speaker 1: Karen has a computer simulated version of her voice. It 594 00:37:15,320 --> 00:37:18,400 Speaker 1: is Telegram based so it's like a voice chat based 595 00:37:18,719 --> 00:37:23,840 Speaker 1: AI version of her, and supposedly it has essentially a 596 00:37:23,880 --> 00:37:26,520 Speaker 1: copy of her personality, or at the very least some 597 00:37:26,800 --> 00:37:31,560 Speaker 1: version of her public personality. So interested customers can rent 598 00:37:31,640 --> 00:37:36,759 Speaker 1: time with Aikrinne for about a dollar a minute. And 599 00:37:36,920 --> 00:37:39,480 Speaker 1: she actually launched the service like a week back as 600 00:37:39,520 --> 00:37:42,560 Speaker 1: a beta launch, but this week it became you know, 601 00:37:42,680 --> 00:37:46,200 Speaker 1: it emerged from beta and it's called corinne AI so 602 00:37:46,400 --> 00:37:50,640 Speaker 1: c r y nai. Fortune reports that the beta test 603 00:37:50,719 --> 00:37:54,719 Speaker 1: netted more than seventy one point five thousand dollars in 604 00:37:54,920 --> 00:38:00,920 Speaker 1: one week. Yaoza. So users can connect to the on telegram, 605 00:38:00,960 --> 00:38:03,520 Speaker 1: they can talk about whatever they like. It may not 606 00:38:03,760 --> 00:38:05,480 Speaker 1: come as much of a surprise to most of you 607 00:38:05,640 --> 00:38:08,600 Speaker 1: that the overwhelming majority of people who tested it out 608 00:38:08,680 --> 00:38:12,240 Speaker 1: happen to be male personally. I think it's a clever 609 00:38:12,440 --> 00:38:16,239 Speaker 1: business move. Karin is leveraging her influence as well as 610 00:38:16,320 --> 00:38:23,160 Speaker 1: leveraging technologies capabilities. She's kind of turning the turning the 611 00:38:23,239 --> 00:38:26,759 Speaker 1: tide on things like deep fakes, right because a lot 612 00:38:26,760 --> 00:38:28,680 Speaker 1: of times when we look at deep fakes, they're being 613 00:38:28,800 --> 00:38:32,960 Speaker 1: used to mimic someone without their consent. She's saying, well, 614 00:38:33,000 --> 00:38:35,319 Speaker 1: I want to be in control of this. So I 615 00:38:35,360 --> 00:38:38,719 Speaker 1: am going to actually be the person creating this and 616 00:38:38,840 --> 00:38:42,160 Speaker 1: marketing it and benefiting from it. So instead of being 617 00:38:42,280 --> 00:38:48,719 Speaker 1: a victim, I have agency. I am in charge of it. 618 00:38:50,000 --> 00:38:53,160 Speaker 1: I should add, however, there are experts in ethics who 619 00:38:53,200 --> 00:38:57,719 Speaker 1: are really worried about this sort of thing, not Carinn specifically, 620 00:38:57,760 --> 00:39:04,280 Speaker 1: but generally speaking, a trend toward these humanized AI agents 621 00:39:04,719 --> 00:39:09,920 Speaker 1: because they worry that interacting with them, especially interacting on 622 00:39:09,960 --> 00:39:14,360 Speaker 1: a more frequent basis, could start to reshape how people 623 00:39:14,440 --> 00:39:18,920 Speaker 1: interact with other actual human beings. And based upon what 624 00:39:18,960 --> 00:39:22,320 Speaker 1: we've seen with the web and how that has influenced 625 00:39:22,360 --> 00:39:25,400 Speaker 1: how people interact with one another, I think that is 626 00:39:26,560 --> 00:39:30,400 Speaker 1: that's a realistic concern. It's something we should actually really 627 00:39:30,719 --> 00:39:34,440 Speaker 1: think and talk about. So I feel conflicted about this. 628 00:39:34,480 --> 00:39:37,080 Speaker 1: On the one hand, I think that Karin is absolutely 629 00:39:37,239 --> 00:39:41,720 Speaker 1: right to take charge of her own identity and public persona, 630 00:39:42,680 --> 00:39:45,000 Speaker 1: and on the other hand, I worry that people will 631 00:39:45,000 --> 00:39:47,360 Speaker 1: become too dependent upon this, that it will become a 632 00:39:47,480 --> 00:39:52,000 Speaker 1: kind of crutch that will possibly warp the way that 633 00:39:52,040 --> 00:39:54,840 Speaker 1: they interact with other people, and that that in the 634 00:39:54,880 --> 00:39:57,960 Speaker 1: long term, can be harmful to themselves and the people 635 00:39:58,000 --> 00:40:03,600 Speaker 1: around them. So yeah, it's complicated, but it is interesting 636 00:40:03,600 --> 00:40:06,920 Speaker 1: to see someone take charge like that. And she doesn't 637 00:40:06,920 --> 00:40:08,759 Speaker 1: need me to say it to say good luck to 638 00:40:08,800 --> 00:40:13,239 Speaker 1: her because she's doing just fine based upon making more 639 00:40:13,320 --> 00:40:16,839 Speaker 1: than seventy grand in a week. She definitely doesn't need 640 00:40:17,320 --> 00:40:21,080 Speaker 1: any will which is for me to see success. As 641 00:40:21,120 --> 00:40:24,040 Speaker 1: for the rest of you, I hope you are all well. 642 00:40:24,760 --> 00:40:27,359 Speaker 1: And that's it for the news for this week, so 643 00:40:27,440 --> 00:40:36,840 Speaker 1: I'll talk to you again really soon. Tech Stuff is 644 00:40:36,880 --> 00:40:41,440 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 645 00:40:41,480 --> 00:40:45,120 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 646 00:40:45,160 --> 00:40:49,680 Speaker 1: favorite shows.