1 00:00:04,480 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:16,160 Speaker 1: and welcome to tech Stuff. I'm your host job and Strickland, 3 00:00:16,160 --> 00:00:20,000 Speaker 1: I'm an executive producer with iHeart Podcasts. And how the 4 00:00:20,160 --> 00:00:22,560 Speaker 1: tech are you? It's time for the tech news for 5 00:00:22,640 --> 00:00:27,280 Speaker 1: the week ending September twenty, twenty twenty four. The US 6 00:00:27,360 --> 00:00:31,760 Speaker 1: Federal Trade Commission or FTC, issued a release this week 7 00:00:31,800 --> 00:00:35,479 Speaker 1: titled well, actually it's a really long title. Essentially, this 8 00:00:35,600 --> 00:00:40,320 Speaker 1: release says social media and streaming companies are essentially conducting 9 00:00:40,400 --> 00:00:45,360 Speaker 1: surveillance on both their users and on non users while 10 00:00:45,400 --> 00:00:51,040 Speaker 1: also maintaining inadequate data security and privacy controls, Which I mean, 11 00:00:51,680 --> 00:00:54,320 Speaker 1: if you're someone who has even paid just a little 12 00:00:54,360 --> 00:00:57,760 Speaker 1: attention to these companies over the last several years, you 13 00:00:57,880 --> 00:01:01,040 Speaker 1: might respond to this news as oh, yeah, yeah yeah. Also, 14 00:01:01,160 --> 00:01:03,840 Speaker 1: by the way, water is wet. At least that was 15 00:01:03,920 --> 00:01:07,679 Speaker 1: my initial reaction, because to me, the revelation that these 16 00:01:07,840 --> 00:01:12,120 Speaker 1: companies are one collecting massive amounts of data about people 17 00:01:12,400 --> 00:01:16,680 Speaker 1: and two they're pretty crappy about keeping that stuff safe, 18 00:01:16,880 --> 00:01:21,040 Speaker 1: that's not exactly news. But then the FTC was conducting 19 00:01:21,080 --> 00:01:24,960 Speaker 1: an investigation to determine exactly the scope of data collection 20 00:01:25,520 --> 00:01:28,840 Speaker 1: as well as to what extent, if any, these companies 21 00:01:28,880 --> 00:01:32,600 Speaker 1: are taking to you know, protect private information. These are 22 00:01:32,640 --> 00:01:34,480 Speaker 1: the sorts of things that you got to do if 23 00:01:34,560 --> 00:01:37,800 Speaker 1: maybe further down the line, you decide to you know, 24 00:01:37,880 --> 00:01:42,600 Speaker 1: pressure companies into making changes or pressuring the government into 25 00:01:42,640 --> 00:01:46,479 Speaker 1: passing laws that will require companies to make changes. Now, 26 00:01:46,520 --> 00:01:50,120 Speaker 1: perhaps by by finding the ever loven socks off these 27 00:01:50,200 --> 00:01:54,200 Speaker 1: companies a guy I can dream anyway. The release detailed 28 00:01:54,280 --> 00:01:58,640 Speaker 1: how these companies collect information not just about their own users, 29 00:01:58,800 --> 00:02:02,160 Speaker 1: I mean that is evident, but also on people who 30 00:02:02,200 --> 00:02:05,120 Speaker 1: aren't users at all. And let's just say, how, you know, 31 00:02:05,280 --> 00:02:07,000 Speaker 1: how could that happen? How could someone who's not a 32 00:02:07,080 --> 00:02:09,920 Speaker 1: user have their data collected by these companies. Well, let's 33 00:02:10,240 --> 00:02:14,000 Speaker 1: give one simple example. Let's say you gut Uncle Joey, 34 00:02:14,280 --> 00:02:17,519 Speaker 1: and Uncle Joey's not on Facebook, but you do post 35 00:02:17,520 --> 00:02:20,440 Speaker 1: about Uncle Joey a lot on Facebook, and you know, 36 00:02:20,639 --> 00:02:24,480 Speaker 1: Facebook now knows stuff about Uncle Joey. That's one way. 37 00:02:24,960 --> 00:02:28,359 Speaker 1: Another is that companies might get access to people's information 38 00:02:28,480 --> 00:02:31,680 Speaker 1: by dealing with data brokers. Data brokers are companies that 39 00:02:31,880 --> 00:02:34,840 Speaker 1: buy and sell personal information across the Internet, and a 40 00:02:34,919 --> 00:02:38,400 Speaker 1: data broker might have quite the dossier on you or 41 00:02:38,440 --> 00:02:42,120 Speaker 1: on Uncle Joey, and that information has gathered from multiple 42 00:02:42,160 --> 00:02:46,120 Speaker 1: sources and organized into handy dandy packets for any company 43 00:02:46,120 --> 00:02:49,239 Speaker 1: that really wants to know you know, like what kind 44 00:02:49,400 --> 00:02:54,080 Speaker 1: of breakfast cereal you like, or what kind of car 45 00:02:54,520 --> 00:02:58,080 Speaker 1: you drive, or which political party you support, or whether 46 00:02:58,200 --> 00:03:01,240 Speaker 1: or not you like sports that kind of thing. The 47 00:03:01,400 --> 00:03:05,480 Speaker 1: FTC found most of these companies lack sufficient data security 48 00:03:05,520 --> 00:03:08,840 Speaker 1: and retention policies, meaning that some of these companies retain 49 00:03:08,919 --> 00:03:13,000 Speaker 1: personal information indefinitely. So back in the corporate world, there 50 00:03:13,000 --> 00:03:17,520 Speaker 1: are often rules that mandate that companies destroy older documents 51 00:03:17,560 --> 00:03:20,520 Speaker 1: as time passes. I worked for a consulting company once 52 00:03:20,600 --> 00:03:23,040 Speaker 1: upon a time, and they had strict rules on how 53 00:03:23,040 --> 00:03:26,359 Speaker 1: long we could retain files on our customers, for example, 54 00:03:26,600 --> 00:03:29,840 Speaker 1: and once that time was up, we were obligated to 55 00:03:30,000 --> 00:03:34,080 Speaker 1: delete electronic files and shred hard copies. This was for 56 00:03:34,200 --> 00:03:37,680 Speaker 1: the protection both for the customer and for the consulting 57 00:03:37,720 --> 00:03:42,600 Speaker 1: firm itself. But these social network companies and social streaming sites, 58 00:03:42,880 --> 00:03:45,680 Speaker 1: they often don't have that kind of a policy, and 59 00:03:45,760 --> 00:03:48,840 Speaker 1: so the information to have on people can stay locked 60 00:03:48,880 --> 00:03:52,720 Speaker 1: away and grow over time. The report or just Congress 61 00:03:52,720 --> 00:03:55,560 Speaker 1: to pass laws that will give citizens more data security, 62 00:03:56,040 --> 00:03:59,560 Speaker 1: maybe even give citizens a little bit of agency when 63 00:03:59,560 --> 00:04:02,440 Speaker 1: it comes to how their own personal data can be 64 00:04:02,440 --> 00:04:06,080 Speaker 1: collected and used. Wouldn't that be nice? And further that 65 00:04:06,160 --> 00:04:10,880 Speaker 1: Congress should prioritize protecting younger Internet users in particular, who 66 00:04:10,880 --> 00:04:15,480 Speaker 1: often end up being exploited due to massive loopholes and systems. 67 00:04:15,560 --> 00:04:19,320 Speaker 1: We talked about this recently about how companies like Google 68 00:04:19,400 --> 00:04:25,880 Speaker 1: through YouTube were allegedly working with advertisers to target teenagers, 69 00:04:25,960 --> 00:04:29,880 Speaker 1: even though that's expressly against the rules, because what they 70 00:04:29,880 --> 00:04:33,440 Speaker 1: could do is say, well, let's just target unknown accounts 71 00:04:33,839 --> 00:04:37,200 Speaker 1: where we don't have an age associated with the account holder, 72 00:04:37,440 --> 00:04:43,719 Speaker 1: but the account has behaviors that are typically associated with, say, teenagers, right, 73 00:04:44,040 --> 00:04:47,840 Speaker 1: So you're not targeting teenagers. You're targeting people who behave 74 00:04:48,040 --> 00:04:50,520 Speaker 1: like teenagers and you don't know how old they are, 75 00:04:50,600 --> 00:04:54,120 Speaker 1: So how could you be blamed for targeting teenagers? That's 76 00:04:54,160 --> 00:04:57,800 Speaker 1: the allegation that's going around regarding YouTube. So loopholes like 77 00:04:57,839 --> 00:05:02,599 Speaker 1: that provide huge op oportunities for these companies to exploit 78 00:05:02,760 --> 00:05:06,240 Speaker 1: people that otherwise are meant to be protected from that 79 00:05:06,360 --> 00:05:09,480 Speaker 1: sort of thing. So the FTC actually voted with unanimous 80 00:05:09,480 --> 00:05:12,080 Speaker 1: support to issue this report I think that's pretty refreshing 81 00:05:12,360 --> 00:05:15,800 Speaker 1: for an agency to say, yeah, we're all aligned on this, 82 00:05:15,960 --> 00:05:18,880 Speaker 1: we all think that this is important, and you don't 83 00:05:18,920 --> 00:05:22,440 Speaker 1: get that division you typically get with these where out 84 00:05:22,440 --> 00:05:28,000 Speaker 1: of just principle members will vote against a measure because 85 00:05:28,560 --> 00:05:32,000 Speaker 1: unity is something to be avoided at all costs. Enough 86 00:05:32,000 --> 00:05:35,800 Speaker 1: with that commentary. One huge story that unfolded over this 87 00:05:35,880 --> 00:05:39,760 Speaker 1: past week involved pagers or what we used to call 88 00:05:39,920 --> 00:05:42,159 Speaker 1: beepers when I was a kid, and I'm sure you've 89 00:05:42,160 --> 00:05:46,560 Speaker 1: heard the disturbing and grizzly details. So this week pagers 90 00:05:46,560 --> 00:05:50,760 Speaker 1: in Lebanon and Syria began to explode and they killed 91 00:05:50,880 --> 00:05:53,760 Speaker 1: or at least a dozen people and injured thousands in 92 00:05:53,800 --> 00:05:57,400 Speaker 1: the process. The following day, another series of explosions rocked 93 00:05:57,400 --> 00:06:01,040 Speaker 1: the area, this time from walkie talkie hands sets blowing up. 94 00:06:01,240 --> 00:06:04,440 Speaker 1: The operation is believed to be a semi targeted attack 95 00:06:04,600 --> 00:06:09,120 Speaker 1: on Hesblon that was orchestrated by Israeli spies. Now I 96 00:06:09,240 --> 00:06:12,800 Speaker 1: say semi targeted because it's pretty darn hard to guarantee 97 00:06:12,800 --> 00:06:14,880 Speaker 1: a device like that it's going to be on the 98 00:06:14,920 --> 00:06:18,800 Speaker 1: actual target that you have in mind, or it's really 99 00:06:18,839 --> 00:06:22,760 Speaker 1: impossible to guarantee that that target isn't also near innocent 100 00:06:22,839 --> 00:06:26,880 Speaker 1: civilians when the device actually does explode. In fact, reportedly 101 00:06:27,080 --> 00:06:30,600 Speaker 1: some of the casualties have been children, which is obviously 102 00:06:30,640 --> 00:06:33,880 Speaker 1: it's truly heartbreaking. It's a horrible thing. We do have 103 00:06:33,920 --> 00:06:37,640 Speaker 1: some more details courtesy of the New York Times. The article, 104 00:06:37,680 --> 00:06:41,440 Speaker 1: which is titled how Israel built a modern day trojan 105 00:06:41,520 --> 00:06:46,440 Speaker 1: Horse exploding Pagers gives more details as to what actually happened. 106 00:06:46,800 --> 00:06:50,280 Speaker 1: So the pagers contained a small amount of explosives, and 107 00:06:50,320 --> 00:06:54,239 Speaker 1: that explosive would activate upon receiving a signal. That signal 108 00:06:54,320 --> 00:06:55,960 Speaker 1: was a message that was sent out at three point 109 00:06:55,960 --> 00:06:59,400 Speaker 1: thirty pm in local time. The New York Times article 110 00:06:59,440 --> 00:07:02,800 Speaker 1: reports that Israel has been planning this operation for a 111 00:07:02,839 --> 00:07:06,839 Speaker 1: long time. HESBLA, which obviously has been in a violent 112 00:07:06,960 --> 00:07:11,480 Speaker 1: conflict with Israel for decades now, has more recently attempted 113 00:07:11,520 --> 00:07:16,520 Speaker 1: to migrate members to lower tech communications solutions. That's in 114 00:07:16,520 --> 00:07:20,440 Speaker 1: an effort to remain undetected by Israeli agents. The argument 115 00:07:20,560 --> 00:07:24,840 Speaker 1: was that smartphones and things like that they were helping 116 00:07:25,200 --> 00:07:30,880 Speaker 1: Israeli military and spies target people in HESBLA, So in 117 00:07:30,960 --> 00:07:34,400 Speaker 1: order to avoid that, switch to lower tech stuff that 118 00:07:34,440 --> 00:07:37,640 Speaker 1: doesn't have the same capacity for being tracked. That was 119 00:07:37,760 --> 00:07:42,040 Speaker 1: the concept. Israel, however, knew about this and apparently took 120 00:07:42,040 --> 00:07:46,280 Speaker 1: this opportunity to build such devices with explosives directly incorporated 121 00:07:46,320 --> 00:07:51,200 Speaker 1: into them, and then essentially fed these products to Hesbela 122 00:07:51,320 --> 00:07:55,280 Speaker 1: agents through third parties. So Israel just made sure that 123 00:07:55,320 --> 00:08:01,040 Speaker 1: these other companies had these explosive devices, and then you know, 124 00:08:01,120 --> 00:08:05,200 Speaker 1: when when Hesbela agents were looking for these low tech 125 00:08:05,240 --> 00:08:07,840 Speaker 1: communications devices, the ones they got were the ones that 126 00:08:07,880 --> 00:08:11,280 Speaker 1: were built by Israel. The attack sounded like something that 127 00:08:11,320 --> 00:08:13,760 Speaker 1: would be in a far fetched action movie, you know, 128 00:08:13,800 --> 00:08:15,960 Speaker 1: something you might see in like one of the Kingsman 129 00:08:16,120 --> 00:08:20,200 Speaker 1: films or something. And I cannot imagine how horrifying and 130 00:08:20,400 --> 00:08:23,800 Speaker 1: terrifying it must have been to have been present when 131 00:08:23,840 --> 00:08:28,080 Speaker 1: those explosions began to happen. It's a ready sobering, i 132 00:08:28,080 --> 00:08:31,680 Speaker 1: mean beyond sobering thing to think about, and kind of 133 00:08:31,800 --> 00:08:36,640 Speaker 1: terrifying really, just that this was a coordinated effort that 134 00:08:36,840 --> 00:08:39,880 Speaker 1: was extremely successful in hurting a lot of people. Now, 135 00:08:39,960 --> 00:08:43,280 Speaker 1: whether those were the quote unquote right people, I don't know. 136 00:08:43,600 --> 00:08:47,600 Speaker 1: I'm generally against people hurting each other at all, so 137 00:08:48,320 --> 00:08:51,880 Speaker 1: I'm not big fan on Hesbela carrying out acts of 138 00:08:51,920 --> 00:08:55,800 Speaker 1: violence against people in Israel. That's horrible. Not really keen 139 00:08:55,880 --> 00:08:59,840 Speaker 1: on Israeli military and spies carrying out acts of violence 140 00:08:59,840 --> 00:09:03,199 Speaker 1: on people in Lebanon in Syria. That's horrible. It's all terrible. 141 00:09:03,320 --> 00:09:06,319 Speaker 1: That's really what I get down to it. Over in Europe, 142 00:09:06,640 --> 00:09:09,840 Speaker 1: Reuter's reports that major tech companies are trying to convince 143 00:09:09,880 --> 00:09:13,439 Speaker 1: EU leaders to not go so heavy handed with regard 144 00:09:13,520 --> 00:09:18,079 Speaker 1: to upcoming laws and regulations relating to artificial intelligence. Now, 145 00:09:18,120 --> 00:09:22,319 Speaker 1: you might recall that open aiy's Sam Altman started meeting 146 00:09:22,400 --> 00:09:26,400 Speaker 1: with various political leaders like a couple of years ago. Now, 147 00:09:26,640 --> 00:09:31,480 Speaker 1: those were in early discussions about proposed AI regulations, and 148 00:09:31,600 --> 00:09:35,040 Speaker 1: Altman reportedly was calling four rules. He was saying, yes, 149 00:09:35,080 --> 00:09:38,719 Speaker 1: we should have regulations for AI. However, critics accused him 150 00:09:38,760 --> 00:09:41,800 Speaker 1: of trying to lay a foundation that would really benefit 151 00:09:42,080 --> 00:09:48,000 Speaker 1: open ai while simultaneously suppressing potential competition from smaller companies, 152 00:09:48,360 --> 00:09:51,440 Speaker 1: and all the while, the actual rules that were created 153 00:09:51,440 --> 00:09:55,280 Speaker 1: would do very little to protect the citizens of the EU. 154 00:09:56,080 --> 00:09:59,480 Speaker 1: Even if the tech companies failed to persuade legislators to 155 00:09:59,559 --> 00:10:04,040 Speaker 1: use a lighter touch when it comes to creating AI policy, 156 00:10:04,440 --> 00:10:07,200 Speaker 1: they don't have too much to be worried about, at 157 00:10:07,280 --> 00:10:10,240 Speaker 1: least in the short term, because according to Reuter's the 158 00:10:10,360 --> 00:10:13,880 Speaker 1: rules that are being discussed now, once they are adopted, 159 00:10:13,920 --> 00:10:17,760 Speaker 1: they aren't actually legally binding. They would be, as Captain 160 00:10:17,800 --> 00:10:23,400 Speaker 1: Barbosa might say, more like guidelines also a happy belated 161 00:10:23,520 --> 00:10:25,960 Speaker 1: talk like a pirate day. I'm a day late on 162 00:10:25,960 --> 00:10:29,240 Speaker 1: that one. Anyway, if a company is found to ignore 163 00:10:29,440 --> 00:10:33,800 Speaker 1: or violate guidelines, particularly multiple times, then it could potentially 164 00:10:33,800 --> 00:10:37,040 Speaker 1: face a more serious challenge from regulators in the future, 165 00:10:37,160 --> 00:10:40,320 Speaker 1: which ultimately could end up with like fines and that 166 00:10:40,400 --> 00:10:43,560 Speaker 1: kind of stuff. The issues at play here include everything 167 00:10:43,640 --> 00:10:47,200 Speaker 1: from how companies will gather data to train their AI 168 00:10:47,400 --> 00:10:50,679 Speaker 1: like data scraping. In other words, there are concerns about copyright, 169 00:10:50,840 --> 00:10:55,080 Speaker 1: like can AI companies indiscriminately train models on information that 170 00:10:55,440 --> 00:10:59,160 Speaker 1: otherwise has the protection from such use. Transparency is a 171 00:10:59,160 --> 00:11:01,760 Speaker 1: really big part of it, because right now AI largely 172 00:11:01,800 --> 00:11:05,959 Speaker 1: follows the black box model of transparency, meaning there is 173 00:11:06,000 --> 00:11:09,480 Speaker 1: no transparency. Stuff goes in, other stuff comes out. We're 174 00:11:09,480 --> 00:11:12,160 Speaker 1: not allowed to see what process happens in between those 175 00:11:12,160 --> 00:11:15,480 Speaker 1: two things, that kind of thing. Reuters reveals that many 176 00:11:15,520 --> 00:11:19,320 Speaker 1: major companies, including Google and Open Ai, are lobbying to 177 00:11:19,360 --> 00:11:22,640 Speaker 1: be included in working groups that are meant to shape policies, 178 00:11:22,679 --> 00:11:24,920 Speaker 1: which again gets back to that issue I talked about 179 00:11:24,920 --> 00:11:27,280 Speaker 1: a moment ago. When the folks who are meant to 180 00:11:27,320 --> 00:11:30,720 Speaker 1: be regulated are allowed to shape the regulations, what do 181 00:11:30,800 --> 00:11:32,800 Speaker 1: you think is going to happen. Who do you think 182 00:11:32,840 --> 00:11:36,120 Speaker 1: that's going to ultimately benefit? Here's a hint, it's not 183 00:11:36,240 --> 00:11:39,120 Speaker 1: you or me. Okay, we've got more tech news to 184 00:11:39,160 --> 00:11:50,160 Speaker 1: get through, but first let's take a quick break. The 185 00:11:50,280 --> 00:11:55,080 Speaker 1: United Nations has issued a report that recommends the organization 186 00:11:55,440 --> 00:11:58,319 Speaker 1: of a committee that would be in charge of governing 187 00:11:58,360 --> 00:12:02,280 Speaker 1: and monitoring the development and ployment of artificial intelligence around 188 00:12:02,320 --> 00:12:05,640 Speaker 1: the world. The report suggests that this committee would have 189 00:12:05,720 --> 00:12:09,960 Speaker 1: a similar structure and status that the Intergovernmental Panel on 190 00:12:10,080 --> 00:12:14,960 Speaker 1: Climate Change possesses. The recommendations aren't all about putting AI 191 00:12:15,040 --> 00:12:20,560 Speaker 1: in the spotlight as a potentially dangerous, perhaps catastrophically dangerous technology. 192 00:12:20,920 --> 00:12:24,800 Speaker 1: It's also to kind of decide how to leverage AI 193 00:12:24,960 --> 00:12:29,040 Speaker 1: in ways that provide the most benefit while potentially creating 194 00:12:29,400 --> 00:12:33,200 Speaker 1: the least potential for harm. How to ensure that poorer 195 00:12:33,280 --> 00:12:36,880 Speaker 1: countries are able to realize the benefits from AI while 196 00:12:36,880 --> 00:12:38,559 Speaker 1: also having a seat at the table when it comes 197 00:12:38,600 --> 00:12:40,840 Speaker 1: to governance and thus not be left behind, not to 198 00:12:40,880 --> 00:12:45,000 Speaker 1: create an AI gap. In other words, as we have seen, 199 00:12:45,679 --> 00:12:49,920 Speaker 1: artificial intelligence is really complicated, right, It's not just good 200 00:12:50,160 --> 00:12:54,000 Speaker 1: or bad. There's a lot of complex stuff going on here, 201 00:12:54,520 --> 00:12:58,640 Speaker 1: and it's far too easy to fall into simplifying things 202 00:12:58,720 --> 00:13:02,840 Speaker 1: beyond reas I've done that personally. I mean in just 203 00:13:02,880 --> 00:13:05,360 Speaker 1: in an effort to try and communicate stuff, I have 204 00:13:05,480 --> 00:13:10,880 Speaker 1: oversimplified the problem. I recommend Will Knight's article over on Wired. 205 00:13:11,120 --> 00:13:14,400 Speaker 1: It's titled the United Nations wants to treat AI with 206 00:13:14,480 --> 00:13:18,079 Speaker 1: the same urgency as climate change. Knight does a great 207 00:13:18,160 --> 00:13:22,079 Speaker 1: job explaining not only the recent report that the United 208 00:13:22,160 --> 00:13:25,720 Speaker 1: Nations issued, but also the existing regulations that are already 209 00:13:25,760 --> 00:13:30,560 Speaker 1: in place that could potentially shape this intergovernmental body as 210 00:13:30,600 --> 00:13:34,719 Speaker 1: it takes form. So check that out. Emily Bernbaum and 211 00:13:34,880 --> 00:13:39,559 Speaker 1: Oma Sadek of Bloomberg have a piece that's titled Microsoft 212 00:13:39,679 --> 00:13:43,600 Speaker 1: executive Warrens of Election Meddling in Final forty eight hours. 213 00:13:44,000 --> 00:13:48,000 Speaker 1: So the referenced executive is Brad Smith, who told the 214 00:13:48,080 --> 00:13:51,840 Speaker 1: Senate Intelligence Committee here in the United States that foreign 215 00:13:51,960 --> 00:13:56,520 Speaker 1: backed campaigns aimed to interfere with US elections are likely 216 00:13:56,559 --> 00:14:00,959 Speaker 1: going to peak two days before the election itself. Examples 217 00:14:01,000 --> 00:14:04,360 Speaker 1: that have happened in other countries in which deep fakes 218 00:14:04,360 --> 00:14:07,920 Speaker 1: of various candidates in those elections made the rounds just 219 00:14:08,040 --> 00:14:11,960 Speaker 1: before their election day. We've already seen some examples of 220 00:14:12,000 --> 00:14:14,920 Speaker 1: AI generated material relating to the election here in the 221 00:14:15,040 --> 00:14:18,280 Speaker 1: United States, though for the most part it hasn't actually 222 00:14:18,320 --> 00:14:22,080 Speaker 1: made a huge impact, although one example did prompt Taylor 223 00:14:22,160 --> 00:14:25,360 Speaker 1: Swift to make a statement about voting and also the 224 00:14:25,400 --> 00:14:29,720 Speaker 1: candidate she personally supports, partly because an AI generated image 225 00:14:29,760 --> 00:14:32,840 Speaker 1: of her appearing to support Donald Trump got some buzz, 226 00:14:33,200 --> 00:14:37,320 Speaker 1: especially after Trump himself boosted that signal, and Swift's response 227 00:14:37,400 --> 00:14:40,040 Speaker 1: has inspired a few hundred thousand citizens here in the 228 00:14:40,200 --> 00:14:42,360 Speaker 1: US to register to vote. So that's actually a really 229 00:14:42,400 --> 00:14:45,840 Speaker 1: good thing, you know. It's participation in democracy is actually 230 00:14:45,840 --> 00:14:49,760 Speaker 1: absolutely vital for democracy success. But yeah, we're likely to 231 00:14:49,760 --> 00:14:52,840 Speaker 1: see a lot more of that leading up to election day, 232 00:14:52,840 --> 00:14:56,160 Speaker 1: according to Smith, and that sounds like it tracks to me. 233 00:14:56,520 --> 00:14:59,480 Speaker 1: We've already seen multiple stories about how countries like Iran 234 00:14:59,680 --> 00:15:02,640 Speaker 1: and Russia have attempted to shape the election to varying 235 00:15:02,680 --> 00:15:06,440 Speaker 1: degrees of success here, especially this past year, So for 236 00:15:06,520 --> 00:15:10,320 Speaker 1: that to escalate seems like it's a pretty safe bet. 237 00:15:10,960 --> 00:15:14,200 Speaker 1: Unsafe consequences, but a safe bet that it's gonna happen. 238 00:15:14,840 --> 00:15:18,480 Speaker 1: Simon Sharwood of The Register has an article titled LinkedIn 239 00:15:18,680 --> 00:15:22,600 Speaker 1: started harvesting people's posts for training AI without asking for 240 00:15:22,680 --> 00:15:26,120 Speaker 1: opt in. And again, I think that probably doesn't come 241 00:15:26,120 --> 00:15:28,800 Speaker 1: as a surprise to most people. I mean, lots of 242 00:15:28,840 --> 00:15:32,760 Speaker 1: platforms have done this, where they started to scrape their 243 00:15:32,800 --> 00:15:36,800 Speaker 1: own platforms for user information to train their AI models 244 00:15:37,160 --> 00:15:41,520 Speaker 1: without notifying users, let alone asking them for their consent 245 00:15:41,800 --> 00:15:45,240 Speaker 1: for this to happen. But while it might not be surprising, 246 00:15:45,280 --> 00:15:47,600 Speaker 1: it's still very upsetting to a lot of people who 247 00:15:47,840 --> 00:15:51,400 Speaker 1: likely did not anticipate that their online resume and their 248 00:15:51,480 --> 00:15:55,480 Speaker 1: various posts promoting their work or their companies would ultimately 249 00:15:55,520 --> 00:15:59,280 Speaker 1: serve as training fodder for AI models. Ashley Bellinger of 250 00:15:59,320 --> 00:16:02,080 Speaker 1: Ours Teneca has a related piece to this. It is 251 00:16:02,160 --> 00:16:06,240 Speaker 1: titled how to stop LinkedIn from training AI on your 252 00:16:06,320 --> 00:16:09,760 Speaker 1: data Now. Bellinger points out right away that there's no 253 00:16:09,840 --> 00:16:12,600 Speaker 1: way to opt out of any training that has already happened. 254 00:16:12,840 --> 00:16:15,880 Speaker 1: You cannot say, oh, you know what, remove all my 255 00:16:16,000 --> 00:16:20,200 Speaker 1: stuff from your training model's brain. That's no go. It's 256 00:16:20,280 --> 00:16:24,120 Speaker 1: that cat is out of the AI bag already, so 257 00:16:24,240 --> 00:16:28,000 Speaker 1: there's no way to protect your data that has already 258 00:16:28,040 --> 00:16:32,160 Speaker 1: been used to train LinkedIn AI models. However, there are 259 00:16:32,240 --> 00:16:36,840 Speaker 1: some somewhat limited ways to opt out of future training. 260 00:16:37,240 --> 00:16:40,800 Speaker 1: LinkedIn will issue an updated user agreement related to this, 261 00:16:41,040 --> 00:16:43,360 Speaker 1: and at that point users will be able to opt 262 00:16:43,400 --> 00:16:46,200 Speaker 1: out of future training sessions. To do that, you will 263 00:16:46,200 --> 00:16:48,640 Speaker 1: need to go to your data privacy settings and look 264 00:16:48,680 --> 00:16:51,160 Speaker 1: for a bit that's related to data collection for quote 265 00:16:51,200 --> 00:16:55,440 Speaker 1: unquote generative AI improvement and make sure that that option 266 00:16:55,560 --> 00:16:59,600 Speaker 1: is turned off. Now. If you are in Switzerland or 267 00:16:59,640 --> 00:17:03,080 Speaker 1: if you in the EU, where the law by default 268 00:17:03,200 --> 00:17:07,359 Speaker 1: requires LinkedIn to secure your consent before opting you into 269 00:17:07,480 --> 00:17:10,040 Speaker 1: this data collection program in the first place, you don't 270 00:17:10,080 --> 00:17:12,879 Speaker 1: have to worry about this. You will be able to 271 00:17:13,200 --> 00:17:16,600 Speaker 1: actually respond no, don't do that when you are prompted. 272 00:17:16,840 --> 00:17:19,360 Speaker 1: The rest of the world we don't get that treatment. 273 00:17:19,680 --> 00:17:22,080 Speaker 1: But hey, I think that's a great example to bring 274 00:17:22,160 --> 00:17:26,359 Speaker 1: up when you're pushing US Congress to adapt stronger data 275 00:17:26,400 --> 00:17:30,840 Speaker 1: privacy laws, don't you think. In California, Governor Gavin Newsom 276 00:17:31,119 --> 00:17:33,919 Speaker 1: signed bills into law that are designed to protect actors 277 00:17:33,960 --> 00:17:39,200 Speaker 1: from predatory practices that involve you guessed it AI. So essentially, 278 00:17:39,240 --> 00:17:42,480 Speaker 1: these laws create rules that media companies are going to 279 00:17:42,480 --> 00:17:45,000 Speaker 1: have to follow if they are to create a digital 280 00:17:45,040 --> 00:17:48,159 Speaker 1: replica of an actor or a performer. One of the 281 00:17:48,200 --> 00:17:52,359 Speaker 1: two laws says that companies must have quote contracts to 282 00:17:52,440 --> 00:17:55,760 Speaker 1: specify the use of AI generated digital replicas of a 283 00:17:55,800 --> 00:17:59,560 Speaker 1: performer's voice or likeness, and the performer must be professionally 284 00:17:59,600 --> 00:18:04,640 Speaker 1: represent negotiating the contract end quote, So no just sneaking 285 00:18:04,680 --> 00:18:08,920 Speaker 1: that in. It has to be more transparent than that. Now. 286 00:18:08,960 --> 00:18:13,040 Speaker 1: The other law requires media companies to first acquire the 287 00:18:13,080 --> 00:18:18,800 Speaker 1: express permission from the estates of deceased actors before media 288 00:18:18,840 --> 00:18:21,960 Speaker 1: companies are allowed to create digital replicas of the dearly 289 00:18:22,040 --> 00:18:25,080 Speaker 1: departed actor in question. So, in other words, you can't 290 00:18:25,119 --> 00:18:27,840 Speaker 1: just go and create a digital replica of Clark Gable 291 00:18:28,200 --> 00:18:32,680 Speaker 1: to be in your movie without first acquiring legal consent 292 00:18:33,160 --> 00:18:37,879 Speaker 1: from Gable's estate. And this relates closely to issues that 293 00:18:37,960 --> 00:18:41,400 Speaker 1: sag Aftra brought up while they were negotiating new contract 294 00:18:41,400 --> 00:18:44,840 Speaker 1: agreements with the movie studios. That's what ultimately led to 295 00:18:45,480 --> 00:18:49,960 Speaker 1: the union strikes. Not that long ago. Have you ever 296 00:18:50,600 --> 00:18:54,679 Speaker 1: been on a social platform like x formally known as 297 00:18:54,720 --> 00:18:58,840 Speaker 1: Twitter and you posted something that you thought was amazingly 298 00:18:58,880 --> 00:19:03,480 Speaker 1: insightful or really funny, or just incredibly relevant to the 299 00:19:03,520 --> 00:19:06,879 Speaker 1: world and what's going on or whatever. Only then you 300 00:19:07,040 --> 00:19:12,200 Speaker 1: absolutely got no engagement whatsoever, you know, no likes or responses, nothing. 301 00:19:12,680 --> 00:19:15,080 Speaker 1: That's a real bummer, right, I mean, here you are 302 00:19:15,480 --> 00:19:19,639 Speaker 1: spitting gold out into the universe and you're getting nothing back. 303 00:19:20,240 --> 00:19:24,520 Speaker 1: Wouldn't it be nice to receive lots of really positive responses. 304 00:19:24,600 --> 00:19:28,080 Speaker 1: Maybe people are riffing on what you said and pointing 305 00:19:28,080 --> 00:19:32,120 Speaker 1: out other things relating to what you were saying, and 306 00:19:32,560 --> 00:19:37,000 Speaker 1: creating a real conversation around it. Well, maybe social ai 307 00:19:37,160 --> 00:19:41,000 Speaker 1: is for you then, though I don't think so. You know, 308 00:19:41,280 --> 00:19:43,879 Speaker 1: social ai is only for you if you do not 309 00:19:44,080 --> 00:19:47,040 Speaker 1: mind that all the responses you get are generated by 310 00:19:47,040 --> 00:19:51,840 Speaker 1: AI bots, because that's what social ai is. It's an 311 00:19:51,880 --> 00:19:57,960 Speaker 1: app that I argue, mimics a social platform because everyone 312 00:19:58,040 --> 00:20:01,760 Speaker 1: else on your instance of this app is a bot, 313 00:20:02,520 --> 00:20:05,480 Speaker 1: Like you're the only human left on Earth and everything 314 00:20:05,520 --> 00:20:10,200 Speaker 1: else is run by AI. You can select the types 315 00:20:10,240 --> 00:20:13,840 Speaker 1: of followers that you'll get responding to your posts. You know, 316 00:20:13,880 --> 00:20:17,560 Speaker 1: maybe you want people who are funny. Maybe you're looking 317 00:20:17,600 --> 00:20:21,720 Speaker 1: for nerds to respond to your stuff. Maybe you're looking 318 00:20:21,760 --> 00:20:27,080 Speaker 1: for insightful observations drawn from what you're posting. I don't 319 00:20:27,080 --> 00:20:29,359 Speaker 1: know how much mileage you'll actually get out of any 320 00:20:29,400 --> 00:20:31,639 Speaker 1: of those, because I've actually been I've been reading some 321 00:20:31,680 --> 00:20:35,760 Speaker 1: of the responses that various reporters have used or have 322 00:20:35,880 --> 00:20:39,520 Speaker 1: posted after they tried out social Ai. They created some 323 00:20:39,640 --> 00:20:43,000 Speaker 1: posts and then they posted the responses they were getting, 324 00:20:43,359 --> 00:20:47,120 Speaker 1: and all those responses strike me as hollow and lacking 325 00:20:47,200 --> 00:20:50,760 Speaker 1: any substance whatsoever. Like to me, it reminds me of 326 00:20:50,760 --> 00:20:54,160 Speaker 1: being in school and you're in English class and you're 327 00:20:54,200 --> 00:20:56,560 Speaker 1: supposed to give a book report, and one by one, 328 00:20:56,960 --> 00:20:59,399 Speaker 1: students are going up to give their book reports, and 329 00:20:59,440 --> 00:21:02,760 Speaker 1: it's very clear who read the book versus someone who 330 00:21:02,920 --> 00:21:06,000 Speaker 1: only read the back book cover, and they're just trying 331 00:21:06,040 --> 00:21:08,560 Speaker 1: to stretch that out long enough to make it seem 332 00:21:08,600 --> 00:21:11,160 Speaker 1: like they read the whole thing. The responses I read 333 00:21:11,160 --> 00:21:13,320 Speaker 1: and social ai made me think of the students who 334 00:21:13,520 --> 00:21:15,760 Speaker 1: just read the back book cover and didn't actually do 335 00:21:15,880 --> 00:21:18,800 Speaker 1: the work. The developer behind social ai is a guy 336 00:21:18,880 --> 00:21:22,640 Speaker 1: named Michael Samon, and I don't know why he made 337 00:21:22,640 --> 00:21:26,280 Speaker 1: social ai. Honestly, I don't know if this was kind 338 00:21:26,320 --> 00:21:29,320 Speaker 1: of his way to lampoon how meaningless a lot of 339 00:21:29,359 --> 00:21:32,840 Speaker 1: online engagement ends up being. Like if you've ever opened 340 00:21:32,960 --> 00:21:36,240 Speaker 1: up threads on Threads or on x or whatever, if 341 00:21:36,280 --> 00:21:39,919 Speaker 1: you just read like the comments section under posts, often 342 00:21:39,960 --> 00:21:42,560 Speaker 1: you come across a lot of stuff that surface level 343 00:21:42,640 --> 00:21:45,680 Speaker 1: is being too kind, there's no depth whatsoever. I don't 344 00:21:45,680 --> 00:21:48,359 Speaker 1: know if Saman was poking fun at Elon Musk for 345 00:21:48,520 --> 00:21:53,000 Speaker 1: buying Twitter and finding out what it's like to have 346 00:21:53,440 --> 00:21:57,160 Speaker 1: an entire platform filled with bots. You know, Musk had 347 00:21:57,200 --> 00:22:00,639 Speaker 1: said before he bought Twitter that Twitter was infested with 348 00:22:00,720 --> 00:22:02,960 Speaker 1: bots and he was going to clean it up. And 349 00:22:03,160 --> 00:22:05,959 Speaker 1: I think a lot of people now argue that X 350 00:22:06,000 --> 00:22:09,600 Speaker 1: has bore bots or a higher concentration of bots than 351 00:22:09,600 --> 00:22:14,000 Speaker 1: Twitter ever had before Musk took over. Whether that's true 352 00:22:14,119 --> 00:22:17,600 Speaker 1: or not, I don't know, but certainly the perception or 353 00:22:17,680 --> 00:22:19,760 Speaker 1: I don't know if Saman was doing this in a 354 00:22:19,840 --> 00:22:23,880 Speaker 1: sincere effort to provide comfort to lonely people who otherwise 355 00:22:23,920 --> 00:22:27,080 Speaker 1: get very little interaction from others. Right Like, if it's 356 00:22:27,080 --> 00:22:29,520 Speaker 1: someone who just feels like they're saying things and no 357 00:22:29,520 --> 00:22:32,240 Speaker 1: one's hearing them, then that can lead to a pretty 358 00:22:32,320 --> 00:22:36,800 Speaker 1: despondent day to day existence. So having something that makes 359 00:22:36,800 --> 00:22:40,680 Speaker 1: you feel heard and seen and validated. That could have 360 00:22:40,840 --> 00:22:45,040 Speaker 1: real value to some people. And maybe that's why Saman 361 00:22:45,119 --> 00:22:48,040 Speaker 1: did this. It's hard to say, because he has given 362 00:22:48,160 --> 00:22:51,959 Speaker 1: various statements that support each of those motivations, and you know, 363 00:22:52,080 --> 00:22:55,560 Speaker 1: maybe he's motivated by a mixture of things, or maybe 364 00:22:55,600 --> 00:22:58,879 Speaker 1: the motivation has actually evolved over time, where maybe maybe 365 00:22:58,880 --> 00:23:02,640 Speaker 1: it started either as a joke or as a sincere 366 00:23:02,680 --> 00:23:06,120 Speaker 1: effort to help people and then slowly evolved into the other. 367 00:23:06,280 --> 00:23:09,479 Speaker 1: I don't know, but whatever the motivation is, I can't 368 00:23:09,520 --> 00:23:13,240 Speaker 1: say that I'm impressed with the quality of the engagement 369 00:23:13,320 --> 00:23:16,719 Speaker 1: you get when you post stuff to social AI. It 370 00:23:16,800 --> 00:23:20,760 Speaker 1: does actually make me think of Threads a lot, because 371 00:23:20,760 --> 00:23:24,120 Speaker 1: whenever I do log into threads, I get the distinct 372 00:23:24,200 --> 00:23:26,920 Speaker 1: feeling that a lot of the accounts that are being 373 00:23:26,960 --> 00:23:30,720 Speaker 1: promoted to me are actually being driven by AI of bots. 374 00:23:30,800 --> 00:23:33,000 Speaker 1: I mean, maybe the account is held by a real person, 375 00:23:33,040 --> 00:23:35,840 Speaker 1: but it's an AI bot that's actually posting the stuff, 376 00:23:36,040 --> 00:23:39,240 Speaker 1: and it's all in an effort to drive engagement. That's 377 00:23:39,240 --> 00:23:41,720 Speaker 1: the feeling I get because there are just too many 378 00:23:41,760 --> 00:23:45,960 Speaker 1: accounts that are all asking essentially the same questions, right, 379 00:23:46,240 --> 00:23:50,080 Speaker 1: and they tend to be questions that prompt a quick response, 380 00:23:50,200 --> 00:23:54,639 Speaker 1: especially if you're linking it to a specific region, like Hi, Austin, Texas, 381 00:23:54,640 --> 00:23:57,320 Speaker 1: what are some great restaurants to look at? Like something 382 00:23:57,359 --> 00:23:59,840 Speaker 1: like that, Right, that's going to get a lot of responses, 383 00:24:00,000 --> 00:24:02,480 Speaker 1: at least in Austin and probably the surrounding areas, and 384 00:24:02,520 --> 00:24:05,960 Speaker 1: it drives a lot of engagement, which, in turn, engagement 385 00:24:06,800 --> 00:24:10,520 Speaker 1: is like currency for influencers. Right. So, I get the 386 00:24:10,520 --> 00:24:13,080 Speaker 1: feeling that a lot of the posts on threads aren't 387 00:24:13,119 --> 00:24:16,119 Speaker 1: being made by actual people. They're made by bots that 388 00:24:16,160 --> 00:24:19,000 Speaker 1: are just trying to get as much engagement as possible, 389 00:24:19,359 --> 00:24:21,760 Speaker 1: and a lot of that stuff ends up being, you know, 390 00:24:22,200 --> 00:24:25,080 Speaker 1: pretty simple tricks, to the point where I very rarely 391 00:24:25,119 --> 00:24:27,840 Speaker 1: respond to anything in Threads unless I actually know the 392 00:24:27,840 --> 00:24:30,280 Speaker 1: person who's posting and I feel like, oh, that really 393 00:24:30,320 --> 00:24:33,040 Speaker 1: came from that person, not this is something that some 394 00:24:33,200 --> 00:24:37,640 Speaker 1: bots spat out in an effort to make number go up. However, 395 00:24:37,720 --> 00:24:40,760 Speaker 1: it's possible I'm just getting paranoid. Maybe I'm just paranoid 396 00:24:40,760 --> 00:24:43,639 Speaker 1: and I believe everybody's a bot and I'm just positioning 397 00:24:43,640 --> 00:24:46,960 Speaker 1: myself to audition for the next version of the thing, 398 00:24:47,080 --> 00:24:51,240 Speaker 1: except it'll be robots, not you. Know Aliens. Maybe that's 399 00:24:51,280 --> 00:24:54,600 Speaker 1: the case. I'm going to take another quick break. When 400 00:24:54,640 --> 00:24:56,359 Speaker 1: I come back, I've got a little bit more tech 401 00:24:56,400 --> 00:25:09,359 Speaker 1: news to share with y'all. Okay, we're mostly done with 402 00:25:09,440 --> 00:25:12,159 Speaker 1: AI at this point. That was an awful lot of 403 00:25:12,240 --> 00:25:14,879 Speaker 1: news about artificial intelligence. But to be honest, that was 404 00:25:14,920 --> 00:25:17,560 Speaker 1: what was really dominating a lot of the tech conversation 405 00:25:17,640 --> 00:25:21,280 Speaker 1: this week. I mean that, and then obviously the Israeli 406 00:25:21,320 --> 00:25:25,960 Speaker 1: attack against Tesbolah using pagers and walkie talkies as explosives. 407 00:25:26,160 --> 00:25:29,879 Speaker 1: Those were like the two big things that were in discussion. 408 00:25:30,480 --> 00:25:35,399 Speaker 1: But let's talk about X slash Twitter for just a 409 00:25:35,480 --> 00:25:40,240 Speaker 1: couple of seconds. So this first one's an update. You 410 00:25:40,359 --> 00:25:45,680 Speaker 1: might recall that a Brazilian Supreme Court judge ruled that 411 00:25:45,760 --> 00:25:49,120 Speaker 1: the internet service providers in Brazil were to shut down 412 00:25:49,160 --> 00:25:52,600 Speaker 1: access to x. This was after Musk refused to play 413 00:25:52,640 --> 00:25:56,600 Speaker 1: ball regarding the removal or censorship of certain accounts on 414 00:25:56,640 --> 00:26:00,439 Speaker 1: the platform in Brazil. If you're not familiar with the story, 415 00:26:00,520 --> 00:26:05,399 Speaker 1: so the Supreme Court justice was telling Musk, hey, there's 416 00:26:06,200 --> 00:26:10,000 Speaker 1: this group of accounts that are spouting off misinformation and 417 00:26:10,040 --> 00:26:14,320 Speaker 1: hate speech in Brazil and those posts are causing harm 418 00:26:14,880 --> 00:26:19,080 Speaker 1: either directly or indirectly and therefore we want you to 419 00:26:19,160 --> 00:26:23,640 Speaker 1: shut down those accounts. And Musk's response is, no, we 420 00:26:23,680 --> 00:26:27,560 Speaker 1: believe in free speech, which Musk believes in free speech 421 00:26:27,560 --> 00:26:30,600 Speaker 1: when it suits him. If it's free speech that is 422 00:26:31,000 --> 00:26:34,680 Speaker 1: critical of Musk, he's not as much in favor of that, honestly. 423 00:26:34,920 --> 00:26:37,159 Speaker 1: But anyway, he said, we're not going to do that. 424 00:26:37,200 --> 00:26:40,679 Speaker 1: We're not in the business of shutting down accounts just 425 00:26:40,720 --> 00:26:43,400 Speaker 1: because they say things you don't like. And the judge said, 426 00:26:43,480 --> 00:26:46,080 Speaker 1: all right, well, then what we're gonna do is we're 427 00:26:46,119 --> 00:26:50,720 Speaker 1: gonna shut down access to X in Brazil, and they did. 428 00:26:51,040 --> 00:26:56,200 Speaker 1: But this week on Wednesday, X briefly returned into service 429 00:26:56,240 --> 00:27:00,800 Speaker 1: to Brazil, not because the Supreme Court in Brazil allowed 430 00:27:00,840 --> 00:27:04,639 Speaker 1: it to, but rather they were able to circumvent the 431 00:27:04,880 --> 00:27:08,879 Speaker 1: issues by using like third party cloud services to return 432 00:27:09,000 --> 00:27:13,360 Speaker 1: service in the country. So the Supreme Court judge did 433 00:27:13,359 --> 00:27:16,520 Speaker 1: not just let that go unnoticed. The justice is said, 434 00:27:16,640 --> 00:27:22,480 Speaker 1: if X remains active in Brazil despite the ruling against it, 435 00:27:22,960 --> 00:27:26,439 Speaker 1: then Brazil will levy a nine hundred thousand dollars a 436 00:27:26,560 --> 00:27:31,040 Speaker 1: day fine for every day that X would remain accessible 437 00:27:31,080 --> 00:27:34,440 Speaker 1: within the country. As you might imagine, X is now 438 00:27:34,960 --> 00:27:40,680 Speaker 1: once again down in Brazil. So the battle between Brazilian 439 00:27:41,080 --> 00:27:45,680 Speaker 1: judges and Elon Musk continues now again. Y'all, if you've 440 00:27:45,680 --> 00:27:47,320 Speaker 1: been listening to the show for a while, you know 441 00:27:47,400 --> 00:27:49,480 Speaker 1: I am not a big fan of Elon Musk. I 442 00:27:49,640 --> 00:27:53,679 Speaker 1: do not care for him at all. However, I also 443 00:27:53,840 --> 00:28:01,760 Speaker 1: don't think that censorship is a reasonable approach either. I 444 00:28:01,760 --> 00:28:04,400 Speaker 1: think content moderation is important, and I think that's something 445 00:28:04,440 --> 00:28:09,480 Speaker 1: that X has really, really really fallen short on. Twitter 446 00:28:09,640 --> 00:28:14,480 Speaker 1: was never good about content moderation, but X has said 447 00:28:14,520 --> 00:28:17,040 Speaker 1: they'll hold my beer and has taken that to the 448 00:28:17,160 --> 00:28:20,920 Speaker 1: nth degree. So I do think that X requires better 449 00:28:21,000 --> 00:28:25,879 Speaker 1: content moderation, which supposedly is something that's actually in the works. 450 00:28:26,160 --> 00:28:29,680 Speaker 1: But I don't know that that censorship is the right 451 00:28:29,720 --> 00:28:34,800 Speaker 1: answer from like a governmental source. So I think this 452 00:28:34,880 --> 00:28:37,320 Speaker 1: is a story where I don't agree with any party 453 00:28:37,359 --> 00:28:40,520 Speaker 1: that's involved in it. I mean, I don't value X 454 00:28:40,560 --> 00:28:43,200 Speaker 1: as a service anymore. I personally do not like I 455 00:28:43,240 --> 00:28:46,720 Speaker 1: don't use it anymore. I deactivated my account ages ago. 456 00:28:47,120 --> 00:28:49,800 Speaker 1: But I at the same time recognize that for a 457 00:28:49,800 --> 00:28:53,760 Speaker 1: lot of people it serves a really useful purpose and 458 00:28:54,280 --> 00:28:57,080 Speaker 1: I don't want to see that just get tossed aside. 459 00:28:57,120 --> 00:29:00,360 Speaker 1: So this is a complicated situation where I don't think 460 00:29:00,400 --> 00:29:04,080 Speaker 1: anyone is really in the right and the people who 461 00:29:04,080 --> 00:29:06,960 Speaker 1: are suffering are the ones who are being driven to 462 00:29:07,480 --> 00:29:10,280 Speaker 1: stuff like blue Sky and Threads. Not that blue Sky 463 00:29:10,400 --> 00:29:13,360 Speaker 1: is terrible, but Threads is pretty bad. I've been using 464 00:29:13,360 --> 00:29:16,080 Speaker 1: Threads for a bit, and I'm like, why did I 465 00:29:16,200 --> 00:29:18,680 Speaker 1: bother doing this? I think for about a month, I've 466 00:29:18,720 --> 00:29:22,560 Speaker 1: been using Threads again, and I question each time, like 467 00:29:22,640 --> 00:29:24,600 Speaker 1: am I going to come back and use it again today? 468 00:29:24,680 --> 00:29:27,920 Speaker 1: Or is this it? Am I done? Anyway? Enough of that, 469 00:29:28,040 --> 00:29:30,680 Speaker 1: let's move on. So here in the United States, X 470 00:29:30,720 --> 00:29:34,640 Speaker 1: has also officially relocated its headquarters. It originally was headquartered 471 00:29:34,680 --> 00:29:38,640 Speaker 1: in San Francisco, California. Now it has moved to best Drop, 472 00:29:38,840 --> 00:29:44,560 Speaker 1: Texas Bastrop, Texas. I'm sure I've mispronounced that name, but 473 00:29:44,640 --> 00:29:48,440 Speaker 1: that's because Texans decide to pronounce things in their own 474 00:29:48,640 --> 00:29:54,080 Speaker 1: peculiar way. Like in Austin, there's a street that's spelled Guadalupe, 475 00:29:54,560 --> 00:29:57,400 Speaker 1: but if you say Guadalupe, you'll be laughed out of 476 00:29:57,400 --> 00:30:00,880 Speaker 1: the city because it's Guadaloup. I can't really criticize that. 477 00:30:00,960 --> 00:30:03,960 Speaker 1: Here in Atlanta, we have a street that's called Ponce 478 00:30:04,120 --> 00:30:08,040 Speaker 1: de Leon, but it's pronounced Ponce Delian, Like you got 479 00:30:08,160 --> 00:30:10,200 Speaker 1: to say Ponce Delian or people won't know what you're 480 00:30:10,240 --> 00:30:13,400 Speaker 1: talking about. So anyway, that's beside the point. The move 481 00:30:13,720 --> 00:30:16,760 Speaker 1: from California to Texas had a lot of political motivation 482 00:30:16,880 --> 00:30:21,680 Speaker 1: behind it, Namely, Elon Musk has clashed multiple times with 483 00:30:22,120 --> 00:30:27,000 Speaker 1: California's political leadership on numerous occasions over lots of different topics, 484 00:30:27,160 --> 00:30:30,720 Speaker 1: including recently here, the state of California passed laws that 485 00:30:30,920 --> 00:30:35,320 Speaker 1: meant to protect transgender students from being outed to their 486 00:30:35,360 --> 00:30:39,760 Speaker 1: parents without their consent, and Musk apparently can't abide that 487 00:30:39,880 --> 00:30:42,320 Speaker 1: kind of thing. He just is like, no, that's not cool. 488 00:30:42,960 --> 00:30:46,520 Speaker 1: I want people's lives to be put into danger. I 489 00:30:46,560 --> 00:30:51,160 Speaker 1: suppose if you'd like to learn more about this relocation 490 00:30:51,520 --> 00:30:55,560 Speaker 1: from California to Texas, I recommend Andrea Guzman's article in 491 00:30:55,640 --> 00:31:01,240 Speaker 1: the online paper Krawn c Hron. That article is titled 492 00:31:01,280 --> 00:31:06,320 Speaker 1: Elon Musk officially moves X headquarters from California to Texas. 493 00:31:06,520 --> 00:31:09,760 Speaker 1: That'll be helpful. It also explains that the folks in 494 00:31:09,800 --> 00:31:12,720 Speaker 1: California don't know yet if they're going to be required 495 00:31:12,760 --> 00:31:17,040 Speaker 1: to relocate to Texas. Or if other offices in areas 496 00:31:17,040 --> 00:31:21,280 Speaker 1: around San Francisco, such as like San Jose will remain open. 497 00:31:21,480 --> 00:31:24,880 Speaker 1: They don't know yet. And a lot of Elon Musk's 498 00:31:24,920 --> 00:31:28,560 Speaker 1: businesses are now headquartered in Texas, where he finds a 499 00:31:28,640 --> 00:31:33,520 Speaker 1: lot more political parody with the folks in charge in 500 00:31:33,520 --> 00:31:39,280 Speaker 1: that state. So how much power does AI need as 501 00:31:39,320 --> 00:31:43,400 Speaker 1: in electricity? We're not totally done with AI after all, 502 00:31:43,400 --> 00:31:48,840 Speaker 1: it seems well, apparently they need enough electricity to require 503 00:31:49,000 --> 00:31:52,840 Speaker 1: a nuclear power plant that had been shut down to 504 00:31:53,000 --> 00:31:56,719 Speaker 1: come back online, because Microsoft has signed a deal with 505 00:31:56,840 --> 00:32:01,760 Speaker 1: Constellation Energy, and that deal will require the restart of 506 00:32:02,040 --> 00:32:06,440 Speaker 1: Unit one of the infamous Three Mile Island nuclear power 507 00:32:06,480 --> 00:32:10,960 Speaker 1: plant in Pennsylvania. Now, way back in the nineteen seventies 508 00:32:11,000 --> 00:32:13,640 Speaker 1: here in the United States, Three Mile Island became the 509 00:32:13,680 --> 00:32:18,080 Speaker 1: focus of national news when Unit two had a partial 510 00:32:18,240 --> 00:32:23,840 Speaker 1: nuclear meltdown that also included the release of some radioactive 511 00:32:23,880 --> 00:32:28,720 Speaker 1: gases and radioactive iodine into the surrounding area. That unit 512 00:32:28,880 --> 00:32:31,680 Speaker 1: is not going to be reactivated for obvious reasons, So 513 00:32:31,840 --> 00:32:34,320 Speaker 1: Unit two is not like going to go from partial 514 00:32:34,360 --> 00:32:40,360 Speaker 1: meltdown to back in action. Unit one will be restarted now. Also, 515 00:32:40,400 --> 00:32:43,400 Speaker 1: to be clear, Three Mile Island did not totally shut 516 00:32:43,440 --> 00:32:47,800 Speaker 1: down after this partial meltdown, it did come back online 517 00:32:48,200 --> 00:32:52,080 Speaker 1: in a reduced capacity and had remained in operation until 518 00:32:52,240 --> 00:32:56,400 Speaker 1: twenty nineteen when it shut down. Now this deal is 519 00:32:57,080 --> 00:32:59,840 Speaker 1: going to have it come back online by around twenty 520 00:32:59,880 --> 00:33:02,680 Speaker 1: twenty eight, and this is in order to supply Microsoft 521 00:33:02,760 --> 00:33:05,880 Speaker 1: with that sweet, sweet lightning juice that's needed to power 522 00:33:05,960 --> 00:33:09,920 Speaker 1: all of Microsoft's various robots where it's AI efforts. In 523 00:33:10,000 --> 00:33:12,480 Speaker 1: other words, now this really reminds me of the fact 524 00:33:12,480 --> 00:33:15,360 Speaker 1: that we're always going to need access to more energy, 525 00:33:15,880 --> 00:33:19,760 Speaker 1: and we'll use all the available sources that we have. 526 00:33:20,240 --> 00:33:24,040 Speaker 1: Because one of the big talking points about fusion is 527 00:33:24,080 --> 00:33:26,760 Speaker 1: that if we can get fusion to work, then we 528 00:33:26,840 --> 00:33:30,720 Speaker 1: would meet the world's energy needs pretty handily. In fact, 529 00:33:31,080 --> 00:33:35,000 Speaker 1: we would meet them many times over, the idea being 530 00:33:35,000 --> 00:33:38,920 Speaker 1: that we would have plentiful energy, prices would drop, and 531 00:33:38,960 --> 00:33:42,239 Speaker 1: we wouldn't have to worry about deficits at all. But 532 00:33:42,480 --> 00:33:45,320 Speaker 1: I think AI kind of proves that we always will 533 00:33:45,360 --> 00:33:49,240 Speaker 1: come up with more ways to require more energy. The 534 00:33:49,320 --> 00:33:52,800 Speaker 1: word enough just doesn't come into it. There will never 535 00:33:52,840 --> 00:33:56,240 Speaker 1: be enough, because we'll just find new ways to require more, 536 00:33:56,680 --> 00:33:58,920 Speaker 1: which is a sobering thought and one that we need 537 00:33:58,960 --> 00:34:02,040 Speaker 1: to remind ourselves when we get carried away, like news 538 00:34:02,080 --> 00:34:04,760 Speaker 1: that relate to things like fusion. For example, I get 539 00:34:04,800 --> 00:34:08,000 Speaker 1: really excited when I read about advances in fusion because 540 00:34:08,239 --> 00:34:12,320 Speaker 1: it's a super interesting technology. It has the capacity to 541 00:34:12,520 --> 00:34:16,880 Speaker 1: provide a lot of energy with very little downsides to it, 542 00:34:17,120 --> 00:34:19,279 Speaker 1: apart from the fact that it's really hard to get 543 00:34:19,280 --> 00:34:22,120 Speaker 1: it to work right and we haven't done it yet. 544 00:34:22,520 --> 00:34:24,759 Speaker 1: But like, if we do get to work right, it 545 00:34:24,800 --> 00:34:27,759 Speaker 1: could be phenomenal. But then stories like this remind me 546 00:34:27,800 --> 00:34:30,200 Speaker 1: of Yeah, but we are going to have stuff like 547 00:34:30,239 --> 00:34:34,520 Speaker 1: cryptocurrency minors and AI language models and all this kind 548 00:34:34,520 --> 00:34:39,440 Speaker 1: of stuff that just are incredibly hungry for electricity, So 549 00:34:40,520 --> 00:34:44,960 Speaker 1: we will find ways to consume all that excess that 550 00:34:45,000 --> 00:34:49,640 Speaker 1: we have produced fun World. Finally, Disney is taking out 551 00:34:49,680 --> 00:34:53,200 Speaker 1: the Slack, and by that I mean Disney plans to 552 00:34:53,280 --> 00:34:57,920 Speaker 1: transition off of the Slack collaboration tool after hackers manage 553 00:34:57,960 --> 00:35:02,399 Speaker 1: to access more than a terabytes of corporate information from 554 00:35:02,440 --> 00:35:06,000 Speaker 1: inside the Mousehouse and then they leaked that data, or 555 00:35:06,040 --> 00:35:09,440 Speaker 1: at least some of it online. The leaked information included 556 00:35:09,440 --> 00:35:13,800 Speaker 1: stuff about unreleased projects from Disney Entertainment and like included 557 00:35:13,920 --> 00:35:18,120 Speaker 1: more than forty four million messages between Disney employees or 558 00:35:18,160 --> 00:35:21,439 Speaker 1: staff or I don't know, cast members, whatever. I haven't 559 00:35:21,480 --> 00:35:25,520 Speaker 1: seen details as to how precisely the hackers got access 560 00:35:25,600 --> 00:35:29,919 Speaker 1: to Disney's Slack channels. I would gently remind leaders that 561 00:35:30,320 --> 00:35:33,600 Speaker 1: it is really important that whatever platform you choose for 562 00:35:33,800 --> 00:35:36,960 Speaker 1: project management or collaboration, whatever it might be, it is 563 00:35:37,040 --> 00:35:41,640 Speaker 1: important that that program is secure. But even the most 564 00:35:41,760 --> 00:35:45,359 Speaker 1: secure system is not going to protect you unless your 565 00:35:45,400 --> 00:35:50,319 Speaker 1: employees learn and follow good security etiquette. Otherwise, you can 566 00:35:50,400 --> 00:35:53,280 Speaker 1: swap out tools until the cows come home, and hackers 567 00:35:53,280 --> 00:35:56,560 Speaker 1: will still find someone to exploit in order to get 568 00:35:56,560 --> 00:35:59,920 Speaker 1: at the goods. So I'm not saying that Slack is perfect. 569 00:36:00,360 --> 00:36:03,680 Speaker 1: I don't use Slack personally. I just think that throwing 570 00:36:03,760 --> 00:36:07,920 Speaker 1: Slack under the bus is probably short sighted, unless, of course, 571 00:36:08,360 --> 00:36:12,200 Speaker 1: the hackers did exploit an actual vulnerability in Slack, which 572 00:36:12,239 --> 00:36:15,920 Speaker 1: is entirely possible. The information I came across was not 573 00:36:16,440 --> 00:36:18,759 Speaker 1: clear about that one way or the other. If the 574 00:36:18,840 --> 00:36:23,360 Speaker 1: hackers were able to exploit a vulnerability in Slack itself, 575 00:36:23,640 --> 00:36:27,400 Speaker 1: that spells trouble for all of Slack's users at Disney 576 00:36:27,480 --> 00:36:32,200 Speaker 1: or otherwise. So that's a much, much, much bigger issue if, however, 577 00:36:32,560 --> 00:36:34,200 Speaker 1: this was a case where they were able to get 578 00:36:34,239 --> 00:36:38,120 Speaker 1: access to Disney's Slack channels, not because Slack itself had 579 00:36:38,120 --> 00:36:41,799 Speaker 1: a vulnerability, but because someone within Disney accidentally handed over 580 00:36:41,840 --> 00:36:44,000 Speaker 1: the Keys to the Kingdom, which, by the way, Keys 581 00:36:44,040 --> 00:36:46,239 Speaker 1: to the Kingdom is a great behind the scenes tour 582 00:36:46,280 --> 00:36:48,400 Speaker 1: at Walt Disney World. That's not really what I was 583 00:36:48,440 --> 00:36:50,760 Speaker 1: talking about, but yeah, if that's the case, it doesn't 584 00:36:50,800 --> 00:36:53,800 Speaker 1: matter what platform you're on. I mean, people are still 585 00:36:53,840 --> 00:36:56,920 Speaker 1: prone to getting tricked by things like social engineering. So 586 00:36:57,400 --> 00:37:00,680 Speaker 1: just a reminder for everyone out there that getting the 587 00:37:00,760 --> 00:37:06,160 Speaker 1: so called best tool doesn't always guarantee success. Okay, that's 588 00:37:06,200 --> 00:37:08,879 Speaker 1: it for the news for this week. I hope all 589 00:37:08,880 --> 00:37:11,280 Speaker 1: of you are well, and I'll talk to you again 590 00:37:12,000 --> 00:37:22,279 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 591 00:37:22,360 --> 00:37:27,080 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 592 00:37:27,120 --> 00:37:32,799 Speaker 1: wherever you listen to your favorite shows.