1 00:00:04,440 --> 00:00:12,239 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. He there, 2 00:00:12,240 --> 00:00:15,760 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,840 --> 00:00:18,919 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:18,960 --> 00:00:22,200 Speaker 1: are you. It's time for the tech news for May 5 00:00:22,560 --> 00:00:27,200 Speaker 1: second two, thy twenty three and first up. You may 6 00:00:27,240 --> 00:00:31,600 Speaker 1: have seen some headlines like Godfather of AI quits Google 7 00:00:31,680 --> 00:00:36,080 Speaker 1: to warn us about artificial intelligence, or something along those lines. 8 00:00:36,680 --> 00:00:39,920 Speaker 1: The godfather in question is not Don Corleone. It is 9 00:00:40,040 --> 00:00:44,279 Speaker 1: doctor Jeffrey Hinton. He's been a researcher in AI in 10 00:00:44,360 --> 00:00:49,000 Speaker 1: general and deep learning in particular for decades, and he 11 00:00:49,040 --> 00:00:51,320 Speaker 1: says that he quit his job at Google so that 12 00:00:51,400 --> 00:00:55,440 Speaker 1: he could speak out about his concerns centering around artificial 13 00:00:55,480 --> 00:01:00,560 Speaker 1: intelligence without having those concerns impact Google directly. He says 14 00:01:00,600 --> 00:01:03,840 Speaker 1: that generally speaking, Google has behaved in a responsible way 15 00:01:04,120 --> 00:01:07,160 Speaker 1: with regard to AI research. Now, I'm going to do 16 00:01:07,160 --> 00:01:10,920 Speaker 1: a full episode about doctor Hinton and his concerns next week, 17 00:01:11,240 --> 00:01:14,200 Speaker 1: so be on the lookout for that episode. Rather than 18 00:01:14,360 --> 00:01:18,000 Speaker 1: dive into all those details here, I'll say that when 19 00:01:18,080 --> 00:01:22,240 Speaker 1: someone who is deeply entrenched in AI research comes forward 20 00:01:22,240 --> 00:01:26,160 Speaker 1: to talk about the potential dangers of the technology, it's 21 00:01:26,280 --> 00:01:29,600 Speaker 1: probably a good idea for the rest of us to listen. Personally, 22 00:01:29,760 --> 00:01:32,880 Speaker 1: I've been concerned, but not quite at the point of 23 00:01:32,920 --> 00:01:37,160 Speaker 1: being worried about AI for a while now. But I'm 24 00:01:37,200 --> 00:01:40,960 Speaker 1: also worried about the fact that people tend to overgeneralize 25 00:01:40,959 --> 00:01:44,280 Speaker 1: when they talk about AI, or they use one aspect 26 00:01:44,319 --> 00:01:48,240 Speaker 1: of AI to represent the entire field. Maybe, however, it's 27 00:01:48,280 --> 00:01:50,440 Speaker 1: time for me to crank the threat level up a 28 00:01:50,520 --> 00:01:53,360 Speaker 1: notch or two in my own brain. We'll talk more 29 00:01:53,400 --> 00:01:55,800 Speaker 1: about that next week, but we do have a few 30 00:01:55,840 --> 00:01:58,000 Speaker 1: other AI stories today. We're going to get to those 31 00:01:58,040 --> 00:02:00,920 Speaker 1: in a second, but spoiler alert, several of them fall 32 00:02:00,960 --> 00:02:06,280 Speaker 1: into the bad news category. Before we jump into all that, however, 33 00:02:06,840 --> 00:02:09,600 Speaker 1: I wanted to give an update on the legal crusade 34 00:02:09,760 --> 00:02:13,440 Speaker 1: some film studios are on in order to defeat evil 35 00:02:13,560 --> 00:02:16,920 Speaker 1: content pirates, who clearly are the most substantial threat to 36 00:02:16,960 --> 00:02:19,880 Speaker 1: the entertainment industry's quest to make all the money in 37 00:02:19,919 --> 00:02:23,200 Speaker 1: the world. Little editorial note, I might be a little 38 00:02:23,200 --> 00:02:26,800 Speaker 1: bit facetious here. The film studios have sued an Internet 39 00:02:26,840 --> 00:02:32,559 Speaker 1: service provider formerly called RCN, now known as Astound Broadband, 40 00:02:33,120 --> 00:02:36,399 Speaker 1: saying that this ISP did nothing to stop its customers 41 00:02:36,440 --> 00:02:41,079 Speaker 1: from downloading more than thirty different films illegally. As part 42 00:02:41,120 --> 00:02:44,480 Speaker 1: of the studio's quest for justice, they have applied for 43 00:02:44,600 --> 00:02:50,119 Speaker 1: subpoena against Reddit. Why well, because the studios had identified 44 00:02:50,520 --> 00:02:55,440 Speaker 1: a few user handles that had posted in various piracy 45 00:02:55,680 --> 00:03:01,440 Speaker 1: and intellectual property and ISP related subreddit and the studios 46 00:03:01,480 --> 00:03:04,600 Speaker 1: wanted names. Gush darn it, They wanted to know who 47 00:03:04,639 --> 00:03:07,320 Speaker 1: those users were, and they were ready to force Reddit 48 00:03:07,400 --> 00:03:11,240 Speaker 1: to hand over any data it had about those users. 49 00:03:11,840 --> 00:03:17,320 Speaker 1: Reddit told the studios to pound sand saying, these people 50 00:03:17,400 --> 00:03:20,639 Speaker 1: you've identified, most of them aren't even talking about the 51 00:03:20,680 --> 00:03:25,919 Speaker 1: topic that's relevant to your case, and by any standard, 52 00:03:25,919 --> 00:03:29,120 Speaker 1: we will not hand over that information. And now a 53 00:03:29,200 --> 00:03:33,079 Speaker 1: US District court judge has cited with Reddit, saying that 54 00:03:33,120 --> 00:03:36,560 Speaker 1: the studios cannot force Reddit to violate the First Amendment 55 00:03:36,600 --> 00:03:41,600 Speaker 1: rights of those users, that the barrier to doing that 56 00:03:41,760 --> 00:03:45,200 Speaker 1: has not been met. And again the judge said, the 57 00:03:45,240 --> 00:03:47,480 Speaker 1: identity of those folks really has nothing to do with 58 00:03:47,520 --> 00:03:51,320 Speaker 1: the studio's actual legal case. As for the case against 59 00:03:51,320 --> 00:03:54,800 Speaker 1: the former RCN, well that's probably going to be a 60 00:03:54,840 --> 00:03:57,160 Speaker 1: bit of a tricky path as well. As a platform 61 00:03:57,320 --> 00:04:02,040 Speaker 1: RCN isn't actually responsible for what its users do, though 62 00:04:02,160 --> 00:04:05,560 Speaker 1: those protections only extend so far. If a platform is 63 00:04:05,720 --> 00:04:09,520 Speaker 1: told about illegal activity, then it's supposed to take reasonable 64 00:04:09,560 --> 00:04:13,320 Speaker 1: action to stop that activity or else potentially lose legal 65 00:04:13,360 --> 00:04:16,599 Speaker 1: protection in the process. But it's a tricky thing to 66 00:04:16,680 --> 00:04:19,760 Speaker 1: argue in court, so there's no guarantee that the studios 67 00:04:19,760 --> 00:04:22,799 Speaker 1: will get the justice they so want. Now, I decide 68 00:04:22,839 --> 00:04:26,159 Speaker 1: to include the story to illustrate how these protections can 69 00:04:26,279 --> 00:04:32,360 Speaker 1: be important. That they can shield platforms and users platforms 70 00:04:32,400 --> 00:04:36,160 Speaker 1: like Reddit or internet service providers from becoming a big 71 00:04:36,200 --> 00:04:39,880 Speaker 1: old target from a heavy hitting industry like the film industry. 72 00:04:40,320 --> 00:04:44,040 Speaker 1: And while I love film and I think film commerce 73 00:04:44,279 --> 00:04:47,360 Speaker 1: is a valid business that has real challenges facing it, 74 00:04:47,760 --> 00:04:51,919 Speaker 1: I also recognize that media companies in general historically have 75 00:04:52,080 --> 00:04:56,239 Speaker 1: used disproportionate response to perceive threats to their bottom line, 76 00:04:56,680 --> 00:05:00,880 Speaker 1: even when the threat ends up being unquantifiable. Next up, 77 00:05:01,240 --> 00:05:06,640 Speaker 1: the Writer's Guild of America or WGA, is now on strike. 78 00:05:07,120 --> 00:05:11,240 Speaker 1: The WGA represents writers who work for film and television 79 00:05:11,320 --> 00:05:14,279 Speaker 1: here in the US, and the strike means that members 80 00:05:14,279 --> 00:05:17,960 Speaker 1: of the union cannot do any work relating to that. 81 00:05:18,400 --> 00:05:21,760 Speaker 1: They can take no meetings, they can't pitch story ideas, 82 00:05:21,880 --> 00:05:25,279 Speaker 1: they aren't allowed to communicate with collaborators about a project 83 00:05:25,320 --> 00:05:28,800 Speaker 1: as long as the strike is going, all work relating 84 00:05:28,800 --> 00:05:32,799 Speaker 1: to producing, writing for TV and the films in the US, 85 00:05:33,360 --> 00:05:37,200 Speaker 1: they it has to stop. The WGA is in negotiations 86 00:05:37,200 --> 00:05:39,680 Speaker 1: with Hey, those film studios that we were talking about 87 00:05:39,720 --> 00:05:43,000 Speaker 1: a second ago, and at stake are several elements that 88 00:05:43,080 --> 00:05:46,400 Speaker 1: relate to tech, which is why I'm covering this story 89 00:05:46,440 --> 00:05:51,760 Speaker 1: in tech stuff. For one thing, streaming has really changed 90 00:05:51,920 --> 00:05:56,159 Speaker 1: the economics of media. In the old days, before streaming 91 00:05:56,200 --> 00:06:00,480 Speaker 1: platforms are really a thing, writers would receive compensation not 92 00:06:00,560 --> 00:06:03,039 Speaker 1: only in the form of a fee for their work, 93 00:06:03,560 --> 00:06:06,599 Speaker 1: but they would also receive something called residuals, which are 94 00:06:06,720 --> 00:06:10,520 Speaker 1: kind of like royalties. So let's say you write an 95 00:06:10,560 --> 00:06:12,720 Speaker 1: episode of a television show We're just going to make 96 00:06:12,720 --> 00:06:15,719 Speaker 1: one up. We'll call it Cyborg John and his Amazing 97 00:06:15,760 --> 00:06:19,520 Speaker 1: Tech stuff, and you get paid for your work. You 98 00:06:19,560 --> 00:06:23,360 Speaker 1: write a teleplay, it's used, you're paid. Then about a 99 00:06:23,440 --> 00:06:27,840 Speaker 1: year later, a television channel airs the episode a second time. Well, 100 00:06:27,960 --> 00:06:31,160 Speaker 1: the station is making money by selling advertising against an 101 00:06:31,160 --> 00:06:35,200 Speaker 1: episode you wrote, so the channel is profiting off of 102 00:06:35,240 --> 00:06:38,159 Speaker 1: your work. Thus it stands to reason that you should 103 00:06:38,200 --> 00:06:40,840 Speaker 1: get a share of that money, so you get some 104 00:06:41,080 --> 00:06:45,359 Speaker 1: in the form of residuals. If Cyborg John the TV 105 00:06:45,480 --> 00:06:48,279 Speaker 1: series hits that magic number of episodes that are needed 106 00:06:48,320 --> 00:06:51,960 Speaker 1: to go into syndication and that number varies depending upon 107 00:06:51,960 --> 00:06:54,479 Speaker 1: the nature of the show, well that's even better news 108 00:06:54,560 --> 00:06:57,760 Speaker 1: because it means your episode could be showed again and again, 109 00:06:58,320 --> 00:07:00,840 Speaker 1: and every time it is you get a little payout. 110 00:07:01,600 --> 00:07:04,520 Speaker 1: This becomes part of your long term income. It kind 111 00:07:04,520 --> 00:07:08,679 Speaker 1: of becomes passive income at that point. But streaming really 112 00:07:08,800 --> 00:07:12,920 Speaker 1: changed things. Streaming doesn't always depend upon advertising for one thing, 113 00:07:13,520 --> 00:07:17,240 Speaker 1: so residuals become a lot trickier to figure out. How 114 00:07:17,280 --> 00:07:21,160 Speaker 1: do you determine the value of a view if there's 115 00:07:21,240 --> 00:07:25,960 Speaker 1: not directly advertising served against that view. It really does 116 00:07:26,000 --> 00:07:28,880 Speaker 1: become tricky. This means that writers are seeing a hit 117 00:07:29,000 --> 00:07:31,920 Speaker 1: to their long term compensation as people turn more and 118 00:07:31,960 --> 00:07:35,440 Speaker 1: more to streaming. Plus, a lot of shows have really 119 00:07:35,480 --> 00:07:39,040 Speaker 1: slimmed down their pool of writers. They used to have writers' 120 00:07:39,120 --> 00:07:42,000 Speaker 1: rooms filled with lots of writers, and now you have 121 00:07:42,080 --> 00:07:46,000 Speaker 1: these sort of micro writer rooms where a couple of 122 00:07:46,000 --> 00:07:48,200 Speaker 1: writers have to share the whole workload for a full 123 00:07:48,240 --> 00:07:52,040 Speaker 1: season of shows. So workload is going up and compensation 124 00:07:52,640 --> 00:07:55,320 Speaker 1: is not. On top of that, there's a fear that 125 00:07:55,360 --> 00:07:58,240 Speaker 1: shows will start to lean on AI tools to take 126 00:07:58,280 --> 00:08:01,200 Speaker 1: over part of the writing process, which will push more 127 00:08:01,240 --> 00:08:04,679 Speaker 1: creatives out of the industry. So the strike is really 128 00:08:04,720 --> 00:08:09,000 Speaker 1: about the WGA trying to force Hollywood to adjust to 129 00:08:09,080 --> 00:08:13,000 Speaker 1: the current environment because the old economic model just doesn't 130 00:08:13,080 --> 00:08:16,360 Speaker 1: work due to the changes in how the business itself works. 131 00:08:17,080 --> 00:08:19,280 Speaker 1: How long this strike will go on will remain to 132 00:08:19,320 --> 00:08:23,200 Speaker 1: be seen. Already, it's affecting stuff like late night talk shows, 133 00:08:23,280 --> 00:08:26,960 Speaker 1: which can't go on without writers, and we're likely to 134 00:08:27,000 --> 00:08:31,000 Speaker 1: see more effects the longer the strike goes, like possibly 135 00:08:31,080 --> 00:08:34,920 Speaker 1: next year, some shows will have truncated seasons, they won't 136 00:08:34,920 --> 00:08:39,120 Speaker 1: have as many episodes because the writers went on strike. Also, 137 00:08:39,280 --> 00:08:42,760 Speaker 1: there are two other major unions in Hollywood, the Screen 138 00:08:42,800 --> 00:08:47,559 Speaker 1: Actors Guild American Federation of Television and Radio Artists aka 139 00:08:48,000 --> 00:08:51,199 Speaker 1: sag AFTRA, and then there's also the Director's Guild of 140 00:08:51,240 --> 00:08:55,640 Speaker 1: America or DGA. Both of those unions have their own 141 00:08:55,720 --> 00:09:00,920 Speaker 1: contracts with film studios expiring on June thirtieth. Probably also 142 00:09:01,040 --> 00:09:05,360 Speaker 1: want to renegotiate, especially with things like streaming being part 143 00:09:05,400 --> 00:09:08,360 Speaker 1: of the mix, and for Screen Actors Guild AI and 144 00:09:08,440 --> 00:09:11,199 Speaker 1: Deep fakes are definitely something that I'm sure a lot 145 00:09:11,200 --> 00:09:14,559 Speaker 1: of people are concerned about. So I feel like the 146 00:09:14,600 --> 00:09:17,480 Speaker 1: Writer's Guild of America is kind of forging a path 147 00:09:17,559 --> 00:09:20,360 Speaker 1: that these other two unions are likely to follow, at 148 00:09:20,440 --> 00:09:23,400 Speaker 1: least in large part. And potentially this could mean we 149 00:09:23,440 --> 00:09:27,480 Speaker 1: could see all three unions, you know, strike that's like 150 00:09:27,520 --> 00:09:31,240 Speaker 1: the worst case scenario for consumers, because that would mean 151 00:09:31,280 --> 00:09:34,240 Speaker 1: that pretty much all activity in Hollywood shuts down period 152 00:09:34,679 --> 00:09:39,960 Speaker 1: TVs and movies, so it can be a real dry 153 00:09:39,960 --> 00:09:42,640 Speaker 1: spell until things get hashed out. On the other hand, 154 00:09:42,679 --> 00:09:47,000 Speaker 1: maybe the writers end up making some progress with the 155 00:09:47,000 --> 00:09:51,280 Speaker 1: Hollywood studios that then the DGA and SAGAFTRA can build on. 156 00:09:51,400 --> 00:09:53,840 Speaker 1: We'll just have to wait and see. Now, that bit 157 00:09:54,000 --> 00:09:59,640 Speaker 1: about AI taking an important part in the creative industry 158 00:09:59,720 --> 00:10:02,480 Speaker 1: might seem a little premature, but then you just need 159 00:10:02,520 --> 00:10:05,360 Speaker 1: to look at Bloomberg News and it's interview with IBM's 160 00:10:05,360 --> 00:10:08,440 Speaker 1: CEO r Vind Krishna, and that will change your mind. 161 00:10:08,600 --> 00:10:12,400 Speaker 1: Toot sweet. So in that interview, Krishna revealed that IBM 162 00:10:12,480 --> 00:10:15,319 Speaker 1: plans to put its hiring plan for nearly eight thousand 163 00:10:15,400 --> 00:10:20,800 Speaker 1: jobs on hold to see if maybe possibly AI could 164 00:10:20,840 --> 00:10:24,440 Speaker 1: do those jobs instead. I mean, why hire real human 165 00:10:24,480 --> 00:10:27,480 Speaker 1: beings if the robots can do all the work. I mean, 166 00:10:27,600 --> 00:10:30,760 Speaker 1: robots don't get sick, they don't go on maternity leave, 167 00:10:30,800 --> 00:10:34,080 Speaker 1: they don't ask for a raise, robots don't organize into 168 00:10:34,120 --> 00:10:36,559 Speaker 1: a union and threaten to stop working if their demands 169 00:10:36,559 --> 00:10:40,640 Speaker 1: for stuff like health care and benefits aren't met. Krishna 170 00:10:40,679 --> 00:10:43,800 Speaker 1: was talking about jobs that do not face customers, so 171 00:10:43,840 --> 00:10:46,920 Speaker 1: these would be behind the scenes, in house roles, stuff 172 00:10:46,960 --> 00:10:52,040 Speaker 1: like HR jobs. So maybe that HR robot will actually 173 00:10:52,080 --> 00:10:55,280 Speaker 1: listen when someone brings up a concern, or maybe that 174 00:10:55,400 --> 00:10:58,600 Speaker 1: robot will just be really really efficient at minimizing the 175 00:10:58,640 --> 00:11:01,600 Speaker 1: potential impact to the business while trying to make the 176 00:11:01,640 --> 00:11:06,920 Speaker 1: problem go away, not resolve, go away. I am salty 177 00:11:07,000 --> 00:11:11,599 Speaker 1: today must be because yesterday was May Day. Anyway, Krishna's 178 00:11:11,640 --> 00:11:14,960 Speaker 1: comments are not likely to make anyone feel better about 179 00:11:15,000 --> 00:11:19,040 Speaker 1: the fear of AI replacing them, of taking our jobs, 180 00:11:19,679 --> 00:11:23,160 Speaker 1: because that's exactly what IBM appears to be exploring. And y'all, 181 00:11:23,240 --> 00:11:26,360 Speaker 1: I am not a doom sayer. I don't believe AI 182 00:11:26,480 --> 00:11:28,400 Speaker 1: is going to replace us all. For one thing, it's 183 00:11:28,480 --> 00:11:32,199 Speaker 1: just not practical. If AI did replace us, if AI 184 00:11:32,360 --> 00:11:35,120 Speaker 1: got all the jobs, well, no one would be able 185 00:11:35,160 --> 00:11:37,640 Speaker 1: to make a living. And if no one's making a living, 186 00:11:37,679 --> 00:11:39,920 Speaker 1: then they can't afford to buy stuff. And if you 187 00:11:39,920 --> 00:11:43,400 Speaker 1: can't afford to buy stuff, these companies can't make any money. 188 00:11:43,559 --> 00:11:45,960 Speaker 1: They just go out of business because who are their customers. 189 00:11:46,040 --> 00:11:51,280 Speaker 1: Even businesses that have other businesses as customers, those secondary businesses, 190 00:11:51,320 --> 00:11:54,079 Speaker 1: they're going to go out of money right because there's 191 00:11:54,120 --> 00:11:57,160 Speaker 1: no one buying anything. It literally becomes a self defeating 192 00:11:57,160 --> 00:12:00,439 Speaker 1: strategy in the long run. So it ultimately does matter 193 00:12:00,480 --> 00:12:03,880 Speaker 1: how much cost you eliminate from your operations if you're 194 00:12:03,920 --> 00:12:07,200 Speaker 1: also eliminating the ability for people to buy your product 195 00:12:07,320 --> 00:12:10,360 Speaker 1: or service. Hey, maybe I'm giving companies like IBM too 196 00:12:10,440 --> 00:12:14,439 Speaker 1: much credit for thinking ahead and considering consequences. Okay, we 197 00:12:14,559 --> 00:12:16,400 Speaker 1: got a lot more news stories to go today. We're 198 00:12:16,400 --> 00:12:28,920 Speaker 1: gonna take a quick break. Okay, we're back, and we've 199 00:12:28,960 --> 00:12:33,079 Speaker 1: got more AI stories. So Samsung is now telling employees 200 00:12:33,120 --> 00:12:38,520 Speaker 1: to avoid using AI. Specifically, Samsung wants its employees to 201 00:12:38,600 --> 00:12:44,600 Speaker 1: avoid generative AI tools stuff like chat GPT, which employees 202 00:12:44,640 --> 00:12:47,840 Speaker 1: had been using to do stuff like you know, alter code, 203 00:12:47,920 --> 00:12:50,240 Speaker 1: create new code, that kind of thing to help with 204 00:12:51,000 --> 00:12:54,960 Speaker 1: basic steps along those lines. So why is Samsung saying, hey, 205 00:12:54,960 --> 00:12:58,320 Speaker 1: don't do that. Well. Last month, Samsung had a couple 206 00:12:58,320 --> 00:13:02,480 Speaker 1: of embarrassing incidents in which employees had shared company IP 207 00:13:03,360 --> 00:13:07,120 Speaker 1: on chat GPT and that information subsequently leaked to the 208 00:13:07,120 --> 00:13:10,960 Speaker 1: general public. So folks were working on stuff like Samsung's 209 00:13:10,960 --> 00:13:15,240 Speaker 1: own proprietary code and also sharing stuff like meeting notes 210 00:13:15,240 --> 00:13:17,840 Speaker 1: in an effort to use generative AI to create stuff 211 00:13:17,880 --> 00:13:22,360 Speaker 1: like reports and presentations, but then that sensitive information got out, 212 00:13:22,520 --> 00:13:25,760 Speaker 1: and that's not great for Samsung. If I were to 213 00:13:25,800 --> 00:13:29,400 Speaker 1: create an analogy, I would say, as a Southern fella 214 00:13:29,840 --> 00:13:32,160 Speaker 1: brought up in the state of Georgia, this would be 215 00:13:32,240 --> 00:13:36,640 Speaker 1: like if someone from Coca Cola had shared Coke's formula 216 00:13:36,920 --> 00:13:39,920 Speaker 1: with chat GPT in an effort to brainstorm new co 217 00:13:40,040 --> 00:13:43,320 Speaker 1: Cola ideas, and then the formula would then leak to 218 00:13:43,360 --> 00:13:46,760 Speaker 1: the public. That would be disaster. That's kind of what 219 00:13:46,800 --> 00:13:49,680 Speaker 1: happened to Samsung. So now Samsung is saying that these 220 00:13:49,760 --> 00:13:53,240 Speaker 1: AI tools aren't secure, and to be fair, they are not, 221 00:13:53,960 --> 00:13:57,920 Speaker 1: and that employees shouldn't be using them to do work. Now, 222 00:13:57,960 --> 00:14:00,680 Speaker 1: I think this makes a lot of sense because until 223 00:14:00,760 --> 00:14:06,120 Speaker 1: companies have their own sequestered generative AI tools, not necessarily 224 00:14:06,160 --> 00:14:10,280 Speaker 1: created by the company itself, but one that is is 225 00:14:10,440 --> 00:14:13,160 Speaker 1: only being used by that company. It doesn't have connections 226 00:14:13,160 --> 00:14:16,320 Speaker 1: to the outside world. It can't send information outside of 227 00:14:16,320 --> 00:14:20,320 Speaker 1: the organization. Well until that day happens. You have to 228 00:14:20,360 --> 00:14:24,960 Speaker 1: treat these AI tools as a security flaw. Really, generative 229 00:14:25,000 --> 00:14:27,440 Speaker 1: AI is highlighting a lot of issues in the way 230 00:14:27,560 --> 00:14:31,160 Speaker 1: that companies currently do business. We've seen the cloud computing 231 00:14:31,160 --> 00:14:35,520 Speaker 1: industry grow and evolve and adopt tight security controls in 232 00:14:35,560 --> 00:14:39,080 Speaker 1: the process as a necessity so that they can continue 233 00:14:39,080 --> 00:14:43,160 Speaker 1: to do business. If your cloud computing company can't show 234 00:14:43,200 --> 00:14:46,880 Speaker 1: that your operations are secure on their network, well they're 235 00:14:46,880 --> 00:14:50,080 Speaker 1: not going to have any customers. So generative AI needs 236 00:14:50,120 --> 00:14:54,760 Speaker 1: to follow that same pathway. Moving on to Twitter, because 237 00:14:54,800 --> 00:14:57,400 Speaker 1: Heaven help us if we can get through a news 238 00:14:57,400 --> 00:15:00,680 Speaker 1: episode without having to talk about it. The reports that 239 00:15:00,800 --> 00:15:04,680 Speaker 1: brands are still pretty iffy about posting on Twitter or 240 00:15:04,800 --> 00:15:09,120 Speaker 1: using it as an advertising platform because the transition of 241 00:15:09,200 --> 00:15:13,280 Speaker 1: the blue check mark from being a verification mark into 242 00:15:13,320 --> 00:15:17,720 Speaker 1: becoming a paid subscription feature has led to a rise 243 00:15:17,760 --> 00:15:22,440 Speaker 1: of impostors posing as all sorts of things, including official brands, 244 00:15:22,960 --> 00:15:28,920 Speaker 1: and with Twitter's moderation team absolutely decimated by rounds of layoffs, 245 00:15:29,040 --> 00:15:31,200 Speaker 1: and we'll come back to that in a bit too. 246 00:15:31,800 --> 00:15:34,480 Speaker 1: It is harder and harder for Twitter to respond to 247 00:15:34,560 --> 00:15:38,360 Speaker 1: violations like that in a timely fashion. Plus, as a 248 00:15:38,360 --> 00:15:41,960 Speaker 1: fun little bonus news item regarding Twitter blue check marks, 249 00:15:42,360 --> 00:15:45,960 Speaker 1: some formerly verified Twitter users found out that by changing 250 00:15:46,000 --> 00:15:49,000 Speaker 1: their bios they could get their check mark back, at 251 00:15:49,080 --> 00:15:53,880 Speaker 1: least temporarily now. Originally the word was that if you 252 00:15:53,960 --> 00:15:57,720 Speaker 1: put former blue check in your bio, you put the 253 00:15:57,760 --> 00:16:01,600 Speaker 1: actual words former blue check part of your bio, your 254 00:16:01,640 --> 00:16:04,080 Speaker 1: little check mark would return. And sure enough a lot 255 00:16:04,120 --> 00:16:07,600 Speaker 1: of formally verified users tried this out and it seemed 256 00:16:07,640 --> 00:16:10,720 Speaker 1: to work. But then later on some folks said this 257 00:16:10,760 --> 00:16:13,640 Speaker 1: would happen if you just updated your bio in any way, 258 00:16:14,000 --> 00:16:16,240 Speaker 1: so you didn't have to put those specific words in 259 00:16:16,240 --> 00:16:18,120 Speaker 1: for it to happen. It would just happen if you 260 00:16:18,360 --> 00:16:20,640 Speaker 1: made an edit to your bio, and also that this 261 00:16:20,840 --> 00:16:24,320 Speaker 1: change was temporary. I did not try it out myself, 262 00:16:24,400 --> 00:16:28,360 Speaker 1: but my buddy Tom Merritt tested the former blue check method, 263 00:16:28,400 --> 00:16:31,680 Speaker 1: and sure enough he got his check mark back before 264 00:16:31,720 --> 00:16:34,520 Speaker 1: he deleted the phrase from his bio and went back 265 00:16:34,560 --> 00:16:37,000 Speaker 1: to being an unchecked shlub like the rest of us. 266 00:16:37,600 --> 00:16:41,760 Speaker 1: It really shows that Twitter's systems are held together tenuously 267 00:16:42,120 --> 00:16:44,760 Speaker 1: at this point, and that again is not a surprise, 268 00:16:44,840 --> 00:16:47,440 Speaker 1: because there's hardly enough folks at the company to keep 269 00:16:47,480 --> 00:16:52,760 Speaker 1: things running, let alone implement new features. Yesterday, Twitter appeared 270 00:16:52,800 --> 00:16:56,320 Speaker 1: to have some technical issues, as many users reported that 271 00:16:56,360 --> 00:17:00,400 Speaker 1: the service had logged them out of the desktop website 272 00:17:00,520 --> 00:17:03,640 Speaker 1: and then wouldn't let them log in again. So. I 273 00:17:03,680 --> 00:17:07,800 Speaker 1: know a lot of folks these days access everything via apps, 274 00:17:08,400 --> 00:17:10,879 Speaker 1: but I'm an old person, so I still go to 275 00:17:10,920 --> 00:17:14,960 Speaker 1: websites and stuff, including for the rare times when I 276 00:17:14,960 --> 00:17:20,359 Speaker 1: pop on Twitter. I typically use the desktop Twitter website. 277 00:17:20,400 --> 00:17:23,520 Speaker 1: Sometimes I use tweet Deck, but either way, I'm using 278 00:17:23,560 --> 00:17:26,000 Speaker 1: a desktop version. I almost never am on my phone 279 00:17:26,040 --> 00:17:29,359 Speaker 1: looking at this, but I am on I'm typically looking 280 00:17:29,520 --> 00:17:32,280 Speaker 1: for the tech stuff feed. So for folks like me, 281 00:17:32,680 --> 00:17:35,440 Speaker 1: getting logged out and then being prevented from logging back 282 00:17:35,480 --> 00:17:39,680 Speaker 1: in is kind of akin to being exiled from Twitter, which, 283 00:17:39,760 --> 00:17:41,560 Speaker 1: come to think of it, doesn't sound like that bad 284 00:17:41,600 --> 00:17:44,679 Speaker 1: of a fate these days. The Verge reported on this 285 00:17:44,840 --> 00:17:47,800 Speaker 1: issue and pointed out somewhat snarkily that there was no 286 00:17:47,840 --> 00:17:49,760 Speaker 1: telling how long it would take for the problem to 287 00:17:49,760 --> 00:17:52,400 Speaker 1: get fixed, because we really don't know how many folks 288 00:17:52,440 --> 00:17:54,560 Speaker 1: are even working on those kinds of things at Twitter 289 00:17:54,640 --> 00:17:58,560 Speaker 1: these days? And any inquiry to Twitter's defunct PR department 290 00:17:58,640 --> 00:18:02,000 Speaker 1: comes back with the infamous oop emoji. The Verge did 291 00:18:02,080 --> 00:18:05,840 Speaker 1: hypothesize that maybe this had something to do with Twitter 292 00:18:05,920 --> 00:18:08,840 Speaker 1: trying to fix that problem of formerly verified people getting 293 00:18:08,840 --> 00:18:11,880 Speaker 1: their check marks back, but who knows. I will say 294 00:18:11,880 --> 00:18:16,040 Speaker 1: that today, when I was checking on Twitter, I had 295 00:18:16,080 --> 00:18:18,800 Speaker 1: no issues, I was not logged out, I was fine. 296 00:18:19,240 --> 00:18:22,480 Speaker 1: So whatever it was, I assume has been fixed, or 297 00:18:23,040 --> 00:18:26,200 Speaker 1: at the very least hasn't affected me. Now. I mentioned 298 00:18:26,640 --> 00:18:30,080 Speaker 1: that Twitter's moderation team has been nearly eliminated as part 299 00:18:30,119 --> 00:18:33,119 Speaker 1: of the massive cuts Musk has made to the company's staff. 300 00:18:33,640 --> 00:18:35,920 Speaker 1: That led to a real doozy of a problem. Over 301 00:18:35,960 --> 00:18:40,479 Speaker 1: the weekend, some Twitter users were posting the Super Mario 302 00:18:40,520 --> 00:18:44,160 Speaker 1: Brothers movie to Twitter, the new one, the animated film, 303 00:18:44,160 --> 00:18:46,320 Speaker 1: not the Bob Hoskins film, although someone might have done 304 00:18:46,320 --> 00:18:49,120 Speaker 1: that too, I don't know, but I'm specifically talking about 305 00:18:49,119 --> 00:18:52,840 Speaker 1: the movie that's out in theaters. They posted the whole 306 00:18:52,960 --> 00:18:56,440 Speaker 1: darn movie start to finish, and these tweets stayed up 307 00:18:56,480 --> 00:19:01,240 Speaker 1: for hours. They gained millions of views. I think the 308 00:19:01,280 --> 00:19:04,119 Speaker 1: primary one got more than nine million views before it 309 00:19:04,160 --> 00:19:07,280 Speaker 1: got taken down, and then Twitter finally clamped down on 310 00:19:07,359 --> 00:19:12,360 Speaker 1: them and even started to suspend accounts as a result 311 00:19:12,400 --> 00:19:15,760 Speaker 1: of this. Of course, once it goes up somewhere, it 312 00:19:15,800 --> 00:19:17,800 Speaker 1: can pop up somewhere else. Someone else can do the 313 00:19:17,880 --> 00:19:21,159 Speaker 1: exact same thing. And I mean, it is understandable that 314 00:19:21,160 --> 00:19:23,840 Speaker 1: Twitter has now suspended accounts that have been found to 315 00:19:23,880 --> 00:19:27,640 Speaker 1: do this. But if Twitter failed to act, then those 316 00:19:27,720 --> 00:19:30,119 Speaker 1: movie studios we've been talking about, they sure would not 317 00:19:30,240 --> 00:19:33,719 Speaker 1: go easy on Twitter. If Twitter is shown to not 318 00:19:33,880 --> 00:19:37,360 Speaker 1: respond to stuff like DMCA violations, it can be held 319 00:19:37,359 --> 00:19:42,400 Speaker 1: responsible for hosting illegal material. Again, platforms enjoy a lot 320 00:19:42,440 --> 00:19:47,040 Speaker 1: of legal protection, but only if they can show that 321 00:19:47,040 --> 00:19:51,160 Speaker 1: they're making their own steps to curtail illegal activity. If 322 00:19:51,200 --> 00:19:54,000 Speaker 1: they are aware of illegal activity and they're not doing 323 00:19:54,040 --> 00:19:59,080 Speaker 1: anything to stop it, then sometimes those protections can go away. 324 00:20:00,080 --> 00:20:02,040 Speaker 1: Us for how this happened in the first place, well, 325 00:20:02,080 --> 00:20:05,239 Speaker 1: part of it is because Musk loosened the restrictions on 326 00:20:05,320 --> 00:20:08,560 Speaker 1: how long a video that's posted to Twitter is allowed 327 00:20:08,560 --> 00:20:10,600 Speaker 1: to be so a lot of it also has to 328 00:20:10,640 --> 00:20:12,439 Speaker 1: do with the fact that the company has virtually no 329 00:20:12,560 --> 00:20:16,200 Speaker 1: one to watch out for stuff like this. Anymore. So, Yeah, 330 00:20:16,400 --> 00:20:23,200 Speaker 1: Twitter's idiosyncratic woes continue. Some environmental activist groups have filed 331 00:20:23,280 --> 00:20:28,640 Speaker 1: lawsuits against the US Federal Aviation Administration, or FAA. This 332 00:20:28,720 --> 00:20:33,920 Speaker 1: is with regards to SpaceX's test flight of its Starship vehicle. 333 00:20:34,240 --> 00:20:38,440 Speaker 1: You might remember that Starship launched off the ground successfully 334 00:20:38,560 --> 00:20:42,359 Speaker 1: last month, but subsequently malfunctioned when the first and second 335 00:20:42,359 --> 00:20:46,240 Speaker 1: stages failed to separate. This prompted SpaceX to hit the 336 00:20:46,240 --> 00:20:50,160 Speaker 1: self destruct button. But during the launch, the force from 337 00:20:50,280 --> 00:20:54,120 Speaker 1: those massive engines, the thirty three engines of the first 338 00:20:54,160 --> 00:20:57,240 Speaker 1: stage of the Starship that caused a lot of damage. 339 00:20:57,320 --> 00:21:01,280 Speaker 1: It created craters in the launch pad, It hurled debris, 340 00:21:01,720 --> 00:21:05,679 Speaker 1: including large heavy debris, all around the area, and that 341 00:21:05,800 --> 00:21:10,359 Speaker 1: area happens to include some really sensitive wildlife habitats, something 342 00:21:10,400 --> 00:21:13,040 Speaker 1: that SpaceX has had to contend with since setting up 343 00:21:13,040 --> 00:21:17,520 Speaker 1: operations in Texas, and the environmental groups are accusing the 344 00:21:17,680 --> 00:21:21,920 Speaker 1: FAA of failing to put SpaceX through proper environmental reviews 345 00:21:22,520 --> 00:21:26,000 Speaker 1: before signing off on the test flights of the Starship. 346 00:21:26,400 --> 00:21:30,200 Speaker 1: As part of this lawsuit, these environmental groups are seeking 347 00:21:30,280 --> 00:21:34,320 Speaker 1: to force the FAA to revoke its license to SpaceX 348 00:21:34,440 --> 00:21:38,080 Speaker 1: until a more thorough environmental review can happen in order 349 00:21:38,119 --> 00:21:42,119 Speaker 1: to prevent further destruction to these habitats. That area is 350 00:21:42,200 --> 00:21:46,440 Speaker 1: home to several endangered species, some of them critically endangered, 351 00:21:46,920 --> 00:21:50,480 Speaker 1: and I do think a thorough review is a reasonable request, 352 00:21:50,600 --> 00:21:53,680 Speaker 1: particularly in the wake of the damage the starship's engines 353 00:21:53,720 --> 00:21:57,119 Speaker 1: did to the launch area. No one really anticipated it 354 00:21:57,200 --> 00:22:01,800 Speaker 1: being that bad, not that it was, you know, widespread destruction, 355 00:22:02,520 --> 00:22:05,879 Speaker 1: but it was enough to cause some concern, and I 356 00:22:05,920 --> 00:22:10,800 Speaker 1: think a review is probably warranted. It may turn out 357 00:22:10,880 --> 00:22:14,080 Speaker 1: that everything's fine and still within the parameters of the agreement, 358 00:22:14,480 --> 00:22:17,240 Speaker 1: which is that's fine too. SpaceX should be able to 359 00:22:17,240 --> 00:22:22,800 Speaker 1: continue then, but without a review, we just don't really know. Okay, 360 00:22:23,400 --> 00:22:25,960 Speaker 1: I've got a few more stories to cover before we 361 00:22:26,040 --> 00:22:29,119 Speaker 1: get into any of that. Let's take one more quick break. 362 00:22:38,160 --> 00:22:43,520 Speaker 1: We're back. An inquisitive bitcoin enthusiast has used their knowledge 363 00:22:43,560 --> 00:22:48,840 Speaker 1: of blockchain to identify nearly one thousand digital wallets held 364 00:22:48,920 --> 00:22:53,760 Speaker 1: by various Russian governmental agencies, including ones like the Foreign 365 00:22:53,800 --> 00:22:59,000 Speaker 1: Military Intelligence Agency and the Foreign Intelligence Service. The anonymous 366 00:22:59,040 --> 00:23:03,280 Speaker 1: detective accuses the owners of these digital wallets of using 367 00:23:03,280 --> 00:23:08,240 Speaker 1: the accounts to finance stuff like hacker groups. So it's 368 00:23:08,359 --> 00:23:11,480 Speaker 1: an open secret the worst kept secret in tech really 369 00:23:11,600 --> 00:23:15,919 Speaker 1: that Russia relies on hackers to conduct espionage, sabotage, and 370 00:23:16,119 --> 00:23:22,320 Speaker 1: infiltration campaigns on various targets. It's entirely possible that official 371 00:23:22,359 --> 00:23:26,760 Speaker 1: government agencies are using these digital wallets to fund those efforts. 372 00:23:27,240 --> 00:23:30,800 Speaker 1: A lot of hacker groups lean heavily on digital currency 373 00:23:31,320 --> 00:23:35,280 Speaker 1: because there's less government regulation and oversight. It's easier to 374 00:23:35,520 --> 00:23:40,960 Speaker 1: avoid imperial entanglements, as obi Wan would say. Further, this 375 00:23:41,280 --> 00:23:44,639 Speaker 1: secret sleuth claims to have seized control of at least 376 00:23:45,000 --> 00:23:49,320 Speaker 1: some of these wallets, which they have backed up by 377 00:23:49,359 --> 00:23:55,719 Speaker 1: then burning through money like literally destroying digital currency. So 378 00:23:56,000 --> 00:23:59,760 Speaker 1: that suggests this is not an empty boast because according 379 00:24:00,080 --> 00:24:05,359 Speaker 1: to Reuters, they destroyed around three hundred thousand dollars worth 380 00:24:05,359 --> 00:24:09,280 Speaker 1: of bitcoin while making their claims. And that's putting someone 381 00:24:09,280 --> 00:24:11,480 Speaker 1: else's money where your mouth is, all right, I mean, 382 00:24:11,520 --> 00:24:14,600 Speaker 1: it's like you mean business if you're just burning three 383 00:24:14,680 --> 00:24:18,400 Speaker 1: hundred grand in order to get attention. According to Yahoo 384 00:24:18,400 --> 00:24:22,560 Speaker 1: Finance and the firm ch Analysis, the vigilante has even 385 00:24:22,640 --> 00:24:26,040 Speaker 1: taken Russian money held in these digital wallets and then 386 00:24:26,160 --> 00:24:31,160 Speaker 1: funneled that money to Ukrainian aid groups. Ultimately, this does 387 00:24:31,200 --> 00:24:35,720 Speaker 1: not speak well of Russia's security practices. If they've managed 388 00:24:35,720 --> 00:24:40,320 Speaker 1: to lose control of all these digital assets. Speaking of Russia, 389 00:24:40,480 --> 00:24:43,040 Speaker 1: by the end of June, it's going to be just 390 00:24:43,080 --> 00:24:46,960 Speaker 1: a bit harder to find love in that country, or 391 00:24:47,000 --> 00:24:51,639 Speaker 1: maybe not love, maybe you know, casual flings. You see, 392 00:24:51,880 --> 00:24:54,280 Speaker 1: the Match Group has announced that it's going to be 393 00:24:54,320 --> 00:24:58,480 Speaker 1: shutting down operations in Russia on June thirtieth. The Match 394 00:24:58,480 --> 00:25:03,280 Speaker 1: Group owns, among other things, Tender, so Russians will soon 395 00:25:03,320 --> 00:25:07,680 Speaker 1: find themselves unswipeable at least on Tender, and at least 396 00:25:07,680 --> 00:25:10,520 Speaker 1: while they're in Russia. Why is the Match Group pulling 397 00:25:10,520 --> 00:25:14,240 Speaker 1: out of Russia, Well, basically, it's because Putin's government does 398 00:25:14,520 --> 00:25:19,120 Speaker 1: terrible things. So the Match Group said that the importance 399 00:25:19,160 --> 00:25:21,639 Speaker 1: of human rights was the main reason for closing up 400 00:25:21,720 --> 00:25:25,160 Speaker 1: shop in Russia. That contributing to an economy that supports 401 00:25:25,160 --> 00:25:28,720 Speaker 1: the violation of human rights isn't something the brand really 402 00:25:28,760 --> 00:25:32,400 Speaker 1: wants to be associated with. Bad optics. In other words, 403 00:25:32,840 --> 00:25:36,000 Speaker 1: lots of other companies have similarly shut down in Russia 404 00:25:36,160 --> 00:25:39,160 Speaker 1: and pulled out of operations. The country has actually converted 405 00:25:39,200 --> 00:25:41,680 Speaker 1: at least some of the buildings that were left behind 406 00:25:42,080 --> 00:25:45,840 Speaker 1: into copycat operations. Meant to fill in the void, so 407 00:25:46,800 --> 00:25:51,480 Speaker 1: maybe they'll have their own Russian version of Tender. Gizmodo 408 00:25:51,560 --> 00:25:55,320 Speaker 1: reports that a Microsoft Windows update has created frustrations for 409 00:25:55,440 --> 00:25:57,800 Speaker 1: some users out there. When it comes to setting a 410 00:25:57,800 --> 00:26:01,280 Speaker 1: default browser. It's the ideal. You should be able to 411 00:26:01,359 --> 00:26:04,919 Speaker 1: choose any browser as your default and that's that. But 412 00:26:05,000 --> 00:26:08,160 Speaker 1: after this update, apparently it has become harder to choose 413 00:26:08,200 --> 00:26:13,240 Speaker 1: anything other than Microsoft's own browser Edge as the default, 414 00:26:13,440 --> 00:26:16,760 Speaker 1: and switching to you know, Chrome has presented some users 415 00:26:16,760 --> 00:26:20,160 Speaker 1: with annoying pop ups that essentially asked the same question 416 00:26:20,200 --> 00:26:22,480 Speaker 1: over and over again, like are you sure you want 417 00:26:22,560 --> 00:26:25,520 Speaker 1: this as your default browser? The only way to fix 418 00:26:25,520 --> 00:26:29,560 Speaker 1: this problem was to roll back the system update. Gizmoto 419 00:26:29,640 --> 00:26:34,920 Speaker 1: goes further, though, so Google had incorporated a button within 420 00:26:35,240 --> 00:26:38,520 Speaker 1: the Windows version of Google Chrome. It was in the 421 00:26:38,600 --> 00:26:42,080 Speaker 1: upper left hand corner of the browser window, and if 422 00:26:42,080 --> 00:26:44,800 Speaker 1: you clicked on this button, it would let you automatically 423 00:26:44,840 --> 00:26:47,800 Speaker 1: set Chrome as your default browser. And you wouldn't actually 424 00:26:47,800 --> 00:26:50,359 Speaker 1: have to go into your computer's system settings in order 425 00:26:50,400 --> 00:26:52,359 Speaker 1: to make this change. So they were just making it 426 00:26:52,400 --> 00:26:55,680 Speaker 1: easier for you to select Google Chrome as your default 427 00:26:55,800 --> 00:27:01,199 Speaker 1: if you wanted to. But that Microsoft Windows update broke 428 00:27:01,680 --> 00:27:08,359 Speaker 1: this functionality within the browser. So professional users had one 429 00:27:09,000 --> 00:27:12,240 Speaker 1: huge array of problems. But for your consumers, your average 430 00:27:12,240 --> 00:27:14,920 Speaker 1: people who are using Google Chrome, the button really just 431 00:27:14,960 --> 00:27:18,280 Speaker 1: stopped working because Moto found that by changing the name 432 00:27:18,320 --> 00:27:21,360 Speaker 1: of the Chrome app, the problem went away. So suddenly 433 00:27:21,400 --> 00:27:25,080 Speaker 1: the button worked again just by changing Google Chrome's name, 434 00:27:25,960 --> 00:27:28,520 Speaker 1: and that was it. Like they didn't change the code 435 00:27:28,560 --> 00:27:30,760 Speaker 1: or anything. They just changed the name of the app, 436 00:27:31,280 --> 00:27:35,960 Speaker 1: and once they did, then this barrier went away. That 437 00:27:36,119 --> 00:27:42,240 Speaker 1: suggests that Microsoft was perhaps purposefully targeting Chrome itself, because 438 00:27:42,760 --> 00:27:45,239 Speaker 1: if just a name change fixed that issue, and if 439 00:27:45,280 --> 00:27:49,840 Speaker 1: other browsers weren't having similar problems, it seems like Microsoft 440 00:27:50,000 --> 00:27:53,760 Speaker 1: might have been targeting Google Chrome, the most popular browser 441 00:27:53,800 --> 00:27:57,760 Speaker 1: out there. Google, for its part, ultimately turned off that 442 00:27:57,880 --> 00:28:01,520 Speaker 1: default browser button, which solve these problems, because of course, 443 00:28:01,560 --> 00:28:04,480 Speaker 1: Google was getting lots of complaints from people say, hey, 444 00:28:04,520 --> 00:28:07,280 Speaker 1: I set your browsers my default, but I keep getting 445 00:28:07,480 --> 00:28:11,320 Speaker 1: hassled by these pop ups. What's up? So they turned 446 00:28:11,320 --> 00:28:15,679 Speaker 1: the button off. That ended those endless pop ups that 447 00:28:15,720 --> 00:28:20,199 Speaker 1: were frustrating users. But again it suggests that Microsoft was 448 00:28:20,320 --> 00:28:24,080 Speaker 1: perhaps trying to make it harder for folks to switch 449 00:28:24,240 --> 00:28:28,199 Speaker 1: to some other browser. And y'all who were online in 450 00:28:28,240 --> 00:28:31,280 Speaker 1: the late nineties might think the story sounds really familiar. 451 00:28:31,720 --> 00:28:34,200 Speaker 1: That's because it's not that much different to the accusations 452 00:28:34,240 --> 00:28:38,360 Speaker 1: that Netscape made against Microsoft in the early days of 453 00:28:38,440 --> 00:28:42,320 Speaker 1: the web browser wars. The more things change, the more 454 00:28:42,360 --> 00:28:47,040 Speaker 1: they stay the same. Finally, this past weekend, the World 455 00:28:47,080 --> 00:28:52,520 Speaker 1: Wide Web celebrated its thirtieth anniversary of emerging from CERN's 456 00:28:52,720 --> 00:28:56,920 Speaker 1: ownership to enter the public domain. CERN, which is the 457 00:28:56,960 --> 00:29:02,200 Speaker 1: same organization that's behind the Large Head Collider, was where 458 00:29:02,280 --> 00:29:05,320 Speaker 1: Tim Berner's lead developed the basics of what would become 459 00:29:05,600 --> 00:29:08,920 Speaker 1: the World Wide Web and web browsers using stuff like 460 00:29:09,040 --> 00:29:13,240 Speaker 1: hyperlinks to connect different documents together, even if those documents 461 00:29:13,360 --> 00:29:17,560 Speaker 1: lived on different servers. And it was on April thirtieth, 462 00:29:17,880 --> 00:29:23,400 Speaker 1: nineteen ninety three that CERN made this decision to release 463 00:29:23,480 --> 00:29:26,760 Speaker 1: the code into the public domain, and they relinquished all 464 00:29:26,840 --> 00:29:30,800 Speaker 1: intellectual property rights to the code. Now, it still took 465 00:29:30,920 --> 00:29:34,640 Speaker 1: some time for this web idea to catch on. I 466 00:29:34,720 --> 00:29:38,400 Speaker 1: was actually in college when this was happening, and I 467 00:29:38,520 --> 00:29:42,360 Speaker 1: distinctly remember looking at early web browsers in the computer 468 00:29:42,480 --> 00:29:46,960 Speaker 1: lab in my college library, and I was thinking, good grief, 469 00:29:47,080 --> 00:29:50,760 Speaker 1: that's slow. Forget that it takes forever for these web 470 00:29:50,800 --> 00:29:53,480 Speaker 1: pages to load up. I'm just going to stick with 471 00:29:53,680 --> 00:29:56,640 Speaker 1: FTP and tel net. Why would you even use a 472 00:29:56,680 --> 00:30:00,280 Speaker 1: web browser? Now? In my defense, there weren't that many 473 00:30:00,360 --> 00:30:03,680 Speaker 1: web pages out there when I first was exposed to 474 00:30:03,720 --> 00:30:08,000 Speaker 1: the web. They were incredibly primitive web pages and they 475 00:30:08,000 --> 00:30:11,760 Speaker 1: didn't really change much. So you would go to one 476 00:30:12,120 --> 00:30:13,680 Speaker 1: and if you went to the same web page a 477 00:30:13,680 --> 00:30:15,720 Speaker 1: month later, it would be exactly the same. There was 478 00:30:15,800 --> 00:30:19,000 Speaker 1: really little reason to go to the same web page twice, 479 00:30:19,120 --> 00:30:23,760 Speaker 1: in other words, But obviously over time things changed, to 480 00:30:23,760 --> 00:30:26,800 Speaker 1: put it lightly, and now thirty years later, we can't 481 00:30:26,840 --> 00:30:33,040 Speaker 1: talk about things like commerce and communication and professional networking 482 00:30:33,560 --> 00:30:36,760 Speaker 1: and creative efforts. We can't talk about any of that 483 00:30:36,880 --> 00:30:40,320 Speaker 1: without also including the web as part of that conversation. 484 00:30:40,480 --> 00:30:45,760 Speaker 1: So happy belated Emancipation Day, world Wide Web. And that's 485 00:30:45,800 --> 00:30:48,640 Speaker 1: it for this episode, the tech News episode for May 486 00:30:48,720 --> 00:30:52,520 Speaker 1: sewo twenty twenty three. I hope you are all well, 487 00:30:53,000 --> 00:31:02,160 Speaker 1: and I'll talk to you again really soon. Tech Stuff 488 00:31:02,280 --> 00:31:06,800 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 489 00:31:06,840 --> 00:31:10,360 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 490 00:31:10,440 --> 00:31:11,360 Speaker 1: your favorite shows.