1 00:00:04,440 --> 00:00:12,760 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,760 --> 00:00:16,160 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,200 --> 00:00:20,080 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:20,160 --> 00:00:23,759 Speaker 1: are you. It's time for the tech news for let's 5 00:00:23,800 --> 00:00:28,960 Speaker 1: see Thursday. That's it September seventh, twenty twenty three. Sorry, 6 00:00:29,280 --> 00:00:32,360 Speaker 1: holiday weeks always mess me up, but let's get to 7 00:00:32,360 --> 00:00:35,080 Speaker 1: the news. First up, we have an update on the 8 00:00:35,159 --> 00:00:39,360 Speaker 1: European Union's efforts to push back against big tech companies 9 00:00:39,840 --> 00:00:44,360 Speaker 1: via the Digital Markets Act or DMA. The European Commission 10 00:00:44,440 --> 00:00:49,479 Speaker 1: has designated six tech companies as quote unquote gatekeepers of 11 00:00:49,520 --> 00:00:52,760 Speaker 1: the digital market in the EU. Now. To qualify as 12 00:00:52,760 --> 00:00:55,720 Speaker 1: a gatekeeper, a company has to have more than forty 13 00:00:55,760 --> 00:00:59,920 Speaker 1: five million end users per month, more than ten thousand 14 00:01:00,200 --> 00:01:03,680 Speaker 1: business users per year, to be active in at least 15 00:01:03,880 --> 00:01:07,200 Speaker 1: three member states of the EU, and to have either 16 00:01:07,360 --> 00:01:09,920 Speaker 1: a market cap of at least seventy five million euro 17 00:01:10,400 --> 00:01:14,319 Speaker 1: or a turnover of seven point five billion Euro. The 18 00:01:14,360 --> 00:01:18,440 Speaker 1: six companies that have the honor of being called gatekeepers 19 00:01:18,760 --> 00:01:25,240 Speaker 1: include Alphabet that's Google's parent company, Amazon, Apple, Meta, Microsoft, 20 00:01:25,280 --> 00:01:28,520 Speaker 1: and byte Dance, which is TikTok's parent company. Just in 21 00:01:28,560 --> 00:01:31,839 Speaker 1: case you've forgotten now, I've talked about this a little 22 00:01:31,880 --> 00:01:34,560 Speaker 1: bit this year about how Amazon has tried to fight 23 00:01:34,640 --> 00:01:38,880 Speaker 1: this designation as gatekeeper. Amazon's argued that doesn't really qualify 24 00:01:40,040 --> 00:01:43,959 Speaker 1: under those terms, but it appears that those arguments have 25 00:01:44,040 --> 00:01:47,440 Speaker 1: fallen on deaf ears. The EU has singled out some 26 00:01:47,480 --> 00:01:51,560 Speaker 1: specific platforms and services in these companies that will have 27 00:01:51,640 --> 00:01:55,600 Speaker 1: to comply with new rules under the DMA. Those platforms 28 00:01:55,600 --> 00:02:00,640 Speaker 1: include stuff like Apple's Safari browser or iOS, or Google's 29 00:02:00,640 --> 00:02:06,560 Speaker 1: Android or Chrome browser, Google Search, Google Ads, metas, Facebook 30 00:02:06,600 --> 00:02:10,079 Speaker 1: and Instagram, and messenger services. Those are just examples, that's 31 00:02:10,080 --> 00:02:12,880 Speaker 1: not all of them. In fact, the EU identified twenty 32 00:02:13,000 --> 00:02:17,680 Speaker 1: two services across the Big six, including some for the 33 00:02:17,720 --> 00:02:20,519 Speaker 1: companies I mentioned that I didn't go into. The rules 34 00:02:20,520 --> 00:02:23,600 Speaker 1: are meant to give EU citizens more control over the 35 00:02:23,600 --> 00:02:26,880 Speaker 1: services they use. So, for example, one rule states that 36 00:02:27,320 --> 00:02:30,359 Speaker 1: a company like Apple has to make all the pre 37 00:02:30,440 --> 00:02:36,280 Speaker 1: installed apps on iOS devices uninstallable and replaceable with third 38 00:02:36,280 --> 00:02:39,760 Speaker 1: party applications. So in other words, if you get an iPhone, 39 00:02:40,000 --> 00:02:42,079 Speaker 1: then you should be able to remove all the pre 40 00:02:42,160 --> 00:02:45,080 Speaker 1: installed Apple apps and just replace them with third party 41 00:02:45,120 --> 00:02:48,679 Speaker 1: apps if you want to. Now, that is to put 42 00:02:48,680 --> 00:02:52,359 Speaker 1: it lightly antithetical to Apple's approach, and there are a 43 00:02:52,400 --> 00:02:55,480 Speaker 1: lot of other rules. That's just one example, and they 44 00:02:55,520 --> 00:02:58,720 Speaker 1: really do push back hard against some of the key 45 00:02:58,760 --> 00:03:02,320 Speaker 1: strategies that tech company have been employing over the last decade. 46 00:03:02,360 --> 00:03:05,840 Speaker 1: So this is a big deal. The gatekeepers have six 47 00:03:06,040 --> 00:03:09,320 Speaker 1: months to comply with these rules. If they are found 48 00:03:09,400 --> 00:03:11,920 Speaker 1: in violation of the rules, the EU can find the 49 00:03:11,960 --> 00:03:15,720 Speaker 1: company up to ten percent of its total worldwide turnover. 50 00:03:16,520 --> 00:03:19,080 Speaker 1: Remember we're talking like seven and a half billion at 51 00:03:19,120 --> 00:03:22,240 Speaker 1: least here, so ten percent of that. But if a 52 00:03:22,240 --> 00:03:24,880 Speaker 1: company continues to break the rules, that actually can go 53 00:03:25,000 --> 00:03:28,280 Speaker 1: up to a twenty percent fine, which is, you know, 54 00:03:28,720 --> 00:03:31,480 Speaker 1: twice as much if my math is correct. Also, the 55 00:03:31,480 --> 00:03:34,920 Speaker 1: Commission asserts that it has the authority to force companies 56 00:03:34,960 --> 00:03:38,320 Speaker 1: to sell off parts of their businesses if they're unable 57 00:03:38,360 --> 00:03:41,960 Speaker 1: to apply comply rather with the DMA. So I'll be 58 00:03:42,000 --> 00:03:44,520 Speaker 1: really interested to see how the gatekeepers respond to this. 59 00:03:45,160 --> 00:03:50,400 Speaker 1: My guess is unfavorably. Microsoft has released information about an 60 00:03:50,400 --> 00:03:56,680 Speaker 1: infiltration attack that ultimately had a Chinese backed hacker group 61 00:03:56,760 --> 00:04:00,200 Speaker 1: called Storm zero five to five eight gaining access to 62 00:04:00,280 --> 00:04:03,480 Speaker 1: email systems belonging to more than two dozen high value 63 00:04:03,520 --> 00:04:07,760 Speaker 1: targets we're talking about like big companies and also big 64 00:04:07,800 --> 00:04:12,240 Speaker 1: government agencies, And it sounds like the way this attack 65 00:04:12,320 --> 00:04:15,200 Speaker 1: happened was a convergence of a whole bunch of stuff 66 00:04:15,240 --> 00:04:17,919 Speaker 1: that wasn't supposed to happen, and it was like a 67 00:04:17,960 --> 00:04:22,680 Speaker 1: perfect storm situation for the hackers. So first up is 68 00:04:22,720 --> 00:04:27,320 Speaker 1: an expired account consumer signing key, kind of like a 69 00:04:27,640 --> 00:04:29,600 Speaker 1: key to get into a system. I mean, that's kind 70 00:04:29,600 --> 00:04:32,760 Speaker 1: of what this is. And the key could but should 71 00:04:32,760 --> 00:04:36,520 Speaker 1: not have been able to but it could create tokens 72 00:04:36,560 --> 00:04:40,839 Speaker 1: that would allow access to stuff like Microsoft's Azure service. 73 00:04:41,200 --> 00:04:45,240 Speaker 1: Azure is Microsoft's cloud computing platform. But the beginning of 74 00:04:45,240 --> 00:04:47,600 Speaker 1: the story actually dates all the way back to April 75 00:04:47,760 --> 00:04:51,839 Speaker 1: twenty twenty one, So see, the system that this key 76 00:04:52,040 --> 00:04:56,400 Speaker 1: was on was meant to be a very heavily protected system. 77 00:04:56,800 --> 00:05:00,479 Speaker 1: The only person allowed to access this workstation was a 78 00:05:00,480 --> 00:05:05,080 Speaker 1: specific engineer who had been thoroughly vetted by Microsoft because 79 00:05:05,400 --> 00:05:10,280 Speaker 1: they were working in a production development environment that other 80 00:05:10,320 --> 00:05:13,120 Speaker 1: people were absolutely not supposed to be able to access. 81 00:05:13,480 --> 00:05:15,880 Speaker 1: So that also meant the workstation was not allowed to 82 00:05:15,880 --> 00:05:19,679 Speaker 1: have several basic applications and services on it because those 83 00:05:19,720 --> 00:05:23,800 Speaker 1: are frequently attack vectors. For hackers. So there was no 84 00:05:23,960 --> 00:05:27,839 Speaker 1: email on this thing, no web access on it, collaboration 85 00:05:27,960 --> 00:05:30,839 Speaker 1: tools couldn't be on it, that kind of thing. Essentially, 86 00:05:31,360 --> 00:05:35,960 Speaker 1: the workstation was approaching air gap status, although it still 87 00:05:36,000 --> 00:05:38,920 Speaker 1: had network connectivity, so it wasn't truly air gapped. If 88 00:05:38,960 --> 00:05:41,680 Speaker 1: it had been, then we'd have a different conversation going 89 00:05:41,720 --> 00:05:46,480 Speaker 1: on here. It also had multi factor authentication protection, and 90 00:05:46,560 --> 00:05:48,359 Speaker 1: all of this makes you wonder, well, then, how the 91 00:05:48,400 --> 00:05:51,440 Speaker 1: heck did the hackers get access to this If it 92 00:05:51,520 --> 00:05:54,640 Speaker 1: was such a heavily protected workstation, how did they manage 93 00:05:54,680 --> 00:05:59,040 Speaker 1: to get this key that had already expired, and how 94 00:05:59,120 --> 00:06:01,360 Speaker 1: then did they use to get access to all these 95 00:06:01,400 --> 00:06:07,240 Speaker 1: different organization's email systems. Well again, back in twenty twenty one, 96 00:06:07,440 --> 00:06:12,360 Speaker 1: in April, this particular workstation crashed and so Windows performed 97 00:06:12,400 --> 00:06:15,880 Speaker 1: a crash dump process. So this is when the computer 98 00:06:16,000 --> 00:06:19,480 Speaker 1: takes the data that's in the computer's memory and then 99 00:06:19,640 --> 00:06:22,839 Speaker 1: saves that data to long term storage. And the reason 100 00:06:22,880 --> 00:06:25,680 Speaker 1: for that is that someone can later look at the 101 00:06:25,800 --> 00:06:28,719 Speaker 1: data and kind of see what actually happened, what caused 102 00:06:28,720 --> 00:06:31,520 Speaker 1: the crash. Is there something that needs to be addressed 103 00:06:31,560 --> 00:06:33,719 Speaker 1: to prevent it from happening in the future. Well, it 104 00:06:33,760 --> 00:06:39,240 Speaker 1: turns out that this expired account consumer signing key was 105 00:06:39,480 --> 00:06:42,520 Speaker 1: in the computer's memory at the time of the crash, 106 00:06:42,839 --> 00:06:47,200 Speaker 1: so it was part of this crash dump. Now, it 107 00:06:47,279 --> 00:06:50,159 Speaker 1: was supposed to not make it through the next step 108 00:06:50,240 --> 00:06:55,480 Speaker 1: because Microsoft scans data to look for things that could 109 00:06:55,480 --> 00:06:59,240 Speaker 1: potentially be a security vulnerability and to remove them before 110 00:07:00,240 --> 00:07:05,200 Speaker 1: transporting the data to, you know, the the bug development group, 111 00:07:05,960 --> 00:07:09,760 Speaker 1: and that step failed. It did not detect the fact 112 00:07:09,800 --> 00:07:12,880 Speaker 1: that there was this key that was just sitting there. 113 00:07:13,000 --> 00:07:16,040 Speaker 1: It's like a shiny key hidden in a big pile 114 00:07:16,080 --> 00:07:21,080 Speaker 1: of dirty, old data, and so they didn't think that 115 00:07:21,240 --> 00:07:23,960 Speaker 1: there was anything problematic there, so they moved it on 116 00:07:24,240 --> 00:07:29,720 Speaker 1: over to the debugging environment. And sometime this year, hackers 117 00:07:29,720 --> 00:07:33,560 Speaker 1: were able to compromise a different Microsoft engineer's corporate account 118 00:07:34,040 --> 00:07:37,200 Speaker 1: and that gave them access to this debugging environment, which 119 00:07:37,200 --> 00:07:41,000 Speaker 1: normally wouldn't have anything in it that would be particularly dangerous, 120 00:07:41,320 --> 00:07:45,840 Speaker 1: but they were able to find that key, that gleaming 121 00:07:46,120 --> 00:07:51,600 Speaker 1: shining signing key in that crash dump data. And then 122 00:07:51,760 --> 00:07:55,480 Speaker 1: how could they then use an expired key, an expired 123 00:07:55,560 --> 00:07:59,200 Speaker 1: consumer key at that it wasn't even an enterprise key. 124 00:07:59,680 --> 00:08:01,640 Speaker 1: How could they use that to forge tokens that would 125 00:08:01,640 --> 00:08:06,880 Speaker 1: allow them to access enterprise accounts on Azure. Well, Microsoft 126 00:08:06,880 --> 00:08:09,200 Speaker 1: revealed that in twenty eighteen, the company created a new 127 00:08:09,240 --> 00:08:13,560 Speaker 1: framework that messed things up. This new framework couldn't actually 128 00:08:13,640 --> 00:08:16,720 Speaker 1: validate signing keys properly. It couldn't really tell the difference 129 00:08:16,760 --> 00:08:20,720 Speaker 1: between a consumer key and an enterprise key, and unless 130 00:08:20,880 --> 00:08:25,840 Speaker 1: system administrators had taken some pretty extraordinary steps, assuming that 131 00:08:25,880 --> 00:08:29,720 Speaker 1: Microsoft had not addressed this and automated the validation process, 132 00:08:29,960 --> 00:08:33,000 Speaker 1: which is a weird assumption to make, Like you would 133 00:08:33,040 --> 00:08:35,040 Speaker 1: assume that all of this would be taken care of 134 00:08:35,080 --> 00:08:38,760 Speaker 1: on Microsoft side, so there was no reason to take 135 00:08:38,800 --> 00:08:42,480 Speaker 1: these extraordinary steps. Well, unless you had taken those steps, 136 00:08:42,800 --> 00:08:46,520 Speaker 1: it meant that your system was potentially vulnerable to this attack. 137 00:08:46,880 --> 00:08:51,080 Speaker 1: And that's in fact what happened. That was the perfect storm, 138 00:08:51,480 --> 00:08:53,120 Speaker 1: and a lot of it ends up being the sole 139 00:08:53,200 --> 00:08:57,560 Speaker 1: responsibility of Microsoft itself. It actually gets way more technical 140 00:08:57,800 --> 00:09:00,240 Speaker 1: than the overview I just gave. I gave a very 141 00:09:00,440 --> 00:09:04,319 Speaker 1: very oversimplified version of what happened, But if you would 142 00:09:04,320 --> 00:09:07,400 Speaker 1: like to read up on the details, I recommend checking 143 00:09:07,480 --> 00:09:11,760 Speaker 1: out an article in Ours Technika. It's titled Microsoft finally 144 00:09:11,800 --> 00:09:16,480 Speaker 1: explains cause of Azure breach and engineer's account was hacked. 145 00:09:16,880 --> 00:09:19,920 Speaker 1: It goes into more detail and explains on a technical 146 00:09:20,000 --> 00:09:22,640 Speaker 1: level what was happening. So if you want to check 147 00:09:22,640 --> 00:09:27,199 Speaker 1: that out, I recommend it. Okay, we have now come 148 00:09:27,280 --> 00:09:29,000 Speaker 1: up to the section of tech news I like to 149 00:09:29,040 --> 00:09:33,440 Speaker 1: call AII because it's all about AI and so leading 150 00:09:33,480 --> 00:09:39,440 Speaker 1: the charge is Go Media that's the parent company of Gizmodo. So, 151 00:09:39,480 --> 00:09:43,560 Speaker 1: according to the Verge, on August twenty ninth, Geomedia fired 152 00:09:43,600 --> 00:09:48,280 Speaker 1: the entire staff of Gizmoto and Espanol, which, as I'm 153 00:09:48,280 --> 00:09:51,920 Speaker 1: sure you've gathered, is the Spanish language version of Gizmodo. 154 00:09:52,360 --> 00:09:57,079 Speaker 1: So the company replaced the staff with AI translators so 155 00:09:57,280 --> 00:10:01,439 Speaker 1: they would translate the English language articles into space Now. 156 00:10:01,720 --> 00:10:04,880 Speaker 1: Articles on the Spanish language site include a bit at 157 00:10:04,880 --> 00:10:06,480 Speaker 1: the end of the article at the bottom of the 158 00:10:06,480 --> 00:10:10,400 Speaker 1: page that says something like contents have been automatically translated 159 00:10:10,400 --> 00:10:13,720 Speaker 1: from the original. Due to the nuances of machine translation, 160 00:10:13,840 --> 00:10:17,840 Speaker 1: there can be slight differences. Well, readers have reported some 161 00:10:17,880 --> 00:10:21,600 Speaker 1: slight differences and some major ones too. They say that 162 00:10:21,640 --> 00:10:23,840 Speaker 1: this tool is far from perfect. They say there are 163 00:10:23,880 --> 00:10:27,880 Speaker 1: examples of articles that start in Spanish but then at 164 00:10:27,920 --> 00:10:31,200 Speaker 1: some way through the article, they change to English and 165 00:10:31,320 --> 00:10:34,360 Speaker 1: stay that way. As many have pointed out in recent months, 166 00:10:34,400 --> 00:10:37,720 Speaker 1: the shift toward using AI for content creation or even 167 00:10:38,040 --> 00:10:43,040 Speaker 1: auto translation isn't necessarily as cost saving or labor saving 168 00:10:43,120 --> 00:10:46,560 Speaker 1: as you might first imagine, because you have issues with 169 00:10:46,640 --> 00:10:52,600 Speaker 1: factual errors, technical hiccups, the AI hallucinating and inventing information 170 00:10:53,120 --> 00:10:56,240 Speaker 1: that is incorrect, and that means that humans still have 171 00:10:56,320 --> 00:10:59,360 Speaker 1: to pour over AI's work to make sure that the 172 00:10:59,640 --> 00:11:01,800 Speaker 1: work is right. And at some point you have to 173 00:11:01,800 --> 00:11:05,240 Speaker 1: ask yourself the question, isn't it just easier and smarter 174 00:11:05,559 --> 00:11:07,760 Speaker 1: to just use humans rather than to have to deal 175 00:11:07,800 --> 00:11:11,240 Speaker 1: with all the instances of robot goofa mups? But then, 176 00:11:11,240 --> 00:11:14,680 Speaker 1: what do I know? Just a dumb human. Earlier this year, 177 00:11:15,040 --> 00:11:19,320 Speaker 1: a music creator with the handle ghost writer made headlines 178 00:11:19,440 --> 00:11:22,280 Speaker 1: when they released a song they had written called Heart 179 00:11:22,280 --> 00:11:25,200 Speaker 1: on My Sleeve. This was the song that was voiced 180 00:11:25,200 --> 00:11:29,439 Speaker 1: by an AI version of real human artists Drake and 181 00:11:29,720 --> 00:11:33,280 Speaker 1: The Weekend, and that prompted a big old brew haha 182 00:11:33,320 --> 00:11:36,559 Speaker 1: in the music industry. Music companies argued that it was 183 00:11:36,600 --> 00:11:41,040 Speaker 1: a copyright violation. That doesn't track for me at all 184 00:11:41,600 --> 00:11:47,280 Speaker 1: because ghostwriter apparently wrote the actual song. So Ghostwriter has 185 00:11:47,280 --> 00:11:50,480 Speaker 1: the copyright for the song, at least the lyrics and 186 00:11:50,520 --> 00:11:54,839 Speaker 1: presumably the music, and you can't copyright a person's voice. 187 00:11:55,240 --> 00:11:59,280 Speaker 1: So I don't know where you could actually argue, you know, 188 00:11:59,679 --> 00:12:02,760 Speaker 1: in a valid way, that this is a copyright violation. 189 00:12:03,280 --> 00:12:05,040 Speaker 1: But the issue raised a lot of questions and it 190 00:12:05,080 --> 00:12:07,360 Speaker 1: pointed out that there are some big old gaps that 191 00:12:07,400 --> 00:12:11,360 Speaker 1: we have in IP law as far as AI generated 192 00:12:11,559 --> 00:12:16,040 Speaker 1: content goes. And now The Verge reports that one Ghostwriter 193 00:12:16,280 --> 00:12:19,440 Speaker 1: has released a new song called Whiplash that features AI 194 00:12:19,559 --> 00:12:23,360 Speaker 1: impressions of Travis Scott and twenty one Savage and two 195 00:12:24,000 --> 00:12:27,679 Speaker 1: Ghostwriter has posted a message stating that they're going to 196 00:12:27,760 --> 00:12:31,920 Speaker 1: release a record of music with impersonated voices if the 197 00:12:32,160 --> 00:12:35,040 Speaker 1: artists who were impersonated sign off on it. So it's 198 00:12:35,080 --> 00:12:37,680 Speaker 1: not that Ghostwriter is saying this is going out. You know, 199 00:12:37,720 --> 00:12:39,840 Speaker 1: I'm gonna sell this whether you like it or not. 200 00:12:40,440 --> 00:12:44,320 Speaker 1: Ghostwriter saying, if you consent, then we'll start selling this 201 00:12:44,520 --> 00:12:47,800 Speaker 1: and I'll direct royalties to you. So you're gonna make 202 00:12:47,840 --> 00:12:51,120 Speaker 1: money and you never had to do anything now. According 203 00:12:51,200 --> 00:12:55,600 Speaker 1: to Harvey Mason, junior head of the Recording Academy, the 204 00:12:55,640 --> 00:12:58,760 Speaker 1: other thing Ghostwriter wants to do is a distinct possibility, 205 00:12:58,800 --> 00:13:02,680 Speaker 1: which is ghostwriter wants to submit Hard on My Sleeve 206 00:13:03,280 --> 00:13:07,720 Speaker 1: for Grammy consideration. The Recording Academy is the organization that 207 00:13:07,880 --> 00:13:10,480 Speaker 1: essentially runs the Grammys, and so the head of the 208 00:13:10,480 --> 00:13:14,560 Speaker 1: Academy says, well, from a creative standpoint, he totally or 209 00:13:14,600 --> 00:13:19,679 Speaker 1: they totally can submit this song for consideration because it 210 00:13:19,720 --> 00:13:21,880 Speaker 1: was written by a human. Might not have been performed 211 00:13:21,880 --> 00:13:23,959 Speaker 1: by one, but it was written by one. That's all 212 00:13:23,960 --> 00:13:28,120 Speaker 1: that matters. However, there are other metrics that the song 213 00:13:28,200 --> 00:13:31,440 Speaker 1: has to meet in order to qualify for consideration. These 214 00:13:31,440 --> 00:13:35,960 Speaker 1: are metrics that involve things like distribution, and because so 215 00:13:36,160 --> 00:13:39,920 Speaker 1: many platforms pulled the song after being pressured by the 216 00:13:40,040 --> 00:13:46,120 Speaker 1: music labels, chances are this song does not meet those qualifications. 217 00:13:46,480 --> 00:13:48,760 Speaker 1: So I don't think it's going to be considered for 218 00:13:48,800 --> 00:13:52,440 Speaker 1: a Grammy, not because of the creative side, but because 219 00:13:52,480 --> 00:13:56,280 Speaker 1: of the business side. But according to the head of 220 00:13:56,320 --> 00:14:03,280 Speaker 1: Recording Academy, it totally could be Grammy eligible. Interesting. Okay, 221 00:14:03,559 --> 00:14:06,160 Speaker 1: We've got a lot more stories to cover before we 222 00:14:06,240 --> 00:14:08,720 Speaker 1: wrap up today, but first let's take a quick break 223 00:14:08,760 --> 00:14:21,800 Speaker 1: to thank our sponsors. NBC News reports that platforms like 224 00:14:21,920 --> 00:14:29,400 Speaker 1: Instagram and TikTok are being inundated with sexualized AI generated content. Essentially, 225 00:14:29,440 --> 00:14:33,680 Speaker 1: we're talking about chatbots and AI generated images. Here. The 226 00:14:33,720 --> 00:14:37,640 Speaker 1: news agency found thirty five app developers running more than 227 00:14:37,720 --> 00:14:41,400 Speaker 1: one thousand ads for these kinds of services on Meta's 228 00:14:41,480 --> 00:14:44,840 Speaker 1: various platforms, and that's notable for a few reasons. A 229 00:14:44,880 --> 00:14:47,600 Speaker 1: big one is that Meta has really cracked down hard 230 00:14:47,760 --> 00:14:52,760 Speaker 1: on sexualized content from human beings, so people like sex 231 00:14:52,840 --> 00:14:56,200 Speaker 1: workers have been pushed off the platform. Companies that make 232 00:14:57,000 --> 00:15:00,800 Speaker 1: sexual aids and sexual toys have been deny aid the 233 00:15:00,920 --> 00:15:04,160 Speaker 1: ability to run ads on the platform. And yet here 234 00:15:04,200 --> 00:15:09,200 Speaker 1: we have this influx of AI powered services, adult oriented 235 00:15:09,240 --> 00:15:13,240 Speaker 1: services that aren't just appearing on the platform. They're running 236 00:15:13,440 --> 00:15:18,120 Speaker 1: in ads like this is this is a paid service there, 237 00:15:18,360 --> 00:15:22,040 Speaker 1: they're working with Meta. And yet despite the fact that 238 00:15:22,040 --> 00:15:24,400 Speaker 1: Meta has cracked down on this in these other situations, 239 00:15:24,440 --> 00:15:27,680 Speaker 1: they don't seem to have done it here. The same thing, 240 00:15:27,800 --> 00:15:30,680 Speaker 1: or a very similar thing is happening over at TikTok. 241 00:15:31,120 --> 00:15:34,640 Speaker 1: NBC only found fourteen app developers running ads over there, however, 242 00:15:35,400 --> 00:15:37,760 Speaker 1: and there's a lot going on here. Obviously, there's this 243 00:15:37,880 --> 00:15:41,760 Speaker 1: double standard that disenfranchises you know, human beings who are 244 00:15:41,800 --> 00:15:45,280 Speaker 1: working in this space, but doesn't do that to AI. 245 00:15:46,320 --> 00:15:50,080 Speaker 1: And then there's the concern about security and privacy. This 246 00:15:50,160 --> 00:15:53,920 Speaker 1: is my big concern because we already know that AI 247 00:15:54,440 --> 00:15:58,280 Speaker 1: can be real loosey goosey with your private information, right Like, 248 00:15:58,720 --> 00:16:02,680 Speaker 1: if the AI model is taking the stuff you communicate 249 00:16:02,720 --> 00:16:05,760 Speaker 1: to the model in account, then you are feeding that 250 00:16:05,840 --> 00:16:09,240 Speaker 1: AI model, which then can regurgitate the stuff you fed 251 00:16:09,280 --> 00:16:13,320 Speaker 1: to it to other people. So maybe you don't want 252 00:16:13,400 --> 00:16:17,440 Speaker 1: to express your most intimate desires and preferences to an 253 00:16:17,520 --> 00:16:20,840 Speaker 1: AI chatbot. It just might not turn out so well 254 00:16:20,880 --> 00:16:24,640 Speaker 1: for you in the long run anyway. NBC News rightfully 255 00:16:24,680 --> 00:16:27,560 Speaker 1: raises the question as to why these AI powered ads 256 00:16:28,000 --> 00:16:30,720 Speaker 1: seem to be getting a pass when it would be 257 00:16:30,760 --> 00:16:33,280 Speaker 1: against the rules for a human to post something similar. 258 00:16:33,600 --> 00:16:36,480 Speaker 1: I don't know, maybe the robots already got to them. 259 00:16:36,720 --> 00:16:40,560 Speaker 1: Starting in November, Google will require political ads that rely 260 00:16:40,680 --> 00:16:45,240 Speaker 1: on AI for content generation to prominently disclose the involvement 261 00:16:45,360 --> 00:16:50,280 Speaker 1: of the AI. Specifically, Google will require ads that contain 262 00:16:50,440 --> 00:16:55,360 Speaker 1: quote synthetic content end quote that appears to show realistic 263 00:16:55,440 --> 00:17:00,760 Speaker 1: people or events to label it as such. So let's say, 264 00:17:00,760 --> 00:17:03,640 Speaker 1: for example, that you had a political ad and it 265 00:17:03,720 --> 00:17:07,000 Speaker 1: shows a certain politician appearing at a certain event, and 266 00:17:07,040 --> 00:17:12,120 Speaker 1: that event is absolutely overflowing with a huge, enthusiastic, supportive audience. 267 00:17:12,680 --> 00:17:16,119 Speaker 1: But perhaps in reality they did appear at an event, 268 00:17:16,240 --> 00:17:18,800 Speaker 1: but maybe it was poorly attended. So AI has been 269 00:17:18,880 --> 00:17:22,440 Speaker 1: used to generate this crowd. Well, the ad would need 270 00:17:22,480 --> 00:17:26,920 Speaker 1: to disclose that it had used AI to augment these images. 271 00:17:27,520 --> 00:17:29,440 Speaker 1: I don't think it would have to get as granular 272 00:17:29,520 --> 00:17:32,880 Speaker 1: as to say what actually was done to the images, 273 00:17:33,160 --> 00:17:35,480 Speaker 1: but it would have to alert you that AI was 274 00:17:35,600 --> 00:17:39,680 Speaker 1: used as part of that ad generation. Now there is 275 00:17:39,720 --> 00:17:42,679 Speaker 1: a threshold here. If someone were just using AI to 276 00:17:42,720 --> 00:17:45,840 Speaker 1: do some minor tweaks like removing red eye or something 277 00:17:45,880 --> 00:17:48,800 Speaker 1: like that, they don't have to disclose that. That doesn't 278 00:17:48,840 --> 00:17:52,520 Speaker 1: meet the criteria. But if you're doing something like making 279 00:17:52,560 --> 00:17:56,159 Speaker 1: someone appear someone that somewhere where they weren't or with 280 00:17:56,359 --> 00:18:00,800 Speaker 1: someone who was not there, anything like that, that would 281 00:18:00,840 --> 00:18:04,040 Speaker 1: have to be disclosed. All right, we are now done 282 00:18:04,040 --> 00:18:07,400 Speaker 1: with AI for this episode, but I'm not done with Google. 283 00:18:08,040 --> 00:18:11,920 Speaker 1: YouTube is removing some control options for content creators when 284 00:18:11,960 --> 00:18:15,600 Speaker 1: it comes to ads. So right now a content creator 285 00:18:15,880 --> 00:18:19,360 Speaker 1: has a decent amount of freedom of where they will 286 00:18:19,400 --> 00:18:23,840 Speaker 1: allow ads to run against monetized content. So, for example, 287 00:18:24,320 --> 00:18:28,399 Speaker 1: a lot of ASMR artists will allow pre roll ads 288 00:18:28,800 --> 00:18:31,760 Speaker 1: these are the ads that play before a video plays, 289 00:18:32,440 --> 00:18:35,399 Speaker 1: but they turn off mid roll ads, which of course 290 00:18:35,440 --> 00:18:38,240 Speaker 1: play in the middle of a video. Like the video stops, 291 00:18:38,240 --> 00:18:39,880 Speaker 1: you get an ad and then the video keeps going, 292 00:18:39,960 --> 00:18:43,120 Speaker 1: kind of like the ads in this podcast. Also, they'll 293 00:18:43,119 --> 00:18:45,840 Speaker 1: turn off post roll ads. Those are the ads that 294 00:18:45,880 --> 00:18:49,320 Speaker 1: play after a video has finished playing. And the reason 295 00:18:49,359 --> 00:18:53,000 Speaker 1: why ASMR artists turn off mid role in post roll 296 00:18:53,040 --> 00:18:55,879 Speaker 1: ads as a rule is because they can be really 297 00:18:56,000 --> 00:18:59,280 Speaker 1: jarring to listen to if you are using ASMR to 298 00:18:59,320 --> 00:19:02,399 Speaker 1: try and relax or to lull yourself to sleep, because 299 00:19:02,520 --> 00:19:06,280 Speaker 1: chances are having some obnoxious ad playing right after the 300 00:19:06,359 --> 00:19:10,800 Speaker 1: video ends will kind of spoil the effect anyway. YouTube's 301 00:19:10,880 --> 00:19:14,200 Speaker 1: changes mean that creators can select whether or not ads 302 00:19:14,280 --> 00:19:16,439 Speaker 1: can play on either side of a video, but they 303 00:19:16,440 --> 00:19:19,640 Speaker 1: don't get to choose whether those ads will be pre 304 00:19:19,760 --> 00:19:22,840 Speaker 1: roll or post roll or both. They can say yes, 305 00:19:22,920 --> 00:19:26,280 Speaker 1: I will allow ads to play before or after, but 306 00:19:26,280 --> 00:19:29,800 Speaker 1: they don't get to say which one. Just it's before 307 00:19:29,920 --> 00:19:34,040 Speaker 1: or after collectively, and YouTube gets to decide whether the 308 00:19:34,080 --> 00:19:36,840 Speaker 1: ads will play before the video or after the video 309 00:19:37,280 --> 00:19:40,560 Speaker 1: or both. They also are not going to be able 310 00:19:40,560 --> 00:19:42,840 Speaker 1: to choose whether or not the ads are skippable or 311 00:19:42,960 --> 00:19:45,320 Speaker 1: non skippable. There are also going to be some other 312 00:19:45,400 --> 00:19:47,880 Speaker 1: changes to things like mineral ads as well, but they're 313 00:19:47,920 --> 00:19:51,440 Speaker 1: not quite to the extreme that I just mentioned. And 314 00:19:51,760 --> 00:19:53,919 Speaker 1: for a lot of content creators this may not be 315 00:19:54,040 --> 00:19:58,040 Speaker 1: that disruptive, but for folks in the meditation or ASMR 316 00:19:58,240 --> 00:20:02,480 Speaker 1: or relaxation spaces, it's causing a lot of anxiety. Heads 317 00:20:02,600 --> 00:20:06,320 Speaker 1: up to Chrome users, Google has been rolling out an 318 00:20:06,440 --> 00:20:11,359 Speaker 1: enhanced ad privacy feature. Some might say this feature is 319 00:20:11,440 --> 00:20:15,719 Speaker 1: perhaps a bit misleading because the name when have you think, Ah, 320 00:20:15,840 --> 00:20:19,680 Speaker 1: this feature will help keep my data private from advertisers. 321 00:20:20,560 --> 00:20:23,960 Speaker 1: Now it kind of does that, but it kind of doesn't. 322 00:20:24,119 --> 00:20:29,280 Speaker 1: So this ties into an application programming interface that's called Topics, 323 00:20:29,359 --> 00:20:32,679 Speaker 1: the Topics API, and what's going on here is that 324 00:20:32,800 --> 00:20:37,360 Speaker 1: Chrome uses your browser history to serve you targeted ads. 325 00:20:37,920 --> 00:20:40,880 Speaker 1: So based upon the kinds of sites you visit and 326 00:20:40,920 --> 00:20:43,680 Speaker 1: how often you visit them and how long you stay there, 327 00:20:44,160 --> 00:20:46,239 Speaker 1: Chrome will be able to serve ads to you that 328 00:20:46,320 --> 00:20:50,199 Speaker 1: it judges are more relevant to your interests. So Topics 329 00:20:50,280 --> 00:20:54,080 Speaker 1: is supposed to help replace third party cookies something that 330 00:20:54,160 --> 00:20:56,959 Speaker 1: Chrome will stop supporting in the not too distant future. 331 00:20:57,400 --> 00:21:00,920 Speaker 1: Cookies can act as trackers for your behavior across the web. 332 00:21:01,359 --> 00:21:04,880 Speaker 1: So the benefit of topics, according to Google, is that 333 00:21:05,320 --> 00:21:08,560 Speaker 1: it doesn't hand your browser history over to advertisers. It 334 00:21:08,560 --> 00:21:12,720 Speaker 1: doesn't explicitly say, oh, you went to site X, site y, 335 00:21:12,880 --> 00:21:16,600 Speaker 1: and site z. Instead, what Google does is hold on 336 00:21:16,720 --> 00:21:19,960 Speaker 1: to that information and it just indicates the types of 337 00:21:20,119 --> 00:21:23,959 Speaker 1: stuff you're interested in. So for me, it might tell 338 00:21:24,119 --> 00:21:28,240 Speaker 1: an advertiser something about me without going into details. So 339 00:21:28,320 --> 00:21:31,160 Speaker 1: for example, it would not say, oh, yeah, Jonathan, he's 340 00:21:31,160 --> 00:21:34,080 Speaker 1: on the I Fix It page like five times a week. 341 00:21:34,560 --> 00:21:37,159 Speaker 1: It wouldn't say that. Instead, it might say Jonathan is 342 00:21:37,200 --> 00:21:42,040 Speaker 1: interested in technology, gadgets, dioy repair, et cetera. And you 343 00:21:42,080 --> 00:21:44,800 Speaker 1: can argue that's an improvement, right, Like, it's not as 344 00:21:44,840 --> 00:21:48,199 Speaker 1: explicit as listing out all the different specific sites you 345 00:21:48,240 --> 00:21:51,400 Speaker 1: went to. But the way Google has rolled this out 346 00:21:51,440 --> 00:21:54,280 Speaker 1: has left some people upset because users are seeing a 347 00:21:54,320 --> 00:21:58,280 Speaker 1: pop up that alerts them to the enhanced privacy from ads. 348 00:21:59,240 --> 00:22:02,159 Speaker 1: But what this really means is that it's already opting 349 00:22:02,200 --> 00:22:07,959 Speaker 1: you into this browser history rec recording feature that Google 350 00:22:08,040 --> 00:22:10,800 Speaker 1: that you're enabling this, you're enabling Google to use your 351 00:22:10,800 --> 00:22:14,639 Speaker 1: browser history as a way to target ads to you. Like, 352 00:22:14,960 --> 00:22:17,440 Speaker 1: that's that's what happens. If you just hit got it, 353 00:22:17,440 --> 00:22:20,960 Speaker 1: it just it opts you in. And people are saying, oh, 354 00:22:21,000 --> 00:22:23,960 Speaker 1: it makes it sound like by the name of the 355 00:22:23,960 --> 00:22:26,800 Speaker 1: feature that if you hit got it it opts you out. 356 00:22:27,320 --> 00:22:29,240 Speaker 1: The opposite is true. It opts you in, And in 357 00:22:29,320 --> 00:22:31,359 Speaker 1: order to opt out, you then have to go to 358 00:22:31,480 --> 00:22:35,520 Speaker 1: your browser settings, go to the privacy settings. There's a 359 00:22:35,560 --> 00:22:38,399 Speaker 1: selection called ad privacy. If you do that, then you 360 00:22:38,400 --> 00:22:41,320 Speaker 1: can go in and turn off this feature, but you 361 00:22:41,400 --> 00:22:43,360 Speaker 1: have to do it manually at that point because you've 362 00:22:43,359 --> 00:22:46,679 Speaker 1: already said got it. Well they got me because I 363 00:22:46,720 --> 00:22:51,320 Speaker 1: said got it. I probably didn't read it properly. But 364 00:22:51,480 --> 00:22:53,400 Speaker 1: even if I had, like, there's a chance I would 365 00:22:53,440 --> 00:22:55,760 Speaker 1: have just thought, oh, by clicking got it, it means 366 00:22:55,800 --> 00:22:58,919 Speaker 1: I've opted to be out of this program, when in 367 00:22:58,960 --> 00:23:01,520 Speaker 1: fact the opposite was true. And when I went into 368 00:23:01,520 --> 00:23:04,160 Speaker 1: the settings, that's what I saw. The settings were all 369 00:23:04,160 --> 00:23:06,640 Speaker 1: turned on and I needed to turn them off manually. 370 00:23:06,760 --> 00:23:09,800 Speaker 1: So just want to let you folks know that, so 371 00:23:09,840 --> 00:23:12,360 Speaker 1: that if you use Chrome, you can go into those 372 00:23:12,359 --> 00:23:15,399 Speaker 1: privacy settings check that ad privacy. See if it's on, 373 00:23:15,680 --> 00:23:17,320 Speaker 1: and you know, if you have no problem with that, 374 00:23:17,320 --> 00:23:19,879 Speaker 1: that's fine, just leave it on. But if, like me, 375 00:23:20,119 --> 00:23:22,800 Speaker 1: you thought you were opting out and it turns out 376 00:23:22,840 --> 00:23:26,119 Speaker 1: you were opting in, you might want to make some changes. Okay, 377 00:23:26,160 --> 00:23:28,959 Speaker 1: I've got some more news stories to come, but before 378 00:23:29,080 --> 00:23:42,520 Speaker 1: we jump into all of that, let's take another quick break. Okay, 379 00:23:42,600 --> 00:23:47,160 Speaker 1: quick story here. The FAA is extending how far UPS 380 00:23:47,240 --> 00:23:50,479 Speaker 1: is allowed to fly delivery drones. So this week the 381 00:23:50,560 --> 00:23:54,199 Speaker 1: FAA updated its rules and will allow deliveries that go 382 00:23:54,359 --> 00:23:58,919 Speaker 1: beyond line of sight. So previously, an operator or a 383 00:23:59,000 --> 00:24:03,680 Speaker 1: spotter need to be able to maintain eyes maintain sight 384 00:24:04,119 --> 00:24:07,200 Speaker 1: on a delivery drone as it was dropping off a package, which, 385 00:24:07,200 --> 00:24:10,320 Speaker 1: of course, you could argue eliminates the benefits of having 386 00:24:10,359 --> 00:24:12,520 Speaker 1: a delivery drone in the first place. So now, for 387 00:24:12,560 --> 00:24:16,360 Speaker 1: the first time, companies will not have to employ spotters 388 00:24:16,400 --> 00:24:18,359 Speaker 1: to keep an eye on a drone as it makes 389 00:24:18,359 --> 00:24:21,159 Speaker 1: its way to the drop off. So the days of 390 00:24:21,320 --> 00:24:25,600 Speaker 1: robots delivering our small packages are closer than ever. Next up, 391 00:24:25,880 --> 00:24:28,520 Speaker 1: have you ever had the problem of running out of 392 00:24:28,560 --> 00:24:31,560 Speaker 1: disk space? Maybe it was on a video game console 393 00:24:31,680 --> 00:24:35,200 Speaker 1: and it told you that no, you cannot download Starfield 394 00:24:35,200 --> 00:24:38,320 Speaker 1: because your console's disk drive is already full, so you're 395 00:24:38,320 --> 00:24:41,199 Speaker 1: gonna have to uninstall some stuff to make room, or 396 00:24:41,480 --> 00:24:45,240 Speaker 1: maybe you've encountered it on a work or home PC. Well, 397 00:24:45,320 --> 00:24:47,679 Speaker 1: if you have, just be comforted in knowing that you 398 00:24:47,720 --> 00:24:51,600 Speaker 1: are not alone. Because last week, a disk capacity issue 399 00:24:52,080 --> 00:24:57,199 Speaker 1: shut down Toyota, like all of Toyota's assembly plants in 400 00:24:57,280 --> 00:25:01,680 Speaker 1: Japan had to shut down, and some of the company's 401 00:25:01,800 --> 00:25:05,159 Speaker 1: servers just detected that there was insufficient disk space to 402 00:25:05,200 --> 00:25:10,280 Speaker 1: continue operations, and so operations stopped continuing. This raised concerns 403 00:25:10,280 --> 00:25:13,040 Speaker 1: that perhaps the auto company had been the target of hackers, 404 00:25:13,040 --> 00:25:14,800 Speaker 1: but in fact it turned out to be a pretty 405 00:25:14,800 --> 00:25:18,359 Speaker 1: simple problem with a pretty simple solution. Toyota transferred the 406 00:25:18,480 --> 00:25:21,680 Speaker 1: data to a server that had much more storage capacity 407 00:25:22,040 --> 00:25:24,480 Speaker 1: and then things got back on track. But it really 408 00:25:24,520 --> 00:25:27,639 Speaker 1: does show that the little frustrations that can irritate us 409 00:25:27,840 --> 00:25:31,960 Speaker 1: as end users can cause much larger problems in other contexts. 410 00:25:32,400 --> 00:25:36,280 Speaker 1: The Mozilla Foundation recently conducted a study on the privacy 411 00:25:36,320 --> 00:25:40,720 Speaker 1: practices of certain segment of technology, and they found that 412 00:25:40,760 --> 00:25:46,000 Speaker 1: this particular sector of technology underperforms across the board when 413 00:25:46,040 --> 00:25:50,119 Speaker 1: it comes to privacy, and that segment is drum roll, please, 414 00:25:50,840 --> 00:25:55,760 Speaker 1: the automobile industry. So, according to the Mozilla Foundation, ninety 415 00:25:55,880 --> 00:25:59,879 Speaker 1: two percent of auto manufacturers give drivers very little or 416 00:25:59,880 --> 00:26:04,199 Speaker 1: no control over how their private information is gathered and used. 417 00:26:04,720 --> 00:26:08,160 Speaker 1: More than eighty percent of them will share driver data 418 00:26:08,240 --> 00:26:11,760 Speaker 1: with outside partners. Out of the twenty five car companies 419 00:26:11,760 --> 00:26:15,040 Speaker 1: that the Foundation studied, not a single one of them 420 00:26:15,480 --> 00:26:19,520 Speaker 1: even met the minimum privacy standards that they had established 421 00:26:20,240 --> 00:26:23,760 Speaker 1: they being the Foundation, I should say. Plus, the Foundation said, 422 00:26:23,760 --> 00:26:27,520 Speaker 1: these companies aren't just collecting and sharing data, they're collecting 423 00:26:27,640 --> 00:26:31,040 Speaker 1: more information than they need in order to provide services 424 00:26:31,040 --> 00:26:33,879 Speaker 1: to drivers. So this reminds me of how platforms like 425 00:26:33,960 --> 00:26:37,960 Speaker 1: Facebook used to let developers get access to all sorts 426 00:26:37,960 --> 00:26:41,240 Speaker 1: of data, even if their app didn't need that data 427 00:26:41,520 --> 00:26:44,000 Speaker 1: to operate. That's part of what led to the whole 428 00:26:44,000 --> 00:26:47,679 Speaker 1: Cambridge Analytica scandal in fact, and it turns out a 429 00:26:47,760 --> 00:26:50,240 Speaker 1: very similar thing is playing out in our vehicles to 430 00:26:50,280 --> 00:26:53,880 Speaker 1: an extent. Anyway. Some other stuff that the study discovered 431 00:26:54,080 --> 00:26:57,080 Speaker 1: includes the fun fact that more than half of auto 432 00:26:57,119 --> 00:27:00,600 Speaker 1: manufacturers are willing to share driver information to law enforcement 433 00:27:00,880 --> 00:27:04,880 Speaker 1: upon request. If you can trast that with Internet based platforms, 434 00:27:05,480 --> 00:27:09,199 Speaker 1: they are known to exert a pretty decent deal of 435 00:27:09,240 --> 00:27:12,320 Speaker 1: effort to try and keep user data secure, because otherwise 436 00:27:12,600 --> 00:27:15,960 Speaker 1: you lose the trust of your user base. But the 437 00:27:16,000 --> 00:27:20,760 Speaker 1: car companies don't seem to have that same perspective. Seventy 438 00:27:20,920 --> 00:27:24,639 Speaker 1: six percent of those auto companies claimed that they actually 439 00:27:24,640 --> 00:27:27,560 Speaker 1: have the right to sell driver information to data brokers, 440 00:27:28,200 --> 00:27:31,679 Speaker 1: like they just have that right inherently doesn't require you 441 00:27:31,760 --> 00:27:34,920 Speaker 1: to consent to it. If you're wondering which company scored 442 00:27:34,960 --> 00:27:39,040 Speaker 1: the lowest on the test results, that would be Tesla. 443 00:27:39,160 --> 00:27:41,680 Speaker 1: And I know I have a tendency to criticize Tesla 444 00:27:41,800 --> 00:27:44,359 Speaker 1: a lot on the show, but it's stuff like this, 445 00:27:44,520 --> 00:27:48,560 Speaker 1: I would argue that kind of justifies that approach. As 446 00:27:48,600 --> 00:27:50,600 Speaker 1: for what you, as a driver can do about this, 447 00:27:51,200 --> 00:27:55,240 Speaker 1: the answer is not a whole lot. Mozilla Foundation has 448 00:27:55,280 --> 00:27:57,960 Speaker 1: called out to car companies to make substantive changes to 449 00:27:58,080 --> 00:28:01,199 Speaker 1: how they collect and utilize data because this problem is 450 00:28:01,240 --> 00:28:04,960 Speaker 1: too big for individual drivers to tackle in any meaningful way. 451 00:28:05,600 --> 00:28:09,000 Speaker 1: Hey do you remember Clubhouse. That's the app that debuted 452 00:28:09,040 --> 00:28:12,479 Speaker 1: in twenty twenty one made a real big splash. It 453 00:28:12,560 --> 00:28:16,639 Speaker 1: let people create virtual spaces where they could broadcast audio 454 00:28:16,760 --> 00:28:19,120 Speaker 1: to an audience. So it's kind of like you could 455 00:28:19,320 --> 00:28:22,720 Speaker 1: have a live streaming podcaster, or you could be even 456 00:28:22,760 --> 00:28:25,119 Speaker 1: a DJ if you wanted to be. You could have 457 00:28:25,240 --> 00:28:28,719 Speaker 1: shows with guests and allow folks to listen in. And 458 00:28:28,960 --> 00:28:33,920 Speaker 1: initially Clubhouse had this air of exclusivity. Originally the app 459 00:28:34,000 --> 00:28:36,919 Speaker 1: was invitation only and it was also limited to the 460 00:28:36,960 --> 00:28:40,840 Speaker 1: iOS platform at first, and there's nothing like an exclusive 461 00:28:40,920 --> 00:28:43,720 Speaker 1: club to make people want to become a member. But 462 00:28:43,800 --> 00:28:46,960 Speaker 1: the shine wore off of Clubhouse, and while the app 463 00:28:47,080 --> 00:28:50,360 Speaker 1: did have sort of a meteoric rise, it faded from 464 00:28:50,400 --> 00:28:54,480 Speaker 1: conversations not too long after that. Now the company has 465 00:28:54,600 --> 00:28:57,680 Speaker 1: laid off about half its staff and it's looking to 466 00:28:57,760 --> 00:29:01,120 Speaker 1: reframe Clubhouse as more of a message app than a 467 00:29:01,200 --> 00:29:03,960 Speaker 1: live audio app. So rather than being a bunch of 468 00:29:04,040 --> 00:29:07,920 Speaker 1: virtual town halls that could be hosted by anybody you know, 469 00:29:08,000 --> 00:29:11,760 Speaker 1: from a no one like me to an actual celebrity, 470 00:29:12,240 --> 00:29:15,640 Speaker 1: the focus now is for Clubhouse users to form groups 471 00:29:15,680 --> 00:29:19,000 Speaker 1: with people they actually know, like real life friends and family. 472 00:29:19,760 --> 00:29:22,040 Speaker 1: In fact, it sounds a lot like what Twitter was 473 00:29:22,160 --> 00:29:25,360 Speaker 1: intended to be when it was first launched. It wasn't 474 00:29:25,360 --> 00:29:29,360 Speaker 1: thought of as like broadcasting to everyone. The use case 475 00:29:29,440 --> 00:29:31,959 Speaker 1: was more like you would follow people you actually know, 476 00:29:32,040 --> 00:29:33,800 Speaker 1: so that you could keep up with what they were doing. 477 00:29:34,720 --> 00:29:38,000 Speaker 1: Now users will be leaving audio messages for each other. 478 00:29:38,640 --> 00:29:40,800 Speaker 1: I'm not really sure how this is at all different 479 00:29:40,800 --> 00:29:44,640 Speaker 1: from other messaging apps that also incorporate audio elements, and 480 00:29:44,680 --> 00:29:47,240 Speaker 1: I'm not sure if Clubhouse can actually leverage this approach 481 00:29:47,320 --> 00:29:50,480 Speaker 1: to regain relevancy in the market. But then I never 482 00:29:50,520 --> 00:29:53,800 Speaker 1: actually joined Clubhouse because I'm not one of the cool kids, 483 00:29:54,160 --> 00:29:56,120 Speaker 1: So I'm out of the loop here. I don't know 484 00:29:56,120 --> 00:29:58,560 Speaker 1: what I'm talking about. I guess now. I was going 485 00:29:58,600 --> 00:30:02,680 Speaker 1: to talk about a secret rocket launch at Cape Canaveral 486 00:30:02,720 --> 00:30:06,120 Speaker 1: this week, one that isn't part of SpaceX, it's not 487 00:30:06,280 --> 00:30:09,360 Speaker 1: part of NASA, it's not part of any other like 488 00:30:09,440 --> 00:30:12,520 Speaker 1: space agency. But it turns out this secret launch was 489 00:30:12,520 --> 00:30:16,400 Speaker 1: scrubbed and didn't happen. Technically, it was a secret for 490 00:30:16,640 --> 00:30:19,600 Speaker 1: at least a while, but folks figured out pretty quickly 491 00:30:19,640 --> 00:30:23,360 Speaker 1: that it was a US military operation and that it 492 00:30:23,400 --> 00:30:26,320 Speaker 1: was most likely meant to conduct a test launch of 493 00:30:26,760 --> 00:30:31,760 Speaker 1: hypersonic missiles. So as that term suggests, these are missiles 494 00:30:31,800 --> 00:30:34,720 Speaker 1: that travel faster than the sweet of sound, and with 495 00:30:34,880 --> 00:30:38,200 Speaker 1: this incredible speed and their maneuverability, they would be more 496 00:30:38,240 --> 00:30:44,480 Speaker 1: capable of avoiding anti missile weaponry. But today, or rather yesterday, 497 00:30:44,480 --> 00:30:47,960 Speaker 1: I should say, the military scrubbed those plans. They also 498 00:30:47,960 --> 00:30:51,760 Speaker 1: acknowledged that this was meant to test hypersonic missile technology, 499 00:30:52,200 --> 00:30:55,680 Speaker 1: but the reason for scrubbing was very vague. It just 500 00:30:55,680 --> 00:30:58,360 Speaker 1: said that during pre flight checks they had to make 501 00:30:58,400 --> 00:31:03,520 Speaker 1: the call to scrub launch. This would have been the 502 00:31:03,560 --> 00:31:07,040 Speaker 1: first surface test of a US hypersonic missile, but other 503 00:31:07,120 --> 00:31:10,640 Speaker 1: countries like China, which actually leads the world in this technology, 504 00:31:10,960 --> 00:31:15,160 Speaker 1: and Russia have already deployed, in Russia's case, even used 505 00:31:15,240 --> 00:31:19,160 Speaker 1: hypersonic capable weaponry. We haven't heard the last of this, 506 00:31:19,360 --> 00:31:22,560 Speaker 1: I don't think or I mean maybe we won't hear it. 507 00:31:22,600 --> 00:31:25,000 Speaker 1: We'll see it. I mean we'll hear it, because hypersonic 508 00:31:25,040 --> 00:31:27,520 Speaker 1: you'll get a sonic boom afterward, and then an actual 509 00:31:27,520 --> 00:31:31,680 Speaker 1: boom if it's a missile. I'm getting off track now. 510 00:31:31,720 --> 00:31:35,760 Speaker 1: If you were going to remake Alanis Morissett's song Ironic Today, 511 00:31:36,320 --> 00:31:40,400 Speaker 1: you might include this last news item there. So, Rockstar Games, 512 00:31:40,680 --> 00:31:44,040 Speaker 1: the company behind popular franchises like Grand Theft Auto, has 513 00:31:44,080 --> 00:31:47,840 Speaker 1: included some of the same technology the company has previously 514 00:31:47,920 --> 00:31:52,000 Speaker 1: campaigned against in some of its games. That's now selling 515 00:31:52,040 --> 00:31:55,600 Speaker 1: on Steam. So this all has to do with Digital 516 00:31:55,760 --> 00:31:59,320 Speaker 1: Rights Management or DRM. So the purpose of DRM is 517 00:31:59,360 --> 00:32:03,840 Speaker 1: to prevent or discourage piracy, though advocates often argue that 518 00:32:04,000 --> 00:32:07,800 Speaker 1: pirates will find ways around DRM, so then the only 519 00:32:07,840 --> 00:32:11,600 Speaker 1: people who have to deal with DRM are actual valid customers, 520 00:32:11,960 --> 00:32:14,480 Speaker 1: which means you're just making the experience worse for people 521 00:32:14,480 --> 00:32:17,520 Speaker 1: who were already paying for the experience, and the people 522 00:32:17,520 --> 00:32:19,840 Speaker 1: who didn't want to pay for the experience, they just 523 00:32:20,040 --> 00:32:23,320 Speaker 1: found ways to get around the protections you had put 524 00:32:23,320 --> 00:32:28,880 Speaker 1: in place. And anyway, a hacker group called Razor nineteen eleven, 525 00:32:29,320 --> 00:32:32,040 Speaker 1: years and years and years ago created a bunch of 526 00:32:32,080 --> 00:32:36,080 Speaker 1: cracks for Rockstar Games, for certain titles in Rockstar Games. 527 00:32:36,480 --> 00:32:39,280 Speaker 1: So these were pieces of software that are meant to 528 00:32:39,360 --> 00:32:42,440 Speaker 1: get around DRM. Really just files that allow you to 529 00:32:42,600 --> 00:32:47,840 Speaker 1: bypass DRM. And the interesting thing is that now Rockstar 530 00:32:47,880 --> 00:32:50,560 Speaker 1: has put some of those old titles for sale on Steam. 531 00:32:51,000 --> 00:32:54,960 Speaker 1: But Rockstar needed a way to get around this DRM 532 00:32:55,000 --> 00:32:57,719 Speaker 1: that they themselves had put on these old games, and 533 00:32:57,880 --> 00:33:02,120 Speaker 1: to do that, what the company apparently has done is 534 00:33:02,200 --> 00:33:05,640 Speaker 1: to include some of the very same cracks made by 535 00:33:05,640 --> 00:33:09,280 Speaker 1: those hackers. A hacker group that Rockstar Games was very 536 00:33:09,320 --> 00:33:12,120 Speaker 1: gung ho on going up against back in the day, 537 00:33:12,160 --> 00:33:16,320 Speaker 1: and now they're using the hacker's own tools because these 538 00:33:16,360 --> 00:33:19,320 Speaker 1: old games have DRM on them that otherwise would make 539 00:33:19,360 --> 00:33:24,760 Speaker 1: the playing experience suboptimal. Bleeping Computer uncovered this and has 540 00:33:24,800 --> 00:33:28,040 Speaker 1: screenshots of code that indicate that, yeah, some of those 541 00:33:28,040 --> 00:33:31,400 Speaker 1: cracks do appear in certain Rockstar game files, which is 542 00:33:31,520 --> 00:33:35,360 Speaker 1: pretty wild stuff. Okay, I've got a couple of recommendations 543 00:33:35,520 --> 00:33:38,000 Speaker 1: of articles for you all this week. First up is 544 00:33:38,040 --> 00:33:41,880 Speaker 1: an article and Wired. It's titled The Burning Man Fiasco 545 00:33:42,120 --> 00:33:46,000 Speaker 1: is the Ultimate Tech culture Clash. So obviously this goes 546 00:33:46,040 --> 00:33:49,760 Speaker 1: into detail about how thousands of people found themselves stranded 547 00:33:49,800 --> 00:33:53,080 Speaker 1: in the desert at the Burning Man Festival while torrential 548 00:33:53,200 --> 00:33:56,120 Speaker 1: rains move through the area. But it's also about how 549 00:33:56,200 --> 00:34:01,120 Speaker 1: a cultural subset essentially appropriated and you know, took over. 550 00:34:01,200 --> 00:34:04,840 Speaker 1: They hijacked an art festival, and how that in turn 551 00:34:04,920 --> 00:34:07,520 Speaker 1: has changed the festival itself, so well worth a read. 552 00:34:08,080 --> 00:34:12,560 Speaker 1: The second recommendation I have is titled Airbnb bookings dry 553 00:34:12,680 --> 00:34:15,600 Speaker 1: up in New York as the new short stay rules 554 00:34:15,640 --> 00:34:20,120 Speaker 1: are introduced. I think this is a really interesting read. 555 00:34:20,160 --> 00:34:23,720 Speaker 1: It's in the Guardian and it's only partly tech related. 556 00:34:23,719 --> 00:34:26,040 Speaker 1: I would argue a lot of people put Airbnb and 557 00:34:26,120 --> 00:34:30,080 Speaker 1: tech company status, so it kind of fits in that regard. 558 00:34:30,120 --> 00:34:35,240 Speaker 1: But it's also really about how the tech startup culture 559 00:34:35,280 --> 00:34:41,040 Speaker 1: that's centered around disruption can be i can have some 560 00:34:41,080 --> 00:34:45,239 Speaker 1: really negative consequences. So in this case, the disruption was 561 00:34:45,280 --> 00:34:48,080 Speaker 1: meant to be to the hospitality industry, right, that's what 562 00:34:48,160 --> 00:34:51,080 Speaker 1: Airbnb is taking aim for. They're taking aim at like 563 00:34:51,360 --> 00:34:55,400 Speaker 1: hotels and cabin rentals and things like that in an 564 00:34:55,440 --> 00:34:58,960 Speaker 1: effort to you know, democratize it to some extent and 565 00:34:59,000 --> 00:35:01,880 Speaker 1: also just to make a a ton of cash in 566 00:35:01,920 --> 00:35:05,960 Speaker 1: the process. But New York. The response to New York 567 00:35:06,080 --> 00:35:09,239 Speaker 1: was that we need to make up rules to curtail 568 00:35:09,560 --> 00:35:13,439 Speaker 1: this because people who are owning properties rather than selling them, 569 00:35:14,040 --> 00:35:18,840 Speaker 1: they are renting them out for these short stays. And meanwhile, 570 00:35:18,880 --> 00:35:21,799 Speaker 1: we have a housing crisis in the city. There's not 571 00:35:21,920 --> 00:35:24,520 Speaker 1: enough housing for the people who need it. And part 572 00:35:24,560 --> 00:35:26,759 Speaker 1: of the reason is because there are all these landlords 573 00:35:27,160 --> 00:35:30,279 Speaker 1: who are renting out these spaces in short term. So 574 00:35:30,320 --> 00:35:33,080 Speaker 1: we're going to make it really really hard for them 575 00:35:33,120 --> 00:35:36,719 Speaker 1: to do that. The rules are making it very challenging 576 00:35:37,320 --> 00:35:40,919 Speaker 1: to run an Airbnb kind of business in New York 577 00:35:40,960 --> 00:35:43,719 Speaker 1: City in order to address this issue. So I just 578 00:35:43,760 --> 00:35:48,279 Speaker 1: thought it was interesting to see the interplay between disruption 579 00:35:48,880 --> 00:35:53,560 Speaker 1: and legislative response, so I recommend that one as well. Okay, 580 00:35:53,640 --> 00:35:56,279 Speaker 1: that's it for the tech News for Thursday, September seventh, 581 00:35:56,400 --> 00:36:00,839 Speaker 1: twenty twenty three. I hope you're all well, and I'll 582 00:36:00,880 --> 00:36:10,319 Speaker 1: talk to you again really soon. Tech Stuff is an 583 00:36:10,400 --> 00:36:15,920 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 584 00:36:16,040 --> 00:36:19,200 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.