1 00:00:04,480 --> 00:00:12,320 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,960 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:16,000 --> 00:00:19,599 Speaker 1: I'm an executive producer with iHeart Podcasts. And how the 4 00:00:19,680 --> 00:00:22,520 Speaker 1: tech are you? It's time for the tech news for 5 00:00:22,600 --> 00:00:27,440 Speaker 1: the week ending July fifth, twenty twenty four. And y'all. 6 00:00:27,440 --> 00:00:31,440 Speaker 1: One of the tricky things about designing a website is 7 00:00:31,440 --> 00:00:34,320 Speaker 1: that you can't know which browsers your users are going 8 00:00:34,360 --> 00:00:37,320 Speaker 1: to be relying upon when they visit it. So you 9 00:00:37,400 --> 00:00:42,040 Speaker 1: might build a site that works great in certain current browsers, 10 00:00:42,040 --> 00:00:45,040 Speaker 1: but if you go back a generation or three, maybe 11 00:00:45,040 --> 00:00:48,040 Speaker 1: things aren't quite so seamless. But you still have people 12 00:00:48,080 --> 00:00:50,720 Speaker 1: out there who are relying on those browsers. So what 13 00:00:50,840 --> 00:00:53,680 Speaker 1: do you do. Do you try and design for the 14 00:00:53,720 --> 00:00:57,680 Speaker 1: lowest common denominator. Well, for many web designers, one work 15 00:00:57,720 --> 00:01:01,720 Speaker 1: around for this problem resided on an online code library 16 00:01:01,800 --> 00:01:06,240 Speaker 1: called polyfilm. The project is an open source one that 17 00:01:06,440 --> 00:01:11,280 Speaker 1: would put JavaScript code up on a polyfil account and 18 00:01:11,319 --> 00:01:15,080 Speaker 1: that would allow websites to include just a particular URL 19 00:01:15,160 --> 00:01:18,880 Speaker 1: link in the website design, and the library would work 20 00:01:18,920 --> 00:01:22,440 Speaker 1: with older browsers and allow them to display web pages 21 00:01:22,480 --> 00:01:25,000 Speaker 1: properly so that you know, you would still see the 22 00:01:25,040 --> 00:01:27,039 Speaker 1: way the web page was supposed to be laid out, 23 00:01:27,440 --> 00:01:30,959 Speaker 1: So it offloaded a lot of work for web designers. 24 00:01:31,120 --> 00:01:34,120 Speaker 1: You could just include this link and it would do 25 00:01:34,319 --> 00:01:37,040 Speaker 1: the work for you. But then earlier this year, a 26 00:01:37,160 --> 00:01:42,920 Speaker 1: Chinese company called finuln Ul or Funnel maybe it's funnel 27 00:01:43,160 --> 00:01:46,680 Speaker 1: funnel makes more sense anyway, they purchased both the GitHub 28 00:01:46,760 --> 00:01:50,800 Speaker 1: account that hosted this library, as well as the domain 29 00:01:50,920 --> 00:01:56,520 Speaker 1: name for the polyfil site, and last week a cybersecurity 30 00:01:56,520 --> 00:02:00,440 Speaker 1: company called Sansec alerted the world that what you to 31 00:02:00,560 --> 00:02:05,240 Speaker 1: be that JavaScript code is now code that redirects visits 32 00:02:05,280 --> 00:02:09,120 Speaker 1: to other websites, mainly ones related to porn or gambling. 33 00:02:09,600 --> 00:02:12,520 Speaker 1: That's not great. The security firm also said that the 34 00:02:12,560 --> 00:02:16,040 Speaker 1: code was designed so that it wasn't redirecting all the time, 35 00:02:16,320 --> 00:02:19,080 Speaker 1: and this was probably an effort to hide the fact 36 00:02:19,080 --> 00:02:21,800 Speaker 1: it was doing it at all, right, Like if it 37 00:02:21,840 --> 00:02:24,440 Speaker 1: was only doing it in certain hours, then it was 38 00:02:24,720 --> 00:02:28,200 Speaker 1: going to avoid detection longer. But it wasn't long before 39 00:02:28,280 --> 00:02:32,400 Speaker 1: various web companies began to block the domain entirely, and 40 00:02:32,480 --> 00:02:35,800 Speaker 1: the guy who first built polypill posted a message urging 41 00:02:35,840 --> 00:02:39,600 Speaker 1: website administrators to remove links to the Online Code Library. 42 00:02:39,880 --> 00:02:43,639 Speaker 1: Ours Technica's Dan Goodin reports that nearly four hundred thousand 43 00:02:43,720 --> 00:02:47,280 Speaker 1: sites are still linked to the library despite these warnings, 44 00:02:47,320 --> 00:02:51,440 Speaker 1: including sites that are connected to the US federal government, 45 00:02:51,680 --> 00:02:54,680 Speaker 1: which is a big old wompwomp. So these aren't just 46 00:02:54,720 --> 00:02:58,040 Speaker 1: little independent websites out there that are falling victim to this. 47 00:02:58,280 --> 00:03:01,040 Speaker 1: Some of those websites are connected to assive companies and 48 00:03:01,160 --> 00:03:05,160 Speaker 1: other organizations, you know, groups that should absolutely prioritize removing 49 00:03:05,200 --> 00:03:08,520 Speaker 1: malicious links and code from their web pages. But I 50 00:03:08,560 --> 00:03:11,680 Speaker 1: can't bust the USA's chops too much on this because, 51 00:03:11,760 --> 00:03:14,720 Speaker 1: as good In reveals in his article, more than half 52 00:03:14,840 --> 00:03:17,720 Speaker 1: of all the websites that are still linking to Polythyl 53 00:03:17,760 --> 00:03:21,240 Speaker 1: are actually in Germany, which is a big old achluliba. 54 00:03:21,680 --> 00:03:24,880 Speaker 1: In the end, this story shows that supply chain attacks 55 00:03:25,120 --> 00:03:28,640 Speaker 1: can really be effective. So that's when hackers aren't targeting 56 00:03:28,919 --> 00:03:33,440 Speaker 1: end companies, organizations, or individuals. Instead they target the tools 57 00:03:33,480 --> 00:03:37,480 Speaker 1: and services that those end targets are relying upon. So 58 00:03:37,600 --> 00:03:40,080 Speaker 1: you poison the supply chain and you hit a lot 59 00:03:40,080 --> 00:03:43,520 Speaker 1: of targets. It's also a black mark against Chinese companies 60 00:03:43,760 --> 00:03:48,080 Speaker 1: continuing to cause chaos online. Speaking of supply chain attacks. 61 00:03:48,240 --> 00:03:50,520 Speaker 1: What happens when the company that you count on for 62 00:03:50,640 --> 00:03:54,120 Speaker 1: added security is the target of hackers. That's a question 63 00:03:54,200 --> 00:03:58,640 Speaker 1: folks are asking after authe au Thhy, a two factor 64 00:03:58,840 --> 00:04:03,040 Speaker 1: authentication app, got hit by hackers. More specifically, the company 65 00:04:03,040 --> 00:04:06,920 Speaker 1: that makes Authee, a company called Twilio, revealed that hackers 66 00:04:06,920 --> 00:04:10,200 Speaker 1: had managed to access a limited amount of customer information, 67 00:04:10,600 --> 00:04:14,960 Speaker 1: apparently limited to just around thirty three million cell phone numbers. Now. 68 00:04:15,000 --> 00:04:17,960 Speaker 1: Authee is an app that generates codes meant to authenticate 69 00:04:18,080 --> 00:04:21,480 Speaker 1: users as they log into various services. I actually have 70 00:04:21,560 --> 00:04:24,159 Speaker 1: an Authee account so that I can log into Twitch, 71 00:04:24,360 --> 00:04:27,599 Speaker 1: for example. So I think there's a certain assumption among 72 00:04:27,720 --> 00:04:31,200 Speaker 1: users that the service is also secure because it exists 73 00:04:31,480 --> 00:04:34,960 Speaker 1: solely to aid in the security of other services. You 74 00:04:35,000 --> 00:04:37,359 Speaker 1: think if it's a company that's in the security business, 75 00:04:37,600 --> 00:04:40,679 Speaker 1: it should be pretty safe. And yet Twillio has confirmed 76 00:04:40,680 --> 00:04:45,680 Speaker 1: that hackers access to and quote unquote unauthenticated endpoint to 77 00:04:45,920 --> 00:04:48,919 Speaker 1: steal the list of customer phone numbers. While that information 78 00:04:49,000 --> 00:04:51,680 Speaker 1: has limited value, it does mean that the hackers might 79 00:04:51,760 --> 00:04:54,880 Speaker 1: rely on the data to conduct phishing attacks, or more 80 00:04:55,000 --> 00:04:57,239 Speaker 1: likely sell the data for cheap on a black market 81 00:04:57,279 --> 00:04:59,960 Speaker 1: where other people can use it for phishing attacks and such. 82 00:05:00,360 --> 00:05:03,080 Speaker 1: We're not done with hacking news yet. A group called 83 00:05:03,160 --> 00:05:07,520 Speaker 1: Shiny Hunters says its attack on Ticketmaster landed the group 84 00:05:07,640 --> 00:05:13,600 Speaker 1: some really valuable information. I'm talking seriously valuable, like more 85 00:05:13,680 --> 00:05:19,160 Speaker 1: than twenty two billion dollars valuable. And as such, the 86 00:05:19,200 --> 00:05:22,880 Speaker 1: group has increased its initial ransom demand, which was originally 87 00:05:22,920 --> 00:05:27,120 Speaker 1: one million dollars, up to eight million dollars. It has, 88 00:05:27,279 --> 00:05:30,679 Speaker 1: in the mortal words of Darth Vader, altered the deal. 89 00:05:30,960 --> 00:05:33,680 Speaker 1: Pray they do not alter it further. So what's going 90 00:05:33,720 --> 00:05:38,080 Speaker 1: on here? All right? So the hackers breached Ticketmasters systems 91 00:05:38,080 --> 00:05:40,159 Speaker 1: back in May, I'm pretty sure I talked about on 92 00:05:40,200 --> 00:05:43,400 Speaker 1: our previous news episode, and in the process, the hackers 93 00:05:43,440 --> 00:05:46,839 Speaker 1: were able to access a ton of information, and that 94 00:05:46,960 --> 00:05:50,920 Speaker 1: includes around four hundred and forty thousand tickets to Taylor 95 00:05:51,040 --> 00:05:54,880 Speaker 1: Swift shows. You know, she's like the hottest ticket in town, 96 00:05:55,279 --> 00:05:57,839 Speaker 1: no matter what town it is. And the hackers have 97 00:05:58,040 --> 00:06:00,440 Speaker 1: all the information they need to do stuff like produce 98 00:06:00,560 --> 00:06:05,000 Speaker 1: fraudulent but working tickets. They could do that because they 99 00:06:05,040 --> 00:06:07,400 Speaker 1: have all the data. So imagine that you show up 100 00:06:07,440 --> 00:06:09,640 Speaker 1: to a Swift concert and then you find out that 101 00:06:09,800 --> 00:06:14,119 Speaker 1: your legitimate ticket that you purchased months ago no longer 102 00:06:14,200 --> 00:06:17,479 Speaker 1: works because you know someone else has beaten you to it, 103 00:06:17,560 --> 00:06:20,119 Speaker 1: and this is a ticket you purchased for some ungodly 104 00:06:20,200 --> 00:06:22,880 Speaker 1: amount of money, because, let's face it, Ticketmaster is a 105 00:06:22,920 --> 00:06:26,120 Speaker 1: real beast of a company, and it also has beastly 106 00:06:26,279 --> 00:06:29,920 Speaker 1: convenience and processing fees to boot. But because hackers were 107 00:06:29,960 --> 00:06:33,279 Speaker 1: able to steal your ticket information, they produced a copy. 108 00:06:33,600 --> 00:06:36,919 Speaker 1: Maybe they produced a whole bunch of different copies. Maybe 109 00:06:36,920 --> 00:06:40,800 Speaker 1: they scalped all those tickets to unsuspecting buyers. There could 110 00:06:40,839 --> 00:06:44,000 Speaker 1: be one hundred other people who bought your ticket information 111 00:06:44,160 --> 00:06:47,440 Speaker 1: and they're also stuck waiting outside because whomever got there 112 00:06:47,560 --> 00:06:50,279 Speaker 1: first is currently sitting in your seat and they're waiting 113 00:06:50,320 --> 00:06:53,279 Speaker 1: for blank Space to start playing. That's actually the only 114 00:06:53,320 --> 00:06:56,040 Speaker 1: Taylor Swift song I know off the top of my head. Anyway, 115 00:06:56,400 --> 00:06:59,480 Speaker 1: the hackers also have information about all the people who 116 00:06:59,560 --> 00:07:04,200 Speaker 1: have bought tickets. They have personal identifiable information, and that 117 00:07:04,279 --> 00:07:07,640 Speaker 1: could mean that they could reach out to the customers 118 00:07:07,640 --> 00:07:11,200 Speaker 1: and pose as ticket Master. They could say, hey, we 119 00:07:11,640 --> 00:07:14,920 Speaker 1: have recovered your tickets. They were part of this breach, 120 00:07:14,960 --> 00:07:17,360 Speaker 1: but we have it. We need to secure x amount 121 00:07:17,400 --> 00:07:19,960 Speaker 1: of money in order to send you the updated information 122 00:07:20,280 --> 00:07:23,400 Speaker 1: and they're just exploiting you. That's a possibility. Maybe they 123 00:07:23,440 --> 00:07:27,360 Speaker 1: sell your information online and other hackers use your information 124 00:07:27,600 --> 00:07:30,160 Speaker 1: to conduct spear phishing campaigns against you. I mean, if 125 00:07:30,160 --> 00:07:32,360 Speaker 1: you're the sort of person who has spent hundreds or 126 00:07:32,360 --> 00:07:35,400 Speaker 1: one thousands of dollars on a concert ticket, then you 127 00:07:35,440 --> 00:07:38,360 Speaker 1: could end up being a very attractive target for exploitation 128 00:07:38,520 --> 00:07:42,360 Speaker 1: down the line. According to hack Read, the stolen information 129 00:07:42,440 --> 00:07:46,400 Speaker 1: includes nearly a billion sales orders and half a billion 130 00:07:46,560 --> 00:07:50,720 Speaker 1: unique email addresses, plus four hundred million encrypted credit cards 131 00:07:50,880 --> 00:07:54,880 Speaker 1: records with partial details unencrypted. Now, the encrypted credit card 132 00:07:54,880 --> 00:07:57,280 Speaker 1: information at the very least means the hackers don't have 133 00:07:57,360 --> 00:08:01,040 Speaker 1: immediate access to that information. Encryption is a tough thing 134 00:08:01,080 --> 00:08:03,680 Speaker 1: to break us, particularly if you're using really good encryption, 135 00:08:03,960 --> 00:08:06,000 Speaker 1: so they might not be able to ever get that 136 00:08:06,080 --> 00:08:09,480 Speaker 1: credit card information. But this is a really ugly hack 137 00:08:09,560 --> 00:08:12,680 Speaker 1: that has affected millions of Ticketmaster customers. So what's the 138 00:08:12,720 --> 00:08:15,520 Speaker 1: company going to do? Well? I do not know, but 139 00:08:15,600 --> 00:08:17,600 Speaker 1: I bet this is not going to look good in 140 00:08:17,640 --> 00:08:20,360 Speaker 1: the antitrust lawsuit that the US government has brought against 141 00:08:20,440 --> 00:08:23,840 Speaker 1: Live Nation, which is Ticketmaster's parent company. Now, do you 142 00:08:23,840 --> 00:08:26,440 Speaker 1: think we're done with hacker stories this week, don't bank 143 00:08:26,480 --> 00:08:32,120 Speaker 1: on it literally. Ransomware hackers targeted the Patelco credit Union 144 00:08:32,360 --> 00:08:35,760 Speaker 1: in California. According to John Broadkin of Ours Technica, we're 145 00:08:35,760 --> 00:08:37,920 Speaker 1: going to have a lot of Ours Technica stories for 146 00:08:37,960 --> 00:08:41,200 Speaker 1: the second half of this episode. But apparently the hackers 147 00:08:41,320 --> 00:08:44,880 Speaker 1: used a phishing email to trick someone within the organization 148 00:08:45,200 --> 00:08:50,040 Speaker 1: to activate malware that quickly began encrypting data in Patelco's systems, 149 00:08:50,080 --> 00:08:52,720 Speaker 1: and it locked that information away from the credit union. 150 00:08:53,000 --> 00:08:56,040 Speaker 1: Just as the credit card information being encrypted means that 151 00:08:56,120 --> 00:08:59,440 Speaker 1: hackers can't easily get to the credit cards. Well, if 152 00:08:59,480 --> 00:09:03,200 Speaker 1: hackers in encrypt all of an organization's data on their 153 00:09:03,600 --> 00:09:06,560 Speaker 1: systems and their servers, then the organization has no access 154 00:09:06,600 --> 00:09:11,280 Speaker 1: to their legitimate information. So among the many services that 155 00:09:11,360 --> 00:09:14,640 Speaker 1: have been disrupted by this massive attack are online banking, 156 00:09:14,760 --> 00:09:17,280 Speaker 1: which is a big one. The actual attack happened on 157 00:09:17,360 --> 00:09:19,720 Speaker 1: June twenty ninth, and the credit union chose to shut 158 00:09:19,760 --> 00:09:22,320 Speaker 1: down several of its services sort of as a protective 159 00:09:22,320 --> 00:09:25,520 Speaker 1: measure to prevent the hack from spreading throughout the entire system. So, 160 00:09:25,559 --> 00:09:31,200 Speaker 1: according to the credit Union, that includes stuff like quote transactions, transfers, payments, 161 00:09:31,520 --> 00:09:35,800 Speaker 1: and deposits end quote, you know, the basic functionality of 162 00:09:35,840 --> 00:09:39,880 Speaker 1: a bank, So direct deposits were also affected, but according 163 00:09:39,880 --> 00:09:43,120 Speaker 1: to the bank, cash and check deposits are still working, 164 00:09:43,440 --> 00:09:46,439 Speaker 1: So that sounds like for the time being, Patelco customers 165 00:09:46,520 --> 00:09:48,679 Speaker 1: will have to go to a physical location in order 166 00:09:48,760 --> 00:09:52,320 Speaker 1: to make deposits or withdrawals. They may also have had 167 00:09:52,360 --> 00:09:55,240 Speaker 1: their personal information compromises part of this attack. In fact, 168 00:09:55,240 --> 00:09:58,960 Speaker 1: patel coasys you should assume that's the case. They have 169 00:09:59,040 --> 00:10:01,960 Speaker 1: also said that the credit union will work with law 170 00:10:02,040 --> 00:10:05,000 Speaker 1: enforcement to provide protection to those customers. Now, if I 171 00:10:05,040 --> 00:10:06,560 Speaker 1: were to guess, I would say that would be things 172 00:10:06,600 --> 00:10:10,640 Speaker 1: like credit protection and maybe some id theft protection that 173 00:10:10,679 --> 00:10:13,280 Speaker 1: will last for like a year. That's a pretty common 174 00:10:13,360 --> 00:10:16,840 Speaker 1: thing that companies will offer in the wake of a 175 00:10:16,920 --> 00:10:20,240 Speaker 1: breach like this, but this is a particularly bad one. 176 00:10:20,640 --> 00:10:24,560 Speaker 1: It does really illustrate the fact that companies need to 177 00:10:25,600 --> 00:10:31,360 Speaker 1: really drill home the proper security measures that employees need 178 00:10:31,400 --> 00:10:34,640 Speaker 1: to follow in order to avoid these kinds of attacks. 179 00:10:34,800 --> 00:10:38,040 Speaker 1: Hackers will take any advantage they can to do this 180 00:10:38,160 --> 00:10:41,559 Speaker 1: sort of thing, and they will target organizations that are 181 00:10:41,559 --> 00:10:46,240 Speaker 1: particularly vulnerable like banking. Medical organizations are another big one, 182 00:10:46,640 --> 00:10:50,240 Speaker 1: because there's a huge incentive for the company to pay 183 00:10:50,280 --> 00:10:54,400 Speaker 1: off the ransom and get regain access to all that information. 184 00:10:54,880 --> 00:10:57,400 Speaker 1: But as I've always said, keep in mind, paying the 185 00:10:57,480 --> 00:11:01,360 Speaker 1: ransom is typically a bad idea. One, there's no guarantee 186 00:11:01,360 --> 00:11:04,679 Speaker 1: you're going to get everything back, or that the hackers 187 00:11:04,679 --> 00:11:07,040 Speaker 1: aren't going to keep copies of all the information and 188 00:11:07,080 --> 00:11:09,719 Speaker 1: then sell it on the black market. Two. Paying the 189 00:11:09,800 --> 00:11:13,880 Speaker 1: ransom sends the message, hey, these attacks work, they make money, 190 00:11:14,080 --> 00:11:17,000 Speaker 1: and then hackers will just step it up. So paying 191 00:11:17,080 --> 00:11:20,400 Speaker 1: ransoms is typically pretty bad. But at the same time, 192 00:11:20,480 --> 00:11:23,200 Speaker 1: if it's a mission critical kind of thing, I get 193 00:11:23,240 --> 00:11:26,280 Speaker 1: how it's hard to just shrug your shoulders and say, well, 194 00:11:26,440 --> 00:11:28,880 Speaker 1: we're just going to take a loss on this one. Okay, 195 00:11:28,960 --> 00:11:30,839 Speaker 1: we're going to take a quick break. When we come back, 196 00:11:30,840 --> 00:11:42,560 Speaker 1: we've got some more tech news stories to cover. Okay, 197 00:11:42,600 --> 00:11:45,240 Speaker 1: we're back, and we've got some more Ours Technica stories, 198 00:11:45,520 --> 00:11:47,640 Speaker 1: because there were a ton of good ones this week. 199 00:11:47,880 --> 00:11:50,800 Speaker 1: So Ashley Bellinger of Ours Technica, she's actually got a 200 00:11:50,840 --> 00:11:55,040 Speaker 1: couple of stories in this week's episode, has a disturbing 201 00:11:55,080 --> 00:11:59,439 Speaker 1: piece about AI and it's titled AI trains on kids 202 00:11:59,520 --> 00:12:03,679 Speaker 1: photos even when parents use strict privacy settings. So this 203 00:12:03,800 --> 00:12:07,359 Speaker 1: piece is all about how AI companies with image generators 204 00:12:07,520 --> 00:12:11,000 Speaker 1: have been using posted photos across the web to train 205 00:12:11,440 --> 00:12:15,559 Speaker 1: those models, even in cases where the platforms that are 206 00:12:15,600 --> 00:12:19,920 Speaker 1: hosting these photos have specific rules against data scraping, or 207 00:12:20,000 --> 00:12:24,120 Speaker 1: platforms where parents have settings where they can opt into 208 00:12:24,679 --> 00:12:28,319 Speaker 1: denying permission for the use of their children's pictures so 209 00:12:28,440 --> 00:12:31,520 Speaker 1: they can explicitly say I do not want these photos 210 00:12:31,600 --> 00:12:34,800 Speaker 1: used for anything else. And yet it appears that these 211 00:12:34,840 --> 00:12:38,120 Speaker 1: image generator models have still been using those kinds of 212 00:12:38,160 --> 00:12:42,280 Speaker 1: images to train up and that's awful. It is an 213 00:12:42,600 --> 00:12:46,559 Speaker 1: enormous violation of privacy. And researchers with the Human Rights 214 00:12:46,600 --> 00:12:49,600 Speaker 1: Watch have discovered that these companies have hundreds of photos 215 00:12:49,600 --> 00:12:53,480 Speaker 1: of children from vulnerable populations. That makes us even more horrifying. 216 00:12:53,559 --> 00:12:56,000 Speaker 1: It's not just kids, which is already bad enough, but 217 00:12:56,240 --> 00:13:02,120 Speaker 1: kids from disadvantaged communities where they don't have access to 218 00:13:02,400 --> 00:13:05,800 Speaker 1: the kinds of tools or services that others might have 219 00:13:05,920 --> 00:13:08,200 Speaker 1: to fight this kind of thing. Not that fighting it 220 00:13:08,240 --> 00:13:09,960 Speaker 1: is that easy in the first place, but it's even 221 00:13:10,080 --> 00:13:12,800 Speaker 1: harder for these folks. So the researcher said that the 222 00:13:12,840 --> 00:13:16,560 Speaker 1: metadata connected to these images sometimes also includes personal information 223 00:13:16,600 --> 00:13:19,320 Speaker 1: about the children, which is obviously an even bigger privacy 224 00:13:19,360 --> 00:13:24,160 Speaker 1: and security risk, and the generator also creates images based 225 00:13:24,200 --> 00:13:27,960 Speaker 1: off these reference photos. Right Like image generator companies say 226 00:13:28,040 --> 00:13:31,600 Speaker 1: that their AI isn't plagiarizing off of other people, just 227 00:13:31,679 --> 00:13:35,760 Speaker 1: as AI text generator companies say that the text generator 228 00:13:35,800 --> 00:13:39,680 Speaker 1: doesn't plagiarize. But there have been plenty of cases where 229 00:13:39,679 --> 00:13:43,840 Speaker 1: people have pointed out, hey, that's not entirely true. Like 230 00:13:43,920 --> 00:13:47,280 Speaker 1: you can spot elements that seem to be directly lifted 231 00:13:47,320 --> 00:13:50,920 Speaker 1: from source material, and if not directly lifted, so heavily 232 00:13:51,080 --> 00:13:54,480 Speaker 1: influenced by that source material as to constitute a copy. 233 00:13:54,679 --> 00:13:57,959 Speaker 1: So I think it's really important to read this piece. 234 00:13:58,000 --> 00:14:01,680 Speaker 1: There's a lot more that Ashley Bellinger writes about in 235 00:14:01,800 --> 00:14:05,080 Speaker 1: her article. I highly recommend reading it again. That's on 236 00:14:05,160 --> 00:14:07,240 Speaker 1: ours Technico if you want to check out the full 237 00:14:07,280 --> 00:14:11,880 Speaker 1: story now. She also has a piece titled tool Preventing 238 00:14:11,960 --> 00:14:16,960 Speaker 1: Aimmicry Cracked Artists Wonder What's next. This is kind of 239 00:14:16,960 --> 00:14:19,840 Speaker 1: related because it also has to do with AI generation 240 00:14:20,160 --> 00:14:23,960 Speaker 1: and specifically image generation. So as the headline indicates, a 241 00:14:24,000 --> 00:14:29,160 Speaker 1: tool that some online artists use called Glaze has recently 242 00:14:29,560 --> 00:14:31,880 Speaker 1: been called into question as to whether or not it 243 00:14:32,000 --> 00:14:36,320 Speaker 1: is a really great defense So glaze works by inserting 244 00:14:36,400 --> 00:14:40,520 Speaker 1: data into images, and that data alters the images in 245 00:14:40,560 --> 00:14:43,560 Speaker 1: ways that aren't noticeable by humans. In a way, you 246 00:14:43,560 --> 00:14:46,760 Speaker 1: could say it corrupts the information of the image itself. 247 00:14:46,800 --> 00:14:51,040 Speaker 1: And a computer that's scanning these images doesn't know that 248 00:14:51,200 --> 00:14:55,880 Speaker 1: the superfluous data isn't necessary. It thinks it's part of 249 00:14:56,280 --> 00:14:59,800 Speaker 1: the image. Because computers aren't looking at pictures, they're looking 250 00:14:59,840 --> 00:15:04,160 Speaker 1: at the information that makes up that picture and replicating 251 00:15:04,320 --> 00:15:07,600 Speaker 1: or working off of that. So if you're poisoning the 252 00:15:07,640 --> 00:15:12,640 Speaker 1: images by inserting some meaningless information that doesn't really show 253 00:15:12,760 --> 00:15:15,960 Speaker 1: up in the finished picture when you're looking at as 254 00:15:15,960 --> 00:15:18,560 Speaker 1: a human being, then the computer thinks, oh, well, this 255 00:15:18,640 --> 00:15:22,000 Speaker 1: is a necessary component of this kind of image for 256 00:15:22,120 --> 00:15:25,600 Speaker 1: this particular style. Like it's looking at the metadata and saying, oh, 257 00:15:25,640 --> 00:15:27,880 Speaker 1: this is the artist who created this image. If someone 258 00:15:27,920 --> 00:15:30,360 Speaker 1: asks me to create an image in the style of 259 00:15:30,720 --> 00:15:33,480 Speaker 1: this artist, I will take this data in an effort 260 00:15:33,560 --> 00:15:37,600 Speaker 1: to produce that kind of image. But because of the poison, right, 261 00:15:37,680 --> 00:15:40,960 Speaker 1: because of the superfluous data, it might take that noise 262 00:15:41,160 --> 00:15:44,160 Speaker 1: and boost the noise. So what you'll end up with 263 00:15:44,440 --> 00:15:48,760 Speaker 1: is an image that does not look like the reference material, 264 00:15:49,040 --> 00:15:51,240 Speaker 1: which is that's the whole point of glaze. It's to 265 00:15:51,640 --> 00:15:55,400 Speaker 1: poison the reference material so that artists can retain their 266 00:15:55,520 --> 00:15:58,840 Speaker 1: unique styles and not worry about computers copying it. But 267 00:15:58,960 --> 00:16:01,320 Speaker 1: now there are a pair of orobloms facing artists who 268 00:16:01,360 --> 00:16:03,920 Speaker 1: want to use glaze. So one is just that doing 269 00:16:03,960 --> 00:16:07,320 Speaker 1: so requires going through an approvals process with Glaze, and 270 00:16:07,400 --> 00:16:10,359 Speaker 1: the demand for the tool has exceeded the team's capacity 271 00:16:10,400 --> 00:16:13,520 Speaker 1: for keeping up with those requests, so there's a bottleneck there. 272 00:16:13,760 --> 00:16:16,200 Speaker 1: The other problem is that some researchers have come forward 273 00:16:16,240 --> 00:16:19,760 Speaker 1: saying that Glaze's methods aren't really bulletproof and that AI 274 00:16:19,840 --> 00:16:23,520 Speaker 1: will inevitably evolve to defeat these protections. So it's kind 275 00:16:23,560 --> 00:16:25,960 Speaker 1: of like a seesaw approach, and we've actually seen that 276 00:16:26,040 --> 00:16:29,320 Speaker 1: in other security measures like captures are a great example. 277 00:16:29,480 --> 00:16:32,280 Speaker 1: You know, experts would design a new test that in 278 00:16:32,360 --> 00:16:35,840 Speaker 1: theory is easy for humans to do but hard for machines. 279 00:16:36,000 --> 00:16:40,280 Speaker 1: But then eventually the computer scientists train up machines so 280 00:16:40,320 --> 00:16:42,920 Speaker 1: that they can do these tests as well as are 281 00:16:42,920 --> 00:16:45,760 Speaker 1: sometimes even better than humans can, and it requires a 282 00:16:45,800 --> 00:16:49,120 Speaker 1: complete redesign of the capture test, and so it goes. 283 00:16:49,400 --> 00:16:51,760 Speaker 1: The same thing could be going on in image generating 284 00:16:51,800 --> 00:16:54,160 Speaker 1: AI and the efforts to foil it and again. To 285 00:16:54,240 --> 00:16:57,400 Speaker 1: learn more about this, read Bellinger's article on ours Tetnica. 286 00:16:57,480 --> 00:16:59,960 Speaker 1: She does a phenomenal job breaking it all down again. 287 00:17:00,080 --> 00:17:05,679 Speaker 1: That's titled tool Preventing AI mimicry. Cracked artists wonder what's next? Now, 288 00:17:05,720 --> 00:17:07,920 Speaker 1: I'm sure all of y'all out there had the experience 289 00:17:07,920 --> 00:17:10,280 Speaker 1: of setting up a new television and scrolling through all 290 00:17:10,280 --> 00:17:11,919 Speaker 1: the options to find out how the heck you can 291 00:17:11,960 --> 00:17:15,040 Speaker 1: turn off motion smoothing. This is that feature that removes 292 00:17:15,119 --> 00:17:17,560 Speaker 1: motion blur, and that might look great if you're watching 293 00:17:17,560 --> 00:17:20,280 Speaker 1: a live sports event, but for everything else, well, a 294 00:17:20,320 --> 00:17:23,400 Speaker 1: lot of people really hate that effect, including me. This 295 00:17:23,440 --> 00:17:26,040 Speaker 1: is what gives everything that kind of soap opera look. 296 00:17:26,520 --> 00:17:28,919 Speaker 1: You could argue that the reason why classic films and 297 00:17:28,920 --> 00:17:31,879 Speaker 1: television look the way they do really becomes part of 298 00:17:31,920 --> 00:17:34,919 Speaker 1: a combination of limitations on the technology as well as 299 00:17:34,960 --> 00:17:38,360 Speaker 1: the costs of production. But it means that we have 300 00:17:38,640 --> 00:17:43,440 Speaker 1: certain concepts that we associate with what looks like cinema 301 00:17:43,640 --> 00:17:47,919 Speaker 1: or looks like TV, and motion smoothing kind of violates that. Well, 302 00:17:48,119 --> 00:17:52,800 Speaker 1: back in early June, Roku turned on motion smoothing by 303 00:17:52,800 --> 00:17:55,680 Speaker 1: default and there's no way to turn it off, which 304 00:17:55,680 --> 00:17:58,600 Speaker 1: has prompted William Joel of The Verge to write a 305 00:17:58,720 --> 00:18:02,600 Speaker 1: very entertaining piece title Dear Roku, you ruined my TV. 306 00:18:03,080 --> 00:18:06,639 Speaker 1: So Joel writes about how Roku has removed the choice 307 00:18:06,640 --> 00:18:10,440 Speaker 1: from users, forcing them an experience that many people do 308 00:18:10,480 --> 00:18:13,000 Speaker 1: not like. Well worth the read. It's over on the Verge. 309 00:18:13,119 --> 00:18:15,800 Speaker 1: Go check that out. Particularly, you should read it if 310 00:18:15,800 --> 00:18:17,800 Speaker 1: you happen to be an executive at Roku and you're 311 00:18:17,800 --> 00:18:20,919 Speaker 1: wondering why your customers are so agitated. It's been a 312 00:18:21,000 --> 00:18:24,120 Speaker 1: year since Meta launched its competitor to x, formerly known 313 00:18:24,119 --> 00:18:27,359 Speaker 1: as Twitter. Meta's platform is called Threads, which takes its 314 00:18:27,440 --> 00:18:30,920 Speaker 1: name from earlier abandoned Meta projects, and this week Mark 315 00:18:31,000 --> 00:18:33,440 Speaker 1: Zuckerberg announced that Threads hit one hundred and seventy five 316 00:18:33,480 --> 00:18:36,600 Speaker 1: million users, which is impressive but also shows that Meta 317 00:18:36,680 --> 00:18:39,240 Speaker 1: users have not been adopting Threads as quickly as they 318 00:18:39,280 --> 00:18:42,840 Speaker 1: have other platforms like Instagram. Zuckerberg did not go into 319 00:18:42,840 --> 00:18:45,439 Speaker 1: detail on stuff like daily users or anything like that, 320 00:18:45,560 --> 00:18:47,520 Speaker 1: and if I were a betting man, I would wager 321 00:18:47,600 --> 00:18:50,280 Speaker 1: that the reason Zuckerberg did not share those numbers is 322 00:18:50,280 --> 00:18:53,399 Speaker 1: that they aren't very impressive, because I'm guessing on a 323 00:18:53,480 --> 00:18:56,280 Speaker 1: daily basis, people just aren't going to threads that much. Yes, 324 00:18:56,320 --> 00:18:58,800 Speaker 1: there's one hundred and seventy five million users total, but 325 00:18:58,840 --> 00:19:02,000 Speaker 1: how many of those are going to threads regularly? So 326 00:19:02,040 --> 00:19:04,639 Speaker 1: the question is will Threads gain more purchase and user 327 00:19:04,680 --> 00:19:08,119 Speaker 1: mind share? And also how is X doing during all 328 00:19:08,200 --> 00:19:11,199 Speaker 1: this stuff? Honestly, I have no clue. My perception is 329 00:19:11,200 --> 00:19:14,520 Speaker 1: that things that X aren't going great, but that's largely 330 00:19:14,800 --> 00:19:18,280 Speaker 1: down to how you know, there's these ongoing challenges the 331 00:19:18,280 --> 00:19:21,360 Speaker 1: company is facing when it comes to convincing advertisers that 332 00:19:21,520 --> 00:19:24,119 Speaker 1: the ads they are paying for are not going to 333 00:19:24,160 --> 00:19:27,960 Speaker 1: show up next to hate speech. In November twenty twenty three, 334 00:19:28,320 --> 00:19:32,000 Speaker 1: Amazon launched an ambitious product called Astro, which is a 335 00:19:32,040 --> 00:19:35,840 Speaker 1: home robot, a little wheeled CTC robot that can roll 336 00:19:35,880 --> 00:19:38,399 Speaker 1: around your house and keep an eye, well, you know, 337 00:19:38,480 --> 00:19:41,560 Speaker 1: keep a keep cameras and sensors on how things are going. 338 00:19:41,720 --> 00:19:45,439 Speaker 1: And the company also introduced an enterprise version of that 339 00:19:45,560 --> 00:19:49,440 Speaker 1: bought Astro for business, so it's a device intended for 340 00:19:49,680 --> 00:19:53,240 Speaker 1: corporations and such. Now, less than a year after launch, 341 00:19:53,440 --> 00:19:57,919 Speaker 1: Amazon has announced is discontinuing the enterprise version. Customers that 342 00:19:57,960 --> 00:20:00,320 Speaker 1: bought one will receive a full refund, which is around 343 00:20:00,359 --> 00:20:03,240 Speaker 1: twenty three hundred and fifty bucks plus a few hundred 344 00:20:03,240 --> 00:20:06,560 Speaker 1: dollars in credit because their security system is going to 345 00:20:06,600 --> 00:20:09,880 Speaker 1: stop working. Once Amazon shuts down the servers on September 346 00:20:09,920 --> 00:20:14,360 Speaker 1: twenty fifth, it will brick these little robots. The company 347 00:20:14,359 --> 00:20:17,399 Speaker 1: has said it will continue to develop robotics for the home, 348 00:20:17,600 --> 00:20:20,320 Speaker 1: so it sounds like the consumer version of Astro will 349 00:20:20,320 --> 00:20:22,879 Speaker 1: continue to receive support at least for now, and that 350 00:20:22,960 --> 00:20:26,200 Speaker 1: Amazon is apparently working on successors to that twenty twenty 351 00:20:26,240 --> 00:20:30,280 Speaker 1: three model. As for Astro, for business models, they cannot 352 00:20:30,359 --> 00:20:34,040 Speaker 1: be switched to work as consumer versions, so Amazon is 353 00:20:34,080 --> 00:20:37,360 Speaker 1: sending customers shipping label so they can ship off these 354 00:20:37,480 --> 00:20:41,640 Speaker 1: former security robots back off to Amazon so they can 355 00:20:41,640 --> 00:20:44,359 Speaker 1: go to the recycling center, which seems like a pretty 356 00:20:44,400 --> 00:20:47,280 Speaker 1: sad fate for the cute little fillers. Okay, that's it 357 00:20:47,440 --> 00:20:49,560 Speaker 1: for the tech news for the week ending July fifth, 358 00:20:49,600 --> 00:20:52,800 Speaker 1: twenty twenty four. I hope you are all well and 359 00:20:52,840 --> 00:21:02,399 Speaker 1: I'll talk to you again really soon. Tech Stuff is 360 00:21:02,440 --> 00:21:07,000 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 361 00:21:07,040 --> 00:21:10,639 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 362 00:21:10,720 --> 00:21:11,399 Speaker 1: favorite shows,