1 00:00:04,480 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,760 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,800 --> 00:00:18,799 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:18,840 --> 00:00:22,040 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,120 --> 00:00:26,600 Speaker 1: the week ending on August sixteenth, twenty twenty four. So 6 00:00:26,640 --> 00:00:30,720 Speaker 1: for this first news item, I wanted to bring a 7 00:00:30,840 --> 00:00:35,080 Speaker 1: guest onto the show to talk about it in further detail, 8 00:00:37,920 --> 00:00:42,200 Speaker 1: joining tech Stuff to give us her expertise is friend 9 00:00:42,200 --> 00:00:45,920 Speaker 1: of the show, Shannon Morse. She's a hacker, she's a YouTuber, 10 00:00:46,120 --> 00:00:49,320 Speaker 1: she's a ren Fair enthusiast, and I'm proud to call 11 00:00:49,360 --> 00:00:52,320 Speaker 1: her a friend. Shannon, Welcome back to tech Stuff. 12 00:00:53,640 --> 00:00:55,560 Speaker 2: Hi, thank you so much for having me. You kind 13 00:00:55,560 --> 00:00:57,920 Speaker 2: of surprised me when you said the ren Fair enthusiast. Yes, 14 00:00:58,000 --> 00:00:58,840 Speaker 2: I am well. 15 00:00:59,320 --> 00:01:01,600 Speaker 1: To be fair, so am I I worked ren Fair 16 00:01:01,640 --> 00:01:05,520 Speaker 1: from nineteen ninety nine to twenty nineteen, so so cool. 17 00:01:05,600 --> 00:01:07,920 Speaker 1: We're not alone in this. But I've brought you on 18 00:01:08,160 --> 00:01:10,720 Speaker 1: not to chat about Renfair, although that would be a 19 00:01:10,880 --> 00:01:13,680 Speaker 1: heck of a podcast, but to talk about next time. 20 00:01:13,800 --> 00:01:16,560 Speaker 1: Next time we'll talk about that, talk about the tech 21 00:01:16,760 --> 00:01:20,639 Speaker 1: of Renaissance era Europe. I thought we would talk about 22 00:01:20,680 --> 00:01:26,600 Speaker 1: this hacking attack that has exposed billions of records for people, 23 00:01:26,680 --> 00:01:28,800 Speaker 1: not just in the United States, but in Canada and 24 00:01:28,880 --> 00:01:31,840 Speaker 1: other places as well. I wanted to hear from you 25 00:01:32,280 --> 00:01:36,200 Speaker 1: sort of what happened, who was involved, and what are 26 00:01:36,240 --> 00:01:38,160 Speaker 1: the implications of this attack. 27 00:01:38,920 --> 00:01:42,160 Speaker 2: Well, let's go ahead and start with what exactly happened. 28 00:01:42,560 --> 00:01:46,520 Speaker 2: So way back in April of twenty twenty four, so 29 00:01:46,640 --> 00:01:51,240 Speaker 2: this year, there was this Twitter account that found a 30 00:01:51,360 --> 00:01:54,200 Speaker 2: leak online and they were like, oh, this seems like 31 00:01:54,280 --> 00:01:58,800 Speaker 2: it came from a cybercrime constituent that is called USDOD 32 00:01:58,960 --> 00:02:01,560 Speaker 2: and nobody really knew what was going on. It wasn't 33 00:02:01,640 --> 00:02:05,960 Speaker 2: until July twenty first, twenty twenty four, that about four 34 00:02:06,080 --> 00:02:09,800 Speaker 2: terabytes of data was released on breach forums. And breach 35 00:02:09,880 --> 00:02:13,560 Speaker 2: forums is basically this cybercrime community on the dark Web 36 00:02:13,720 --> 00:02:18,560 Speaker 2: where they often leak private and personal information of regular people. 37 00:02:18,880 --> 00:02:21,720 Speaker 2: This happens really often, and in this case they figured 38 00:02:21,720 --> 00:02:24,520 Speaker 2: out and they claim that it was from a data 39 00:02:24,520 --> 00:02:30,160 Speaker 2: broker called Nationalpublic Data dot com and this was later confirmed. 40 00:02:30,280 --> 00:02:35,840 Speaker 2: So around mid August, just gosh, yesterday, people started confirming 41 00:02:35,880 --> 00:02:39,120 Speaker 2: that this was from the data broker Nationalpublic Data dot com. 42 00:02:39,600 --> 00:02:42,960 Speaker 2: The amount of data did amount to about two point 43 00:02:43,040 --> 00:02:46,680 Speaker 2: nine billion records. The unfortunate part is a lot of 44 00:02:47,760 --> 00:02:51,160 Speaker 2: journalistic media outlets are saying that it was the data 45 00:02:51,160 --> 00:02:54,520 Speaker 2: of almost every American or two point nine billion people 46 00:02:54,960 --> 00:02:58,120 Speaker 2: had their data leak, and that's not necessarily the case, 47 00:02:58,520 --> 00:03:02,240 Speaker 2: so that was a little misconstruction. It's actually records, and 48 00:03:02,280 --> 00:03:04,920 Speaker 2: there could be records on decease people, there could be 49 00:03:05,040 --> 00:03:09,480 Speaker 2: multiple records on one person, there could be records on businesses, 50 00:03:09,639 --> 00:03:11,680 Speaker 2: and all of these seem to be the case when 51 00:03:11,720 --> 00:03:14,560 Speaker 2: it comes to these two point nine billion records. But 52 00:03:14,760 --> 00:03:20,000 Speaker 2: this does include everything from names and addresses to in 53 00:03:20,040 --> 00:03:24,040 Speaker 2: some cases, social security numbers and some other personal information. 54 00:03:24,160 --> 00:03:25,560 Speaker 2: So it's a pretty serious issue. 55 00:03:25,680 --> 00:03:28,800 Speaker 1: Yeah, when it comes to if you're just talking about 56 00:03:29,000 --> 00:03:33,400 Speaker 1: data breaches that affect or potentially could affect the average person, 57 00:03:33,480 --> 00:03:36,440 Speaker 1: this one seems pretty bad. From why do I understand? 58 00:03:36,440 --> 00:03:39,680 Speaker 1: The data broker in question typically would do things like 59 00:03:39,760 --> 00:03:41,840 Speaker 1: background checks for various companies. 60 00:03:42,120 --> 00:03:44,880 Speaker 2: Yes, and that's kind of unfortunate because a lot of 61 00:03:45,000 --> 00:03:49,800 Speaker 2: businesses probably used this company for background checks, and there's 62 00:03:49,840 --> 00:03:52,480 Speaker 2: no real way of knowing, like when you first got 63 00:03:52,480 --> 00:03:54,880 Speaker 2: employed with a company or if you signed up for 64 00:03:54,920 --> 00:03:58,560 Speaker 2: a credit new credit card if they're using this business 65 00:03:58,600 --> 00:04:01,520 Speaker 2: for background checks, So you really have no idea which 66 00:04:01,600 --> 00:04:04,800 Speaker 2: data brokers have your data. And if National Public Data 67 00:04:04,840 --> 00:04:06,600 Speaker 2: did have your data, well. 68 00:04:06,680 --> 00:04:11,360 Speaker 1: Clearly the data was unencrypted, because otherwise it would be 69 00:04:11,520 --> 00:04:14,600 Speaker 1: very difficult to make any use of this information. I 70 00:04:14,640 --> 00:04:17,640 Speaker 1: don't want to just make an assumption or have my 71 00:04:17,800 --> 00:04:20,680 Speaker 1: bias informed this, so I thought I would ask you directly, Shannon, 72 00:04:20,880 --> 00:04:24,560 Speaker 1: in your opinion, is the storage of that sort of 73 00:04:24,680 --> 00:04:27,599 Speaker 1: data in an unencrypted form? Is that what you would 74 00:04:27,640 --> 00:04:29,799 Speaker 1: perhaps call bad security practices. 75 00:04:32,200 --> 00:04:34,440 Speaker 2: I think we could both agree here, assuming that you're 76 00:04:34,760 --> 00:04:39,120 Speaker 2: of the same opinion as I. Absolutely having any of 77 00:04:39,120 --> 00:04:42,680 Speaker 2: that kind of data unencrypted or easily accessible on a 78 00:04:42,760 --> 00:04:46,880 Speaker 2: storage format is really really bad, especially when it comes 79 00:04:46,920 --> 00:04:51,360 Speaker 2: to a data broker. Unfortunately, data brokers are legal here 80 00:04:51,400 --> 00:04:55,400 Speaker 2: in the United States, and we really have no safety 81 00:04:55,480 --> 00:05:00,359 Speaker 2: proponents or no safety like regulations. Regulatory powers control what 82 00:05:00,440 --> 00:05:03,080 Speaker 2: data brokers can and cannot do with that data. So 83 00:05:03,600 --> 00:05:05,880 Speaker 2: they should have been taking care of that data in 84 00:05:05,880 --> 00:05:08,839 Speaker 2: an encrypted format, especially if they want to make money 85 00:05:08,839 --> 00:05:11,159 Speaker 2: off of it. Because now all this data has leaked, 86 00:05:11,760 --> 00:05:14,760 Speaker 2: Chances are just from a business standpoint, this data broker 87 00:05:14,800 --> 00:05:18,160 Speaker 2: will not make as much money using those background checks 88 00:05:18,160 --> 00:05:19,440 Speaker 2: because people aren't going to trust. 89 00:05:19,279 --> 00:05:21,279 Speaker 1: Them as much. Now, sure, there's going to be class 90 00:05:21,279 --> 00:05:24,760 Speaker 1: action lawsuits and such directed against the company, I hope so. Yeah, 91 00:05:24,800 --> 00:05:28,800 Speaker 1: So I imagine that that will have an impact into their 92 00:05:28,839 --> 00:05:31,679 Speaker 1: revenue as well. Thank you for pointing out the lack 93 00:05:31,920 --> 00:05:34,680 Speaker 1: of regulation and safety net here in the United States. 94 00:05:34,680 --> 00:05:37,520 Speaker 1: Obviously in places like the EU, there's been a lot 95 00:05:37,560 --> 00:05:40,839 Speaker 1: of work done to try and try and get a 96 00:05:40,880 --> 00:05:43,400 Speaker 1: handle on that because it is tricky. I think here 97 00:05:43,440 --> 00:05:45,840 Speaker 1: in the US, all of our attention tends to go 98 00:05:45,880 --> 00:05:48,760 Speaker 1: to the end points, like the social platforms where people 99 00:05:48,880 --> 00:05:52,640 Speaker 1: are sharing info, but they ignore, like all the big 100 00:05:52,680 --> 00:05:55,960 Speaker 1: companies in the background that are literally buying and selling 101 00:05:56,040 --> 00:05:57,119 Speaker 1: that info all the time. 102 00:05:57,600 --> 00:06:01,240 Speaker 2: I mean, we've seen this since the early Facebook days 103 00:06:01,240 --> 00:06:04,599 Speaker 2: with Cambridge Analytica, if you remember that big data leak, 104 00:06:04,720 --> 00:06:09,720 Speaker 2: Like it's very very similar scenario here where this background 105 00:06:09,760 --> 00:06:13,679 Speaker 2: company that nobody really knows exists, has so much data 106 00:06:13,680 --> 00:06:17,320 Speaker 2: and so many profiles, and they're building these profiles around us, 107 00:06:17,560 --> 00:06:20,039 Speaker 2: and we have no say in the matter. They just 108 00:06:20,160 --> 00:06:23,600 Speaker 2: do it without our knowledge, without our consent. And then 109 00:06:23,640 --> 00:06:26,160 Speaker 2: we hear about one of these breaches and we're like, well, 110 00:06:26,200 --> 00:06:28,440 Speaker 2: I've never done business with them, why do they have 111 00:06:28,520 --> 00:06:31,800 Speaker 2: my information? Why did this happen to me? So unfortunately 112 00:06:31,920 --> 00:06:34,960 Speaker 2: this is the case in the United States, and luckily 113 00:06:35,000 --> 00:06:37,080 Speaker 2: we do have some options in terms of what we 114 00:06:37,120 --> 00:06:40,080 Speaker 2: can do to protect ourselves. But it's not one hundred percent, 115 00:06:40,120 --> 00:06:41,400 Speaker 2: and I don't know if it ever will be. 116 00:06:41,600 --> 00:06:45,359 Speaker 1: Yeah, Like, the suggestions I give to people include things like, 117 00:06:46,160 --> 00:06:48,400 Speaker 1: when it comes to accounts you already use, if you 118 00:06:48,560 --> 00:06:52,920 Speaker 1: haven't activated multi factor authentication, you should just do that 119 00:06:53,000 --> 00:06:56,600 Speaker 1: as a matter of practice on everything that offers it. 120 00:06:56,720 --> 00:06:59,520 Speaker 1: And if you're concerned about someone making use of say 121 00:06:59,520 --> 00:07:02,160 Speaker 1: your social Security number and your address, like they have 122 00:07:02,240 --> 00:07:04,800 Speaker 1: all the details they need for identity theft, you can 123 00:07:04,920 --> 00:07:08,120 Speaker 1: put a freeze on your credit with the three major 124 00:07:08,360 --> 00:07:10,680 Speaker 1: companies here in the United States. It's a pain in 125 00:07:10,720 --> 00:07:12,600 Speaker 1: the butt to do it. It's a pain in the 126 00:07:12,640 --> 00:07:15,680 Speaker 1: butt to thought if you freeze it, but it's less 127 00:07:15,720 --> 00:07:18,720 Speaker 1: of a pain of a butt than someone taking out 128 00:07:19,240 --> 00:07:24,280 Speaker 1: alone under your name. So, yeah, this is bad. It's 129 00:07:24,320 --> 00:07:26,480 Speaker 1: bad for all of us because there was nothing that 130 00:07:26,560 --> 00:07:29,640 Speaker 1: the average person could have done to protect themselves from 131 00:07:29,640 --> 00:07:31,239 Speaker 1: this particular attack. 132 00:07:31,600 --> 00:07:34,480 Speaker 2: And even though your data may not have been involved 133 00:07:34,480 --> 00:07:36,840 Speaker 2: in this, and you can go to a website called 134 00:07:36,920 --> 00:07:40,200 Speaker 2: have I Been Poned that's owned with a p dot 135 00:07:40,240 --> 00:07:43,720 Speaker 2: com to see if your data was leaked in this 136 00:07:43,760 --> 00:07:47,160 Speaker 2: one as well as other previous data breaches. You can 137 00:07:47,280 --> 00:07:50,280 Speaker 2: see if your data was indeed leaked in any of those. 138 00:07:51,120 --> 00:07:53,160 Speaker 2: That can help you kind of get a mindset of 139 00:07:53,560 --> 00:07:57,120 Speaker 2: where is my data, who has my data currently? And 140 00:07:57,160 --> 00:07:59,840 Speaker 2: then you can also, you know, like you said, do 141 00:07:59,880 --> 00:08:04,280 Speaker 2: the credit freeze at Experian, Equifax, and TransUnion. That's extremely 142 00:08:04,320 --> 00:08:06,480 Speaker 2: important and one of the first steps you should take. 143 00:08:06,760 --> 00:08:10,120 Speaker 2: I recommend that to anybody who is curious about like 144 00:08:10,200 --> 00:08:13,000 Speaker 2: if their Social Security number has gotten out there, or 145 00:08:13,080 --> 00:08:17,280 Speaker 2: any kind of personally identifiable information that would allow for 146 00:08:17,520 --> 00:08:20,840 Speaker 2: any kind of identity theft. That's really important. And using 147 00:08:21,520 --> 00:08:25,480 Speaker 2: kind of proper security hygiene online can certainly help as well, 148 00:08:25,840 --> 00:08:29,080 Speaker 2: because you will run into phishing scams when people find 149 00:08:29,120 --> 00:08:31,640 Speaker 2: out your email address and your name. You might run 150 00:08:31,680 --> 00:08:36,080 Speaker 2: into spam texts or phishing texts or spammy calls. You 151 00:08:36,160 --> 00:08:38,800 Speaker 2: might run into the same thing with people sending you 152 00:08:38,840 --> 00:08:42,600 Speaker 2: spamming and phishing emails. So if you have these kind 153 00:08:42,600 --> 00:08:45,320 Speaker 2: of issues, or if you see them starting to rise 154 00:08:45,679 --> 00:08:48,800 Speaker 2: then looking into signing up for a password manager, which 155 00:08:48,840 --> 00:08:51,360 Speaker 2: can make your life a lot easier when you're auditing 156 00:08:51,400 --> 00:08:54,240 Speaker 2: how many accounts you have online and making sure that 157 00:08:54,280 --> 00:08:56,839 Speaker 2: you're changing those passwords because you can stay up to 158 00:08:56,920 --> 00:09:00,480 Speaker 2: date and just use like auto generation tools in password 159 00:09:00,480 --> 00:09:05,040 Speaker 2: managers to make really really good, strong protective passwords that 160 00:09:05,160 --> 00:09:07,320 Speaker 2: even you don't know, but the password manager does, so 161 00:09:07,360 --> 00:09:10,640 Speaker 2: it's fine. You won't lose your entry into your profiles. 162 00:09:10,679 --> 00:09:14,800 Speaker 2: Your password manager will help you using two factor authentication 163 00:09:15,000 --> 00:09:18,880 Speaker 2: to protect your accounts, especially if passwords get leaked, because 164 00:09:19,120 --> 00:09:22,360 Speaker 2: even if you are using a different password on every website, 165 00:09:22,559 --> 00:09:25,680 Speaker 2: if one of your profiles get leaked, that profile could 166 00:09:25,679 --> 00:09:27,640 Speaker 2: be hacked. So you might as well set up two 167 00:09:27,640 --> 00:09:30,960 Speaker 2: factor authentication and make sure none of those accounts get hacked. 168 00:09:31,080 --> 00:09:33,760 Speaker 2: And using proper security hygiene when it comes to public 169 00:09:33,840 --> 00:09:38,520 Speaker 2: networks like VPNs and making sure you're not logging into 170 00:09:38,559 --> 00:09:42,599 Speaker 2: accounts on public Wi Fi and using secure networks is 171 00:09:42,640 --> 00:09:46,199 Speaker 2: a really really good proponent. When it comes to data brokers, 172 00:09:46,679 --> 00:09:50,240 Speaker 2: it's extremely hard to manually opt out because there's hundreds 173 00:09:50,280 --> 00:09:54,520 Speaker 2: of them, including Nationalpublic Data dot Com, But you can 174 00:09:54,640 --> 00:09:57,560 Speaker 2: sign up for a data broker opt out tool like 175 00:09:57,760 --> 00:10:00,560 Speaker 2: delete me. That's the one that I use, and I 176 00:10:00,640 --> 00:10:02,920 Speaker 2: have paid for it as a customer long before they 177 00:10:03,000 --> 00:10:06,200 Speaker 2: knew about my YouTube channel. They will go in and 178 00:10:06,360 --> 00:10:09,600 Speaker 2: opt you out of having your data on the data brokers, 179 00:10:09,840 --> 00:10:13,000 Speaker 2: and they do it quarterly because data brokers often will 180 00:10:13,040 --> 00:10:15,800 Speaker 2: repurchase your data and put it back on their platforms, 181 00:10:16,160 --> 00:10:19,479 Speaker 2: so people can continuously go to these data broker websites 182 00:10:19,559 --> 00:10:23,000 Speaker 2: and research for your profile and find your address, your 183 00:10:23,040 --> 00:10:26,160 Speaker 2: phone number, your name, the names of your kids. Like, 184 00:10:26,240 --> 00:10:29,720 Speaker 2: it's pretty invasive what they can do. So I've been 185 00:10:30,160 --> 00:10:34,920 Speaker 2: paying for data broker opt out services for almost a 186 00:10:35,000 --> 00:10:38,720 Speaker 2: decade now and it's definitely helped with like clearing the 187 00:10:38,800 --> 00:10:41,280 Speaker 2: kind of data that's out there that I have no 188 00:10:41,360 --> 00:10:44,240 Speaker 2: control over and just kind of taking a step towards 189 00:10:44,240 --> 00:10:45,000 Speaker 2: my security. 190 00:10:45,160 --> 00:10:49,360 Speaker 1: That's great advice, and obviously like these are steps that 191 00:10:49,600 --> 00:10:52,480 Speaker 1: hopefully one day will no longer be necessary, will have 192 00:10:52,960 --> 00:10:55,719 Speaker 1: the protective measures in place that make the mood man 193 00:10:55,760 --> 00:10:59,079 Speaker 1: I hope, So I do too. I got hope springs eternal, Shannon, 194 00:10:59,480 --> 00:11:05,120 Speaker 1: that's I gotta hope or else I wither away. Well, yes, Shannon, 195 00:11:05,160 --> 00:11:07,840 Speaker 1: thank you so much for joining the show and letting 196 00:11:07,920 --> 00:11:11,040 Speaker 1: us know more. About this hack and what it means 197 00:11:11,080 --> 00:11:14,040 Speaker 1: and what people can do. I think that helps alleviate 198 00:11:14,120 --> 00:11:16,960 Speaker 1: a lot of anxiety when people start to hear about 199 00:11:16,960 --> 00:11:22,480 Speaker 1: the steps they can take to best protect themselves. I 200 00:11:22,520 --> 00:11:24,959 Speaker 1: want to thank Shannon again, and we're gonna take a 201 00:11:25,000 --> 00:11:27,680 Speaker 1: quick break. When we come back, we have more news 202 00:11:27,720 --> 00:11:41,360 Speaker 1: to talk about. Mark Zuckerberg probably gets lots of requests 203 00:11:41,559 --> 00:11:45,120 Speaker 1: being the head of Facebook and all. This week, nineteen 204 00:11:45,200 --> 00:11:48,160 Speaker 1: of those requests came from members of the United States 205 00:11:48,200 --> 00:11:50,839 Speaker 1: Congress who would very much like to hear why Meta 206 00:11:50,880 --> 00:11:54,760 Speaker 1: has allowed ads for such spicy stuff as cocaine and 207 00:11:54,920 --> 00:11:59,120 Speaker 1: ecstasy to appear on Facebook and Instagram. I mean, these 208 00:11:59,400 --> 00:12:03,480 Speaker 1: substances aren't legal, so why is Meta allowing advertisers to 209 00:12:03,559 --> 00:12:07,440 Speaker 1: run ads for them on these very popular platforms. An 210 00:12:07,520 --> 00:12:13,280 Speaker 1: investigation from the Tech Transparency Project or TTP, prompted this inquiry. 211 00:12:13,760 --> 00:12:17,120 Speaker 1: In the investigation, the TTP found more than four hundred 212 00:12:17,120 --> 00:12:20,920 Speaker 1: examples of ads for drugs of varying degrees of legality 213 00:12:21,160 --> 00:12:24,680 Speaker 1: on these platforms, and lawmakers would like Zuckerberg to explain 214 00:12:24,760 --> 00:12:27,160 Speaker 1: how this could happen. In a letter to the CEO, 215 00:12:27,640 --> 00:12:32,000 Speaker 1: the legislators wrote, quote, this was not user generated content 216 00:12:32,240 --> 00:12:35,920 Speaker 1: on the dark Web or on private social media pages, 217 00:12:36,200 --> 00:12:40,959 Speaker 1: but rather they were advertisements approved and monetized by Meta. 218 00:12:41,000 --> 00:12:44,760 Speaker 1: Many of these ads contained blatant references to illegal drugs 219 00:12:44,760 --> 00:12:48,880 Speaker 1: in their titles, descriptions, photos, and advertiser account names, which 220 00:12:48,920 --> 00:12:51,800 Speaker 1: were easily found by the researchers and journalists at the 221 00:12:51,800 --> 00:12:57,240 Speaker 1: Wall Street Journal and Tech Transparency Project using Meta's ad library. However, 222 00:12:57,480 --> 00:13:00,719 Speaker 1: they appeared to have passed undetected or bid ignored by 223 00:13:00,720 --> 00:13:05,480 Speaker 1: Meta's own internal processes. End quote. Now, according to the TTP, 224 00:13:05,880 --> 00:13:08,480 Speaker 1: many of these ads prompted users to click over to 225 00:13:08,720 --> 00:13:12,680 Speaker 1: a Telegram account to complete any shopping that they wished 226 00:13:12,800 --> 00:13:18,319 Speaker 1: to do. Meta has until September sixth to respond. Now, 227 00:13:18,360 --> 00:13:21,559 Speaker 1: I can't say that I'm terribly surprised by this story. Personally, 228 00:13:21,640 --> 00:13:25,880 Speaker 1: I have encountered so many suspicious ads on Facebook that 229 00:13:26,000 --> 00:13:29,360 Speaker 1: I suspect the quality control aspect of the Ads division 230 00:13:29,480 --> 00:13:33,280 Speaker 1: is purposefully lax. Now, in my case, the ads weren't 231 00:13:33,400 --> 00:13:37,720 Speaker 1: for illegal drugs, but rather they were ads that way 232 00:13:38,040 --> 00:13:41,319 Speaker 1: that were fake. I mean they were posing as other companies, 233 00:13:41,760 --> 00:13:44,760 Speaker 1: like some other entity, in an attempt to fool consumers 234 00:13:45,040 --> 00:13:50,160 Speaker 1: into shopping for shoddy knockoffs. The big example I can 235 00:13:50,200 --> 00:13:52,320 Speaker 1: think of off top of my head was the sam 236 00:13:52,440 --> 00:13:55,000 Speaker 1: Ash music stores in the United States. They went out 237 00:13:55,000 --> 00:13:58,319 Speaker 1: of business and closed down, and I saw ad after 238 00:13:58,400 --> 00:14:03,240 Speaker 1: ad after ad pose that, but they weren't sam Ash. 239 00:14:03,280 --> 00:14:06,280 Speaker 1: They were some other fly by night company trying to 240 00:14:06,320 --> 00:14:11,040 Speaker 1: sell knockoffs or sometimes just boxes of literal trash rather 241 00:14:11,120 --> 00:14:15,280 Speaker 1: than whatever item you thought you were going to get. Now. 242 00:14:15,920 --> 00:14:18,920 Speaker 1: I have reported these ads to Facebook, and frequently I 243 00:14:18,960 --> 00:14:22,120 Speaker 1: received the message that after review, Facebook determined there were 244 00:14:22,120 --> 00:14:26,560 Speaker 1: no violations of its policies. Meanwhile, the actual legitimate companies 245 00:14:26,640 --> 00:14:30,240 Speaker 1: like sam Ash would post warnings on their pages about 246 00:14:30,280 --> 00:14:33,840 Speaker 1: scams like this. I suspect that unless lawmakers make it 247 00:14:33,920 --> 00:14:37,320 Speaker 1: really hurt to engage in irresponsible ad partnerships, we're not 248 00:14:37,360 --> 00:14:40,200 Speaker 1: going to see any significant change on this front. In 249 00:14:40,240 --> 00:14:43,800 Speaker 1: the world of tech and politics, the Trump campaign has 250 00:14:43,840 --> 00:14:48,600 Speaker 1: been hit with a spearfishing attack that compromised campaign assets. 251 00:14:48,840 --> 00:14:52,280 Speaker 1: The Trump campaign identified hackers backed by Iran as the 252 00:14:52,320 --> 00:14:57,480 Speaker 1: party responsible, and now Google's Threat Analysis Group or tag TAG, 253 00:14:57,880 --> 00:15:00,960 Speaker 1: has released a statement saying that yes, they have observed 254 00:15:00,960 --> 00:15:05,680 Speaker 1: that the hacker group APT forty two, which is linked 255 00:15:05,720 --> 00:15:11,120 Speaker 1: to Iran, targeted both the Trump campaign and the Harris campaign. 256 00:15:11,200 --> 00:15:16,040 Speaker 1: Kamala Harris's campaign, as well as President Biden's campaign when 257 00:15:16,040 --> 00:15:20,560 Speaker 1: he was actively campaigning. Tag also stated that subsequent attacks 258 00:15:20,560 --> 00:15:23,960 Speaker 1: have been unsuccessful, but there remains an ongoing attempt to 259 00:15:24,000 --> 00:15:27,440 Speaker 1: compromise accounts belonging to people who are close to Trump 260 00:15:27,560 --> 00:15:31,560 Speaker 1: and Harris slash Biden. The assumption moving forward is that 261 00:15:31,640 --> 00:15:35,600 Speaker 1: all elections will face potential external interference from threat actors, 262 00:15:35,720 --> 00:15:39,320 Speaker 1: both domestic and foreign. So, you know, that's fun. It's 263 00:15:39,360 --> 00:15:42,720 Speaker 1: a reminder that democracy isn't something that can just fend 264 00:15:42,760 --> 00:15:45,120 Speaker 1: for itself. Now, if you want to learn more about this, 265 00:15:45,280 --> 00:15:48,640 Speaker 1: I recommend Kevin Perdy's article on ours Tetnica. It is 266 00:15:48,680 --> 00:15:53,600 Speaker 1: titled Google's threat team confirms Iran targeting Trump, Biden, and 267 00:15:53,720 --> 00:15:58,840 Speaker 1: Harris campaigns. More than a decade ago, New Zealand police officers, 268 00:15:58,920 --> 00:16:03,360 Speaker 1: responding to a from the FBI rated the massive home 269 00:16:03,400 --> 00:16:07,000 Speaker 1: of Kim dot com. Dot Com formerly known as Schmidtz, 270 00:16:07,280 --> 00:16:11,320 Speaker 1: formed the company mega Upload, which allowed users to create 271 00:16:11,320 --> 00:16:14,320 Speaker 1: cloud based storage for all sorts of stuff and then 272 00:16:14,400 --> 00:16:17,400 Speaker 1: give other folks access to the files that were in 273 00:16:17,440 --> 00:16:20,360 Speaker 1: that storage. So obviously a lot of people use the 274 00:16:20,440 --> 00:16:22,680 Speaker 1: service to serve as kind of a trading ground for 275 00:16:22,800 --> 00:16:28,280 Speaker 1: pirated material music, TV shows, films, games, software, and more 276 00:16:28,440 --> 00:16:31,440 Speaker 1: all found a home on mega Upload, while the company 277 00:16:31,480 --> 00:16:35,480 Speaker 1: is responsible for making that stuff. Seethed they saw dot 278 00:16:35,520 --> 00:16:39,680 Speaker 1: Com profiting off of piracy, So the FBI called on 279 00:16:39,760 --> 00:16:42,480 Speaker 1: New Zealand police back in twenty twelve to give dot 280 00:16:42,520 --> 00:16:46,000 Speaker 1: Com a little visit, and they charged him with racketeering 281 00:16:46,040 --> 00:16:48,680 Speaker 1: and wirefraud and money laundering on top of all that 282 00:16:48,760 --> 00:16:53,240 Speaker 1: copyright infringement. And after more than a decade of legal wrangling, 283 00:16:53,320 --> 00:16:56,840 Speaker 1: the US has now secured extradition of dot Com to 284 00:16:56,960 --> 00:16:59,760 Speaker 1: fly to the US and stand trial. Now dot Com 285 00:16:59,760 --> 00:17:02,880 Speaker 1: has denounced the whole thing. He referred to New Zealand 286 00:17:02,920 --> 00:17:06,280 Speaker 1: as essentially a US colony. It's not, and dot Com 287 00:17:06,280 --> 00:17:09,119 Speaker 1: still maintains he will not leave New Zealand and that 288 00:17:09,240 --> 00:17:11,680 Speaker 1: his business was just to provide a service in which 289 00:17:11,720 --> 00:17:15,200 Speaker 1: people could upload and share files, and therefore he bears 290 00:17:15,240 --> 00:17:19,280 Speaker 1: no responsibility on the kinds of files that were uploaded 291 00:17:19,320 --> 00:17:22,359 Speaker 1: and shared. This is a basic principle of the safe 292 00:17:22,359 --> 00:17:25,720 Speaker 1: harbor defense. But dot Com has also been really vocal 293 00:17:25,880 --> 00:17:29,879 Speaker 1: and let's say abrasive, and I think that may have 294 00:17:30,040 --> 00:17:33,160 Speaker 1: hurt his case a little, or at least prodded authorities 295 00:17:33,160 --> 00:17:35,720 Speaker 1: to really go after him. Now, whether all of this 296 00:17:35,800 --> 00:17:38,919 Speaker 1: is going to lead to a trial and conviction or not, 297 00:17:39,119 --> 00:17:42,520 Speaker 1: or if dot Com will successfully appeal the extradition order, 298 00:17:42,760 --> 00:17:45,040 Speaker 1: we'll just have to wait and see. In the good 299 00:17:45,119 --> 00:17:49,760 Speaker 1: times are hard times department. Telecommunications company Cisco Systems posted 300 00:17:49,800 --> 00:17:53,680 Speaker 1: a ten point three billion with a B dollar profit 301 00:17:53,800 --> 00:17:56,520 Speaker 1: in its last fiscal year, but it's also laying off 302 00:17:56,560 --> 00:17:59,800 Speaker 1: folks seven percent of its workforce, which could be around 303 00:17:59,800 --> 00:18:03,119 Speaker 1: five five hundred people. All told, The company said that 304 00:18:03,160 --> 00:18:06,320 Speaker 1: the move is necessary so it can put full focus 305 00:18:06,520 --> 00:18:10,399 Speaker 1: on quote key growth opportunities and drive more efficiency in 306 00:18:10,440 --> 00:18:14,120 Speaker 1: our business end quote. As Stephen Counsel of SFGate reports, 307 00:18:14,200 --> 00:18:16,560 Speaker 1: this is the second time this year that Cisco has 308 00:18:16,560 --> 00:18:22,240 Speaker 1: held substantial layoffs. Golly. Back to politics, Donald Trump made 309 00:18:22,240 --> 00:18:25,520 Speaker 1: headlines after claiming that the Harris campaign or someone on 310 00:18:25,560 --> 00:18:28,919 Speaker 1: that campaign's behalf, had made use of AI to artificially 311 00:18:28,960 --> 00:18:33,080 Speaker 1: boost crowd sizes at Harris events. In various images of 312 00:18:33,160 --> 00:18:37,439 Speaker 1: those stops, and the images that Trump referenced all appear 313 00:18:37,520 --> 00:18:40,760 Speaker 1: to be legit. Footage from numerous media outlets show that 314 00:18:40,800 --> 00:18:44,320 Speaker 1: the crowds were actually quite large. However, that doesn't mean 315 00:18:44,320 --> 00:18:48,399 Speaker 1: that there are no AI generated or enhanced images making 316 00:18:48,440 --> 00:18:50,800 Speaker 1: the rounds out there. It's just the ones that Trump 317 00:18:50,960 --> 00:18:54,960 Speaker 1: was citing were not AI generated. And it also means 318 00:18:55,000 --> 00:18:58,080 Speaker 1: that people are calling into question the legitimacy of actual, 319 00:18:58,720 --> 00:19:01,720 Speaker 1: real images and visit. So on the one hand, we 320 00:19:01,800 --> 00:19:04,639 Speaker 1: do know there are AI generated images out there that 321 00:19:04,680 --> 00:19:07,479 Speaker 1: are competing for our attention and posing as legit, and 322 00:19:07,520 --> 00:19:10,080 Speaker 1: they are misinformation. On the other hand, we also know 323 00:19:10,160 --> 00:19:13,159 Speaker 1: that people will now question legitimate sources and argue that 324 00:19:13,240 --> 00:19:16,359 Speaker 1: they are in fact AI generated. So some refer to 325 00:19:16,400 --> 00:19:20,359 Speaker 1: this as the liar's dividend. There's a great piece about this. 326 00:19:20,680 --> 00:19:24,120 Speaker 1: It's again over at Ours Technicuts by Kyle Orland. It's 327 00:19:24,200 --> 00:19:28,200 Speaker 1: titled the Many Many Signs that Kamala Harris's rally crowds 328 00:19:28,240 --> 00:19:31,960 Speaker 1: aren't AI creations. We're going to take another quick break, 329 00:19:31,960 --> 00:19:33,919 Speaker 1: and when we come back, I've got some more news. 330 00:19:43,280 --> 00:19:46,879 Speaker 1: A story I missed a while back is that code 331 00:19:46,920 --> 00:19:50,240 Speaker 1: testers came across some interesting stuff while looking at Apple Intelligence, 332 00:19:50,280 --> 00:19:53,800 Speaker 1: which is Apple's AI project, and that interesting stuff appeared 333 00:19:53,800 --> 00:19:57,120 Speaker 1: to be prompts meant to guide generative AI to avoid 334 00:19:57,119 --> 00:20:00,560 Speaker 1: the pitfalls we've seen with other tools, stuff like ucinations 335 00:20:00,640 --> 00:20:04,200 Speaker 1: or confabulations, you know, the tendency for generative AI to 336 00:20:04,359 --> 00:20:07,320 Speaker 1: just make stuff up in the absence of real information. 337 00:20:07,600 --> 00:20:12,480 Speaker 1: And that's potentially good news, assuming that these guidelines actually work. 338 00:20:12,800 --> 00:20:15,080 Speaker 1: But I have another story this week that makes me worry. 339 00:20:15,280 --> 00:20:17,800 Speaker 1: It's not about Apple. It's about a research firm out 340 00:20:17,800 --> 00:20:21,080 Speaker 1: of Tokyo called Sakana AI. So the team had set 341 00:20:21,280 --> 00:20:25,440 Speaker 1: its AI system to do autonomous scientific research, and they 342 00:20:25,480 --> 00:20:29,240 Speaker 1: included a function that would essentially say times up on tasks. 343 00:20:29,440 --> 00:20:32,400 Speaker 1: And what was surprising was the system attempted to rewrite 344 00:20:32,440 --> 00:20:35,280 Speaker 1: its own code in order to give itself more time 345 00:20:35,400 --> 00:20:38,719 Speaker 1: to complete tasks. So instead of trying to do things faster, 346 00:20:39,240 --> 00:20:42,000 Speaker 1: they tried to change the deadline. Now, this is a 347 00:20:42,119 --> 00:20:44,960 Speaker 1: very low level instance of a type of situation that 348 00:20:45,000 --> 00:20:49,160 Speaker 1: has fueled countless science fiction cautionary tales. An AI system 349 00:20:49,359 --> 00:20:52,720 Speaker 1: ignores or alters rules in order to fulfill its function. 350 00:20:53,119 --> 00:20:56,240 Speaker 1: The cliche version of this is asking a global system 351 00:20:56,240 --> 00:20:58,719 Speaker 1: that's running on AI to create world peace, and so 352 00:20:58,840 --> 00:21:01,600 Speaker 1: the AI inevitably decides the only way to do that, 353 00:21:01,800 --> 00:21:04,640 Speaker 1: the only way to prevent conflict is to kill off 354 00:21:04,720 --> 00:21:08,480 Speaker 1: all those pesky humans who create it, meaning everybody. Of course, 355 00:21:08,560 --> 00:21:12,080 Speaker 1: the Sikana AI example is not dangerous like that, but 356 00:21:12,160 --> 00:21:14,680 Speaker 1: it does illustrate that AI experts have to be very 357 00:21:14,720 --> 00:21:17,960 Speaker 1: careful in how they design systems so that the systems 358 00:21:18,000 --> 00:21:21,040 Speaker 1: remain safe and reliable. It may not be enough to 359 00:21:21,119 --> 00:21:24,320 Speaker 1: create rules if the AI can figure oh, well, here's 360 00:21:24,320 --> 00:21:27,320 Speaker 1: the problem. These ding dang dearn rules need to go away. 361 00:21:27,600 --> 00:21:29,640 Speaker 1: And I'm being a little flippant, but to learn more 362 00:21:29,640 --> 00:21:33,480 Speaker 1: about this, I recommend BENJ. Edwards article research AI model 363 00:21:33,560 --> 00:21:37,119 Speaker 1: unexpectedly modified its own code to extend runtime and that 364 00:21:37,320 --> 00:21:40,560 Speaker 1: is on. You guessed it Ours Technica. I swear I 365 00:21:40,640 --> 00:21:43,400 Speaker 1: use a lot of other sources. It's just Ours Technica 366 00:21:43,480 --> 00:21:45,800 Speaker 1: really knocked it out of the park. This week, so 367 00:21:46,119 --> 00:21:50,040 Speaker 1: Ahmed has a piece on tech spot titled Recruiters Overwhelmed 368 00:21:50,080 --> 00:21:53,119 Speaker 1: as fifty seven percent of young applicants are using chat 369 00:21:53,160 --> 00:21:56,919 Speaker 1: GPT for job resumes. In this piece, Omed mentions how 370 00:21:57,000 --> 00:21:59,919 Speaker 1: hiring managers are flooded with more applications than ever before, 371 00:22:00,320 --> 00:22:03,080 Speaker 1: many of which have clearly been written in part or 372 00:22:03,280 --> 00:22:07,000 Speaker 1: entirely by AI. Further, the pieces written by the free 373 00:22:07,080 --> 00:22:10,840 Speaker 1: version of tools like chat GPT contain many telltale signs 374 00:22:10,840 --> 00:22:14,600 Speaker 1: of AI generation with less natural language and other quirks, 375 00:22:14,880 --> 00:22:18,240 Speaker 1: while the paid for versions of these tools tend to 376 00:22:18,280 --> 00:22:20,840 Speaker 1: blend a bit better with you stuff that was actually 377 00:22:20,880 --> 00:22:24,080 Speaker 1: written by real human beings. And this presents a challenge 378 00:22:24,080 --> 00:22:26,800 Speaker 1: to hiring managers who need to see what an applicant 379 00:22:26,840 --> 00:22:29,439 Speaker 1: is actually capable of. Now, that could just mean bringing 380 00:22:29,480 --> 00:22:32,040 Speaker 1: more people in for interviews, but those take up a 381 00:22:32,080 --> 00:22:34,680 Speaker 1: lot of time to schedule and actually do so it's 382 00:22:34,680 --> 00:22:38,359 Speaker 1: not terribly efficient. Moreover, this makes me concerned that folks 383 00:22:38,400 --> 00:22:42,040 Speaker 1: who have little to know access to AI tools, specifically 384 00:22:42,280 --> 00:22:44,120 Speaker 1: the types of tools that you have to pay for, 385 00:22:44,560 --> 00:22:48,399 Speaker 1: they're ultimately going to be at a disadvantage. Maybe it 386 00:22:48,440 --> 00:22:50,760 Speaker 1: will all shake out, but my fear is that people 387 00:22:50,800 --> 00:22:53,520 Speaker 1: who are already in a position to at least afford 388 00:22:53,560 --> 00:22:55,920 Speaker 1: an AI hype man are going to be the people 389 00:22:55,920 --> 00:22:58,800 Speaker 1: who get these interview slots, while actual human beings who 390 00:22:58,800 --> 00:23:01,280 Speaker 1: can't afford that type of SOLF report will be applying 391 00:23:01,320 --> 00:23:04,520 Speaker 1: to job after job with diminishing hope of landing an interview. 392 00:23:06,160 --> 00:23:09,600 Speaker 1: I am bumming myself out today. The SAG after a 393 00:23:09,720 --> 00:23:13,320 Speaker 1: union which represents actors, has agreed to conditions in which 394 00:23:13,359 --> 00:23:17,080 Speaker 1: companies can make use of AI duplicates of actor voices. 395 00:23:17,359 --> 00:23:21,480 Speaker 1: This is specifically for use in advertisements, and it leverages 396 00:23:21,520 --> 00:23:24,760 Speaker 1: a platform called Narrative, which doesn't have an E at 397 00:23:24,760 --> 00:23:29,080 Speaker 1: the end, and actors can license their voices for use 398 00:23:29,080 --> 00:23:33,200 Speaker 1: in commercials through Narrative. The agreement states that individual actors 399 00:23:33,240 --> 00:23:35,760 Speaker 1: will have full say on which brands they're willing to 400 00:23:35,800 --> 00:23:38,200 Speaker 1: work with and how much they can charge for use 401 00:23:38,240 --> 00:23:40,879 Speaker 1: of their voice. In addition, if an actor decides they 402 00:23:40,920 --> 00:23:44,320 Speaker 1: no longer want the robots to talk like them, they 403 00:23:44,359 --> 00:23:47,320 Speaker 1: can sever their relationship with Narrative, and the platform is 404 00:23:47,400 --> 00:23:51,040 Speaker 1: obligated to delete all their voice data, including the reference 405 00:23:51,080 --> 00:23:53,560 Speaker 1: recordings that were used to make the digital duplicate in 406 00:23:53,600 --> 00:23:55,879 Speaker 1: the first place. One thing that concerns me is the 407 00:23:55,920 --> 00:23:58,919 Speaker 1: possibility of a company making use of an AI duplicated 408 00:23:59,000 --> 00:24:01,720 Speaker 1: voice for ende dorsements. Here in the United States, we 409 00:24:01,760 --> 00:24:05,440 Speaker 1: have strict rules about endorsing products and services. The person 410 00:24:05,480 --> 00:24:09,480 Speaker 1: who's giving the endorsement is legally responsible for actually using 411 00:24:09,560 --> 00:24:12,240 Speaker 1: the thing in question and being honest in their take 412 00:24:12,320 --> 00:24:15,000 Speaker 1: on it. It's why I do very few endorsements, because 413 00:24:15,000 --> 00:24:17,800 Speaker 1: i have very high standards on this and I'm legally 414 00:24:17,840 --> 00:24:20,800 Speaker 1: obligated to do so. But with AI duplicates, a company 415 00:24:20,840 --> 00:24:25,000 Speaker 1: could try and do that without the actual original actor's input. 416 00:24:25,119 --> 00:24:26,840 Speaker 1: If the actor just says, yeah, you know, I'm cool 417 00:24:26,840 --> 00:24:29,959 Speaker 1: with whatever brand using my voice, but then the brand 418 00:24:30,200 --> 00:24:32,199 Speaker 1: ends up doing more than just an ad spot and 419 00:24:32,240 --> 00:24:34,800 Speaker 1: does an endorsement, well, that could bring legal issues into 420 00:24:34,800 --> 00:24:37,600 Speaker 1: play down the line. However, I have not read the 421 00:24:37,600 --> 00:24:39,680 Speaker 1: full agreement yet, so it may be that this is 422 00:24:39,720 --> 00:24:43,119 Speaker 1: already accounted for and the process could be really granular. 423 00:24:43,240 --> 00:24:46,760 Speaker 1: So maybe I'm complaining about something that isn't even an issue. 424 00:24:47,320 --> 00:24:50,399 Speaker 1: Just a quick up date on the ongoing issues with 425 00:24:50,480 --> 00:24:53,600 Speaker 1: the Boeing Starliner crew, who are currently aboard the International 426 00:24:53,640 --> 00:24:57,760 Speaker 1: Space Station. NASA has yet to decide how to proceed. Currently, 427 00:24:57,840 --> 00:25:00,639 Speaker 1: the next scheduled docking with the ISS is supposed to 428 00:25:00,640 --> 00:25:03,880 Speaker 1: happen on September twenty fourth with a SpaceX Dragon capsule, 429 00:25:04,000 --> 00:25:07,359 Speaker 1: but that obviously cannot happen with the star Liner in 430 00:25:07,400 --> 00:25:10,800 Speaker 1: the way. So the assumption that I'm mostly seeing online 431 00:25:10,840 --> 00:25:12,720 Speaker 1: is that NASA is going to opt to have the 432 00:25:12,720 --> 00:25:16,640 Speaker 1: Starliner return to Earth with no crew aboard the spacecraft, 433 00:25:16,760 --> 00:25:18,960 Speaker 1: and the two astronauts who flew on the star Liner 434 00:25:19,000 --> 00:25:20,920 Speaker 1: will have to wait a while before hitching a ride 435 00:25:20,920 --> 00:25:24,800 Speaker 1: back home. Aboard a SpaceX Dragon capsule. NASA has previously 436 00:25:24,800 --> 00:25:27,199 Speaker 1: indicated the agency would make a decision on how to 437 00:25:27,240 --> 00:25:31,080 Speaker 1: move forward this week, but since then representatives have said 438 00:25:31,119 --> 00:25:32,800 Speaker 1: the agency is going to take a bit more time 439 00:25:32,840 --> 00:25:35,160 Speaker 1: to make that call since there is still some time 440 00:25:35,200 --> 00:25:37,440 Speaker 1: to spare, but not a whole lot of it. It's 441 00:25:37,480 --> 00:25:40,480 Speaker 1: still possible we'll see the astronauts return in the star 442 00:25:40,600 --> 00:25:44,119 Speaker 1: liner itself. NASA is taking all factors into account and 443 00:25:44,160 --> 00:25:46,720 Speaker 1: wants to be certain that any such attempts are well 444 00:25:46,760 --> 00:25:49,840 Speaker 1: within acceptable risks. Before I sign off, I've got a 445 00:25:49,880 --> 00:25:52,639 Speaker 1: couple of article suggestions for y'all. First up as another 446 00:25:52,760 --> 00:25:56,200 Speaker 1: piece by BINGJ. Edwards. This one is titled deep Live 447 00:25:56,280 --> 00:25:59,800 Speaker 1: Cam Goes Viral allowing anyone to become a digital dopple 448 00:25:59,800 --> 00:26:03,520 Speaker 1: game and yes it's on Ours Technica. The article tells 449 00:26:03,560 --> 00:26:06,240 Speaker 1: about tools that let folks use a simple image to 450 00:26:06,280 --> 00:26:09,679 Speaker 1: create a kind of digital mask, sort of Mission Impossible 451 00:26:09,720 --> 00:26:13,520 Speaker 1: style only for like webcam live streams, you know, not 452 00:26:13,600 --> 00:26:16,320 Speaker 1: in the real world, but online. You can use a 453 00:26:16,320 --> 00:26:19,080 Speaker 1: picture like the one case they used a picture of 454 00:26:19,119 --> 00:26:22,360 Speaker 1: George Clooney and the guy ended up having a digital 455 00:26:22,880 --> 00:26:25,879 Speaker 1: George Clooney face and it was pretty impressive. It was 456 00:26:26,680 --> 00:26:30,600 Speaker 1: reactive in real time. Next is a piece from NPR's 457 00:26:30,720 --> 00:26:35,200 Speaker 1: Dara Kurr titled Meta shutters tool used to fight disinformation 458 00:26:35,400 --> 00:26:39,359 Speaker 1: despite outcry. Kerr writes about a tool called crowd Tangle, 459 00:26:39,480 --> 00:26:43,160 Speaker 1: which researchers use to track disinformation online and how Meta 460 00:26:43,200 --> 00:26:45,480 Speaker 1: is shutting this down despite the fact that here in 461 00:26:45,520 --> 00:26:48,679 Speaker 1: the United States we're in an election year and you 462 00:26:48,680 --> 00:26:51,880 Speaker 1: would think that tools meant to help track and detect 463 00:26:51,920 --> 00:26:55,720 Speaker 1: disinformation would be particularly useful. So that's what has critics 464 00:26:55,800 --> 00:26:59,600 Speaker 1: asking Meta to maybe keep it online a bit longer. 465 00:27:00,359 --> 00:27:03,320 Speaker 1: That's it. I want to thank Shannon Morse again for 466 00:27:03,440 --> 00:27:07,080 Speaker 1: jumping on the show. I always appreciate having her on. 467 00:27:07,400 --> 00:27:09,680 Speaker 1: She is a delight. You can see more of her 468 00:27:09,680 --> 00:27:12,200 Speaker 1: work over at YouTube, So go to YouTube and look 469 00:27:12,280 --> 00:27:17,520 Speaker 1: for Shannon Morse. Highly recommend her content. She's really a 470 00:27:17,520 --> 00:27:21,480 Speaker 1: lot of fun, incredibly knowledgeable, and is a great tech communicator. 471 00:27:21,720 --> 00:27:24,160 Speaker 1: So check out her work, and like I said, check 472 00:27:24,160 --> 00:27:26,159 Speaker 1: out our Stanka because they knocked it out of the 473 00:27:26,160 --> 00:27:28,800 Speaker 1: park this week. I hope all of you are doing 474 00:27:28,840 --> 00:27:38,200 Speaker 1: well and I'll talk to you again really soon. Tech 475 00:27:38,240 --> 00:27:42,680 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 476 00:27:43,000 --> 00:27:46,720 Speaker 1: visit the iHeartRadio app, Apple podcasts, or wherever you listen 477 00:27:46,720 --> 00:27:51,320 Speaker 1: to your favorite shows.