1 00:00:04,360 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:15,320 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland, 3 00:00:15,360 --> 00:00:18,919 Speaker 1: domind executive producer with iHeartRadio and how the tech are you. 4 00:00:19,239 --> 00:00:21,960 Speaker 1: It's time for the tech news from March twenty third, 5 00:00:22,000 --> 00:00:28,400 Speaker 1: twenty twenty three. Oh wow three two three two two three. 6 00:00:29,880 --> 00:00:33,880 Speaker 1: Sorry got distracted. Okay, let's jump start this episode with 7 00:00:34,040 --> 00:00:38,400 Speaker 1: more AI news stories. The story of twenty twenty three. 8 00:00:38,479 --> 00:00:42,320 Speaker 1: So first up, Avraham Pilch of Tom's Hardware wrote an 9 00:00:42,400 --> 00:00:48,560 Speaker 1: article titled google Bard plagiarized our article, then apologized when caught. 10 00:00:49,200 --> 00:00:53,680 Speaker 1: So google Bard is Google's version of you know, the 11 00:00:53,760 --> 00:00:56,800 Speaker 1: chat box features that are built on top of a 12 00:00:56,880 --> 00:00:59,720 Speaker 1: large language model. It's similar in many ways to chat 13 00:00:59,760 --> 00:01:03,880 Speaker 1: g PT that's incorporated into stuff like bing. Bard is 14 00:01:04,319 --> 00:01:08,560 Speaker 1: currently in beta testing, so it's not openly rolled out 15 00:01:08,560 --> 00:01:12,559 Speaker 1: to everyone. Pilch's article explains that he was testing Bard 16 00:01:12,880 --> 00:01:16,440 Speaker 1: and asking it to compare two different processors against one 17 00:01:16,440 --> 00:01:19,600 Speaker 1: another and to recommend which of the two processors would 18 00:01:19,640 --> 00:01:24,759 Speaker 1: be fastest. And Bard generated a response, and Pilch saw 19 00:01:24,800 --> 00:01:28,960 Speaker 1: that some facts looked awfully familiar, and it ended up 20 00:01:29,000 --> 00:01:34,560 Speaker 1: being because the information that Bard was referencing originated in 21 00:01:34,600 --> 00:01:39,320 Speaker 1: a benchmark test article that was also published on Tom's 22 00:01:39,360 --> 00:01:43,360 Speaker 1: Hardware just a couple of days earlier. So Pilch asked 23 00:01:43,440 --> 00:01:45,640 Speaker 1: Bard for the source of the data, where did you 24 00:01:45,680 --> 00:01:49,040 Speaker 1: get this information. That's when Bard explained it had pulled 25 00:01:49,080 --> 00:01:52,160 Speaker 1: the info from the Tom's Hardware article. And at that 26 00:01:52,240 --> 00:01:57,760 Speaker 1: point Pilch essentially asked Bard if that perhaps constituted plagiarism, 27 00:01:57,880 --> 00:02:02,840 Speaker 1: and Bard kind of said yeah. And this is just 28 00:02:02,960 --> 00:02:06,320 Speaker 1: one of the concerns folks have about chat bot AI 29 00:02:06,400 --> 00:02:10,280 Speaker 1: tools like Barred and chat gpt, that they could pull 30 00:02:10,440 --> 00:02:14,560 Speaker 1: data from sources without giving credit, and that both denies 31 00:02:14,600 --> 00:02:19,120 Speaker 1: the original creator of that content any recognition or ability 32 00:02:19,200 --> 00:02:22,800 Speaker 1: to monetize their work, and it also makes it difficult 33 00:02:22,800 --> 00:02:26,639 Speaker 1: to fact check the answers. You know, the information has 34 00:02:26,680 --> 00:02:30,240 Speaker 1: to be coming from somewhere. These AI chat bots are 35 00:02:30,320 --> 00:02:35,120 Speaker 1: assembling answers based off available information. They're not just inventing it, 36 00:02:35,120 --> 00:02:38,480 Speaker 1: they're not just lying. In other words, but you don't 37 00:02:38,520 --> 00:02:41,720 Speaker 1: necessarily know where they're pulling that information from, and that 38 00:02:41,760 --> 00:02:45,200 Speaker 1: means not only that it may or may not be trustworthy, 39 00:02:45,720 --> 00:02:48,440 Speaker 1: but also that someone somewhere is getting the short end 40 00:02:48,440 --> 00:02:51,440 Speaker 1: of the stick. They generated that content, and yet you 41 00:02:51,480 --> 00:02:55,280 Speaker 1: know they're not being compensated for that, right. So it 42 00:02:55,320 --> 00:02:58,480 Speaker 1: reminds me of how content creators were really worried when 43 00:02:58,480 --> 00:03:02,280 Speaker 1: Google would start to include short descriptions on a search 44 00:03:02,320 --> 00:03:05,440 Speaker 1: result page that could potentially negate the need to click 45 00:03:05,560 --> 00:03:10,000 Speaker 1: through to an actual page on that topic, thus denying 46 00:03:10,600 --> 00:03:14,360 Speaker 1: those pages of views and ad revenue. Why would you 47 00:03:14,960 --> 00:03:18,160 Speaker 1: write stuff for the web if that stuff ends up 48 00:03:18,160 --> 00:03:23,600 Speaker 1: being appropriated by some AI chatbot and then regurgitated to 49 00:03:24,280 --> 00:03:28,360 Speaker 1: users who never see your actual article, and thus the 50 00:03:28,400 --> 00:03:32,080 Speaker 1: website never gets any visitors, and eventually the website stops 51 00:03:32,120 --> 00:03:35,360 Speaker 1: employing you because they can't afford to do it. Like 52 00:03:35,800 --> 00:03:41,160 Speaker 1: it just becomes this kind of self defeating cycle. But anyway, 53 00:03:41,480 --> 00:03:44,400 Speaker 1: go check out the full article on Tom's hardware. Again, 54 00:03:44,840 --> 00:03:48,640 Speaker 1: it's called Google bard plagiarize our article, then apologized when 55 00:03:48,760 --> 00:03:53,040 Speaker 1: caught because I actually do believe in sending people to 56 00:03:53,120 --> 00:03:57,280 Speaker 1: the proper sources. James Vincent of The Verge has a 57 00:03:57,360 --> 00:04:01,440 Speaker 1: different warning relating to AI chat I can't actually give 58 00:04:01,480 --> 00:04:06,240 Speaker 1: you the full title of the article because it ends 59 00:04:06,280 --> 00:04:09,600 Speaker 1: with some profanity, but I'll give you most of the 60 00:04:09,600 --> 00:04:13,760 Speaker 1: title of the article. It's called Google and Microsoft's chat 61 00:04:13,760 --> 00:04:18,360 Speaker 1: bots are already citing one another in a misinformation insert 62 00:04:18,360 --> 00:04:22,960 Speaker 1: profanity here. It's a kind of storm, I'll say. In 63 00:04:23,000 --> 00:04:27,440 Speaker 1: the article, Vincent mentions a peculiar series of responses that 64 00:04:27,800 --> 00:04:31,640 Speaker 1: when one asked Microsoft Bing if Google had shut down 65 00:04:32,320 --> 00:04:35,760 Speaker 1: It's Bard chat bot, then Bing would say yes. Essentially, 66 00:04:35,920 --> 00:04:38,440 Speaker 1: Bing would say yes, Bard has been shut down by Google. 67 00:04:39,240 --> 00:04:43,320 Speaker 1: Now as evidence, Bing was citing a tweet in which 68 00:04:43,360 --> 00:04:47,720 Speaker 1: someone said that they had asked Bard when Bard would 69 00:04:47,720 --> 00:04:51,520 Speaker 1: get shut down, and Bard claimed it already had been. 70 00:04:52,080 --> 00:04:55,080 Speaker 1: So where was Bard getting this information, because clearly that's 71 00:04:55,080 --> 00:04:58,640 Speaker 1: not true, right, Bard was answering the person, so it 72 00:04:58,680 --> 00:05:01,320 Speaker 1: could not have been shut down. Well, Bard was pulling 73 00:05:01,360 --> 00:05:04,279 Speaker 1: its information from a joke that someone left in the 74 00:05:04,320 --> 00:05:07,920 Speaker 1: comment section of a Hacker news piece, and then someone 75 00:05:07,960 --> 00:05:11,440 Speaker 1: else had taken that joke and generated a chat GPT 76 00:05:13,279 --> 00:05:16,280 Speaker 1: article around it, like they actually gave that chat gpt 77 00:05:16,480 --> 00:05:20,320 Speaker 1: to write an article about Bard being shut down by Google. Again, 78 00:05:20,360 --> 00:05:23,000 Speaker 1: this is all a joke at this point, but Bard 79 00:05:23,720 --> 00:05:26,240 Speaker 1: cites this joke as if it's an actual news item. 80 00:05:27,400 --> 00:05:31,200 Speaker 1: This tweet talks about how Bard said that it was 81 00:05:31,240 --> 00:05:34,040 Speaker 1: already shut down, and then Being says yeah, Bard's been 82 00:05:34,040 --> 00:05:38,240 Speaker 1: shut down because it's citing the tweet. So again, this 83 00:05:38,320 --> 00:05:42,440 Speaker 1: really makes a very salient point about you know, this 84 00:05:42,520 --> 00:05:45,599 Speaker 1: goofy little tweet ended up being used as if it 85 00:05:45,640 --> 00:05:51,680 Speaker 1: were reliable hard news information, and that AI chatbots aren't 86 00:05:51,720 --> 00:05:56,800 Speaker 1: capable of telling the truth from humor, or, lies or satire. 87 00:05:57,360 --> 00:05:59,799 Speaker 1: That you could end up asking these chatbots a question. 88 00:06:00,120 --> 00:06:03,159 Speaker 1: It is entirely possible that they might reference a site 89 00:06:03,200 --> 00:06:06,680 Speaker 1: like The Onion, for example, and the Onion is a 90 00:06:06,760 --> 00:06:11,400 Speaker 1: satire humor website, like it's meant to write articles that 91 00:06:11,480 --> 00:06:16,320 Speaker 1: are not true for the purposes of humor, and the 92 00:06:16,320 --> 00:06:19,440 Speaker 1: answer you get from AI would probably be interesting, but 93 00:06:19,480 --> 00:06:22,640 Speaker 1: it would not be reliable. Goodness knows, there are already 94 00:06:22,760 --> 00:06:27,600 Speaker 1: tons of sites out there that claim to be satire. 95 00:06:27,880 --> 00:06:31,120 Speaker 1: Usually this claim is hidden in a little about page 96 00:06:31,240 --> 00:06:34,520 Speaker 1: somewhere that makes it really hard to tell at first. 97 00:06:34,560 --> 00:06:36,960 Speaker 1: Like I've seen so many, not as many these days 98 00:06:37,000 --> 00:06:40,200 Speaker 1: as maybe five years ago, but man, I used to 99 00:06:40,200 --> 00:06:43,279 Speaker 1: come across them all the time. And in reality, these 100 00:06:43,400 --> 00:06:46,440 Speaker 1: websites only existed to publish fake news that would then 101 00:06:46,520 --> 00:06:52,359 Speaker 1: go viral on various social platforms. So if you dug 102 00:06:52,480 --> 00:06:55,000 Speaker 1: down deep enough you would find some disclaimer on the 103 00:06:55,040 --> 00:06:58,279 Speaker 1: website somewhere saying this is meant for entertainment and satire. 104 00:06:58,680 --> 00:07:01,839 Speaker 1: It wasn't satire, it was just lies because it wasn't 105 00:07:01,960 --> 00:07:05,600 Speaker 1: humorous at all. It wasn't presented to be humor or 106 00:07:05,640 --> 00:07:09,640 Speaker 1: to give any insight. It was just meant to go viral. Well, 107 00:07:09,840 --> 00:07:13,840 Speaker 1: AI chatbots don't know necessarily that that kind of content 108 00:07:14,080 --> 00:07:17,440 Speaker 1: isn't reliable and could present it as such. So yeah, 109 00:07:17,440 --> 00:07:23,960 Speaker 1: another example of how chatbots can give us some misleading information. 110 00:07:26,120 --> 00:07:28,800 Speaker 1: Not all uses of AI are bad, of course, you 111 00:07:28,920 --> 00:07:31,960 Speaker 1: be soft. The video Game Company released a video showing 112 00:07:32,000 --> 00:07:36,360 Speaker 1: off an AI tool called ghost Writer Write Her, not 113 00:07:36,920 --> 00:07:40,120 Speaker 1: ghost writer as in the Biker with a Skull flaming 114 00:07:40,160 --> 00:07:43,360 Speaker 1: Skull for ahead. So ghost Writer aims to make the 115 00:07:43,440 --> 00:07:47,920 Speaker 1: tedious task of generating background chatter for NPCs in games 116 00:07:48,280 --> 00:07:52,000 Speaker 1: and automated task. So if you've played any open world 117 00:07:52,080 --> 00:07:56,600 Speaker 1: style games, you're probably familiar with hearing NPC's holding conversations 118 00:07:56,600 --> 00:08:00,119 Speaker 1: around you, or maybe even commenting on your appearance as 119 00:08:00,160 --> 00:08:03,360 Speaker 1: you move into view. Even if you haven't played a 120 00:08:03,400 --> 00:08:06,600 Speaker 1: lot of games, chances are you've heard people reference the 121 00:08:07,920 --> 00:08:12,200 Speaker 1: iconic line from Skyrim. I used to be an adventurer 122 00:08:12,200 --> 00:08:15,680 Speaker 1: like you, then I took an arrow to the knee. Well, 123 00:08:15,840 --> 00:08:19,400 Speaker 1: someone has to write all these little lines of NPC 124 00:08:19,600 --> 00:08:23,480 Speaker 1: dialogue like that. Someone's job is to flesh out a 125 00:08:23,520 --> 00:08:27,360 Speaker 1: world by writing all these possible lines that people could 126 00:08:27,360 --> 00:08:31,400 Speaker 1: say in the background, some that players may never even 127 00:08:31,400 --> 00:08:35,240 Speaker 1: consciously register. It's just chatter in the background. So it 128 00:08:35,320 --> 00:08:38,280 Speaker 1: can get pretty dull, particularly if you're trying to work 129 00:08:38,320 --> 00:08:40,880 Speaker 1: in enough variety to make the world feel like it's 130 00:08:40,880 --> 00:08:44,880 Speaker 1: inhabited by actual people and you don't have everyone just 131 00:08:44,920 --> 00:08:49,360 Speaker 1: saying rhubarb, rhubarb in the background. So what ghostwriter does 132 00:08:49,520 --> 00:08:53,800 Speaker 1: is helped generate variations of dialogue options, so you can 133 00:08:53,880 --> 00:08:56,400 Speaker 1: kind of put in a line and it will start 134 00:08:56,480 --> 00:09:00,920 Speaker 1: to use tools to express the same thought but in 135 00:09:00,960 --> 00:09:04,800 Speaker 1: different ways, and you can actually go through and edit 136 00:09:05,440 --> 00:09:08,719 Speaker 1: the responses so that way, if there are any grammatical 137 00:09:08,760 --> 00:09:11,480 Speaker 1: mistakes or anything like that, you can fix them. You 138 00:09:11,480 --> 00:09:16,440 Speaker 1: can accept or reject suggestions, and over time, ghost writer 139 00:09:16,720 --> 00:09:19,640 Speaker 1: gets better at learning what it is you're trying to do, 140 00:09:19,800 --> 00:09:22,600 Speaker 1: and it starts to give you better suggestions the next 141 00:09:22,640 --> 00:09:26,280 Speaker 1: time you use it to generate stuff. This gives writers 142 00:09:26,320 --> 00:09:29,440 Speaker 1: the chance to flesh out their game much more quickly 143 00:09:29,720 --> 00:09:32,959 Speaker 1: and dedicate more of their efforts and their brain power 144 00:09:33,000 --> 00:09:36,679 Speaker 1: and creativity to writing the stuff that really matters and 145 00:09:36,840 --> 00:09:41,240 Speaker 1: helps drive the game's narrative forward. I think it's pretty cool, though. 146 00:09:41,280 --> 00:09:43,400 Speaker 1: I kind of want to have a game now where 147 00:09:43,400 --> 00:09:46,760 Speaker 1: your character passes into a world that's just populated by 148 00:09:46,880 --> 00:09:50,480 Speaker 1: NPCs from all sorts of different games. You know, kind 149 00:09:50,520 --> 00:09:53,400 Speaker 1: of like if Central Casting had been in charge of 150 00:09:53,480 --> 00:09:56,360 Speaker 1: everything and they just grabbed anyone they could and shove there, 151 00:09:56,400 --> 00:09:59,840 Speaker 1: and it's all random INPCS. So you've got fantasy and 152 00:10:00,320 --> 00:10:04,000 Speaker 1: you know, modern day crime in PCs, all sorts of stuff, 153 00:10:04,000 --> 00:10:06,760 Speaker 1: all just intermingling and trying to have conversations, and you 154 00:10:06,840 --> 00:10:09,240 Speaker 1: start to hear these iconic in PC lines from all 155 00:10:09,720 --> 00:10:12,160 Speaker 1: the different famous games out there as you move through 156 00:10:12,200 --> 00:10:17,319 Speaker 1: the area. Someone make that for me. Okay, I've got 157 00:10:17,320 --> 00:10:19,720 Speaker 1: a lot more stories to cover before we get to those. 158 00:10:19,960 --> 00:10:32,240 Speaker 1: Let's take a quick break. Okay, we're back. Let's shift 159 00:10:32,360 --> 00:10:37,360 Speaker 1: to talk about TikTok So she Too, the CEO of TikTok, 160 00:10:37,520 --> 00:10:40,840 Speaker 1: submitted testimony in advance of his appearance before Congress, which 161 00:10:40,880 --> 00:10:45,400 Speaker 1: is happening as I record this episode. He's currently in 162 00:10:45,480 --> 00:10:49,080 Speaker 1: front of Congress to answer questions about TikTok as the 163 00:10:49,160 --> 00:10:53,960 Speaker 1: US government ramps up resistance to the app and the company. 164 00:10:54,360 --> 00:10:57,560 Speaker 1: So in the testimony he submitted, Cho claims that the 165 00:10:57,600 --> 00:11:02,199 Speaker 1: average US user of TikTok is quote an adult well 166 00:11:02,360 --> 00:11:06,240 Speaker 1: passed college end quote. This was reported in Insider. The 167 00:11:06,320 --> 00:11:10,600 Speaker 1: Insider piece also cites a sales presentation within TikTok that 168 00:11:10,800 --> 00:11:14,040 Speaker 1: leaked in twenty twenty one and said that around seventeen 169 00:11:14,120 --> 00:11:17,640 Speaker 1: percent of users were between the ages of thirteen and seventeen, 170 00:11:18,160 --> 00:11:21,160 Speaker 1: and forty two percent were between eighteen to twenty four. 171 00:11:21,520 --> 00:11:23,760 Speaker 1: And you might think, well, why does this even matter. Well, 172 00:11:24,320 --> 00:11:28,120 Speaker 1: some of the arguments that politicians have made against TikTok 173 00:11:28,200 --> 00:11:31,959 Speaker 1: focus on how the app can promote harmful messages, particularly 174 00:11:32,040 --> 00:11:35,959 Speaker 1: to younger users, and that this can range from misinformation 175 00:11:36,400 --> 00:11:40,520 Speaker 1: to glorifying self harm to encouraging people to participate in 176 00:11:40,640 --> 00:11:44,520 Speaker 1: dangerous viral challenges. Now, I suppose if TikTok were to say, 177 00:11:45,080 --> 00:11:48,400 Speaker 1: but the people who use our app are actually older 178 00:11:48,400 --> 00:11:51,880 Speaker 1: than that, that it's not that many kids, it's mostly adults. 179 00:11:51,920 --> 00:11:55,679 Speaker 1: That I suppose removes a tiny bit of the oath 180 00:11:56,080 --> 00:11:59,160 Speaker 1: behind the argument that TikTok is bad for kids. But 181 00:11:59,240 --> 00:12:01,720 Speaker 1: I mean if it is true, then that is super 182 00:12:01,760 --> 00:12:04,480 Speaker 1: bad news for Meta because for the last couple of years, 183 00:12:04,520 --> 00:12:07,760 Speaker 1: Meta has looked at platforms like TikTok and also to 184 00:12:07,840 --> 00:12:12,200 Speaker 1: others like Snapchat as dangerous competition. That's where the young 185 00:12:12,280 --> 00:12:15,800 Speaker 1: people were going to instead of to Meta. And meanwhile, 186 00:12:15,880 --> 00:12:19,600 Speaker 1: Meta's user base is aging, but there are fewer young 187 00:12:19,720 --> 00:12:23,040 Speaker 1: people coming in, which is bad for long term success 188 00:12:23,040 --> 00:12:26,320 Speaker 1: for the platform. But if it turns out that TikTok 189 00:12:26,400 --> 00:12:29,000 Speaker 1: is not the place where young people are going, then 190 00:12:29,040 --> 00:12:31,560 Speaker 1: who the heck is Meta gonna copy in order to 191 00:12:31,600 --> 00:12:35,520 Speaker 1: try and get those users. Anyway, I'm certain Congress will 192 00:12:35,559 --> 00:12:38,160 Speaker 1: have plenty of other concerns they want to addressed. In fact, 193 00:12:38,160 --> 00:12:40,920 Speaker 1: I know they do because I dipped in just briefly 194 00:12:40,960 --> 00:12:43,840 Speaker 1: to watch a little bit of the hearing. They really 195 00:12:43,880 --> 00:12:47,280 Speaker 1: want to know more about things that honestly, TikTok has 196 00:12:47,280 --> 00:12:50,600 Speaker 1: tried to address multiple times in the past, namely the 197 00:12:50,600 --> 00:12:54,679 Speaker 1: company's relationship to its Chinese parent company Byte Dance and 198 00:12:54,720 --> 00:12:59,079 Speaker 1: then bitte Dances, obligations to the Chinese government, and whether 199 00:12:59,200 --> 00:13:03,480 Speaker 1: or not TikTok is actually keeping safe you know, private 200 00:13:03,480 --> 00:13:05,679 Speaker 1: information and that kind of stuff, or if it's just 201 00:13:05,760 --> 00:13:11,560 Speaker 1: acting as a data siphon for China. You know again, 202 00:13:12,040 --> 00:13:15,360 Speaker 1: TikTok reps repeatedly have said that they have taken steps 203 00:13:15,400 --> 00:13:18,960 Speaker 1: to prevent that kind of stuff from happening. But those 204 00:13:19,040 --> 00:13:22,920 Speaker 1: excuses or reasons, however you want to look at it 205 00:13:22,960 --> 00:13:29,160 Speaker 1: continued to raise skepticism in US government quarters. Yeah, it's 206 00:13:29,200 --> 00:13:31,400 Speaker 1: a complicated thing, and I'm sure I'll talk more about 207 00:13:31,480 --> 00:13:34,040 Speaker 1: this probably next week when we have heard all the 208 00:13:34,080 --> 00:13:37,560 Speaker 1: outcomes of this hearing. ABC News reports that the US 209 00:13:37,640 --> 00:13:43,560 Speaker 1: Securities in Exchange Commission, or SEC, is going after Justin Sun, 210 00:13:44,160 --> 00:13:49,080 Speaker 1: a cryptocurrency company founder who the SEC claims was transferring 211 00:13:49,200 --> 00:13:54,080 Speaker 1: large amounts of specific cryptocurrency tokens back and forth between 212 00:13:54,120 --> 00:13:58,160 Speaker 1: two different wallets that he owned. So they were both 213 00:13:58,200 --> 00:14:01,640 Speaker 1: his wallets, and he was just transferring large amounts back 214 00:14:01,640 --> 00:14:04,040 Speaker 1: and forth again and again. So why would he do that? 215 00:14:05,240 --> 00:14:08,280 Speaker 1: While according to the SEC, it was an effort to 216 00:14:08,320 --> 00:14:11,560 Speaker 1: inflate the trading volume of the tokens, but to do 217 00:14:11,600 --> 00:14:13,880 Speaker 1: so artificially. So, in other words, if someone from the 218 00:14:13,880 --> 00:14:19,040 Speaker 1: outside's looking in says, oh wow, are these tokens are 219 00:14:19,040 --> 00:14:21,200 Speaker 1: being traded back and forth a lot? This is actually 220 00:14:21,280 --> 00:14:25,880 Speaker 1: being actively used as a currency that helps stabilize the 221 00:14:25,960 --> 00:14:29,200 Speaker 1: value of the tokens because people have a confidence in 222 00:14:29,200 --> 00:14:32,080 Speaker 1: that token for it to hold onto that value if 223 00:14:32,120 --> 00:14:35,400 Speaker 1: in fact it's being actively used and not just hoarded 224 00:14:35,520 --> 00:14:39,480 Speaker 1: or sold off. And it also gave sound the ability 225 00:14:39,880 --> 00:14:44,720 Speaker 1: to try and offload stuff without it impacting the value 226 00:14:44,720 --> 00:14:47,200 Speaker 1: of the tokens itself. So that's what the SEC was saying, 227 00:14:47,240 --> 00:14:53,400 Speaker 1: that he was manipulating the system in order to profit 228 00:14:53,520 --> 00:14:56,560 Speaker 1: off of it. He was fixing the game. In other words, 229 00:14:57,080 --> 00:15:01,360 Speaker 1: there's another charge as well that enlisted the help of 230 00:15:01,440 --> 00:15:07,560 Speaker 1: celebrities to endorse these various cryptocurrency tokens, but there was 231 00:15:07,600 --> 00:15:11,840 Speaker 1: no attempt to divulge the fact that they were being 232 00:15:11,880 --> 00:15:15,479 Speaker 1: paid to do this. So, in other words, the celebrities 233 00:15:15,480 --> 00:15:18,440 Speaker 1: were coming across as if they had just personally researched 234 00:15:18,760 --> 00:15:23,760 Speaker 1: this cryptocurrency and that they were engaged with it on 235 00:15:23,800 --> 00:15:26,840 Speaker 1: their own and they were promoting it because they thought 236 00:15:26,880 --> 00:15:31,240 Speaker 1: it was really cool, as opposed to, hey, i've partnered 237 00:15:31,280 --> 00:15:35,320 Speaker 1: with such and such and they make this thing and 238 00:15:35,400 --> 00:15:37,840 Speaker 1: you should check it out right. So there are very 239 00:15:37,840 --> 00:15:41,120 Speaker 1: specific rules that are in place for endorsements. You have 240 00:15:41,360 --> 00:15:45,640 Speaker 1: to divulge the relationship you have with a sponsor. If 241 00:15:45,680 --> 00:15:48,680 Speaker 1: you are an endorser. You're being paid to endorse something, 242 00:15:49,080 --> 00:15:51,640 Speaker 1: you have to make it clear to people. Otherwise it's 243 00:15:51,640 --> 00:15:55,640 Speaker 1: considered a type of false advertising because it gives the 244 00:15:55,640 --> 00:16:00,680 Speaker 1: appearance that you are independently excited about this product that 245 00:16:00,720 --> 00:16:02,560 Speaker 1: you might not have even heard of had it not 246 00:16:02,640 --> 00:16:05,320 Speaker 1: been for this relationship. So that's a big no no 247 00:16:05,520 --> 00:16:09,040 Speaker 1: here in the US is not acknowledging that there was 248 00:16:09,120 --> 00:16:13,360 Speaker 1: payment exchange for that endorsement. Some of the celebrities named 249 00:16:13,400 --> 00:16:15,760 Speaker 1: in the operation have already agreed to hand back the 250 00:16:15,800 --> 00:16:18,320 Speaker 1: money that they had been paid to endorse the crypto 251 00:16:18,360 --> 00:16:20,840 Speaker 1: tokens in the first place. There are a couple of 252 00:16:20,840 --> 00:16:24,280 Speaker 1: holdouts whom I expect will discover the government would very 253 00:16:24,360 --> 00:16:26,960 Speaker 1: much like to have a talk with them. I haven't 254 00:16:26,960 --> 00:16:30,560 Speaker 1: covered stories about tech companies cracking down on remote work 255 00:16:30,680 --> 00:16:33,560 Speaker 1: for a while, largely because a lot of the big 256 00:16:33,560 --> 00:16:36,360 Speaker 1: companies have essentially put in tough restrictions or have just 257 00:16:36,400 --> 00:16:41,600 Speaker 1: outright denied work from home approaches, but platformers Zoe Schiffer 258 00:16:41,680 --> 00:16:44,840 Speaker 1: reports that Apple is taking steps to keep tabs on 259 00:16:44,920 --> 00:16:47,680 Speaker 1: employees to make sure they come in at least three 260 00:16:47,720 --> 00:16:52,640 Speaker 1: times per week by monitoring their employee badge activity, so 261 00:16:52,720 --> 00:16:55,960 Speaker 1: like a security padge when you tap in or in 262 00:16:56,000 --> 00:16:58,520 Speaker 1: some cases out of a building. I don't know if 263 00:16:58,520 --> 00:17:01,920 Speaker 1: Apple requires you to tap in and out. I remember 264 00:17:01,960 --> 00:17:05,160 Speaker 1: back in the day Discovery did, which became a big 265 00:17:05,160 --> 00:17:09,919 Speaker 1: deal because we when we would visit Discovery back when 266 00:17:09,960 --> 00:17:11,760 Speaker 1: I was part of How Stuff Works. How Stuff Works 267 00:17:11,800 --> 00:17:14,879 Speaker 1: got acquired by Discovery for a while, How Stuff Works 268 00:17:15,040 --> 00:17:19,560 Speaker 1: still had security, but a less thorough security approach where 269 00:17:19,920 --> 00:17:21,840 Speaker 1: let's say that I was arriving at the office with 270 00:17:21,880 --> 00:17:25,080 Speaker 1: my arch nemesis Ben Bolin, I might tap in and 271 00:17:25,119 --> 00:17:28,120 Speaker 1: then both of us just walk in. At Discovery, each 272 00:17:28,119 --> 00:17:31,000 Speaker 1: person was required to tap in in sequence. Like it 273 00:17:31,000 --> 00:17:33,960 Speaker 1: didn't matter if you all arrived at the building in 274 00:17:34,000 --> 00:17:36,040 Speaker 1: a big group, you each had to tap in. I 275 00:17:36,080 --> 00:17:40,200 Speaker 1: assume Apple is the same way. So now, according to Schiffer, 276 00:17:41,000 --> 00:17:44,600 Speaker 1: Apple is tracking that data and if someone is not 277 00:17:44,760 --> 00:17:48,160 Speaker 1: tapping in and out three times per week, they get 278 00:17:48,200 --> 00:17:51,000 Speaker 1: a warning, and if they do it again, they get 279 00:17:51,040 --> 00:17:55,720 Speaker 1: an escalating warning, which presumably ultimately leads to some form 280 00:17:55,760 --> 00:18:01,000 Speaker 1: of reprisal. So that's fun. Nothing like being monitored at work. 281 00:18:01,400 --> 00:18:06,760 Speaker 1: It's the best really helps drive up productivity. Now, I 282 00:18:06,800 --> 00:18:09,960 Speaker 1: will say that my guess is that in the current 283 00:18:10,600 --> 00:18:14,280 Speaker 1: work environment where you have so many big companies laying 284 00:18:14,320 --> 00:18:18,520 Speaker 1: off thousands of employees. I mean, I think even indeed, 285 00:18:18,800 --> 00:18:21,240 Speaker 1: a company that's meant to help people find the right 286 00:18:21,359 --> 00:18:24,000 Speaker 1: kind of staff, they laid off a couple of thousand 287 00:18:24,080 --> 00:18:28,440 Speaker 1: people recently, like fifteen percent of their staff. When that's 288 00:18:28,520 --> 00:18:31,199 Speaker 1: the kind of lay of the land, I imagine there 289 00:18:31,200 --> 00:18:34,200 Speaker 1: are a lot of employees who don't feel comfortable advocating 290 00:18:34,240 --> 00:18:38,160 Speaker 1: for remote work solutions, and so they will do their 291 00:18:38,200 --> 00:18:42,080 Speaker 1: best to conform with these kinds of policies where you 292 00:18:42,119 --> 00:18:44,440 Speaker 1: have to come in a certain number of times per week. 293 00:18:45,119 --> 00:18:49,760 Speaker 1: But yeah, it's not a good luck. But again, the 294 00:18:49,800 --> 00:18:51,879 Speaker 1: work environment being what it is, I don't know that 295 00:18:51,920 --> 00:18:54,439 Speaker 1: people feel like they have a lot of alternatives. The 296 00:18:54,520 --> 00:18:58,280 Speaker 1: United States Federal Trade Commission, or FTC, is really stepping 297 00:18:58,359 --> 00:19:01,760 Speaker 1: up recently. The reason I say that is that the 298 00:19:01,880 --> 00:19:06,480 Speaker 1: Verge reports that the FTC is saying subscriptions should be 299 00:19:06,600 --> 00:19:11,520 Speaker 1: just as easy to cancel as they are to initiate. Now, 300 00:19:11,520 --> 00:19:13,720 Speaker 1: I'm sure a lot of you have encountered the experience 301 00:19:13,720 --> 00:19:18,280 Speaker 1: of needing to cancel a subscribe service. Maybe it's your ISP, 302 00:19:18,960 --> 00:19:22,520 Speaker 1: maybe it's a phone plan, maybe it's a streaming subscription, 303 00:19:22,840 --> 00:19:27,400 Speaker 1: or Heaven help you, it's a gym membership, and you've 304 00:19:27,400 --> 00:19:30,360 Speaker 1: probably encountered a situation where you had to go through 305 00:19:30,440 --> 00:19:33,080 Speaker 1: like a wild goose chase just to get out of 306 00:19:33,119 --> 00:19:37,720 Speaker 1: this stupid subscription. I'm actually reminded of when Ryan Block 307 00:19:37,920 --> 00:19:40,639 Speaker 1: tried to cancel his service with Comcast and he was 308 00:19:40,680 --> 00:19:44,800 Speaker 1: put through a ridiculous routine which he recorded and then 309 00:19:44,920 --> 00:19:48,320 Speaker 1: later shared online back in twenty fourteen, and just as 310 00:19:48,320 --> 00:19:52,280 Speaker 1: a personal anecdote, when that story broke, I read about it. 311 00:19:52,359 --> 00:19:55,800 Speaker 1: They didn't initially identify it as Ryan Block, so I 312 00:19:55,840 --> 00:19:58,040 Speaker 1: read it and it's like, oh man, this poor this 313 00:19:58,080 --> 00:20:02,879 Speaker 1: poor guy. He was really just run up the wall 314 00:20:03,560 --> 00:20:06,720 Speaker 1: with the sales representative. I can't believe it. And then 315 00:20:06,720 --> 00:20:08,719 Speaker 1: when I found out who it was, I laughed and 316 00:20:08,800 --> 00:20:12,480 Speaker 1: laughed because I don't know Ryan personally, but I've known 317 00:20:12,520 --> 00:20:16,520 Speaker 1: his wife for like a decade, So when I found 318 00:20:16,520 --> 00:20:19,200 Speaker 1: out it was happening to someone you know that I 319 00:20:19,320 --> 00:20:23,000 Speaker 1: kind of know, it got particularly absurd to me. Anyway, 320 00:20:23,280 --> 00:20:25,800 Speaker 1: the FTC wants that kind of stuff to be buried 321 00:20:25,800 --> 00:20:28,520 Speaker 1: in the past and for companies to adopt a click 322 00:20:28,720 --> 00:20:32,439 Speaker 1: to cancel policy that makes getting out of a subscription 323 00:20:32,840 --> 00:20:34,959 Speaker 1: way less of a hassle. It's supposed to be just 324 00:20:35,040 --> 00:20:38,080 Speaker 1: as easy to end a subscription as it is to 325 00:20:38,280 --> 00:20:41,840 Speaker 1: start one. And since I don't think companies are going 326 00:20:41,920 --> 00:20:45,080 Speaker 1: to win to make it more challenging to sign up 327 00:20:45,320 --> 00:20:47,439 Speaker 1: for a service. You know, the harder it is for 328 00:20:47,480 --> 00:20:49,359 Speaker 1: you to join a service, the less likely you're going 329 00:20:49,400 --> 00:20:51,960 Speaker 1: to do it. Like you might be convinced at first, 330 00:20:51,960 --> 00:20:54,840 Speaker 1: Oh yeah, no, that doesn't sound bad, I'm in. But 331 00:20:54,920 --> 00:20:57,240 Speaker 1: if you start to see there's a curve, like a 332 00:20:57,280 --> 00:21:01,199 Speaker 1: barrier to entry, you might bounce. Those people don't want that, Like, 333 00:21:01,240 --> 00:21:03,320 Speaker 1: they want you to be as committed as you possibly 334 00:21:03,320 --> 00:21:05,600 Speaker 1: can be to the point where you are actually hooked in. 335 00:21:06,200 --> 00:21:10,640 Speaker 1: So they're not going to make signing up more complicated, 336 00:21:10,720 --> 00:21:12,400 Speaker 1: but it does mean that they have to make it 337 00:21:12,800 --> 00:21:16,040 Speaker 1: less complicated to cancel out of something. It would also 338 00:21:16,040 --> 00:21:18,679 Speaker 1: mean that companies that use various incentives to try and 339 00:21:18,760 --> 00:21:22,600 Speaker 1: keep customers on board would have to offer some sort 340 00:21:22,640 --> 00:21:25,760 Speaker 1: of total opt out pathway for people who just don't 341 00:21:25,760 --> 00:21:27,840 Speaker 1: have the time to listen to that kind of pitch. 342 00:21:28,280 --> 00:21:30,439 Speaker 1: So again, if you ever tried to cancel out of 343 00:21:30,440 --> 00:21:33,560 Speaker 1: a phone plan, you probably heard well, you know, if 344 00:21:33,600 --> 00:21:36,000 Speaker 1: you decide to resign with us will give you blah 345 00:21:36,040 --> 00:21:39,480 Speaker 1: blah blah blah blah blah blah this new set of rules. 346 00:21:39,480 --> 00:21:42,080 Speaker 1: Say no, no, no. You can just say right off 347 00:21:42,119 --> 00:21:44,520 Speaker 1: the top, I'm not interested in hearing any other offers. 348 00:21:44,560 --> 00:21:48,720 Speaker 1: I just want out. However, this particular set of rules 349 00:21:48,720 --> 00:21:53,679 Speaker 1: would not apply to non commercial services, so stuff like 350 00:21:54,240 --> 00:21:58,680 Speaker 1: charitable donations or political donations those would not necessarily get 351 00:21:58,680 --> 00:22:02,080 Speaker 1: covered by these rules. The proposal received a three to 352 00:22:02,240 --> 00:22:05,160 Speaker 1: one vote in the FTC. The one person who voted 353 00:22:05,200 --> 00:22:08,600 Speaker 1: against it is the Loan Republican member of the FTC board. 354 00:22:09,359 --> 00:22:12,159 Speaker 1: But it's still going to be open for public comment, 355 00:22:12,359 --> 00:22:15,479 Speaker 1: so people can actually weigh in on what they think first, 356 00:22:16,080 --> 00:22:20,919 Speaker 1: and that'll all happen before the FTC can adopt the rules, 357 00:22:21,440 --> 00:22:25,560 Speaker 1: which they may end up changing before they adopt. Also, 358 00:22:25,600 --> 00:22:29,119 Speaker 1: the FTC itself would not actually be taking action against 359 00:22:29,119 --> 00:22:31,879 Speaker 1: companies that failed to comply with these rules. Instead, the 360 00:22:31,960 --> 00:22:35,919 Speaker 1: rules would give regulators the ability to enforce them, So essentially, 361 00:22:35,920 --> 00:22:40,919 Speaker 1: it's saying regulators who are already in charge of enforcing 362 00:22:41,000 --> 00:22:44,159 Speaker 1: other rules for companies would just have new rules that 363 00:22:44,200 --> 00:22:48,400 Speaker 1: they could continue to enforce. So pretty good news if 364 00:22:48,440 --> 00:22:51,520 Speaker 1: you have ever suffered the experience of having to try 365 00:22:51,560 --> 00:22:53,640 Speaker 1: and cancel out a something that was designed to make 366 00:22:53,680 --> 00:22:57,520 Speaker 1: it very hard to do that. Okay, got a few 367 00:22:57,560 --> 00:22:59,760 Speaker 1: more stories to go, but before we get to that, 368 00:23:00,160 --> 00:23:11,480 Speaker 1: take another quick break. Before the break, we were talking 369 00:23:11,560 --> 00:23:15,280 Speaker 1: about the FTC, the Federal Trade Commission. Now let's talk 370 00:23:15,320 --> 00:23:21,040 Speaker 1: about the FCC or Federal Communications Commission. It is taking 371 00:23:21,119 --> 00:23:25,439 Speaker 1: aim against spam text messages the same way that the 372 00:23:25,480 --> 00:23:29,600 Speaker 1: agency targeted robocalls a couple of years ago. So if 373 00:23:29,600 --> 00:23:32,200 Speaker 1: you're in the US, you might remember that the FCC 374 00:23:32,400 --> 00:23:37,320 Speaker 1: passed rules for telecom companies to shut down robo calls 375 00:23:37,359 --> 00:23:41,480 Speaker 1: whenever possible. It actually led to one network getting shut 376 00:23:41,480 --> 00:23:44,640 Speaker 1: out of the American telecommunications infrastructure where they weren't able 377 00:23:44,680 --> 00:23:49,040 Speaker 1: to interface with any other telephone network because they were 378 00:23:49,160 --> 00:23:53,920 Speaker 1: failing to shut those down. So that was a fair success. 379 00:23:53,960 --> 00:23:58,320 Speaker 1: I mean, I still get robocalls, so I don't think 380 00:23:58,400 --> 00:24:01,960 Speaker 1: it was a total success, but it definitely has cut 381 00:24:02,000 --> 00:24:06,840 Speaker 1: back on that activity. Now they want to do the 382 00:24:06,880 --> 00:24:09,920 Speaker 1: same thing but for spam text messages. So the new 383 00:24:10,000 --> 00:24:13,000 Speaker 1: rule says that phone companies will have to block text 384 00:24:13,040 --> 00:24:19,639 Speaker 1: messages originating from quote invalid, unallocated or underused end quote 385 00:24:19,920 --> 00:24:22,640 Speaker 1: phone numbers. So if it's a phone number that has 386 00:24:22,680 --> 00:24:26,919 Speaker 1: been associated with spam, then the phone company should just 387 00:24:27,080 --> 00:24:31,600 Speaker 1: block those text messages as a rule of thumb. The 388 00:24:31,680 --> 00:24:37,040 Speaker 1: vote passed unanimously within the FCC, So that makes sense 389 00:24:37,080 --> 00:24:39,639 Speaker 1: because there have been a lot of reports of fraud 390 00:24:39,680 --> 00:24:42,280 Speaker 1: connecting to spammy text messages that have been on the 391 00:24:42,359 --> 00:24:44,399 Speaker 1: rise in recent years, and so there's a real need 392 00:24:44,960 --> 00:24:50,320 Speaker 1: to protect the public from scam artists and people who 393 00:24:50,359 --> 00:24:53,960 Speaker 1: are you know, trying to fish for data that kind 394 00:24:53,960 --> 00:24:56,760 Speaker 1: of thing. And you know, some people are really really 395 00:24:56,840 --> 00:25:01,160 Speaker 1: vulnerable to that, particularly the older generation tend to be 396 00:25:01,480 --> 00:25:06,360 Speaker 1: more susceptible to those kinds of attacks. So yeah, I'm 397 00:25:06,400 --> 00:25:11,400 Speaker 1: glad to see this happening as well. I honestly remember 398 00:25:11,440 --> 00:25:14,720 Speaker 1: a time where I would get a phone call that 399 00:25:15,240 --> 00:25:17,479 Speaker 1: you know, obviously I wouldn't answer it. I would go 400 00:25:17,520 --> 00:25:20,840 Speaker 1: online and try and search up the number to a 401 00:25:20,880 --> 00:25:24,360 Speaker 1: reverse search, and there were a lot of resources out 402 00:25:24,400 --> 00:25:26,359 Speaker 1: there that would track whether or not something was a 403 00:25:26,400 --> 00:25:29,680 Speaker 1: spam call. For whatever reason, these days, I can't easily 404 00:25:29,720 --> 00:25:33,720 Speaker 1: find those resources anymore. I don't know if they just stopped, 405 00:25:34,080 --> 00:25:36,199 Speaker 1: or if maybe they're just buried in search results. I 406 00:25:36,200 --> 00:25:40,679 Speaker 1: haven't really dug deep into it, but it got to 407 00:25:40,680 --> 00:25:44,040 Speaker 1: a point where I was getting the stress that I 408 00:25:44,080 --> 00:25:48,240 Speaker 1: couldn't easily see if something that was coming in with 409 00:25:48,400 --> 00:25:52,280 Speaker 1: spam or not. So knowing that there are steps being 410 00:25:52,280 --> 00:25:56,480 Speaker 1: taken to at least shut down the known perpetrators of spam, 411 00:25:57,560 --> 00:26:01,240 Speaker 1: I find that refreshing, because goodness does I just want 412 00:26:01,359 --> 00:26:04,200 Speaker 1: to for my device to be useful, and if I'm 413 00:26:04,240 --> 00:26:07,240 Speaker 1: discouraged from using it because of all the robocalls and spam, 414 00:26:08,160 --> 00:26:12,280 Speaker 1: then I just become a hermit, which, you know, some 415 00:26:12,359 --> 00:26:18,560 Speaker 1: days it's an attractive thought. Okay. Sharon Harding at Ours 416 00:26:18,600 --> 00:26:24,240 Speaker 1: Technica has a terrifying article titled journalists plug in unknown 417 00:26:24,480 --> 00:26:28,399 Speaker 1: USB drive mailed to him it exploded in his face, 418 00:26:29,320 --> 00:26:32,080 Speaker 1: and yeah, the headline is scary, but it actually kind 419 00:26:32,080 --> 00:26:36,600 Speaker 1: of gets worse. So five journalists from Ecuador received USB 420 00:26:36,760 --> 00:26:40,760 Speaker 1: drives in the mail sent from another part within Ecuador, 421 00:26:40,880 --> 00:26:43,840 Speaker 1: so it was within the country that they received these. 422 00:26:44,400 --> 00:26:48,080 Speaker 1: And one of these journalists, a guy named Lennon Artieda, 423 00:26:48,200 --> 00:26:51,760 Speaker 1: inserted the drive into a computer. You know, he plugged 424 00:26:51,760 --> 00:26:55,240 Speaker 1: the USB drive into a laptop or computer, and then 425 00:26:55,280 --> 00:26:59,760 Speaker 1: the USB drive exploded. There's a little capsule sized amount 426 00:26:59,760 --> 00:27:04,960 Speaker 1: of explosive in there that once it received voltage enough voltage, 427 00:27:05,520 --> 00:27:10,680 Speaker 1: it detonated. Fortunately the injuries that Artada received were not serious, 428 00:27:11,040 --> 00:27:14,840 Speaker 1: but I'm sure it was a terrifying experience. So in 429 00:27:14,880 --> 00:27:18,119 Speaker 1: other cases, people received these drives, but they hooked them 430 00:27:18,200 --> 00:27:22,600 Speaker 1: up through adapters that did not provide the voltage needed 431 00:27:22,640 --> 00:27:25,960 Speaker 1: to detonate the device, and they discovered that, in fact, 432 00:27:25,960 --> 00:27:30,520 Speaker 1: they were other explosive devices. As I said, it's been 433 00:27:30,560 --> 00:27:32,720 Speaker 1: five people so far, at least according to the artist 434 00:27:32,720 --> 00:27:35,359 Speaker 1: technic apiece. And you might wonder, well, why the heck 435 00:27:36,080 --> 00:27:40,639 Speaker 1: are journalists in Ecuador receiving explosive devices? And details are 436 00:27:40,720 --> 00:27:45,320 Speaker 1: really scarce on that, Like, there's a lot of speculation 437 00:27:45,560 --> 00:27:49,800 Speaker 1: about what could be the root reason for this. It 438 00:27:49,920 --> 00:27:53,280 Speaker 1: seems reasonable to conclude that this is an attempt to 439 00:27:53,440 --> 00:27:58,640 Speaker 1: intimidate and silence journalists. But honestly, outside of Ecuador, there's 440 00:27:58,640 --> 00:28:01,840 Speaker 1: not a whole love information about out who is responsible 441 00:28:01,840 --> 00:28:04,680 Speaker 1: for this and what purpose it is? Like, what are 442 00:28:04,720 --> 00:28:08,680 Speaker 1: they being silenced about? I am not sure neither is 443 00:28:08,760 --> 00:28:12,359 Speaker 1: Harding at Ours Technica. But Harding does remind us that 444 00:28:12,400 --> 00:28:15,320 Speaker 1: we should never plug in an unknown USB device to 445 00:28:15,359 --> 00:28:20,480 Speaker 1: a computer. If you happen across a USB device. Don't 446 00:28:20,560 --> 00:28:23,560 Speaker 1: attach that to a computer system. You never know what's 447 00:28:23,600 --> 00:28:26,600 Speaker 1: on it. Now, normally I would say don't do it, 448 00:28:26,640 --> 00:28:29,480 Speaker 1: because there could be malware on that USB drive and 449 00:28:29,560 --> 00:28:33,480 Speaker 1: you might introduce that malware. You might inject it into 450 00:28:33,520 --> 00:28:37,720 Speaker 1: your computer and overall into like a network system. Heck, 451 00:28:38,200 --> 00:28:44,560 Speaker 1: that's how stucks Net infected centrifuges in nuclear facilities in Iran. 452 00:28:44,680 --> 00:28:50,760 Speaker 1: Sucks that being some malware that presumably was developed by 453 00:28:51,440 --> 00:28:54,680 Speaker 1: Israel and possibly the United States, probably some sort of 454 00:28:54,720 --> 00:29:01,240 Speaker 1: combination there that then got introduced to otherwise gapped systems 455 00:29:01,960 --> 00:29:07,400 Speaker 1: within Iran. Not easy to do unless you're able to 456 00:29:08,120 --> 00:29:10,640 Speaker 1: hide it on say a USB drive and convince someone 457 00:29:10,680 --> 00:29:14,200 Speaker 1: to connect that drive to the otherwise air gapped systems. 458 00:29:14,600 --> 00:29:17,320 Speaker 1: So yeah, that's one reason you would never want to 459 00:29:17,320 --> 00:29:21,360 Speaker 1: plug a USB drive that you came from some unknown 460 00:29:21,400 --> 00:29:24,480 Speaker 1: source into your computer. But another is that it might 461 00:29:24,560 --> 00:29:28,959 Speaker 1: just explode. Finally, last night at Cape Canaveral, and aerospace 462 00:29:29,000 --> 00:29:33,240 Speaker 1: startup called Reality Space became the first company to launch 463 00:29:33,360 --> 00:29:37,520 Speaker 1: a three D printed rocket successfully. Now that's the good news. 464 00:29:37,840 --> 00:29:42,560 Speaker 1: The bad news is that the rocket, designated a Terran one, 465 00:29:42,800 --> 00:29:46,800 Speaker 1: failed during its second stage separation. So it wasn't able 466 00:29:46,840 --> 00:29:51,480 Speaker 1: to achieve low Earth orbit. The three D printed components 467 00:29:51,520 --> 00:29:54,880 Speaker 1: of this rocket made up about eighty five percent of 468 00:29:54,920 --> 00:29:58,600 Speaker 1: the launch vehicle. So this isn't like someone just hit 469 00:29:58,680 --> 00:30:02,840 Speaker 1: print on their laptop and then many, many hours later 470 00:30:03,240 --> 00:30:07,520 Speaker 1: there was a fully built rocket standing there. But instead 471 00:30:07,560 --> 00:30:11,000 Speaker 1: it was about lots of components that were printed three 472 00:30:11,120 --> 00:30:14,040 Speaker 1: D printed, including metal components that were three D printed 473 00:30:14,360 --> 00:30:19,160 Speaker 1: to build this rocket. This approach could really bring down 474 00:30:19,240 --> 00:30:25,720 Speaker 1: launch costs. It ends up simplifying the design and manufacturing 475 00:30:25,720 --> 00:30:29,800 Speaker 1: of rockets, which could really make it more cost effective 476 00:30:29,840 --> 00:30:33,080 Speaker 1: to send stuff to space, which is pretty cool. And 477 00:30:33,160 --> 00:30:36,240 Speaker 1: the fact that the rocket held together for the launch 478 00:30:36,520 --> 00:30:39,800 Speaker 1: is by itself a great achievement. Sure, the second stage 479 00:30:39,840 --> 00:30:44,160 Speaker 1: separation did not go off as planned, which is unfortunate, 480 00:30:44,840 --> 00:30:48,959 Speaker 1: but as we have said many many times on this show, 481 00:30:49,280 --> 00:30:53,560 Speaker 1: rocket science is really hard y'all. The company, meanwhile, has 482 00:30:53,600 --> 00:30:57,640 Speaker 1: aspirations developing rockets that in the future are as much 483 00:30:57,680 --> 00:31:01,560 Speaker 1: as ninety five percent printed, and it's just really exciting. 484 00:31:01,560 --> 00:31:06,200 Speaker 1: It's really cool. I think it has the possibility of 485 00:31:07,040 --> 00:31:10,920 Speaker 1: taking on some of the duties of launching smaller payloads 486 00:31:10,960 --> 00:31:15,959 Speaker 1: into space at a much reduced cost, which comes with 487 00:31:15,960 --> 00:31:19,000 Speaker 1: its own challenges. Obviously, you don't want to launch too 488 00:31:19,080 --> 00:31:21,600 Speaker 1: much stuff because then you've got space junk just orbiting 489 00:31:22,280 --> 00:31:26,160 Speaker 1: the planet and potentially creating obstacles that you have to 490 00:31:26,200 --> 00:31:30,320 Speaker 1: plan around when you're doing future space missions. But also 491 00:31:30,440 --> 00:31:35,720 Speaker 1: it might mean that we could really take advantage of 492 00:31:35,760 --> 00:31:39,720 Speaker 1: some cool opportunities that otherwise would be too expensive for 493 00:31:39,800 --> 00:31:43,800 Speaker 1: us to pursue, and that to me is really exciting. 494 00:31:44,200 --> 00:31:48,000 Speaker 1: All Right, that's it for the news for Thursday, March 495 00:31:48,040 --> 00:31:51,200 Speaker 1: twenty third, twenty twenty three. Hope you're all well, and 496 00:31:51,280 --> 00:32:00,880 Speaker 1: I'll talk to you again really soon. Tech Stuff is 497 00:32:00,880 --> 00:32:05,440 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 498 00:32:05,480 --> 00:32:09,120 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 499 00:32:09,160 --> 00:32:09,880 Speaker 1: favorite shows.