1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,280 --> 00:00:16,080 Speaker 1: You're welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,120 --> 00:00:18,919 Speaker 1: I'm an executive producer with I Heart Radio, and I 4 00:00:19,040 --> 00:00:22,160 Speaker 1: love all things tech. And it is time for the 5 00:00:22,280 --> 00:00:29,040 Speaker 1: tech News for Thursday, March fourth one. And before I begin, 6 00:00:29,720 --> 00:00:32,479 Speaker 1: I want to address something from the news episode I 7 00:00:32,520 --> 00:00:35,800 Speaker 1: published earlier this week. I talked about how the social 8 00:00:35,880 --> 00:00:41,200 Speaker 1: network Gab, known for housing a lot of far right users, 9 00:00:41,280 --> 00:00:44,120 Speaker 1: became the target of hackers. But one thing I did 10 00:00:44,159 --> 00:00:47,479 Speaker 1: not mention this was a failing on my part is 11 00:00:47,520 --> 00:00:51,400 Speaker 1: that while I don't agree at all philosophically with the 12 00:00:51,440 --> 00:00:55,560 Speaker 1: average Gab users, I also condemn hacking in an effort 13 00:00:55,600 --> 00:00:59,880 Speaker 1: to gain access to systems and steal, you know, people's data. 14 00:01:00,480 --> 00:01:04,160 Speaker 1: I think intrusion experts are important. They can help companies 15 00:01:04,160 --> 00:01:08,560 Speaker 1: discover and patch security vulnerabilities, and sometimes folks will do 16 00:01:08,600 --> 00:01:11,119 Speaker 1: that all on their own. They'll product systems to see 17 00:01:11,160 --> 00:01:14,160 Speaker 1: if they're really secure, and typically they'll reach out and 18 00:01:14,200 --> 00:01:17,880 Speaker 1: tell a company if they found a vulnerability, unless, of course, 19 00:01:17,920 --> 00:01:21,480 Speaker 1: they're going all black hat with the hacker route. But 20 00:01:21,680 --> 00:01:24,480 Speaker 1: this was a case of someone scraping data off of 21 00:01:24,520 --> 00:01:28,920 Speaker 1: gabs systems and then sharing that data externally. And while 22 00:01:28,959 --> 00:01:33,120 Speaker 1: I might vehemently disagree with the general political viewpoint over 23 00:01:33,160 --> 00:01:37,959 Speaker 1: at GAB, I also don't condone stealing data anyway. No 24 00:01:38,000 --> 00:01:40,679 Speaker 1: one told me to say any of that. In fact, 25 00:01:40,720 --> 00:01:43,440 Speaker 1: no one's even brought it up. Maybe I'm just being 26 00:01:43,959 --> 00:01:46,560 Speaker 1: you know, overreactive or whatever, but it occurred to me 27 00:01:46,680 --> 00:01:49,680 Speaker 1: after that show went live that I really should say 28 00:01:49,720 --> 00:01:51,800 Speaker 1: something about it, as I think it's really the ethical 29 00:01:51,840 --> 00:01:55,520 Speaker 1: thing to do. Okay, let's get onto the news. Our 30 00:01:55,560 --> 00:01:59,760 Speaker 1: first story falls into the category of irony. The US 31 00:02:00,120 --> 00:02:03,120 Speaker 1: of Appeals for the Federal Circuit has found in favor, 32 00:02:03,160 --> 00:02:06,360 Speaker 1: at least partly of a German company called bit Management, 33 00:02:06,800 --> 00:02:10,680 Speaker 1: regarding the company's claim that the U. S. Navy copied 34 00:02:10,760 --> 00:02:15,120 Speaker 1: software from bit Management without permission. So this is a 35 00:02:15,160 --> 00:02:18,520 Speaker 1: copyright case. But another way to say it is that 36 00:02:18,600 --> 00:02:23,080 Speaker 1: the US Navy has been found guilty of piracy, which, 37 00:02:23,120 --> 00:02:26,760 Speaker 1: as I understand it, is the opposite of what navies 38 00:02:26,840 --> 00:02:30,680 Speaker 1: are supposed to do. All goofs aside, this is just 39 00:02:30,840 --> 00:02:34,200 Speaker 1: the latest development and a problem that started a decade ago. 40 00:02:34,840 --> 00:02:38,680 Speaker 1: The U. S. Navy purchased software called BS Contact, which 41 00:02:38,720 --> 00:02:43,079 Speaker 1: is a virtual reality suite from Bit Management Actually they 42 00:02:43,080 --> 00:02:46,480 Speaker 1: went through a third party company and then the Navy 43 00:02:46,639 --> 00:02:50,960 Speaker 1: copied that software onto its internal network. But according to 44 00:02:51,000 --> 00:02:54,079 Speaker 1: Bit Management, the purchase of the software did not include 45 00:02:54,120 --> 00:02:57,280 Speaker 1: a license for the Navy to do that. The Navy 46 00:02:57,400 --> 00:03:01,240 Speaker 1: ended up installing the software more than half a million computers. 47 00:03:02,000 --> 00:03:05,520 Speaker 1: The company filed suit against the Navy in twenty sixteen 48 00:03:05,560 --> 00:03:08,480 Speaker 1: on the matter, demanding that the Navy pay for all 49 00:03:08,480 --> 00:03:13,040 Speaker 1: those copies, which amounts to hundreds of millions of dollars. 50 00:03:13,080 --> 00:03:15,120 Speaker 1: So why did it take so long for the case 51 00:03:15,160 --> 00:03:17,720 Speaker 1: to reach its current status? While a big part of 52 00:03:17,760 --> 00:03:20,120 Speaker 1: the problem is that the Navy was working with a 53 00:03:20,200 --> 00:03:23,760 Speaker 1: third party retailer that acted kind of like a go between, 54 00:03:24,120 --> 00:03:26,639 Speaker 1: and it meant that Navy officials actually believed that they 55 00:03:26,720 --> 00:03:30,000 Speaker 1: did have the permission to copy the software, while Bit 56 00:03:30,080 --> 00:03:34,840 Speaker 1: Management maintains the Navy did not in fact have that permission. Now, 57 00:03:34,880 --> 00:03:38,040 Speaker 1: the initial court case found in favor of the U. S. 58 00:03:38,120 --> 00:03:42,600 Speaker 1: Navy against Bit Management, but then Bit Management appealed the matter. 59 00:03:43,080 --> 00:03:47,600 Speaker 1: The Court of Appeals partly upheld the earlier ruling in 60 00:03:47,680 --> 00:03:50,280 Speaker 1: favor of the Navy, but it did say that while 61 00:03:50,360 --> 00:03:55,000 Speaker 1: there was an implied license between bit Management and the Navy. 62 00:03:55,120 --> 00:03:58,880 Speaker 1: The Navy failed to track usage of the software across 63 00:03:59,000 --> 00:04:01,680 Speaker 1: its network, which would have been part of the terms 64 00:04:01,920 --> 00:04:06,120 Speaker 1: of this implied license. As such, the Navy is liable 65 00:04:06,200 --> 00:04:09,120 Speaker 1: for copyright infringement. Now the case will go back to 66 00:04:09,200 --> 00:04:13,120 Speaker 1: Federal Claims Court to decide what damages should be paid 67 00:04:13,120 --> 00:04:16,360 Speaker 1: to bit Management, and first part bit Management says those 68 00:04:16,440 --> 00:04:19,680 Speaker 1: damages should be somewhere in the neighborhood of half a 69 00:04:19,880 --> 00:04:26,440 Speaker 1: billion dollars yikes. At Microsoft's virtual event Ignite, the company 70 00:04:26,520 --> 00:04:30,359 Speaker 1: unveiled a cloud based platform for software developers interested in 71 00:04:30,480 --> 00:04:35,160 Speaker 1: making applications that incorporate mixed reality. And just in case 72 00:04:35,200 --> 00:04:38,600 Speaker 1: that phrase doesn't mean anything to you, mixed reality is 73 00:04:38,640 --> 00:04:42,080 Speaker 1: sort of a spectrum of experiences that includes stuff like 74 00:04:42,240 --> 00:04:46,120 Speaker 1: virtual reality, where the majority of what a user experiences 75 00:04:46,160 --> 00:04:49,760 Speaker 1: through their senses comes courtesy of a computer, so the 76 00:04:49,760 --> 00:04:52,840 Speaker 1: stuff you see and hear are all computer generated, to 77 00:04:53,200 --> 00:04:57,200 Speaker 1: augmented reality, which involves enhancing our perception of the world 78 00:04:57,240 --> 00:05:02,000 Speaker 1: around us by incorporating digital experience adians is on top of, 79 00:05:02,120 --> 00:05:06,719 Speaker 1: or integrated with that world around us. So while MESH 80 00:05:06,839 --> 00:05:09,520 Speaker 1: sounds promising and it could lead to some really interesting 81 00:05:09,560 --> 00:05:14,719 Speaker 1: implementations of mixed reality, software, engineer and technical fellow. That's, 82 00:05:15,000 --> 00:05:18,000 Speaker 1: by the way, a title, not a description. Alex Kitman 83 00:05:18,120 --> 00:05:22,560 Speaker 1: said that the hollow lens that's the mixed reality headset 84 00:05:22,640 --> 00:05:26,560 Speaker 1: from Microsoft, won't be hitting consumer shelves any time in 85 00:05:26,600 --> 00:05:29,800 Speaker 1: the immediate future. Kitman said that the hollow lens has 86 00:05:29,839 --> 00:05:31,800 Speaker 1: a way to go before it is in a form 87 00:05:31,880 --> 00:05:35,600 Speaker 1: factor that consumers will really appreciate and enjoy. I think 88 00:05:35,600 --> 00:05:38,760 Speaker 1: that's actually a really refreshing thing to hear. I've never 89 00:05:38,920 --> 00:05:42,240 Speaker 1: used a hollow lens myself, I understand the experience is 90 00:05:42,320 --> 00:05:46,320 Speaker 1: really compelling. It's really interesting. However, the headsets right now 91 00:05:46,360 --> 00:05:49,400 Speaker 1: are a little bulky. They can become uncomfortable after you 92 00:05:49,440 --> 00:05:52,080 Speaker 1: wear them for a bit. And Kipman's point is that 93 00:05:52,160 --> 00:05:55,640 Speaker 1: a consumer product really needs to be quote comfortable enough 94 00:05:55,760 --> 00:06:00,240 Speaker 1: and immersive enough and socially acceptable enough end quote, and 95 00:06:00,279 --> 00:06:03,680 Speaker 1: that the hollow lens just isn't there yet. For that reason, 96 00:06:03,720 --> 00:06:07,200 Speaker 1: the hollow lens market is still aimed at businesses rather 97 00:06:07,279 --> 00:06:12,080 Speaker 1: than you know, the general consumer. The company Brave, which 98 00:06:12,120 --> 00:06:16,280 Speaker 1: makes a web browser that focuses on user privacy, has 99 00:06:16,279 --> 00:06:19,799 Speaker 1: announced that it has acquired a search engine. That search 100 00:06:19,839 --> 00:06:22,920 Speaker 1: engine is called tail Cat, which in itself is the 101 00:06:22,960 --> 00:06:27,200 Speaker 1: product of a similarly privacy focused company called Clicks, but 102 00:06:27,480 --> 00:06:30,360 Speaker 1: Clicks actually called it quits in April of last year 103 00:06:30,800 --> 00:06:34,360 Speaker 1: in the wake of the pandemic, citing fundraising problems as 104 00:06:34,360 --> 00:06:36,359 Speaker 1: being one of the big reasons the company had to 105 00:06:36,400 --> 00:06:40,640 Speaker 1: shut down. Tail Cat will transform into Brave Search, and 106 00:06:40,680 --> 00:06:44,159 Speaker 1: it will be the default search engine for the Brave browser. Now, 107 00:06:44,200 --> 00:06:47,360 Speaker 1: I have to admit I had not heard of Brave 108 00:06:47,640 --> 00:06:50,360 Speaker 1: before reading up on the news this week, but there 109 00:06:50,400 --> 00:06:54,760 Speaker 1: are around twenty five million active Brave users according to 110 00:06:54,800 --> 00:06:58,640 Speaker 1: the Register. Brave blocks website trackers and ads, but it 111 00:06:58,720 --> 00:07:02,560 Speaker 1: also has its own an ad network, which has prompted 112 00:07:02,600 --> 00:07:05,920 Speaker 1: some critics to question the ethics behind the company's revenue model. 113 00:07:06,520 --> 00:07:09,240 Speaker 1: I'm gonna have to look into this in a more 114 00:07:09,400 --> 00:07:12,400 Speaker 1: thorough manner and do maybe a full episode about Brave, 115 00:07:12,440 --> 00:07:16,200 Speaker 1: because I find it interesting and their approach does seem 116 00:07:16,240 --> 00:07:20,360 Speaker 1: to be a bit complicated. Now, it wouldn't be a 117 00:07:20,360 --> 00:07:23,239 Speaker 1: news episode in one if we didn't spend a little 118 00:07:23,240 --> 00:07:27,080 Speaker 1: time talking about social networking platforms and their contribution to 119 00:07:27,240 --> 00:07:32,040 Speaker 1: misinformation and radicalization. So first up is YouTube, which has 120 00:07:32,120 --> 00:07:37,440 Speaker 1: once again suspended lawyer and former politician Rudy Giuliani's account. 121 00:07:38,040 --> 00:07:41,920 Speaker 1: This makes it suspension number two and as many months. 122 00:07:42,600 --> 00:07:44,960 Speaker 1: The band will last for two weeks, during which time 123 00:07:45,000 --> 00:07:47,760 Speaker 1: Giuliani will not be allowed to upload new videos to 124 00:07:47,840 --> 00:07:50,679 Speaker 1: his channel. And you might ask what was the reasoning 125 00:07:50,720 --> 00:07:54,520 Speaker 1: behind this band, Well, it was technically twofold. According to 126 00:07:54,560 --> 00:07:59,000 Speaker 1: a YouTube spokesperson, quote, we removed content from Rudy W. 127 00:07:59,160 --> 00:08:02,480 Speaker 1: Giuliani che and All for violating our Sale of Regulated 128 00:08:02,520 --> 00:08:07,360 Speaker 1: Goods policy, which prohibits content facilitating the use of nicotine, 129 00:08:07,480 --> 00:08:12,160 Speaker 1: and our presidential election integrity policy. End quote. Now, if 130 00:08:12,200 --> 00:08:15,240 Speaker 1: Giuliani gets a third strike within ninety days of this 131 00:08:15,360 --> 00:08:18,960 Speaker 1: band being lifted, his channel will get a permanent ban. 132 00:08:19,640 --> 00:08:23,080 Speaker 1: Giuliani has spent a lot of time undermining the results 133 00:08:23,120 --> 00:08:26,680 Speaker 1: of the twenty u S election, saying repeatedly that there 134 00:08:26,760 --> 00:08:31,800 Speaker 1: was widespread fraud without you know, providing any actual evidence 135 00:08:31,840 --> 00:08:35,320 Speaker 1: that such fraud happened. In fact, he's been reprimanded for 136 00:08:35,360 --> 00:08:39,440 Speaker 1: that very thing multiple times in multiple courts. And in 137 00:08:39,760 --> 00:08:44,160 Speaker 1: semi related news, Wired ran a story titled TikTok played 138 00:08:44,240 --> 00:08:49,120 Speaker 1: a key role in MAGA radicalization, MAGA being make America 139 00:08:49,200 --> 00:08:52,160 Speaker 1: great Again, and they pointed out that while a lot 140 00:08:52,200 --> 00:08:55,040 Speaker 1: of the public's focus on the subject of radicalization and 141 00:08:55,120 --> 00:08:59,120 Speaker 1: extremism has been on platforms like Facebook, Twitter, and Parlor 142 00:08:59,760 --> 00:09:04,040 Speaker 1: or la if you prefer, TikTok has mostly been overlooked, 143 00:09:04,520 --> 00:09:07,520 Speaker 1: and Cameron Hickey, the author of the piece, makes the 144 00:09:07,559 --> 00:09:11,280 Speaker 1: point that TikTok has way more active users than Parlor 145 00:09:11,400 --> 00:09:15,400 Speaker 1: ever did, and that radicalizing content on TikTok often falls 146 00:09:15,400 --> 00:09:19,160 Speaker 1: into a categorization that makes it tricky for TikTok to 147 00:09:19,200 --> 00:09:24,000 Speaker 1: actually moderate the content simply because TikTok hasn't really formulated 148 00:09:24,040 --> 00:09:28,080 Speaker 1: policies that really cover that type of messaging, and the 149 00:09:28,320 --> 00:09:32,720 Speaker 1: meme generating culture at TikTok also means that these types 150 00:09:32,720 --> 00:09:36,080 Speaker 1: of messages get augmented as more and more people interact 151 00:09:36,120 --> 00:09:38,360 Speaker 1: with them and add to them and share them. It's 152 00:09:38,400 --> 00:09:41,199 Speaker 1: also a reminder that in a world where people are 153 00:09:41,280 --> 00:09:45,920 Speaker 1: more isolated than they usually are, technological radicalization is playing 154 00:09:45,960 --> 00:09:49,880 Speaker 1: a much larger role than it previously did. And since 155 00:09:49,920 --> 00:09:53,720 Speaker 1: I mentioned Parlor last month, the company filed an antitrust 156 00:09:53,800 --> 00:09:58,559 Speaker 1: lawsuit against Amazon after Amazon Web Services or a WS, 157 00:09:59,040 --> 00:10:02,400 Speaker 1: gave Parlor the boot kicked him off the servers. This 158 00:10:02,520 --> 00:10:05,600 Speaker 1: made the website homeless for a short time until it 159 00:10:05,640 --> 00:10:09,480 Speaker 1: found new hosting services. The court ruled that Parlor needed 160 00:10:09,520 --> 00:10:13,559 Speaker 1: to file an amended complaint by February sixteenth, which Parlor 161 00:10:13,600 --> 00:10:16,640 Speaker 1: failed to do. It got a deadline extension, which ended 162 00:10:16,720 --> 00:10:21,240 Speaker 1: on Tuesday, March second, and they still didn't have an 163 00:10:21,240 --> 00:10:24,720 Speaker 1: amended complaint. Instead, they withdrew the lawsuit, though they did 164 00:10:24,800 --> 00:10:28,720 Speaker 1: file a different lawsuit against Amazon in the state of Washington, 165 00:10:29,080 --> 00:10:33,320 Speaker 1: claiming defamation and breach of contract. Whether this suit will 166 00:10:33,360 --> 00:10:36,680 Speaker 1: fare better remains to be seen. In the meantime, Parlor 167 00:10:36,800 --> 00:10:40,280 Speaker 1: is still banned from both the iOS and Android app stores, 168 00:10:40,400 --> 00:10:44,160 Speaker 1: so there's no Parlor app for those platforms, though users 169 00:10:44,200 --> 00:10:47,040 Speaker 1: can still visit the web based version of the site. 170 00:10:47,880 --> 00:10:51,560 Speaker 1: The iOS app version of Netflix now has a feature 171 00:10:51,679 --> 00:10:55,400 Speaker 1: called Fast Laughs, and as the name suggests, this feature 172 00:10:55,440 --> 00:10:59,280 Speaker 1: delivers short videos of comedic content to the viewer, with 173 00:10:59,360 --> 00:11:02,959 Speaker 1: some lasting as little as fifteen seconds, so you get in, 174 00:11:03,400 --> 00:11:06,199 Speaker 1: you get your joke, and you get out, as they say, 175 00:11:06,400 --> 00:11:11,120 Speaker 1: clips come from movies, television series, and comedy specials, and 176 00:11:11,240 --> 00:11:14,079 Speaker 1: it includes a way for users to share their favorites 177 00:11:14,120 --> 00:11:18,080 Speaker 1: with their friends, assuming their friends are also Netflix subscribers, 178 00:11:18,440 --> 00:11:21,200 Speaker 1: and the company plans to deliver around one hundred clips 179 00:11:21,240 --> 00:11:24,880 Speaker 1: each day for people to kind of sift through. The 180 00:11:24,880 --> 00:11:27,640 Speaker 1: feature also offers a way for users to navigate straight 181 00:11:27,679 --> 00:11:30,240 Speaker 1: to the source of where the clip came from if 182 00:11:30,240 --> 00:11:32,400 Speaker 1: they want to watch the whole thing, which is really 183 00:11:32,679 --> 00:11:35,040 Speaker 1: the whole reason for this feature to exist in the 184 00:11:35,040 --> 00:11:37,960 Speaker 1: first place. I don't have an iPhone, so I have 185 00:11:38,120 --> 00:11:40,960 Speaker 1: not had a chance to test this out myself, which 186 00:11:41,000 --> 00:11:43,800 Speaker 1: is really too bad, as I could use a good laugh. 187 00:11:44,840 --> 00:11:47,720 Speaker 1: And our final story, and one that I'm really excited 188 00:11:47,760 --> 00:11:51,719 Speaker 1: about because I'm a nerd, is that researchers, including some 189 00:11:51,880 --> 00:11:55,040 Speaker 1: at M I. T. Libraries, have created a pretty nifty 190 00:11:55,080 --> 00:11:59,000 Speaker 1: way to read unopened letters from hundreds of years ago. 191 00:11:59,360 --> 00:12:03,160 Speaker 1: All right, first, let's present what the problem actually is. 192 00:12:03,520 --> 00:12:07,240 Speaker 1: So back before there were envelopes, before you could get 193 00:12:07,240 --> 00:12:09,120 Speaker 1: an envelope, put a letter in it, and seal it, 194 00:12:09,520 --> 00:12:11,439 Speaker 1: there were just a couple of different ways that you 195 00:12:11,480 --> 00:12:14,640 Speaker 1: could send a written message to someone that included some 196 00:12:14,720 --> 00:12:18,560 Speaker 1: sort of protection against tampering. So one of those ways 197 00:12:18,679 --> 00:12:21,240 Speaker 1: was that you could seal the message with a wax seal. 198 00:12:21,320 --> 00:12:24,559 Speaker 1: You've probably seen TV shows or movies where this happens, 199 00:12:24,600 --> 00:12:27,360 Speaker 1: where you put a blob of wax down to seal 200 00:12:27,720 --> 00:12:30,520 Speaker 1: a letter. Shut and then you use like a signet 201 00:12:30,679 --> 00:12:34,640 Speaker 1: ring or something to stamp it so that you know 202 00:12:35,200 --> 00:12:38,439 Speaker 1: that the message you get is legitimate and that no 203 00:12:38,559 --> 00:12:40,880 Speaker 1: one has opened it. If the seal is broken when 204 00:12:40,880 --> 00:12:44,120 Speaker 1: the message arrives, you know that somebody somewhere has opened 205 00:12:44,120 --> 00:12:47,200 Speaker 1: the darn thing, and that could mean not just that 206 00:12:47,240 --> 00:12:50,840 Speaker 1: the message had been compromised and potentially read, but it 207 00:12:50,920 --> 00:12:54,920 Speaker 1: also opens up the possibility that someone altered the message. 208 00:12:55,480 --> 00:12:59,480 Speaker 1: But another way to send a letter and try to 209 00:12:59,559 --> 00:13:03,640 Speaker 1: keep it being snooped on was a practice called letter locking, 210 00:13:03,920 --> 00:13:06,400 Speaker 1: and it's actually a pretty simple idea. So you write 211 00:13:06,440 --> 00:13:09,640 Speaker 1: your letter on paper, you know, one side of each 212 00:13:09,679 --> 00:13:12,440 Speaker 1: piece of paper, put your stacks of paper together, and 213 00:13:12,480 --> 00:13:14,960 Speaker 1: then you would fold your paper in such a way 214 00:13:15,000 --> 00:13:18,800 Speaker 1: that to unfold it you would have to tear the 215 00:13:18,840 --> 00:13:20,840 Speaker 1: paper a little bit in the process. There'd be no 216 00:13:20,960 --> 00:13:24,400 Speaker 1: way to unfold the letter without tearing the paper a 217 00:13:24,440 --> 00:13:27,360 Speaker 1: little bit. Not tear it enough to rip the letter 218 00:13:27,400 --> 00:13:30,199 Speaker 1: in half or anything, but there would be a tear 219 00:13:30,480 --> 00:13:33,320 Speaker 1: in the paper. So if you got a letter and 220 00:13:33,360 --> 00:13:35,280 Speaker 1: you saw that there was already a tear in the 221 00:13:35,320 --> 00:13:37,880 Speaker 1: paper before you could even unfold it, you would know 222 00:13:38,040 --> 00:13:40,280 Speaker 1: someone else had opened it. Before you got it and 223 00:13:40,320 --> 00:13:45,080 Speaker 1: then intrigue. Now flash forward a few hundred years. There 224 00:13:45,080 --> 00:13:49,160 Speaker 1: are a lot of unopened letters from ages ago. And 225 00:13:49,160 --> 00:13:52,560 Speaker 1: while we could open these letters in the intended way, 226 00:13:52,920 --> 00:13:56,120 Speaker 1: it would actually cause damage to the documents. And arguably 227 00:13:56,200 --> 00:13:59,440 Speaker 1: the actual folding method itself is part of the information 228 00:13:59,520 --> 00:14:03,160 Speaker 1: we to preserve. So how else are we supposed to 229 00:14:03,160 --> 00:14:07,600 Speaker 1: get our beady little eyes on the contents of these letters. Well, 230 00:14:07,640 --> 00:14:11,360 Speaker 1: these researchers used some really cool technology to read the 231 00:14:11,480 --> 00:14:15,360 Speaker 1: letters without opening them. First, they used an X ray 232 00:14:15,400 --> 00:14:19,280 Speaker 1: scanner to thoroughly scan an unopened letter, and the scanner, 233 00:14:19,320 --> 00:14:22,440 Speaker 1: which was designed to be used in dentistry, would create 234 00:14:22,480 --> 00:14:25,840 Speaker 1: a three dimensional X ray scan of the letter, which 235 00:14:25,880 --> 00:14:30,000 Speaker 1: included where ink was on the surface of the paper 236 00:14:30,080 --> 00:14:34,520 Speaker 1: and where the folds were. Then, using a custom built algorithm, 237 00:14:34,680 --> 00:14:39,440 Speaker 1: the team had a computer virtually unfold this three dimensional 238 00:14:39,480 --> 00:14:43,640 Speaker 1: model into a two dimensional reconstruction of the original letter. 239 00:14:44,280 --> 00:14:47,680 Speaker 1: They were virtually unfolding the scan of the letter, at 240 00:14:47,680 --> 00:14:49,720 Speaker 1: which point they could actually read it. And that whole 241 00:14:49,720 --> 00:14:53,360 Speaker 1: process is so amazingly cool to me. It's a really 242 00:14:53,440 --> 00:14:57,360 Speaker 1: nifty way to engineer around a tricky problem. Now beyond that, 243 00:14:57,640 --> 00:15:00,600 Speaker 1: I could see this general sort of appro being used 244 00:15:00,640 --> 00:15:03,480 Speaker 1: to develop strategies for other types of machine learning and 245 00:15:03,520 --> 00:15:07,400 Speaker 1: automated systems. Creating sets of rules so that machines like 246 00:15:07,480 --> 00:15:10,640 Speaker 1: computers and robots can achieve a task is a big 247 00:15:10,680 --> 00:15:14,080 Speaker 1: part of artificial intelligence, and while this application might have 248 00:15:14,280 --> 00:15:17,800 Speaker 1: limited uses in its current form, you could see how 249 00:15:17,840 --> 00:15:20,200 Speaker 1: a similar approach might be used to create a rule 250 00:15:20,280 --> 00:15:23,560 Speaker 1: set for a robot when, say, it encounters a door. 251 00:15:24,080 --> 00:15:27,040 Speaker 1: We humans typically know how to open doors, though I 252 00:15:27,080 --> 00:15:29,280 Speaker 1: have to admit I'm pretty good at pushing on poll 253 00:15:29,360 --> 00:15:32,640 Speaker 1: doors and vice versa, but robots find it all much 254 00:15:32,680 --> 00:15:35,560 Speaker 1: more tricky, particularly because we have a lot of different 255 00:15:35,680 --> 00:15:39,240 Speaker 1: kinds of doors with different ways to open them. Anyway, 256 00:15:39,440 --> 00:15:42,200 Speaker 1: I thought the story was particularly interesting, and besides, I'm 257 00:15:42,200 --> 00:15:44,440 Speaker 1: a medievalist at heart, so it had to go in. 258 00:15:45,080 --> 00:15:48,480 Speaker 1: And that wraps up the news for Thursday, March fourth, 259 00:15:48,480 --> 00:15:52,280 Speaker 1: two thousand twenty one. If you have suggestions for topics 260 00:15:52,280 --> 00:15:54,520 Speaker 1: I should cover on tech Stuff, please let me know. 261 00:15:54,840 --> 00:15:58,400 Speaker 1: Reach out on Twitter to handle It's text stuff hsw 262 00:15:58,760 --> 00:16:06,720 Speaker 1: and I'll talk to you again release soon. Y Text 263 00:16:06,720 --> 00:16:10,200 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 264 00:16:10,200 --> 00:16:12,960 Speaker 1: from I Heart Radio, visit the I Heart radio, app, 265 00:16:13,120 --> 00:16:16,280 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.