1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:11,880 --> 00:00:14,920 Speaker 1: He there, and welcome to tech Stuff. I'm your host 3 00:00:15,000 --> 00:00:18,079 Speaker 1: Jonathan Strickland, Diamond executive producer with I Heart Radio, and 4 00:00:18,120 --> 00:00:20,720 Speaker 1: how the tech are you. It's time for the tech 5 00:00:20,760 --> 00:00:26,400 Speaker 1: news for Thursday, January twenty three, and we've got another 6 00:00:26,440 --> 00:00:30,600 Speaker 1: couple of interesting bits of chat GPT, slash open AI, 7 00:00:30,760 --> 00:00:34,960 Speaker 1: slash chat bought news. First up, The Verge reports that 8 00:00:35,080 --> 00:00:40,480 Speaker 1: academic publishing company Springer Nature has updated its policies such 9 00:00:40,520 --> 00:00:44,559 Speaker 1: that a I writing tools like chat GPT cannot be 10 00:00:44,680 --> 00:00:48,959 Speaker 1: credited as an author of a scientific paper. And considering 11 00:00:48,960 --> 00:00:53,520 Speaker 1: the reports of how tools like chat GBT can and 12 00:00:53,640 --> 00:00:57,240 Speaker 1: do get stuff wrong on occasion, not to mention some 13 00:00:57,360 --> 00:01:01,360 Speaker 1: plagiarism issues, we will talk about in justice. Second, all 14 00:01:01,400 --> 00:01:04,320 Speaker 1: of this makes sense. Springer Nature does say that these 15 00:01:04,319 --> 00:01:07,560 Speaker 1: tools can be used in the process of creating ideas 16 00:01:07,560 --> 00:01:11,119 Speaker 1: for research or even assisting in writing, as long as 17 00:01:11,160 --> 00:01:15,200 Speaker 1: the paper is transparent about this, which potentially could even 18 00:01:15,200 --> 00:01:19,560 Speaker 1: mean specifying which bits of the paper were AI generated. 19 00:01:20,120 --> 00:01:25,200 Speaker 1: But ultimately these things cannot be credited as an author, 20 00:01:25,600 --> 00:01:28,160 Speaker 1: and that really does make sense. These tools tend to 21 00:01:28,200 --> 00:01:32,119 Speaker 1: have a lack of transparency and you don't know how 22 00:01:32,160 --> 00:01:35,040 Speaker 1: they get to the point of generating the text they create. 23 00:01:36,160 --> 00:01:39,600 Speaker 1: You also don't get things like citations for references and 24 00:01:39,680 --> 00:01:41,840 Speaker 1: that kind of thing. And so without knowing where the 25 00:01:41,920 --> 00:01:46,080 Speaker 1: author is drawing information from, one you can't be certain 26 00:01:46,120 --> 00:01:49,480 Speaker 1: that the information being presented as reliable, and too, you 27 00:01:49,520 --> 00:01:54,600 Speaker 1: can't be sure that the author's conclusions are valid because 28 00:01:54,640 --> 00:01:57,960 Speaker 1: you don't know where the conclusions were drawn from. It 29 00:01:58,000 --> 00:02:01,360 Speaker 1: sounds like Springer Nature is opti mystic regarding how AI 30 00:02:01,560 --> 00:02:05,000 Speaker 1: can play a part in scientific research and writing, but 31 00:02:05,120 --> 00:02:06,840 Speaker 1: that it's going to take a lot of work to 32 00:02:07,000 --> 00:02:10,400 Speaker 1: foster the positive use of AI, as well as a 33 00:02:10,400 --> 00:02:13,840 Speaker 1: lot of work to prevent the negative use cases. A 34 00:02:13,880 --> 00:02:16,720 Speaker 1: couple of weeks ago, I talked about how the website 35 00:02:16,800 --> 00:02:21,280 Speaker 1: Futurism had uncovered c nets use of AI to write 36 00:02:21,480 --> 00:02:24,160 Speaker 1: articles in the c net Money channel, and you only 37 00:02:24,200 --> 00:02:26,800 Speaker 1: knew about if you clicked on the little dropped down 38 00:02:27,320 --> 00:02:30,880 Speaker 1: that would appear if you highlighted the s net money 39 00:02:31,000 --> 00:02:35,560 Speaker 1: staff byline. Well, futurism keeps coming up with great reporting 40 00:02:35,600 --> 00:02:38,440 Speaker 1: with more updates on that. They have pointed out how 41 00:02:38,480 --> 00:02:43,120 Speaker 1: several of the articles written by this AI included factual 42 00:02:43,240 --> 00:02:48,840 Speaker 1: errors and just as concerning also included examples of plagiarism. Now, 43 00:02:48,880 --> 00:02:52,959 Speaker 1: in some cases, the AI appears to have largely lifted language, 44 00:02:53,000 --> 00:02:58,680 Speaker 1: almost entirely untouched from other sources, including from c net itself. 45 00:02:59,400 --> 00:03:02,440 Speaker 1: As Future is a reporter John Christian puts it, quote, 46 00:03:02,960 --> 00:03:06,799 Speaker 1: a new futurism investigation found extensive evidence that the c 47 00:03:06,960 --> 00:03:11,960 Speaker 1: net ais work has demonstrated deep structural and phrasing similarities 48 00:03:11,960 --> 00:03:16,560 Speaker 1: to articles previously published elsewhere without giving credit. In other words, 49 00:03:16,560 --> 00:03:19,320 Speaker 1: it looks like the butt directly plagiarized the work of 50 00:03:19,440 --> 00:03:23,440 Speaker 1: red Ventures competitors, red Ventures being the owner of Set, 51 00:03:24,000 --> 00:03:27,240 Speaker 1: as well as human writers at bank Rate and even 52 00:03:27,240 --> 00:03:31,960 Speaker 1: Senet itself end quote. And as I mentioned a moment ago, 53 00:03:32,280 --> 00:03:36,080 Speaker 1: these chat bots fail to credit their sources, and so 54 00:03:36,160 --> 00:03:39,280 Speaker 1: it all comes across as the AI is passing work 55 00:03:39,400 --> 00:03:42,600 Speaker 1: off as its own when it's not. And that sort 56 00:03:42,640 --> 00:03:45,920 Speaker 1: of stuff can get you fired or banned from publications. 57 00:03:45,960 --> 00:03:49,240 Speaker 1: So this really is a big problem, particularly when you 58 00:03:49,280 --> 00:03:52,520 Speaker 1: consider that AI in this context is being relied upon 59 00:03:52,560 --> 00:03:55,840 Speaker 1: to generate articles in place of a human being doing 60 00:03:55,880 --> 00:03:58,560 Speaker 1: the writing. So if it turns out your robot is 61 00:03:58,800 --> 00:04:02,080 Speaker 1: copying humans, not only are you denying a person a 62 00:04:02,200 --> 00:04:05,600 Speaker 1: chance to write the article. So you're denying a person 63 00:04:05,760 --> 00:04:09,880 Speaker 1: a job. Your robot is also stealing work from other humans. 64 00:04:09,880 --> 00:04:14,080 Speaker 1: It's kind of like a double dipping of bad AI practices. Now, 65 00:04:14,120 --> 00:04:17,600 Speaker 1: to be fair to see Net and to rend Ventures, UH, 66 00:04:17,640 --> 00:04:20,400 Speaker 1: they've stopped using AI to generate articles, at least for 67 00:04:20,440 --> 00:04:23,240 Speaker 1: the time being. I highly recommend you read the full 68 00:04:23,360 --> 00:04:26,640 Speaker 1: article that's on futurism. It has the title c nets 69 00:04:26,680 --> 00:04:31,839 Speaker 1: AI journalist appears to have committed extensive plagiarism and another 70 00:04:31,960 --> 00:04:35,880 Speaker 1: AI fail news item. You might remember I mentioned a 71 00:04:35,920 --> 00:04:38,560 Speaker 1: couple of weeks ago that Do Not Pay, the company 72 00:04:38,560 --> 00:04:41,960 Speaker 1: that helps normal everyday folks try to do stuff like 73 00:04:42,080 --> 00:04:45,279 Speaker 1: cancel their streaming subscriptions all the way up to getting 74 00:04:45,279 --> 00:04:48,000 Speaker 1: out of parking tickets, that that it was going to 75 00:04:48,120 --> 00:04:51,960 Speaker 1: participate in a court case next month, namely, a person 76 00:04:51,960 --> 00:04:55,520 Speaker 1: who was charged with a traffic citation in California was 77 00:04:55,560 --> 00:04:58,000 Speaker 1: going to wear some glasses equipped with cameras and a 78 00:04:58,040 --> 00:05:01,839 Speaker 1: microphone and wear an earp and then do Not Pay 79 00:05:01,839 --> 00:05:05,240 Speaker 1: as AI assistant would attempt to help the accused defend 80 00:05:05,279 --> 00:05:08,880 Speaker 1: themselves in traffic court. Those plans are now off the 81 00:05:08,920 --> 00:05:13,159 Speaker 1: table because Do Not Pays CEO Joshua Browder says the 82 00:05:13,160 --> 00:05:16,440 Speaker 1: company has scrapped the idea after receiving threats from quote 83 00:05:17,000 --> 00:05:20,760 Speaker 1: multiple state bar associations end quote, and that he and 84 00:05:20,760 --> 00:05:24,400 Speaker 1: the company could stand accused of the unauthorized practice of law, 85 00:05:24,800 --> 00:05:26,800 Speaker 1: which at least in some states can land you in 86 00:05:26,800 --> 00:05:29,880 Speaker 1: the pokey for up to six months. For those not 87 00:05:30,040 --> 00:05:35,120 Speaker 1: familiar with the phrase the pokey, that's that's jail. So 88 00:05:35,400 --> 00:05:38,920 Speaker 1: AI will not be pulling a matlock this February. I'm 89 00:05:39,120 --> 00:05:42,400 Speaker 1: I'm dropping a lot of old people references in this episode. 90 00:05:42,600 --> 00:05:45,680 Speaker 1: Give me a shout if you're if you're picking them up. Meanwhile, 91 00:05:45,760 --> 00:05:49,040 Speaker 1: state bar associations across the United States are evaluating how 92 00:05:49,080 --> 00:05:51,880 Speaker 1: to handle situations like this, which just a year ago 93 00:05:52,080 --> 00:05:55,240 Speaker 1: would have seemed like a silly fantasy. Also, Do not 94 00:05:55,360 --> 00:05:58,640 Speaker 1: Pay was in the news because it offered up a 95 00:05:58,680 --> 00:06:01,440 Speaker 1: million bucks to any all your schedule to argue before 96 00:06:01,480 --> 00:06:04,120 Speaker 1: the Supreme Court, saying all you have to do is 97 00:06:04,200 --> 00:06:07,000 Speaker 1: use an earpiece and it will connect to our AI 98 00:06:07,120 --> 00:06:10,160 Speaker 1: tool and the AI will tell you what to argue 99 00:06:10,200 --> 00:06:12,160 Speaker 1: in front of the court and you'll get a million dollars. 100 00:06:13,040 --> 00:06:15,760 Speaker 1: Most folks seem to portray Browder as the kind of 101 00:06:15,760 --> 00:06:17,800 Speaker 1: executive who likes to get a lot of publicity through 102 00:06:17,920 --> 00:06:21,440 Speaker 1: really splashy stunts, And in this case, it's kind of 103 00:06:21,440 --> 00:06:25,640 Speaker 1: hard for me to dispute that particular portrayal. On Tuesday, 104 00:06:25,640 --> 00:06:28,800 Speaker 1: the New York Stock Exchange had a rough morning. Stock 105 00:06:28,839 --> 00:06:31,920 Speaker 1: prices for normally stable companies began to swing as much 106 00:06:31,920 --> 00:06:36,480 Speaker 1: as twenty in a short amount of time. According to Bloomberg, 107 00:06:36,800 --> 00:06:40,760 Speaker 1: more than two hundred fifty companies were affected. So why 108 00:06:40,800 --> 00:06:44,000 Speaker 1: is this in tech stuff? What was causing these extreme 109 00:06:44,080 --> 00:06:48,680 Speaker 1: stock fluctuations? Was it economic instability? Was it news that 110 00:06:48,760 --> 00:06:52,400 Speaker 1: the doomsday clock is now sitting at ninety seconds? Was 111 00:06:52,440 --> 00:06:54,560 Speaker 1: it fears about how Russia is going to react to 112 00:06:54,560 --> 00:06:58,159 Speaker 1: Germany and the United States sending tanks to Ukraine. No, 113 00:06:59,480 --> 00:07:02,120 Speaker 1: the while morning was due to an n y s 114 00:07:02,120 --> 00:07:06,760 Speaker 1: E employee forgetting to turn off a backup system properly. 115 00:07:07,200 --> 00:07:10,080 Speaker 1: All right, So, clearly, for something as important to the 116 00:07:10,120 --> 00:07:13,280 Speaker 1: global economy as the New York Stock Exchange, that is, 117 00:07:13,360 --> 00:07:17,440 Speaker 1: the n y s E, then a backup is necessary, right. 118 00:07:18,040 --> 00:07:20,800 Speaker 1: There needs to be a way to switch on a 119 00:07:20,840 --> 00:07:23,280 Speaker 1: backup quickly in the event of a failure of the 120 00:07:23,320 --> 00:07:26,760 Speaker 1: primary system or else trading ends up being affected and 121 00:07:26,800 --> 00:07:29,320 Speaker 1: in a worst case scenario, it could prompt a real 122 00:07:29,520 --> 00:07:33,200 Speaker 1: financial crisis. So of course, the n y s E 123 00:07:33,760 --> 00:07:36,960 Speaker 1: has backups, and employees are supposed to check up on 124 00:07:37,000 --> 00:07:41,040 Speaker 1: these backup systems regularly, and typically this involves turning the 125 00:07:41,120 --> 00:07:45,320 Speaker 1: backup on after normal trading hours when the stock exchange 126 00:07:45,360 --> 00:07:48,880 Speaker 1: is closed, then verifying that the system is working the 127 00:07:48,920 --> 00:07:52,000 Speaker 1: way it's supposed to, and then shutting the system down again. 128 00:07:53,000 --> 00:07:56,640 Speaker 1: But on Tuesday, someone failed to do that last step 129 00:07:57,000 --> 00:08:00,000 Speaker 1: of shutting it down again, and when the primary system 130 00:08:00,120 --> 00:08:03,040 Speaker 1: came online, it performed as if it had been going 131 00:08:03,080 --> 00:08:05,760 Speaker 1: for a while, like instead of being the beginning of 132 00:08:05,800 --> 00:08:08,560 Speaker 1: the trading day, it was acting as if it had 133 00:08:08,600 --> 00:08:11,520 Speaker 1: gone down and came back up while the backup was 134 00:08:11,560 --> 00:08:15,440 Speaker 1: in operation. And this through everything into a tizzy because 135 00:08:15,440 --> 00:08:17,520 Speaker 1: there were all these auctions that were supposed to go 136 00:08:17,640 --> 00:08:19,280 Speaker 1: off first thing in the morning, and it kind of 137 00:08:19,320 --> 00:08:22,800 Speaker 1: bypassed all this, and that's why prices started to fluctuate 138 00:08:22,840 --> 00:08:25,000 Speaker 1: like crazy because things were not going the way they 139 00:08:25,000 --> 00:08:28,160 Speaker 1: were supposed to. So then the New York Stock Exchange 140 00:08:28,160 --> 00:08:30,560 Speaker 1: had to spring into action and address this problem. And 141 00:08:30,600 --> 00:08:33,160 Speaker 1: now the n y s E is in the process 142 00:08:33,320 --> 00:08:38,760 Speaker 1: of reversing the trades that happened during the whole confusing process, 143 00:08:39,200 --> 00:08:42,000 Speaker 1: and that's gonna be a pretty expensive fix. So this 144 00:08:42,080 --> 00:08:44,240 Speaker 1: is a case where have you tried turning it off 145 00:08:44,280 --> 00:08:47,440 Speaker 1: and on again? Is the wrong approach. It's just have 146 00:08:47,520 --> 00:08:51,200 Speaker 1: you tried turning it off. Reuter's reports that according to 147 00:08:51,360 --> 00:08:56,640 Speaker 1: Standard Media Index, advertising spend on Twitter was down seventy 148 00:08:57,720 --> 00:09:02,360 Speaker 1: in December two. This was bad news, particularly after the 149 00:09:02,440 --> 00:09:06,600 Speaker 1: rough November, which reportedly saw a fifty five decline in 150 00:09:06,720 --> 00:09:10,200 Speaker 1: ad spending. And if you're ready for another percentage, Twitter 151 00:09:10,400 --> 00:09:14,880 Speaker 1: relies on advertising to make up of its revenue. So 152 00:09:15,040 --> 00:09:18,360 Speaker 1: this is really bad news for Twitter, and there have 153 00:09:18,400 --> 00:09:21,320 Speaker 1: been a lot of factors that could have contributed to 154 00:09:21,400 --> 00:09:25,200 Speaker 1: add money drying up. First up, there's this general global 155 00:09:25,320 --> 00:09:28,880 Speaker 1: economic situation we're all in where companies across the board 156 00:09:28,880 --> 00:09:33,120 Speaker 1: are cutting costs all over the place, which includes ad spend. 157 00:09:33,440 --> 00:09:36,440 Speaker 1: Marketing typically is one of the first areas hit when 158 00:09:36,480 --> 00:09:40,120 Speaker 1: companies want to buckle down and tighten their belts. Then 159 00:09:40,200 --> 00:09:43,800 Speaker 1: there's the concern that Twitter is turning into a more chaotic, toxic, 160 00:09:43,880 --> 00:09:48,720 Speaker 1: and frankly dangerous platform, especially for brands. With Elon Musk's 161 00:09:48,800 --> 00:09:53,800 Speaker 1: rather mercurial leadership style. Leading the way. Musk himself has 162 00:09:53,880 --> 00:09:56,680 Speaker 1: laid a lot of the blame on activist groups lobbying 163 00:09:56,760 --> 00:10:00,520 Speaker 1: companies to pull ad money off Twitter. But keep in mind, 164 00:10:00,960 --> 00:10:04,040 Speaker 1: one of the reasons Musk gave when he was still 165 00:10:04,040 --> 00:10:07,280 Speaker 1: trying to back away from buying Twitter months and months 166 00:10:07,280 --> 00:10:11,480 Speaker 1: and months ago, back in mid two was that he 167 00:10:11,520 --> 00:10:14,800 Speaker 1: saw an economic downturn coming, and that could mean that 168 00:10:14,920 --> 00:10:18,520 Speaker 1: he was already worried about drops in revenue for Twitter. Also, 169 00:10:18,640 --> 00:10:21,480 Speaker 1: Musk recently announced that Twitter would offer a subscription tier 170 00:10:21,559 --> 00:10:24,439 Speaker 1: that would cost a little bit more than the normal subscription, 171 00:10:24,840 --> 00:10:27,840 Speaker 1: but would come with an ad free experience, and that 172 00:10:28,000 --> 00:10:31,559 Speaker 1: he said Twitter already has too many ads on it. 173 00:10:31,559 --> 00:10:34,520 Speaker 1: It's weird that he's saying this while he's also simultaneously 174 00:10:34,559 --> 00:10:38,240 Speaker 1: trying to bring advertisers back on board with Twitter, using 175 00:10:38,320 --> 00:10:41,839 Speaker 1: various incentives to bring them back into the fold. It's 176 00:10:41,840 --> 00:10:43,400 Speaker 1: getting to the point where you really don't know what 177 00:10:43,520 --> 00:10:47,520 Speaker 1: to believe as a user or as an advertiser. Okay, 178 00:10:47,520 --> 00:10:50,800 Speaker 1: while we sort all that out, let's take a quick break. 179 00:10:50,840 --> 00:11:03,240 Speaker 1: We'll be back after these messages. We're back. Next up, 180 00:11:03,240 --> 00:11:06,040 Speaker 1: we've got some stories about censorship and how tech companies 181 00:11:06,080 --> 00:11:09,800 Speaker 1: are reportedly bowing to it, and Apple customers in Hong 182 00:11:09,840 --> 00:11:13,840 Speaker 1: Kong reported that Safari blocked at least for a while 183 00:11:14,360 --> 00:11:18,440 Speaker 1: access to certain sites like get lab last year, and 184 00:11:18,480 --> 00:11:22,960 Speaker 1: apparently this was because Apple has worked with Tencent in China, 185 00:11:23,400 --> 00:11:26,880 Speaker 1: and Tencent has a blacklist of sites that Chinese users 186 00:11:26,920 --> 00:11:30,320 Speaker 1: are not supposed to be able to access, which you know, 187 00:11:30,520 --> 00:11:33,720 Speaker 1: usually these directives come from the government, they're sent to 188 00:11:33,840 --> 00:11:37,640 Speaker 1: these these big companies, and Apple partners with ten Cent 189 00:11:37,960 --> 00:11:42,200 Speaker 1: in order to operate within China, so that's where this 190 00:11:42,320 --> 00:11:45,840 Speaker 1: was apparently coming from. But Hong Kong citizens have enjoyed 191 00:11:46,000 --> 00:11:50,120 Speaker 1: less authoritative oversight in the past, and it appears as 192 00:11:50,120 --> 00:11:54,359 Speaker 1: though ten cents blacklist was now extending to Hong Kong users, 193 00:11:54,400 --> 00:11:57,960 Speaker 1: something that Apple appears to have confirmed at some point 194 00:11:58,120 --> 00:12:00,800 Speaker 1: late last year. And the reason we have to be 195 00:12:00,880 --> 00:12:03,600 Speaker 1: vague about this is that it's not that Apple came 196 00:12:03,600 --> 00:12:08,480 Speaker 1: out and released a statement about this. Rather, Apple at 197 00:12:08,520 --> 00:12:13,320 Speaker 1: some point late in updated it's Safari privacy notice in 198 00:12:13,400 --> 00:12:15,960 Speaker 1: Hong Kong, and it took a while for folks to 199 00:12:16,000 --> 00:12:18,200 Speaker 1: notice that had happened, which is why we're only just 200 00:12:18,320 --> 00:12:22,520 Speaker 1: now hearing about this in the West. Apple has essentially 201 00:12:22,559 --> 00:12:25,320 Speaker 1: refused to comment on this. They Apple has said, you 202 00:12:25,320 --> 00:12:27,000 Speaker 1: know what, you need to go talk to ten Cent. 203 00:12:27,120 --> 00:12:30,400 Speaker 1: It's like ask your mom, Apple saying go talk to 204 00:12:30,440 --> 00:12:32,960 Speaker 1: tensent and tense. It is refusing to comment it on 205 00:12:33,080 --> 00:12:36,720 Speaker 1: it too. So that's where we are with that now. 206 00:12:37,160 --> 00:12:40,839 Speaker 1: On a related note, and in another Twitter story, the 207 00:12:40,880 --> 00:12:44,240 Speaker 1: Intercept reports that Twitter has bowed to political pressure in 208 00:12:44,240 --> 00:12:49,160 Speaker 1: India and is actively blocking links to a BBC documentary 209 00:12:49,200 --> 00:12:54,719 Speaker 1: that accuses India's Prime Minister Narendra Modi of taking a 210 00:12:54,760 --> 00:12:59,040 Speaker 1: prominent role in carrying out Gina side in Gujarat in 211 00:12:59,120 --> 00:13:03,760 Speaker 1: two thousand two. The Indian government has denounced this documentary, 212 00:13:03,880 --> 00:13:07,520 Speaker 1: calling it propaganda and demanding that companies like Twitter and 213 00:13:07,559 --> 00:13:11,559 Speaker 1: YouTube block links to it within India, and several people, 214 00:13:11,600 --> 00:13:15,440 Speaker 1: including actor John Cusack, have reported that their tweets that 215 00:13:15,559 --> 00:13:19,800 Speaker 1: included links to this documentary have been banned and blocked 216 00:13:19,960 --> 00:13:22,480 Speaker 1: in India, and Cusack has actually gone on to say 217 00:13:22,480 --> 00:13:26,080 Speaker 1: that his whole Twitter account is banned in India full stop. 218 00:13:26,440 --> 00:13:30,959 Speaker 1: The Intercept is pretty darn critical of Twitter and of Musk. 219 00:13:31,040 --> 00:13:34,359 Speaker 1: In the article I read this morning, uh, the Intercept 220 00:13:34,480 --> 00:13:38,360 Speaker 1: rather cheekily points out that Musk has been framed as 221 00:13:38,440 --> 00:13:42,880 Speaker 1: a free speech absolutist, and yet here's an example of 222 00:13:42,920 --> 00:13:46,480 Speaker 1: his company cow towing to authoritarian pressure. It's very much 223 00:13:46,520 --> 00:13:50,040 Speaker 1: not a free speech thing, like it's the opposite. It's 224 00:13:50,160 --> 00:13:53,760 Speaker 1: it's it's government censorship. Musk, for his part, says he 225 00:13:53,880 --> 00:13:56,800 Speaker 1: was not even aware of the issue and that all 226 00:13:56,880 --> 00:14:00,360 Speaker 1: his attention has been divided between Twitter, Tesla, and Space Sex, 227 00:14:00,800 --> 00:14:03,400 Speaker 1: which is a pretty dangerous thing for him to even tweet, 228 00:14:04,040 --> 00:14:06,680 Speaker 1: because there are a ton of Tesla investors who have 229 00:14:06,760 --> 00:14:10,400 Speaker 1: been angry about him for that specific reason that his 230 00:14:10,480 --> 00:14:13,520 Speaker 1: attention has been divided, and they've seen Twitter as this 231 00:14:14,360 --> 00:14:20,720 Speaker 1: big diversion that has in in turn caused Tesla to suffer. 232 00:14:20,800 --> 00:14:23,240 Speaker 1: And like if you look at the stock price for Tesla, 233 00:14:23,600 --> 00:14:27,400 Speaker 1: it's been in a rough place for the last year. Anyway, 234 00:14:27,440 --> 00:14:29,880 Speaker 1: it is concerning to see a government be able to 235 00:14:29,960 --> 00:14:33,600 Speaker 1: ban any news that it doesn't like, whether Musk himself 236 00:14:33,640 --> 00:14:35,640 Speaker 1: was aware of the issue or not. And again I 237 00:14:35,640 --> 00:14:39,120 Speaker 1: should also add Google agreed several times in the past 238 00:14:39,160 --> 00:14:41,760 Speaker 1: two demands from India's government to since there certain links 239 00:14:41,760 --> 00:14:44,600 Speaker 1: and videos, So it's not like Twitter is alone here. 240 00:14:44,960 --> 00:14:50,720 Speaker 1: It's one prominent example of a tech company appearing to 241 00:14:50,840 --> 00:14:56,440 Speaker 1: comply with an authoritarian government's demands to sensor materials so 242 00:14:56,520 --> 00:14:59,680 Speaker 1: that citizens aren't able to access it, but it is 243 00:14:59,720 --> 00:15:03,200 Speaker 1: by no means the only one doing it. Last year, 244 00:15:03,280 --> 00:15:06,920 Speaker 1: I mentioned how Bloomberg reported on the Army's experience with 245 00:15:07,000 --> 00:15:12,240 Speaker 1: the Integrated Visual Augmentation System or IVAS, and this is 246 00:15:12,360 --> 00:15:16,560 Speaker 1: essentially it's a militarized and ruggedized version of the Microsoft 247 00:15:16,600 --> 00:15:22,240 Speaker 1: Hollowlens mixed reality headset. It's it's one that specifically developed 248 00:15:22,280 --> 00:15:25,280 Speaker 1: and tuned for the Army, and the Army's goal was 249 00:15:25,320 --> 00:15:27,720 Speaker 1: to work with Microsoft and develop a system that helps 250 00:15:27,720 --> 00:15:30,920 Speaker 1: soldiers be more effective in the field, including support for 251 00:15:30,960 --> 00:15:35,640 Speaker 1: stuff like target acquisition and better marksmanship. But as it 252 00:15:35,640 --> 00:15:39,600 Speaker 1: turns out, soldiers didn't much care for the technology. Reports 253 00:15:39,600 --> 00:15:43,120 Speaker 1: said that soldiers experienced health issues ranging from next train 254 00:15:43,280 --> 00:15:46,600 Speaker 1: to motion sickness, and that they were more accurate and 255 00:15:46,600 --> 00:15:50,080 Speaker 1: efficient when they were relying on their older equipment. So 256 00:15:50,120 --> 00:15:52,960 Speaker 1: a lot of soldiers just opted to not use the 257 00:15:53,000 --> 00:15:56,200 Speaker 1: headset at all. While some of this could be chalked 258 00:15:56,240 --> 00:15:59,640 Speaker 1: up to the adjustment period anyone encounters when they have 259 00:15:59,720 --> 00:16:03,640 Speaker 1: to in corporate new equipment into their their daily tasks, 260 00:16:04,160 --> 00:16:06,960 Speaker 1: the messages seemed to be that ives was performing well 261 00:16:07,120 --> 00:16:12,000 Speaker 1: under expectations. Task and Purpose dot Com published a piece 262 00:16:12,080 --> 00:16:16,880 Speaker 1: earlier this week saying Microsoft still allegedly had plans to 263 00:16:16,960 --> 00:16:20,440 Speaker 1: develop the platform and improve it further and release updates 264 00:16:20,480 --> 00:16:22,960 Speaker 1: to the IVAS platform, and that the Army still intends 265 00:16:23,000 --> 00:16:27,680 Speaker 1: to deploy this very very expensive headset because it had 266 00:16:27,680 --> 00:16:31,840 Speaker 1: already purchased thousands of the things, and each headset, by 267 00:16:31,880 --> 00:16:37,880 Speaker 1: the way, cost the Army forty dollars a piece grand 268 00:16:37,960 --> 00:16:39,920 Speaker 1: for one of these headsets. And I thought the upcoming 269 00:16:39,960 --> 00:16:42,120 Speaker 1: Apple headset was going to be expensive at three K. 270 00:16:42,720 --> 00:16:45,960 Speaker 1: But the Army could be waiting a really long time 271 00:16:46,200 --> 00:16:50,000 Speaker 1: before getting that new and improved Iva's headset. Because one 272 00:16:50,080 --> 00:16:54,920 Speaker 1: area within Microsoft that was hit very, very hard by 273 00:16:55,000 --> 00:16:59,400 Speaker 1: recent layoffs and cutbacks is the mixed reality divisions within 274 00:16:59,440 --> 00:17:03,520 Speaker 1: the company. I said one area. I guess areas would 275 00:17:03,560 --> 00:17:06,240 Speaker 1: be more appropriate, because it's separate departments that we're all 276 00:17:06,280 --> 00:17:10,160 Speaker 1: working within this general field. Gizmoto has an article that's 277 00:17:10,160 --> 00:17:14,200 Speaker 1: titled Microsoft cuts VR staff and leaves questions about its 278 00:17:14,240 --> 00:17:18,280 Speaker 1: metaverse ambitions, written by Kyle Barr, and a lot of 279 00:17:18,280 --> 00:17:20,600 Speaker 1: other outlets have also reported on this as well. It's 280 00:17:20,640 --> 00:17:23,879 Speaker 1: just I relied on Gizmoto as my prime source for 281 00:17:23,880 --> 00:17:27,919 Speaker 1: this particular news item. Essentially, it sounds like the mixed 282 00:17:27,960 --> 00:17:33,440 Speaker 1: reality division inside Microsoft was positively gutted, in some cases 283 00:17:33,680 --> 00:17:37,320 Speaker 1: to the point for all intents and purposes of being eliminated. 284 00:17:37,960 --> 00:17:41,639 Speaker 1: All Space VR, which Microsoft acquired in seventeen, announced it 285 00:17:41,680 --> 00:17:44,760 Speaker 1: will be shutting down in March. The few folks from 286 00:17:45,040 --> 00:17:47,960 Speaker 1: the Alt Space VR team who still are at Microsoft 287 00:17:48,040 --> 00:17:51,119 Speaker 1: are going to be shifted to work for Microsoft Mesh. 288 00:17:51,280 --> 00:17:54,920 Speaker 1: This is Microsoft project to bring VR into its Teams product. 289 00:17:55,520 --> 00:18:00,560 Speaker 1: Other mixed reality departments were either severely downsized or outright jettison. 290 00:18:01,160 --> 00:18:04,640 Speaker 1: It's possible that the Hollow Lens has actually seen its 291 00:18:04,720 --> 00:18:08,240 Speaker 1: final days, and lots of news outlets are suggesting that 292 00:18:08,280 --> 00:18:12,320 Speaker 1: Microsoft maybe pulling back or even abandoning plans to develop 293 00:18:12,400 --> 00:18:15,080 Speaker 1: the Metaverse. And it was going to be a really 294 00:18:15,119 --> 00:18:19,359 Speaker 1: important partner with meta for that particular project. And y'all 295 00:18:19,400 --> 00:18:22,960 Speaker 1: you know, I've been super critical about the hype around Metaverse. 296 00:18:24,040 --> 00:18:30,840 Speaker 1: I still question the usefulness and and uh importance of 297 00:18:30,880 --> 00:18:33,520 Speaker 1: the Metaverse, but I definitely did not want to see 298 00:18:33,520 --> 00:18:36,080 Speaker 1: it all come crashing down like this, especially at the 299 00:18:36,119 --> 00:18:38,639 Speaker 1: expense of people's jobs. I would have much rather have 300 00:18:38,800 --> 00:18:44,639 Speaker 1: seen people get repurposed into other projects that could potentially 301 00:18:44,680 --> 00:18:48,400 Speaker 1: have greater benefit. I really questioned the benefit of the metaverse. 302 00:18:48,400 --> 00:18:51,560 Speaker 1: But yeah, I hate seeing folks getting laid off like this. 303 00:18:52,320 --> 00:18:54,959 Speaker 1: Our next story takes us to New York City and 304 00:18:55,080 --> 00:18:59,879 Speaker 1: Madison Square Gardens, So Masson Square Gardens or MSG, is 305 00:19:00,119 --> 00:19:02,920 Speaker 1: one of the famous venues here in the United States. 306 00:19:02,920 --> 00:19:05,119 Speaker 1: I think of Mass and Square Gardens the same way 307 00:19:05,160 --> 00:19:07,880 Speaker 1: I think of things like the Hollywood Bowl. A lot 308 00:19:07,880 --> 00:19:11,320 Speaker 1: of sporting events and concerts and stuff take place there. 309 00:19:11,400 --> 00:19:14,320 Speaker 1: And whenever I think of Mass and Square Gardens, honestly, 310 00:19:14,800 --> 00:19:18,280 Speaker 1: I just imagined professional rassling, maybe with a couple of 311 00:19:18,280 --> 00:19:21,439 Speaker 1: immortals ready to have a big hold sword fight with 312 00:19:21,480 --> 00:19:24,400 Speaker 1: one another. Give me a shout out if you actually 313 00:19:24,920 --> 00:19:28,840 Speaker 1: understand that reference. But recently, MSG has been in the 314 00:19:28,840 --> 00:19:34,440 Speaker 1: news for a controversial practice, namely using facial recognition technology 315 00:19:34,560 --> 00:19:38,159 Speaker 1: at security checkpoints in order to identify and then deny 316 00:19:38,600 --> 00:19:43,520 Speaker 1: entry to certain folks, namely people who have been identified 317 00:19:43,560 --> 00:19:46,439 Speaker 1: as working for law firms that are currently or have 318 00:19:46,560 --> 00:19:52,840 Speaker 1: previously been involved in lawsuits against MSG Entertainment, even if 319 00:19:52,880 --> 00:19:56,040 Speaker 1: that person had was not an active part of that 320 00:19:56,119 --> 00:20:00,239 Speaker 1: particular lawsuit. In other words, imagine that you show up 321 00:20:00,680 --> 00:20:04,320 Speaker 1: to go see a concert and you get to security 322 00:20:04,600 --> 00:20:06,840 Speaker 1: and you find out at the security checkpoint that you're 323 00:20:06,880 --> 00:20:11,280 Speaker 1: not allowed inside because the owner of the building doesn't 324 00:20:11,320 --> 00:20:14,920 Speaker 1: like the company you work for. That's the only reason. Oh, 325 00:20:15,000 --> 00:20:17,680 Speaker 1: you work for X, Y or Z, you can't come in. 326 00:20:18,160 --> 00:20:20,880 Speaker 1: We don't let you in here. Go home. That would 327 00:20:20,920 --> 00:20:24,119 Speaker 1: really stink, right, and it has New York's Attorney General 328 00:20:24,200 --> 00:20:28,160 Speaker 1: going after MSG Entertainment investigating if the company is violating 329 00:20:28,200 --> 00:20:31,680 Speaker 1: civil and human rights by relying on facial recognition tech 330 00:20:32,119 --> 00:20:34,600 Speaker 1: and denying entry in this way that it could be 331 00:20:34,680 --> 00:20:38,280 Speaker 1: retaliatory and against the law. Even if it turns out 332 00:20:38,320 --> 00:20:41,960 Speaker 1: it's not technically against the law, it sure does sound scummy, 333 00:20:42,080 --> 00:20:45,120 Speaker 1: doesn't it. By the way, Masson Square Gardens, in case 334 00:20:45,160 --> 00:20:48,359 Speaker 1: you were wondering, I look just like Ben Bolan of 335 00:20:48,400 --> 00:20:51,160 Speaker 1: stuff they don't want you to know. Okay, I've got 336 00:20:51,160 --> 00:20:54,320 Speaker 1: a couple more stories to finish up on that are 337 00:20:54,400 --> 00:20:57,280 Speaker 1: kind of cool and science, but we'll get to those 338 00:20:57,359 --> 00:21:10,560 Speaker 1: after this quick break. Okay, let's get all science and 339 00:21:10,680 --> 00:21:14,720 Speaker 1: stuff for our final two news items for today's episode. 340 00:21:15,440 --> 00:21:19,600 Speaker 1: First up is an article from side Tech Daily about 341 00:21:19,600 --> 00:21:24,040 Speaker 1: how research in the Physical Review Research Journal details how 342 00:21:24,160 --> 00:21:31,040 Speaker 1: artificial intelligence is helping the quantum computing discipline. It can 343 00:21:31,080 --> 00:21:35,920 Speaker 1: help maintain coherence within quantum systems. This is something that's 344 00:21:35,960 --> 00:21:39,879 Speaker 1: really tricky and incredibly important in quantum computing, but it 345 00:21:39,880 --> 00:21:43,600 Speaker 1: does require a bit of explanation. So quantum systems are 346 00:21:43,640 --> 00:21:48,360 Speaker 1: incredibly complicated and they are also very delicate. You've probably 347 00:21:48,400 --> 00:21:54,680 Speaker 1: heard about quantum experiments where you are able to get 348 00:21:55,560 --> 00:21:58,280 Speaker 1: really cool results until you observe the system and then 349 00:21:58,320 --> 00:22:00,679 Speaker 1: it all breaks down into one or the other. So 350 00:22:00,800 --> 00:22:05,080 Speaker 1: the classic example is light behaving like a wave and 351 00:22:05,160 --> 00:22:07,240 Speaker 1: a particle at the same time, but when you observe 352 00:22:07,280 --> 00:22:10,120 Speaker 1: the system, it breaks down into one or the other. 353 00:22:10,320 --> 00:22:12,840 Speaker 1: It cannot be both at the same time. Well, you 354 00:22:12,880 --> 00:22:15,600 Speaker 1: observe it because the system is broken down into a 355 00:22:15,640 --> 00:22:21,520 Speaker 1: classical system. Well, Quantum computers can be used to tackle 356 00:22:21,760 --> 00:22:26,520 Speaker 1: problems where there are perhaps thousands of potential solutions and 357 00:22:26,600 --> 00:22:28,960 Speaker 1: the system is trying to determine what is the best 358 00:22:29,119 --> 00:22:32,080 Speaker 1: solution out of all those thousands, and it's essentially doing 359 00:22:32,119 --> 00:22:36,320 Speaker 1: that simultaneously. It's it's essentially solving the problem with every 360 00:22:36,320 --> 00:22:40,199 Speaker 1: potential solution at the same time. However, if the quantum 361 00:22:40,240 --> 00:22:43,639 Speaker 1: system collapses, it can't do that anymore, and it becomes 362 00:22:43,640 --> 00:22:47,960 Speaker 1: a classical system, and it becomes a very underpowered classical system. 363 00:22:48,080 --> 00:22:51,800 Speaker 1: And the killer of it is these quantum systems are 364 00:22:52,560 --> 00:22:56,080 Speaker 1: so delicate. A couple of things can end up causing problems. 365 00:22:56,119 --> 00:23:00,359 Speaker 1: There's this issue called damping, which is when essentially internal 366 00:23:00,400 --> 00:23:03,439 Speaker 1: motion within the system starts to slow down and it 367 00:23:03,480 --> 00:23:07,679 Speaker 1: can potentially come to a standstill. Another is collapse from 368 00:23:07,800 --> 00:23:12,959 Speaker 1: environmental noise. Something external from the system causes the system 369 00:23:13,000 --> 00:23:18,600 Speaker 1: to collapse. So maintaining the quantum system within a quantum 370 00:23:18,600 --> 00:23:22,320 Speaker 1: computer is of vital importance and it's very hard. But 371 00:23:22,840 --> 00:23:26,320 Speaker 1: this research shows that AI can be used to help 372 00:23:27,000 --> 00:23:30,919 Speaker 1: keep the system healthy and it does that through the 373 00:23:30,920 --> 00:23:36,199 Speaker 1: the the judicious use of things like intense light and 374 00:23:36,400 --> 00:23:40,120 Speaker 1: changes in voltage and applying different voltage to the system. 375 00:23:40,400 --> 00:23:43,240 Speaker 1: And it has to be done precisely at just the 376 00:23:43,320 --> 00:23:46,000 Speaker 1: right time and just the right intensity, which is why 377 00:23:46,080 --> 00:23:49,520 Speaker 1: AI is being used to do this. That AI is 378 00:23:49,680 --> 00:23:56,120 Speaker 1: creating the system in which uh, these different effects can 379 00:23:56,119 --> 00:23:59,560 Speaker 1: be applied to the quantum computer to keep it operational, 380 00:24:00,080 --> 00:24:04,000 Speaker 1: which is crazy science fiction kind of stuff to me. Uh, 381 00:24:04,000 --> 00:24:06,600 Speaker 1: And I love that AI can help us make another 382 00:24:06,760 --> 00:24:11,359 Speaker 1: equally fascinating technology more effective through the use of zapping 383 00:24:11,720 --> 00:24:14,600 Speaker 1: that technology at just the right time and just the 384 00:24:14,720 --> 00:24:18,400 Speaker 1: right amount of power. And finally, researchers at the Chinese 385 00:24:18,440 --> 00:24:21,680 Speaker 1: University of Hong Kong have developed a quote unquote robot 386 00:24:22,320 --> 00:24:28,280 Speaker 1: out of a magneto active phase transitional matter or MPTM. 387 00:24:28,320 --> 00:24:30,520 Speaker 1: And that sounds really complex, but when you start to 388 00:24:30,560 --> 00:24:33,399 Speaker 1: break it down, it's actually not that hard to understand. 389 00:24:34,040 --> 00:24:36,800 Speaker 1: It's hard to do, but you can at least understand 390 00:24:36,840 --> 00:24:39,840 Speaker 1: what's going on. So magneto active tells us that this 391 00:24:39,920 --> 00:24:44,359 Speaker 1: material reacts to magnets and magnetic fields. So you can 392 00:24:44,440 --> 00:24:48,119 Speaker 1: conclude from that that this is how controllers can manipulate 393 00:24:48,160 --> 00:24:51,200 Speaker 1: the robot. They can move it around and and move 394 00:24:51,320 --> 00:24:54,880 Speaker 1: and manipulate in various ways using magnets and magnetic fields. 395 00:24:55,119 --> 00:24:56,840 Speaker 1: All right, got it. So it's got some sort of 396 00:24:56,880 --> 00:25:00,640 Speaker 1: magnetic material inside of it, but one of about phase 397 00:25:00,640 --> 00:25:03,600 Speaker 1: transitional matter. What does that mean? Well, it means that 398 00:25:03,640 --> 00:25:07,120 Speaker 1: the matter in question can move between different phases such 399 00:25:07,160 --> 00:25:10,000 Speaker 1: as liquid and solid, and that, in fact is what 400 00:25:10,040 --> 00:25:13,240 Speaker 1: we're talking about in this specific case. So this is 401 00:25:13,280 --> 00:25:17,320 Speaker 1: a robot in air quotes that can go from solid 402 00:25:17,600 --> 00:25:20,720 Speaker 1: to liquid and back to solid again, and it can 403 00:25:20,760 --> 00:25:26,600 Speaker 1: be manipulated by magnets, which is super freaking cool. So 404 00:25:26,680 --> 00:25:30,560 Speaker 1: the researchers relied on gallium. Uh, this is a metal 405 00:25:30,600 --> 00:25:33,119 Speaker 1: that has a very low melting point. Gallium will go 406 00:25:33,200 --> 00:25:36,000 Speaker 1: from solid to liquid at around eighty five and a 407 00:25:36,000 --> 00:25:40,440 Speaker 1: half degrees fahrenheit or just under thirty degrees celsius. Now, 408 00:25:40,480 --> 00:25:42,720 Speaker 1: this means that you can do some really neat stuff 409 00:25:42,760 --> 00:25:47,040 Speaker 1: with gallium because gallium will be solid at room temperature, 410 00:25:47,640 --> 00:25:50,240 Speaker 1: but it will melt if you hold a piece of 411 00:25:50,280 --> 00:25:52,680 Speaker 1: it in your hand because your body temperature is higher 412 00:25:52,720 --> 00:25:55,919 Speaker 1: than eighty five point five. And I've seen lots of 413 00:25:56,000 --> 00:25:59,320 Speaker 1: videos of people playing with gallium where they've used gallium 414 00:25:59,320 --> 00:26:01,919 Speaker 1: and poured liquid gallium into a mold of like a 415 00:26:01,960 --> 00:26:04,960 Speaker 1: spoon for example, and so it looks like a regular 416 00:26:05,000 --> 00:26:07,480 Speaker 1: metal spoon. But then if you were to use it 417 00:26:07,520 --> 00:26:12,320 Speaker 1: to say, stir some hot tea, the spoon would just dissolve. 418 00:26:12,760 --> 00:26:15,160 Speaker 1: It would melt into the You wouldn't want to drink 419 00:26:15,200 --> 00:26:16,960 Speaker 1: that tea. By the way, it is non toxic, but 420 00:26:17,000 --> 00:26:20,080 Speaker 1: you still don't want to be drinking gallium um. But yeah, 421 00:26:20,080 --> 00:26:23,119 Speaker 1: it's it's it's just cool, like a cool special effect 422 00:26:23,200 --> 00:26:26,200 Speaker 1: kind of thing, and it's really trippy to see it 423 00:26:26,440 --> 00:26:30,080 Speaker 1: in video action. It has actually been used in high 424 00:26:30,080 --> 00:26:35,000 Speaker 1: temperature applications for thermometers, for example, instead of mercury. It 425 00:26:35,080 --> 00:26:38,760 Speaker 1: only works if the temperatures you're measuring are above the 426 00:26:38,800 --> 00:26:42,840 Speaker 1: melting point for gallium. But it is really useful because 427 00:26:42,880 --> 00:26:47,480 Speaker 1: it's also non toxic, unlike mercury. The researchers then took 428 00:26:47,480 --> 00:26:51,560 Speaker 1: this gallium, pure gallium, and they doped it with magnetic particles, 429 00:26:51,560 --> 00:26:57,280 Speaker 1: so they essentially implant magnetic particles inside gallium and when 430 00:26:57,320 --> 00:27:01,920 Speaker 1: it solidifies, those magnetic particles are locked in place, and 431 00:27:02,000 --> 00:27:05,160 Speaker 1: so in solid form, it was really easy to manipulate 432 00:27:05,200 --> 00:27:11,240 Speaker 1: the gallium using magnets because those those magnetic particles again, 433 00:27:11,320 --> 00:27:13,800 Speaker 1: they're they're locked where they are, so you can use 434 00:27:13,800 --> 00:27:16,679 Speaker 1: a magnet and just scoot that little solid piece of 435 00:27:16,680 --> 00:27:20,840 Speaker 1: gallium all over the place. When it was in liquid form, 436 00:27:20,880 --> 00:27:24,200 Speaker 1: it was much more difficult to make make the gallium 437 00:27:24,240 --> 00:27:27,840 Speaker 1: move through magnetic force. You could get some movement, but 438 00:27:28,080 --> 00:27:32,680 Speaker 1: it was really tough to do. But the research includes 439 00:27:32,720 --> 00:27:36,000 Speaker 1: a really fun video and the video shows a little figuring. 440 00:27:36,119 --> 00:27:39,280 Speaker 1: Its shaped kind of like a lego minifig and it's 441 00:27:39,400 --> 00:27:44,760 Speaker 1: inside a black plastic cage, and the researchers use a 442 00:27:44,840 --> 00:27:47,960 Speaker 1: magnet to move the figure around inside the cage, so 443 00:27:48,040 --> 00:27:51,400 Speaker 1: the figure paces back and forth and then except it's 444 00:27:51,400 --> 00:27:53,880 Speaker 1: not actually like its legs aren't moving, it's just scooting 445 00:27:53,880 --> 00:27:57,280 Speaker 1: across the surface of the that's on I guess it's 446 00:27:57,280 --> 00:28:00,240 Speaker 1: a table. It's hard to see because it's a really 447 00:28:00,320 --> 00:28:05,760 Speaker 1: closed up video. But then they use magnetic resonance to 448 00:28:06,040 --> 00:28:09,280 Speaker 1: heat the material up, so it went beyond the melting point, 449 00:28:09,320 --> 00:28:12,600 Speaker 1: and so then it turns into this silvery globe and 450 00:28:12,640 --> 00:28:16,240 Speaker 1: it slowly pours out through the bars of the cage 451 00:28:16,960 --> 00:28:19,840 Speaker 1: and then flows into a mold that's in the shape 452 00:28:19,840 --> 00:28:24,159 Speaker 1: of the original robot figure. It cools down and then resolidifies, 453 00:28:24,200 --> 00:28:27,040 Speaker 1: and the boom. We have a very cute recreation of 454 00:28:27,080 --> 00:28:30,760 Speaker 1: a famous scene from Terminator to where the T one 455 00:28:30,800 --> 00:28:34,000 Speaker 1: thousand goes into liquid form and passes through the bars 456 00:28:34,119 --> 00:28:38,640 Speaker 1: of a jail cell. Now, admittedly this is a very 457 00:28:38,680 --> 00:28:42,760 Speaker 1: primitive robot. When it's in solid form, there are no 458 00:28:42,880 --> 00:28:44,960 Speaker 1: real moving parts, there are no circuits, there are no 459 00:28:45,120 --> 00:28:48,960 Speaker 1: processors or anything inside of this thing. It is completely 460 00:28:49,000 --> 00:28:54,240 Speaker 1: manipulated by external forces. Being a physical magnet or something 461 00:28:54,280 --> 00:28:58,320 Speaker 1: generating magnetic fields. And because it has this very low 462 00:28:58,360 --> 00:29:01,800 Speaker 1: melting point, there aren't a lot of practical applications for 463 00:29:01,840 --> 00:29:06,720 Speaker 1: this particular approach to robotics just yet. However, this could 464 00:29:06,720 --> 00:29:09,200 Speaker 1: be the first step to doing some really cool work 465 00:29:09,440 --> 00:29:12,400 Speaker 1: in creating robots that can change form depending upon the 466 00:29:12,440 --> 00:29:15,400 Speaker 1: situation and the need, and that could be useful in 467 00:29:15,440 --> 00:29:18,840 Speaker 1: all sorts of different applications, including medical procedures. So it 468 00:29:18,960 --> 00:29:21,200 Speaker 1: is really neat, but we have to keep in mind 469 00:29:21,240 --> 00:29:25,160 Speaker 1: this is like a demonstration of a concept and that 470 00:29:25,200 --> 00:29:29,800 Speaker 1: we're very far away from having a practical version of 471 00:29:29,840 --> 00:29:32,200 Speaker 1: this that could do useful stuff in the real world. 472 00:29:32,800 --> 00:29:35,800 Speaker 1: Still fun to watch the video though, And that's it 473 00:29:35,920 --> 00:29:40,720 Speaker 1: for this episode. The news for Thursday, January twenty three. 474 00:29:41,040 --> 00:29:43,840 Speaker 1: I hope you are all well. If you have suggestions 475 00:29:43,880 --> 00:29:46,360 Speaker 1: for topics I should cover in future episodes of tech Stuff, 476 00:29:46,400 --> 00:29:48,000 Speaker 1: please reach out to me. You can do that in 477 00:29:48,000 --> 00:29:50,719 Speaker 1: a couple of different ways. One is you can go 478 00:29:50,760 --> 00:29:53,920 Speaker 1: over to Twitter and send me a tweet. The show 479 00:29:54,040 --> 00:29:57,959 Speaker 1: handle is tech Stuff hs W, so just tweet at 480 00:29:58,000 --> 00:30:00,800 Speaker 1: that and I'll see it. Or if you prefer, you 481 00:30:00,800 --> 00:30:03,720 Speaker 1: can download the I Heart radio app spree to downloads 482 00:30:03,760 --> 00:30:06,240 Speaker 1: free to use, and you can type text stuff into 483 00:30:06,280 --> 00:30:08,480 Speaker 1: little search field. Go right to the tech stuff page. 484 00:30:08,600 --> 00:30:10,960 Speaker 1: You'll see there's a little microphone icon on that page. 485 00:30:11,000 --> 00:30:13,160 Speaker 1: If you click on that, you can leave a voice 486 00:30:13,200 --> 00:30:15,800 Speaker 1: message up to thirty seconds in length and let me 487 00:30:15,840 --> 00:30:17,720 Speaker 1: know what you would like to hear and I'll talk 488 00:30:17,720 --> 00:30:26,880 Speaker 1: to you again really soon. Yeah. Text Stuff is an 489 00:30:26,880 --> 00:30:30,600 Speaker 1: I Heart Radio production. For more podcasts from my Heart Radio, 490 00:30:30,920 --> 00:30:34,120 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 491 00:30:34,200 --> 00:30:35,719 Speaker 1: you listen to your favorite shows.