1 00:00:04,440 --> 00:00:10,400 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. 2 00:00:11,960 --> 00:00:12,720 Speaker 2: Hey there, and. 3 00:00:12,600 --> 00:00:16,000 Speaker 1: Welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm 4 00:00:16,000 --> 00:00:19,560 Speaker 1: an executive producer with iHeartRadio. And how the tech are you. 5 00:00:20,000 --> 00:00:23,239 Speaker 1: It's time for the tech news for Tuesday, July twenty fifth, 6 00:00:23,400 --> 00:00:27,560 Speaker 1: twenty twenty three. And let's get this out of the 7 00:00:27,600 --> 00:00:32,159 Speaker 1: way first. Elon Musk has rushed into changing Twitter into 8 00:00:32,200 --> 00:00:37,360 Speaker 1: his long talked about it does Everything app called X. 9 00:00:38,080 --> 00:00:42,000 Speaker 1: Musk actually has a long history with X. He actually 10 00:00:42,040 --> 00:00:44,800 Speaker 1: also has a lot of x'es. Back in the late 11 00:00:44,920 --> 00:00:48,120 Speaker 1: nineteen nineties, Musk was part of a group that founded 12 00:00:48,159 --> 00:00:52,320 Speaker 1: a site called x dot com. It was essentially an 13 00:00:52,400 --> 00:00:57,840 Speaker 1: online bank. Then they acquired a competing service called Confinity, 14 00:00:58,600 --> 00:01:01,280 Speaker 1: and primarily they became own as a service that allowed 15 00:01:01,400 --> 00:01:06,839 Speaker 1: users to do money transfers over the internet. X dot 16 00:01:06,880 --> 00:01:10,199 Speaker 1: com was purchased by eBay and became PayPal, and Elon 17 00:01:10,319 --> 00:01:15,200 Speaker 1: Musk was subsequently ousted from PayPal when Peter Thiel led 18 00:01:15,319 --> 00:01:19,880 Speaker 1: a coup and his group gave Elon Musk the boot. 19 00:01:20,240 --> 00:01:23,760 Speaker 1: Apparently they found him difficult to work with, but of 20 00:01:23,800 --> 00:01:26,480 Speaker 1: course by that time he had already received his massive 21 00:01:26,640 --> 00:01:31,160 Speaker 1: payout of eBay purchasing the company, so he was well 22 00:01:31,200 --> 00:01:36,319 Speaker 1: on his way from going from being wealthy to obscenely wealthy. 23 00:01:36,959 --> 00:01:39,840 Speaker 1: The domain X dot com remained at PayPal for more 24 00:01:39,880 --> 00:01:44,440 Speaker 1: than a decade. However, in twenty seventeen, Musk purchased the 25 00:01:44,600 --> 00:01:49,040 Speaker 1: rights to the name from PayPal for nostalgic reasons. Meanwhile, 26 00:01:49,320 --> 00:01:53,560 Speaker 1: over in China, the company ten Cent had in twenty 27 00:01:53,640 --> 00:01:58,360 Speaker 1: eleven released an app called we Chat. As Stefan would say, 28 00:01:58,680 --> 00:02:04,920 Speaker 1: this app hasing instant messaging, voice messaging, one to many 29 00:02:05,000 --> 00:02:10,680 Speaker 1: messaging like Twitter, mobile payments, and other social media functions. 30 00:02:10,680 --> 00:02:14,080 Speaker 1: Probably the worst Saturday Night Live sketch I've ever done. 31 00:02:14,200 --> 00:02:18,560 Speaker 1: When Elon Musk started talking about acquiring Twitter, he also 32 00:02:18,560 --> 00:02:21,720 Speaker 1: said his long term plan was to incorporate Twitter's functionality 33 00:02:21,760 --> 00:02:25,280 Speaker 1: into an everything app similar to we Chat, but for 34 00:02:25,360 --> 00:02:27,880 Speaker 1: the rest of the world, and that he was going 35 00:02:27,919 --> 00:02:29,760 Speaker 1: to call this app X. 36 00:02:30,880 --> 00:02:32,560 Speaker 2: Well. This past weekend, Musk. 37 00:02:32,400 --> 00:02:36,160 Speaker 1: Started making those changes, So if you visit Twitter dot com, 38 00:02:36,200 --> 00:02:38,919 Speaker 1: you'll see that there's now an X logo in place 39 00:02:39,480 --> 00:02:42,440 Speaker 1: of the once familiar bird logo that had been there 40 00:02:42,480 --> 00:02:43,359 Speaker 1: for ages. 41 00:02:43,440 --> 00:02:44,280 Speaker 2: It's gone now. 42 00:02:45,440 --> 00:02:48,560 Speaker 1: There's been no shortage of criticisms for this move. A 43 00:02:48,560 --> 00:02:51,240 Speaker 1: lot of folks question the wisdom of ditching more than 44 00:02:51,280 --> 00:02:56,680 Speaker 1: a decade of branding and ip. In fact, Twitter and 45 00:02:56,760 --> 00:02:59,400 Speaker 1: tweeting are common words in the text sphere, and a 46 00:02:59,440 --> 00:03:01,400 Speaker 1: lot of people are suggesting that it's just a really 47 00:03:01,440 --> 00:03:07,360 Speaker 1: bad move to just unceremoniously ditch all that the police 48 00:03:07,400 --> 00:03:10,520 Speaker 1: also thought it was a bad move temporarily, or at 49 00:03:10,560 --> 00:03:13,120 Speaker 1: least that the physical act of taking the word Twitter 50 00:03:13,400 --> 00:03:16,760 Speaker 1: down outside the company's HQ was a really bad move 51 00:03:17,120 --> 00:03:21,320 Speaker 1: because that involved a crane that blocked two lanes of 52 00:03:21,360 --> 00:03:23,920 Speaker 1: traffic on a busy street, and there was a question 53 00:03:23,960 --> 00:03:26,600 Speaker 1: about whether or not Elon Musk had bothered to get 54 00:03:26,639 --> 00:03:31,000 Speaker 1: a permit to do that. Anyway, the crew only managed 55 00:03:31,040 --> 00:03:36,520 Speaker 1: to remove the twitt from the logo before the delays 56 00:03:36,560 --> 00:03:40,160 Speaker 1: set in, so the er was still on there. I 57 00:03:40,200 --> 00:03:42,560 Speaker 1: almost said that the crew managed to remove the twit, 58 00:03:42,720 --> 00:03:45,320 Speaker 1: but Elon's still at the company, so I guess that's 59 00:03:45,360 --> 00:03:50,040 Speaker 1: not true. AO, The stompage was just a temporary delay. 60 00:03:50,080 --> 00:03:53,560 Speaker 1: The police determined that no crime was being committed, and 61 00:03:53,640 --> 00:03:57,440 Speaker 1: so you know, they didn't actually charge anyone with anything. 62 00:03:58,080 --> 00:03:58,560 Speaker 2: Anyway. 63 00:03:59,000 --> 00:04:02,120 Speaker 1: Musk says that the twins name once made sense back 64 00:04:02,160 --> 00:04:05,360 Speaker 1: in the day when you were limited to sending messages 65 00:04:05,400 --> 00:04:08,119 Speaker 1: of one hundred and forty or fewer characters, but these 66 00:04:08,240 --> 00:04:10,960 Speaker 1: days you can post a lot more stuff, including long 67 00:04:11,000 --> 00:04:15,240 Speaker 1: form video. So he argues Twitter as a name doesn't 68 00:04:15,280 --> 00:04:19,240 Speaker 1: make any sense personally. I disagree. I don't think that 69 00:04:19,600 --> 00:04:25,040 Speaker 1: the increase in services and capability and functionality has negated 70 00:04:25,279 --> 00:04:28,239 Speaker 1: the sense of the name. But then again, I also 71 00:04:28,360 --> 00:04:31,560 Speaker 1: don't have even one billion dollars to my name, so 72 00:04:31,560 --> 00:04:35,320 Speaker 1: what do I know. Musk also said that tweets should 73 00:04:35,320 --> 00:04:39,960 Speaker 1: now be called x's, and maybe that'll catch on. There's 74 00:04:39,960 --> 00:04:43,720 Speaker 1: also the possibility that Musk will just gut this newly 75 00:04:43,760 --> 00:04:47,040 Speaker 1: renamed service of some of its features as he pushes 76 00:04:47,160 --> 00:04:49,520 Speaker 1: for X to be the everything app he wants it 77 00:04:49,560 --> 00:04:52,240 Speaker 1: to be. You know. Rumor has it that he doesn't 78 00:04:52,279 --> 00:04:56,479 Speaker 1: really like the retweet feature and that that might just 79 00:04:56,560 --> 00:05:00,320 Speaker 1: be you know, on on borrowed time. Now, there are 80 00:05:00,440 --> 00:05:04,240 Speaker 1: questions remaining that will he be able to succeed, Will 81 00:05:04,240 --> 00:05:07,719 Speaker 1: he be able to create an X app that has 82 00:05:07,880 --> 00:05:11,279 Speaker 1: some but not all, of Twitter's functionality as well as 83 00:05:11,360 --> 00:05:17,159 Speaker 1: other functions and make it a success, or maybe other 84 00:05:17,240 --> 00:05:21,000 Speaker 1: services like Threads or blue Sky or Mastodon will take 85 00:05:21,040 --> 00:05:24,039 Speaker 1: the reins, or maybe people will just come to the 86 00:05:24,040 --> 00:05:28,120 Speaker 1: conclusion that they are fed up with Twitter style social platforms. 87 00:05:28,560 --> 00:05:31,080 Speaker 1: I know I'm one of those. I technically have a 88 00:05:31,160 --> 00:05:34,920 Speaker 1: Thread's account and a blue Sky account and a mastadon account, 89 00:05:35,800 --> 00:05:40,080 Speaker 1: and I very rarely check them because I just find 90 00:05:40,080 --> 00:05:43,760 Speaker 1: them exhausting. Now, I did mention masddon just a moment ago, 91 00:05:44,080 --> 00:05:48,080 Speaker 1: which is a federated approach to social platforms. That means 92 00:05:48,080 --> 00:05:52,200 Speaker 1: that rather than having a centralized corporate structure where you 93 00:05:52,279 --> 00:05:55,800 Speaker 1: have a single entity that dictates the rules and policies 94 00:05:55,920 --> 00:06:02,280 Speaker 1: for the platform, Mastedon consists of multiple in instances called servers, 95 00:06:02,760 --> 00:06:07,120 Speaker 1: and each server can have its own rules and policies 96 00:06:07,279 --> 00:06:11,039 Speaker 1: and vibe and culture. So some servers cater more to 97 00:06:11,200 --> 00:06:16,520 Speaker 1: a specific community than others do, like this community is 98 00:06:16,880 --> 00:06:20,400 Speaker 1: for queer creators, for example, that might be one masted 99 00:06:20,440 --> 00:06:23,800 Speaker 1: On server. Not that you know it's exclusively that, but 100 00:06:23,920 --> 00:06:27,520 Speaker 1: that's who was in mind when they created the server. 101 00:06:28,160 --> 00:06:32,719 Speaker 1: So this kind of system facilitates communication across servers. So 102 00:06:32,800 --> 00:06:36,560 Speaker 1: even if you join server A and your friend joins 103 00:06:36,680 --> 00:06:40,120 Speaker 1: server B, you can still communicate with each other through 104 00:06:40,160 --> 00:06:43,919 Speaker 1: this system. Discovery is a little trickier if you're on 105 00:06:44,040 --> 00:06:47,720 Speaker 1: different servers, but still possible. It's just not as intuitive. 106 00:06:48,200 --> 00:06:51,800 Speaker 1: But Mastodon is in the news for a really grim reason. 107 00:06:52,120 --> 00:06:55,559 Speaker 1: Some researchers at Stanford discovered that after examining three hundred 108 00:06:55,560 --> 00:06:58,880 Speaker 1: and twenty five thousand posts, they found one hundred and 109 00:06:58,880 --> 00:07:03,480 Speaker 1: twelve instances of known child sexual abuse material, also known 110 00:07:03,520 --> 00:07:08,200 Speaker 1: as seesam c SAM. Their search covered the twenty five 111 00:07:08,240 --> 00:07:11,480 Speaker 1: most popular servers on Mastodon, and in addition to the 112 00:07:11,520 --> 00:07:15,720 Speaker 1: messages actually containing known seesam, they found more than a 113 00:07:15,800 --> 00:07:20,080 Speaker 1: thousand instances of messages pointing to off site seesam you 114 00:07:20,120 --> 00:07:23,560 Speaker 1: know resources, whether it was a trading site or grooming sites. 115 00:07:23,760 --> 00:07:27,600 Speaker 1: Terrible awful things, and it points to a really big 116 00:07:27,720 --> 00:07:32,119 Speaker 1: challenge with the federated approach to these social networks because 117 00:07:32,160 --> 00:07:35,960 Speaker 1: big centralized companies can have entire departments that are filled 118 00:07:36,000 --> 00:07:40,320 Speaker 1: with content moderators, or actually, more likely they'll outsource that 119 00:07:40,440 --> 00:07:43,840 Speaker 1: to a third party company that is employing people in 120 00:07:43,880 --> 00:07:46,880 Speaker 1: developing countries. We've seen that over and over and how 121 00:07:46,960 --> 00:07:51,200 Speaker 1: that has affected people who are working in those jobs. 122 00:07:51,480 --> 00:07:54,840 Speaker 1: The point is they have a lot of people who 123 00:07:54,880 --> 00:07:57,360 Speaker 1: are working on the problem in these big centralized companies, 124 00:07:57,360 --> 00:07:59,720 Speaker 1: but in the THETA verse. You're talking about instances that 125 00:08:00,200 --> 00:08:03,600 Speaker 1: have only a single moderator running them, and chances are 126 00:08:04,160 --> 00:08:07,560 Speaker 1: being the server's one moderator is not actually that person's 127 00:08:07,880 --> 00:08:10,440 Speaker 1: real job. They've got another job, and they just do 128 00:08:10,480 --> 00:08:13,000 Speaker 1: this on the side, like in their spare time, which 129 00:08:13,040 --> 00:08:16,120 Speaker 1: means that when problems arise, you know, when people are 130 00:08:16,200 --> 00:08:19,280 Speaker 1: violating terms of service or when they are posting outright 131 00:08:19,320 --> 00:08:24,080 Speaker 1: illegal content, it takes time for the moderator to be 132 00:08:24,120 --> 00:08:27,600 Speaker 1: able to address it. And if it's happening a lot 133 00:08:27,640 --> 00:08:30,080 Speaker 1: and you've only got the one moderator, then those problems 134 00:08:30,160 --> 00:08:33,840 Speaker 1: really start to pile up. For these federated services to survive, 135 00:08:33,880 --> 00:08:36,160 Speaker 1: I suspect they're going to have to develop and incorporate 136 00:08:36,240 --> 00:08:40,520 Speaker 1: a lot more automated tools to supplement the work that's 137 00:08:40,559 --> 00:08:43,760 Speaker 1: being done by human moderators. Here's a case where you 138 00:08:43,760 --> 00:08:46,480 Speaker 1: can make a real strong argument for the need for AI. 139 00:08:47,160 --> 00:08:50,920 Speaker 1: You would need AI powered tools that are very good 140 00:08:51,480 --> 00:08:56,880 Speaker 1: at identifying instances that have illegal material or CESAM related 141 00:08:56,920 --> 00:09:00,000 Speaker 1: material on them so that you can respond much more 142 00:09:00,120 --> 00:09:05,800 Speaker 1: quickly and decisively and have a safe community. Otherwise, these 143 00:09:05,840 --> 00:09:08,040 Speaker 1: services will not really be able to take the place 144 00:09:08,160 --> 00:09:12,800 Speaker 1: of these centralized services. That currently dominate the web because 145 00:09:12,800 --> 00:09:15,920 Speaker 1: they won't be able to scale without encountering massive problems 146 00:09:15,960 --> 00:09:20,720 Speaker 1: like being saturated with illegal and harmful content. So it 147 00:09:20,800 --> 00:09:25,640 Speaker 1: is the dark side of the fediverse. Okay, one more 148 00:09:25,640 --> 00:09:28,480 Speaker 1: story before we go to break, and that is you know, 149 00:09:28,520 --> 00:09:31,600 Speaker 1: I talked about PayPal a little bit earlier. I mean 150 00:09:31,800 --> 00:09:34,520 Speaker 1: I was talking about Musk's history with the letter X. 151 00:09:35,200 --> 00:09:38,720 Speaker 1: Let's talk about the US Federal Reserve because it has 152 00:09:38,800 --> 00:09:43,160 Speaker 1: a new system being rolled out called fed now. So 153 00:09:43,200 --> 00:09:48,120 Speaker 1: this service coordinates payment between different financial institutions like banks 154 00:09:48,280 --> 00:09:51,439 Speaker 1: and credit unions. So at launch, thirty five banks and 155 00:09:51,480 --> 00:09:55,000 Speaker 1: credit unions are included in this system. But that's just 156 00:09:55,160 --> 00:09:59,400 Speaker 1: the very beginning of this initiative. The goal is to 157 00:09:59,440 --> 00:10:02,440 Speaker 1: create a network work that connects more than nine thousand 158 00:10:02,720 --> 00:10:07,640 Speaker 1: financial institutions together for what purpose well, essentially to make 159 00:10:07,679 --> 00:10:10,280 Speaker 1: it easy for people, at least people who have a 160 00:10:10,320 --> 00:10:14,640 Speaker 1: bank account to transfer money to each other, So forget 161 00:10:14,720 --> 00:10:19,360 Speaker 1: checks or debit cards or even apps like Venmo or PayPal. 162 00:10:19,720 --> 00:10:23,200 Speaker 1: By the way, PayPal owns Venmo in case you weren't aware, 163 00:10:23,200 --> 00:10:27,240 Speaker 1: I remember hearing some millennials talk about, you know, dissing 164 00:10:27,320 --> 00:10:30,520 Speaker 1: PayPal because they want to use Venmo, and I was 165 00:10:30,559 --> 00:10:34,240 Speaker 1: just like you do know PayPal owns Venmo, You're still 166 00:10:34,360 --> 00:10:38,920 Speaker 1: using PayPal right or any of those other cash transfer apps. 167 00:10:39,600 --> 00:10:43,240 Speaker 1: So this would allow you to make direct transfers from 168 00:10:43,440 --> 00:10:46,400 Speaker 1: one bank account to another, as long as the two 169 00:10:46,480 --> 00:10:49,560 Speaker 1: banks or community credit unions or whatever it may be 170 00:10:50,000 --> 00:10:53,920 Speaker 1: are connected within this network, the FED and visions FED 171 00:10:53,960 --> 00:10:57,280 Speaker 1: now to also allow people to access funds much faster 172 00:10:57,600 --> 00:10:58,800 Speaker 1: with fewer roadblocks. 173 00:10:58,840 --> 00:11:00,760 Speaker 2: So imagine, let's say that. 174 00:11:00,720 --> 00:11:05,040 Speaker 1: You run a small business and you invoice another company 175 00:11:05,040 --> 00:11:06,840 Speaker 1: because they've hired you to do some work and you 176 00:11:06,880 --> 00:11:09,720 Speaker 1: did the work, so you send them an invoice and 177 00:11:09,800 --> 00:11:13,480 Speaker 1: they can pay you through FED now. Well, the money 178 00:11:13,520 --> 00:11:16,040 Speaker 1: can go through much more quickly and then you can 179 00:11:16,120 --> 00:11:19,640 Speaker 1: access it right away. And obviously that can be really 180 00:11:19,640 --> 00:11:22,880 Speaker 1: helpful for people who have cash flow issues right where 181 00:11:23,040 --> 00:11:25,240 Speaker 1: you know, it's not that they don't have money coming 182 00:11:25,280 --> 00:11:27,800 Speaker 1: to them, it's just it's not in their account right now. 183 00:11:28,200 --> 00:11:30,640 Speaker 1: This can help cut down on those kinds of situations 184 00:11:30,679 --> 00:11:35,160 Speaker 1: and streamline those financial transactions. So it could be person 185 00:11:35,200 --> 00:11:37,280 Speaker 1: to person, it could be company to person or person 186 00:11:37,320 --> 00:11:40,800 Speaker 1: to company. It's meant to just kind of facilitate all 187 00:11:40,800 --> 00:11:43,280 Speaker 1: of that and to remove the need for these third 188 00:11:43,320 --> 00:11:47,320 Speaker 1: party applications. There's still a long way to go to 189 00:11:47,360 --> 00:11:51,880 Speaker 1: get to that point, obviously. Whether just thirty five initial 190 00:11:52,559 --> 00:11:55,599 Speaker 1: systems that are in place, that's not enough obviously to 191 00:11:55,880 --> 00:11:59,760 Speaker 1: really make a huge difference. Also, the participating institutions are 192 00:11:59,800 --> 00:12:03,760 Speaker 1: not required to fully implement all the services, so some 193 00:12:03,840 --> 00:12:06,000 Speaker 1: are taking it a little more slowly than others, and 194 00:12:06,040 --> 00:12:10,600 Speaker 1: they're only allowing certain services to be used within their systems. 195 00:12:10,800 --> 00:12:14,280 Speaker 1: So fed now has the potential to disrupt stuff like 196 00:12:14,360 --> 00:12:17,120 Speaker 1: PayPal and Venmo and cash apps and things like that, 197 00:12:17,640 --> 00:12:20,480 Speaker 1: but it's not going to happen overnight, and for the 198 00:12:20,520 --> 00:12:24,600 Speaker 1: people who are unbanked or underbanked, this might not prove 199 00:12:24,640 --> 00:12:27,400 Speaker 1: to be a relevant technology. We still have to find 200 00:12:27,440 --> 00:12:33,439 Speaker 1: ways to facilitate them being part of these financial transactions 201 00:12:33,480 --> 00:12:36,520 Speaker 1: because just having a bank account can be a barrier 202 00:12:36,559 --> 00:12:39,560 Speaker 1: to entry for a lot of people. So it still 203 00:12:39,600 --> 00:12:41,560 Speaker 1: needs to address that issue too, And I don't know 204 00:12:41,600 --> 00:12:45,320 Speaker 1: that the system ultimately will. That may not even be 205 00:12:45,520 --> 00:12:50,320 Speaker 1: in the mission statement at all for fed now, but 206 00:12:50,440 --> 00:12:53,319 Speaker 1: it is interesting that it could have a massive impact 207 00:12:54,000 --> 00:12:56,400 Speaker 1: on money transfers using the internet. 208 00:12:57,559 --> 00:12:59,240 Speaker 2: Okay, now we're going to take a quick break when 209 00:12:59,280 --> 00:12:59,640 Speaker 2: we come back. 210 00:12:59,640 --> 00:13:10,679 Speaker 1: We've got a lot more news to cover, all right, 211 00:13:10,720 --> 00:13:11,240 Speaker 1: We're back. 212 00:13:12,160 --> 00:13:14,240 Speaker 2: More than one thousand. 213 00:13:13,760 --> 00:13:17,000 Speaker 1: App developers in the UK I think around fifteen hundred 214 00:13:17,000 --> 00:13:21,319 Speaker 1: actually have joined in on a class action lawsuit against Apple. 215 00:13:21,600 --> 00:13:24,720 Speaker 1: They argue that Apple is using its monopoly as a 216 00:13:24,760 --> 00:13:28,599 Speaker 1: storefront for iOS apps to charge high fees to developers. 217 00:13:28,800 --> 00:13:29,439 Speaker 2: So this is a. 218 00:13:29,360 --> 00:13:32,160 Speaker 1: Complaint we have heard numerous times in the past and 219 00:13:32,240 --> 00:13:35,560 Speaker 1: has been addressed in various court cases around the world. So, 220 00:13:35,640 --> 00:13:40,160 Speaker 1: as you likely know, Apple collects between fifteen to thirty 221 00:13:40,200 --> 00:13:45,079 Speaker 1: percent on in app transactions for most types of transactions. 222 00:13:45,120 --> 00:13:48,240 Speaker 1: There are some that get an exception, but for most 223 00:13:48,679 --> 00:13:51,120 Speaker 1: Apple gets a fifteen to thirty percent cut. So if 224 00:13:51,120 --> 00:13:53,559 Speaker 1: you spend a dollar in an app from a really 225 00:13:53,559 --> 00:13:57,080 Speaker 1: big developer, that developer might only get seventy cents of 226 00:13:57,120 --> 00:13:59,680 Speaker 1: that dollar and the other thirty cents goes to Apple. 227 00:14:00,080 --> 00:14:01,719 Speaker 1: And you know, that doesn't sound like that much, But 228 00:14:01,760 --> 00:14:04,880 Speaker 1: then you start thinking about all the apps and the 229 00:14:05,520 --> 00:14:08,480 Speaker 1: millions of people, the hundreds of millions of people who 230 00:14:08,520 --> 00:14:12,520 Speaker 1: are using iOS apps to do these different transactions. It 231 00:14:12,520 --> 00:14:14,760 Speaker 1: really adds up, and it means Apple breaks in the 232 00:14:14,840 --> 00:14:18,640 Speaker 1: big bucks through this practice. The app developers argue this 233 00:14:18,800 --> 00:14:21,600 Speaker 1: is unfair to them because there's nowhere else they can 234 00:14:21,680 --> 00:14:24,480 Speaker 1: go if they want to get their app in front 235 00:14:24,520 --> 00:14:28,920 Speaker 1: of iOS users. There's no alternative to the Apple App Store, 236 00:14:29,120 --> 00:14:31,720 Speaker 1: so there's no competition in the space, and that means 237 00:14:31,760 --> 00:14:35,760 Speaker 1: Apple holds a monopoly for that market and can dictate terms. 238 00:14:36,440 --> 00:14:41,200 Speaker 1: The lawsuit is seeking a billion dollars from Apple. Is 239 00:14:41,240 --> 00:14:45,720 Speaker 1: a billion dollar class action lawsuit. Recently, Apple has made 240 00:14:45,720 --> 00:14:47,880 Speaker 1: some steps to move away from its kind of iron 241 00:14:47,960 --> 00:14:51,480 Speaker 1: grip approach that it had been known for for years, 242 00:14:52,760 --> 00:14:55,960 Speaker 1: and those steps include things like allowing developers to use 243 00:14:56,000 --> 00:15:00,200 Speaker 1: different financial services to complete transactions instead of Apple's own 244 00:15:00,240 --> 00:15:04,240 Speaker 1: in house financial system, all the way to potentially opening 245 00:15:04,320 --> 00:15:09,400 Speaker 1: up so that competing app storefronts can offer apps on 246 00:15:09,880 --> 00:15:14,880 Speaker 1: iOS devices. One word about the different third party financial 247 00:15:14,960 --> 00:15:18,840 Speaker 1: services thing, Apple, at least in some markets, was kind 248 00:15:18,840 --> 00:15:21,640 Speaker 1: of sneaky about this, Like, yes, they allowed it, you 249 00:15:21,720 --> 00:15:27,960 Speaker 1: could use a different financial services company to complete your transactions, 250 00:15:28,520 --> 00:15:31,320 Speaker 1: but Apple would put essentially like an Apple tax on 251 00:15:31,360 --> 00:15:34,200 Speaker 1: top of it, which means you would actually collect even 252 00:15:34,320 --> 00:15:38,760 Speaker 1: less money per transaction. And obviously a lot of people 253 00:15:38,800 --> 00:15:44,120 Speaker 1: are challenging that, saying Apple is doing this unfairly that 254 00:15:45,120 --> 00:15:51,280 Speaker 1: you unwilling to relinquish control. Apple creates disincentives to use 255 00:15:51,320 --> 00:15:55,080 Speaker 1: any alternatives. So yes, technically there are alternatives, but they 256 00:15:55,080 --> 00:15:59,640 Speaker 1: are less attractive than Apple's own approach because Apple has 257 00:16:00,480 --> 00:16:04,000 Speaker 1: artificially made them more expensive. That's been an argument in 258 00:16:04,040 --> 00:16:06,120 Speaker 1: some regions as well, so we'll have to see how 259 00:16:06,160 --> 00:16:10,080 Speaker 1: this one plays out. Google recently published its Environmental Impact 260 00:16:10,120 --> 00:16:13,560 Speaker 1: Report for twenty twenty two, and the company revealed that 261 00:16:13,720 --> 00:16:17,680 Speaker 1: it is using a truly enormous amount of water every year. 262 00:16:18,520 --> 00:16:23,600 Speaker 1: Last year, the company used five point six billion gallons 263 00:16:23,600 --> 00:16:28,280 Speaker 1: of water, most of it potable water, meaning drinkable water, 264 00:16:28,840 --> 00:16:31,840 Speaker 1: water that's safe to drink. Many of Google's data centers 265 00:16:31,880 --> 00:16:35,360 Speaker 1: depend upon water cooled systems to keep machines in operation 266 00:16:35,600 --> 00:16:38,600 Speaker 1: twenty four hours a day, seven days a week. The 267 00:16:38,640 --> 00:16:41,880 Speaker 1: general expectation is that twenty twenty three is going to 268 00:16:41,880 --> 00:16:45,360 Speaker 1: see another big jump in water consumption because Google, like 269 00:16:45,480 --> 00:16:48,760 Speaker 1: a lot of other companies, has pushed forward really hard 270 00:16:48,800 --> 00:16:51,480 Speaker 1: in the AI space, and AI requires a lot of 271 00:16:51,520 --> 00:16:55,720 Speaker 1: compute power, so to meet those requirements, it shouldn't be 272 00:16:55,720 --> 00:16:58,160 Speaker 1: a surprise that Google is going to need to consume 273 00:16:58,240 --> 00:17:02,120 Speaker 1: more resources in order to keep machines running. Below critical temperatures. 274 00:17:02,680 --> 00:17:05,479 Speaker 1: According to Google reps, the company is also relying on 275 00:17:05,600 --> 00:17:08,320 Speaker 1: air cooled systems to help reduce the need for water, 276 00:17:08,400 --> 00:17:12,119 Speaker 1: particularly in places that are impacted by a water crisis, 277 00:17:12,880 --> 00:17:16,440 Speaker 1: and Google says it is committed to replenish quote one 278 00:17:16,520 --> 00:17:20,119 Speaker 1: hundred and twenty percent of the freshwater it consumes end 279 00:17:20,200 --> 00:17:25,760 Speaker 1: quote by twenty thirty. That's according to Business Insider. However, 280 00:17:26,200 --> 00:17:31,920 Speaker 1: in that same article, the writer reveals that currently Google 281 00:17:31,960 --> 00:17:36,800 Speaker 1: replenishes just six percent of the freshwater it consumes, So 282 00:17:36,880 --> 00:17:38,560 Speaker 1: obviously the company is going to have to make some 283 00:17:38,600 --> 00:17:42,280 Speaker 1: substantial changes to meet that one hundred and twenty percent 284 00:17:42,400 --> 00:17:45,960 Speaker 1: goal within seven years. Considering that large parts of the 285 00:17:46,040 --> 00:17:50,600 Speaker 1: US are either currently in a water crisis or they 286 00:17:50,720 --> 00:17:53,399 Speaker 1: very well may be in a water crisis within a 287 00:17:53,480 --> 00:17:56,600 Speaker 1: year or two, this is a real problem that needs 288 00:17:56,600 --> 00:18:00,159 Speaker 1: a serious solution. Now stop me if you've heard this one. 289 00:18:00,400 --> 00:18:07,840 Speaker 1: Tech companies see a potentially massive market. There's a tiny catch, however, 290 00:18:08,400 --> 00:18:11,680 Speaker 1: this market is in a region that's run by an oppressive, 291 00:18:12,119 --> 00:18:16,159 Speaker 1: authoritarian government, and that government is going to have some 292 00:18:16,320 --> 00:18:20,080 Speaker 1: tight restrictions on any company that works within that region, 293 00:18:20,280 --> 00:18:24,080 Speaker 1: up to and including requiring the companies to share data 294 00:18:24,400 --> 00:18:28,359 Speaker 1: with the government itself. Plus the government has been connected 295 00:18:28,440 --> 00:18:32,919 Speaker 1: with atrocious acts in the past. Yes, we saw this 296 00:18:32,960 --> 00:18:35,560 Speaker 1: actually happen in China. And while a lot of tech 297 00:18:35,600 --> 00:18:39,680 Speaker 1: companies were initially very eager to get across and get 298 00:18:39,720 --> 00:18:43,520 Speaker 1: access to the enormous population of potential users in China, 299 00:18:44,359 --> 00:18:47,639 Speaker 1: a lot of those same companies later abandoned China after 300 00:18:47,680 --> 00:18:50,760 Speaker 1: discovering that sure enough, doing business there came with too 301 00:18:50,760 --> 00:18:54,320 Speaker 1: many downsides. Well, I guess we can say those companies 302 00:18:54,440 --> 00:18:57,280 Speaker 1: learned from their mistakes, because they're ready to repeat them 303 00:18:57,400 --> 00:19:00,960 Speaker 1: almost exactly by the way, I lifted that joke from 304 00:19:01,040 --> 00:19:05,720 Speaker 1: Peter Cook in a Beyond the Fringe sketch. Anyway, Saudi 305 00:19:05,760 --> 00:19:10,200 Speaker 1: Arabian officials are courting tech companies to set up operations 306 00:19:10,240 --> 00:19:16,000 Speaker 1: within Saudi Arabia, partly to create a new technological wonder 307 00:19:16,040 --> 00:19:18,199 Speaker 1: city that some people are really worried is going to 308 00:19:18,200 --> 00:19:22,960 Speaker 1: become like the perfect surveillance city to keep track on 309 00:19:23,200 --> 00:19:27,960 Speaker 1: the movements of everyone who's within it. It's pretty dystopian stuff. 310 00:19:29,000 --> 00:19:32,640 Speaker 1: So the effort is to make Saudi Arabia a global 311 00:19:32,680 --> 00:19:35,600 Speaker 1: center for tech as part of a plan called the 312 00:19:35,760 --> 00:19:40,919 Speaker 1: Vision twenty thirty plan. Companies that have signed on include Google, 313 00:19:41,000 --> 00:19:44,000 Speaker 1: which signed on last year, and Microsoft, which signed on 314 00:19:44,080 --> 00:19:47,160 Speaker 1: this year, both of which are now building cloud centers 315 00:19:47,200 --> 00:19:50,760 Speaker 1: within Saudi Arabia. Meanwhile, human rights experts warned that Saudi 316 00:19:50,840 --> 00:19:56,360 Speaker 1: Arabia has some rather vague but powerful laws regarding national 317 00:19:56,359 --> 00:20:01,159 Speaker 1: security which give the government really kind of limitless power 318 00:20:01,320 --> 00:20:04,640 Speaker 1: to make demands of companies that operate within their borders, 319 00:20:05,080 --> 00:20:08,159 Speaker 1: and that those demands could include telling Google and Microsoft 320 00:20:08,160 --> 00:20:10,920 Speaker 1: and other tech companies that they have to share information 321 00:20:11,400 --> 00:20:14,120 Speaker 1: so that the government can do things like track down dissidents. 322 00:20:15,160 --> 00:20:18,119 Speaker 1: And in Saudi Arabia, a dissident can be someone like 323 00:20:18,560 --> 00:20:23,080 Speaker 1: a journalist reporting on the dealings of the Saudi Arabian royalty. 324 00:20:23,640 --> 00:20:25,520 Speaker 1: And as we have seen, the government is not above 325 00:20:25,600 --> 00:20:29,240 Speaker 1: assassinating people like that. It has happened, It happened not 326 00:20:29,359 --> 00:20:32,920 Speaker 1: long ago. You speak out against authoritarian power in Saudi Arabia, 327 00:20:32,960 --> 00:20:38,480 Speaker 1: you face massive consequences. A woman in Saudi Arabia criticized 328 00:20:38,640 --> 00:20:43,520 Speaker 1: this futuristic technological city on Twitter. She is now serving 329 00:20:43,720 --> 00:20:47,240 Speaker 1: a thirty year jail sentence for speaking out about it. 330 00:20:47,560 --> 00:20:50,800 Speaker 1: So there's this huge concern that Google and Microsoft, in 331 00:20:50,840 --> 00:20:54,520 Speaker 1: an eagerness to tap into the undeniably huge amount of 332 00:20:54,560 --> 00:20:58,200 Speaker 1: money that's in Saudi Arabia are also turning a blind 333 00:20:58,240 --> 00:21:01,040 Speaker 1: eye to the consequences of that deal, and that they 334 00:21:01,080 --> 00:21:07,880 Speaker 1: are becoming accomplices to an authoritarian government that commits terrible 335 00:21:07,920 --> 00:21:11,640 Speaker 1: acts upon its own people. Company representatives for both Microsoft 336 00:21:11,680 --> 00:21:15,679 Speaker 1: and Google have essentially said they're committed to protecting human rights, 337 00:21:16,240 --> 00:21:19,680 Speaker 1: but critics say that doesn't exactly align with them agreeing 338 00:21:19,720 --> 00:21:22,240 Speaker 1: to do business in a country that has an undeniable 339 00:21:22,320 --> 00:21:28,080 Speaker 1: history of violating human rights. So yeah, it's it's hard 340 00:21:28,160 --> 00:21:33,159 Speaker 1: to take these companies at their word when it's undeniable 341 00:21:33,840 --> 00:21:37,679 Speaker 1: that the government has abused its power multiple times in 342 00:21:37,800 --> 00:21:43,280 Speaker 1: order to strike down any who would question that government's, 343 00:21:43,320 --> 00:21:51,760 Speaker 1: you know, authority or its practices. So yeah, pretty ugly stuff. Okay, 344 00:21:52,560 --> 00:21:55,760 Speaker 1: I've got a couple more stories to talk about before 345 00:21:55,800 --> 00:22:07,919 Speaker 1: we get to that. Let's take another quick break. So, 346 00:22:08,320 --> 00:22:13,360 Speaker 1: the Entertainment Software Rating Board or ESRB would very much 347 00:22:13,440 --> 00:22:15,639 Speaker 1: like to scan your face before you launch into that 348 00:22:15,680 --> 00:22:19,359 Speaker 1: play session of Doom. So, according to PC Gamer, the 349 00:22:19,800 --> 00:22:23,560 Speaker 1: ESRB and some partners it's working with are planning to 350 00:22:23,640 --> 00:22:27,280 Speaker 1: launch a tool that they're calling the Privacy Protective Facial 351 00:22:27,480 --> 00:22:31,280 Speaker 1: Age Estimation. So the purpose of this would be to 352 00:22:31,320 --> 00:22:34,359 Speaker 1: determine if the person who's trying to buy a game 353 00:22:34,640 --> 00:22:37,760 Speaker 1: or to go through an in app purchase or activate 354 00:22:37,880 --> 00:22:41,160 Speaker 1: some other feature in the game that's meant for adults 355 00:22:41,760 --> 00:22:44,920 Speaker 1: is actually of age that they're old enough to do 356 00:22:44,960 --> 00:22:51,600 Speaker 1: that and not be violating some policy somewhere. And the 357 00:22:51,680 --> 00:22:54,359 Speaker 1: FTC is considering the measure and has opened up the 358 00:22:54,400 --> 00:22:57,920 Speaker 1: matter to public feedback and will continue to accept public 359 00:22:57,960 --> 00:23:01,280 Speaker 1: feedback until August two. First, so, if you're in the 360 00:23:01,400 --> 00:23:05,280 Speaker 1: United States and you have some thoughts about this, you 361 00:23:05,320 --> 00:23:09,119 Speaker 1: should probably go to the FTC's website and find the 362 00:23:09,160 --> 00:23:12,960 Speaker 1: page that is actually about this measure and leave your thoughts, 363 00:23:13,040 --> 00:23:16,240 Speaker 1: whether it's in support or against. I'm not telling you 364 00:23:16,320 --> 00:23:20,080 Speaker 1: how to feel personally, I'm against it, but you know, 365 00:23:20,640 --> 00:23:23,359 Speaker 1: I can also see why there is a call for this, 366 00:23:23,480 --> 00:23:28,159 Speaker 1: because otherwise you're requiring adults to take a very active 367 00:23:28,240 --> 00:23:32,240 Speaker 1: role in overseeing the types of stuff their children are 368 00:23:32,240 --> 00:23:34,439 Speaker 1: getting involved in. And while I think that's necessary, I 369 00:23:34,440 --> 00:23:38,720 Speaker 1: also think it's not always realistic, right, Like, even the 370 00:23:38,760 --> 00:23:43,280 Speaker 1: most loving parents are going to have issues following through 371 00:23:43,280 --> 00:23:45,320 Speaker 1: with that because they have their own stuff they got 372 00:23:45,320 --> 00:23:47,879 Speaker 1: to deal with. I assume I don't have kids, so 373 00:23:47,920 --> 00:23:51,160 Speaker 1: I don't have this issue. My dog isn't a gamer, 374 00:23:51,200 --> 00:23:53,560 Speaker 1: so I don't ever have to worry about this anyway. 375 00:23:54,560 --> 00:23:55,199 Speaker 2: We have talked a. 376 00:23:55,200 --> 00:23:58,000 Speaker 1: Lot in this show about how facial recognition technology in 377 00:23:58,000 --> 00:24:01,800 Speaker 1: general has some really big problem and it makes me 378 00:24:01,880 --> 00:24:05,480 Speaker 1: wonder how accurate the system really is? Right can it 379 00:24:05,560 --> 00:24:10,200 Speaker 1: tell the difference between say, a mature fifteen year old 380 00:24:10,640 --> 00:24:13,480 Speaker 1: and a young looking eighteen year old, Because the eighteen 381 00:24:13,520 --> 00:24:16,240 Speaker 1: year old will be old enough to play games that 382 00:24:16,280 --> 00:24:19,240 Speaker 1: get like a mature rating, the fifteen year old would 383 00:24:19,280 --> 00:24:21,240 Speaker 1: not be. How is this system going to be able 384 00:24:21,280 --> 00:24:24,720 Speaker 1: to tell the difference reliably? Like sure, after a certain 385 00:24:24,880 --> 00:24:28,640 Speaker 1: age it might not have an issue, But for younger players, 386 00:24:28,680 --> 00:24:31,320 Speaker 1: like people who are of age but they look young, 387 00:24:31,800 --> 00:24:35,800 Speaker 1: that's going to be an issue. And it gets really invasive, 388 00:24:36,160 --> 00:24:39,000 Speaker 1: and there's a lot of privacy concerns obviously about this. 389 00:24:39,160 --> 00:24:41,679 Speaker 1: It's kind of weird. It reminds me a lot of 390 00:24:41,760 --> 00:24:47,440 Speaker 1: how in Japan there were companies that were making cigarette 391 00:24:47,520 --> 00:24:51,680 Speaker 1: vending machines and they were incorporating cameras and facial recognition 392 00:24:51,720 --> 00:24:54,560 Speaker 1: technology to try and prevent kids from being able to 393 00:24:54,560 --> 00:24:57,240 Speaker 1: buy cigarettes out of these machines. The kids just figured 394 00:24:57,280 --> 00:24:59,439 Speaker 1: they could hold up the face of an older person, 395 00:24:59,560 --> 00:25:01,240 Speaker 1: like a picture of a face of an older person, 396 00:25:01,280 --> 00:25:04,400 Speaker 1: not the actual face that would have been really disturbing, 397 00:25:04,480 --> 00:25:07,040 Speaker 1: but a picture of a face, and that would be 398 00:25:07,160 --> 00:25:10,400 Speaker 1: enough to fool the system. Supposedly, this this system they're 399 00:25:10,440 --> 00:25:13,440 Speaker 1: talking about, would not be fooled by holding up a picture. 400 00:25:13,480 --> 00:25:15,800 Speaker 1: You would have to actually have a three dimensional face there. 401 00:25:16,680 --> 00:25:22,440 Speaker 1: But still it seems weird and invasive, And you know, 402 00:25:23,240 --> 00:25:27,119 Speaker 1: I get that the ESRB is trying to stay ahead 403 00:25:27,119 --> 00:25:30,879 Speaker 1: of governments from regulating the video game industry. That's the 404 00:25:30,880 --> 00:25:32,840 Speaker 1: whole purpose of the ESRB in the first place. 405 00:25:33,200 --> 00:25:33,960 Speaker 2: If you don't. 406 00:25:33,720 --> 00:25:38,320 Speaker 1: Remember, back in the nineteen nineties, the video game industry 407 00:25:38,359 --> 00:25:42,760 Speaker 1: as a whole formed the Entertainment Software Rating Board as 408 00:25:42,800 --> 00:25:46,280 Speaker 1: a way to address government concerns that the video game 409 00:25:46,320 --> 00:25:50,000 Speaker 1: industry was creating games that would, you know, turn children 410 00:25:50,000 --> 00:25:54,280 Speaker 1: into total monsters because they would play Mortal Kombat and become, 411 00:25:55,040 --> 00:25:58,959 Speaker 1: I don't know, gremlins or something. So the ESRB existed 412 00:25:59,000 --> 00:26:02,120 Speaker 1: in order to give rating to these different games. Now 413 00:26:02,160 --> 00:26:04,159 Speaker 1: it sounds like the ESRB is trying to come up 414 00:26:04,200 --> 00:26:07,640 Speaker 1: with ways to ensure that the people who are buying 415 00:26:07,720 --> 00:26:11,960 Speaker 1: games are actually old enough to do it, and maybe 416 00:26:12,000 --> 00:26:15,600 Speaker 1: that's to head off any potential government regulation that would 417 00:26:15,640 --> 00:26:20,520 Speaker 1: otherwise enter into the industry, But in my opinion, this 418 00:26:20,560 --> 00:26:24,720 Speaker 1: particular move is not a good one. Finally, in some 419 00:26:25,000 --> 00:26:29,680 Speaker 1: NIDOS Science News, researchers at North Carolina State University created 420 00:26:29,760 --> 00:26:34,080 Speaker 1: a gel like solution that lets them three D print 421 00:26:34,760 --> 00:26:39,760 Speaker 1: with metal at room temperature. So if you've never used 422 00:26:39,760 --> 00:26:44,240 Speaker 1: a three D printer, generally speaking, three D printers work 423 00:26:44,680 --> 00:26:47,840 Speaker 1: in a way where they take some sort of material, 424 00:26:48,160 --> 00:26:50,480 Speaker 1: like the one that I'm most familiar with is plastic, 425 00:26:51,000 --> 00:26:53,280 Speaker 1: and you get these filaments made of plastic. It's like 426 00:26:53,280 --> 00:26:56,600 Speaker 1: a plastic wire almost, and you feed it through the 427 00:26:56,640 --> 00:26:59,399 Speaker 1: three D printer and it has a chamber that heats 428 00:26:59,480 --> 00:27:02,919 Speaker 1: up that the plastic melts into like a semi liquid state. 429 00:27:03,880 --> 00:27:06,840 Speaker 1: Then that ends up going through the three D printer 430 00:27:07,000 --> 00:27:11,240 Speaker 1: through the print head, which lays down a trail of 431 00:27:11,280 --> 00:27:15,920 Speaker 1: this semi liquid stuff, which when it cools, it hardens 432 00:27:15,920 --> 00:27:20,920 Speaker 1: and solidifies, and that's what lets you create three dimensional structures. However, 433 00:27:21,880 --> 00:27:24,119 Speaker 1: if you're working with materials that have a much higher 434 00:27:24,160 --> 00:27:27,520 Speaker 1: melting point, it gets really hard to print with them, right, 435 00:27:27,560 --> 00:27:30,960 Speaker 1: and metals typically have pretty high melting points, not all 436 00:27:31,000 --> 00:27:36,120 Speaker 1: of them. You know, mercury doesn't, gallium doesn't, but others do, 437 00:27:36,720 --> 00:27:40,000 Speaker 1: and so they require more specialized equipment. When you're talking 438 00:27:40,040 --> 00:27:42,879 Speaker 1: about specialized equipment, well then you're talking about stuff getting 439 00:27:42,920 --> 00:27:46,560 Speaker 1: more expensive, right, because if it has to be built 440 00:27:46,560 --> 00:27:50,399 Speaker 1: to certain parameters, then it gets to be expensive and 441 00:27:50,480 --> 00:27:53,960 Speaker 1: you start to lose any advantages you get through being 442 00:27:53,960 --> 00:27:57,240 Speaker 1: able to three D print right from a financial standpoint anyway, 443 00:27:57,880 --> 00:28:01,000 Speaker 1: so you might as well not do three D printing 444 00:28:01,040 --> 00:28:04,879 Speaker 1: and use a more traditional and cheaper approach, even if 445 00:28:04,880 --> 00:28:07,639 Speaker 1: it's less efficient, because you're not going to spend as 446 00:28:07,680 --> 00:28:11,080 Speaker 1: much money. While these researchers at North Carolina State University 447 00:28:11,520 --> 00:28:13,960 Speaker 1: came up with a way to print with copper that 448 00:28:14,080 --> 00:28:18,280 Speaker 1: doesn't involve heating the copper up to a melting point. Instead, 449 00:28:19,080 --> 00:28:23,200 Speaker 1: they used copper particles that were in suspension in water, 450 00:28:23,800 --> 00:28:28,760 Speaker 1: and they added to that suspension some hydrochloric acid, which 451 00:28:28,840 --> 00:28:31,919 Speaker 1: dropped the pH of the water down to one, and 452 00:28:32,000 --> 00:28:37,960 Speaker 1: they added particles of utectic gallium indium alloy, and as 453 00:28:37,960 --> 00:28:41,440 Speaker 1: a kind of binding agent, they also added methyl cellulose 454 00:28:42,040 --> 00:28:46,520 Speaker 1: and this created that gel like mixture which was liquid 455 00:28:46,720 --> 00:28:49,440 Speaker 1: enough to go through a three D printer. They could 456 00:28:49,440 --> 00:28:52,720 Speaker 1: then print what was essentially a two dimensional shape using 457 00:28:52,760 --> 00:28:56,760 Speaker 1: this mixture, but then by applying heat in specific ways, 458 00:28:56,880 --> 00:29:00,320 Speaker 1: they could get a predictable reaction from the me It 459 00:29:00,360 --> 00:29:04,160 Speaker 1: would bend in ways that were predictable, so you would 460 00:29:04,200 --> 00:29:07,080 Speaker 1: print the shape. You would apply the heat at specific 461 00:29:07,160 --> 00:29:10,800 Speaker 1: points on this two dimensional shape, and as it would 462 00:29:10,800 --> 00:29:13,480 Speaker 1: heat up and dry, it would also change shape into 463 00:29:13,560 --> 00:29:15,880 Speaker 1: whatever it was you wanted to make, and you would 464 00:29:15,920 --> 00:29:19,880 Speaker 1: get a three dimensional metallic object that's electrically conductive. So 465 00:29:19,960 --> 00:29:23,840 Speaker 1: really neat, Like you could make components to very specific 466 00:29:25,240 --> 00:29:30,080 Speaker 1: you know, sizes and measurements using this approach instead of 467 00:29:30,080 --> 00:29:32,280 Speaker 1: having to make do with, you know, something that's off 468 00:29:32,320 --> 00:29:35,200 Speaker 1: the shelf, but isn't you know, ideal for whatever use 469 00:29:35,240 --> 00:29:38,320 Speaker 1: you need. As the object dry is the water and 470 00:29:38,360 --> 00:29:41,320 Speaker 1: the act evaporate, so you are left behind with just 471 00:29:41,400 --> 00:29:44,280 Speaker 1: a mix of those two metals plus the methyl cellulose 472 00:29:44,320 --> 00:29:47,040 Speaker 1: which is kind of binding everything together. So it's really 473 00:29:47,040 --> 00:29:51,080 Speaker 1: cool technology. I'm not sure when we will see like 474 00:29:51,200 --> 00:29:54,120 Speaker 1: a lot of practical uses of it, but it is 475 00:29:54,200 --> 00:29:58,200 Speaker 1: pretty neat. That's it for today's news. I had a 476 00:29:58,200 --> 00:30:00,719 Speaker 1: couple of other news items, but a lot of them 477 00:30:00,760 --> 00:30:02,920 Speaker 1: are still developing, so I'm going to wait and maybe 478 00:30:02,920 --> 00:30:05,880 Speaker 1: cover them on Thursday when there's more to say. And 479 00:30:06,120 --> 00:30:08,640 Speaker 1: I hope in the meanwhile that you are all well 480 00:30:09,080 --> 00:30:18,520 Speaker 1: and I'll talk to you again really soon. Tech Stuff 481 00:30:18,600 --> 00:30:23,120 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 482 00:30:23,160 --> 00:30:26,720 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 483 00:30:26,760 --> 00:30:31,160 Speaker 1: your favorite shows