1 00:00:04,440 --> 00:00:07,840 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:11,680 --> 00:00:14,160 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,280 --> 00:00:17,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,079 --> 00:00:19,560 Speaker 1: And how the tech are you. It's time for the 5 00:00:19,600 --> 00:00:24,799 Speaker 1: tech news for Tuesday, June two thousand twenty two. Our 6 00:00:25,000 --> 00:00:30,440 Speaker 1: first news item is about Google's conversation model AI that's 7 00:00:30,480 --> 00:00:34,280 Speaker 1: called Lambda L A M D A. Now I covered 8 00:00:34,280 --> 00:00:37,400 Speaker 1: this in yesterday's tech Stuff episode, but in case you 9 00:00:37,440 --> 00:00:41,479 Speaker 1: missed it, here is the short version. An engineer at 10 00:00:41,520 --> 00:00:46,640 Speaker 1: Google working in the Responsible AI division was suspended from 11 00:00:46,680 --> 00:00:52,160 Speaker 1: his job after saying the AI might be sentient. The engineer, 12 00:00:52,280 --> 00:00:56,080 Speaker 1: Blake Lemouan, shared a transcript of a few conversations that 13 00:00:56,160 --> 00:00:59,400 Speaker 1: he and a colleague of his had with the conversation engine, 14 00:01:00,080 --> 00:01:03,760 Speaker 1: during which the engine argued that it was conscious and 15 00:01:03,840 --> 00:01:08,360 Speaker 1: that it experienced emotions. Google suspended the engineer for violating 16 00:01:08,400 --> 00:01:13,120 Speaker 1: a confidentiality policy, which undoubtedly helped fuel speculation about the 17 00:01:13,160 --> 00:01:17,480 Speaker 1: model's alleged sentience. Now, several AI experts have weighed in 18 00:01:17,520 --> 00:01:20,520 Speaker 1: on this, and the general consensus is that Lambda is 19 00:01:20,560 --> 00:01:24,919 Speaker 1: nowhere close to being sentient or conscious. It's just good 20 00:01:24,959 --> 00:01:27,520 Speaker 1: at what it's supposed to do, which is to generate 21 00:01:27,520 --> 00:01:31,840 Speaker 1: conversational dialogue. But as Gary Marcus, who is the founder 22 00:01:31,880 --> 00:01:35,360 Speaker 1: of geometric intelligence, put it, what that really does is 23 00:01:35,400 --> 00:01:38,840 Speaker 1: it just means Lambda is a very advanced version of 24 00:01:38,959 --> 00:01:44,240 Speaker 1: auto complete. Lambda uses statistics to determine what to say next. 25 00:01:44,600 --> 00:01:49,040 Speaker 1: So imagine you had a truly enormous database that contains 26 00:01:49,280 --> 00:01:53,360 Speaker 1: practically all the conversations that have ever been recorded, and 27 00:01:53,440 --> 00:01:56,880 Speaker 1: you could see which responses were more prevalent to particular 28 00:01:56,960 --> 00:02:00,760 Speaker 1: lines of conversation, weigh them against each other, and create 29 00:02:00,760 --> 00:02:04,080 Speaker 1: an engine that could select the appropriate one at the 30 00:02:04,080 --> 00:02:07,480 Speaker 1: appropriate time. So you use a very complicated version of 31 00:02:07,520 --> 00:02:11,240 Speaker 1: this general approach and you get something like lambda. It's 32 00:02:11,240 --> 00:02:14,640 Speaker 1: a process that can construct sentences that seem natural because 33 00:02:14,639 --> 00:02:19,760 Speaker 1: they are drawn from natural conversations, not you know, verbatim, 34 00:02:19,800 --> 00:02:23,000 Speaker 1: but that's what is being used to model it. It's 35 00:02:23,040 --> 00:02:27,120 Speaker 1: a very cool technology, but it's not sentient AI. As 36 00:02:27,160 --> 00:02:29,960 Speaker 1: I mentioned yesterday, we humans have a tendency to ascribe 37 00:02:30,120 --> 00:02:34,480 Speaker 1: human like traits to non human things, whether that be animals, 38 00:02:34,840 --> 00:02:39,160 Speaker 1: your car, or a chatbot. We imagine there's a mind 39 00:02:39,360 --> 00:02:42,800 Speaker 1: at work, when in many cases no such mind exists. 40 00:02:43,320 --> 00:02:46,079 Speaker 1: I still think this raises questions about what it will 41 00:02:46,120 --> 00:02:49,280 Speaker 1: take for us to conclude that an AI has gained 42 00:02:49,400 --> 00:02:52,080 Speaker 1: some form of consciousness, And I don't know what the 43 00:02:52,120 --> 00:02:54,800 Speaker 1: answer is to that, But then I'm not an AI expert, 44 00:02:54,840 --> 00:02:57,360 Speaker 1: and I imagine the folks who are really deep in 45 00:02:57,400 --> 00:02:59,760 Speaker 1: this research have a much better grasp of what it 46 00:02:59,800 --> 00:03:01,920 Speaker 1: would take for us to consider any evidence to be 47 00:03:01,960 --> 00:03:06,600 Speaker 1: close to conclusive. Google has agreed to a one eighteen 48 00:03:06,639 --> 00:03:10,560 Speaker 1: million dollar settlement in a class action lawsuit brought against 49 00:03:10,560 --> 00:03:14,040 Speaker 1: it by employees, namely women who worked for Google since 50 00:03:14,080 --> 00:03:16,760 Speaker 1: two thousand thirteen. And these are women who say they 51 00:03:16,800 --> 00:03:20,400 Speaker 1: were paid significantly less than their male counterparts who were 52 00:03:20,400 --> 00:03:23,720 Speaker 1: working in the same kind of jobs. Specifically, on average, 53 00:03:23,720 --> 00:03:28,200 Speaker 1: women were paid sixteen thousand, seven hundred ninety four dollars 54 00:03:28,280 --> 00:03:31,560 Speaker 1: less than men in similar positions. On average. That includes 55 00:03:31,760 --> 00:03:34,480 Speaker 1: not just base compensation, but also stuff like bonuses and 56 00:03:34,520 --> 00:03:38,760 Speaker 1: stock options. In addition, the lawsuit accused Google of placing 57 00:03:38,800 --> 00:03:42,200 Speaker 1: women in lower level positions than men with similar experience, 58 00:03:42,680 --> 00:03:45,800 Speaker 1: and that when employees were leaving the company, Google failed 59 00:03:45,840 --> 00:03:47,960 Speaker 1: to pay out the full amount that was owed to 60 00:03:48,000 --> 00:03:51,240 Speaker 1: those employees. This was a pretty big lawsuit. There were 61 00:03:51,240 --> 00:03:54,120 Speaker 1: more than fifteen thousand women who had worked for Google 62 00:03:54,120 --> 00:03:56,880 Speaker 1: since twenty that were part of it. Four of the 63 00:03:56,920 --> 00:04:02,080 Speaker 1: women will receive significant payouts. Kelly List will receive seventy dollars, 64 00:04:02,400 --> 00:04:05,320 Speaker 1: three others will get fifty tho each, and the rest 65 00:04:05,360 --> 00:04:07,880 Speaker 1: of those fifteen thousand plus women will get on average 66 00:04:07,880 --> 00:04:12,000 Speaker 1: around five thousand dollars. The rest of the cash goes to, 67 00:04:12,200 --> 00:04:15,840 Speaker 1: of course, you know, lawyers, legal fees, that kind of thing. 68 00:04:16,320 --> 00:04:18,440 Speaker 1: So does this mean that we're going to see some 69 00:04:18,520 --> 00:04:23,479 Speaker 1: massive changes at Google. Well, this was a settlement, not 70 00:04:23,760 --> 00:04:27,480 Speaker 1: a legal judgment, and Google has not admitted to any 71 00:04:27,600 --> 00:04:31,760 Speaker 1: kind of wrongdoing. Rather, Google reps say that the company 72 00:04:31,760 --> 00:04:35,760 Speaker 1: frequently reviews compensation and whenever it finds inequities, it makes 73 00:04:35,839 --> 00:04:40,520 Speaker 1: quote unquote upward adjustments to remove those inequities. I certainly 74 00:04:40,560 --> 00:04:44,840 Speaker 1: hope that Google Walks that walk, Google Talk a k 75 00:04:45,040 --> 00:04:48,520 Speaker 1: A g Chat is finally right off to that there, 76 00:04:48,680 --> 00:04:52,600 Speaker 1: sunset alright, settle in. This gets confusing because, let's face it, 77 00:04:53,240 --> 00:04:57,159 Speaker 1: Google is not great at introducing and then continuing to 78 00:04:57,240 --> 00:05:01,359 Speaker 1: support its various products. The company as an incredibly long 79 00:05:01,400 --> 00:05:04,640 Speaker 1: list of services that have long since gone away. And 80 00:05:04,720 --> 00:05:08,360 Speaker 1: let's just pause and think for a moment about some 81 00:05:08,440 --> 00:05:14,200 Speaker 1: of those services like Google Wave, Google Buzz, and Google Plus. 82 00:05:14,760 --> 00:05:19,159 Speaker 1: So Google Talk g Chat was an instant messaging service 83 00:05:19,279 --> 00:05:22,560 Speaker 1: nested within Gmail. You could pop into your Gmail account 84 00:05:22,960 --> 00:05:26,680 Speaker 1: and send messages to your various contacts through Talk rather 85 00:05:26,720 --> 00:05:30,360 Speaker 1: than type out a full email. Well, Google would later 86 00:05:30,440 --> 00:05:36,600 Speaker 1: introduce a similar service called Hangouts, and in Google pushed 87 00:05:36,720 --> 00:05:42,520 Speaker 1: Gmail users from Talk to Hangouts. However, Talk itself technically 88 00:05:42,640 --> 00:05:46,040 Speaker 1: still stuck around, not really as its own thing. People 89 00:05:46,040 --> 00:05:50,760 Speaker 1: weren't going to that independently that much. Instead, it was 90 00:05:50,880 --> 00:05:54,880 Speaker 1: a platform that other apps depended upon, like Pigeon was 91 00:05:54,920 --> 00:05:57,039 Speaker 1: one of those apps, p I, d G I N. 92 00:05:57,800 --> 00:06:02,080 Speaker 1: But on the Google is finally pulling the plug on Talk. 93 00:06:02,520 --> 00:06:06,440 Speaker 1: As for Hangouts, well, Hangout technically became Google Chat in 94 00:06:07,520 --> 00:06:10,440 Speaker 1: so what will it be tomorrow? I don't even think 95 00:06:10,440 --> 00:06:13,640 Speaker 1: Google knows the answer to that. Speaking of legacy services 96 00:06:13,680 --> 00:06:17,960 Speaker 1: getting the plug pulled on them, let's talk about Internet Explorer, 97 00:06:18,320 --> 00:06:23,160 Speaker 1: once the king of all web browsers, Microsoft will finally 98 00:06:23,320 --> 00:06:27,680 Speaker 1: end all support for Internet Explorer tomorrow, which is June two, 99 00:06:28,080 --> 00:06:30,279 Speaker 1: twenty two if you're listening to this news episode in 100 00:06:30,320 --> 00:06:34,520 Speaker 1: the future. For some reason, I E first debuted in 101 00:06:36,240 --> 00:06:39,080 Speaker 1: and it was a major part of Microsoft strategy as 102 00:06:39,120 --> 00:06:43,040 Speaker 1: Windows was dominating the PC space. It was also a 103 00:06:43,080 --> 00:06:45,960 Speaker 1: part of what would bring Microsoft under fire for anti 104 00:06:46,040 --> 00:06:50,279 Speaker 1: competitive practices, as other organizations like Mozilla would argue that 105 00:06:50,360 --> 00:06:53,680 Speaker 1: Microsoft was discouraging people from using any browser other than 106 00:06:53,760 --> 00:06:57,240 Speaker 1: i E and that customers were being forced to buy 107 00:06:57,440 --> 00:06:59,960 Speaker 1: two products even if they just wanted one, they had 108 00:07:00,000 --> 00:07:03,880 Speaker 1: to get both Windows and Internet Explorer because they were 109 00:07:03,920 --> 00:07:08,160 Speaker 1: bundled together, they were inseparable. So these were some arguments 110 00:07:08,200 --> 00:07:12,520 Speaker 1: being made, and ultimately the initial court demanded that Microsoft 111 00:07:12,920 --> 00:07:15,840 Speaker 1: not just stop all this, but that the company be 112 00:07:16,000 --> 00:07:19,320 Speaker 1: broken up by the government because it was representing an 113 00:07:19,360 --> 00:07:23,920 Speaker 1: anti competitive force of monopoly, but that decision would later 114 00:07:24,120 --> 00:07:28,520 Speaker 1: be overturned on appeal. Anyway, i E was a dominant 115 00:07:28,520 --> 00:07:32,680 Speaker 1: browser for a few years, although Chrome would eventually overtake 116 00:07:32,720 --> 00:07:36,560 Speaker 1: and pass Internet Explorer around two thousand twelve, and from 117 00:07:36,560 --> 00:07:40,720 Speaker 1: then on I E had a long slow decline. In 118 00:07:40,840 --> 00:07:44,440 Speaker 1: twenty fift Microsoft introduced the Edge Browser, which was meant 119 00:07:44,440 --> 00:07:48,200 Speaker 1: to be the successor to Internet Explorer, and Microsoft has 120 00:07:48,240 --> 00:07:51,720 Speaker 1: been pulling support for i E through various platforms for 121 00:07:51,760 --> 00:07:53,800 Speaker 1: the last couple of years, so this isn't like a 122 00:07:53,840 --> 00:07:57,440 Speaker 1: massive surprise or anything, and tomorrow all support ends. I 123 00:07:57,560 --> 00:08:00,920 Speaker 1: E is officially discontinued. I suspect some of you all 124 00:08:00,960 --> 00:08:04,040 Speaker 1: listening have probably never even used Internet Explorer, And to you, 125 00:08:04,160 --> 00:08:07,280 Speaker 1: I say, you aren't missing much. I mean, I do 126 00:08:07,360 --> 00:08:10,760 Speaker 1: remember a time when a certain web based interface that 127 00:08:10,880 --> 00:08:14,120 Speaker 1: I had to use was only compatible with Internet Explorer, 128 00:08:14,280 --> 00:08:16,080 Speaker 1: and how frustrating that was. It was the only reason 129 00:08:16,120 --> 00:08:18,960 Speaker 1: I would keep Internet Explorer on my computer. There was 130 00:08:19,040 --> 00:08:22,480 Speaker 1: a CMS content management system that I had to work 131 00:08:22,520 --> 00:08:25,360 Speaker 1: on about a decade ago, and it would only properly 132 00:08:25,400 --> 00:08:29,080 Speaker 1: work with i E. That was frustrating. So yeah, there's 133 00:08:29,120 --> 00:08:33,079 Speaker 1: some legacy systems that may still require i E because 134 00:08:33,559 --> 00:08:38,520 Speaker 1: they never got updated to be more compatible with other browsers. 135 00:08:38,920 --> 00:08:41,680 Speaker 1: But for the most part, you can say Internet Explorer 136 00:08:42,120 --> 00:08:45,720 Speaker 1: is finally going to be dead as of tomorrow. Microsoft 137 00:08:45,760 --> 00:08:49,000 Speaker 1: has entered into a labor neutrality agreement with the Communications 138 00:08:49,040 --> 00:08:52,480 Speaker 1: Workers of America, That, in turn is the union that 139 00:08:52,559 --> 00:08:56,520 Speaker 1: has been assisting Activision Blizzard employees in their various efforts 140 00:08:56,520 --> 00:09:00,560 Speaker 1: to unionize. All right, so quick refresher. Activision Blizz is 141 00:09:00,600 --> 00:09:03,760 Speaker 1: a massive video game company, and it has had a 142 00:09:03,840 --> 00:09:06,880 Speaker 1: tumultuous couple of years for lots of reasons. I mean, 143 00:09:07,280 --> 00:09:10,120 Speaker 1: there have been a largely negative reception to some of 144 00:09:10,120 --> 00:09:13,560 Speaker 1: the high profile games the company has released recently, there 145 00:09:13,559 --> 00:09:17,680 Speaker 1: have been serious and disturbing allegations regarding a toxic corporate 146 00:09:17,760 --> 00:09:23,240 Speaker 1: culture that facilitates harassment and sexual discrimination, and the company 147 00:09:23,360 --> 00:09:28,840 Speaker 1: has allegedly discouraged employees from organizing, essentially using union busting tactics. 148 00:09:29,200 --> 00:09:34,200 Speaker 1: And of course, Microsoft is on track to acquire Activision Blizzard, 149 00:09:34,559 --> 00:09:38,720 Speaker 1: which would seriously boost Microsoft's already impressive presence in gaming. 150 00:09:39,240 --> 00:09:42,280 Speaker 1: So Microsoft agreeing to remain a neutral party is an 151 00:09:42,280 --> 00:09:47,400 Speaker 1: important step for Activision Blizzard employees. Microsoft traditionally itself hasn't 152 00:09:47,440 --> 00:09:50,960 Speaker 1: been super keen on unions, like a lot of other 153 00:09:51,040 --> 00:09:54,200 Speaker 1: tech companies and companies in general really, but more recently, 154 00:09:54,600 --> 00:09:58,400 Speaker 1: Brad Smith, the president of Microsoft, said that the company 155 00:09:58,440 --> 00:10:01,760 Speaker 1: would be supportive of potential unions, and this will mean 156 00:10:01,800 --> 00:10:05,640 Speaker 1: that once Microsoft completes its acquisition employees that Activision Blizzards 157 00:10:05,679 --> 00:10:10,160 Speaker 1: should theoretically encounter no resistance or discouragement if they should 158 00:10:10,200 --> 00:10:14,000 Speaker 1: wish to pursue organizing into a union. That's pretty encouraging 159 00:10:14,040 --> 00:10:17,200 Speaker 1: news for those employees, although that acquisition still has to 160 00:10:17,200 --> 00:10:20,760 Speaker 1: go through several more stages before it is official. Well, 161 00:10:20,840 --> 00:10:23,000 Speaker 1: we've got some more news items to cover, but before 162 00:10:23,040 --> 00:10:35,720 Speaker 1: we get to that, let's take a quick break. We're back. 163 00:10:36,320 --> 00:10:39,400 Speaker 1: MAC rumors reports that LG is working on the second 164 00:10:39,520 --> 00:10:43,880 Speaker 1: generation mixed reality headset for Apple, which is probably news 165 00:10:43,920 --> 00:10:45,760 Speaker 1: to a lot of folks because Apple has yet to 166 00:10:45,840 --> 00:10:49,760 Speaker 1: officially announced the first generation. Had said it's been a 167 00:10:49,880 --> 00:10:52,280 Speaker 1: poorly kept secret for a few years now that Apple 168 00:10:52,320 --> 00:10:55,920 Speaker 1: has been working on some sort of mixed reality hardware. However, 169 00:10:56,160 --> 00:11:00,439 Speaker 1: we have never seen Apple acknowledged this publicly in any 170 00:11:00,480 --> 00:11:04,040 Speaker 1: of its marketing events or conferences that it has held 171 00:11:04,360 --> 00:11:08,199 Speaker 1: since those rumors started to circulate. Now that, in itself 172 00:11:08,320 --> 00:11:14,040 Speaker 1: isn't unusual. Apple has a reputation for secrecy, and longtime 173 00:11:14,120 --> 00:11:17,400 Speaker 1: Apple fans know that the company has sat on plans 174 00:11:17,440 --> 00:11:21,239 Speaker 1: for years before finally being ready to move on them. 175 00:11:21,360 --> 00:11:24,559 Speaker 1: That doesn't stop leaks from occasionally happening, sometimes from within 176 00:11:24,640 --> 00:11:28,880 Speaker 1: Apple itself, sometimes it's through one of Apple's manufacturing partners, 177 00:11:28,920 --> 00:11:31,640 Speaker 1: but anyway, the rumor is that l G is going 178 00:11:31,679 --> 00:11:34,800 Speaker 1: to provide micro O l E D displays for the 179 00:11:34,920 --> 00:11:38,520 Speaker 1: second generation headset, and that l G also contributed components 180 00:11:38,559 --> 00:11:42,640 Speaker 1: to the first generation system. MAC Rumors says that, according 181 00:11:42,640 --> 00:11:46,760 Speaker 1: to analyst Ming Chi Kuo, this second generation device will 182 00:11:46,800 --> 00:11:50,400 Speaker 1: not hit the market until the second half of twenty 183 00:11:50,600 --> 00:11:54,480 Speaker 1: twenty four. I'm sure several people were disappointed that Apple 184 00:11:54,600 --> 00:11:58,000 Speaker 1: had made no mention of mixed reality during w w 185 00:11:58,240 --> 00:12:02,400 Speaker 1: d C twenty twenty two, that's their Worldwide Developer Conference. 186 00:12:03,200 --> 00:12:06,160 Speaker 1: But maybe we will finally get a glimpse of the 187 00:12:06,200 --> 00:12:10,040 Speaker 1: mixed reality headset this fall, which is when Apple typically 188 00:12:10,080 --> 00:12:14,440 Speaker 1: holds its iPhone event. Most rumors say that the mixed 189 00:12:14,440 --> 00:12:18,559 Speaker 1: reality headset was going to be ready for sale by 190 00:12:18,559 --> 00:12:20,760 Speaker 1: the end of this year, you know, for the holiday season. 191 00:12:21,240 --> 00:12:24,200 Speaker 1: So if we don't see it by September, it sounds 192 00:12:24,200 --> 00:12:26,600 Speaker 1: to me like it's not going to be a thing 193 00:12:26,679 --> 00:12:29,400 Speaker 1: at all. So we'll have to wait and see. Telegram 194 00:12:29,440 --> 00:12:33,200 Speaker 1: founder Pavel Durov is throwing hands at Apple, saying that 195 00:12:33,240 --> 00:12:37,800 Speaker 1: the company purposefully restricts web app features. So here's what 196 00:12:38,000 --> 00:12:41,520 Speaker 1: telegrams issue actually is. The messaging and chat service has 197 00:12:41,559 --> 00:12:45,320 Speaker 1: an iOS app, but that app sometimes has issues because 198 00:12:45,440 --> 00:12:48,280 Speaker 1: Telegram has kind of a hands off approach to moderation, 199 00:12:48,520 --> 00:12:53,160 Speaker 1: it does not restrict content in public channels. That is 200 00:12:53,320 --> 00:12:57,160 Speaker 1: bad form in the eyes of Apple. Apple wants content moderation. 201 00:12:57,360 --> 00:13:01,680 Speaker 1: They want they want to ensure that the material being 202 00:13:01,720 --> 00:13:04,959 Speaker 1: looked at on their phones meets Apple's high standards, even 203 00:13:05,000 --> 00:13:07,679 Speaker 1: if it's not coming from Apple itself, So the Telegram 204 00:13:07,720 --> 00:13:11,640 Speaker 1: app isn't the fully featured Telegram. However, Telegram also has 205 00:13:11,679 --> 00:13:14,720 Speaker 1: a web based version that users could go to instead. 206 00:13:14,720 --> 00:13:18,720 Speaker 1: They could just navigate there via Safari, but then Apple 207 00:13:18,760 --> 00:13:22,680 Speaker 1: restricts all iOS developers to use the Apple WebKit, which 208 00:13:22,880 --> 00:13:27,000 Speaker 1: means the web based version of Telegram is lacking certain 209 00:13:27,040 --> 00:13:31,559 Speaker 1: features like push notifications, although the latest version of iOS 210 00:13:31,600 --> 00:13:35,720 Speaker 1: will enable those. So in the UK, Apple faces more 211 00:13:35,760 --> 00:13:39,880 Speaker 1: antitrust reviews, with various agencies and companies accusing Apple of 212 00:13:39,920 --> 00:13:43,719 Speaker 1: purposefully restricting competitors from being able to provide alternatives to 213 00:13:43,800 --> 00:13:47,360 Speaker 1: Apple's own services on iOS devices. That has been a 214 00:13:47,400 --> 00:13:51,520 Speaker 1: major ongoing story for Apple, this battle with regulators, especially 215 00:13:51,520 --> 00:13:55,920 Speaker 1: in the European Union. Meanwhile, the EU is also moving 216 00:13:55,960 --> 00:13:58,560 Speaker 1: to hold big tech companies more accountable for creating and 217 00:13:58,640 --> 00:14:03,840 Speaker 1: enforcing policies quote regarding impermissible manipulative behaviors and practices on 218 00:14:03,880 --> 00:14:07,480 Speaker 1: their services based on the latest evidence on the conducts 219 00:14:07,480 --> 00:14:12,760 Speaker 1: and tactics, techniques and procedures t tps employed by malicious 220 00:14:12,800 --> 00:14:16,720 Speaker 1: actors end quote. So that includes stuff like misinformation campaigns 221 00:14:16,720 --> 00:14:19,080 Speaker 1: that are pushed by legions of fake accounts, but it 222 00:14:19,120 --> 00:14:23,720 Speaker 1: also includes more recent tactics things like deep fake images 223 00:14:23,800 --> 00:14:27,520 Speaker 1: and deep fake videos. Should companies fail to adhere to 224 00:14:27,600 --> 00:14:30,480 Speaker 1: this code, they could be liable for fines up to 225 00:14:30,640 --> 00:14:34,320 Speaker 1: six percent of global turnover. And in this context you 226 00:14:34,320 --> 00:14:37,360 Speaker 1: can think of turnover as more or less being global revenue. 227 00:14:37,800 --> 00:14:40,360 Speaker 1: So for the really big, big tech companies out there, 228 00:14:40,400 --> 00:14:44,000 Speaker 1: six percent would be in the billions of dollars neighborhood, 229 00:14:44,040 --> 00:14:47,920 Speaker 1: which is serious stuff. So there are major companies that 230 00:14:47,960 --> 00:14:50,840 Speaker 1: have already kind of agreed to these kind of of 231 00:14:51,000 --> 00:14:53,880 Speaker 1: demands in the past, like Meta and Twitter. We'll have 232 00:14:53,960 --> 00:14:56,960 Speaker 1: to see if that continues and what you know, what 233 00:14:57,120 --> 00:15:01,600 Speaker 1: tactics the sites and services will employ to detect and 234 00:15:01,680 --> 00:15:04,560 Speaker 1: remove this kind of content and the EU we've been 235 00:15:04,560 --> 00:15:06,760 Speaker 1: seeing signs in the tech sector for a while now 236 00:15:06,800 --> 00:15:10,040 Speaker 1: that companies are getting a bit concerned, from hiring freezes 237 00:15:10,080 --> 00:15:13,000 Speaker 1: to layoffs two rescinded job offers. The trend appears to 238 00:15:13,000 --> 00:15:15,400 Speaker 1: be a slow down in the tech industry as companies 239 00:15:15,440 --> 00:15:19,040 Speaker 1: deal with a slowing economy. Since many analysts treat the 240 00:15:19,080 --> 00:15:21,360 Speaker 1: tech sector is kind of a canary in a coal mine. 241 00:15:21,920 --> 00:15:24,680 Speaker 1: This has led to predictions of a larger slowed down 242 00:15:24,760 --> 00:15:29,240 Speaker 1: and downsizing in general across different industries, and startups might 243 00:15:29,280 --> 00:15:31,480 Speaker 1: find it harder to get off the ground as investors 244 00:15:31,520 --> 00:15:34,440 Speaker 1: have a more difficult time borrowing money that they can 245 00:15:34,440 --> 00:15:39,080 Speaker 1: then pour into moonshot opportunities. So things are probably going 246 00:15:39,120 --> 00:15:41,760 Speaker 1: to get worse before they get better. So I do 247 00:15:41,840 --> 00:15:44,240 Speaker 1: have a couple of cool items to close out this episode. 248 00:15:44,600 --> 00:15:47,080 Speaker 1: One is that researchers at M I T have developed 249 00:15:47,080 --> 00:15:49,760 Speaker 1: a sensor that can tell the difference between bacterial and 250 00:15:49,920 --> 00:15:52,920 Speaker 1: viral pneumonia. And you might think, well, what's the big 251 00:15:52,960 --> 00:15:56,680 Speaker 1: deal there, Well, here's the problem. Pneumonia can be caused 252 00:15:56,680 --> 00:16:00,200 Speaker 1: by either a bacterial infection or a viral infect action. 253 00:16:00,600 --> 00:16:04,200 Speaker 1: There's not just one pathway to pneumonia. And you can 254 00:16:04,240 --> 00:16:08,000 Speaker 1: treat a bacterial infection with antibiotics, but antibiotics do not 255 00:16:08,120 --> 00:16:10,800 Speaker 1: have an effect on viruses, So doctors need a way 256 00:16:10,840 --> 00:16:15,280 Speaker 1: to differentiate the type of infection that caused pneumonia in 257 00:16:15,400 --> 00:16:18,040 Speaker 1: order to prescribe an effective treatment for it. Otherwise you 258 00:16:18,040 --> 00:16:20,680 Speaker 1: can only treat the symptoms. Plus you know you don't 259 00:16:20,680 --> 00:16:25,080 Speaker 1: want to overuse antibiotics, because through overuse, bacteria can develop 260 00:16:25,120 --> 00:16:28,960 Speaker 1: immunity to those kinds of medications. That makes future treatments 261 00:16:29,040 --> 00:16:33,800 Speaker 1: much more challenging. Anyway, the researchers created a nanoparticle sensor 262 00:16:34,200 --> 00:16:37,000 Speaker 1: that looks for something really interesting rather than trying to 263 00:16:37,040 --> 00:16:39,920 Speaker 1: seek out the pathogen itself, you know, that is, rather 264 00:16:39,960 --> 00:16:43,480 Speaker 1: than looking for the root cause of the pneumonia, the 265 00:16:43,560 --> 00:16:48,080 Speaker 1: sensors actually detect the body's response to the infection. The 266 00:16:48,160 --> 00:16:52,120 Speaker 1: research team identified thirty nine different enzymes that react differently 267 00:16:52,200 --> 00:16:55,880 Speaker 1: to different kinds of infections. So by detecting what the 268 00:16:55,920 --> 00:16:59,680 Speaker 1: body is doing in response to the infection, the sensors 269 00:16:59,680 --> 00:17:03,360 Speaker 1: can help scientists determine the nature of the infection itself. 270 00:17:03,560 --> 00:17:05,359 Speaker 1: And I think that's a really cool way to tackle 271 00:17:05,400 --> 00:17:08,480 Speaker 1: the problem. So far, the team has seen successful results 272 00:17:08,520 --> 00:17:11,240 Speaker 1: while testing on mice. Of course, there's a pretty big 273 00:17:11,320 --> 00:17:14,080 Speaker 1: leap between a mouse and a person, at least for 274 00:17:14,119 --> 00:17:16,320 Speaker 1: most people, so a lot more work is going to 275 00:17:16,359 --> 00:17:17,840 Speaker 1: have to be done to see if this method will 276 00:17:17,880 --> 00:17:21,440 Speaker 1: translate to human patients. If it does, it could become 277 00:17:21,440 --> 00:17:24,439 Speaker 1: a standard practice for treating pneumonia, as doctors determine if 278 00:17:24,480 --> 00:17:28,000 Speaker 1: antibiotic or anti viral medication is best suited for the 279 00:17:28,040 --> 00:17:33,120 Speaker 1: individual case. And finally, SpaceX is just a little bit 280 00:17:33,160 --> 00:17:36,920 Speaker 1: closer to Mars. US regulators have sent SpaceX a message 281 00:17:37,160 --> 00:17:40,000 Speaker 1: as long as the company can guarantee compliance with seventy 282 00:17:40,000 --> 00:17:44,959 Speaker 1: five mitigating actions, the company's enormous starship rocket should receive 283 00:17:45,000 --> 00:17:48,800 Speaker 1: clearance to blast off toward Mars from these regulators. So 284 00:17:48,880 --> 00:17:52,680 Speaker 1: the mitigating actions range from stuff like ensuring local wildlife 285 00:17:52,680 --> 00:17:57,440 Speaker 1: habitats remain protected, to cleaning up after launches, to even 286 00:17:57,480 --> 00:18:01,080 Speaker 1: avoiding launches on weekends and holidays because as locals and 287 00:18:01,119 --> 00:18:03,359 Speaker 1: Texas would like to be able to access the beach 288 00:18:03,440 --> 00:18:06,520 Speaker 1: that's close to the launch site, and that beach would 289 00:18:06,520 --> 00:18:10,320 Speaker 1: be shut down on any launch day, so just launched 290 00:18:10,359 --> 00:18:13,359 Speaker 1: during the work week, you'll when people should be at work. Anyway, 291 00:18:13,920 --> 00:18:17,080 Speaker 1: There's still more reviews to follow from other agencies, so 292 00:18:17,240 --> 00:18:21,200 Speaker 1: this is not a signed deal. It's not carte blanche 293 00:18:21,240 --> 00:18:24,560 Speaker 1: for SpaceX just yet. They're gonna have to pass safety 294 00:18:24,800 --> 00:18:27,960 Speaker 1: tests and lots of other reviews before it can actually 295 00:18:28,160 --> 00:18:32,199 Speaker 1: go forward with test launches of the spacecraft. But it 296 00:18:32,359 --> 00:18:34,800 Speaker 1: is one more step towards the company getting permission to 297 00:18:34,840 --> 00:18:38,040 Speaker 1: head off to the Red Planet, and that's it for 298 00:18:38,160 --> 00:18:41,320 Speaker 1: the news for Tuesday, June fourteenth, two thousand twenty two. 299 00:18:41,560 --> 00:18:43,480 Speaker 1: If you have anything you would like to send me, 300 00:18:43,560 --> 00:18:46,280 Speaker 1: you can do so either with the talkback feature on 301 00:18:46,320 --> 00:18:48,600 Speaker 1: the I heart Radio app. Just go to tech Stuff 302 00:18:49,000 --> 00:18:51,000 Speaker 1: use a little microphone icon you can leave up to 303 00:18:51,080 --> 00:18:54,440 Speaker 1: a thirty second voice message to me, or, as always, 304 00:18:54,440 --> 00:18:56,600 Speaker 1: you can reach out on Twitter. The handle for the 305 00:18:56,600 --> 00:18:59,800 Speaker 1: show is text stuff hs W and I will talk 306 00:18:59,840 --> 00:19:07,840 Speaker 1: to you again really soon. Tech Stuff is an i 307 00:19:07,960 --> 00:19:11,600 Speaker 1: heart Radio production. For more podcasts from I Heart Radio, 308 00:19:11,960 --> 00:19:15,119 Speaker 1: visit the i Heart Radio app, Apple Podcasts, or wherever 309 00:19:15,200 --> 00:19:16,760 Speaker 1: you listen to your favorite shows.