1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,880 --> 00:00:14,840 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,480 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,800 --> 00:00:21,560 Speaker 1: And how the tech are you. It's time for the 5 00:00:21,600 --> 00:00:28,280 Speaker 1: tech news for Thursday, February nine, and I almost said Tuesday, 6 00:00:28,320 --> 00:00:31,120 Speaker 1: because I don't know what day it is anymore. And 7 00:00:31,160 --> 00:00:35,839 Speaker 1: it's time to get those AI chat bought news items 8 00:00:35,840 --> 00:00:38,839 Speaker 1: out of the way. First thing, these have been obviously 9 00:00:38,880 --> 00:00:42,680 Speaker 1: the big news throughout the beginning of this year. Now, 10 00:00:42,680 --> 00:00:45,760 Speaker 1: as I mentioned earlier this week, both Microsoft and Google 11 00:00:46,040 --> 00:00:49,360 Speaker 1: have launched kind of an invitation only test of their 12 00:00:49,400 --> 00:00:54,120 Speaker 1: respective AI chat bought tools. Microsoft is taking advantage of 13 00:00:54,280 --> 00:00:58,200 Speaker 1: chat GPT, that's the chat bought everyone has heard about. 14 00:00:58,720 --> 00:01:02,880 Speaker 1: But there are some inferences with the Microsoft version. Standard 15 00:01:02,960 --> 00:01:07,760 Speaker 1: chat GPT cannot give up to date responses. You wouldn't 16 00:01:07,800 --> 00:01:11,720 Speaker 1: be able to get information about, say the recent earthquake 17 00:01:11,760 --> 00:01:16,399 Speaker 1: in Turkey, for example. The information that chat GPT the 18 00:01:16,480 --> 00:01:20,080 Speaker 1: Vanilla brand draws from is a couple of years old. However, 19 00:01:20,560 --> 00:01:25,360 Speaker 1: the Microsoft version is able to reference current events, so 20 00:01:25,520 --> 00:01:28,199 Speaker 1: you could ask about stuff that's happening in the news. 21 00:01:28,560 --> 00:01:31,520 Speaker 1: Now that's potentially good being able to get us accinct 22 00:01:31,600 --> 00:01:35,039 Speaker 1: summary of what's going on could be a really valuable tool, 23 00:01:35,640 --> 00:01:39,319 Speaker 1: but it can also spell trouble for news organizations. You know, 24 00:01:39,360 --> 00:01:42,040 Speaker 1: if they're not getting traffic because people are just getting 25 00:01:42,040 --> 00:01:45,760 Speaker 1: the news from an AI chat butt, that could mean 26 00:01:45,800 --> 00:01:48,520 Speaker 1: that they start seeing less and less revenue and then 27 00:01:48,800 --> 00:01:51,560 Speaker 1: ultimately they could go out of business. Ironically, these are 28 00:01:51,600 --> 00:01:54,960 Speaker 1: the institutions that are creating the data that the chat 29 00:01:55,000 --> 00:01:58,000 Speaker 1: bots able to pull from in the first place. However, 30 00:01:58,360 --> 00:02:00,960 Speaker 1: it could also mean that summary could end up being 31 00:02:01,040 --> 00:02:04,640 Speaker 1: very biased depending upon which news sources the chat bought 32 00:02:04,800 --> 00:02:08,519 Speaker 1: is pulling from when it's formulating a response. If it's 33 00:02:08,600 --> 00:02:12,639 Speaker 1: a far left or far right leaning news source, then 34 00:02:12,760 --> 00:02:17,400 Speaker 1: you're probably not going to get an objective account of 35 00:02:17,440 --> 00:02:21,399 Speaker 1: what's going on in the news, and so that could 36 00:02:21,440 --> 00:02:24,200 Speaker 1: be a really big problem down the line. Now I'm 37 00:02:24,240 --> 00:02:27,960 Speaker 1: on waitlist for access to the Microsoft version, which, by 38 00:02:27,960 --> 00:02:31,200 Speaker 1: the way, you can only access if you're using Microsoft's 39 00:02:31,320 --> 00:02:35,799 Speaker 1: browser Edge, which is you know, obviously an attempt for 40 00:02:35,919 --> 00:02:40,120 Speaker 1: Microsoft to try and get some some market share away 41 00:02:40,160 --> 00:02:43,440 Speaker 1: from Google and to claim some of that for their 42 00:02:43,440 --> 00:02:49,000 Speaker 1: own by keeping this being search result AI chat bot 43 00:02:49,040 --> 00:02:54,120 Speaker 1: integration contained to the Edge browser. But anyway, I'm on 44 00:02:54,160 --> 00:02:56,960 Speaker 1: the wait list. I have not received access to it, 45 00:02:57,000 --> 00:03:01,400 Speaker 1: so I don't have any firsthand experience. But you know, 46 00:03:02,000 --> 00:03:05,400 Speaker 1: I did test out some of the example questions and 47 00:03:05,680 --> 00:03:08,880 Speaker 1: being so you, as a normal user, you can go 48 00:03:08,960 --> 00:03:13,040 Speaker 1: to Being through Edge and have some access to this, 49 00:03:13,320 --> 00:03:16,800 Speaker 1: but only by submitting sample questions that have already been written. 50 00:03:17,160 --> 00:03:19,480 Speaker 1: So I did that and what I saw was that 51 00:03:19,520 --> 00:03:22,360 Speaker 1: the results came back where you have search results in 52 00:03:22,400 --> 00:03:25,040 Speaker 1: the main part of the screen on the left hand side, 53 00:03:25,480 --> 00:03:28,160 Speaker 1: and then the right hand column you have the AI 54 00:03:28,280 --> 00:03:31,320 Speaker 1: generated response, so they're side by side. I think that's 55 00:03:31,360 --> 00:03:33,800 Speaker 1: a wise decision. It means that the search results are 56 00:03:33,840 --> 00:03:38,040 Speaker 1: not outright eliminated. But whether or not folks will go 57 00:03:38,200 --> 00:03:42,120 Speaker 1: beyond AI generated responses and start clicking on links remains 58 00:03:42,160 --> 00:03:45,560 Speaker 1: to be seen. Anyway, Microsoft impressed a lot of folks 59 00:03:45,560 --> 00:03:49,000 Speaker 1: with this announcement and the early access. Now let's contrast 60 00:03:49,040 --> 00:03:52,640 Speaker 1: that with Google, which has had a much rougher week now. 61 00:03:52,680 --> 00:03:56,440 Speaker 1: As I mentioned earlier, Google had sort of rushed its 62 00:03:56,440 --> 00:03:59,840 Speaker 1: own AI chat bot, which is now called Barred, into 63 00:04:00,000 --> 00:04:04,960 Speaker 1: an invitation only beta test earlier this week, in anticipation 64 00:04:05,080 --> 00:04:10,560 Speaker 1: of Microsoft's announcement of chat GPT. Then yesterday Google promoted 65 00:04:10,920 --> 00:04:14,600 Speaker 1: the barred chat butt, but in that outing, the AI 66 00:04:14,760 --> 00:04:17,600 Speaker 1: made a boo boo. So when it was asked to 67 00:04:17,680 --> 00:04:20,960 Speaker 1: list things that the James Webb Space Telescope had managed 68 00:04:21,000 --> 00:04:26,240 Speaker 1: to do so far, the barred AI chat butt included 69 00:04:27,000 --> 00:04:29,919 Speaker 1: a bullet point saying that the James Webb Space Telescope 70 00:04:29,960 --> 00:04:32,640 Speaker 1: took the very first picture of an exo planet, so 71 00:04:32,720 --> 00:04:35,680 Speaker 1: a planet that's outside of our own solar system. The 72 00:04:35,760 --> 00:04:40,000 Speaker 1: problem is that's wrong. The very first image of an 73 00:04:40,000 --> 00:04:43,320 Speaker 1: exo planet pre dates the James Webb Space Telescope. It 74 00:04:43,360 --> 00:04:47,360 Speaker 1: happened almost twenty years ago. But Bard presented the bullet 75 00:04:47,400 --> 00:04:49,920 Speaker 1: item as if it were a fact, something that chat 76 00:04:49,960 --> 00:04:53,440 Speaker 1: GPT critics have said is a huge problem with chat 77 00:04:53,480 --> 00:04:56,840 Speaker 1: GPT as well. In fact, for all these AI applications, 78 00:04:56,880 --> 00:04:59,920 Speaker 1: that's the problem is that they present information in such 79 00:05:00,080 --> 00:05:03,960 Speaker 1: way that it seems authoritative and definitive, when in fact 80 00:05:04,000 --> 00:05:08,400 Speaker 1: it may be incorrect. Well, Google stock went tumbling, had 81 00:05:08,480 --> 00:05:10,800 Speaker 1: dropped from around a hundred six dollars per share down 82 00:05:10,839 --> 00:05:13,920 Speaker 1: to ninety eight dollars per share, and you might think, wow, okay, 83 00:05:13,960 --> 00:05:17,080 Speaker 1: ten dollars, that's big, but it's not. You know, how 84 00:05:17,160 --> 00:05:21,040 Speaker 1: much could that be collectively? Well, collectively, if you measure 85 00:05:21,040 --> 00:05:23,719 Speaker 1: it across all the shares of Google that are out there, 86 00:05:24,200 --> 00:05:27,480 Speaker 1: it means that the company effectively lost a hundred billion 87 00:05:27,560 --> 00:05:33,040 Speaker 1: dollars in value Yawza, Microsoft Scores and early victory in 88 00:05:33,080 --> 00:05:35,760 Speaker 1: the Chat Bought search wars. Though I would caution anyone 89 00:05:35,920 --> 00:05:40,359 Speaker 1: in Redwood from celebrating just yet, as chat gpt is 90 00:05:40,520 --> 00:05:44,080 Speaker 1: very much capable of giving wrong answers with conviction, just 91 00:05:44,200 --> 00:05:47,200 Speaker 1: as Barred has done. It is better to continue to 92 00:05:47,279 --> 00:05:51,000 Speaker 1: stress that these tools are in development, they are not 93 00:05:52,160 --> 00:05:55,719 Speaker 1: trustworthy by a long shot. On a related note, I 94 00:05:55,760 --> 00:05:58,919 Speaker 1: also saw that Kun'lan Tech, the parent company behind the 95 00:05:58,960 --> 00:06:03,599 Speaker 1: web browser Opera, plans to incorporate chat gpt into Opera 96 00:06:03,680 --> 00:06:07,159 Speaker 1: itself in the future. There are scarce details as to 97 00:06:07,240 --> 00:06:10,640 Speaker 1: the extent to which chat gpt will play a role 98 00:06:10,680 --> 00:06:14,080 Speaker 1: in Opera, and I also don't have really any information 99 00:06:14,120 --> 00:06:17,160 Speaker 1: on a timeline for when this could be rolled out, 100 00:06:17,560 --> 00:06:20,240 Speaker 1: but it is interesting that Opera, which is a niche 101 00:06:20,360 --> 00:06:25,039 Speaker 1: web browser, is waiting into this competitive field. Currently, Opera 102 00:06:25,120 --> 00:06:28,720 Speaker 1: ranks as the sixth most popular browser in the world 103 00:06:29,240 --> 00:06:32,320 Speaker 1: and has a market share of two point four percent. 104 00:06:32,680 --> 00:06:37,000 Speaker 1: Compare that to Google Chrome. This is the dominant web 105 00:06:37,000 --> 00:06:39,599 Speaker 1: browser in the world and has a sixty five point 106 00:06:39,720 --> 00:06:43,039 Speaker 1: four percent market share, so more than half of all 107 00:06:43,080 --> 00:06:47,479 Speaker 1: web browsers being used our Google Chrome. Now Microsoft is 108 00:06:47,520 --> 00:06:50,480 Speaker 1: hoping to take a serious chunk out of Google because 109 00:06:50,520 --> 00:06:55,839 Speaker 1: again it's chat GPT bing search tool is only available 110 00:06:55,880 --> 00:06:58,560 Speaker 1: if you're using the Edge browser. So we'll have to 111 00:06:58,600 --> 00:07:02,080 Speaker 1: see if this changes things, or if in fact chat 112 00:07:02,120 --> 00:07:05,760 Speaker 1: gpt just doesn't have the legs. Vice has a really 113 00:07:05,800 --> 00:07:10,280 Speaker 1: interesting article about a very odd quirk chat gpt appears 114 00:07:10,320 --> 00:07:14,720 Speaker 1: to have. The article has the title quote chat gpt 115 00:07:15,000 --> 00:07:18,840 Speaker 1: can be broken by entering these strange words and nobody 116 00:07:19,000 --> 00:07:21,960 Speaker 1: is sure why end quote, And yeah, that kind of 117 00:07:22,000 --> 00:07:26,000 Speaker 1: summarizes the story. But the strange words in particular appear 118 00:07:26,120 --> 00:07:30,360 Speaker 1: to be user names from places like Reddit and Twitch 119 00:07:30,440 --> 00:07:33,040 Speaker 1: and that sort of stuff, and they include stuff like 120 00:07:33,400 --> 00:07:38,800 Speaker 1: solid gold, magic harp, and streamer bought. Researchers discovered a 121 00:07:38,840 --> 00:07:42,960 Speaker 1: bank of around one hundred odd words or user names 122 00:07:43,400 --> 00:07:46,400 Speaker 1: while they were probing chat gpt. They were trying to 123 00:07:46,440 --> 00:07:52,040 Speaker 1: determine what sort of prompts create the most reliable results, 124 00:07:52,640 --> 00:07:55,760 Speaker 1: and in the process they kind of uncovered this treasure 125 00:07:55,800 --> 00:07:59,120 Speaker 1: trove of bizarre words that chat GPT just doesn't seem 126 00:07:59,160 --> 00:08:03,400 Speaker 1: to understand what to do with. Now, chat GPT can 127 00:08:03,400 --> 00:08:07,280 Speaker 1: compose text that will include words that are similar to 128 00:08:07,440 --> 00:08:11,160 Speaker 1: the one hundred or so odd ones that the researchers found, 129 00:08:11,840 --> 00:08:16,160 Speaker 1: but they discovered if they put those actual names into 130 00:08:16,240 --> 00:08:20,520 Speaker 1: a prompt, it would create an odd response from chat GPT. 131 00:08:20,880 --> 00:08:24,320 Speaker 1: For example, they said that when they submitted streamer bought 132 00:08:24,560 --> 00:08:29,160 Speaker 1: to chat GPT, it replied, you're a jerk. Now, that's 133 00:08:29,280 --> 00:08:33,360 Speaker 1: very funny, but obviously it's also puzzling. What would prompt 134 00:08:33,559 --> 00:08:37,520 Speaker 1: chat GPT to call you a jerk just for saying 135 00:08:37,600 --> 00:08:41,520 Speaker 1: streamer bought to it? Other words would either create no 136 00:08:41,640 --> 00:08:45,880 Speaker 1: response at all, or chat GPT would totally misinterpret whatever 137 00:08:45,920 --> 00:08:48,959 Speaker 1: it was. It would not be able to repeat these 138 00:08:48,960 --> 00:08:52,280 Speaker 1: words if you told it to repeat them. Uh. In 139 00:08:52,360 --> 00:08:57,000 Speaker 1: one case, it converted a user name into a number. 140 00:08:57,040 --> 00:08:59,400 Speaker 1: It treated the user name as if it were a number, 141 00:08:59,559 --> 00:09:01,800 Speaker 1: and if you asked questions about the user name, it 142 00:09:01,840 --> 00:09:04,640 Speaker 1: would answer as if it were a number. Now, the 143 00:09:04,679 --> 00:09:08,080 Speaker 1: researchers admit they are not sure what's going on here, 144 00:09:08,280 --> 00:09:12,200 Speaker 1: As the title of the piece suggests, they suspect that 145 00:09:12,360 --> 00:09:15,280 Speaker 1: this could be from open AI using sites like Reddit 146 00:09:15,360 --> 00:09:18,640 Speaker 1: as a data source to crawl through and gather information. 147 00:09:19,080 --> 00:09:22,280 Speaker 1: According to one of these researchers, several of the user 148 00:09:22,320 --> 00:09:26,480 Speaker 1: handles were actually found to be participants in a Reddit 149 00:09:26,520 --> 00:09:32,199 Speaker 1: thread where people in the thread are counting to infinity 150 00:09:32,440 --> 00:09:37,600 Speaker 1: reply by reply, like each reply increases the number by one. 151 00:09:38,360 --> 00:09:41,280 Speaker 1: Maybe that's why when the researchers asked about the Nitrome 152 00:09:41,440 --> 00:09:45,760 Speaker 1: Fan they discovered the chat GPT interpreted that as the 153 00:09:45,840 --> 00:09:50,679 Speaker 1: number two, like the Nitrome Fan meant two, So I 154 00:09:50,679 --> 00:09:54,280 Speaker 1: guess you would have blink the Nitrome Fan as a 155 00:09:54,280 --> 00:09:56,920 Speaker 1: band that I don't really like very much. And sure, 156 00:09:57,000 --> 00:09:59,920 Speaker 1: this might all sound trivial, but the researchers have pointed 157 00:10:00,000 --> 00:10:02,959 Speaker 1: out that they want to understand why a chat bought 158 00:10:03,600 --> 00:10:08,199 Speaker 1: might pick certain information over other while creating a response, 159 00:10:08,600 --> 00:10:11,080 Speaker 1: what kind of information would cause a chat about to break, 160 00:10:11,120 --> 00:10:15,120 Speaker 1: because that's important to understand and also for us to 161 00:10:15,120 --> 00:10:19,640 Speaker 1: get a better grip on how accurate, reliable, and trustworthy 162 00:10:19,720 --> 00:10:22,920 Speaker 1: the AI is at any given time, as well as 163 00:10:22,960 --> 00:10:25,280 Speaker 1: get insight into the types of sources that the AI 164 00:10:25,360 --> 00:10:29,360 Speaker 1: relies upon important questions to answer. Okay, we're gonna take 165 00:10:29,360 --> 00:10:31,760 Speaker 1: a quick break. When we come back, we're gonna segue 166 00:10:31,800 --> 00:10:35,440 Speaker 1: away from all this AI stuff and talk about Twitter 167 00:10:35,559 --> 00:10:48,120 Speaker 1: for a little bit. Okay, we're back. If you were 168 00:10:48,160 --> 00:10:51,679 Speaker 1: on Twitter yesterday, you might have encountered some weirdness going on. 169 00:10:51,840 --> 00:10:55,520 Speaker 1: Some users ended up seeing a message that said you 170 00:10:55,559 --> 00:10:59,079 Speaker 1: are over the daily limit for sending tweets, even though 171 00:10:59,120 --> 00:11:03,200 Speaker 1: they hadn't posted anything up to that point. Twitter does 172 00:11:03,240 --> 00:11:07,000 Speaker 1: in fact have a daily limit on tweets, but it 173 00:11:07,120 --> 00:11:10,400 Speaker 1: is a very high limit. I'm talking two thousand, four 174 00:11:10,480 --> 00:11:14,079 Speaker 1: hundred messages. There's also a limit on direct messages per 175 00:11:14,200 --> 00:11:17,360 Speaker 1: day that's five hundred, and you're also limited to how 176 00:11:17,360 --> 00:11:20,360 Speaker 1: many people you can follow in a day that's four hundred. 177 00:11:20,840 --> 00:11:24,280 Speaker 1: These policies are largely an effort to stave off particularly 178 00:11:24,360 --> 00:11:28,480 Speaker 1: active spam bots. Your average person is not likely to 179 00:11:28,520 --> 00:11:31,400 Speaker 1: send that many messages. I mean, a hundred messages per 180 00:11:31,440 --> 00:11:34,160 Speaker 1: hour on average is asking a bit much, even for me. 181 00:11:34,920 --> 00:11:37,800 Speaker 1: So it was certainly strange when users started to get 182 00:11:37,840 --> 00:11:40,800 Speaker 1: message saying they had hit these limits, and it prompted 183 00:11:40,840 --> 00:11:45,520 Speaker 1: speculation that perhaps Twitter was going to impose strict limits 184 00:11:46,160 --> 00:11:48,680 Speaker 1: and then you would have to subscribe to a paid 185 00:11:48,880 --> 00:11:54,240 Speaker 1: Twitter account in order to increase those limits. But honestly, 186 00:11:54,280 --> 00:11:56,800 Speaker 1: it just ended up being a glitch. And I know 187 00:11:56,880 --> 00:11:58,880 Speaker 1: that Twitter has made a lot of decisions that would 188 00:11:58,960 --> 00:12:01,480 Speaker 1: lead you to think, oh, they're gonna charge you if 189 00:12:01,480 --> 00:12:03,839 Speaker 1: you want to post more than ten times a day, 190 00:12:03,920 --> 00:12:06,600 Speaker 1: but that does not appear to be the case as 191 00:12:06,640 --> 00:12:09,480 Speaker 1: far as anyone outside of Twitter is where there is 192 00:12:09,520 --> 00:12:13,040 Speaker 1: no plan to do that. So Elon Musk also sent 193 00:12:13,120 --> 00:12:15,719 Speaker 1: out a message to Twitter employees that urged them to 194 00:12:15,800 --> 00:12:18,280 Speaker 1: kind of put projects on hold for the moment and 195 00:12:18,320 --> 00:12:22,800 Speaker 1: instead prioritize making sure that Twitter is stable and that 196 00:12:22,920 --> 00:12:26,600 Speaker 1: it's performing properly. He also cited the fact that the 197 00:12:26,640 --> 00:12:30,200 Speaker 1: Big Game, you know, the American football big game that 198 00:12:30,240 --> 00:12:32,240 Speaker 1: has a trademark name that I have been told I 199 00:12:32,280 --> 00:12:36,040 Speaker 1: cannot use, is happening this weekend and people are bound 200 00:12:36,040 --> 00:12:39,240 Speaker 1: to be on Twitter talking about all the commercials and 201 00:12:39,280 --> 00:12:41,960 Speaker 1: also the game, because you know, I understand there are 202 00:12:41,960 --> 00:12:45,440 Speaker 1: people who really take those games very seriously. This is 203 00:12:45,480 --> 00:12:48,840 Speaker 1: me not really joking about sports fans. I'm more joking 204 00:12:48,880 --> 00:12:51,960 Speaker 1: about how I'm just a person who never got into sports. 205 00:12:52,000 --> 00:12:54,959 Speaker 1: I think sports fans who are passionate about the sports 206 00:12:54,960 --> 00:12:58,559 Speaker 1: they love, that's awesome. Um, I just I just am 207 00:12:58,600 --> 00:13:02,199 Speaker 1: not one of them. But it's a big shocking revelation. 208 00:13:02,240 --> 00:13:05,000 Speaker 1: I knew you weren't expecting it anyway. I'm sure that 209 00:13:05,040 --> 00:13:07,360 Speaker 1: Twitter staff are going to have their hands full keeping 210 00:13:07,360 --> 00:13:10,200 Speaker 1: the ship afloat during one of the busiest days for 211 00:13:10,280 --> 00:13:15,000 Speaker 1: social networks, and that's got to be hard considering the 212 00:13:15,040 --> 00:13:19,040 Speaker 1: reduced workforce that still remains at Twitter. The UK and 213 00:13:19,080 --> 00:13:22,040 Speaker 1: the US have imposed sanctions against leaders in the Russian 214 00:13:22,040 --> 00:13:26,800 Speaker 1: hacker group called trick Mott, which hasn't been as active recently, 215 00:13:26,880 --> 00:13:29,720 Speaker 1: but in the past has been responsible for some massive 216 00:13:29,840 --> 00:13:35,080 Speaker 1: ransomware attacks on vulnerable targets, most notably healthcare facilities. Hackers 217 00:13:35,080 --> 00:13:39,000 Speaker 1: will frequently concentrate on healthcare companies because their services are 218 00:13:39,120 --> 00:13:42,280 Speaker 1: literally matters of life and death, so there's a strong 219 00:13:42,360 --> 00:13:46,840 Speaker 1: incentive for these targets to concede to ransom demands because 220 00:13:46,880 --> 00:13:51,600 Speaker 1: the consequences could be deadly. UK and US officials have 221 00:13:51,640 --> 00:13:54,480 Speaker 1: also alleged that this hacker group has official ties with 222 00:13:54,559 --> 00:13:58,120 Speaker 1: Russian intelligence, which would mean they are a state sponsored 223 00:13:58,160 --> 00:14:01,520 Speaker 1: hacker group. And you might wonder what good sanctions in 224 00:14:01,559 --> 00:14:04,560 Speaker 1: the UK and the US are going to do against 225 00:14:04,880 --> 00:14:07,560 Speaker 1: these hackers who are based out of Russia, which is 226 00:14:07,559 --> 00:14:10,560 Speaker 1: a country that already has heavy sanctions against it for 227 00:14:10,760 --> 00:14:13,439 Speaker 1: numerous reasons, and according to the officials, the goal is 228 00:14:13,480 --> 00:14:16,000 Speaker 1: really to make it harder for the hackers to launder 229 00:14:16,280 --> 00:14:20,120 Speaker 1: money from their ransomware attacks, which would potentially make them 230 00:14:20,160 --> 00:14:22,920 Speaker 1: easier to catch and harder for Russian officials to ignore. 231 00:14:23,480 --> 00:14:26,480 Speaker 1: A couple of electric vehicle startup companies called Rivian and 232 00:14:26,600 --> 00:14:30,000 Speaker 1: pole Star recently commissioned a climate study from the consulting 233 00:14:30,040 --> 00:14:33,160 Speaker 1: firm Kearney, and the results of that study prompted the 234 00:14:33,280 --> 00:14:36,800 Speaker 1: startups to announce that the automotive industry needs some serious 235 00:14:36,880 --> 00:14:42,120 Speaker 1: changes beyond the electrification of the vehicles they offer in 236 00:14:42,200 --> 00:14:44,800 Speaker 1: order to make a real DNT in carbon emissions. And 237 00:14:44,800 --> 00:14:47,200 Speaker 1: we've been seeing a lot of companies commit to producing 238 00:14:47,200 --> 00:14:50,840 Speaker 1: more electric vehicles recently, particularly as states and countries are 239 00:14:50,880 --> 00:14:53,960 Speaker 1: passing laws that are going to require all new vehicles 240 00:14:54,000 --> 00:14:57,120 Speaker 1: sold within their borders to be something other than traditional 241 00:14:57,160 --> 00:15:01,520 Speaker 1: internal combustion engine vehicles. But the startups argue that to 242 00:15:01,600 --> 00:15:04,160 Speaker 1: really make meaningful change, the industry as a whole has 243 00:15:04,200 --> 00:15:09,080 Speaker 1: to address the entire supply chains contribution to greenhouse gas emissions. 244 00:15:09,120 --> 00:15:12,600 Speaker 1: That includes things like the mining of the minerals you 245 00:15:12,640 --> 00:15:15,880 Speaker 1: need in order to make batteries and further that even 246 00:15:15,960 --> 00:15:19,280 Speaker 1: with the electrification of vehicles, we're moving far too slowly 247 00:15:19,320 --> 00:15:23,000 Speaker 1: to meet important goals outlined by the Paris Agreement. The 248 00:15:23,040 --> 00:15:26,280 Speaker 1: report is actually pretty grim. Uh. It sounds like it's 249 00:15:26,400 --> 00:15:29,600 Speaker 1: impossible to meet the Paris goals without a much more 250 00:15:29,720 --> 00:15:32,800 Speaker 1: drastic change than what countries like the United States have 251 00:15:32,840 --> 00:15:36,520 Speaker 1: already committed to, and it's not likely that these countries 252 00:15:36,560 --> 00:15:41,640 Speaker 1: are going to uh increase that commitment. The implications for 253 00:15:41,680 --> 00:15:43,920 Speaker 1: that are really sobering because it means that we will 254 00:15:44,000 --> 00:15:47,000 Speaker 1: fail to limit global warming to a further one point 255 00:15:47,040 --> 00:15:50,200 Speaker 1: five degrees celsius and the impact of climate change will 256 00:15:50,240 --> 00:15:53,920 Speaker 1: be greater because of that. And as the startups are stressing, 257 00:15:54,000 --> 00:15:56,640 Speaker 1: every year that passes with us not doing what is 258 00:15:56,720 --> 00:16:01,040 Speaker 1: necessary means that what is necessary will become even harder 259 00:16:01,080 --> 00:16:04,680 Speaker 1: to do. It will grow and become more imperative for 260 00:16:04,760 --> 00:16:07,240 Speaker 1: us to do something about it. We're really just making 261 00:16:07,280 --> 00:16:11,280 Speaker 1: it harder for ourselves and more importantly, on future generations. 262 00:16:11,440 --> 00:16:14,760 Speaker 1: According to this report, The Wall Street Journal reports that 263 00:16:14,960 --> 00:16:17,760 Speaker 1: users in certain Facebook groups in India have been using 264 00:16:17,800 --> 00:16:21,120 Speaker 1: the platform to offer guns for sale. The groups in 265 00:16:21,200 --> 00:16:24,920 Speaker 1: question are related to an extremist Hindu organization that the 266 00:16:24,960 --> 00:16:28,600 Speaker 1: c i A alleges is connected with violent attacks against 267 00:16:28,680 --> 00:16:32,520 Speaker 1: Muslims in India. Now, private gun sales between citizens is 268 00:16:32,560 --> 00:16:36,040 Speaker 1: a big no no in Facebook's terms of service, and yet, 269 00:16:36,120 --> 00:16:39,920 Speaker 1: when an Indian activist brought these posts to Meta's attention, 270 00:16:40,320 --> 00:16:43,440 Speaker 1: he received a response that said the posts actually didn't 271 00:16:43,560 --> 00:16:47,320 Speaker 1: violate any of Facebook's policies. However, when the Wall Street 272 00:16:47,480 --> 00:16:51,240 Speaker 1: Journal started to ask Meta questions about how that could 273 00:16:51,320 --> 00:16:54,840 Speaker 1: possibly be the case, the posts offering guns for sale 274 00:16:54,840 --> 00:16:59,880 Speaker 1: were suddenly removed. Meta essentially said, these posts violated Facebook policies, 275 00:17:00,280 --> 00:17:02,400 Speaker 1: and we take that kind of stuff down when we 276 00:17:02,440 --> 00:17:05,440 Speaker 1: find out about it. Except they had already found out 277 00:17:05,440 --> 00:17:09,359 Speaker 1: about it because the activists had reported it. I guess 278 00:17:09,359 --> 00:17:14,479 Speaker 1: you just need to insert cricket chirping sound here. Various 279 00:17:14,480 --> 00:17:17,760 Speaker 1: activist groups have long pointed out that Meta's content moderation 280 00:17:17,760 --> 00:17:20,840 Speaker 1: practices have a lot to be desired in many regions 281 00:17:20,840 --> 00:17:24,159 Speaker 1: around the globe, particularly in places where English is not 282 00:17:24,240 --> 00:17:27,920 Speaker 1: the dominant language. And to make matters even stickier for Meta, 283 00:17:28,119 --> 00:17:31,160 Speaker 1: India is home to more than a billion people, and 284 00:17:31,200 --> 00:17:34,080 Speaker 1: a whole lot of them use Meta products like Facebook 285 00:17:34,160 --> 00:17:36,639 Speaker 1: and WhatsApp. So the picture of the Wall Street Journal 286 00:17:36,640 --> 00:17:41,399 Speaker 1: paints is that despite this large user base, Meta really 287 00:17:41,520 --> 00:17:45,800 Speaker 1: is failing to enforce its policies in India. Finally, some 288 00:17:45,920 --> 00:17:49,520 Speaker 1: light news. This week, Nintendo held one of its Nintendo 289 00:17:49,600 --> 00:17:52,879 Speaker 1: Direct events and showed off some upcoming titles, including the 290 00:17:52,960 --> 00:17:57,240 Speaker 1: highly anticipated and delayed sequel to the legend of Zelda 291 00:17:57,280 --> 00:18:00,640 Speaker 1: game Breath of the Wild. The sequel is called Tears 292 00:18:00,840 --> 00:18:03,760 Speaker 1: of the King, and Nintendo released a trailer to give 293 00:18:03,800 --> 00:18:07,040 Speaker 1: players a hint of what to expect, and it looks real, 294 00:18:07,480 --> 00:18:09,679 Speaker 1: moody and epic. Y'all. You get a real Lord of 295 00:18:09,680 --> 00:18:12,760 Speaker 1: the Rings s ron kind of vibe going with the 296 00:18:12,760 --> 00:18:16,280 Speaker 1: the villain and uh saying that the entire kingdom of 297 00:18:16,359 --> 00:18:19,000 Speaker 1: High Rule is going to get wiped out. And we 298 00:18:19,080 --> 00:18:23,040 Speaker 1: get some shots of Link in action doing cool stuff 299 00:18:23,080 --> 00:18:27,520 Speaker 1: like writing a quad copter like device and fighting various monsters, 300 00:18:27,560 --> 00:18:30,560 Speaker 1: as well as some cut scene clips. And we also 301 00:18:30,600 --> 00:18:33,160 Speaker 1: got the release date of May twelve of this year, 302 00:18:33,240 --> 00:18:35,560 Speaker 1: and it looks pretty cool now. I say that as 303 00:18:35,600 --> 00:18:38,439 Speaker 1: someone who hasn't played a Zelda game since Okarina of 304 00:18:38,480 --> 00:18:42,720 Speaker 1: Time that was my last one because I'm old. Nintendo 305 00:18:42,760 --> 00:18:47,280 Speaker 1: also announced a remaster of Metroid Prime that's available on 306 00:18:47,320 --> 00:18:50,840 Speaker 1: the Shop, the digital store for the Nintendo Switch right now. 307 00:18:51,520 --> 00:18:54,760 Speaker 1: Nintendo also announced that Switch Online players will be able 308 00:18:54,760 --> 00:18:58,040 Speaker 1: to access a library of games that were originally designed 309 00:18:58,080 --> 00:19:01,960 Speaker 1: for the game Boy and the game Boy Advance. That 310 00:19:02,160 --> 00:19:05,800 Speaker 1: really takes me back. You could finally play Tetris the 311 00:19:05,800 --> 00:19:08,199 Speaker 1: way it was meant to be played a k a. 312 00:19:08,280 --> 00:19:11,840 Speaker 1: The way I played it. Nintendo announced several other titles 313 00:19:11,840 --> 00:19:14,199 Speaker 1: and releases, and you should definitely pop online when you 314 00:19:14,240 --> 00:19:17,160 Speaker 1: can and find the full presentation. If you're a big 315 00:19:17,240 --> 00:19:20,399 Speaker 1: Nintendo fan, I'm sure you'll enjoy it. That's it for 316 00:19:20,480 --> 00:19:23,560 Speaker 1: the news for Thursday, February nine. I hope you are 317 00:19:23,640 --> 00:19:25,520 Speaker 1: all well. Make sure you reach out to me with 318 00:19:25,560 --> 00:19:27,520 Speaker 1: anything you would like to hear in future episodes of 319 00:19:27,520 --> 00:19:30,160 Speaker 1: tech Stuff. You do that on Twitter if it's up. 320 00:19:30,760 --> 00:19:33,480 Speaker 1: The handel for tech Stuff is tech Stuff hs W. 321 00:19:34,200 --> 00:19:43,439 Speaker 1: I'll talk to you again really soon. Tex Stuff is 322 00:19:43,440 --> 00:19:46,639 Speaker 1: an I Heart Radio production. For more podcasts from I 323 00:19:46,720 --> 00:19:50,320 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 324 00:19:50,440 --> 00:19:52,439 Speaker 1: or wherever you listen to your favorite shows.