1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,520 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,640 --> 00:00:17,360 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,400 --> 00:00:19,479 Speaker 1: and a law of all things tech. And this is 5 00:00:19,520 --> 00:00:24,160 Speaker 1: the Tech News for Thursday, February twenty one. And starting 6 00:00:24,160 --> 00:00:26,759 Speaker 1: with this episode, the news episodes are gonna get a 7 00:00:26,800 --> 00:00:29,800 Speaker 1: little shorter so I can really focus on a few 8 00:00:29,880 --> 00:00:34,200 Speaker 1: big headlines as opposed to really trolling the news and 9 00:00:34,360 --> 00:00:36,960 Speaker 1: trying to pad out an episode. I feel like that's 10 00:00:37,000 --> 00:00:39,600 Speaker 1: not doing anyone any favors. So we're going to really 11 00:00:40,000 --> 00:00:44,879 Speaker 1: focus from here on out. So let's get started. Researchers 12 00:00:44,880 --> 00:00:47,640 Speaker 1: with M I. T. Harvard University and E. T. H. 13 00:00:47,800 --> 00:00:50,680 Speaker 1: Zurich are working on using machine learning to help treat 14 00:00:50,720 --> 00:00:55,480 Speaker 1: patients with COVID nineteen. So just quickly, machine learning refers 15 00:00:55,560 --> 00:00:59,560 Speaker 1: to a study a field of study within computer science 16 00:00:59,600 --> 00:01:04,840 Speaker 1: where you're using algorithms that can self improve over time. 17 00:01:05,319 --> 00:01:07,840 Speaker 1: So the goal with this particular study was to find 18 00:01:07,880 --> 00:01:13,520 Speaker 1: effective treatments for severe COVID nineteen symptoms within vulnerable population, 19 00:01:13,640 --> 00:01:19,080 Speaker 1: specifically elderly people who have COVID nineteen. The researchers identified 20 00:01:19,160 --> 00:01:22,679 Speaker 1: a potential starting point related to lung tissue, because as 21 00:01:22,680 --> 00:01:27,120 Speaker 1: we age, our lungs tend to become stiffer, and that 22 00:01:27,200 --> 00:01:31,080 Speaker 1: condition can make respiratory illnesses more difficult to manage. The 23 00:01:31,160 --> 00:01:35,399 Speaker 1: researchers wanted to see if there are any existing medications 24 00:01:35,440 --> 00:01:39,320 Speaker 1: that might be effective for treating those symptoms. While other 25 00:01:39,400 --> 00:01:42,680 Speaker 1: teams of doctors and researchers are working to develop new 26 00:01:42,720 --> 00:01:45,960 Speaker 1: medications all the time for COVID nineteen and everything else, 27 00:01:46,400 --> 00:01:49,640 Speaker 1: that process is very slow, and for good reason. You 28 00:01:49,680 --> 00:01:53,320 Speaker 1: have to make sure that the treatment has a high efficacy, 29 00:01:53,440 --> 00:01:55,920 Speaker 1: and you have to learn what side effects the medication 30 00:01:56,160 --> 00:02:00,320 Speaker 1: might have. Existing medications have already gone through this uncle 31 00:02:00,440 --> 00:02:04,280 Speaker 1: testing process, so we have a good understanding of what 32 00:02:04,440 --> 00:02:07,720 Speaker 1: those medications do and the potential side effects of them, 33 00:02:07,760 --> 00:02:10,440 Speaker 1: but we don't necessarily know if any of them would 34 00:02:10,440 --> 00:02:14,000 Speaker 1: be particularly helpful in treating COVID nineteen symptoms. So the 35 00:02:14,040 --> 00:02:17,160 Speaker 1: team decided to look at genes and proteins that are 36 00:02:17,200 --> 00:02:21,600 Speaker 1: related to aging in general and this tendency for lungs 37 00:02:21,639 --> 00:02:25,240 Speaker 1: to grow more stiff over time. In particular. They're using 38 00:02:25,280 --> 00:02:28,200 Speaker 1: machine learning algorithms to whittle down a large list of 39 00:02:28,200 --> 00:02:32,560 Speaker 1: medications that might help to address the expression of those genes, 40 00:02:32,720 --> 00:02:36,760 Speaker 1: perhaps even going back several steps to the genetic root cause, 41 00:02:37,200 --> 00:02:39,240 Speaker 1: because you can really think of this as you know, 42 00:02:39,280 --> 00:02:43,639 Speaker 1: a series of things that lead to this outcome. Now, 43 00:02:43,639 --> 00:02:47,440 Speaker 1: the algorithm looked at how various drugs affect the expression 44 00:02:47,600 --> 00:02:51,680 Speaker 1: of these particular genes and cross referenced that with another 45 00:02:51,760 --> 00:02:56,560 Speaker 1: data set about how genetic expression would respond after a 46 00:02:56,680 --> 00:02:59,880 Speaker 1: COVID nineteen infection, and then the algorithm looked for drugs 47 00:02:59,880 --> 00:03:02,640 Speaker 1: that might have an impact on gene expression that could 48 00:03:02,919 --> 00:03:07,200 Speaker 1: in turn decrease the severity of COVID nineteen symptoms. The 49 00:03:07,240 --> 00:03:10,720 Speaker 1: team is sharing their information with pharmaceutical companies, which can 50 00:03:10,760 --> 00:03:13,720 Speaker 1: then begin their own clinical trials to test the results 51 00:03:13,760 --> 00:03:17,000 Speaker 1: against you know, reality, and if it works, it could 52 00:03:17,040 --> 00:03:19,680 Speaker 1: make an enormous difference in the quality of life of 53 00:03:19,760 --> 00:03:25,480 Speaker 1: elderly COVID patients and potentially decrease mortality rates. Over at YouTube, 54 00:03:25,480 --> 00:03:29,000 Speaker 1: the company recently shut down nearly three thousand channels that 55 00:03:29,080 --> 00:03:33,079 Speaker 1: it says we're part of a state backed influence operation. Now, 56 00:03:33,120 --> 00:03:36,880 Speaker 1: the states in question were Russia and China. YouTube says 57 00:03:36,920 --> 00:03:39,760 Speaker 1: that most of the channels were part of these Chinese 58 00:03:39,760 --> 00:03:43,880 Speaker 1: programs meant to spread propaganda about stuff like US politics 59 00:03:43,960 --> 00:03:47,400 Speaker 1: and the COVID nineteen pandemic, all of it buried deep 60 00:03:47,480 --> 00:03:50,880 Speaker 1: in tons of videos about far less weighty matters like 61 00:03:51,280 --> 00:03:54,520 Speaker 1: you know, celebrity gossip and pop culture topics. We're seeing 62 00:03:54,520 --> 00:03:57,840 Speaker 1: a more proactive approach from YouTube after years of criticism 63 00:03:57,880 --> 00:04:02,400 Speaker 1: about how the company's moderation policy. These allowed misinformation campaigns 64 00:04:02,400 --> 00:04:06,280 Speaker 1: to run wild, and videos promoting extremist views were able 65 00:04:06,320 --> 00:04:10,960 Speaker 1: to get a foothold on the platform. Meanwhile, over at Weymo, 66 00:04:11,080 --> 00:04:15,040 Speaker 1: another company in the Google family, engineers are getting ready 67 00:04:15,080 --> 00:04:18,520 Speaker 1: for a really big test. Weimo is in the autonomous 68 00:04:18,640 --> 00:04:22,159 Speaker 1: vehicle business. In case you didn't know, it had previously 69 00:04:22,200 --> 00:04:26,400 Speaker 1: conducted tests of self driving cars in Phoenix, Arizona primarily, 70 00:04:26,720 --> 00:04:29,120 Speaker 1: but now the company is preparing to launch an autonomous 71 00:04:29,240 --> 00:04:34,120 Speaker 1: vehicle testing phase in the San Francisco area. Weymo hasn't 72 00:04:34,120 --> 00:04:38,360 Speaker 1: built their own vehicles. Instead, they took existing models from 73 00:04:38,480 --> 00:04:41,720 Speaker 1: other car companies and then change those models to make 74 00:04:41,760 --> 00:04:45,800 Speaker 1: them self driving. The Jaguar or if you prefer, Jaguar 75 00:04:46,160 --> 00:04:49,280 Speaker 1: I Pace, which is an electric vehicle suv, and the 76 00:04:49,360 --> 00:04:52,720 Speaker 1: Chrysler Pacifica, which is a kind of minivan, make up 77 00:04:52,760 --> 00:04:55,960 Speaker 1: the models in the fleet of autonomous cars. The goal 78 00:04:56,040 --> 00:04:59,760 Speaker 1: is to build out an autonomous taxi service in the future, 79 00:04:59,800 --> 00:05:02,320 Speaker 1: but Weymo reps are really quick to point out that 80 00:05:02,920 --> 00:05:05,719 Speaker 1: there's still a long way to go before we get there. 81 00:05:05,760 --> 00:05:08,480 Speaker 1: The tests in the Bay Area will still include a 82 00:05:08,560 --> 00:05:13,480 Speaker 1: human driver or quote single vehicle operator, as a Weymo 83 00:05:13,560 --> 00:05:17,240 Speaker 1: rep told venture Beat. The company said that the test 84 00:05:17,279 --> 00:05:20,960 Speaker 1: will last several weeks. It begins this week and they 85 00:05:21,000 --> 00:05:24,480 Speaker 1: have already conducted and they continue to conduct tests in 86 00:05:24,520 --> 00:05:27,880 Speaker 1: other parts of California. This test won't be open to 87 00:05:27,920 --> 00:05:31,160 Speaker 1: the public, so you're not gonna find yourself getting into 88 00:05:31,200 --> 00:05:35,320 Speaker 1: a Weymo driver less taxi just by happenstance. If you 89 00:05:35,440 --> 00:05:38,360 Speaker 1: happen to be a Weymo employee, maybe then you can 90 00:05:38,400 --> 00:05:41,920 Speaker 1: participate in the study. While Weymo has been working hard 91 00:05:41,960 --> 00:05:44,679 Speaker 1: to improve its technology, it also has had a small 92 00:05:44,760 --> 00:05:47,800 Speaker 1: number of cases in which a human operator felt it 93 00:05:47,880 --> 00:05:51,279 Speaker 1: necessary to take control of the car. It's a pretty 94 00:05:51,360 --> 00:05:55,239 Speaker 1: rare thing, however. Engadget reports that Weymo filed a report 95 00:05:55,279 --> 00:05:58,360 Speaker 1: with the California Department of Motor Vehicles that explained that 96 00:05:58,480 --> 00:06:00,719 Speaker 1: there were just twenty one cases and which an operator 97 00:06:00,800 --> 00:06:03,800 Speaker 1: felt it was necessary to take over the control of 98 00:06:03,839 --> 00:06:09,320 Speaker 1: the vehicle over a test period that included six nine 99 00:06:09,400 --> 00:06:14,000 Speaker 1: thousand miles driven by autonomous vehicles. Still, when it comes 100 00:06:14,000 --> 00:06:18,080 Speaker 1: to autonomous cars and safety, the bar is incredibly high, 101 00:06:18,279 --> 00:06:21,640 Speaker 1: and you could argue it's justifiably so, because we've seen 102 00:06:21,680 --> 00:06:25,400 Speaker 1: what kind of tragedies can happen when an autonomous or 103 00:06:25,520 --> 00:06:30,760 Speaker 1: semi autonomous system fails to prevent an accident. Over in Australia, 104 00:06:31,000 --> 00:06:33,080 Speaker 1: the government is preparing to vote on a bill that 105 00:06:33,080 --> 00:06:37,080 Speaker 1: would require internet companies like Google and Facebook to pay 106 00:06:37,200 --> 00:06:42,120 Speaker 1: journalistic outlets media companies for content appearing on those tech 107 00:06:42,240 --> 00:06:45,279 Speaker 1: companies platforms. So the argument is that if a company 108 00:06:45,360 --> 00:06:49,880 Speaker 1: like Facebook makes use of content from an Australian media company, 109 00:06:49,960 --> 00:06:53,960 Speaker 1: then Facebook should pay that media company for use of 110 00:06:54,000 --> 00:06:58,560 Speaker 1: that content. The legislation would create regulations forcing tech companies 111 00:06:58,600 --> 00:07:02,200 Speaker 1: to negotiate with media companies and agree upon a rate, 112 00:07:03,040 --> 00:07:05,440 Speaker 1: which might be a lump sum or it might be 113 00:07:05,520 --> 00:07:08,279 Speaker 1: a per click rate. It all depends on how the 114 00:07:08,360 --> 00:07:12,880 Speaker 1: legislation gets hashed out. The Treasurer of Australia developed this 115 00:07:12,960 --> 00:07:15,640 Speaker 1: idea after a study showed that companies like Facebook and 116 00:07:15,680 --> 00:07:20,239 Speaker 1: Google receive a really big share of advertising dollars while 117 00:07:20,360 --> 00:07:24,360 Speaker 1: much of the content that was appearing in Australia we're 118 00:07:24,480 --> 00:07:29,000 Speaker 1: from Australian media organizations. But the tech companies have objected 119 00:07:29,040 --> 00:07:32,040 Speaker 1: to the wording of the proposed legislation, and now Facebook 120 00:07:32,080 --> 00:07:35,640 Speaker 1: says it will not allow Australian users and media companies 121 00:07:36,000 --> 00:07:40,480 Speaker 1: to share links to news articles on Facebook. The managing 122 00:07:40,520 --> 00:07:44,480 Speaker 1: director of Facebook Australia and New Zealand wrote, quote, the 123 00:07:44,560 --> 00:07:49,600 Speaker 1: proposed law fundamentally misunderstands the relationship between our platform and 124 00:07:49,720 --> 00:07:52,560 Speaker 1: publishers who use it to share news content. It has 125 00:07:52,640 --> 00:07:56,080 Speaker 1: left us facing a stark choice attempt to comply with 126 00:07:56,120 --> 00:07:59,280 Speaker 1: the law that ignores the realities of this relationship, or 127 00:07:59,320 --> 00:08:03,240 Speaker 1: stop allow news content on our services in Australia. With 128 00:08:03,320 --> 00:08:07,200 Speaker 1: a heavy heart, we are choosing the ladder end quote. 129 00:08:07,800 --> 00:08:13,480 Speaker 1: The sarcastic tone is purely editorial. Further, Australian users will 130 00:08:13,520 --> 00:08:15,960 Speaker 1: not be able to see posts that include links to 131 00:08:16,040 --> 00:08:20,360 Speaker 1: international news sources, and international users won't be able to 132 00:08:20,520 --> 00:08:25,960 Speaker 1: view or share Australian news content on Facebook. Effectively, Australian 133 00:08:26,040 --> 00:08:31,080 Speaker 1: Facebook users are in a media blackout while they use Facebook. Now, 134 00:08:31,120 --> 00:08:34,320 Speaker 1: this is a really complicated issue, one that actually goes 135 00:08:34,360 --> 00:08:37,440 Speaker 1: beyond technology, but I do think this move might add 136 00:08:37,480 --> 00:08:40,520 Speaker 1: more fuel to the fire for criticisms that companies like 137 00:08:40,559 --> 00:08:44,640 Speaker 1: Facebook and Google are monopolistic or anti competitive. And it 138 00:08:44,720 --> 00:08:49,440 Speaker 1: also kind of devalues Facebook's news feed if no news 139 00:08:49,640 --> 00:08:51,680 Speaker 1: is allowed to be in it. Huh hey, do you 140 00:08:51,720 --> 00:08:56,160 Speaker 1: remember the Sony Pictures hack? Back in quick refresher, a 141 00:08:56,160 --> 00:08:59,280 Speaker 1: group of hackers who identified themselves as the Guardians of 142 00:08:59,400 --> 00:09:04,160 Speaker 1: Peace infiltrated Sony Pictures systems and stole a crap ton 143 00:09:04,360 --> 00:09:08,840 Speaker 1: of data, including copies of unreleased films, emails, and personal 144 00:09:08,880 --> 00:09:12,840 Speaker 1: information about Sony Pictures employees. The group indicated that the 145 00:09:12,920 --> 00:09:15,800 Speaker 1: hack was a response to Sony Pictures producing a movie 146 00:09:15,840 --> 00:09:19,840 Speaker 1: called The Interview. Uh. The plot of that comedy revolves 147 00:09:19,840 --> 00:09:23,400 Speaker 1: around an assassination attempt on Kim Jong un, the leader 148 00:09:23,440 --> 00:09:27,560 Speaker 1: of North Korea. Sony ultimately canceled the theatrical release of 149 00:09:27,600 --> 00:09:30,800 Speaker 1: that movie and switched to a digital delivery method of distribution. 150 00:09:31,320 --> 00:09:33,560 Speaker 1: And here we are seven years later in the U. S. 151 00:09:33,600 --> 00:09:37,000 Speaker 1: Department of Justice has charged three hackers that they say 152 00:09:37,000 --> 00:09:39,440 Speaker 1: were involved in the Sony Pictures hack, as well as 153 00:09:39,520 --> 00:09:43,160 Speaker 1: some other cyber criminal activities like the development and distribution 154 00:09:43,200 --> 00:09:46,920 Speaker 1: of the Wanna Cry ransomware, malware, and hacks on various 155 00:09:46,920 --> 00:09:50,000 Speaker 1: cryptocurrency exchanges. The d o J says that the three 156 00:09:50,080 --> 00:09:52,880 Speaker 1: hackers all belonged to hagging units that fall under the 157 00:09:52,920 --> 00:09:56,319 Speaker 1: authority of the North Korean military. In addition, the d 158 00:09:56,440 --> 00:09:59,400 Speaker 1: o J revealed that a Canadian American citizen has pled 159 00:09:59,400 --> 00:10:02,160 Speaker 1: guilty un charges of money laundering on behalf of North 160 00:10:02,240 --> 00:10:05,120 Speaker 1: Korean hackers. Now, if you did follow the Sony Pictures 161 00:10:05,160 --> 00:10:07,920 Speaker 1: case when it happened, you probably remember that North Korean 162 00:10:07,960 --> 00:10:11,440 Speaker 1: officials denied that there was any connection between the North 163 00:10:11,520 --> 00:10:15,000 Speaker 1: Korea government or military and these hackers. You probably also 164 00:10:15,040 --> 00:10:18,880 Speaker 1: remember that a lot of cybersecurity experts were, let us say, 165 00:10:18,920 --> 00:10:23,200 Speaker 1: skeptical of that claim. These charges continue to reject North 166 00:10:23,280 --> 00:10:27,200 Speaker 1: Korea's objections to them, and as for the charges, it's 167 00:10:27,320 --> 00:10:29,560 Speaker 1: very hard for me to imagine a scenario in which 168 00:10:29,600 --> 00:10:33,080 Speaker 1: North Korea would ever agree to extradite those charged to 169 00:10:33,200 --> 00:10:36,520 Speaker 1: stand trial for the allegations. So really these charges are 170 00:10:36,520 --> 00:10:40,720 Speaker 1: more of a name and shame approach. In the world 171 00:10:40,840 --> 00:10:44,120 Speaker 1: of tech conferences, there are a few standouts that merit 172 00:10:44,200 --> 00:10:47,360 Speaker 1: special attention. Ce s as a big one as his 173 00:10:47,440 --> 00:10:50,679 Speaker 1: E three, but a third one is the Mobile World Congress, 174 00:10:50,720 --> 00:10:53,400 Speaker 1: which holds events in different parts of the world throughout 175 00:10:53,400 --> 00:10:57,079 Speaker 1: the year, but it reserves its main exhibition for Barcelona, Spain, 176 00:10:57,559 --> 00:10:59,880 Speaker 1: and as the name implies, the tech trade event for 177 00:11:00,000 --> 00:11:03,800 Speaker 1: focuses on mobile devices and apps for those devices. Now, 178 00:11:03,800 --> 00:11:06,320 Speaker 1: in a normal year, more than a hundred thousand people 179 00:11:06,400 --> 00:11:10,000 Speaker 1: attend the event. The g s m A, which organizes 180 00:11:10,040 --> 00:11:13,600 Speaker 1: the conference, canceled the Barcelona event in due to the 181 00:11:13,679 --> 00:11:17,080 Speaker 1: COVID nineteen pandemic, but plans are in place for the 182 00:11:17,120 --> 00:11:20,040 Speaker 1: twenty one event to happen, though it is going to 183 00:11:20,160 --> 00:11:23,080 Speaker 1: take place later in the year by swapping places with 184 00:11:23,240 --> 00:11:28,280 Speaker 1: the Mobile World Congress Shanghai event. In addition, attendees won't 185 00:11:28,320 --> 00:11:32,120 Speaker 1: be required to receive a COVID vaccination prior to going, 186 00:11:32,440 --> 00:11:35,040 Speaker 1: probably because there's still a lot of uncertainty about who 187 00:11:35,080 --> 00:11:37,880 Speaker 1: will be able to receive a vaccine win in many 188 00:11:37,880 --> 00:11:41,079 Speaker 1: parts of the world. Instead, attendees will have to test 189 00:11:41,240 --> 00:11:44,360 Speaker 1: negative for COVID nineteen within seventy two hours of the 190 00:11:44,400 --> 00:11:48,319 Speaker 1: event kicking off on June. The g s m A 191 00:11:48,320 --> 00:11:50,920 Speaker 1: also says that it will limit attendance and expects no 192 00:11:51,000 --> 00:11:54,640 Speaker 1: more than fifty people to go. Now I mean, I 193 00:11:54,679 --> 00:11:58,480 Speaker 1: sincerely hope that this event and the Shanghai event are 194 00:11:58,920 --> 00:12:02,600 Speaker 1: safe for every one concerned, but I have to admit 195 00:12:02,679 --> 00:12:05,280 Speaker 1: that these kind of big events still make me a 196 00:12:05,280 --> 00:12:09,120 Speaker 1: little nervous in a time when we're not really sure 197 00:12:09,280 --> 00:12:13,440 Speaker 1: what the status is going to be for vaccinations. And finally, 198 00:12:13,559 --> 00:12:16,840 Speaker 1: engineers with the University of California, San Diego showed off 199 00:12:16,920 --> 00:12:20,720 Speaker 1: a really cool soft robot. It uses pressurized air to 200 00:12:20,800 --> 00:12:24,760 Speaker 1: provide the needed to move around, and most astonishingly, it 201 00:12:24,800 --> 00:12:29,160 Speaker 1: has no electronic components on board. Everything relies on tubes 202 00:12:29,280 --> 00:12:33,120 Speaker 1: and valves to send pressurized air to specific limbs. The 203 00:12:33,200 --> 00:12:37,199 Speaker 1: robot has four legs. The team based the robots motion 204 00:12:37,280 --> 00:12:41,840 Speaker 1: off of an African side necked turtle, utilizing quote diagonal 205 00:12:41,960 --> 00:12:45,600 Speaker 1: couplet git patterns end quote. Now, that essentially means that 206 00:12:45,640 --> 00:12:49,120 Speaker 1: the turtle walks by moving diagonally opposite limbs at the 207 00:12:49,160 --> 00:12:51,840 Speaker 1: same time, like if it's taking a step forward with 208 00:12:51,920 --> 00:12:55,160 Speaker 1: its right front leg, it also moves its left back 209 00:12:55,280 --> 00:12:58,600 Speaker 1: leg forward at the same time and vice versa. Now, 210 00:12:58,640 --> 00:13:01,560 Speaker 1: the robot does this with a valve system controlling which 211 00:13:01,640 --> 00:13:05,560 Speaker 1: limbs receive air, causing them to extend. The valves have 212 00:13:05,679 --> 00:13:08,800 Speaker 1: a delay, so the air flows into pairs of limbs 213 00:13:09,120 --> 00:13:11,640 Speaker 1: at different times, and the coordination of this results in 214 00:13:11,679 --> 00:13:15,640 Speaker 1: the robot walking. Although walking is a generous term, it's 215 00:13:15,679 --> 00:13:18,480 Speaker 1: more of a coordinated wobble. But then most days that's 216 00:13:18,480 --> 00:13:20,480 Speaker 1: all I can manage too, So who am I to talk? 217 00:13:20,880 --> 00:13:24,560 Speaker 1: Soft robots could have a lot of really cool potential uses, 218 00:13:24,800 --> 00:13:28,679 Speaker 1: including interacting and human environments where a soft robot poses 219 00:13:28,760 --> 00:13:33,080 Speaker 1: less of a risk to we squishy humans. Think bay 220 00:13:33,120 --> 00:13:36,679 Speaker 1: Max from Big Hero six. And that wraps up this 221 00:13:36,800 --> 00:13:42,160 Speaker 1: quick fire news round for Thursday, February twenty one. We'll 222 00:13:42,200 --> 00:13:44,880 Speaker 1: be back next week with more episodes of tech Stuff 223 00:13:44,920 --> 00:13:47,680 Speaker 1: and more news. If you guys have suggestions for things 224 00:13:47,679 --> 00:13:49,920 Speaker 1: I should cover on tech Stuff, please let me know. 225 00:13:50,040 --> 00:13:51,880 Speaker 1: The best way to do that is on Twitter with 226 00:13:51,960 --> 00:13:55,079 Speaker 1: the handled text Stuff H s W and I'll talk 227 00:13:55,080 --> 00:14:02,920 Speaker 1: to you again really soon. Text Stuff is an I 228 00:14:03,040 --> 00:14:06,520 Speaker 1: Heart Radio production. For more podcasts from I Heart Radio, 229 00:14:06,840 --> 00:14:10,040 Speaker 1: visit the i Heart Radio app, Apple Podcasts, or wherever 230 00:14:10,120 --> 00:14:11,640 Speaker 1: you listen to your favorite shows.