1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:15,080 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,240 --> 00:00:18,560 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,680 --> 00:00:22,200 Speaker 1: and I love all things tech. And today we have 5 00:00:22,320 --> 00:00:28,360 Speaker 1: the news for Thursday, February twenty one, and we're gonna 6 00:00:28,400 --> 00:00:31,880 Speaker 1: start off with more about solar Winds. We've covered the 7 00:00:31,920 --> 00:00:34,440 Speaker 1: Solar Winds hack on tech Stuff for a while now. 8 00:00:34,479 --> 00:00:38,160 Speaker 1: This is the hack that primarily targeted the Solar Winds 9 00:00:38,240 --> 00:00:42,559 Speaker 1: software product Ryan, which is a network management software that's 10 00:00:42,640 --> 00:00:46,559 Speaker 1: used by thousands of companies, including many of the Fortune 11 00:00:46,560 --> 00:00:49,760 Speaker 1: five hundred, as well as government agencies like the Department 12 00:00:49,800 --> 00:00:53,479 Speaker 1: of Homeland Security. In the US, the United States Congress 13 00:00:53,560 --> 00:00:57,400 Speaker 1: is holding sessions focusing on the hack and its scope 14 00:00:57,480 --> 00:01:01,160 Speaker 1: and any next steps that we should be taking. To 15 00:01:01,240 --> 00:01:04,959 Speaker 1: that end, Microsoft and a security company called fire I 16 00:01:05,480 --> 00:01:09,000 Speaker 1: are calling upon Congress to draft legislation that would create 17 00:01:09,040 --> 00:01:13,000 Speaker 1: a mandatory process for companies to follow after the discovery 18 00:01:13,120 --> 00:01:16,679 Speaker 1: of a data breach. The idea here is that the 19 00:01:16,800 --> 00:01:20,240 Speaker 1: longer a company stays quiet about a breach, the more 20 00:01:20,280 --> 00:01:23,959 Speaker 1: harm that can be done by the attackers, particularly if 21 00:01:23,959 --> 00:01:26,880 Speaker 1: it turns out that other companies have also been targeted 22 00:01:26,920 --> 00:01:31,319 Speaker 1: by that same group and have yet to detect an intrusion. Moreover, 23 00:01:31,840 --> 00:01:33,840 Speaker 1: the company has urged that there should be a way 24 00:01:34,080 --> 00:01:37,480 Speaker 1: for victims to come forward with these issues, even those 25 00:01:37,520 --> 00:01:41,839 Speaker 1: that could potentially involve national security concerns, without the threat 26 00:01:41,880 --> 00:01:45,679 Speaker 1: of having legal action leveled against those companies. So, in 27 00:01:45,720 --> 00:01:49,960 Speaker 1: other words, if a defense contractor detects a breach, it 28 00:01:50,000 --> 00:01:52,880 Speaker 1: should be able to report it without worrying about being 29 00:01:52,920 --> 00:01:56,440 Speaker 1: sued to eternity and back by the government, just as 30 00:01:56,440 --> 00:01:59,240 Speaker 1: an example. As it stands at the moment, in the 31 00:01:59,320 --> 00:02:01,600 Speaker 1: United States, the sort of thing is handled mostly on 32 00:02:01,640 --> 00:02:04,720 Speaker 1: a state by state basis, with most states having their 33 00:02:04,720 --> 00:02:07,240 Speaker 1: own rules in place to protect companies in the event 34 00:02:07,320 --> 00:02:10,200 Speaker 1: that they need to report a security breach, but there 35 00:02:10,280 --> 00:02:14,400 Speaker 1: is no existing federal legislation on the matter. On the whole, 36 00:02:14,680 --> 00:02:17,280 Speaker 1: I think this is a pretty good idea and encourages 37 00:02:17,320 --> 00:02:21,000 Speaker 1: companies and agencies to share information with one another, which 38 00:02:21,000 --> 00:02:25,359 Speaker 1: could drastically improve the response time to problems like security breaches, 39 00:02:25,880 --> 00:02:30,239 Speaker 1: that in turn could reduce the potential impact of those breaches. Now, 40 00:02:30,240 --> 00:02:33,160 Speaker 1: these are still very early talks, and it's also important 41 00:02:33,200 --> 00:02:37,440 Speaker 1: to remember that any legislation has to be crafted responsibly, 42 00:02:37,919 --> 00:02:40,520 Speaker 1: or else it runs the risk of making a problem 43 00:02:40,680 --> 00:02:44,359 Speaker 1: worse or at the very least more complicated. Hannah Beach 44 00:02:44,480 --> 00:02:46,880 Speaker 1: and Paul Mosur wrote a piece for The New York 45 00:02:46,919 --> 00:02:51,000 Speaker 1: Times titled A Digital Firewall in Myanmar Built with guns 46 00:02:51,080 --> 00:02:53,720 Speaker 1: and wire cutters. If you have access to The New 47 00:02:53,800 --> 00:02:57,160 Speaker 1: York Times, I recommend reading this piece. The reporters tell 48 00:02:57,200 --> 00:02:59,800 Speaker 1: the story of how the military coup in Myanmar, which 49 00:02:59,800 --> 00:03:06,480 Speaker 1: has overthrown a democratically elected administration forcibly, confronted telecoms and 50 00:03:06,520 --> 00:03:10,200 Speaker 1: their employees in an effort to shut down communication systems 51 00:03:10,240 --> 00:03:14,400 Speaker 1: within Myanmar, limiting the citizens access to information and their 52 00:03:14,440 --> 00:03:19,160 Speaker 1: ability to organize. It's a primitive but effective way to 53 00:03:19,280 --> 00:03:22,239 Speaker 1: limit the resource of the Internet, and the military in 54 00:03:22,320 --> 00:03:25,320 Speaker 1: Myanmar appear to be determined to follow in the footsteps 55 00:03:25,360 --> 00:03:28,280 Speaker 1: of China, which of course is famous for its own 56 00:03:28,320 --> 00:03:32,160 Speaker 1: digital barriers to the outside Internet. Often we call it 57 00:03:32,280 --> 00:03:37,280 Speaker 1: the Great Firewall of China. The situation in Myanmar is distressing, 58 00:03:37,320 --> 00:03:40,120 Speaker 1: to say the least, and people like Tom Andrews of 59 00:03:40,160 --> 00:03:42,440 Speaker 1: the United Nations have argued that there has to be 60 00:03:42,480 --> 00:03:46,760 Speaker 1: an unequivocal global response to this coup, and countries like 61 00:03:46,800 --> 00:03:50,320 Speaker 1: the UK, the United States, and Canada have already imposed 62 00:03:50,440 --> 00:03:55,120 Speaker 1: sanctions on the coup leaders. Meanwhile, Myanmar citizens continue to 63 00:03:55,320 --> 00:03:59,240 Speaker 1: organize protests, amassing in the streets and holding a general 64 00:03:59,280 --> 00:04:03,480 Speaker 1: strike with a truly enormous display on February twenty two. 65 00:04:04,080 --> 00:04:07,000 Speaker 1: This story is important all on its own, but it 66 00:04:07,080 --> 00:04:10,840 Speaker 1: also stands as a reminder of how crucial the Internet is. 67 00:04:11,400 --> 00:04:15,000 Speaker 1: If it's something that an oppressive regime has to shut 68 00:04:15,040 --> 00:04:19,120 Speaker 1: down in order to secure control, then you know it's important. 69 00:04:20,040 --> 00:04:25,000 Speaker 1: Something else that's important is understanding how influential for companies 70 00:04:25,040 --> 00:04:27,920 Speaker 1: are when it comes to the World Wide Web. That's 71 00:04:27,920 --> 00:04:31,840 Speaker 1: what the Big Tech Detective browser extension aims to do. 72 00:04:32,480 --> 00:04:36,400 Speaker 1: The extension comes from the Economic Security Project, and it 73 00:04:36,480 --> 00:04:40,760 Speaker 1: does something rather peculiar. Most browser extensions aim to make 74 00:04:40,839 --> 00:04:45,440 Speaker 1: a browser more useful, but the Big Tech Detective kind 75 00:04:45,480 --> 00:04:48,840 Speaker 1: of does the opposite. So what it does is block 76 00:04:48,920 --> 00:04:52,680 Speaker 1: access to any web page that connects to any IP 77 00:04:52,880 --> 00:04:58,800 Speaker 1: address that belongs to Microsoft, Amazon, Facebook or Google. Now 78 00:04:58,839 --> 00:05:02,080 Speaker 1: that includes some of more sneaky tricksy stuff, you know, 79 00:05:02,200 --> 00:05:05,479 Speaker 1: like trackers, which are the sort of things that monitor 80 00:05:05,520 --> 00:05:10,280 Speaker 1: your browsing behavior and report back to another entity, usually 81 00:05:10,279 --> 00:05:13,599 Speaker 1: so that these companies can market the most relevant ads 82 00:05:13,680 --> 00:05:16,279 Speaker 1: to you. That's why if you ever spend a few 83 00:05:16,279 --> 00:05:19,719 Speaker 1: minutes looking at I don't know websites about pet care, 84 00:05:20,000 --> 00:05:22,560 Speaker 1: you'll start to see more ads for pet care products 85 00:05:22,560 --> 00:05:25,640 Speaker 1: and services pop up on Facebook. But a web page 86 00:05:25,720 --> 00:05:28,680 Speaker 1: my contact one of these big four companies for less 87 00:05:28,720 --> 00:05:33,600 Speaker 1: intrusive reasons, such as to pull fonts from a Google database. Anyway, 88 00:05:33,920 --> 00:05:35,839 Speaker 1: it doesn't matter if the web page you're trying to 89 00:05:35,880 --> 00:05:39,040 Speaker 1: look at is tracking you, or if it's just hosted 90 00:05:39,040 --> 00:05:43,160 Speaker 1: on Amazon Web Services or whatever. If this extension sees 91 00:05:43,240 --> 00:05:47,799 Speaker 1: that that web page is at all connected to Amazon, Google, Facebook, 92 00:05:47,880 --> 00:05:51,240 Speaker 1: or Microsoft, it blocks you from seeing that web page. 93 00:05:51,520 --> 00:05:54,480 Speaker 1: As Mitchell Clark of The Verge points out, this makes 94 00:05:54,480 --> 00:05:58,400 Speaker 1: the web parractically useless, and that's kind of the point. 95 00:05:58,920 --> 00:06:03,000 Speaker 1: It's an exercise to show people exactly how influential these 96 00:06:03,040 --> 00:06:06,640 Speaker 1: four companies are when it comes to the web. Practically 97 00:06:06,680 --> 00:06:09,960 Speaker 1: every page on the Web has some sort of connection 98 00:06:10,040 --> 00:06:13,160 Speaker 1: to these four companies. Now it might be a light 99 00:06:13,200 --> 00:06:16,800 Speaker 1: touch in some cases, or it might be a tight integration, 100 00:06:17,120 --> 00:06:20,760 Speaker 1: but it really shows that those four companies wield significant 101 00:06:20,800 --> 00:06:24,240 Speaker 1: influence on the web. And maybe that's not always a 102 00:06:24,279 --> 00:06:27,440 Speaker 1: good thing. Oh and as you might suspect, to get 103 00:06:27,440 --> 00:06:30,359 Speaker 1: the extension for a browser like Google Chrome, you have 104 00:06:30,440 --> 00:06:32,960 Speaker 1: to sideloaded, meaning you're not going to find it on 105 00:06:33,000 --> 00:06:37,839 Speaker 1: the actual Google Extensions website. In previous episodes of tech Stuff, 106 00:06:37,920 --> 00:06:41,480 Speaker 1: we've looked at how Facebook, Twitter, and YouTube have removed 107 00:06:41,520 --> 00:06:45,320 Speaker 1: posts and band users for spreading misinformation. Well now we 108 00:06:45,400 --> 00:06:48,360 Speaker 1: can add TikTok to that list, as a report this 109 00:06:48,400 --> 00:06:51,240 Speaker 1: week from the company shows that TikTok removed more than 110 00:06:51,279 --> 00:06:54,840 Speaker 1: three hundred forty thousand videos from its platform for breaking 111 00:06:54,839 --> 00:07:01,360 Speaker 1: the rules regarding election misinformation, disinformation, and manipulation. The transparency 112 00:07:01,400 --> 00:07:05,480 Speaker 1: report covers TikTok's activities over the second half of twenty twenty, 113 00:07:05,880 --> 00:07:08,479 Speaker 1: so in addition to those videos that were taken down, 114 00:07:08,839 --> 00:07:11,920 Speaker 1: the company also removed more than four hundred forty thousand 115 00:07:12,040 --> 00:07:15,760 Speaker 1: videos from its recommendation engine. That means that you wouldn't 116 00:07:15,760 --> 00:07:18,080 Speaker 1: see that pop up as a potential hey, why don't 117 00:07:18,080 --> 00:07:21,320 Speaker 1: you watch this video next kind of thing again. The 118 00:07:21,400 --> 00:07:24,120 Speaker 1: purpose for the removal was that the videos had been 119 00:07:24,160 --> 00:07:29,400 Speaker 1: flagged as containing misinformation, presumably not bad enough that TikTok 120 00:07:29,680 --> 00:07:33,120 Speaker 1: would need to remove the video entirely, I guess, but 121 00:07:33,240 --> 00:07:38,240 Speaker 1: it was no longer in recommendations, which really decreases their discovery. 122 00:07:38,360 --> 00:07:43,160 Speaker 1: And TikTok also eradicated one point seven five million accounts, 123 00:07:43,200 --> 00:07:46,840 Speaker 1: and TikTok says that it appears that they only existed 124 00:07:46,880 --> 00:07:52,040 Speaker 1: for the purposes of elevating specific misinformation messaging. Weather We'll 125 00:07:52,080 --> 00:07:55,600 Speaker 1: continue to see platforms like TikTok and it's older social 126 00:07:55,680 --> 00:08:00,240 Speaker 1: networking siblings stay on top of restricting the speed and 127 00:08:00,320 --> 00:08:04,920 Speaker 1: spread of misinformation that remains to be seen. Twitter has 128 00:08:04,960 --> 00:08:08,000 Speaker 1: now added a warning to alert users to tweets that 129 00:08:08,080 --> 00:08:11,240 Speaker 1: link out two sites that host stuff that was obtained illegally, 130 00:08:11,600 --> 00:08:16,360 Speaker 1: like pirated content. The message reads, quote these materials may 131 00:08:16,400 --> 00:08:19,840 Speaker 1: have been obtained through hacking end quote. So it's kind 132 00:08:19,840 --> 00:08:23,920 Speaker 1: of like saying, okay, but for reals, these stereo systems 133 00:08:23,960 --> 00:08:26,480 Speaker 1: didn't actually fall off the back of a truck like 134 00:08:26,520 --> 00:08:31,480 Speaker 1: that guy keeps saying they done dang stolem. In related news, 135 00:08:31,760 --> 00:08:35,800 Speaker 1: people discovered that Twitter's methodology for identifying a bad link 136 00:08:35,840 --> 00:08:40,080 Speaker 1: to hacked material is itself a bit shoddy. Tom Warren 137 00:08:40,120 --> 00:08:43,880 Speaker 1: posted a link on Twitter to The Virgin's website. And 138 00:08:43,960 --> 00:08:46,520 Speaker 1: The Verge is a respectable tech news side. I use 139 00:08:46,640 --> 00:08:48,720 Speaker 1: them all the time. In fact I cite them in 140 00:08:48,760 --> 00:08:52,960 Speaker 1: these episodes, and the warning showed up on that post. 141 00:08:53,320 --> 00:08:56,840 Speaker 1: But as Warren explained, quote so there's a way to 142 00:08:57,000 --> 00:09:00,920 Speaker 1: trick Twitter into displaying its hacked materials warning. This will 143 00:09:00,920 --> 00:09:04,240 Speaker 1: be today's Twitter meme until they fix it. End quote. 144 00:09:04,679 --> 00:09:06,960 Speaker 1: The trick, by the way, was to type out a 145 00:09:07,080 --> 00:09:11,240 Speaker 1: legitimate u r L first, like www dot the Verge 146 00:09:11,360 --> 00:09:15,640 Speaker 1: dot com, then a slash, then a hashtag followed by 147 00:09:15,760 --> 00:09:18,800 Speaker 1: a U r L that would trip the warning message. 148 00:09:19,160 --> 00:09:21,640 Speaker 1: So a U r L to a an actual site 149 00:09:21,640 --> 00:09:24,559 Speaker 1: that had pirated material on it. The good u r 150 00:09:24,720 --> 00:09:27,360 Speaker 1: L is what would show up in the tweet as 151 00:09:27,400 --> 00:09:30,160 Speaker 1: the link, but Twitter would identify the bad u r 152 00:09:30,320 --> 00:09:33,120 Speaker 1: L and use that to generate the warning message. So 153 00:09:33,160 --> 00:09:35,480 Speaker 1: it's not the end of the world, but it's definitely 154 00:09:35,600 --> 00:09:37,280 Speaker 1: enough fodder for people to have a bit of fun 155 00:09:37,360 --> 00:09:40,640 Speaker 1: posting links to legitimate sources and sites but have them 156 00:09:40,720 --> 00:09:44,800 Speaker 1: appear to be clearing houses for illegal spoils on the Internet. 157 00:09:45,400 --> 00:09:48,800 Speaker 1: Sony announced that the company is working on virtual reality 158 00:09:48,840 --> 00:09:52,400 Speaker 1: hardware for the PS five console, which can still be 159 00:09:52,480 --> 00:09:55,480 Speaker 1: pretty hard to get your hands on these days. The 160 00:09:55,600 --> 00:09:58,439 Speaker 1: PS four had proved to be a good platform for 161 00:09:58,559 --> 00:10:02,920 Speaker 1: VR Unlike other or VR products, it didn't require users 162 00:10:02,920 --> 00:10:05,920 Speaker 1: to invest a couple of grand in a gaming rig 163 00:10:06,000 --> 00:10:09,080 Speaker 1: just to run the software and power the hardware. And 164 00:10:09,160 --> 00:10:12,840 Speaker 1: this announcement seems to indicate that virtual reality is really 165 00:10:12,960 --> 00:10:17,040 Speaker 1: establishing its place in gaming and home entertainment. Now. It's 166 00:10:17,040 --> 00:10:19,760 Speaker 1: certainly not as widespread as we were led to believe 167 00:10:19,800 --> 00:10:22,520 Speaker 1: it would be back in the nineteen nineties when virtual 168 00:10:22,600 --> 00:10:26,240 Speaker 1: reality was first really firing in the imaginations of the 169 00:10:26,280 --> 00:10:29,800 Speaker 1: general public. But I still take this as an encouraging 170 00:10:29,880 --> 00:10:32,839 Speaker 1: move because I like the idea of VR. I want 171 00:10:32,880 --> 00:10:35,280 Speaker 1: to see it continue to grow. I want to see 172 00:10:35,760 --> 00:10:41,040 Speaker 1: new experiences and games crafted for the virtual reality ecosystem, 173 00:10:41,080 --> 00:10:44,480 Speaker 1: and I think having something that connects directly to a 174 00:10:44,559 --> 00:10:46,839 Speaker 1: console where you don't need to do any upgrades or 175 00:10:46,880 --> 00:10:49,040 Speaker 1: anything like that in order to make it work just 176 00:10:49,120 --> 00:10:52,600 Speaker 1: makes a lot of sense. In sadder news, the US 177 00:10:52,640 --> 00:10:57,280 Speaker 1: electronics chain Fries is going out of business. This in 178 00:10:57,320 --> 00:11:01,280 Speaker 1: itself isn't a huge surprise. The company was already teetering 179 00:11:01,520 --> 00:11:05,520 Speaker 1: before the pandemic in twent Like a lot of brick 180 00:11:05,559 --> 00:11:08,959 Speaker 1: and mortar retail stores, Fries saw a lot of its 181 00:11:09,000 --> 00:11:13,920 Speaker 1: business siphoned away by that behemous Amazon. But back in 182 00:11:13,960 --> 00:11:16,880 Speaker 1: the day, Fries was one of those places where computer 183 00:11:16,960 --> 00:11:20,280 Speaker 1: geeks could go in order to buy various components. You 184 00:11:20,280 --> 00:11:23,880 Speaker 1: could stroll the aisles and select all the different pieces 185 00:11:23,960 --> 00:11:27,079 Speaker 1: you needed to build a PC, and you could build 186 00:11:27,080 --> 00:11:29,480 Speaker 1: it yourself for less than it would cost to buy 187 00:11:29,559 --> 00:11:33,480 Speaker 1: a pre assembled one to your own specifications. But the 188 00:11:33,520 --> 00:11:35,760 Speaker 1: process did require a bit of research if you wanted 189 00:11:35,800 --> 00:11:39,800 Speaker 1: to avoid making embarrassing mistakes like I don't know, picking 190 00:11:39,840 --> 00:11:43,760 Speaker 1: a motherboard and a CPU that aren't compatible with each other. 191 00:11:43,960 --> 00:11:47,720 Speaker 1: But who would do that? Not this guy, let me 192 00:11:47,760 --> 00:11:52,439 Speaker 1: tell you. Anyway, it looks like it's the end of 193 00:11:52,640 --> 00:11:56,040 Speaker 1: Fries and that is a real shame. Although from what 194 00:11:56,120 --> 00:11:59,880 Speaker 1: I understand, in more recent years, the experience of shopping 195 00:12:00,120 --> 00:12:02,760 Speaker 1: Fries has taken a bit of a dip. And even 196 00:12:02,840 --> 00:12:05,040 Speaker 1: in its heyday when I used to go to Fries, 197 00:12:05,880 --> 00:12:08,560 Speaker 1: it was kind of a scavenger hunt to find all 198 00:12:08,600 --> 00:12:11,160 Speaker 1: the stuff you needed. It wasn't necessarily laid out in 199 00:12:11,200 --> 00:12:15,760 Speaker 1: a way that was user friendly. And finally, in California, 200 00:12:15,840 --> 00:12:18,959 Speaker 1: a federal judge has denied the request made by telecom 201 00:12:19,000 --> 00:12:22,440 Speaker 1: and cable companies to block the state from enforcing a 202 00:12:22,520 --> 00:12:25,960 Speaker 1: net neutrality law. The law and makes it illegal for 203 00:12:26,040 --> 00:12:30,119 Speaker 1: telecom companies to favor their own services over those provided 204 00:12:30,120 --> 00:12:33,280 Speaker 1: by another party. So, in other words, a company like 205 00:12:33,480 --> 00:12:37,520 Speaker 1: Comcast would not be allowed to throttle a service like 206 00:12:37,640 --> 00:12:41,960 Speaker 1: Netflix in an effort to persuade customers to use Comcasts 207 00:12:41,960 --> 00:12:45,960 Speaker 1: owned video on demand services instead. A few years ago, 208 00:12:46,440 --> 00:12:49,040 Speaker 1: this was also the stance of the f c C 209 00:12:49,520 --> 00:12:53,760 Speaker 1: in the United States, the federal level administration that oversees 210 00:12:53,800 --> 00:12:57,280 Speaker 1: these kinds of matters, but those restrictions were phased out 211 00:12:57,400 --> 00:13:00,880 Speaker 1: during the Trump administration. In the wake of that event, 212 00:13:01,200 --> 00:13:05,200 Speaker 1: California passed its own state law regarding net neutrality. A 213 00:13:05,240 --> 00:13:08,440 Speaker 1: collection of telecommunications and cable companies are suing the state 214 00:13:08,520 --> 00:13:11,480 Speaker 1: over the matter, but the general consensus seems to be 215 00:13:11,559 --> 00:13:14,280 Speaker 1: that that effort is destined to fail, and with the 216 00:13:14,280 --> 00:13:18,720 Speaker 1: Biden administration now overseeing federal operations, the FCC as a 217 00:13:18,760 --> 00:13:22,720 Speaker 1: whole might reinstitute some of the restrictions that the previous 218 00:13:22,800 --> 00:13:26,600 Speaker 1: administration had ended. It might be a good time for 219 00:13:26,679 --> 00:13:32,480 Speaker 1: net neutrality and a slightly tougher time for all those gigantic, 220 00:13:33,000 --> 00:13:39,520 Speaker 1: almost monopolistic telecommunications companies that are dominating all realms of 221 00:13:40,200 --> 00:13:46,120 Speaker 1: communications and media. I'm not crying over that, honestly. And 222 00:13:46,200 --> 00:13:51,880 Speaker 1: that wraps up the news stories for this Thursday, February one. 223 00:13:52,080 --> 00:13:54,960 Speaker 1: If you guys have suggestions for future topics I should 224 00:13:55,000 --> 00:13:57,960 Speaker 1: cover on episodes of tech Stuff, let me know. Reach 225 00:13:57,960 --> 00:14:01,040 Speaker 1: out to me on Twitter. The handle is stuff h 226 00:14:01,200 --> 00:14:04,760 Speaker 1: s W and I'll talk to you again really soon. 227 00:14:06,040 --> 00:14:13,080 Speaker 1: Y text Stuff is an I Heart Radio production. For 228 00:14:13,160 --> 00:14:16,120 Speaker 1: more podcasts from my Heart Radio, visit the i heart 229 00:14:16,200 --> 00:14:19,400 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 230 00:14:19,440 --> 00:14:20,120 Speaker 1: favorite shows