1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,800 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,960 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,120 --> 00:00:20,720 Speaker 1: and I love all things tech and it is time 5 00:00:21,120 --> 00:00:26,400 Speaker 1: for the news for Tuesday, February twenty one. Let's get 6 00:00:26,440 --> 00:00:29,520 Speaker 1: to it, and we will begin with an update on 7 00:00:29,560 --> 00:00:33,240 Speaker 1: the Solar Winds hack, a massive supply chain hack that's 8 00:00:33,280 --> 00:00:38,839 Speaker 1: had a critically dangerous effect on numerous companies and governmental agencies, 9 00:00:39,240 --> 00:00:41,959 Speaker 1: with the potential for a lot more damage further down 10 00:00:42,040 --> 00:00:46,040 Speaker 1: the road. Solar Winds makes I T system management software, 11 00:00:46,280 --> 00:00:49,479 Speaker 1: so you know, the type of stuff that network administrators 12 00:00:49,520 --> 00:00:52,960 Speaker 1: might use to monitor and control really complicated computer networks. 13 00:00:53,360 --> 00:00:58,440 Speaker 1: Hackers likely backed by Russia, compromised a piece of software 14 00:00:58,480 --> 00:01:03,120 Speaker 1: called Ryan on the Solar Winds side of everything, and 15 00:01:03,200 --> 00:01:07,560 Speaker 1: they inserted malware into the software code for the Orion product. 16 00:01:07,800 --> 00:01:11,560 Speaker 1: So when solar Winds pushed out updates to Orion, Solar 17 00:01:11,600 --> 00:01:15,840 Speaker 1: Winds customers who accepted those updates because solar Winds as 18 00:01:15,840 --> 00:01:19,760 Speaker 1: a trusted partner, they unknowingly infected their own systems with 19 00:01:19,920 --> 00:01:24,840 Speaker 1: a trojan type of malware. That attack hit thousands of 20 00:01:24,920 --> 00:01:27,840 Speaker 1: customers but the hackers actually only followed up with a 21 00:01:27,880 --> 00:01:31,319 Speaker 1: few dozen of them to further infiltrate and spy on 22 00:01:31,400 --> 00:01:35,440 Speaker 1: targets that ranged from Microsoft to the United States Department 23 00:01:35,440 --> 00:01:39,600 Speaker 1: of Homeland Security. Now, Microsoft says that after analyzing the 24 00:01:39,640 --> 00:01:43,440 Speaker 1: code found in Orion, researchers reckon that maybe as many 25 00:01:43,480 --> 00:01:47,120 Speaker 1: as a thousand or more developers had worked on that 26 00:01:47,200 --> 00:01:51,680 Speaker 1: malicious code, which itself consisted of four thousand, thirty two 27 00:01:51,680 --> 00:01:55,120 Speaker 1: lines of code. Now more than a thousand developers, that 28 00:01:55,280 --> 00:02:00,840 Speaker 1: is a truly huge endeavor, something far beyond independent hacking groups. 29 00:02:01,200 --> 00:02:04,800 Speaker 1: Microsoft president Brad Smith indicated that this sort of attack 30 00:02:05,200 --> 00:02:08,520 Speaker 1: was new to the United States, but Russia had previously 31 00:02:08,560 --> 00:02:12,480 Speaker 1: employed a similar approach in cyber attacks on systems located 32 00:02:12,560 --> 00:02:15,880 Speaker 1: inside the Ukraine. If you want to learn more about 33 00:02:15,919 --> 00:02:19,400 Speaker 1: the Solar Winds hack, I had a an episode with 34 00:02:19,440 --> 00:02:23,600 Speaker 1: hacker extraordinaire Shannon Morrise she joined tech Stuff. The episode 35 00:02:23,639 --> 00:02:27,399 Speaker 1: is titled the Solar Winds Hack. It published on February one, 36 00:02:27,639 --> 00:02:30,440 Speaker 1: and Shannon and I explored the scope of the Solar 37 00:02:30,480 --> 00:02:33,760 Speaker 1: Winds attack and what it all actually means. Spoiler alert, 38 00:02:34,360 --> 00:02:39,040 Speaker 1: It's not good. And now let's switch to a collection 39 00:02:39,080 --> 00:02:42,240 Speaker 1: of stories that I like to call eat the Rich. 40 00:02:43,160 --> 00:02:45,920 Speaker 1: And we've got some more updates on that story of 41 00:02:45,919 --> 00:02:49,280 Speaker 1: Wall Street Bets and the game Stop stock, this time 42 00:02:49,360 --> 00:02:51,960 Speaker 1: looking at how the United States government is getting involved, 43 00:02:52,160 --> 00:02:56,280 Speaker 1: so quick refresher. Some hedge funds decided to short sell 44 00:02:56,639 --> 00:02:59,480 Speaker 1: shares a game Stop. That's when you borrow shares of stock, 45 00:02:59,840 --> 00:03:02,280 Speaker 1: you sell them at whatever the current market value is 46 00:03:02,320 --> 00:03:05,840 Speaker 1: for that stock. Then you buy back those borrowed shares 47 00:03:06,120 --> 00:03:08,440 Speaker 1: so you can return them to the original owner. And 48 00:03:08,480 --> 00:03:10,480 Speaker 1: what your hope is is that the price of the 49 00:03:10,520 --> 00:03:13,360 Speaker 1: shares will drop so that you make money on the deal. 50 00:03:13,440 --> 00:03:16,320 Speaker 1: So let's say you sell some borrowed stock for twenty 51 00:03:16,320 --> 00:03:18,480 Speaker 1: bucks a pop, and you buy it back when it's 52 00:03:18,520 --> 00:03:21,720 Speaker 1: at ten dollars a share. You just made ten dollars 53 00:03:21,720 --> 00:03:24,640 Speaker 1: a share as you return the stock to the person 54 00:03:24,680 --> 00:03:27,880 Speaker 1: you borrowed it from. But a group of independent investors 55 00:03:28,080 --> 00:03:32,200 Speaker 1: day traders, communicating on forums like Reddit and a subreddit 56 00:03:32,240 --> 00:03:35,800 Speaker 1: called Wall Street Bets, decided to foil this plan by 57 00:03:35,840 --> 00:03:39,680 Speaker 1: buying up game Stop shares and encouraging others to do so. 58 00:03:40,440 --> 00:03:44,120 Speaker 1: Now that pushed the price of the game Stop shares up, 59 00:03:44,520 --> 00:03:48,960 Speaker 1: not down. Then you had hedge funds freaking out because 60 00:03:49,440 --> 00:03:51,880 Speaker 1: they were going to be obligated to buy back those 61 00:03:51,880 --> 00:03:54,760 Speaker 1: shares they have to in order to return them, and 62 00:03:54,840 --> 00:03:57,440 Speaker 1: now they would have to pay more money than they 63 00:03:57,480 --> 00:04:00,240 Speaker 1: made when they sold off those borrowed shares, and the 64 00:04:00,280 --> 00:04:03,760 Speaker 1: Wall Street bets folks kept buying up more stock as 65 00:04:03,800 --> 00:04:07,080 Speaker 1: soon as it would become available, which pushed the price 66 00:04:07,160 --> 00:04:10,560 Speaker 1: even higher, creating a short squeeze. It hit a high 67 00:04:10,640 --> 00:04:15,360 Speaker 1: of around four three dollars per share. Uh this was 68 00:04:15,560 --> 00:04:18,279 Speaker 1: well above the twenty dollars a share that it was 69 00:04:18,360 --> 00:04:22,240 Speaker 1: at before all this happened, and that share price has 70 00:04:22,279 --> 00:04:24,840 Speaker 1: actually settled off quite a bit. Right now, as a 71 00:04:24,920 --> 00:04:28,120 Speaker 1: record this it's around fifty two dollars per share, but 72 00:04:28,200 --> 00:04:30,240 Speaker 1: that's still more than twice what it was when all 73 00:04:30,279 --> 00:04:33,320 Speaker 1: this started. And the U S House of Representatives wants 74 00:04:33,360 --> 00:04:35,960 Speaker 1: to talk to a few people about everything that's been 75 00:04:35,960 --> 00:04:39,480 Speaker 1: going on. Among those people are Steve Hoffman, the CEO 76 00:04:39,560 --> 00:04:43,839 Speaker 1: of Reddit, and Vlad Tinev, the co CEO of robin 77 00:04:43,880 --> 00:04:47,719 Speaker 1: Hood that's an app that allows its customers, it's users 78 00:04:48,040 --> 00:04:51,080 Speaker 1: to buy stocks through the app. And also they want 79 00:04:51,080 --> 00:04:53,760 Speaker 1: to talk to Keith gil who has the online moniker 80 00:04:53,920 --> 00:04:57,599 Speaker 1: of Roaring Kitty he's been invested heavily in game stuff 81 00:04:57,960 --> 00:05:01,640 Speaker 1: since back in twenty nineteen, an he posted about it frequently. 82 00:05:02,800 --> 00:05:04,760 Speaker 1: The Senate also wants to get involved. They want to 83 00:05:04,760 --> 00:05:07,160 Speaker 1: determine if there was any manipulation of the stock market 84 00:05:07,160 --> 00:05:10,400 Speaker 1: going on here. But on the surface, that doesn't seem 85 00:05:10,480 --> 00:05:13,599 Speaker 1: to be the case, at least not for the price 86 00:05:13,680 --> 00:05:17,360 Speaker 1: to go up. Sure, there were independent investors all working 87 00:05:17,360 --> 00:05:20,359 Speaker 1: together to buy up shares of stock, but that in 88 00:05:20,440 --> 00:05:24,680 Speaker 1: itself isn't manipulation. In fact, the Wall Street Bets Crew 89 00:05:24,760 --> 00:05:29,080 Speaker 1: maintains that hedge funds that short cell companies frequently engage 90 00:05:29,080 --> 00:05:33,120 Speaker 1: in activities that are specifically meant to devalue that company's 91 00:05:33,240 --> 00:05:37,919 Speaker 1: share price. That seems to fall more under the manipulation umbrella. 92 00:05:38,160 --> 00:05:40,800 Speaker 1: Also on the table or talks about robin Hood, which 93 00:05:40,880 --> 00:05:43,560 Speaker 1: famously put a freeze on how many shares of game 94 00:05:43,640 --> 00:05:47,440 Speaker 1: Stop users could actually purchase, and in some cases they 95 00:05:47,480 --> 00:05:50,520 Speaker 1: just stopped anyone from being able to buy game Stop 96 00:05:50,560 --> 00:05:53,600 Speaker 1: shares at all through robin Hood. The House will examine 97 00:05:53,600 --> 00:05:57,080 Speaker 1: if robin Hood broke any federal regulations in the process 98 00:05:57,120 --> 00:06:00,760 Speaker 1: of that. As numerous media outlets have shared a major 99 00:06:00,880 --> 00:06:04,159 Speaker 1: hedge fund owns a significant stake in Robin Hood and 100 00:06:04,200 --> 00:06:06,840 Speaker 1: the Wall Street Bets Crew has maintained that the app 101 00:06:06,920 --> 00:06:09,080 Speaker 1: is kind of part of a system that rewards the 102 00:06:09,120 --> 00:06:12,719 Speaker 1: wealthy at the expense of everybody else. The U S. 103 00:06:12,720 --> 00:06:15,240 Speaker 1: Senate is also looking into the current state of the 104 00:06:15,320 --> 00:06:17,920 Speaker 1: stock market in general, and the hearing in the House 105 00:06:17,920 --> 00:06:21,440 Speaker 1: of Representatives should take place on February eighteenth, starting at 106 00:06:21,480 --> 00:06:24,680 Speaker 1: noon Eastern time. They'll also be streaming it live just 107 00:06:24,800 --> 00:06:28,560 Speaker 1: in case you know you're interested in tuning in. Researchers 108 00:06:28,600 --> 00:06:32,080 Speaker 1: with the Duke University School of Business published a report 109 00:06:32,240 --> 00:06:35,279 Speaker 1: that an elite group of companies are responsible for more 110 00:06:35,320 --> 00:06:40,000 Speaker 1: than half of all the revenue generated from the ocean economy. 111 00:06:40,279 --> 00:06:43,400 Speaker 1: So we're talking about around a hundred companies responsible for 112 00:06:43,520 --> 00:06:48,680 Speaker 1: one point one trillion dollars in revenues, or about six 113 00:06:49,360 --> 00:06:53,960 Speaker 1: of all revenue from ocean based economic activities. This is 114 00:06:54,000 --> 00:06:56,360 Speaker 1: based off data that's actually a couple of years old, 115 00:06:56,360 --> 00:07:00,240 Speaker 1: so it might even be more dramatic than that. In now, 116 00:07:00,320 --> 00:07:03,320 Speaker 1: if you're like me, you might have had a pretty 117 00:07:03,520 --> 00:07:07,000 Speaker 1: rough reaction to that news. I mean, that's an enormous 118 00:07:07,120 --> 00:07:11,680 Speaker 1: amount of influence with a very small number of organizations. 119 00:07:11,720 --> 00:07:14,840 Speaker 1: But luckily for me, the researchers provide a voice of 120 00:07:14,880 --> 00:07:18,360 Speaker 1: cautious optimism about all this. One of the authors of 121 00:07:18,360 --> 00:07:21,400 Speaker 1: the study has said, quote, one of our biggest challenges 122 00:07:21,560 --> 00:07:26,040 Speaker 1: is to sustain healthy ocean ecosystems as economic use increases 123 00:07:26,320 --> 00:07:30,840 Speaker 1: and climate impacts accelerate. This study confirms that a relatively 124 00:07:30,880 --> 00:07:33,880 Speaker 1: small number of companies will be central to this challenge 125 00:07:34,160 --> 00:07:37,720 Speaker 1: and have a real opportunity for leadership end quote. So 126 00:07:37,960 --> 00:07:41,040 Speaker 1: with that perspective, you could argue that it's actually easier 127 00:07:41,120 --> 00:07:44,600 Speaker 1: to convince a relatively small number of companies to make 128 00:07:44,640 --> 00:07:48,360 Speaker 1: some big changes that could have enormous benefits for millions. 129 00:07:48,880 --> 00:07:50,560 Speaker 1: Then it would be a lot easier to do that 130 00:07:50,560 --> 00:07:53,800 Speaker 1: than to try and convince a ton of smaller companies 131 00:07:54,040 --> 00:07:57,200 Speaker 1: that each have a smaller amount of influence and thus 132 00:07:57,600 --> 00:08:00,080 Speaker 1: have a small stake in a very big pool, as 133 00:08:00,120 --> 00:08:03,320 Speaker 1: it were. Companies in the study included oil and gas 134 00:08:03,480 --> 00:08:09,120 Speaker 1: companies that conduct you know, offshore mining operations, seafood production 135 00:08:09,120 --> 00:08:14,280 Speaker 1: and processing, shipping, cruise tourism, offshore wind companies, and more. 136 00:08:14,760 --> 00:08:17,440 Speaker 1: The lion share of that one point one trillion dollars 137 00:08:17,440 --> 00:08:21,120 Speaker 1: of revenue actually falls, into surprise, surprise, the oil and 138 00:08:21,200 --> 00:08:24,800 Speaker 1: gas company slice of the pie. These companies had a 139 00:08:24,840 --> 00:08:30,600 Speaker 1: combined revenue of eight hundred thirty billion dollars or of 140 00:08:30,640 --> 00:08:34,320 Speaker 1: the whole thing. Yoza. Now, will these companies make the 141 00:08:34,400 --> 00:08:38,840 Speaker 1: major changes to address sustainability challenges? Well, if shareholders demand 142 00:08:38,920 --> 00:08:41,800 Speaker 1: it and pressure the companies to do so, or if 143 00:08:41,840 --> 00:08:45,080 Speaker 1: the leadership of those companies takes these issues like ocean 144 00:08:45,120 --> 00:08:50,360 Speaker 1: sustainability and climate change really seriously. Perhaps I withhold judgment 145 00:08:50,600 --> 00:08:53,160 Speaker 1: only because I've seen a lot of companies act in 146 00:08:53,240 --> 00:08:57,400 Speaker 1: short term self interest over long term viability. But I 147 00:08:57,440 --> 00:09:02,360 Speaker 1: hope the attitude of the researchers us reflect reality. All right, 148 00:09:02,400 --> 00:09:06,679 Speaker 1: how about instead of wealthy corporations, I talked about wealthy criminals, 149 00:09:07,320 --> 00:09:10,640 Speaker 1: And yes there is some overlap in that ven diagram, 150 00:09:10,640 --> 00:09:13,200 Speaker 1: but I am focusing on people who are engaged in 151 00:09:13,360 --> 00:09:17,240 Speaker 1: illegal activities and then turning to stuff like cryptocurrency in 152 00:09:17,280 --> 00:09:20,360 Speaker 1: an effort to launder money so that you know, they 153 00:09:20,360 --> 00:09:24,840 Speaker 1: can actually use the money they stole. That's the whole 154 00:09:24,840 --> 00:09:28,800 Speaker 1: purpose of money laundering. You take illegally obtained money, whether 155 00:09:28,840 --> 00:09:32,320 Speaker 1: it was stolen or it came from illegal transactions, you know, 156 00:09:32,360 --> 00:09:35,920 Speaker 1: like drug sales, that kind of stuff or whatever, and 157 00:09:36,000 --> 00:09:38,400 Speaker 1: you take that money, you mix it with cash from 158 00:09:38,480 --> 00:09:42,440 Speaker 1: an otherwise legitimate enterprise in order to hide the ill 159 00:09:42,520 --> 00:09:45,440 Speaker 1: gotten gains. Now, this gets pretty tricky if you're talking 160 00:09:45,480 --> 00:09:49,840 Speaker 1: about truly large amounts of money, because regulators noticed that 161 00:09:49,920 --> 00:09:53,840 Speaker 1: kind of stuff. If a humble business, let's say it's 162 00:09:53,960 --> 00:09:58,400 Speaker 1: literally a launderer, were to brake in, you know, a 163 00:09:58,480 --> 00:10:01,920 Speaker 1: hundred times more revenue than that normally would, that might 164 00:10:02,000 --> 00:10:05,559 Speaker 1: raise some eyebrows. But anyway, an analysis company called chain 165 00:10:05,600 --> 00:10:09,720 Speaker 1: Analysis published a report recently saying that fifty five percent 166 00:10:10,040 --> 00:10:13,760 Speaker 1: of all the money laundering happening with cryptocurrency can be 167 00:10:13,800 --> 00:10:17,880 Speaker 1: traced to just two D seventy block chain addresses. Beyond 168 00:10:17,880 --> 00:10:20,840 Speaker 1: this core group of two D seventy blockchain addresses, you've 169 00:10:20,840 --> 00:10:24,760 Speaker 1: got another fifteen hundred addresses or so that is responsible 170 00:10:24,800 --> 00:10:28,760 Speaker 1: for sevent of all money laundering in the cryptocurrency world. Now, 171 00:10:28,800 --> 00:10:32,559 Speaker 1: this kind of concentration of laundering could pose a big 172 00:10:32,600 --> 00:10:35,400 Speaker 1: problem for the batties out there, because with a bottleneck 173 00:10:35,440 --> 00:10:39,480 Speaker 1: and cryptocurrency processing, law enforcement agencies around the world can 174 00:10:39,600 --> 00:10:42,880 Speaker 1: narrow their focus on some of the most active addresses 175 00:10:43,080 --> 00:10:46,640 Speaker 1: to identify and go after groups of criminals, and if 176 00:10:46,640 --> 00:10:51,240 Speaker 1: cryptocurrency exchanges detect that, they could be targeted for investigations, 177 00:10:51,640 --> 00:10:55,080 Speaker 1: they might actually take more proactive steps to enforce anti 178 00:10:55,160 --> 00:10:59,360 Speaker 1: money laundering policies against users and shut down criminal activity 179 00:10:59,480 --> 00:11:02,640 Speaker 1: ahead of official legal action. One of the things you 180 00:11:02,720 --> 00:11:06,160 Speaker 1: learned when you start looking into big money criminal enterprises 181 00:11:06,600 --> 00:11:10,120 Speaker 1: is that crime does pay, but it can sometimes be 182 00:11:10,200 --> 00:11:14,000 Speaker 1: really hard to cash out without getting caught. Over at 183 00:11:14,040 --> 00:11:17,679 Speaker 1: the New York Times, journalist Cade Mets has written an 184 00:11:17,679 --> 00:11:20,640 Speaker 1: incredible piece about a movement that has really taken hold 185 00:11:20,679 --> 00:11:25,040 Speaker 1: in the tech space, particularly among leaders in Silicon Valley. 186 00:11:25,080 --> 00:11:29,120 Speaker 1: The pieces titled Silicon Valleys Safe Space, and it's about 187 00:11:29,120 --> 00:11:33,439 Speaker 1: a blog that was called slate Star codex. Gets explains 188 00:11:33,480 --> 00:11:37,440 Speaker 1: that a psychiatrist named Scott Siskin who used the pseudonym 189 00:11:37,520 --> 00:11:40,600 Speaker 1: Scott Alexander, created the blog and used it to write 190 00:11:40,600 --> 00:11:44,000 Speaker 1: about a lot of big topics. As Gets says, quote, 191 00:11:44,320 --> 00:11:48,120 Speaker 1: the blog explored everything from science and medicine to philosophy 192 00:11:48,120 --> 00:11:51,720 Speaker 1: and politics to the rise of artificial intelligence end quote. 193 00:11:51,960 --> 00:11:55,280 Speaker 1: Now I recommend reading the whole piece over on the 194 00:11:55,280 --> 00:11:58,920 Speaker 1: New York Times. Uh They it's really well done, and 195 00:11:58,960 --> 00:12:01,800 Speaker 1: I just want to give a kind of general overview 196 00:12:01,840 --> 00:12:04,800 Speaker 1: without diving too much into the piece. Much of it 197 00:12:04,880 --> 00:12:07,640 Speaker 1: focused on a school of thought that many of the 198 00:12:07,720 --> 00:12:11,120 Speaker 1: leaders in Silicon Valley have been, you know, following or 199 00:12:11,160 --> 00:12:16,880 Speaker 1: subscribed to. It's called rationalism, which sounds you know reasonable, right, 200 00:12:17,000 --> 00:12:21,319 Speaker 1: I mean rationals in the name. Ideally, the philosophy examines 201 00:12:21,360 --> 00:12:25,160 Speaker 1: issues with a sort of scientific approach. The goal is 202 00:12:25,160 --> 00:12:29,640 Speaker 1: to gain an understanding of issues and solutions using a 203 00:12:29,760 --> 00:12:33,320 Speaker 1: rational line of thinking. But there's a tone that starts 204 00:12:33,360 --> 00:12:37,920 Speaker 1: to pop up in rationalism that I personally find rather troubling, 205 00:12:38,160 --> 00:12:42,400 Speaker 1: and it's a tendency to overlook consequences of certain actions 206 00:12:42,520 --> 00:12:47,760 Speaker 1: or policies. I detect a tendency for privileged people to 207 00:12:47,840 --> 00:12:51,480 Speaker 1: ignore the effects of their rationalism on those who are 208 00:12:51,559 --> 00:12:55,640 Speaker 1: not part of that privileged class. Free speech is frequently 209 00:12:55,720 --> 00:12:59,120 Speaker 1: cited as a critical element of rationalism, but there also 210 00:12:59,160 --> 00:13:01,400 Speaker 1: seems to be a fail eire to address a problem 211 00:13:01,440 --> 00:13:04,280 Speaker 1: I see, which is that those who have the strongest 212 00:13:04,320 --> 00:13:08,199 Speaker 1: platform from which to speak freely are those who already 213 00:13:08,280 --> 00:13:11,800 Speaker 1: enjoy a great deal of privilege. In other words, if 214 00:13:11,840 --> 00:13:15,719 Speaker 1: you aren't one of the lucky few who are privileged, 215 00:13:16,160 --> 00:13:19,840 Speaker 1: then freedom of speech doesn't mean as much because you 216 00:13:19,840 --> 00:13:23,360 Speaker 1: aren't given the same platform that allows you to, you know, 217 00:13:23,440 --> 00:13:26,960 Speaker 1: get your message to be heard. You aren't elevated, and 218 00:13:27,120 --> 00:13:31,360 Speaker 1: your volume isn't amplified the way the folks who do 219 00:13:31,480 --> 00:13:34,760 Speaker 1: belong to that class experience. Now you can probably tell 220 00:13:35,280 --> 00:13:38,280 Speaker 1: I'm not totally on board with rationalism. I respect a 221 00:13:38,280 --> 00:13:41,800 Speaker 1: lot of what it's about in theory, but I also 222 00:13:41,840 --> 00:13:45,120 Speaker 1: have fundamental problems with it in practice. And it's kind 223 00:13:45,160 --> 00:13:48,920 Speaker 1: of why when I talk about critical thinking, I tend 224 00:13:48,960 --> 00:13:52,400 Speaker 1: to pair it with compassion when I advocate it to 225 00:13:52,440 --> 00:13:56,480 Speaker 1: you guys, my listeners. I find that compassion without critical 226 00:13:56,520 --> 00:13:59,720 Speaker 1: thinking that's terrible at least some bad decisions that leads 227 00:13:59,760 --> 00:14:03,080 Speaker 1: to magical reasoning. If you don't allow critical thinking and 228 00:14:03,160 --> 00:14:06,640 Speaker 1: you just follow compassion, you're gonna make some bad choices. 229 00:14:07,000 --> 00:14:10,840 Speaker 1: But I find that critical thinking without compassion leads to 230 00:14:11,000 --> 00:14:14,440 Speaker 1: decisions that can have really negative consequences for a lot 231 00:14:14,480 --> 00:14:17,440 Speaker 1: of people. And Gets does a pretty good job of 232 00:14:17,520 --> 00:14:20,360 Speaker 1: handling this matter with a very objective point of view, 233 00:14:20,560 --> 00:14:25,440 Speaker 1: particularly considering that Gets himself became the target of harassment 234 00:14:25,640 --> 00:14:30,600 Speaker 1: as a result of this journalistic endeavor he pursued. So 235 00:14:31,000 --> 00:14:34,080 Speaker 1: if you find my own perspective frustrating because of my 236 00:14:34,120 --> 00:14:37,280 Speaker 1: own point of view, you should really read the original 237 00:14:37,320 --> 00:14:40,600 Speaker 1: piece because there's a lot in there, including the way 238 00:14:40,680 --> 00:14:44,320 Speaker 1: that various leaders have treated the concept of artificial intelligence, 239 00:14:45,200 --> 00:14:48,480 Speaker 1: and our AI really does post some significant challenges from 240 00:14:48,480 --> 00:14:52,280 Speaker 1: an ethical and existential point of view, So highly recommend 241 00:14:52,320 --> 00:14:55,320 Speaker 1: you check it out. Okay, we have some more stories 242 00:14:55,400 --> 00:14:58,000 Speaker 1: to get to, but first let's take a quick break. 243 00:15:05,640 --> 00:15:10,280 Speaker 1: We're back. The Anti Defamation League released a study last week. 244 00:15:10,560 --> 00:15:14,000 Speaker 1: They found videos on YouTube that promote extremist views like 245 00:15:14,320 --> 00:15:19,000 Speaker 1: you know, white supremacy are still not only found on YouTube, 246 00:15:19,480 --> 00:15:24,320 Speaker 1: but also the site's algorithm is still recommending those videos. Uh, 247 00:15:24,480 --> 00:15:27,520 Speaker 1: if you happen to have watched other extremist videos, so 248 00:15:27,560 --> 00:15:32,080 Speaker 1: if you watch one, you're likely to be recommended others. Now, 249 00:15:32,120 --> 00:15:36,120 Speaker 1: this falls into a general problem of algorithms that potentially 250 00:15:36,160 --> 00:15:40,960 Speaker 1: could be exacerbating radicalization. The purpose of the algorithm ultimately 251 00:15:41,360 --> 00:15:43,840 Speaker 1: is just to keep people on the platform for as 252 00:15:43,880 --> 00:15:47,360 Speaker 1: long as possible, because more time on the platform means 253 00:15:47,400 --> 00:15:51,440 Speaker 1: more advertising dollars going to that company, so you want 254 00:15:51,480 --> 00:15:53,760 Speaker 1: to maximize the amount of time people are spending there. 255 00:15:54,040 --> 00:15:56,280 Speaker 1: So if you click on a video on YouTube, the 256 00:15:56,360 --> 00:15:59,400 Speaker 1: algorithm will take into account which video you chose and 257 00:15:59,400 --> 00:16:02,200 Speaker 1: then pull up videos with related material In them and 258 00:16:02,240 --> 00:16:04,880 Speaker 1: serve them up to you. Now. Many years ago, the 259 00:16:04,920 --> 00:16:08,960 Speaker 1: comedian Patton Oswald had a bit about the DVR device 260 00:16:09,040 --> 00:16:11,960 Speaker 1: TiVo and how he was having problems with it. According 261 00:16:11,960 --> 00:16:15,200 Speaker 1: to the bit, he said he decided to watch some Westerns, 262 00:16:15,240 --> 00:16:18,560 Speaker 1: and TiVo then interpreted that to mean that what Patton 263 00:16:18,640 --> 00:16:25,160 Speaker 1: really liked were horses, and so suddenly TiVo was automatically 264 00:16:25,200 --> 00:16:29,240 Speaker 1: recording programs that had anything to do with horses and 265 00:16:29,560 --> 00:16:31,760 Speaker 1: presenting those two patent as, Hey, you liked you like 266 00:16:31,840 --> 00:16:35,640 Speaker 1: that other horse thing, watch this horse thing. Well, that's 267 00:16:35,720 --> 00:16:38,520 Speaker 1: kind of what YouTube's algorithm does. If you watch videos 268 00:16:38,520 --> 00:16:41,200 Speaker 1: about people rescuing animals, you're going to get a lot 269 00:16:41,320 --> 00:16:44,680 Speaker 1: more of those videos recommended to you. I know, because 270 00:16:44,720 --> 00:16:48,160 Speaker 1: that's what dominates my YouTube recommendations. But when it comes 271 00:16:48,200 --> 00:16:52,520 Speaker 1: to indoctrine Nation into extremist views, this algorithmic approach can 272 00:16:52,560 --> 00:16:55,920 Speaker 1: be a huge problem. It's like YouTube is doing the 273 00:16:56,000 --> 00:17:01,400 Speaker 1: recruitment on behalf of these extremist groups population recommendations with 274 00:17:01,680 --> 00:17:06,760 Speaker 1: propaganda that persuades more people toward harmful extremist ideas. And 275 00:17:06,880 --> 00:17:11,280 Speaker 1: also terms like extremism and radicalism are a little too vague. 276 00:17:11,560 --> 00:17:14,800 Speaker 1: I think you can have an extreme view and it's 277 00:17:14,840 --> 00:17:18,800 Speaker 1: not necessarily harmful, but we're generally talking about philosophies like 278 00:17:19,119 --> 00:17:24,320 Speaker 1: white supremacy that is undeniably harmful. The study found that 279 00:17:24,359 --> 00:17:27,720 Speaker 1: YouTube's algorithm was likely to serve up more videos designed 280 00:17:27,720 --> 00:17:30,600 Speaker 1: to appeal to these sorts of world views if you 281 00:17:30,680 --> 00:17:33,760 Speaker 1: had watched a video already. Now that being said, you 282 00:17:33,800 --> 00:17:36,840 Speaker 1: are not likely to stumble across one of these videos 283 00:17:36,840 --> 00:17:40,280 Speaker 1: and your recommendations without doing some of that work on 284 00:17:40,359 --> 00:17:43,359 Speaker 1: your own. Google has really been working to limit hate 285 00:17:43,400 --> 00:17:46,560 Speaker 1: speech on the YouTube platform. That is helping out a bit. 286 00:17:46,800 --> 00:17:50,080 Speaker 1: In fact, according to Google, it's reduced the consumption of 287 00:17:50,119 --> 00:17:52,720 Speaker 1: those types of videos by as much as eighty percent. 288 00:17:53,200 --> 00:17:56,400 Speaker 1: But if you actually do seek out the videos, then 289 00:17:56,760 --> 00:17:59,960 Speaker 1: you can find them, and you might find that YouTube's 290 00:18:00,000 --> 00:18:02,639 Speaker 1: starts recommending more of them to you, and that remains 291 00:18:02,680 --> 00:18:05,400 Speaker 1: a problem. I hope to do a more thorough episode 292 00:18:05,400 --> 00:18:07,520 Speaker 1: on this general topic in the future, and I have 293 00:18:07,560 --> 00:18:10,160 Speaker 1: a special guest in mind who might help me talk 294 00:18:10,200 --> 00:18:14,520 Speaker 1: about it, So stay tuned for that. Parlor or parlay 295 00:18:14,800 --> 00:18:19,160 Speaker 1: if you prefer, has returned, the social network ostensibly dedicated 296 00:18:19,200 --> 00:18:23,000 Speaker 1: to free speech, but effectively the place of refuge for 297 00:18:23,320 --> 00:18:27,560 Speaker 1: the far right of the political spectrum Online was homeless 298 00:18:27,680 --> 00:18:31,360 Speaker 1: for a while. That was after Amazon Web Services booted 299 00:18:31,480 --> 00:18:34,760 Speaker 1: the company from their servers. But now the site has 300 00:18:34,800 --> 00:18:37,919 Speaker 1: a new host, a company called Epic E p i 301 00:18:38,080 --> 00:18:41,960 Speaker 1: K that's also known as a harbor for other far 302 00:18:42,119 --> 00:18:45,040 Speaker 1: right websites and services that have had a trouble, you know, 303 00:18:45,119 --> 00:18:49,360 Speaker 1: finding a home elsewhere. While those who have existing Parlor 304 00:18:49,440 --> 00:18:53,359 Speaker 1: accounts can access the site, the messages that once populated 305 00:18:53,359 --> 00:18:58,240 Speaker 1: the forums before Amazon evicted them, those are not yet back, 306 00:18:58,640 --> 00:19:01,480 Speaker 1: so those who are already members can go there. They 307 00:19:01,480 --> 00:19:04,880 Speaker 1: can start you know, posting in forums again, but they 308 00:19:04,920 --> 00:19:08,760 Speaker 1: can't access older threads. Parlor in general has had a 309 00:19:08,800 --> 00:19:12,000 Speaker 1: lot of changes. Researchers were able to scrape tons of 310 00:19:12,119 --> 00:19:15,000 Speaker 1: data off of Parlor before it was banished, and there 311 00:19:15,080 --> 00:19:17,919 Speaker 1: is no shortage of critics who will say that the 312 00:19:17,960 --> 00:19:21,720 Speaker 1: company has proven to be extremely careless with user information, 313 00:19:22,200 --> 00:19:24,520 Speaker 1: so a lot of people say that Parlor is is 314 00:19:24,600 --> 00:19:28,600 Speaker 1: kind of a doxing tool all by itself. The former CEO, 315 00:19:28,880 --> 00:19:32,520 Speaker 1: John Matt's has said that Rebecca Mercer, who provided much 316 00:19:32,600 --> 00:19:36,359 Speaker 1: of the funds to launch Parlor, fired him earlier this month, 317 00:19:36,680 --> 00:19:39,640 Speaker 1: and Mark Mechler, best known for co founding the Tea 318 00:19:39,720 --> 00:19:43,359 Speaker 1: Party is now serving as interim CEO while the company 319 00:19:43,359 --> 00:19:46,400 Speaker 1: seeks out a permanent one. It's too early to say 320 00:19:46,520 --> 00:19:49,240 Speaker 1: if users are going to return to the site in 321 00:19:49,400 --> 00:19:53,520 Speaker 1: droves or if they will have migrated to other services. 322 00:19:54,920 --> 00:19:57,240 Speaker 1: And I guess now it's time for me to talk 323 00:19:57,280 --> 00:19:59,960 Speaker 1: about Clubhouse. And I'm sure most of you have heard 324 00:20:00,000 --> 00:20:01,800 Speaker 1: about it, and a lot of you may have even 325 00:20:01,880 --> 00:20:05,560 Speaker 1: used it. Me not so much, not cool like that. 326 00:20:06,160 --> 00:20:09,720 Speaker 1: But Clubhouse is an app that launched last March. It's 327 00:20:09,760 --> 00:20:12,480 Speaker 1: an app that was created by Paul Davison and Rowan 328 00:20:12,560 --> 00:20:15,840 Speaker 1: Seth after they tried to make a podcasting app, and 329 00:20:15,880 --> 00:20:20,119 Speaker 1: it's really blowing up now after lots of early influencers 330 00:20:20,160 --> 00:20:23,440 Speaker 1: and celebrities embraced the platform. Now I'm still the ground 331 00:20:23,600 --> 00:20:26,800 Speaker 1: old man who hasn't joined in. But the app creates 332 00:20:26,920 --> 00:20:29,600 Speaker 1: audio chat rooms, which just makes me think of the 333 00:20:29,600 --> 00:20:32,320 Speaker 1: old party lines you used to see advertised about all 334 00:20:32,359 --> 00:20:35,040 Speaker 1: sorts of stuff back in the nineties. And you use 335 00:20:35,119 --> 00:20:37,760 Speaker 1: the app to log into, you know, various chat rooms. 336 00:20:37,760 --> 00:20:39,640 Speaker 1: You can bounce around if you want, and you're able 337 00:20:39,680 --> 00:20:43,320 Speaker 1: to converse over voice with other users about all sorts 338 00:20:43,359 --> 00:20:46,440 Speaker 1: of things. And some of those users include famous people 339 00:20:46,720 --> 00:20:50,520 Speaker 1: like people like Lindsay Lohan and Elon Musk, and the 340 00:20:50,560 --> 00:20:53,960 Speaker 1: topics of conversation could be really broad, with some rooms 341 00:20:53,960 --> 00:20:58,120 Speaker 1: dedicated to celebrity gossip, some might be dedicated to quantum physics. 342 00:20:58,119 --> 00:21:02,360 Speaker 1: It's pretty wild stuff. It's an invitation only platform, and 343 00:21:02,520 --> 00:21:05,200 Speaker 1: I've never received an invite, which is not a complaint. 344 00:21:05,480 --> 00:21:07,760 Speaker 1: It's probably for the best. I mean, I record so 345 00:21:07,800 --> 00:21:11,200 Speaker 1: many podcasts a week. I suspect most people here as 346 00:21:11,320 --> 00:21:14,639 Speaker 1: much from me as they would like, and in many cases, 347 00:21:14,680 --> 00:21:17,280 Speaker 1: I'm sure it's more than that. Also, there's no Android 348 00:21:17,400 --> 00:21:19,959 Speaker 1: version of the app yet, but it is on the way, 349 00:21:20,000 --> 00:21:23,399 Speaker 1: according to the company. However, since it is invite only, 350 00:21:23,600 --> 00:21:25,760 Speaker 1: that means that clubhouse has a sort of air of 351 00:21:25,800 --> 00:21:29,520 Speaker 1: exclusivity around it, which automatically makes it more attractive for 352 00:21:29,600 --> 00:21:33,399 Speaker 1: some people. Plus, in the middle of a pandemic where 353 00:21:33,760 --> 00:21:36,040 Speaker 1: most of us are stuck at home, it represents a 354 00:21:36,040 --> 00:21:39,080 Speaker 1: way to chat with other people safely and about all 355 00:21:39,119 --> 00:21:42,760 Speaker 1: sorts of interesting or mundane topics. Everyone wants to be 356 00:21:42,800 --> 00:21:44,800 Speaker 1: in the club they can't get in. I guess I 357 00:21:44,840 --> 00:21:47,600 Speaker 1: don't know at all. All sounds kind of stressful to me, Honestly, 358 00:21:47,840 --> 00:21:50,760 Speaker 1: I think I mostly end up just being quiet there, 359 00:21:51,000 --> 00:21:54,240 Speaker 1: believe it or not, So an invitation is wasted on me. 360 00:21:54,400 --> 00:21:57,080 Speaker 1: If you have an invitation, give it to someone you 361 00:21:57,119 --> 00:21:58,640 Speaker 1: think would get the most out of it. I guess 362 00:21:58,600 --> 00:22:00,480 Speaker 1: that is what I'm saying. The fact you can just 363 00:22:00,520 --> 00:22:02,680 Speaker 1: go in there and talk with folks has already led 364 00:22:02,760 --> 00:22:06,119 Speaker 1: China to say, hey, we don't want that here, because 365 00:22:06,119 --> 00:22:09,080 Speaker 1: the Chinese government cracks down hard on any form of 366 00:22:09,080 --> 00:22:13,000 Speaker 1: communication that is outside of its own authority. And Clubhouse 367 00:22:13,080 --> 00:22:19,119 Speaker 1: is also already at the center of other conversations about exclusion, harassment, misinformation, 368 00:22:19,359 --> 00:22:21,440 Speaker 1: you know, the sort of stuff that we see on 369 00:22:21,480 --> 00:22:24,639 Speaker 1: all social networks. Really, I'm not think saying it's it 370 00:22:24,640 --> 00:22:26,919 Speaker 1: should come as a surprise that it's over at Clubhouse too. 371 00:22:27,040 --> 00:22:30,920 Speaker 1: It's it's an issue that has been a problematic for 372 00:22:31,119 --> 00:22:34,240 Speaker 1: every platform. Now, there have been some pretty nasty stories 373 00:22:34,240 --> 00:22:37,480 Speaker 1: about bullying over at Clubhouse, but I think every social 374 00:22:37,480 --> 00:22:41,320 Speaker 1: platform faces these issues. It's really how they respond to 375 00:22:41,359 --> 00:22:45,000 Speaker 1: those problems that's worth talking about. And Clubhouse is still 376 00:22:45,040 --> 00:22:47,840 Speaker 1: in the process of formulating its strategy for that, like 377 00:22:47,880 --> 00:22:51,600 Speaker 1: getting moderators and that kind of thing. I imagine all 378 00:22:51,680 --> 00:22:54,000 Speaker 1: that is going to have to change soon. It's gonna 379 00:22:54,040 --> 00:22:56,800 Speaker 1: have to be less loosey goosey, but I'm sure I'll 380 00:22:56,840 --> 00:22:59,600 Speaker 1: also do a full episode about Clubhouse in the future, 381 00:22:59,640 --> 00:23:03,680 Speaker 1: so it tuned for that as well. Several financial institutions 382 00:23:03,680 --> 00:23:07,200 Speaker 1: are now joining initiatives in which they will share intellectual 383 00:23:07,240 --> 00:23:09,800 Speaker 1: property in the form of patents, all in an effort 384 00:23:09,880 --> 00:23:13,720 Speaker 1: to avoid patent trolls. This stems from a recent lawsuit 385 00:23:13,800 --> 00:23:16,520 Speaker 1: in which a company called U S a A filed 386 00:23:16,560 --> 00:23:20,639 Speaker 1: a patent infringement lawsuit against Wells Fargo, and one the case, 387 00:23:21,080 --> 00:23:24,119 Speaker 1: they received a reward of more than three million dollars 388 00:23:24,119 --> 00:23:26,560 Speaker 1: for a pair of cases. Actually, at the heart of 389 00:23:26,560 --> 00:23:29,320 Speaker 1: the matter was an innovation in which bank customers can 390 00:23:29,359 --> 00:23:32,199 Speaker 1: deposit checks by taking the photo of the check with 391 00:23:32,240 --> 00:23:35,000 Speaker 1: their phone and then using their banking app to transfer 392 00:23:35,080 --> 00:23:37,440 Speaker 1: money to their accounts, something that a lot of banks 393 00:23:37,480 --> 00:23:40,280 Speaker 1: do these days. Now, the whole story behind that is 394 00:23:40,320 --> 00:23:42,840 Speaker 1: actually pretty complicated. There was a company called in fact, 395 00:23:42,840 --> 00:23:45,480 Speaker 1: there is a company called me Tech, which Wells Fargo 396 00:23:45,640 --> 00:23:50,120 Speaker 1: uses for its mobile deposits, and USA A both holding 397 00:23:50,240 --> 00:23:53,800 Speaker 1: patents related to mobile deposits. So you had Metech with 398 00:23:53,840 --> 00:23:56,320 Speaker 1: some patents, U S a A with other patents. They 399 00:23:56,320 --> 00:23:59,040 Speaker 1: got into a big legal battle, they sell it out 400 00:23:59,040 --> 00:24:01,359 Speaker 1: of court, they retain the rights of their patents, and 401 00:24:01,400 --> 00:24:05,439 Speaker 1: now usa A is going after other financial institutions. So 402 00:24:05,480 --> 00:24:09,359 Speaker 1: the repercussions have encouraged a lot of financial institutions to 403 00:24:09,440 --> 00:24:11,960 Speaker 1: enter into a sort of open source agreement to share 404 00:24:12,000 --> 00:24:15,240 Speaker 1: i P in an effort to protect themselves against patent 405 00:24:15,240 --> 00:24:20,360 Speaker 1: infringement cases. The biggest concern our patent assertion entities. These 406 00:24:20,359 --> 00:24:23,679 Speaker 1: would be patentrols. These are companies that make money by 407 00:24:23,720 --> 00:24:28,000 Speaker 1: acquiring patents and then they assert those patents against companies 408 00:24:28,000 --> 00:24:32,160 Speaker 1: that are allegedly infringing those patents. Just a quick reminder, 409 00:24:32,400 --> 00:24:35,640 Speaker 1: a patent protects an invention of some sort. That invention 410 00:24:35,720 --> 00:24:39,320 Speaker 1: can actually be a process, not necessarily a thing. For 411 00:24:39,400 --> 00:24:42,760 Speaker 1: the duration of the patent, the holder is the sole 412 00:24:42,800 --> 00:24:46,880 Speaker 1: owner of that particular implementation of technology. If you want 413 00:24:46,880 --> 00:24:50,320 Speaker 1: to use that tech that approach, you either have to 414 00:24:50,320 --> 00:24:53,439 Speaker 1: figure out how to build something else that achieves the 415 00:24:53,520 --> 00:24:57,360 Speaker 1: same end result but doesn't use the same process as 416 00:24:57,400 --> 00:25:00,760 Speaker 1: the patented technology, or you have to strike up some 417 00:25:00,840 --> 00:25:03,880 Speaker 1: sort of deal with the patent holder and thus license 418 00:25:03,960 --> 00:25:08,439 Speaker 1: the technology. But patent trolls don't tend to offer licenses. First, 419 00:25:09,200 --> 00:25:13,800 Speaker 1: They get litigious. They aim for big rewards or court settlements, 420 00:25:14,119 --> 00:25:17,040 Speaker 1: and that tends to scare everybody else, and it gives 421 00:25:17,080 --> 00:25:19,479 Speaker 1: the patent holder a lot of leverage when they do, 422 00:25:20,240 --> 00:25:24,720 Speaker 1: you know, try to negotiate licensing agreements. Meanwhile, critics point 423 00:25:24,720 --> 00:25:28,639 Speaker 1: out that these patent holders aren't doing anything useful with 424 00:25:28,720 --> 00:25:31,600 Speaker 1: the patents at all. You know, they're not making anything, 425 00:25:31,680 --> 00:25:34,399 Speaker 1: they're not using the patent. They're just relying on it 426 00:25:34,440 --> 00:25:37,600 Speaker 1: as a cudgel. They're hoarding patents and waiting for the 427 00:25:37,640 --> 00:25:40,080 Speaker 1: opportunity to go after someone who appears to be using 428 00:25:40,119 --> 00:25:43,600 Speaker 1: the technology without authorization. And back in the day, the 429 00:25:43,640 --> 00:25:46,840 Speaker 1: company I worked for, how Stuff Works, was targeted by 430 00:25:46,840 --> 00:25:49,119 Speaker 1: a patentrol that claimed to hold a patent that covered 431 00:25:49,160 --> 00:25:52,840 Speaker 1: all of podcasting. Now, that particular case never really went 432 00:25:52,840 --> 00:25:55,600 Speaker 1: anywhere for lots of reasons, but that's a story for 433 00:25:55,640 --> 00:26:00,880 Speaker 1: another day. Hey, when was the last time you answered 434 00:26:00,920 --> 00:26:05,160 Speaker 1: a phone call from an unknown number? For me? That 435 00:26:05,200 --> 00:26:08,399 Speaker 1: would probably be something along the lines of five years ago. 436 00:26:08,840 --> 00:26:11,800 Speaker 1: And as it turns out, it's probably a good thing. 437 00:26:12,040 --> 00:26:14,320 Speaker 1: And I'm not alone in my decision to let my 438 00:26:14,400 --> 00:26:18,240 Speaker 1: phone just ring to voicemail. According to Hya, a cloud 439 00:26:18,280 --> 00:26:21,000 Speaker 1: services company that caters to customers like a T and 440 00:26:21,040 --> 00:26:24,760 Speaker 1: T and Samsung. Robo call scams are kind of at 441 00:26:24,760 --> 00:26:28,200 Speaker 1: a peak. The company commissioned a survey, and that survey 442 00:26:28,240 --> 00:26:31,520 Speaker 1: found that three out of four respondents say they were 443 00:26:31,520 --> 00:26:35,840 Speaker 1: targeted by at least one phone scammer over the last year. Now, 444 00:26:35,840 --> 00:26:38,680 Speaker 1: these are the types of phishing attempts that are looking 445 00:26:38,680 --> 00:26:42,880 Speaker 1: for personally identifiable information like social Security numbers, bank accounts, 446 00:26:42,960 --> 00:26:46,040 Speaker 1: that kind of thing. Maybe you're one of those people. 447 00:26:46,359 --> 00:26:49,040 Speaker 1: You might get a call from someone claiming to represent 448 00:26:49,119 --> 00:26:51,320 Speaker 1: the I R S and that you need to talk 449 00:26:51,400 --> 00:26:55,040 Speaker 1: to them or face possible fines or worse. That's actually 450 00:26:55,040 --> 00:26:57,280 Speaker 1: a pretty common one. And I know I've received a 451 00:26:57,280 --> 00:27:00,119 Speaker 1: couple of robo call messages in that Vein the as, 452 00:27:00,119 --> 00:27:02,760 Speaker 1: at least in the old days, they couldn't really detect 453 00:27:02,960 --> 00:27:05,359 Speaker 1: when voicemail was picking up as opposed to a human, 454 00:27:05,760 --> 00:27:08,400 Speaker 1: So I would get these prerecorded messages and it would 455 00:27:08,440 --> 00:27:10,439 Speaker 1: always start, you know, like halfway through, because it was 456 00:27:10,840 --> 00:27:14,880 Speaker 1: already going through my outgoing message. It didn't pause and 457 00:27:14,920 --> 00:27:18,200 Speaker 1: wait for the beep. So, according to this survey, those 458 00:27:18,280 --> 00:27:21,119 Speaker 1: who fall for the scam lose on average around a 459 00:27:21,200 --> 00:27:23,480 Speaker 1: hundred eighty dollars, but some can lose a lot more 460 00:27:23,520 --> 00:27:27,480 Speaker 1: than that. Meanwhile, because we're in that whole pandemic thing, 461 00:27:27,800 --> 00:27:31,520 Speaker 1: people have been using voice communication a lot more recently, 462 00:27:31,560 --> 00:27:36,360 Speaker 1: and so personal and business calls have nearly tripled in volume. Meanwhile, 463 00:27:37,400 --> 00:27:40,160 Speaker 1: the survey respondents say that if it's an unknown number, 464 00:27:40,520 --> 00:27:42,800 Speaker 1: they don't answer it, which makes it harder for a 465 00:27:42,840 --> 00:27:46,320 Speaker 1: legitimate communication to go through. So if a company does 466 00:27:46,400 --> 00:27:48,879 Speaker 1: need to make contact with a customer, it can be 467 00:27:48,960 --> 00:27:51,320 Speaker 1: impossible to get that person to pick up. Of course, 468 00:27:51,640 --> 00:27:54,560 Speaker 1: if they just leave a legitimate message, that can help 469 00:27:54,600 --> 00:27:56,880 Speaker 1: a lot. Now, I don't know about you, but while 470 00:27:56,920 --> 00:28:00,240 Speaker 1: I might get three or four, or sometimes six or 471 00:28:00,280 --> 00:28:03,520 Speaker 1: seven calls in a day, it's very rare that any 472 00:28:03,560 --> 00:28:06,399 Speaker 1: of those will actually leave a message. Now, you might 473 00:28:06,440 --> 00:28:08,879 Speaker 1: wonder if anyone is actually working on this problem on 474 00:28:09,000 --> 00:28:12,199 Speaker 1: behalf of consumers, And the short answer is yes. But 475 00:28:12,280 --> 00:28:15,600 Speaker 1: the longer answer is it's a complicated problem. It's not 476 00:28:15,760 --> 00:28:19,840 Speaker 1: super easy to solve. In the United States, phone companies 477 00:28:19,880 --> 00:28:22,159 Speaker 1: are supposed to create a new approach to color i 478 00:28:22,280 --> 00:28:25,520 Speaker 1: D by June this year and cut back on the 479 00:28:25,560 --> 00:28:29,520 Speaker 1: practice of spoofing. That's when a caller can use a 480 00:28:29,600 --> 00:28:32,800 Speaker 1: false color I d to try and make a connection. 481 00:28:33,119 --> 00:28:35,520 Speaker 1: Like they're not using their own phone number, they have 482 00:28:35,600 --> 00:28:39,080 Speaker 1: spoofed a different phone number that stands in its place. 483 00:28:39,320 --> 00:28:42,240 Speaker 1: So if you've ever received a bunch of calls from 484 00:28:42,280 --> 00:28:44,680 Speaker 1: a phone number that's similar to your own, you know, 485 00:28:44,800 --> 00:28:48,400 Speaker 1: same area code, maybe the same first three digits of 486 00:28:48,440 --> 00:28:51,880 Speaker 1: the actual number, then you've likely seen spoofing in action. 487 00:28:52,000 --> 00:28:56,320 Speaker 1: That's a common tactic among robo callers. Now, I'll probably 488 00:28:56,360 --> 00:28:59,280 Speaker 1: do a full episode about spoofing and what steps companies 489 00:28:59,360 --> 00:29:02,520 Speaker 1: might take cut back on it, but that will be 490 00:29:02,600 --> 00:29:06,200 Speaker 1: for the future, which is where you and I will 491 00:29:06,200 --> 00:29:09,080 Speaker 1: spend the rest of our lives. Before we get to 492 00:29:09,080 --> 00:29:12,040 Speaker 1: the future, though, we need to take another quick break. 493 00:29:19,680 --> 00:29:23,040 Speaker 1: Let's talk about some of the dangers of relying on 494 00:29:23,160 --> 00:29:27,400 Speaker 1: cloud based services. Now, sometimes the cloud providing those services 495 00:29:27,600 --> 00:29:30,880 Speaker 1: can go down and then you're at a loss. And 496 00:29:30,960 --> 00:29:33,040 Speaker 1: this is one of those scary things that c t 497 00:29:33,240 --> 00:29:35,440 Speaker 1: o s have to consider. Do you keep all your 498 00:29:35,480 --> 00:29:39,640 Speaker 1: systems on premises or on prem as the cool C 499 00:29:39,840 --> 00:29:43,680 Speaker 1: suite folks say, or do you offload some or maybe 500 00:29:43,720 --> 00:29:47,520 Speaker 1: all of those services to cloud based platforms and then 501 00:29:47,560 --> 00:29:50,360 Speaker 1: you rely on other companies to provide a more robust 502 00:29:50,480 --> 00:29:53,920 Speaker 1: and redundant system that you can rely upon. Now, ideally, 503 00:29:54,440 --> 00:29:59,040 Speaker 1: cloud services should be just as if not more reliable, 504 00:29:59,160 --> 00:30:03,480 Speaker 1: than on pre m systems, but sometimes stuff goes wrong. 505 00:30:04,080 --> 00:30:08,200 Speaker 1: Last Friday, stuff went wrong for Notion. That's a company 506 00:30:08,240 --> 00:30:12,800 Speaker 1: that provides cloud based project management services. The company offers 507 00:30:12,840 --> 00:30:15,000 Speaker 1: up a suite of tools for all sorts of things 508 00:30:15,280 --> 00:30:20,480 Speaker 1: from managing product development, to product launches, to marketing campaigns 509 00:30:20,520 --> 00:30:25,160 Speaker 1: and beyond. Except on Friday, for several hours, the whole 510 00:30:25,200 --> 00:30:29,640 Speaker 1: thing went down for everybody, which means four million users 511 00:30:29,680 --> 00:30:33,120 Speaker 1: were not able to access it. So what was going on? Well, 512 00:30:33,160 --> 00:30:36,920 Speaker 1: according to the company, there was a quote very unusual 513 00:30:37,080 --> 00:30:41,200 Speaker 1: DNS issue that occurred at the registry operator level end. 514 00:30:41,280 --> 00:30:45,000 Speaker 1: Quote that might be a little difficult to parse, so 515 00:30:45,080 --> 00:30:48,080 Speaker 1: let's break it down. D n S stands for Domain 516 00:30:48,240 --> 00:30:50,760 Speaker 1: Name system, which you can think of as sort of 517 00:30:50,800 --> 00:30:55,719 Speaker 1: the directory for the Internet. It's the distributed decentralized record 518 00:30:56,000 --> 00:30:59,720 Speaker 1: that explains what all those different machines connected to the 519 00:30:59,760 --> 00:31:03,320 Speaker 1: inner net are and where they can be found. Notion site, 520 00:31:03,600 --> 00:31:08,480 Speaker 1: which is Notion dot s O, has name dot Com 521 00:31:08,520 --> 00:31:11,440 Speaker 1: as the registrars, so that's the company that registered the 522 00:31:11,520 --> 00:31:16,479 Speaker 1: name to Notion. However, Name dot Com works with another 523 00:31:16,520 --> 00:31:21,560 Speaker 1: company called hexo Net. Hexo Net manages all companies all 524 00:31:21,680 --> 00:31:25,600 Speaker 1: websites that use the dot s O domain, and hexo 525 00:31:25,680 --> 00:31:30,080 Speaker 1: net had received word that some nefarious Notion users were 526 00:31:30,120 --> 00:31:33,840 Speaker 1: creating pages in Notion that were an effort to fish 527 00:31:34,000 --> 00:31:37,840 Speaker 1: sensitive information off of unsuspecting targets, so they were essentially 528 00:31:37,960 --> 00:31:41,840 Speaker 1: using Notion as a platform for a delivery system for 529 00:31:41,880 --> 00:31:46,040 Speaker 1: phishing attacks. Hexo Net contacted name dot com about this, 530 00:31:46,480 --> 00:31:49,680 Speaker 1: but Name dot Com was unable to confirm the reports 531 00:31:49,720 --> 00:31:54,120 Speaker 1: with direct evidence. Hexo Net then placed a temporary freeze 532 00:31:54,160 --> 00:31:57,120 Speaker 1: on notions domain in order to sort out the mess, 533 00:31:57,320 --> 00:32:00,480 Speaker 1: which meant that all of Notion went offline for rebudy. 534 00:32:01,000 --> 00:32:04,600 Speaker 1: Hex on Net lifted that freeze later on Friday, But 535 00:32:04,680 --> 00:32:08,120 Speaker 1: the problems with phishing scams and Notion aren't new, and 536 00:32:08,160 --> 00:32:11,240 Speaker 1: as of this recording, the company hasn't really laid out 537 00:32:11,320 --> 00:32:14,360 Speaker 1: plans externally anyway on how they're going to push back 538 00:32:14,400 --> 00:32:18,120 Speaker 1: against those fishing attacks, which could mean that this event 539 00:32:18,120 --> 00:32:20,480 Speaker 1: could repeat itself in the future, which is not a 540 00:32:20,520 --> 00:32:23,880 Speaker 1: strong way to sell a project management platform if you're 541 00:32:23,920 --> 00:32:28,880 Speaker 1: saying it might be offline occasionally as these sort of 542 00:32:28,920 --> 00:32:33,720 Speaker 1: things happen, and now la la Google has a lot 543 00:32:33,760 --> 00:32:38,160 Speaker 1: to answer for in France. Okay, I'm sorry. I can't 544 00:32:38,200 --> 00:32:40,000 Speaker 1: promise that's going to be the last of the accent, 545 00:32:40,440 --> 00:32:43,320 Speaker 1: and I know it's terrible, but a French court has 546 00:32:43,440 --> 00:32:46,240 Speaker 1: ordered Google to pay a fine of one point three 547 00:32:46,360 --> 00:32:50,360 Speaker 1: million dollars or one point one million euros because the 548 00:32:50,360 --> 00:32:55,360 Speaker 1: company's search engine gave misleading rankings about French hotels. And 549 00:32:55,400 --> 00:32:57,680 Speaker 1: that's about the most French thing I think I've ever heard. 550 00:32:58,200 --> 00:33:01,840 Speaker 1: You mess with French hospitality. You get the taroll by 551 00:33:01,880 --> 00:33:06,000 Speaker 1: the orne. Okay, now now I'm really done. I know 552 00:33:06,080 --> 00:33:09,040 Speaker 1: you've all just sort of cringed yourselves out of existence. 553 00:33:09,120 --> 00:33:12,840 Speaker 1: I apologize. So that was a mixed metaphor I was giving. 554 00:33:12,840 --> 00:33:15,480 Speaker 1: It was also a terrible pronunciation of French vocabulary. But 555 00:33:15,520 --> 00:33:19,920 Speaker 1: apparently Google was using one source for ranking hotels in 556 00:33:19,960 --> 00:33:23,240 Speaker 1: addition to a couple of secondary sources, but they didn't 557 00:33:23,240 --> 00:33:26,160 Speaker 1: meet up to the high standards of French sensibilities and 558 00:33:26,280 --> 00:33:29,760 Speaker 1: hotel owners complained to the government, which then investigated the 559 00:33:29,760 --> 00:33:32,400 Speaker 1: matter and found Google to be deficient and how it 560 00:33:32,440 --> 00:33:36,840 Speaker 1: was ranking results Gail Domage. So now Google has to 561 00:33:36,840 --> 00:33:40,840 Speaker 1: pay this fine and presumably come about ranking these hotels 562 00:33:40,920 --> 00:33:44,480 Speaker 1: in a totally different way. It's such a weird thing 563 00:33:44,600 --> 00:33:47,080 Speaker 1: to to see and it's something that would only really 564 00:33:47,120 --> 00:33:50,560 Speaker 1: happen in the European Union. On Monday of this week, 565 00:33:50,640 --> 00:33:55,120 Speaker 1: the CEO of Jagua, the British car company, announced a 566 00:33:55,200 --> 00:33:59,120 Speaker 1: new plan for Jaguars that's the way we Americans say 567 00:33:59,160 --> 00:34:03,240 Speaker 1: it started. In twenty five, the company's cars will ditch 568 00:34:03,400 --> 00:34:07,560 Speaker 1: the internal combustion engine. Yep. All the Jaguars of the 569 00:34:07,600 --> 00:34:12,000 Speaker 1: future will all be electric, either with batteries or with 570 00:34:12,080 --> 00:34:15,640 Speaker 1: hydrogen fuel cell technology. And a fuel cell is a 571 00:34:15,680 --> 00:34:18,919 Speaker 1: lot like a battery and that it generates electricity through 572 00:34:18,960 --> 00:34:22,840 Speaker 1: an electrochemical process, but unlike a battery, you have to 573 00:34:22,880 --> 00:34:25,839 Speaker 1: actually refuel a fuel cell. You have to fill it up. 574 00:34:25,840 --> 00:34:28,400 Speaker 1: Eventually it runs out of hydrogen and you've got to 575 00:34:28,400 --> 00:34:31,399 Speaker 1: top it off. On the bright side, the emissions from 576 00:34:31,400 --> 00:34:35,040 Speaker 1: a fuel cell vehicle are primarily heat and water, which 577 00:34:35,040 --> 00:34:37,960 Speaker 1: is a nice change of pace from carbon spewing internal 578 00:34:37,960 --> 00:34:41,720 Speaker 1: combustion engines. The Jaguar land Rover is to have six 579 00:34:41,840 --> 00:34:46,239 Speaker 1: new battery electric vehicles by twenty six. Gone will be 580 00:34:46,320 --> 00:34:50,080 Speaker 1: engines that require either gasoline or diesel, and the company 581 00:34:50,160 --> 00:34:53,879 Speaker 1: itself plans to be carbon neutral by twenty thirty nine. 582 00:34:54,200 --> 00:34:56,960 Speaker 1: It will also mean making some big changes at the 583 00:34:57,000 --> 00:35:01,360 Speaker 1: company's various manufacturing facilities. Now, Jaguar says it's not gonna 584 00:35:01,400 --> 00:35:05,160 Speaker 1: shut down their factory in Birmingham, England, but things are 585 00:35:05,239 --> 00:35:09,400 Speaker 1: going to change substantially there after that facility finishes building 586 00:35:09,440 --> 00:35:13,440 Speaker 1: out the models that they're making for this year, because 587 00:35:13,440 --> 00:35:15,560 Speaker 1: they are not going to do that next year, but 588 00:35:16,120 --> 00:35:20,040 Speaker 1: presumably that facility will be doing something else that supports 589 00:35:20,120 --> 00:35:24,719 Speaker 1: Jaguar's new vision. Sticking with electric vehicles, Tesla is in 590 00:35:24,800 --> 00:35:26,920 Speaker 1: the news again. You might remember last week when they 591 00:35:26,960 --> 00:35:30,080 Speaker 1: talked about how China is ordering Tesla to shape up 592 00:35:30,120 --> 00:35:32,879 Speaker 1: after quality control issues were found with the American built 593 00:35:32,920 --> 00:35:36,520 Speaker 1: Tesla vehicles that were imported into China, whereas the Chinese 594 00:35:36,600 --> 00:35:40,200 Speaker 1: built ones seem to be okay. Well. Now Germany is 595 00:35:40,280 --> 00:35:43,960 Speaker 1: ordering Tesla to recall more than twelve thousand Model X 596 00:35:44,080 --> 00:35:48,280 Speaker 1: cars because of a problem with loose trim. The country's 597 00:35:48,360 --> 00:35:52,600 Speaker 1: motor vehicle regulatory agency, called the k b A, says 598 00:35:52,680 --> 00:35:55,480 Speaker 1: that the molding on the trim can become loose. That 599 00:35:55,520 --> 00:35:58,920 Speaker 1: means cars might have trim breakoff while they're in motion. 600 00:35:59,000 --> 00:36:01,480 Speaker 1: That represents a has heard on the road, and the 601 00:36:01,600 --> 00:36:05,759 Speaker 1: recall is for cars that were manufactured between and twenty six. 602 00:36:06,480 --> 00:36:10,280 Speaker 1: Tesla had already recalled some nine thousand Model X vehicles 603 00:36:10,320 --> 00:36:13,759 Speaker 1: in the US for a roof trim issue. No word 604 00:36:13,800 --> 00:36:17,640 Speaker 1: if it's the exact same thing, and in January, the 605 00:36:17,800 --> 00:36:22,960 Speaker 1: NHTSA urged Tesla to recall more than one thousand vehicles 606 00:36:23,360 --> 00:36:26,640 Speaker 1: because of a problem with the touch screen interface that 607 00:36:26,760 --> 00:36:29,479 Speaker 1: access the screen for the rear facing camera. It's also 608 00:36:29,920 --> 00:36:32,239 Speaker 1: the way you control important stuff like if you want 609 00:36:32,280 --> 00:36:35,640 Speaker 1: to defug the windshield, so it could pose as a 610 00:36:35,680 --> 00:36:40,000 Speaker 1: safety hazard if that fails as well. According to The Verge, 611 00:36:40,280 --> 00:36:44,080 Speaker 1: Microsoft is quietly testing its x Cloud service through a 612 00:36:44,239 --> 00:36:48,000 Speaker 1: web browser. With x Cloud, people who have an Xbox 613 00:36:48,120 --> 00:36:51,320 Speaker 1: Game Pass are able to access their games in that 614 00:36:51,520 --> 00:36:55,280 Speaker 1: pass through a browser. Now playing a cloud based version 615 00:36:55,719 --> 00:36:58,680 Speaker 1: of the game through the browser itself, that's sort of cool. 616 00:36:59,239 --> 00:37:02,400 Speaker 1: You would need an Xbox controller connected to whatever device 617 00:37:02,560 --> 00:37:04,960 Speaker 1: was running the web browser, and then you're off to 618 00:37:05,040 --> 00:37:09,359 Speaker 1: the races, particularly if you're playing Fortza. That's a race 619 00:37:09,400 --> 00:37:12,560 Speaker 1: card joke. The Verge reports that as it stands, it 620 00:37:12,640 --> 00:37:14,480 Speaker 1: looks like the service is going to be limited to 621 00:37:14,520 --> 00:37:17,840 Speaker 1: browsers that are built off of Chromium, so that includes 622 00:37:18,000 --> 00:37:22,400 Speaker 1: Microsoft Edge and you know, Google Chrome. There are apps 623 00:37:22,920 --> 00:37:26,560 Speaker 1: on Android that already allow Android users to access the 624 00:37:27,480 --> 00:37:30,040 Speaker 1: x cloud services, but this would allow you to do 625 00:37:30,120 --> 00:37:33,480 Speaker 1: it straight through a browser, not with a specific OS. 626 00:37:34,760 --> 00:37:38,400 Speaker 1: The Independent reports that Facebook is working on a smart 627 00:37:38,440 --> 00:37:42,120 Speaker 1: watch with the goal of launching it in two Now, 628 00:37:42,200 --> 00:37:45,000 Speaker 1: as you might suspect, such a watch would lean heavily 629 00:37:45,160 --> 00:37:49,720 Speaker 1: on integrated services from Facebook properties like Instagram, What's Happened, 630 00:37:50,480 --> 00:37:53,560 Speaker 1: you know, Facebook, There's also evidence that it will have 631 00:37:53,719 --> 00:37:57,920 Speaker 1: some fitness tracking capabilities and integration with some big fitness 632 00:37:58,000 --> 00:38:03,080 Speaker 1: companies like Peloton. Now the design incorporates a cellular transceiver, 633 00:38:03,440 --> 00:38:06,200 Speaker 1: so assuming that's in this smart watch, you would be 634 00:38:06,200 --> 00:38:08,880 Speaker 1: able to use the watch to send and receive messages 635 00:38:08,920 --> 00:38:12,960 Speaker 1: without necessarily having to pair it with a phone or 636 00:38:13,760 --> 00:38:18,680 Speaker 1: anything like that. Now, considering Facebook's reputation with user privacy 637 00:38:18,800 --> 00:38:22,360 Speaker 1: and leveraging user information to serve ads to those people, 638 00:38:22,840 --> 00:38:25,920 Speaker 1: I would personally be a little bit reluctant to strap 639 00:38:26,040 --> 00:38:29,319 Speaker 1: on a smart watch that's monitoring stuff like my heart 640 00:38:29,440 --> 00:38:32,080 Speaker 1: rate and my sleeping patterns, because next thing you know, 641 00:38:32,160 --> 00:38:35,600 Speaker 1: I'd be getting ads for exercise equipment and lavender scented 642 00:38:35,680 --> 00:38:38,239 Speaker 1: masks and stuff. I mean, at least I would if 643 00:38:38,239 --> 00:38:40,279 Speaker 1: I were still using Facebook. But you get what I mean. 644 00:38:41,239 --> 00:38:44,440 Speaker 1: Can Facebook create a must have piece of technology? Well, 645 00:38:44,480 --> 00:38:46,960 Speaker 1: the company has tried in the past. Things have not 646 00:38:47,400 --> 00:38:50,120 Speaker 1: really worked out so well. The Facebook phone is one 647 00:38:50,200 --> 00:38:53,200 Speaker 1: of the legendary flops in the tech world, and smart 648 00:38:53,200 --> 00:38:56,560 Speaker 1: watches have frequently fallen short of expectations. So we'll have 649 00:38:56,719 --> 00:38:59,840 Speaker 1: to wait and see. And that wraps up the news 650 00:39:00,160 --> 00:39:05,799 Speaker 1: for Tuesday, February one. I hope you guys are doing well. 651 00:39:06,360 --> 00:39:10,680 Speaker 1: You'll get another typical episode of tech Stuff tomorrow. I 652 00:39:10,680 --> 00:39:14,640 Speaker 1: hope you guys are looking forward to That should be interesting. Uh. 653 00:39:14,760 --> 00:39:17,239 Speaker 1: They will definitely keep you awake because it's gonna be 654 00:39:17,400 --> 00:39:21,680 Speaker 1: about coffee makers. That's a that's a little teaser for 655 00:39:21,800 --> 00:39:24,960 Speaker 1: what to expect. Well, if you have any suggestions for 656 00:39:25,040 --> 00:39:27,200 Speaker 1: future topics of tech Stuff, you can let me know 657 00:39:27,440 --> 00:39:30,640 Speaker 1: by contacting me on Twitter. The handle is text stuff 658 00:39:30,960 --> 00:39:35,040 Speaker 1: h s W and I'll talk to you again really soon. 659 00:39:40,000 --> 00:39:42,960 Speaker 1: Text Stuff is an I Heart Radio production. For more 660 00:39:43,080 --> 00:39:46,480 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 661 00:39:46,600 --> 00:39:49,760 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows,