1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,160 --> 00:00:19,200 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:19,280 --> 00:00:22,279 Speaker 1: and I love all things tech, and it is time 5 00:00:22,360 --> 00:00:27,080 Speaker 1: for the tech news for Tuesday, July six, twenty one. 6 00:00:27,560 --> 00:00:31,880 Speaker 1: Let's get to it. An infosec researcher who goes by 7 00:00:31,880 --> 00:00:35,720 Speaker 1: the handle dr web Anti Virus. I love that handle, 8 00:00:36,159 --> 00:00:40,559 Speaker 1: posted a list of ten apps that are were found 9 00:00:40,640 --> 00:00:43,400 Speaker 1: in the Google Play Store that are host to a 10 00:00:43,479 --> 00:00:46,920 Speaker 1: trojan horse type of malware. This is a kind of 11 00:00:46,960 --> 00:00:49,680 Speaker 1: malware that acts as a carrier for some sort of 12 00:00:49,880 --> 00:00:52,519 Speaker 1: malicious payload. So you can think of it as kind 13 00:00:52,520 --> 00:00:56,080 Speaker 1: of the gel cap that contains something really nasty inside 14 00:00:56,080 --> 00:00:59,680 Speaker 1: it and just slips into a system. What specifically goes 15 00:00:59,720 --> 00:01:03,080 Speaker 1: into a trojan can vary, but in this case, the 16 00:01:03,120 --> 00:01:07,280 Speaker 1: malware's purpose is to scrape Facebook log and credentials off 17 00:01:07,400 --> 00:01:11,080 Speaker 1: of phones. So lots of apps have options that allow 18 00:01:11,200 --> 00:01:15,240 Speaker 1: users to link that app to a Facebook account, often 19 00:01:15,319 --> 00:01:19,480 Speaker 1: with dubious benefits to the user for doing so. It's 20 00:01:19,480 --> 00:01:22,560 Speaker 1: a pretty common practice and some users don't even think 21 00:01:22,600 --> 00:01:26,480 Speaker 1: twice about it. They just click right through. In this case, 22 00:01:26,640 --> 00:01:29,119 Speaker 1: doing that opens up the chance for someone to gain 23 00:01:29,240 --> 00:01:32,679 Speaker 1: your login and password to Facebook, which is also why 24 00:01:32,720 --> 00:01:36,080 Speaker 1: it's always a good idea to enable two factor authentication 25 00:01:36,520 --> 00:01:38,920 Speaker 1: on any services that offer it. It's a bit of 26 00:01:38,959 --> 00:01:42,720 Speaker 1: a hassle, but it can save you some real heartache 27 00:01:42,720 --> 00:01:46,640 Speaker 1: in the long run. The ten apps include p I 28 00:01:46,720 --> 00:01:54,320 Speaker 1: P Photo or PIP Photo Processing, Photo, Rubbish Cleaner, Horoscope, 29 00:01:54,440 --> 00:02:00,240 Speaker 1: Daily In, Well Fitness, app Lock, Keep block It, Mass Sir, 30 00:02:00,840 --> 00:02:06,200 Speaker 1: Horoscope Pie, and app lock Manager. Google has removed all 31 00:02:06,280 --> 00:02:09,839 Speaker 1: ten apps from the app store, but people have downloaded 32 00:02:09,960 --> 00:02:13,440 Speaker 1: these apps at least six million times collectively. Across all 33 00:02:13,480 --> 00:02:16,960 Speaker 1: the apps, the apps gave no indication that they were malicious. 34 00:02:16,960 --> 00:02:20,000 Speaker 1: All of them actually did what it was they claimed 35 00:02:20,000 --> 00:02:23,239 Speaker 1: to do. They just went above and beyond and harvested 36 00:02:23,360 --> 00:02:25,880 Speaker 1: Facebook data to on top of that, so if you 37 00:02:26,040 --> 00:02:28,680 Speaker 1: use Android devices, it's a good idea to check to 38 00:02:28,720 --> 00:02:31,880 Speaker 1: make sure you didn't download and install any of those apps. 39 00:02:32,080 --> 00:02:34,680 Speaker 1: It's also a good idea to change your Facebook password 40 00:02:34,919 --> 00:02:39,560 Speaker 1: and to activate two factor authentication. It's time for another 41 00:02:39,639 --> 00:02:44,520 Speaker 1: cryptocurrency story. This time the King of Bitcoin is in 42 00:02:44,680 --> 00:02:48,040 Speaker 1: legal trouble. I should add this is a self proclaimed 43 00:02:48,160 --> 00:02:51,360 Speaker 1: King of Bitcoin, Bitcoin is not now, nor has it 44 00:02:51,400 --> 00:02:56,320 Speaker 1: ever been, an actual monarchy. This guy's name is Claudio Livera, 45 00:02:56,440 --> 00:02:59,959 Speaker 1: and allegedly he embezzled a lot of money from investor 46 00:03:00,080 --> 00:03:02,880 Speaker 1: ers who are looking to get into cryptocurrency and buy 47 00:03:02,960 --> 00:03:05,720 Speaker 1: a lot of money. I'm talking somewhere in the neighborhood 48 00:03:05,760 --> 00:03:11,360 Speaker 1: of three hundred million dollars. Yelza. Olivera is the president 49 00:03:11,360 --> 00:03:15,320 Speaker 1: of a company called the Bitcoin Banko Group, and a 50 00:03:15,360 --> 00:03:18,240 Speaker 1: few years ago, in twenty nineteen, the group was in 51 00:03:18,280 --> 00:03:21,959 Speaker 1: financial trouble. The company reported that after balancing its books, 52 00:03:22,240 --> 00:03:25,880 Speaker 1: it came up with a shortfall of seven thousand bitcoins 53 00:03:25,960 --> 00:03:28,840 Speaker 1: or so. So, in other words, seven thousand bitcoins just 54 00:03:29,360 --> 00:03:32,840 Speaker 1: seemed to up and vanish. Now that could actually happen, 55 00:03:33,160 --> 00:03:35,760 Speaker 1: of course. I mean if the company were to store 56 00:03:36,120 --> 00:03:39,320 Speaker 1: the bitcoins in a digital wallet that in turn was 57 00:03:39,440 --> 00:03:42,800 Speaker 1: on you know, some specific device, and then the company 58 00:03:42,880 --> 00:03:46,120 Speaker 1: lost access to that device, like maybe they sold it 59 00:03:46,200 --> 00:03:48,480 Speaker 1: or threw it out or something, well, that money would 60 00:03:48,520 --> 00:03:51,240 Speaker 1: also be lost. It would be on that physical hardware. 61 00:03:51,280 --> 00:03:53,480 Speaker 1: At least the record of it that allows you to 62 00:03:53,800 --> 00:03:57,520 Speaker 1: use that money would be on that physical hardware. This 63 00:03:57,560 --> 00:04:01,560 Speaker 1: is equivalent to stuffing a physic sical like combination lock 64 00:04:01,720 --> 00:04:05,320 Speaker 1: safe full of cash and then throwing that safe into 65 00:04:05,360 --> 00:04:09,120 Speaker 1: the ocean at any rate. The company then applied for 66 00:04:09,200 --> 00:04:12,240 Speaker 1: permission from the Brazilian government to go through a process 67 00:04:12,280 --> 00:04:15,720 Speaker 1: that would allow the company to reorganize its finances in 68 00:04:15,800 --> 00:04:18,880 Speaker 1: order to you know, payoff creditors and and to not 69 00:04:19,000 --> 00:04:22,440 Speaker 1: go bankrupt, and the government agreed to this. But then 70 00:04:23,200 --> 00:04:27,120 Speaker 1: the Brazilian government noted that a year on into this process, 71 00:04:27,680 --> 00:04:30,440 Speaker 1: the company didn't seem to actually be following the court 72 00:04:30,520 --> 00:04:35,000 Speaker 1: prescribed process of reorganization. It wasn't doing what it was 73 00:04:35,000 --> 00:04:37,520 Speaker 1: supposed to do by court mandate. And that's when the 74 00:04:37,520 --> 00:04:42,080 Speaker 1: Brazilian government formed an investigation into Olivera, which ultimately concluded 75 00:04:42,120 --> 00:04:44,920 Speaker 1: that he had been skimming money from investors for himself, 76 00:04:45,000 --> 00:04:48,400 Speaker 1: and furthermore that he appeared to have had a history 77 00:04:48,440 --> 00:04:51,840 Speaker 1: of doing this in other places, including the United States. 78 00:04:51,839 --> 00:04:57,279 Speaker 1: Olivera is now in police custody in Brazil. Last Friday, 79 00:04:57,360 --> 00:05:00,840 Speaker 1: we learned that a networking company called Kause, which makes 80 00:05:00,920 --> 00:05:04,479 Speaker 1: products designed to help I T professionals maintain a minister 81 00:05:04,920 --> 00:05:08,479 Speaker 1: network systems and I T infrastructure, got hit with a 82 00:05:08,640 --> 00:05:12,720 Speaker 1: ransomware attack. Lots of companies rely on cause A products, 83 00:05:12,760 --> 00:05:16,120 Speaker 1: and according to the company, anywhere from eight hundred to 84 00:05:16,279 --> 00:05:21,120 Speaker 1: fifteen hundred customers may be affected by this UH and 85 00:05:21,160 --> 00:05:23,440 Speaker 1: it's actually kind of a cascading effects. So because A 86 00:05:23,880 --> 00:05:27,320 Speaker 1: gets targeted, right, they are the company that provides the 87 00:05:27,400 --> 00:05:32,000 Speaker 1: services overall. But in that process of targeting the company, 88 00:05:32,040 --> 00:05:35,000 Speaker 1: the hackers were able to breach data on about fifty 89 00:05:35,240 --> 00:05:38,640 Speaker 1: of cause As direct customers. But those fifty customers in 90 00:05:38,760 --> 00:05:44,120 Speaker 1: turn provide services to smaller companies, so we start to 91 00:05:44,160 --> 00:05:46,600 Speaker 1: see the effects ripple outward. That's where we get to 92 00:05:46,640 --> 00:05:49,719 Speaker 1: that range of eight d to fifteen hundred affected companies. 93 00:05:50,040 --> 00:05:52,840 Speaker 1: The affected companies are essentially locked out of their cloud 94 00:05:52,920 --> 00:05:57,960 Speaker 1: based systems. Hacker group called REvil m R e v 95 00:05:58,160 --> 00:06:01,360 Speaker 1: I L has claimed response ability and has demanded a 96 00:06:01,440 --> 00:06:05,599 Speaker 1: ransom of seventy million dollars in Bitcoin in order to 97 00:06:05,680 --> 00:06:09,720 Speaker 1: release the hold they have on that system to decrypt it. Essentially, 98 00:06:10,080 --> 00:06:12,720 Speaker 1: that's the most that any hacker group has ever demanded 99 00:06:12,760 --> 00:06:15,680 Speaker 1: as a ransom so far. The previous record holder was 100 00:06:15,839 --> 00:06:19,040 Speaker 1: um let's see here, oh you know what, still Rebuild. 101 00:06:19,680 --> 00:06:23,560 Speaker 1: They demanded fifty million dollars from Aser earlier this year. 102 00:06:24,520 --> 00:06:28,880 Speaker 1: A spokesperson for CAZA has declined to comment on paying 103 00:06:29,040 --> 00:06:32,560 Speaker 1: ransoms to terrorists, And just as a reminder, you know, 104 00:06:32,680 --> 00:06:35,880 Speaker 1: paying ransomware is not a great idea. It does not 105 00:06:36,080 --> 00:06:39,320 Speaker 1: guarantee that you're actually going to regain control of your systems, 106 00:06:39,400 --> 00:06:42,160 Speaker 1: and paying off the ransom sends the message that this 107 00:06:42,360 --> 00:06:45,600 Speaker 1: is an effective and profitable type of crime, and that 108 00:06:45,640 --> 00:06:49,719 Speaker 1: guarantees that you'll have more attacks in the future. Some 109 00:06:49,800 --> 00:06:53,200 Speaker 1: companies likely make the choice to pay off the ransom 110 00:06:53,320 --> 00:06:56,320 Speaker 1: because they happen to have some form of cyber insurance 111 00:06:56,400 --> 00:06:59,520 Speaker 1: against that kind of thing, and I guess that makes 112 00:06:59,560 --> 00:07:01,760 Speaker 1: sense from the company's perspective. I mean, if you can 113 00:07:01,760 --> 00:07:06,120 Speaker 1: pay off a ransom and get control of your systems 114 00:07:06,160 --> 00:07:08,920 Speaker 1: back and you get reimbursed for the money you spent 115 00:07:09,000 --> 00:07:12,119 Speaker 1: because you're insured, you kind of made it all someone 116 00:07:12,160 --> 00:07:16,640 Speaker 1: else's problem, right. Insurance companies are starting to see that 117 00:07:16,760 --> 00:07:19,960 Speaker 1: problem manifest now, which I imagine means we're going to 118 00:07:20,080 --> 00:07:23,720 Speaker 1: see some big changes in that kind of insurance moving forward. Anyway, 119 00:07:24,000 --> 00:07:28,120 Speaker 1: INFOSEC researchers believe that revel is based out of Russia 120 00:07:28,320 --> 00:07:31,040 Speaker 1: or possibly Eastern Europe, and we're seeing this a lot 121 00:07:31,080 --> 00:07:33,640 Speaker 1: with a lot of criminal groups in Russia in particular 122 00:07:33,680 --> 00:07:36,080 Speaker 1: picking up the pace. There's a general belief that the 123 00:07:36,160 --> 00:07:39,880 Speaker 1: Russian government allows these criminal groups to operate. Maybe they 124 00:07:39,920 --> 00:07:44,760 Speaker 1: even subcontract with these criminal organizations in order to target 125 00:07:44,880 --> 00:07:49,040 Speaker 1: specific political enemies. That's an allegation that the Russian government 126 00:07:49,080 --> 00:07:53,120 Speaker 1: itself denies. But generally speaking, if you're developing software and 127 00:07:53,240 --> 00:07:57,080 Speaker 1: you're in Russia, you're pretty much confined to the Russian market, 128 00:07:57,280 --> 00:08:00,800 Speaker 1: and hacking becomes a more viable means of making money. 129 00:08:00,960 --> 00:08:04,960 Speaker 1: So um, yeah, I can't say for certain that the 130 00:08:05,040 --> 00:08:11,320 Speaker 1: Russian government condones or actively supports these hacking uh activities, 131 00:08:11,760 --> 00:08:17,440 Speaker 1: but the government does seem to be reticent in pursuing 132 00:08:17,640 --> 00:08:22,120 Speaker 1: legal action against these hacker organizations. Let's put it that way. 133 00:08:22,240 --> 00:08:24,880 Speaker 1: At best, it appears they're looking the other way, and 134 00:08:24,920 --> 00:08:28,800 Speaker 1: at worst they're in cahoots. In addition to that, I 135 00:08:28,840 --> 00:08:32,199 Speaker 1: also want to mention that the company because has said 136 00:08:32,280 --> 00:08:37,720 Speaker 1: it is working to get the services restored today July six. 137 00:08:39,320 --> 00:08:41,480 Speaker 1: By the time you listen to this, that may have 138 00:08:41,679 --> 00:08:47,040 Speaker 1: already happened. Moving on, in India, Twitter is facing a 139 00:08:47,160 --> 00:08:51,760 Speaker 1: very tough situation. The Indian government has ruled that Twitter 140 00:08:51,880 --> 00:08:56,120 Speaker 1: has not been compliant with content regulation within the country, 141 00:08:56,160 --> 00:09:00,520 Speaker 1: and as a result, will not enjoy liability protection against 142 00:09:00,640 --> 00:09:04,079 Speaker 1: stuff that people post on Twitter. While in India, the 143 00:09:04,160 --> 00:09:06,960 Speaker 1: law states that Twitter is to remove posts quickly in 144 00:09:07,000 --> 00:09:10,880 Speaker 1: response to legal requests from you know, law enforcement, and 145 00:09:10,960 --> 00:09:14,000 Speaker 1: that the company is also supposed to share details about 146 00:09:14,040 --> 00:09:17,680 Speaker 1: the owner of accounts that post offending messages. So, in 147 00:09:17,679 --> 00:09:21,360 Speaker 1: other words, if someone posts something that is in violation 148 00:09:21,440 --> 00:09:24,520 Speaker 1: of these rules in India, then Twitter is supposed to 149 00:09:24,559 --> 00:09:28,640 Speaker 1: give up information about that person to Indian law enforcement. 150 00:09:29,360 --> 00:09:33,640 Speaker 1: Twitter was also to hire a compliance officer to ensure 151 00:09:33,679 --> 00:09:37,960 Speaker 1: that the company would follow India's national policies. Also supposed 152 00:09:38,000 --> 00:09:41,240 Speaker 1: to hire a grievance officer in charge of reviewing allegations 153 00:09:41,559 --> 00:09:44,640 Speaker 1: and a contact person who would respond to law enforcement 154 00:09:44,679 --> 00:09:47,920 Speaker 1: messages as soon as possible. And according to the government, 155 00:09:48,120 --> 00:09:52,520 Speaker 1: law enforcement received a complaint from an unknown or unnamed 156 00:09:52,520 --> 00:09:55,520 Speaker 1: Twitter user that said that they were the victims of 157 00:09:55,559 --> 00:09:59,480 Speaker 1: malicious messages that were posted on Twitter that violated these 158 00:09:59,559 --> 00:10:02,320 Speaker 1: rules that I've mentioned, and that the company had not 159 00:10:02,400 --> 00:10:06,440 Speaker 1: yet hired the officers that I mentioned earlier. This follows 160 00:10:06,440 --> 00:10:08,800 Speaker 1: on the heels of some really big stories in India 161 00:10:08,880 --> 00:10:12,240 Speaker 1: involving social media, including the government's attempts to suppress a 162 00:10:12,320 --> 00:10:17,439 Speaker 1: farmers protest within India, and also Twitter had labeled some 163 00:10:17,520 --> 00:10:22,520 Speaker 1: messages from Indian officials as being manipulated media, which caused 164 00:10:22,520 --> 00:10:26,200 Speaker 1: a big mess in India. Here in the United States, 165 00:10:26,440 --> 00:10:30,599 Speaker 1: social networking sites enjoy legal protection thanks to Section to thirty, 166 00:10:30,640 --> 00:10:33,880 Speaker 1: which says that platforms are not accountable for the stuff 167 00:10:33,920 --> 00:10:37,480 Speaker 1: that their users post to those platforms. But officials in 168 00:10:37,559 --> 00:10:40,280 Speaker 1: India point out that it is unreasonable for a company 169 00:10:40,320 --> 00:10:44,600 Speaker 1: to expect American laws and American legal protection while they're 170 00:10:44,640 --> 00:10:48,319 Speaker 1: operating in a totally different country. It is too early 171 00:10:48,400 --> 00:10:51,200 Speaker 1: to say how all this is going to play out 172 00:10:51,360 --> 00:10:54,199 Speaker 1: and whether or not Twitter executives will eventually have to 173 00:10:54,679 --> 00:10:58,360 Speaker 1: stand charge for content posted on Twitter in India, content 174 00:10:58,440 --> 00:11:02,040 Speaker 1: posted by you know, other people. But I expect we're 175 00:11:02,040 --> 00:11:04,360 Speaker 1: gonna see some pretty big stories come out of all this, 176 00:11:04,559 --> 00:11:07,160 Speaker 1: and not just with Twitter, but with any social networking 177 00:11:07,200 --> 00:11:11,360 Speaker 1: site that is operating within India, an enormous market. It's 178 00:11:11,360 --> 00:11:15,040 Speaker 1: not like these companies want to just pull out of India. However, 179 00:11:15,960 --> 00:11:18,640 Speaker 1: there is a place where they are threatening to do 180 00:11:18,840 --> 00:11:22,520 Speaker 1: just that, just not in India. So, speaking of American 181 00:11:22,559 --> 00:11:26,120 Speaker 1: social networks facing big decisions and overseas markets, let's talk 182 00:11:26,120 --> 00:11:29,920 Speaker 1: about Hong Kong. The local government there plans to enact 183 00:11:29,960 --> 00:11:33,560 Speaker 1: some new rules that would change data protection laws within 184 00:11:33,720 --> 00:11:36,240 Speaker 1: the city of Hong Kong, and the goal is to 185 00:11:36,280 --> 00:11:40,440 Speaker 1: fight doxing. Now, that's the practice of a person sharing 186 00:11:40,600 --> 00:11:45,640 Speaker 1: some other person's personal information without their consent, UH, typically 187 00:11:45,840 --> 00:11:51,680 Speaker 1: for malicious purposes, like to either encourage harassment or worse. So, 188 00:11:51,800 --> 00:11:55,280 Speaker 1: for example, if I were to share Ben Boland's home 189 00:11:55,360 --> 00:11:59,880 Speaker 1: address on this program, that would be doxing, and I 190 00:12:00,000 --> 00:12:02,280 Speaker 1: suspect that's why Ben always forces me to wear a 191 00:12:02,280 --> 00:12:05,160 Speaker 1: blindfold and sit in a card that then randomly drives 192 00:12:05,160 --> 00:12:07,320 Speaker 1: around the city for two hours before I can never 193 00:12:07,400 --> 00:12:11,520 Speaker 1: visit him anyway, Hong Kong's rules would set finds of 194 00:12:11,760 --> 00:12:17,600 Speaker 1: nearly dollars per incident plus up to five years imprisonment. 195 00:12:18,080 --> 00:12:20,559 Speaker 1: And apparently, as the rules are written, they could apply 196 00:12:20,679 --> 00:12:24,959 Speaker 1: to platforms that includes companies like Facebook, Twitter, and Google, 197 00:12:25,800 --> 00:12:28,440 Speaker 1: so not just the people posting to them, but the 198 00:12:28,440 --> 00:12:31,920 Speaker 1: platforms that carry the messages as well. The company's sent 199 00:12:32,080 --> 00:12:35,439 Speaker 1: a letter back in June that just became public this week, 200 00:12:35,679 --> 00:12:38,440 Speaker 1: and that letter warns Hong Kong officials that if the 201 00:12:38,520 --> 00:12:41,880 Speaker 1: rules are going to go into practice, these companies will 202 00:12:41,920 --> 00:12:45,520 Speaker 1: have no choice but to stop providing services within Hong Kong. 203 00:12:46,040 --> 00:12:50,920 Speaker 1: So it's kind of an ultimatum. Dots ng is a 204 00:12:51,000 --> 00:12:55,160 Speaker 1: truly abhorrent practice. But just to be clear and to 205 00:12:55,200 --> 00:12:58,840 Speaker 1: get the full context of this, these rules seem to 206 00:12:58,880 --> 00:13:02,400 Speaker 1: stem not from the way that private citizens have been 207 00:13:02,400 --> 00:13:06,360 Speaker 1: affected in Hong Kong, but that rather, during the anti 208 00:13:06,440 --> 00:13:10,840 Speaker 1: government protests in Hong Kong back in twenty nineteen, some 209 00:13:11,000 --> 00:13:15,480 Speaker 1: people were sharing private information about several police officers that 210 00:13:15,640 --> 00:13:19,160 Speaker 1: ended up making the rounds online. Now, again, this is 211 00:13:19,200 --> 00:13:23,000 Speaker 1: a terrible practice, like you know, it's it's it's not 212 00:13:23,080 --> 00:13:25,880 Speaker 1: good no matter which side is using this, and many 213 00:13:25,920 --> 00:13:29,400 Speaker 1: of those police officers ended up facing targeted harassment and worse, 214 00:13:29,840 --> 00:13:33,040 Speaker 1: including you know, their families. But I just wanted to 215 00:13:33,080 --> 00:13:36,200 Speaker 1: give context that the rules don't seem to be in 216 00:13:36,280 --> 00:13:40,960 Speaker 1: response to, you know, citizen concerns. It seems to be 217 00:13:41,080 --> 00:13:43,680 Speaker 1: more of a response to try and clamp down on 218 00:13:43,800 --> 00:13:47,840 Speaker 1: citizens and as a response to civilian challenges to the 219 00:13:47,840 --> 00:13:51,360 Speaker 1: government's activities. So in other words, I guess the the 220 00:13:51,400 --> 00:13:54,680 Speaker 1: short way of saying all that is people are way 221 00:13:54,720 --> 00:13:59,040 Speaker 1: more complicated than technology ever will be. We have a 222 00:13:59,080 --> 00:14:02,080 Speaker 1: couple more stories to get through. But before I do that, 223 00:14:02,400 --> 00:14:14,920 Speaker 1: let's take a quick break. Okay, we're back so over 224 00:14:15,080 --> 00:14:18,720 Speaker 1: in the United Kingdom, there is a semiconductor company called 225 00:14:18,880 --> 00:14:23,440 Speaker 1: Newport Wafer Fab or n WF, and it has a 226 00:14:23,480 --> 00:14:27,280 Speaker 1: new owner. That owner is next spire A That is 227 00:14:27,320 --> 00:14:31,080 Speaker 1: a company that is based in the Netherlands. However, that 228 00:14:31,160 --> 00:14:35,120 Speaker 1: company has its own parent company, because things always get 229 00:14:35,120 --> 00:14:39,040 Speaker 1: more complicated, and that parent company is a Chinese firm 230 00:14:39,160 --> 00:14:44,840 Speaker 1: called Wingtech Group. NWF is the UK's largest silicon chip 231 00:14:44,920 --> 00:14:49,240 Speaker 1: manufacturing company. The UK government might actually look into these 232 00:14:49,280 --> 00:14:52,720 Speaker 1: things a little more closely because the UK passed a 233 00:14:52,800 --> 00:14:57,640 Speaker 1: law that's called the National Security and Investment Act specifically 234 00:14:57,920 --> 00:15:03,760 Speaker 1: to review all acquisitions that could affect national security. In general, 235 00:15:03,840 --> 00:15:07,080 Speaker 1: the UK government views the practice of a foreign company 236 00:15:07,200 --> 00:15:10,720 Speaker 1: coming in to purchase big UK companies as being a 237 00:15:10,760 --> 00:15:14,600 Speaker 1: bit risky. But so far there's been no official resistance 238 00:15:14,680 --> 00:15:21,240 Speaker 1: to this acquisition. Resistance. That's an electricity pun about silicon chips, 239 00:15:21,240 --> 00:15:24,360 Speaker 1: and I didn't even mean to make that. So we'll 240 00:15:24,400 --> 00:15:27,520 Speaker 1: have to see if the UK government allows this to 241 00:15:27,520 --> 00:15:30,640 Speaker 1: move forward with no further comment. It is a little weird. 242 00:15:30,920 --> 00:15:33,960 Speaker 1: I mean, I find the way the UK government handles 243 00:15:34,000 --> 00:15:37,000 Speaker 1: things to be odd anyway. Now, I look at things 244 00:15:37,040 --> 00:15:42,080 Speaker 1: like Brexit, which seems to indicate that England or Britain, 245 00:15:42,160 --> 00:15:44,800 Speaker 1: rather the United Kingdom wants to stand on its own 246 00:15:44,840 --> 00:15:47,880 Speaker 1: and not be part of a larger collective. And yet 247 00:15:48,520 --> 00:15:52,560 Speaker 1: isn't necessarily blinking an eye when large foreign companies come 248 00:15:52,600 --> 00:15:56,520 Speaker 1: in and acquire major companies within the UK. Just seems 249 00:15:56,520 --> 00:15:58,800 Speaker 1: to be a disconnect, is all I'm saying, Just trying 250 00:15:58,800 --> 00:16:03,040 Speaker 1: to understand it. Also in acquisition news, we've got some 251 00:16:03,240 --> 00:16:06,440 Speaker 1: about Bugatti, which is a company that's known for its 252 00:16:06,480 --> 00:16:12,280 Speaker 1: outrageously expensive supercars that very wealthy folks drive at very 253 00:16:12,360 --> 00:16:16,080 Speaker 1: fast speeds. The company of Bugatti started more than a 254 00:16:16,160 --> 00:16:18,960 Speaker 1: century ago, and I covered a bit about it in 255 00:16:19,040 --> 00:16:23,000 Speaker 1: my episodes about the history of Volkswagen. The original run 256 00:16:23,000 --> 00:16:27,560 Speaker 1: of Bugatti ended around the nineteen fifties nineteen sixties. It 257 00:16:27,640 --> 00:16:31,000 Speaker 1: followed the death of its founder, A Torre Bugatti. His 258 00:16:31,080 --> 00:16:33,880 Speaker 1: son had passed away a few years earlier before he died, 259 00:16:34,520 --> 00:16:38,520 Speaker 1: and it essentially kind of faded from memory. And in 260 00:16:38,640 --> 00:16:43,960 Speaker 1: nineteen seven a an investor, a man named Roman rt Oli, 261 00:16:44,160 --> 00:16:48,160 Speaker 1: purchased the rights to the brand name Bugatti, so he 262 00:16:48,200 --> 00:16:50,320 Speaker 1: was able to bring Bugatti back from you know, like 263 00:16:50,360 --> 00:16:54,920 Speaker 1: the Junkyard of history. Volkswagen would then later purchase those 264 00:16:55,040 --> 00:16:59,720 Speaker 1: rights and launched Bugatti Automobiles in ninet So while the 265 00:16:59,800 --> 00:17:03,480 Speaker 1: name Bugatti has been in the automotive world since you know, 266 00:17:03,800 --> 00:17:08,400 Speaker 1: nine nine, you could argue that the Bugatti automobile line 267 00:17:08,400 --> 00:17:11,560 Speaker 1: has had sort of three distinct phases. Had the original 268 00:17:11,640 --> 00:17:15,560 Speaker 1: run with the founder, the run under Artioli when he 269 00:17:15,840 --> 00:17:19,760 Speaker 1: bought the branding, and then the Volkswagen run. Now an 270 00:17:19,760 --> 00:17:24,280 Speaker 1: electric vehicle startup called Ramack has acquired a majority stake 271 00:17:24,400 --> 00:17:29,280 Speaker 1: in Bugatti from Volkswagen Mate Ramak or perhaps Mate Ramak 272 00:17:29,480 --> 00:17:31,520 Speaker 1: I honestly don't know how to say his name anyway. 273 00:17:31,520 --> 00:17:35,200 Speaker 1: The founder of Ramak will lead the new organization, which 274 00:17:35,200 --> 00:17:39,200 Speaker 1: will fittingly be called Bugatti Ramak. The deal was an 275 00:17:39,240 --> 00:17:42,320 Speaker 1: all stock deal, and according to Raemak, the Bugotti and 276 00:17:42,400 --> 00:17:46,440 Speaker 1: Rhemack departments will still operate separately as distinct brands. They're 277 00:17:46,480 --> 00:17:49,960 Speaker 1: not going to be merged together. The Bugatti Raemack company 278 00:17:50,240 --> 00:17:53,480 Speaker 1: will guide both brands. Essentially, it will make the decisions 279 00:17:53,520 --> 00:17:56,280 Speaker 1: for each brand, but each brand will remain its own thing. 280 00:17:56,960 --> 00:17:59,840 Speaker 1: The plan is to introduce an electric vehicle version of 281 00:18:00,040 --> 00:18:03,400 Speaker 1: Bugatti by the end of the decade, along with some 282 00:18:03,520 --> 00:18:08,720 Speaker 1: hybrid models. Users of Audacity have noticed some changes since 283 00:18:08,760 --> 00:18:12,080 Speaker 1: the company mus Group acquired the open source app earlier 284 00:18:12,119 --> 00:18:15,680 Speaker 1: this year. Audacity is a free to download and free 285 00:18:15,720 --> 00:18:19,720 Speaker 1: to use audio recording and editing app. Uh And in fact, 286 00:18:19,800 --> 00:18:21,920 Speaker 1: it's what I used to record tech stuff when I'm 287 00:18:21,960 --> 00:18:25,680 Speaker 1: working from home, like today, I'm using Audacity. At work, 288 00:18:25,880 --> 00:18:30,000 Speaker 1: we use Adobe Audition in case you're curious. Audacity is 289 00:18:30,080 --> 00:18:33,679 Speaker 1: popular because one it's free and to it has a 290 00:18:33,720 --> 00:18:37,320 Speaker 1: fairly extensive set of features, features that I hardly use 291 00:18:37,440 --> 00:18:40,520 Speaker 1: because there are a lot of them, but understanding what 292 00:18:40,560 --> 00:18:42,800 Speaker 1: each one does takes a little bit of digging, so 293 00:18:42,880 --> 00:18:46,479 Speaker 1: I've only kind of dipped my toe in like maybe 294 00:18:46,520 --> 00:18:49,480 Speaker 1: two percent of the features that are available on Audacity. 295 00:18:49,800 --> 00:18:53,159 Speaker 1: The latest version of Audacity, however, includes changes to the 296 00:18:53,200 --> 00:18:56,560 Speaker 1: apps privacy policy, and those changes mentioned that the app 297 00:18:56,560 --> 00:19:01,040 Speaker 1: will collect quote data necessary for law enforcement, IT, litigation 298 00:19:01,200 --> 00:19:08,760 Speaker 1: and authorities requests if any end quote, which huh, yikes. 299 00:19:08,560 --> 00:19:10,919 Speaker 1: This is a piece of software that you use to 300 00:19:10,960 --> 00:19:14,920 Speaker 1: record audio and then edit that audio. You probably wouldn't 301 00:19:14,960 --> 00:19:18,840 Speaker 1: think of it as also passing information on to some 302 00:19:18,920 --> 00:19:23,439 Speaker 1: other entity, right, but apparently that's the case. These changes 303 00:19:23,480 --> 00:19:28,080 Speaker 1: prompted some folks to refer to Audacity as spyware, and 304 00:19:28,160 --> 00:19:30,879 Speaker 1: there's some indication that we might see a fork in 305 00:19:30,960 --> 00:19:35,480 Speaker 1: Audacity as open source developers go back to earlier versions, 306 00:19:35,680 --> 00:19:39,919 Speaker 1: pre muse group acquisition of Audacity and continue developing the 307 00:19:39,960 --> 00:19:43,640 Speaker 1: software through the open source approach. And in case you're 308 00:19:43,680 --> 00:19:47,159 Speaker 1: not familiar with open source, the idea is that you 309 00:19:47,280 --> 00:19:51,200 Speaker 1: make code readily available so that people can see exactly 310 00:19:51,240 --> 00:19:54,400 Speaker 1: how something is coded, and they can then take that 311 00:19:54,600 --> 00:19:57,920 Speaker 1: and depending upon the licensing agreement you set up, they 312 00:19:57,960 --> 00:20:00,639 Speaker 1: can take that code and build it in to something 313 00:20:00,720 --> 00:20:03,359 Speaker 1: of their own. Maybe they make their own version of 314 00:20:03,400 --> 00:20:08,760 Speaker 1: the thing that code stands for, and they can then, 315 00:20:09,000 --> 00:20:12,840 Speaker 1: depending on the license agreement, market that themselves. This all 316 00:20:12,840 --> 00:20:17,399 Speaker 1: depends on how the open code licensing is approached. There 317 00:20:17,400 --> 00:20:18,880 Speaker 1: are a lot of different ways of doing that. I've 318 00:20:18,920 --> 00:20:22,240 Speaker 1: covered that on previous episodes. But the big benefit of 319 00:20:22,320 --> 00:20:25,919 Speaker 1: open source is that you have it open to the 320 00:20:26,080 --> 00:20:30,480 Speaker 1: entire world of developers, as opposed to say, an internal 321 00:20:30,560 --> 00:20:33,440 Speaker 1: group working on a project. And when you open it 322 00:20:33,520 --> 00:20:37,639 Speaker 1: up to everyone, well, by nature, you end up getting 323 00:20:37,880 --> 00:20:40,800 Speaker 1: the best ideas. It may take some time for them 324 00:20:40,800 --> 00:20:43,560 Speaker 1: to all kind of rise to the top. But you're 325 00:20:43,560 --> 00:20:47,640 Speaker 1: going to benefit from the experience of the collective. So 326 00:20:47,960 --> 00:20:53,199 Speaker 1: it's a very effective way of creating powerful applications, but 327 00:20:53,480 --> 00:20:57,280 Speaker 1: it does come with this sort of limitation on how 328 00:20:57,359 --> 00:21:01,320 Speaker 1: you monetize it, or at least there are other factors 329 00:21:01,320 --> 00:21:05,840 Speaker 1: you have to take into consideration. Moving on, Atari, a 330 00:21:05,920 --> 00:21:09,399 Speaker 1: company that lays claim to an historic name in video 331 00:21:09,440 --> 00:21:12,640 Speaker 1: game culture is making a change. Now, before I get 332 00:21:12,680 --> 00:21:15,479 Speaker 1: into this, I should mention that the Atari of today 333 00:21:15,880 --> 00:21:19,000 Speaker 1: is not the same company as the Atari that was 334 00:21:19,080 --> 00:21:21,680 Speaker 1: part of the initial video game console boom of the 335 00:21:21,760 --> 00:21:25,560 Speaker 1: late seventies and early eighties. That version of Atari pretty 336 00:21:25,640 --> 00:21:30,280 Speaker 1: much doesn't exist anymore. Some of its intellectual property exists, 337 00:21:30,800 --> 00:21:34,760 Speaker 1: but but you could think of the current Atari as being, 338 00:21:35,400 --> 00:21:40,240 Speaker 1: you know, a reincarnation maybe. So the Attari of today 339 00:21:40,320 --> 00:21:43,160 Speaker 1: is trying to really establish itself in the modern gaming 340 00:21:43,280 --> 00:21:46,280 Speaker 1: sphere and announced that it's going to transition away from 341 00:21:46,280 --> 00:21:49,919 Speaker 1: developing free to play mobile games, which is an incredibly 342 00:21:49,960 --> 00:21:53,919 Speaker 1: competitive field, to focus more on making games for consoles 343 00:21:54,040 --> 00:21:57,760 Speaker 1: and the PC. These plans would potentially include developing new 344 00:21:57,840 --> 00:22:03,199 Speaker 1: games for the Atari VCS console, which was finally released 345 00:22:03,240 --> 00:22:07,000 Speaker 1: after multiple delays. This is a console that has a 346 00:22:07,320 --> 00:22:11,119 Speaker 1: throwback retro look to it. It plays classic Atari twenty 347 00:22:11,119 --> 00:22:14,560 Speaker 1: six games, among other things like it technically can play 348 00:22:14,640 --> 00:22:17,160 Speaker 1: lots of other stuff, but it was designed to kind 349 00:22:17,160 --> 00:22:21,760 Speaker 1: of dip into that nostalgia for gamers who are, you know, 350 00:22:21,880 --> 00:22:25,119 Speaker 1: my age or older. No word yet on how the 351 00:22:25,119 --> 00:22:28,560 Speaker 1: company is handling the planned opening of its string of 352 00:22:28,560 --> 00:22:31,000 Speaker 1: Attari hotels that we heard about a couple of years ago. 353 00:22:31,880 --> 00:22:34,200 Speaker 1: I actually wonder if that project is still on track 354 00:22:34,320 --> 00:22:37,560 Speaker 1: or not. The first of those hotels is supposed to 355 00:22:37,600 --> 00:22:40,520 Speaker 1: open in Las Vegas next year, and I know that 356 00:22:40,560 --> 00:22:42,840 Speaker 1: if I ever do return to c e S, I'm 357 00:22:42,840 --> 00:22:44,520 Speaker 1: going to have to try and book a room in 358 00:22:44,560 --> 00:22:49,800 Speaker 1: that hotel, assuming you know that it exists. And finally, 359 00:22:50,160 --> 00:22:54,000 Speaker 1: has this ever happened to you? You find yourself in 360 00:22:54,040 --> 00:22:57,479 Speaker 1: a tedious situation, so you reach over and pick up 361 00:22:57,480 --> 00:23:01,040 Speaker 1: your phone for a distraction. Maybe you're standing in a 362 00:23:01,080 --> 00:23:03,000 Speaker 1: line and you just want something to take your mind 363 00:23:03,040 --> 00:23:06,240 Speaker 1: off the fact that you're five people back from being 364 00:23:06,240 --> 00:23:09,640 Speaker 1: able to order your coffee. Maybe you're watching a TV 365 00:23:09,760 --> 00:23:13,160 Speaker 1: show and it's just gone to commercial. Maybe you're listening 366 00:23:13,200 --> 00:23:15,679 Speaker 1: to a podcast and the bald guy is just droning 367 00:23:15,720 --> 00:23:19,520 Speaker 1: on about stuff, or maybe you're part of a government 368 00:23:19,600 --> 00:23:23,080 Speaker 1: body hosting a debate about matters that could potentially affect 369 00:23:23,480 --> 00:23:28,320 Speaker 1: thousands of people's lives. That last one is a kicker, right, Like, 370 00:23:28,400 --> 00:23:31,480 Speaker 1: I mean, there's this general feeling that if you're at 371 00:23:31,520 --> 00:23:36,040 Speaker 1: an important debate, for example, let's say you are a 372 00:23:36,160 --> 00:23:40,160 Speaker 1: US politician in Congress, maybe you should be paying attention 373 00:23:40,240 --> 00:23:44,080 Speaker 1: to the proceedings, as at least theoretically, the outcome of 374 00:23:44,160 --> 00:23:47,560 Speaker 1: them stands to impact countless people, many of whom you 375 00:23:47,640 --> 00:23:52,680 Speaker 1: might actually represent. And yet politicians, like everyone else, you 376 00:23:52,920 --> 00:23:56,160 Speaker 1: can get a bit bored and occasionally they will look 377 00:23:56,200 --> 00:24:01,120 Speaker 1: to their phones for distractions. But the question how occasionally, 378 00:24:01,200 --> 00:24:05,480 Speaker 1: like how much or how little attention are the politicians 379 00:24:05,560 --> 00:24:09,920 Speaker 1: actually paying when it comes to, you know, running a government. 380 00:24:10,440 --> 00:24:15,440 Speaker 1: So an AI that's being called Drees Deporter is scrying 381 00:24:15,480 --> 00:24:19,880 Speaker 1: through video of televised political debates in Belgium. It's doing 382 00:24:19,920 --> 00:24:23,439 Speaker 1: that to see how frequently people are looking at their phones. 383 00:24:24,400 --> 00:24:27,439 Speaker 1: So the AI is looking at signs that politicians are 384 00:24:27,440 --> 00:24:30,119 Speaker 1: looking down at their phones, are actively typing on their 385 00:24:30,160 --> 00:24:33,320 Speaker 1: phones rather than paying attention to the proceedings at hand. 386 00:24:33,760 --> 00:24:38,399 Speaker 1: Then the AI takes screenshots of moments that determines count 387 00:24:38,480 --> 00:24:41,800 Speaker 1: as you know, politicians being distracted. Uh, and it even 388 00:24:41,920 --> 00:24:45,360 Speaker 1: highlights the region showing the phone in the hands of 389 00:24:45,400 --> 00:24:49,480 Speaker 1: that politician then posts those images to Twitter. So it's 390 00:24:49,800 --> 00:24:54,199 Speaker 1: it's essentially public shaming. It's it's saying, hey, you are 391 00:24:54,240 --> 00:24:57,119 Speaker 1: elected to do a job. That job might be boring, 392 00:24:57,440 --> 00:25:01,320 Speaker 1: but it's still important and you are entrusted to do 393 00:25:01,359 --> 00:25:04,840 Speaker 1: that job, so get off you gosh darn phone. Now. 394 00:25:05,400 --> 00:25:08,639 Speaker 1: I'm the first to say this does not sound entirely 395 00:25:08,720 --> 00:25:11,639 Speaker 1: fair to me, because we don't actually know why the 396 00:25:11,680 --> 00:25:15,000 Speaker 1: politicians are looking at their phones at any given time. 397 00:25:15,359 --> 00:25:18,520 Speaker 1: There could be cases where there is a legitimate reason 398 00:25:18,720 --> 00:25:21,720 Speaker 1: that isn't related to just you know, checking out from 399 00:25:21,760 --> 00:25:25,560 Speaker 1: whatever political discussion is going on at the moment. Still, 400 00:25:25,880 --> 00:25:29,000 Speaker 1: I thought it was a fairly funny story to end 401 00:25:29,160 --> 00:25:33,199 Speaker 1: our episode on today. And that is it. That's the 402 00:25:33,200 --> 00:25:37,680 Speaker 1: news for Tuesday, July six one. We'll have more coming 403 00:25:37,760 --> 00:25:41,080 Speaker 1: up on Thursday. If you have any suggestions for topics 404 00:25:41,119 --> 00:25:43,680 Speaker 1: I should cover in future episodes of Tech Stuff, reach 405 00:25:43,720 --> 00:25:45,640 Speaker 1: out to me and let me know what those are. 406 00:25:45,680 --> 00:25:47,880 Speaker 1: The best way to do that is over on Twitter, 407 00:25:48,359 --> 00:25:51,880 Speaker 1: the handle we use on Twitter is at text Stuff 408 00:25:52,080 --> 00:25:56,800 Speaker 1: hs W Now I will talk to you again really soon. 409 00:26:02,080 --> 00:26:05,120 Speaker 1: Text Stuff is an I heart Radio production. For more 410 00:26:05,200 --> 00:26:08,560 Speaker 1: podcasts from I Heart Radio, visit the i heart Radio app, 411 00:26:08,720 --> 00:26:11,879 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.