1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,640 --> 00:00:15,319 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,440 --> 00:00:18,480 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,520 --> 00:00:21,160 Speaker 1: and a love of all things tech. This is the 5 00:00:21,200 --> 00:00:26,520 Speaker 1: tech news for Thursday, August twenty one. Should be a 6 00:00:26,520 --> 00:00:29,920 Speaker 1: pretty short and sweet episode. Let's get to it now. 7 00:00:30,520 --> 00:00:33,839 Speaker 1: Last week, I reported the Only Fans, which is a 8 00:00:35,120 --> 00:00:40,080 Speaker 1: content platform that is most famous for hosting adult content 9 00:00:40,440 --> 00:00:45,240 Speaker 1: and sexually explicit content, was promoting a streaming video app 10 00:00:45,520 --> 00:00:50,800 Speaker 1: that would strictly stick too Safe for work content because 11 00:00:50,800 --> 00:00:52,440 Speaker 1: that was the only way the company was going to 12 00:00:52,479 --> 00:00:56,360 Speaker 1: get the app through and accepted into the Google and 13 00:00:56,440 --> 00:01:01,040 Speaker 1: Apple app stores. But following that was a roller coaster 14 00:01:01,240 --> 00:01:05,560 Speaker 1: of a story. First, the company said it was planning 15 00:01:05,600 --> 00:01:10,720 Speaker 1: to change its content policies, and namely Only Fans initially 16 00:01:10,720 --> 00:01:15,440 Speaker 1: announced that it would no longer allow for sexually explicit 17 00:01:15,880 --> 00:01:19,400 Speaker 1: content on the site, and it sounded as though nudity 18 00:01:19,520 --> 00:01:24,240 Speaker 1: would not be forbidden, but anything expressly sexual would be. 19 00:01:24,680 --> 00:01:30,600 Speaker 1: That precipitated a protest, which included content creators who essentially 20 00:01:30,680 --> 00:01:33,679 Speaker 1: made the site what it is, the argument being this 21 00:01:33,800 --> 00:01:39,039 Speaker 1: site wouldn't exist without that content. Sex workers were among 22 00:01:39,160 --> 00:01:43,920 Speaker 1: those protesting and other supporters they were criticizing the company's decision. 23 00:01:44,400 --> 00:01:48,120 Speaker 1: Well now only Fans has since reversed that decision and 24 00:01:48,200 --> 00:01:51,880 Speaker 1: said this planned change in policy will not happen. After all, 25 00:01:51,920 --> 00:01:54,480 Speaker 1: it was supposed to happen October one, and now it's 26 00:01:54,520 --> 00:01:58,080 Speaker 1: not going to happen. Moreover, the company said the whole 27 00:01:58,080 --> 00:02:02,560 Speaker 1: reason for this proposed change the first place wasn't necessarily 28 00:02:02,640 --> 00:02:08,079 Speaker 1: because the company had shifted towards a more conservative philosophy, 29 00:02:08,120 --> 00:02:12,520 Speaker 1: but rather it comes down to money, or maybe more precisely, 30 00:02:12,520 --> 00:02:17,280 Speaker 1: it comes down to payment processing. So, according to this 31 00:02:17,400 --> 00:02:21,639 Speaker 1: argument at issue, our companies like MasterCard and Visa, these 32 00:02:21,680 --> 00:02:27,200 Speaker 1: are payment processing companies that facilitate financial transactions, you know, 33 00:02:27,240 --> 00:02:30,480 Speaker 1: obviously for thousands of companies, right, not just only Fans, 34 00:02:30,480 --> 00:02:34,200 Speaker 1: but all around the world. Now, without these transaction services, 35 00:02:34,680 --> 00:02:40,720 Speaker 1: it becomes increasingly difficult to handle financial transactions, particularly at scale, 36 00:02:41,240 --> 00:02:46,000 Speaker 1: and so companies depend upon these very large entities to 37 00:02:46,280 --> 00:02:50,440 Speaker 1: make the flow of money possible because obviously without money 38 00:02:50,680 --> 00:02:53,560 Speaker 1: there's nothing, you know, creators want to be paid, the 39 00:02:53,600 --> 00:02:57,640 Speaker 1: site needs to make money. So these processing services hold 40 00:02:57,680 --> 00:03:00,680 Speaker 1: a lot of power and hold a lot of power 41 00:03:00,760 --> 00:03:05,639 Speaker 1: over the payment processing services are various conservative groups that 42 00:03:05,880 --> 00:03:10,000 Speaker 1: object to sexually explicit content. This is not the first 43 00:03:10,000 --> 00:03:14,120 Speaker 1: time we've seen platforms, you know, either bow or attempt 44 00:03:14,200 --> 00:03:18,320 Speaker 1: to bow to pressure from payment services. Patreon went through 45 00:03:18,400 --> 00:03:22,360 Speaker 1: something very similar. There were also reports that Only Fans 46 00:03:22,600 --> 00:03:25,959 Speaker 1: was struggling to secure investment money, and that the issue 47 00:03:26,000 --> 00:03:29,120 Speaker 1: of sexually explicit material on the site was the reason 48 00:03:29,360 --> 00:03:33,280 Speaker 1: for that challenge. But at any rate, the protests seemed 49 00:03:33,320 --> 00:03:36,120 Speaker 1: to have swayed Only Fans to change course yet again 50 00:03:36,600 --> 00:03:39,680 Speaker 1: and reject the requirement for content creators to follow a 51 00:03:39,720 --> 00:03:42,400 Speaker 1: new policy. Now, the reason I wanted to cover the 52 00:03:42,520 --> 00:03:47,640 Speaker 1: story was not for its salacious nature. It wasn't because 53 00:03:47,720 --> 00:03:52,600 Speaker 1: of controversy or to you know, throw sex into the mix, 54 00:03:52,840 --> 00:03:56,680 Speaker 1: but rather to highlight how politics and influence groups and 55 00:03:56,760 --> 00:04:01,040 Speaker 1: money play a huge role in everything in including in 56 00:04:01,080 --> 00:04:04,960 Speaker 1: the tech sector and on online platforms, and that we 57 00:04:05,000 --> 00:04:08,400 Speaker 1: can't really look at any issue without taking the larger 58 00:04:08,480 --> 00:04:13,160 Speaker 1: context into account, or else we miss important details. So 59 00:04:13,520 --> 00:04:19,039 Speaker 1: while your personal stance on the appropriate or inappropriate nature 60 00:04:19,040 --> 00:04:22,560 Speaker 1: of sexually explicit content is valid, like that's part of 61 00:04:22,560 --> 00:04:26,160 Speaker 1: your philosophy, and I do not wish to put my 62 00:04:26,240 --> 00:04:31,159 Speaker 1: own thoughts on there I would say that, you know, 63 00:04:31,360 --> 00:04:34,280 Speaker 1: ultimately we're looking at a lot of people using a 64 00:04:34,279 --> 00:04:37,719 Speaker 1: lot of influence to try and push things towards a 65 00:04:37,800 --> 00:04:42,000 Speaker 1: specific agenda, and that's where the conflict comes out of. 66 00:04:42,400 --> 00:04:45,919 Speaker 1: I don't think it was a lot of people reduced 67 00:04:45,920 --> 00:04:48,280 Speaker 1: to this to essentially say that only FANS was turning 68 00:04:48,320 --> 00:04:51,480 Speaker 1: its back on the people who had made the site 69 00:04:51,520 --> 00:04:54,960 Speaker 1: what it was. I think there's an element of that 70 00:04:54,960 --> 00:04:57,320 Speaker 1: that is true, but I think it is far more 71 00:04:57,360 --> 00:05:03,159 Speaker 1: complicated than that summary. Moving on, Earlier this week, I 72 00:05:03,200 --> 00:05:06,440 Speaker 1: talked about the ongoing semi conductor shortage and how we 73 00:05:06,520 --> 00:05:09,159 Speaker 1: might see that issue affects stuff like, you know, the 74 00:05:09,240 --> 00:05:13,320 Speaker 1: supply of various technologies and products. Well, the Wall Street 75 00:05:13,400 --> 00:05:17,359 Speaker 1: Journal reports that the Taiwan Semiconductor Manufacturing Company, which is 76 00:05:17,400 --> 00:05:22,000 Speaker 1: the largest semi conductor chip manufacturer in the world, is 77 00:05:22,080 --> 00:05:26,000 Speaker 1: going to increase its prices potentially by as much as 78 00:05:26,040 --> 00:05:31,120 Speaker 1: twenty percent. Now, if you're gonna bump up your prices 79 00:05:31,120 --> 00:05:34,200 Speaker 1: by a fifth, that's pretty significant. And most of these 80 00:05:34,279 --> 00:05:36,400 Speaker 1: chips are not going to people like you and me. 81 00:05:37,200 --> 00:05:39,800 Speaker 1: These are being purchased by other companies which are then 82 00:05:39,920 --> 00:05:44,200 Speaker 1: using those chips as components in other products, which can 83 00:05:44,240 --> 00:05:49,240 Speaker 1: include anything from advanced technology to simple gadgets. But if 84 00:05:49,480 --> 00:05:54,080 Speaker 1: the semi conductor companies are raising their prices and these 85 00:05:54,120 --> 00:05:57,159 Speaker 1: other secondary companies are going to have to pay more 86 00:05:57,200 --> 00:06:01,000 Speaker 1: money to buy a basic component, I bet you can 87 00:06:01,040 --> 00:06:06,159 Speaker 1: guess where that bumping cost is gonna go next. Yeah, 88 00:06:06,279 --> 00:06:09,919 Speaker 1: this very likely means that for a fairly wide range 89 00:06:09,960 --> 00:06:14,600 Speaker 1: of products, we're likely to see prices start to go up. Otherwise, 90 00:06:15,000 --> 00:06:17,200 Speaker 1: the companies that are selling this stuff will have to 91 00:06:17,240 --> 00:06:19,360 Speaker 1: do so at a loss, and that's, you know, not 92 00:06:19,440 --> 00:06:23,239 Speaker 1: great in the business plan. You don't want to sell 93 00:06:23,320 --> 00:06:26,719 Speaker 1: things for less than what you paid to make them. 94 00:06:26,760 --> 00:06:29,680 Speaker 1: And we've definitely seen this already happen in the automotive world, 95 00:06:29,920 --> 00:06:32,480 Speaker 1: although that's a more complicated issue. You've got a limited 96 00:06:32,520 --> 00:06:36,520 Speaker 1: supply of cars and you've got steady demand, and by 97 00:06:36,560 --> 00:06:40,919 Speaker 1: the laws of supply and demand, if the supply is 98 00:06:40,960 --> 00:06:43,800 Speaker 1: low and the demand is high, then prices tend to 99 00:06:43,839 --> 00:06:47,599 Speaker 1: go up. Just kind of how it works in in 100 00:06:47,680 --> 00:06:50,919 Speaker 1: that free market sort of approach. However, it would not 101 00:06:51,000 --> 00:06:55,360 Speaker 1: surprise me to see this tent trend kind of affect 102 00:06:55,560 --> 00:06:58,000 Speaker 1: a wider variety of technology in the months to come, 103 00:06:58,440 --> 00:07:01,600 Speaker 1: which is not great new especially as we move towards 104 00:07:01,960 --> 00:07:05,960 Speaker 1: the holiday season, we will have to see. One group 105 00:07:05,960 --> 00:07:09,600 Speaker 1: of policies that has had an enormous impact on business 106 00:07:09,600 --> 00:07:13,440 Speaker 1: in general, but the tech world in particular, is the 107 00:07:13,520 --> 00:07:17,400 Speaker 1: g d p R, or General Data Protection Regulation in 108 00:07:17,440 --> 00:07:22,040 Speaker 1: the European Union. These rules restrict how companies can collect 109 00:07:22,200 --> 00:07:26,360 Speaker 1: and use personal information of EU citizens, and it includes 110 00:07:26,480 --> 00:07:29,600 Speaker 1: numerous protections for citizens of the EU in an era 111 00:07:29,680 --> 00:07:32,880 Speaker 1: where data collection is pretty much the name of the game. 112 00:07:33,280 --> 00:07:36,120 Speaker 1: This is why a lot of sites include those little 113 00:07:36,360 --> 00:07:40,480 Speaker 1: pop up notifications about cookies and tracking and whatnot and 114 00:07:40,560 --> 00:07:43,000 Speaker 1: allow you to opt out of that or to leave 115 00:07:43,080 --> 00:07:47,400 Speaker 1: the site before you know the data collection begins. Well, 116 00:07:47,440 --> 00:07:51,120 Speaker 1: the UK famously said peace out to the EU, and 117 00:07:51,200 --> 00:07:54,480 Speaker 1: now Oliver doubt In, the Culture Secretary for the UK, 118 00:07:55,080 --> 00:07:57,880 Speaker 1: says that the nation is going to take a slightly 119 00:07:57,920 --> 00:08:01,880 Speaker 1: different path from the EU and leave at least parts 120 00:08:01,920 --> 00:08:05,840 Speaker 1: of the g d p R regulations behind. However, this 121 00:08:05,920 --> 00:08:09,000 Speaker 1: is all much easier said than done, because while the 122 00:08:09,080 --> 00:08:12,000 Speaker 1: UK might not be part of the EU anymore, it 123 00:08:12,680 --> 00:08:15,960 Speaker 1: definitely has lots and lots and lots of transactions with 124 00:08:16,040 --> 00:08:20,680 Speaker 1: the EU. Frequently, I mean as as much as a 125 00:08:20,760 --> 00:08:23,360 Speaker 1: small subsection of Brits might like to think that they 126 00:08:23,360 --> 00:08:26,600 Speaker 1: are a world unto themselves, something that a lot of 127 00:08:26,640 --> 00:08:30,280 Speaker 1: Americans can identify with the fact is there are actually 128 00:08:30,320 --> 00:08:32,719 Speaker 1: a lot more people outside of the UK than there 129 00:08:32,760 --> 00:08:36,800 Speaker 1: are in it. Anyway, whatever new rules take the place 130 00:08:36,840 --> 00:08:39,200 Speaker 1: of g d p R in the UK, we'll have 131 00:08:39,320 --> 00:08:42,800 Speaker 1: to meet EU approval or else the UK might see 132 00:08:43,200 --> 00:08:46,840 Speaker 1: various data channels connecting it to the continent, kind of 133 00:08:46,880 --> 00:08:51,400 Speaker 1: shut down. The UK has named John Edwards, who currently 134 00:08:51,640 --> 00:08:55,880 Speaker 1: is the Privacy Commissioner of New Zealand, as a potential 135 00:08:55,920 --> 00:08:59,320 Speaker 1: candidate to take over the job of Information Commissioner, which 136 00:08:59,320 --> 00:09:02,320 Speaker 1: would start in November of this year, and that he 137 00:09:02,360 --> 00:09:06,160 Speaker 1: would then oversee the drafting of new rules. It will 138 00:09:06,200 --> 00:09:09,400 Speaker 1: likely be a pretty complicated process to make certain that 139 00:09:09,520 --> 00:09:12,880 Speaker 1: the new rules are compatible enough with g DPR and 140 00:09:12,960 --> 00:09:16,680 Speaker 1: protect British citizens adequately, and to you know, kind of 141 00:09:16,720 --> 00:09:19,439 Speaker 1: remove some of the stuff that the UK sees as 142 00:09:19,480 --> 00:09:25,040 Speaker 1: being unnecessary or counterproductive. And I wish them luck in 143 00:09:25,080 --> 00:09:28,600 Speaker 1: those endeavors. I have a few more news stories, but 144 00:09:28,640 --> 00:09:30,760 Speaker 1: before we get to that, let's take a quick break. 145 00:09:38,480 --> 00:09:42,079 Speaker 1: BuzzFeed News published an article that reveals that twenty four 146 00:09:42,120 --> 00:09:45,080 Speaker 1: countries have made use of clear View ai s facial 147 00:09:45,080 --> 00:09:48,440 Speaker 1: recognition technology, and you may have heard me talk about 148 00:09:48,480 --> 00:09:51,560 Speaker 1: clear View AI before. That company has actually run into 149 00:09:51,559 --> 00:09:56,160 Speaker 1: some trouble with various online platforms after using technology to 150 00:09:56,240 --> 00:09:59,439 Speaker 1: scrape those platforms for publicly posted images in order to 151 00:09:59,480 --> 00:10:05,120 Speaker 1: build out an enormous facial recognition database without the consent 152 00:10:05,280 --> 00:10:08,880 Speaker 1: of the people who's you know, images they collected. And 153 00:10:08,960 --> 00:10:12,599 Speaker 1: you might say, well, if someone posts a photo publicly 154 00:10:13,000 --> 00:10:15,600 Speaker 1: on their Facebook profile, like if their profile is public 155 00:10:16,040 --> 00:10:20,040 Speaker 1: and they post it to the public, you know, setting, 156 00:10:20,400 --> 00:10:23,960 Speaker 1: then it's fair game because it's all public, right, But 157 00:10:24,080 --> 00:10:26,720 Speaker 1: what if that photo includes other folks in it? It's 158 00:10:26,760 --> 00:10:29,360 Speaker 1: not like just a selfie. What if a friend of 159 00:10:29,400 --> 00:10:33,120 Speaker 1: yours takes a photo that has you in it, and 160 00:10:33,160 --> 00:10:36,800 Speaker 1: then they post that photo on their public Facebook profile, 161 00:10:37,280 --> 00:10:40,120 Speaker 1: and then clear View AI scoops up that photo and 162 00:10:40,160 --> 00:10:43,320 Speaker 1: now your face is also in that database, even though 163 00:10:43,800 --> 00:10:47,880 Speaker 1: you never gave consent. You don't publicly post pictures, at 164 00:10:47,920 --> 00:10:51,199 Speaker 1: least in this hypothetical situation, and so you never had 165 00:10:51,240 --> 00:10:54,400 Speaker 1: an option to opt in or opt out of this. 166 00:10:54,559 --> 00:10:57,280 Speaker 1: It just happened because a friend of yours decided that 167 00:10:57,280 --> 00:10:59,840 Speaker 1: that picture just needed to go up on that Facebook profile. 168 00:11:00,760 --> 00:11:03,040 Speaker 1: All of that just stinks, right, I mean this affects 169 00:11:03,040 --> 00:11:07,000 Speaker 1: people who aren't even on Facebook. Right. You've got people 170 00:11:07,000 --> 00:11:09,720 Speaker 1: who just know people who are on Facebook, and they 171 00:11:09,760 --> 00:11:11,679 Speaker 1: happen to show up in pictures and they had no 172 00:11:12,040 --> 00:11:15,200 Speaker 1: saying any of this. This is just one of the 173 00:11:15,360 --> 00:11:19,720 Speaker 1: many problems that privacy rights advocates have with clear view AI. 174 00:11:20,600 --> 00:11:25,240 Speaker 1: Another big one is that facial recognition technologies are notoriously 175 00:11:25,520 --> 00:11:29,400 Speaker 1: prone to bias, and that this frequently means the tech 176 00:11:29,520 --> 00:11:34,559 Speaker 1: is just playing bad at recognizing images, specifically of non 177 00:11:34,720 --> 00:11:38,440 Speaker 1: white people, and this can lead to false positives. That 178 00:11:38,480 --> 00:11:42,440 Speaker 1: has actually happened in numerous cases around the world where 179 00:11:42,480 --> 00:11:48,880 Speaker 1: police have acted upon uh incorrect facial recognition hits. So 180 00:11:49,360 --> 00:11:54,080 Speaker 1: this disproportionately affects people of color. It is, It is bad. 181 00:11:54,600 --> 00:11:58,360 Speaker 1: Buzzfeeds report shows how widely used this tech is within 182 00:11:58,480 --> 00:12:01,520 Speaker 1: various institutions around the world, old including agencies that are 183 00:12:01,520 --> 00:12:05,120 Speaker 1: funded by taxpayers. So in other words, we're all paying 184 00:12:05,280 --> 00:12:09,000 Speaker 1: to be spied upon and potentially misidentified. There's been a 185 00:12:09,040 --> 00:12:13,520 Speaker 1: growing movement around the world to restrict facial recognition technology use, 186 00:12:13,760 --> 00:12:18,760 Speaker 1: especially in investigations, and some communities outright ban police use 187 00:12:18,920 --> 00:12:23,960 Speaker 1: of that technology. Honestly, I'm all for banning facial recognition 188 00:12:23,960 --> 00:12:27,959 Speaker 1: tech for those kinds of investigations. And I've said this before. 189 00:12:28,280 --> 00:12:32,040 Speaker 1: Relying on technology to solve a social problem when the 190 00:12:32,080 --> 00:12:35,480 Speaker 1: tech just isn't there yet, that's just a recipe to 191 00:12:35,600 --> 00:12:39,880 Speaker 1: make a bad social problem even worse, or to create 192 00:12:40,040 --> 00:12:44,120 Speaker 1: new social problems that interact with existing ones. Not a 193 00:12:44,160 --> 00:12:49,080 Speaker 1: good thing. Tech is not a solve all approach to 194 00:12:49,240 --> 00:12:52,680 Speaker 1: all of our problems, and relying on tech to be 195 00:12:52,840 --> 00:12:55,640 Speaker 1: that is a recipe for disaster. I see the same 196 00:12:55,679 --> 00:12:58,480 Speaker 1: thing playing out with climate change, where people are saying, 197 00:12:58,760 --> 00:13:01,000 Speaker 1: I'm not worried the as we're gonna come up with 198 00:13:01,080 --> 00:13:04,760 Speaker 1: tech to solve this problem. There's no guarantee will do that. 199 00:13:04,800 --> 00:13:06,880 Speaker 1: There's no guarantee it will be on a timeline that 200 00:13:06,920 --> 00:13:10,720 Speaker 1: will actually make a difference. You cannot use the you 201 00:13:10,720 --> 00:13:14,800 Speaker 1: know tech as some sort of literal deos X makena 202 00:13:15,480 --> 00:13:18,680 Speaker 1: to come in and rescue us. All right, soap box 203 00:13:19,200 --> 00:13:23,120 Speaker 1: set aside, but sticking with facial recognition tech, The Register 204 00:13:23,240 --> 00:13:26,960 Speaker 1: reports that a South Korean government agency called the Personal 205 00:13:27,240 --> 00:13:31,920 Speaker 1: Information Protection Commission or p i p C, found that 206 00:13:32,160 --> 00:13:37,319 Speaker 1: Facebook created facial recognition templates for two thousand South Koreans 207 00:13:37,400 --> 00:13:41,679 Speaker 1: without first securing proper consent, and that this happened between 208 00:13:41,760 --> 00:13:44,960 Speaker 1: April two thousand eighteen and April two thousand nineteen. The 209 00:13:45,000 --> 00:13:49,800 Speaker 1: government has then find Facebook six point four six billion one. Now, 210 00:13:49,840 --> 00:13:52,920 Speaker 1: that's about five and a half million dollars, and let's 211 00:13:52,960 --> 00:13:55,480 Speaker 1: be honest, that is a lot of money. Five and 212 00:13:55,480 --> 00:13:57,760 Speaker 1: a half million dollars, that's a ton of money. I 213 00:13:57,800 --> 00:14:03,079 Speaker 1: would I would be gob struck to get five and 214 00:14:03,080 --> 00:14:06,760 Speaker 1: a half million dollars, But for Facebook, that probably doesn't 215 00:14:06,800 --> 00:14:11,200 Speaker 1: even amount to a catered dinner at Facebook anyway. The 216 00:14:11,320 --> 00:14:14,840 Speaker 1: organization also told Facebook that it would need to destroy 217 00:14:15,120 --> 00:14:19,280 Speaker 1: the facial recognition info the company had developed for South Koreans, 218 00:14:19,320 --> 00:14:21,720 Speaker 1: and then if it wanted to rebuild it, it it would 219 00:14:21,760 --> 00:14:25,760 Speaker 1: first have to obtain consent from each person before starting 220 00:14:25,800 --> 00:14:29,480 Speaker 1: back into that process. It also issued a warning to 221 00:14:29,520 --> 00:14:32,360 Speaker 1: Google and told the company to make its privacy policies 222 00:14:32,360 --> 00:14:35,920 Speaker 1: and settings more transparent. And Netflix got hit with a 223 00:14:35,920 --> 00:14:40,960 Speaker 1: two million one or are just under a hundred dollars 224 00:14:40,960 --> 00:14:44,680 Speaker 1: in fines for collecting the data of users without first 225 00:14:44,720 --> 00:14:51,400 Speaker 1: gaining their consent. So we're seeing more pushback in this field. 226 00:14:51,520 --> 00:14:54,920 Speaker 1: But again, like those amounts I mean, they might probably 227 00:14:54,960 --> 00:14:57,400 Speaker 1: make sense within the context of South Korea, but they 228 00:14:57,400 --> 00:14:59,920 Speaker 1: are so small in the grand scheme of these come 229 00:15:00,000 --> 00:15:04,600 Speaker 1: as that while you know, no one ever wants to 230 00:15:04,600 --> 00:15:09,040 Speaker 1: pay a fine, it's barely an inconvenience. The EU is 231 00:15:09,080 --> 00:15:11,160 Speaker 1: looking to chip away at some of the features that 232 00:15:11,200 --> 00:15:16,160 Speaker 1: cryptocurrency enthusiasts happen to like a lot, namely that whole 233 00:15:16,160 --> 00:15:19,560 Speaker 1: bit about privacy, you know, being able to make transactions 234 00:15:19,600 --> 00:15:24,400 Speaker 1: without everyone knowing your business. The EU wants companies like 235 00:15:24,480 --> 00:15:28,480 Speaker 1: coin base and cracking, so essentially companies that handle like 236 00:15:28,600 --> 00:15:31,960 Speaker 1: digital wallets and stuff, to collect information on people who 237 00:15:31,960 --> 00:15:36,000 Speaker 1: are conducting cryptocurrency transactions. And this gets really interesting to 238 00:15:36,040 --> 00:15:41,240 Speaker 1: me because it actually shows how how there's some conflicting 239 00:15:41,320 --> 00:15:45,240 Speaker 1: philosophies at play in the EU when it comes to 240 00:15:45,280 --> 00:15:48,480 Speaker 1: digital information. Because on the one hand, you have some 241 00:15:48,600 --> 00:15:52,160 Speaker 1: fairly extensive data protection laws on the books that are 242 00:15:52,200 --> 00:15:56,400 Speaker 1: meant to help citizens protect their privacy and their security 243 00:15:56,480 --> 00:15:59,160 Speaker 1: and to know who has their data and how they 244 00:15:59,160 --> 00:16:02,920 Speaker 1: are using that. But on the other hand, well, you know, 245 00:16:03,000 --> 00:16:06,520 Speaker 1: they want to keep track of these transactions, and thus 246 00:16:06,640 --> 00:16:10,080 Speaker 1: that means giving up some privacy in order for that 247 00:16:10,160 --> 00:16:13,760 Speaker 1: information to be traceable. Uh. And there's a couple of 248 00:16:13,760 --> 00:16:16,840 Speaker 1: reasons for that. One is that you clearly are thinking 249 00:16:16,840 --> 00:16:21,080 Speaker 1: about taxation, right if you want to tax transactions that 250 00:16:21,200 --> 00:16:24,280 Speaker 1: otherwise are going hidden. Another is that a lot of 251 00:16:24,280 --> 00:16:28,000 Speaker 1: criminals use cryptocurrency in an effort to launder money from 252 00:16:28,160 --> 00:16:32,840 Speaker 1: illicitly gained sources. That's something clearly the authorities would want 253 00:16:32,880 --> 00:16:34,880 Speaker 1: to know about. They'd want to be able to trace that, 254 00:16:35,800 --> 00:16:39,320 Speaker 1: So there are some legit reasons for countries to want 255 00:16:39,360 --> 00:16:42,880 Speaker 1: to do this. This rule would require crypto companies to 256 00:16:43,000 --> 00:16:46,520 Speaker 1: make these transactions traceable, and it would bring cryptocurrency kind 257 00:16:46,520 --> 00:16:49,640 Speaker 1: of in line with the rules that guide other banking 258 00:16:49,640 --> 00:16:52,600 Speaker 1: and financial institutions, including the fact that you would not 259 00:16:52,640 --> 00:16:56,400 Speaker 1: be allowed to have an anonymous digital wallet. Like digital 260 00:16:56,400 --> 00:17:01,440 Speaker 1: wallets would have to be tied to specifick individuals and 261 00:17:01,520 --> 00:17:04,280 Speaker 1: that would need to be traceable. And finally, the search 262 00:17:04,320 --> 00:17:09,439 Speaker 1: business is seriously big business, like like wicked big. And 263 00:17:09,560 --> 00:17:12,480 Speaker 1: one way to try and wrap your head around how 264 00:17:12,640 --> 00:17:16,960 Speaker 1: big search is in the sense of like how much 265 00:17:17,440 --> 00:17:20,520 Speaker 1: money it makes is to look at how much Google 266 00:17:20,640 --> 00:17:23,560 Speaker 1: will pay Apple to allow Google Search to be the 267 00:17:23,600 --> 00:17:26,880 Speaker 1: default search engine for Safari and thus the default search 268 00:17:26,920 --> 00:17:31,560 Speaker 1: engine for Mac and iOS devices. Now. Back in Google 269 00:17:31,600 --> 00:17:37,080 Speaker 1: reportedly coughed up ten billion dollars for that privilege, and 270 00:17:37,160 --> 00:17:40,360 Speaker 1: this year, according to an analyst firm called Bernstein, the 271 00:17:40,440 --> 00:17:45,960 Speaker 1: amount might be closer to fifteen billion dollars and it 272 00:17:46,000 --> 00:17:49,240 Speaker 1: would just increase from there, so next year would probably 273 00:17:49,280 --> 00:17:53,359 Speaker 1: be closer to eighteen to twenty billion dollars. That is 274 00:17:53,400 --> 00:17:57,760 Speaker 1: a princely sum. Indeed, now the analysts say this isn't 275 00:17:57,760 --> 00:18:01,119 Speaker 1: set in stone, and Google might ultimately decide that the 276 00:18:01,200 --> 00:18:04,160 Speaker 1: expense it pays is not worth the benefit it gets, 277 00:18:04,760 --> 00:18:08,440 Speaker 1: or you might have a regulatory agency step in and say, hey, 278 00:18:09,800 --> 00:18:14,000 Speaker 1: this doesn't seem like it's a good practice. It seems 279 00:18:14,040 --> 00:18:17,960 Speaker 1: anti competitive, and that would mean that you're falling into 280 00:18:18,040 --> 00:18:21,160 Speaker 1: some you know, trends of monopolies and you start getting 281 00:18:21,160 --> 00:18:24,760 Speaker 1: into illegal territory. But what it really shows you is 282 00:18:24,760 --> 00:18:28,560 Speaker 1: that Google is definitely an ad company, right, Like that 283 00:18:28,720 --> 00:18:33,359 Speaker 1: is Google's business. Search just happens to be the prime 284 00:18:33,400 --> 00:18:37,879 Speaker 1: way that Google gets eyeballs on advertisement and that's what 285 00:18:38,040 --> 00:18:41,399 Speaker 1: generates revenue for the company. So, yeah, Google provides a 286 00:18:41,440 --> 00:18:47,000 Speaker 1: search service, but it's business is in advertising. So I 287 00:18:47,080 --> 00:18:49,679 Speaker 1: guess it's true that you've got to spend money to 288 00:18:49,720 --> 00:18:52,000 Speaker 1: make money, and that applies even when you're looking at 289 00:18:52,080 --> 00:18:56,479 Speaker 1: multibillion dollar corporations. Now, if like me, you're wondering how 290 00:18:56,560 --> 00:19:01,280 Speaker 1: much Google pulls in from its search business, like, what 291 00:19:01,680 --> 00:19:05,800 Speaker 1: is justifying this incredible expense, Well, in the company reported 292 00:19:05,840 --> 00:19:09,600 Speaker 1: revenues of a hundred four billion dollars in the search 293 00:19:09,720 --> 00:19:13,760 Speaker 1: and other category, and that in turn was more than 294 00:19:14,840 --> 00:19:19,000 Speaker 1: all of Google's revenue for that year. So yeah, these 295 00:19:19,040 --> 00:19:21,399 Speaker 1: are the big leagues we're talking about. Still blows my 296 00:19:21,440 --> 00:19:24,919 Speaker 1: mind that it's that big, though. And that's it for 297 00:19:25,000 --> 00:19:27,840 Speaker 1: the news that I have for you guys on Thursday, 298 00:19:27,880 --> 00:19:31,600 Speaker 1: August one. Will be back next week with more tech 299 00:19:31,640 --> 00:19:34,320 Speaker 1: news and episodes. If you have suggestions for topics I 300 00:19:34,320 --> 00:19:36,560 Speaker 1: should cover on tech Stuff, reach out to me on 301 00:19:36,560 --> 00:19:39,840 Speaker 1: Twitter to handle is tech Stuff hs W, and I'll 302 00:19:39,880 --> 00:19:48,680 Speaker 1: talk to you again really soon. Text Stuff is an 303 00:19:48,680 --> 00:19:52,399 Speaker 1: I Heart Radio production. For more podcasts from my Heart Radio, 304 00:19:52,720 --> 00:19:55,879 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 305 00:19:55,960 --> 00:20:00,159 Speaker 1: you listen to your favorite shows. GW