1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:17,959 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,000 --> 00:00:21,120 Speaker 1: and I love all things tech and today it's the 5 00:00:21,160 --> 00:00:27,600 Speaker 1: tech news for Tuesday, June twenty twenty one. My voice 6 00:00:27,680 --> 00:00:31,320 Speaker 1: is still blown out from a birthday weekend. Some of 7 00:00:31,360 --> 00:00:35,680 Speaker 1: my apologies for that, but let's get to actual tech news. 8 00:00:36,720 --> 00:00:41,520 Speaker 1: South Korea is the latest battleground that sees enormous companies 9 00:00:41,560 --> 00:00:46,479 Speaker 1: from different tech sectors locking horns with each other. Recently, 10 00:00:46,800 --> 00:00:50,239 Speaker 1: a South Korean court passed the buck on a decision 11 00:00:50,280 --> 00:00:54,600 Speaker 1: about whether or not streaming services like Netflix should pay 12 00:00:54,800 --> 00:00:59,800 Speaker 1: bandwidth usage fees to Internet service providers. Alright, so he 13 00:01:00,000 --> 00:01:03,120 Speaker 1: here's the skinny on what's really going down. So on 14 00:01:03,160 --> 00:01:05,679 Speaker 1: the one side, you've got the I s p s, 15 00:01:05,680 --> 00:01:09,319 Speaker 1: the Internet service providers, and they're saying that services like 16 00:01:09,720 --> 00:01:15,200 Speaker 1: you know, Netflix, YouTube, Hulu, these streaming services place a 17 00:01:15,240 --> 00:01:19,920 Speaker 1: particularly heavy demand on bandwidth. You know that that traffic 18 00:01:20,040 --> 00:01:22,440 Speaker 1: takes up a lot of bandwidth on the I s 19 00:01:22,440 --> 00:01:26,760 Speaker 1: P networks, and thus these companies should pay a fee 20 00:01:27,240 --> 00:01:29,400 Speaker 1: to the I s p s to offset the cost 21 00:01:29,640 --> 00:01:33,800 Speaker 1: of providing that bandwidth to them. But Netflix and other 22 00:01:33,840 --> 00:01:38,399 Speaker 1: streaming services have argued that I s P customers, you know, 23 00:01:38,440 --> 00:01:40,600 Speaker 1: the people on the actual end, the people who are 24 00:01:41,360 --> 00:01:45,560 Speaker 1: choosing to go to Netflix or YouTube or whatever, they 25 00:01:45,600 --> 00:01:49,840 Speaker 1: already pay a subscription fee for their services, and that 26 00:01:50,240 --> 00:01:54,440 Speaker 1: any I s P that would charge content providers for 27 00:01:54,720 --> 00:01:58,720 Speaker 1: delivering it and the end user on the other side 28 00:01:58,760 --> 00:02:02,040 Speaker 1: for ordering it, that amounts to double dipping. That the 29 00:02:02,080 --> 00:02:06,160 Speaker 1: company is charging twice for the same service. It's just 30 00:02:06,240 --> 00:02:09,359 Speaker 1: that they're charging two different entities but for the same thing. 31 00:02:10,040 --> 00:02:12,639 Speaker 1: And I want to be clear here, both the I 32 00:02:12,919 --> 00:02:16,359 Speaker 1: s p s and Netflix and other major streaming services, 33 00:02:16,840 --> 00:02:21,600 Speaker 1: they're all huge companies, right. These entities are really mostly 34 00:02:21,639 --> 00:02:24,760 Speaker 1: concerned about making money and then keeping as much of 35 00:02:24,760 --> 00:02:28,239 Speaker 1: that money as they possibly can. So there are no 36 00:02:28,240 --> 00:02:31,840 Speaker 1: noble good guys in this story. In other words, now 37 00:02:31,880 --> 00:02:35,480 Speaker 1: here in the United States, advocates have long protested the 38 00:02:35,520 --> 00:02:38,679 Speaker 1: idea of an I s P charging content companies for 39 00:02:38,800 --> 00:02:42,680 Speaker 1: bandwidth usage. They argue that this is unfair and that 40 00:02:42,760 --> 00:02:46,480 Speaker 1: net neutrality, which among other things, promotes the idea that 41 00:02:46,560 --> 00:02:50,280 Speaker 1: all content is equal on the Internet, demands that I 42 00:02:50,440 --> 00:02:54,400 Speaker 1: sp s not prioritize services or levy bandwidth fees like 43 00:02:54,480 --> 00:02:59,560 Speaker 1: that because it would discourage those companies from offering those services. 44 00:03:00,000 --> 00:03:02,360 Speaker 1: In addition, at least here in the US, we also 45 00:03:02,400 --> 00:03:04,839 Speaker 1: have the issue that I s p s typically are 46 00:03:04,919 --> 00:03:08,639 Speaker 1: also content providers, and if the I s p s 47 00:03:08,680 --> 00:03:13,800 Speaker 1: are charging third party companies like Netflix or YouTube or whatever, 48 00:03:14,720 --> 00:03:18,040 Speaker 1: but they're not having to deal with that themselves. As 49 00:03:18,120 --> 00:03:22,359 Speaker 1: they go forward and offer services, that creates an unfair advantage, 50 00:03:22,360 --> 00:03:26,880 Speaker 1: it becomes anti competitive, it gets really messy. Well, the 51 00:03:26,880 --> 00:03:30,600 Speaker 1: South Korean court essentially said, y'all made this mess, go 52 00:03:30,760 --> 00:03:33,800 Speaker 1: figure it out yourselves, and that gives the I s 53 00:03:33,840 --> 00:03:36,680 Speaker 1: p s and South Korea the ability to negotiate with 54 00:03:36,720 --> 00:03:40,640 Speaker 1: Netflix and other streaming services to demand these sort of 55 00:03:40,680 --> 00:03:43,200 Speaker 1: bandwidth fees. That likely means that these companies are going 56 00:03:43,240 --> 00:03:47,360 Speaker 1: to have to pay some sort of bandwidth fee if 57 00:03:47,400 --> 00:03:49,520 Speaker 1: they want their traffic carried by the I s p 58 00:03:49,640 --> 00:03:52,080 Speaker 1: s and South Korea, which they do. South Korea is 59 00:03:52,480 --> 00:03:56,360 Speaker 1: a big market, but again, the streaming companies are huge. 60 00:03:56,400 --> 00:03:59,280 Speaker 1: Some of them are part of the largest media companies 61 00:03:59,320 --> 00:04:02,440 Speaker 1: in the world. You look at companies like NBC Universal 62 00:04:02,800 --> 00:04:06,440 Speaker 1: owned by Comcast, like it's huge, and it would shock 63 00:04:06,560 --> 00:04:09,960 Speaker 1: me if this all just ends with negotiations, and that's that. 64 00:04:10,080 --> 00:04:12,960 Speaker 1: I'm certain we're going to see more legal challenges follow 65 00:04:13,080 --> 00:04:16,080 Speaker 1: as big powerful companies throw their weight around in an 66 00:04:16,080 --> 00:04:21,520 Speaker 1: effort to eliminate bandwidth fees. Honestly, watching this stuff sort 67 00:04:21,560 --> 00:04:25,360 Speaker 1: of feels like I'm tuning into Godzilla versus Kong, and 68 00:04:25,520 --> 00:04:29,920 Speaker 1: in more ways than one, because the end consumer tends 69 00:04:29,960 --> 00:04:32,800 Speaker 1: to be the biggest victim of these kinds of struggles, 70 00:04:33,080 --> 00:04:36,280 Speaker 1: because no matter what happens, someone typically is going to 71 00:04:36,440 --> 00:04:39,520 Speaker 1: raise the cost of their subscription service in order to 72 00:04:40,160 --> 00:04:43,960 Speaker 1: cover whatever the additional cost is of providing it. Right, So, 73 00:04:44,520 --> 00:04:48,200 Speaker 1: if Netflix has to pay a bandwidth fee in order 74 00:04:48,200 --> 00:04:52,560 Speaker 1: to be carried across South Korean i SP networks, you 75 00:04:52,600 --> 00:04:55,679 Speaker 1: can bet the Netflix will also increase its subscription fee 76 00:04:55,680 --> 00:04:59,160 Speaker 1: for South Korean customers because they love to pass the 77 00:04:59,240 --> 00:05:04,400 Speaker 1: costs on to you. Just kind of how capitalism stuff works. 78 00:05:04,760 --> 00:05:09,160 Speaker 1: Moving on over at YouTube, there was a recent kerfuffle 79 00:05:09,320 --> 00:05:13,640 Speaker 1: regarding a channel ban that raised some eyebrows. The platform 80 00:05:13,720 --> 00:05:17,120 Speaker 1: placed a quote unquote permanent ban on a channel called 81 00:05:17,360 --> 00:05:22,159 Speaker 1: right wing watch Now. This channel posts videos that examine 82 00:05:22,240 --> 00:05:28,240 Speaker 1: and document and critique extremist political conservatives. YouTube claimed that 83 00:05:28,360 --> 00:05:31,800 Speaker 1: the band was just a mistake. That quote, upon further review, 84 00:05:32,160 --> 00:05:35,800 Speaker 1: has now been reinstated end quote. But this has prompted 85 00:05:35,880 --> 00:05:40,360 Speaker 1: some questions from multiple angles. YouTube automates much of its 86 00:05:40,400 --> 00:05:43,120 Speaker 1: review process. I mean it has to because there's like 87 00:05:43,200 --> 00:05:47,640 Speaker 1: five hours of YouTube content uploaded every single minute. There's 88 00:05:47,680 --> 00:05:53,040 Speaker 1: just no way that human beings could review every single 89 00:05:53,120 --> 00:05:56,000 Speaker 1: piece of content uploaded to the platform and determine whether 90 00:05:56,080 --> 00:06:01,720 Speaker 1: or not it meets the platform's policies. So that means 91 00:06:01,760 --> 00:06:04,640 Speaker 1: that algorithms are in charge of making the early call 92 00:06:04,680 --> 00:06:07,640 Speaker 1: as to whether a channel or a specific video is 93 00:06:07,640 --> 00:06:11,600 Speaker 1: in violation of YouTube's policies, and those policies appear to 94 00:06:11,640 --> 00:06:14,800 Speaker 1: be the ones regarding the spread of stuff like misinformation 95 00:06:14,960 --> 00:06:19,080 Speaker 1: and hate speech, though right wing watches purpose is to 96 00:06:19,279 --> 00:06:23,159 Speaker 1: document these sorts of things, not to generate hate speech 97 00:06:23,240 --> 00:06:26,080 Speaker 1: or promote hate speech, but rather to say, this is 98 00:06:26,160 --> 00:06:28,760 Speaker 1: someone who said this thing, and we are making a 99 00:06:28,839 --> 00:06:32,680 Speaker 1: document to prove that this is what has happened. So 100 00:06:32,720 --> 00:06:36,560 Speaker 1: it's more about holding people accountable for their words and actions, 101 00:06:37,000 --> 00:06:40,600 Speaker 1: it's not about promoting those as being you know, valid 102 00:06:40,600 --> 00:06:43,840 Speaker 1: points of view. And so it appears that YouTube's algorithms 103 00:06:43,880 --> 00:06:47,279 Speaker 1: have a hard time distinguishing between channels that violate the 104 00:06:47,360 --> 00:06:52,400 Speaker 1: platform's rules and channels that just try to hold those 105 00:06:52,440 --> 00:06:55,880 Speaker 1: other channels accountable. And I think this illustrates how people 106 00:06:56,000 --> 00:06:59,960 Speaker 1: are really complicated and we just don't have algorithms sophistic 107 00:07:00,240 --> 00:07:04,440 Speaker 1: enough to keep up with that. Speaking of algorithms using 108 00:07:04,440 --> 00:07:08,000 Speaker 1: a sort of brute force approach to human interactions, let's 109 00:07:08,000 --> 00:07:12,040 Speaker 1: talk about Amazon. According to a report in Bloomberg, the 110 00:07:12,120 --> 00:07:15,240 Speaker 1: company has been relying on algorithms to determine whether or 111 00:07:15,280 --> 00:07:18,680 Speaker 1: not flex drivers are worth keeping around, which is kind 112 00:07:18,720 --> 00:07:21,640 Speaker 1: of a big yikes. Al right. So, in order to 113 00:07:21,680 --> 00:07:26,320 Speaker 1: meet the demand of its customers, Amazon uses subcontracted drivers 114 00:07:26,360 --> 00:07:31,040 Speaker 1: to supplement Amazon's normal delivery channels. So in other words, 115 00:07:31,920 --> 00:07:34,880 Speaker 1: Amazon has way more stuff to ship out than its 116 00:07:34,960 --> 00:07:38,440 Speaker 1: normal channels would support, so they end up hiring on 117 00:07:38,640 --> 00:07:43,000 Speaker 1: additional subcontracted drivers. These drivers can actually sign up to 118 00:07:43,080 --> 00:07:45,760 Speaker 1: participate in the service, and then they have the ability 119 00:07:45,800 --> 00:07:49,040 Speaker 1: to set shifts, saying, you know, I can drive on Mondays, 120 00:07:49,080 --> 00:07:51,920 Speaker 1: Wednesdays and Fridays, for example. I can do all that 121 00:07:52,000 --> 00:07:54,920 Speaker 1: through interacting with an app. But these drivers are also 122 00:07:54,960 --> 00:07:59,320 Speaker 1: held accountable by this automated system, and it tracks their 123 00:07:59,360 --> 00:08:04,160 Speaker 1: performance in various ways, including customer feedback, and apparently a 124 00:08:04,240 --> 00:08:07,800 Speaker 1: single outlying issue might cause enough of a blip in 125 00:08:07,800 --> 00:08:11,360 Speaker 1: a person's standing to prompt the algorithm to terminate that 126 00:08:11,400 --> 00:08:16,160 Speaker 1: person's employment. So let's say you're a driver and you're 127 00:08:16,200 --> 00:08:19,560 Speaker 1: doing everything correctly. You know, you're handling the packages with care, 128 00:08:20,080 --> 00:08:23,840 Speaker 1: you're going places, but you encounter something that prevents you 129 00:08:23,880 --> 00:08:26,640 Speaker 1: from being able to make a delivery on time. In 130 00:08:26,680 --> 00:08:31,640 Speaker 1: some instances, it was that, uh, the person that was 131 00:08:31,680 --> 00:08:33,920 Speaker 1: supposed to receive the package was living in an apartment 132 00:08:33,960 --> 00:08:37,080 Speaker 1: complex that was in behind a gate. The gate was locked, 133 00:08:37,120 --> 00:08:41,640 Speaker 1: and the driver was unable to contact anyone to open 134 00:08:41,679 --> 00:08:44,360 Speaker 1: the gate so that they could make the delivery, and 135 00:08:44,400 --> 00:08:48,040 Speaker 1: then they ended up getting a black mark against their 136 00:08:48,040 --> 00:08:52,240 Speaker 1: own record. The driver did, and despite the efforts of 137 00:08:52,280 --> 00:08:56,120 Speaker 1: trying to improve their rating by you know, doing their job, 138 00:08:56,679 --> 00:09:01,240 Speaker 1: they ultimately got fired by computer code. As it were, 139 00:09:01,280 --> 00:09:07,439 Speaker 1: an algorithm itself terminated that person's employment. Something outside of 140 00:09:07,440 --> 00:09:09,720 Speaker 1: a driver's control can lead to enough of an issue 141 00:09:10,000 --> 00:09:13,640 Speaker 1: to trigger the algorithm, even if that driver did subsequently 142 00:09:13,640 --> 00:09:16,800 Speaker 1: work hard to improve their rating. And when this happens, 143 00:09:17,240 --> 00:09:21,400 Speaker 1: drivers really only have two options. They can accept the 144 00:09:21,480 --> 00:09:24,840 Speaker 1: algorithm's decision. Essentially, they can take their walking papers and 145 00:09:24,880 --> 00:09:29,360 Speaker 1: that's that, or they can pony up a two hundred 146 00:09:29,440 --> 00:09:33,199 Speaker 1: dollar fee if they want to try and dispute the termination. 147 00:09:33,840 --> 00:09:36,840 Speaker 1: So yeah, these Flex drivers have to pay two hundred 148 00:09:36,840 --> 00:09:40,640 Speaker 1: bucks to have a termination reviewed, and that does not 149 00:09:40,840 --> 00:09:45,679 Speaker 1: guarantee that the decision will get overturned. Amazon claims that 150 00:09:45,720 --> 00:09:49,160 Speaker 1: the complaints about this system represent a very small percentage 151 00:09:49,160 --> 00:09:52,360 Speaker 1: of drivers and that it's not representative of how most 152 00:09:52,440 --> 00:09:57,320 Speaker 1: Flex drivers interact with the system. But yeah, seems kind 153 00:09:57,320 --> 00:10:01,840 Speaker 1: of suss in my opinion. Last week, Canada's government proposed 154 00:10:01,880 --> 00:10:05,080 Speaker 1: a law that would make online hate speech a crime 155 00:10:05,120 --> 00:10:07,400 Speaker 1: that could be punished with fines of up to twenty 156 00:10:07,440 --> 00:10:10,839 Speaker 1: thousand dollars Canadian, which is, you know, a bit more 157 00:10:10,880 --> 00:10:14,839 Speaker 1: than sixteen thousand dollars in America, and that's for the 158 00:10:14,880 --> 00:10:17,920 Speaker 1: first offense. For a second offense, that find goes up 159 00:10:17,960 --> 00:10:21,839 Speaker 1: to fifty thousand dollars Canadian or around forty tho US. 160 00:10:22,679 --> 00:10:26,640 Speaker 1: The law would also extend safe harbor protections to platforms. So, 161 00:10:26,679 --> 00:10:29,760 Speaker 1: in other words, let's say I were to post a 162 00:10:29,880 --> 00:10:34,000 Speaker 1: video containing hate speech directed at some group on YouTube. 163 00:10:34,280 --> 00:10:37,400 Speaker 1: I never would do this, but for a hypothetical situation. 164 00:10:37,440 --> 00:10:39,800 Speaker 1: Let's say that I did, I would be the one 165 00:10:39,920 --> 00:10:44,480 Speaker 1: held accountable. I could face punishment. However, YouTube would not 166 00:10:44,559 --> 00:10:47,439 Speaker 1: be on trial. In the United States. This is similar 167 00:10:47,480 --> 00:10:51,080 Speaker 1: to our Section to thirty law, which says platforms cannot 168 00:10:51,080 --> 00:10:54,960 Speaker 1: be held legally responsible for the content posted by their users. 169 00:10:55,559 --> 00:10:58,280 Speaker 1: The Canadian government has said that this law is not 170 00:10:58,320 --> 00:11:01,439 Speaker 1: going to target just any one who has something negative 171 00:11:01,480 --> 00:11:05,080 Speaker 1: to say, but rather genuine expressions of hate speech. So 172 00:11:05,160 --> 00:11:09,400 Speaker 1: generally speaking, speech meant to intimidate, discriminate, and otherwise harm 173 00:11:09,880 --> 00:11:13,880 Speaker 1: some group of people, or, as the proposal states, speech 174 00:11:14,200 --> 00:11:18,880 Speaker 1: quote motivated by bias, prejudice, or hate based on race, 175 00:11:19,400 --> 00:11:25,120 Speaker 1: national or ethnic origin, language, color, religion, sex, age, mental 176 00:11:25,280 --> 00:11:31,000 Speaker 1: or physical disability, sexual orientation, gender identity or expression, or 177 00:11:31,040 --> 00:11:35,360 Speaker 1: any other similar factor end quote. The government pointed out 178 00:11:35,400 --> 00:11:39,000 Speaker 1: that victims of such speech quote lose their freedom to 179 00:11:39,040 --> 00:11:43,400 Speaker 1: participate in civil society online end quote. Now, the timing 180 00:11:43,400 --> 00:11:45,880 Speaker 1: of the proposal has been seen to be a bit 181 00:11:46,000 --> 00:11:49,520 Speaker 1: odd in the sense that in Canada, the House of Commons, 182 00:11:49,840 --> 00:11:53,160 Speaker 1: a major part of the Canadian government, has adjourned for 183 00:11:53,200 --> 00:11:55,920 Speaker 1: the summer. The government says that the bill is going 184 00:11:55,960 --> 00:11:59,680 Speaker 1: to be reintroduced later this fall, though that's probably only 185 00:11:59,679 --> 00:12:02,240 Speaker 1: going to happened if the Liberal Party remains in power 186 00:12:02,320 --> 00:12:06,959 Speaker 1: in Canada. Some conservatives have said that this timing is 187 00:12:07,000 --> 00:12:10,680 Speaker 1: a bit suspect, and that perhaps the Liberals never really 188 00:12:10,720 --> 00:12:14,320 Speaker 1: intended to pursue it as an actual law, but rather 189 00:12:14,640 --> 00:12:18,199 Speaker 1: introduced it mainly for the purposes of scoring political points 190 00:12:18,240 --> 00:12:21,560 Speaker 1: ahead of an election. Uh. Some of those conservatives have 191 00:12:21,600 --> 00:12:24,040 Speaker 1: also argued that the proposals are a threat to the 192 00:12:24,080 --> 00:12:26,960 Speaker 1: freedom of speech, though it seems pretty clear to me 193 00:12:27,080 --> 00:12:30,600 Speaker 1: that the Liberals who proposed this legislation already considered that 194 00:12:30,720 --> 00:12:34,040 Speaker 1: and said hate speech itself is a threat to freedom 195 00:12:34,040 --> 00:12:38,160 Speaker 1: of speech because it intimidates the targets of that hate speech, 196 00:12:38,280 --> 00:12:42,960 Speaker 1: and it prevents those targets from expressing themselves. So it 197 00:12:43,120 --> 00:12:46,240 Speaker 1: is in itself a threat to the freedom of speech. 198 00:12:46,559 --> 00:12:51,480 Speaker 1: And as I said earlier, people are complicated. Hey do 199 00:12:51,520 --> 00:12:54,120 Speaker 1: you remember the Solar winds hack. That's the one that 200 00:12:54,200 --> 00:12:57,960 Speaker 1: affected thousands of computers and stole data from some pretty 201 00:12:58,040 --> 00:13:01,920 Speaker 1: darn big companies. Well, the hacker group behind it, known 202 00:13:01,960 --> 00:13:07,319 Speaker 1: as Nobilium, has attacked Microsoft again. This time, the hackers 203 00:13:07,360 --> 00:13:11,120 Speaker 1: targeted a computer that belonged to a Microsoft customer, service agent. 204 00:13:11,720 --> 00:13:15,280 Speaker 1: The hackers got hold of sensitive data relating to Microsoft 205 00:13:15,320 --> 00:13:20,439 Speaker 1: customers and then launched targeted attacks against those customers. Now, 206 00:13:20,480 --> 00:13:24,400 Speaker 1: apparently the hackers were able to access information through the 207 00:13:24,520 --> 00:13:29,080 Speaker 1: second half of May. According to Microsoft, and included in 208 00:13:29,120 --> 00:13:33,800 Speaker 1: the hacked information, we're building contact info for customers as 209 00:13:33,840 --> 00:13:37,880 Speaker 1: well as which products and services those customers were paying for, 210 00:13:38,320 --> 00:13:41,000 Speaker 1: which is pretty useful information. If you wanted to create 211 00:13:41,040 --> 00:13:44,440 Speaker 1: a targeted attack on those customers, you could create a 212 00:13:44,520 --> 00:13:47,960 Speaker 1: really convincing phishing attack because you could be specific about 213 00:13:47,960 --> 00:13:51,880 Speaker 1: which products the customers were using, so you could say, like, 214 00:13:52,520 --> 00:13:55,400 Speaker 1: you know, we have an update to your Microsoft server 215 00:13:55,559 --> 00:13:59,040 Speaker 1: software and have an attachment that contains malware on it. 216 00:13:59,520 --> 00:14:02,559 Speaker 1: Microsoft have to send out emails to customers urging them 217 00:14:02,559 --> 00:14:05,959 Speaker 1: to be cautious with any incoming communications that are purporting 218 00:14:06,000 --> 00:14:08,840 Speaker 1: to be from Microsoft. It's always good to double check 219 00:14:09,120 --> 00:14:12,400 Speaker 1: that any email message you get is legit. Looking at 220 00:14:12,400 --> 00:14:14,520 Speaker 1: the return email address is a good way to start. 221 00:14:14,600 --> 00:14:16,080 Speaker 1: It's not the only thing you should do, but it's 222 00:14:16,080 --> 00:14:19,200 Speaker 1: a good start. You should also just not click on links, 223 00:14:19,280 --> 00:14:22,440 Speaker 1: or open attachments or anything like that from an email 224 00:14:22,640 --> 00:14:26,000 Speaker 1: that doesn't appear to be legit. The company says that 225 00:14:26,120 --> 00:14:28,480 Speaker 1: so far it has no evidence that any customers have 226 00:14:28,600 --> 00:14:32,520 Speaker 1: actually been compromised through a follow up attack. We will 227 00:14:32,640 --> 00:14:36,800 Speaker 1: hope that that continues. And from Microsoft, let's head over 228 00:14:36,840 --> 00:14:39,680 Speaker 1: to Apple, a company that has had a few issues 229 00:14:40,080 --> 00:14:42,800 Speaker 1: with information leaks getting out into the public before the 230 00:14:42,840 --> 00:14:46,120 Speaker 1: company was able to wow an audience with a one 231 00:14:46,240 --> 00:14:50,920 Speaker 1: More Thing reveal. So Apple's reputation for innovation includes how 232 00:14:50,960 --> 00:14:54,720 Speaker 1: the company unveils new products, but that can all be 233 00:14:54,880 --> 00:14:57,800 Speaker 1: ruined with a leak to the public. And so now, 234 00:14:58,080 --> 00:15:02,320 Speaker 1: according to front page Tech, Apple is requiring some employees 235 00:15:02,360 --> 00:15:06,840 Speaker 1: who belong to certain product teams to where body cameras 236 00:15:06,880 --> 00:15:11,160 Speaker 1: similar to, or perhaps even identical to those worn by 237 00:15:11,240 --> 00:15:15,600 Speaker 1: law enforcement. Apple has been cracking down on known sources 238 00:15:15,640 --> 00:15:18,080 Speaker 1: of leaks, as well as driving home the idea that 239 00:15:18,160 --> 00:15:21,440 Speaker 1: Apple employees are not to share information with people known 240 00:15:21,480 --> 00:15:25,720 Speaker 1: to have published Apple leaks. The body camera thing seems 241 00:15:25,760 --> 00:15:31,080 Speaker 1: more than a bit extreme to me. It sounds incredibly invasive, 242 00:15:31,600 --> 00:15:35,880 Speaker 1: and uh, I guess I'm not surprised, but I am disappointed. 243 00:15:36,400 --> 00:15:41,040 Speaker 1: And that's the news for Tuesday, June one. I will 244 00:15:41,040 --> 00:15:44,760 Speaker 1: be back tomorrow with some more acronyms and initialisms. Should 245 00:15:44,800 --> 00:15:47,760 Speaker 1: be closing that out, I hope with one last episode 246 00:15:48,080 --> 00:15:51,000 Speaker 1: and more news later this week, and then who knows 247 00:15:51,040 --> 00:15:53,440 Speaker 1: what I'll talk about. Well, maybe you know. If you 248 00:15:53,480 --> 00:15:55,840 Speaker 1: have a suggestion for a topic I should cover on 249 00:15:55,880 --> 00:15:58,680 Speaker 1: a future tech Stuff, let me know on Twitter. The 250 00:15:58,720 --> 00:16:02,200 Speaker 1: handle for the show is stuff h s W. Some 251 00:16:02,280 --> 00:16:04,640 Speaker 1: of you have been reaching out. It's been great. Keep 252 00:16:04,720 --> 00:16:06,880 Speaker 1: up the good work because that's really helpful for me. 253 00:16:07,240 --> 00:16:15,920 Speaker 1: And I'll talk to you again really soon. Y Text 254 00:16:15,920 --> 00:16:19,400 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 255 00:16:19,400 --> 00:16:22,160 Speaker 1: from my Heart Radio, visit the i Heart Radio app, 256 00:16:22,280 --> 00:16:25,440 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.