1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:14,760 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,880 --> 00:00:18,040 Speaker 1: Johnathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,040 --> 00:00:21,440 Speaker 1: and a love of all things tech except for tech news. 5 00:00:21,480 --> 00:00:26,799 Speaker 1: Today's a tech news day for Thursday, October one. And y'all, 6 00:00:27,520 --> 00:00:30,920 Speaker 1: the tech news can just really it's just like reading 7 00:00:30,920 --> 00:00:33,120 Speaker 1: the news in general. He can really beat you down 8 00:00:33,159 --> 00:00:36,480 Speaker 1: after a while. We're really going to focus on Amazon 9 00:00:36,880 --> 00:00:39,800 Speaker 1: and Facebook in this news episode because a lot of 10 00:00:39,800 --> 00:00:43,159 Speaker 1: stuff has come out about both companies and it's not 11 00:00:43,720 --> 00:00:47,000 Speaker 1: great for the most part. So let's start with Amazon. 12 00:00:47,440 --> 00:00:50,680 Speaker 1: Reuter's reports that Amazon has been found to game the 13 00:00:50,760 --> 00:00:53,760 Speaker 1: system in a really big way, leaning heavily on the 14 00:00:53,760 --> 00:00:59,120 Speaker 1: company's ability to gather and analyze data at a huge scale. Now, 15 00:00:59,200 --> 00:01:03,960 Speaker 1: in this particular case, we're talking about Amazon's operations in India. 16 00:01:04,160 --> 00:01:08,559 Speaker 1: Broiters came into possession of various internal documents and emails 17 00:01:09,160 --> 00:01:15,200 Speaker 1: that show Amazon employees were analyzing customer behaviors on the platform. Okay, 18 00:01:15,440 --> 00:01:18,360 Speaker 1: no problem so far. I mean, you would expect Amazon 19 00:01:18,440 --> 00:01:23,880 Speaker 1: to analyze customer behaviors because the company wants to streamline 20 00:01:23,920 --> 00:01:26,600 Speaker 1: the experience, make it as useful as possible so that 21 00:01:26,640 --> 00:01:31,319 Speaker 1: people keep coming back to Amazon. So some analysis makes sense. 22 00:01:32,000 --> 00:01:35,640 Speaker 1: Amazon was also looking at which products were particularly popular. 23 00:01:35,920 --> 00:01:39,800 Speaker 1: Still no problem. I mean, if you identify which products 24 00:01:39,840 --> 00:01:43,880 Speaker 1: are popular, you can feature those and thus sell more units. 25 00:01:43,959 --> 00:01:48,960 Speaker 1: So not necessarily a bad thing here either. However, company 26 00:01:49,000 --> 00:01:55,040 Speaker 1: employees designated these as reference products and then set about 27 00:01:55,080 --> 00:02:00,840 Speaker 1: to make Amazon knockoff versions of those things. Now they 28 00:02:00,840 --> 00:02:04,520 Speaker 1: wouldn't go so far as to actually counterfeit something, but 29 00:02:04,600 --> 00:02:08,720 Speaker 1: they did make products that were virtually identical to existing 30 00:02:08,760 --> 00:02:13,280 Speaker 1: ones on the market. So this wasn't about Amazon saying, hey, 31 00:02:13,320 --> 00:02:17,280 Speaker 1: you know, recharging cables are really popular, Let's make our 32 00:02:17,280 --> 00:02:21,320 Speaker 1: own recharging cable. This was Amazon saying, hey, this very 33 00:02:21,320 --> 00:02:25,400 Speaker 1: specific recharging cable is really popular. Let's make a version 34 00:02:25,440 --> 00:02:29,320 Speaker 1: that is indistinguishable from this, sell it at a lower cost, 35 00:02:29,720 --> 00:02:34,160 Speaker 1: and then promote it and thus make, you know, essentially 36 00:02:34,240 --> 00:02:40,120 Speaker 1: undercut this existing vendor who is dependent upon Amazon as 37 00:02:40,160 --> 00:02:44,280 Speaker 1: an online shop front. So on top of all this, 38 00:02:44,360 --> 00:02:48,240 Speaker 1: Amazon used its algorithm to place the Amazon made versions 39 00:02:48,400 --> 00:02:52,600 Speaker 1: of these products higher in search. So now someone searching 40 00:02:52,639 --> 00:02:55,080 Speaker 1: for a specific product, you know, it could be a 41 00:02:55,080 --> 00:02:57,560 Speaker 1: piece of clothing, it could be an electronics gadget, it 42 00:02:57,600 --> 00:03:01,000 Speaker 1: could be a household item, They're more likely to come 43 00:03:01,040 --> 00:03:05,760 Speaker 1: across the Amazon branded version of that before whichever one 44 00:03:06,000 --> 00:03:10,240 Speaker 1: Amazon employees were using as a reference product. So you 45 00:03:10,240 --> 00:03:14,760 Speaker 1: can find something that is almost identical to a very 46 00:03:14,880 --> 00:03:20,160 Speaker 1: popular product that's branded by Amazon. And because the algorithm, 47 00:03:20,240 --> 00:03:24,079 Speaker 1: you're seeing the Amazon one first, and their Amazon is 48 00:03:24,160 --> 00:03:28,120 Speaker 1: essentially undercutting these other merchants. Now, I'm pretty sure you 49 00:03:28,120 --> 00:03:29,800 Speaker 1: would be hard pressed to come up with a way 50 00:03:30,120 --> 00:03:34,440 Speaker 1: to defend Amazon against a charge of being anti competitive 51 00:03:35,040 --> 00:03:38,920 Speaker 1: in this case. Uh the internal documents showed that company 52 00:03:38,920 --> 00:03:43,800 Speaker 1: employees wanted to displace existing products on the market. According 53 00:03:43,800 --> 00:03:46,680 Speaker 1: to Reuters, the documents also showed that employees even sought 54 00:03:46,680 --> 00:03:50,200 Speaker 1: out partnerships with the companies that were making the reference 55 00:03:50,280 --> 00:03:54,680 Speaker 1: products because some of those products had final touches or 56 00:03:54,800 --> 00:03:58,560 Speaker 1: unique processes that meant that those products were set apart 57 00:03:58,960 --> 00:04:02,440 Speaker 1: and then it was harder to to make a you know, 58 00:04:02,520 --> 00:04:07,440 Speaker 1: a reasonable copy of that thing. So employees needed to 59 00:04:07,440 --> 00:04:10,560 Speaker 1: be able to find out what these unique processes were 60 00:04:11,040 --> 00:04:16,479 Speaker 1: in order to replicate those products. For Amazon. That's pretty 61 00:04:16,560 --> 00:04:19,760 Speaker 1: much industrial espionage when you get down to it again. 62 00:04:19,800 --> 00:04:22,760 Speaker 1: This is all according to Reuters, and this also contradicts 63 00:04:22,800 --> 00:04:25,640 Speaker 1: claims that Jeff Bezos made when he appeared before the 64 00:04:25,680 --> 00:04:29,880 Speaker 1: United States Congress in twenty twenty. During that testimony, Bezos 65 00:04:29,960 --> 00:04:33,039 Speaker 1: argued that Amazon doesn't use the data it collects to 66 00:04:33,080 --> 00:04:36,080 Speaker 1: give its own branded products a leg up on other 67 00:04:36,200 --> 00:04:39,360 Speaker 1: sellers that would be anti competitive, nor that the company 68 00:04:39,480 --> 00:04:43,640 Speaker 1: uses its algorithm to boost Amazon brands in search results 69 00:04:43,680 --> 00:04:47,520 Speaker 1: at the expense of other brands. But hey, maybe maybe 70 00:04:47,560 --> 00:04:50,320 Speaker 1: he just meant, oh except in India. Maybe that's what 71 00:04:50,440 --> 00:04:52,800 Speaker 1: he just forgot to say that part. Even if all 72 00:04:52,839 --> 00:04:55,880 Speaker 1: of that is the case, the Reuters report is already 73 00:04:55,960 --> 00:05:00,000 Speaker 1: stirring up activity in US politics again. Senator Elizabeth Warren 74 00:05:00,520 --> 00:05:04,520 Speaker 1: sent a tweet saying, quote, these documents show what we 75 00:05:04,680 --> 00:05:08,840 Speaker 1: feared about Amazon's monopoly power. That the company is willing 76 00:05:08,960 --> 00:05:11,920 Speaker 1: and able to rig its platform to benefit its bottom 77 00:05:11,960 --> 00:05:16,240 Speaker 1: line while stifling small businesses and entrepreneurs. This is one 78 00:05:16,279 --> 00:05:18,480 Speaker 1: of the many reasons we need to break it up. 79 00:05:18,839 --> 00:05:24,360 Speaker 1: End quote. Similarly, the American Economic Liberties Project criticized Amazon, 80 00:05:24,800 --> 00:05:28,080 Speaker 1: pointing out that there have been reports from various sellers 81 00:05:28,120 --> 00:05:32,880 Speaker 1: that alleged the company has copied product designs, indicating that 82 00:05:32,920 --> 00:05:36,560 Speaker 1: Amazon had likely used its powerful data collection and analysis 83 00:05:36,600 --> 00:05:40,200 Speaker 1: tools to identify products that could be big sellers, and 84 00:05:40,240 --> 00:05:43,719 Speaker 1: then followed a similar path as we saw in India 85 00:05:44,240 --> 00:05:49,720 Speaker 1: and the that that organization UH cited a an article 86 00:05:49,800 --> 00:05:51,920 Speaker 1: from the Wall Street Journal that was published in twenty 87 00:05:52,000 --> 00:05:56,160 Speaker 1: twenty that alleged Amazon employees had done pretty much the 88 00:05:56,200 --> 00:06:00,520 Speaker 1: same thing that was going on in India over here, Oudweight, 89 00:06:01,120 --> 00:06:05,159 Speaker 1: we are not done reporting on Amazon just yet. See. 90 00:06:05,960 --> 00:06:11,080 Speaker 1: Amazon also recently scrapped plans to build a distribution center 91 00:06:11,200 --> 00:06:15,159 Speaker 1: in San Diego County in San Diego, California. Now, the 92 00:06:15,240 --> 00:06:19,239 Speaker 1: company has not cited a specific reason for backing out 93 00:06:19,279 --> 00:06:22,800 Speaker 1: of this plan, like this was something that was not 94 00:06:22,920 --> 00:06:28,560 Speaker 1: a done deal, but it was heading towards that direction. Critics, however, 95 00:06:28,720 --> 00:06:32,080 Speaker 1: point to how San Diego recently passed legislation called the 96 00:06:32,160 --> 00:06:36,920 Speaker 1: Working Families Ordinance. Now, that law would require companies that 97 00:06:37,000 --> 00:06:41,720 Speaker 1: operate within San Diego County to pay the prevailing wage, 98 00:06:41,960 --> 00:06:46,040 Speaker 1: and the prevailing wage takes union wages as the benchmark, 99 00:06:46,920 --> 00:06:52,039 Speaker 1: and also any company that employs people within San Diego County. 100 00:06:52,040 --> 00:06:54,839 Speaker 1: Any company that operates within San Diego County has to 101 00:06:54,880 --> 00:06:59,400 Speaker 1: provide a baseline level of benefits and worker safety guarantees. 102 00:06:59,760 --> 00:07:03,600 Speaker 1: So in other words, if you don't meet these basic criteria, 103 00:07:03,839 --> 00:07:09,720 Speaker 1: you cannot operate within San Diego County. So these critics conclude, 104 00:07:10,320 --> 00:07:14,280 Speaker 1: it looks like Amazon said, huh, if we have to 105 00:07:14,600 --> 00:07:17,000 Speaker 1: if we build something here, if we build our distribution 106 00:07:17,040 --> 00:07:20,200 Speaker 1: center in San Diego County, we would have to pay 107 00:07:20,240 --> 00:07:22,960 Speaker 1: people more money, and we would have to spend more 108 00:07:23,000 --> 00:07:25,760 Speaker 1: money to make sure that employees are safe, and we'd 109 00:07:25,760 --> 00:07:27,960 Speaker 1: have to give them a minimum of fifty six hours 110 00:07:28,000 --> 00:07:35,040 Speaker 1: of annual sick leaf. No thanks now again, Amazon, I 111 00:07:35,040 --> 00:07:39,160 Speaker 1: should stress, has not referenced the law when addressing the 112 00:07:39,200 --> 00:07:42,040 Speaker 1: fact that the company is no longer building a distribution 113 00:07:42,120 --> 00:07:45,800 Speaker 1: center in San Diego County. But based on other recent 114 00:07:45,840 --> 00:07:49,360 Speaker 1: activities that we've seen in which Amazon has taken a 115 00:07:49,440 --> 00:07:55,400 Speaker 1: pretty tough stance against efforts of worker organization and unionization, 116 00:07:55,920 --> 00:07:59,960 Speaker 1: I feel like maybe the critics who are pointing at 117 00:08:00,120 --> 00:08:03,760 Speaker 1: this they might be onto something here. The company behind 118 00:08:03,760 --> 00:08:07,880 Speaker 1: the development of the warehouse, essentially the real estate company, 119 00:08:07,960 --> 00:08:11,240 Speaker 1: Chestnut Properties, sent a message out that also gets me 120 00:08:11,280 --> 00:08:14,640 Speaker 1: a bit angry. That message says, quote, just the threat 121 00:08:15,320 --> 00:08:19,400 Speaker 1: mention of this ordinance has already cost over four hundred 122 00:08:19,440 --> 00:08:22,560 Speaker 1: great jobs for the weld property that I have been 123 00:08:22,560 --> 00:08:26,720 Speaker 1: working on for over five years end quote. Now that 124 00:08:26,840 --> 00:08:31,000 Speaker 1: seems to argue that the ordinance is actively harming the 125 00:08:31,040 --> 00:08:35,559 Speaker 1: community because it's discouraging businesses from moving in and creating jobs, 126 00:08:35,840 --> 00:08:39,280 Speaker 1: and that this therefore is a bad thing. Now, keep 127 00:08:39,280 --> 00:08:43,520 Speaker 1: in mind, the ordinance demands a baseline level of compensation 128 00:08:43,720 --> 00:08:48,200 Speaker 1: and benefits to employees, and without that legislation, companies like 129 00:08:48,440 --> 00:08:52,360 Speaker 1: Amazon could be paying out much lower wages with fewer 130 00:08:52,360 --> 00:08:56,560 Speaker 1: benefits and fewer guarantees of workers safety. I would argue 131 00:08:56,880 --> 00:08:59,719 Speaker 1: that that kind of job does not qualify as a 132 00:09:00,160 --> 00:09:04,080 Speaker 1: unquote great job, you know, as Chestnut Properties has argued. 133 00:09:04,360 --> 00:09:07,319 Speaker 1: I mean, it's a job, so it is a job. 134 00:09:07,920 --> 00:09:11,319 Speaker 1: But if it's a job that keeps workers living below 135 00:09:11,360 --> 00:09:15,400 Speaker 1: the poverty line and workers are on the hook even 136 00:09:15,440 --> 00:09:19,600 Speaker 1: if they get sick or injured, that's not a great job. 137 00:09:20,400 --> 00:09:25,400 Speaker 1: That's that's like indentured servitude. It might have been great 138 00:09:25,440 --> 00:09:27,920 Speaker 1: for the developer. It might have been great for Chestnut 139 00:09:28,000 --> 00:09:30,400 Speaker 1: Properties because they saw to make a lot of money 140 00:09:30,520 --> 00:09:34,120 Speaker 1: for you know, essentially selling off this this space for 141 00:09:34,200 --> 00:09:37,880 Speaker 1: a distribution center. But that filter, so no, it's not 142 00:09:37,960 --> 00:09:40,640 Speaker 1: great for them. I mean I get that, but I've 143 00:09:40,640 --> 00:09:42,960 Speaker 1: said it a few times in this podcast, I since 144 00:09:43,600 --> 00:09:46,920 Speaker 1: that there is a general labor movement that's growing in 145 00:09:46,960 --> 00:09:49,559 Speaker 1: the United States, and it's something that has been overdue 146 00:09:49,640 --> 00:09:52,240 Speaker 1: because there have been a lot of big companies that 147 00:09:52,320 --> 00:09:55,559 Speaker 1: have grown exponentially at the expense of the folks who 148 00:09:55,559 --> 00:09:58,360 Speaker 1: work for those companies. One of the frequent things you'll 149 00:09:58,400 --> 00:10:02,440 Speaker 1: hear is, you know, it's hard to look at a 150 00:10:02,480 --> 00:10:06,720 Speaker 1: company that pays its employees at such a low level 151 00:10:07,240 --> 00:10:10,319 Speaker 1: that the employees can't afford whatever it is the company makes. 152 00:10:10,600 --> 00:10:13,120 Speaker 1: That's not necessarily the case with Amazon. Amazon makes a 153 00:10:13,160 --> 00:10:15,000 Speaker 1: lot of stuff, and it makes a lot of stuff 154 00:10:15,040 --> 00:10:18,280 Speaker 1: that's super cheap. But you get the idea, right, Like, 155 00:10:19,000 --> 00:10:22,240 Speaker 1: you know, the the argument about paying out, say fifteen 156 00:10:22,280 --> 00:10:26,079 Speaker 1: dollars an hour, Well that's pretty much the minimum wage 157 00:10:26,080 --> 00:10:28,760 Speaker 1: in California, and California is a state that has a 158 00:10:28,920 --> 00:10:34,160 Speaker 1: very high cost of living. So while that might sound 159 00:10:34,200 --> 00:10:36,160 Speaker 1: great if you're in a place that has a very 160 00:10:36,200 --> 00:10:40,240 Speaker 1: low cost of living in California by comparison, it does 161 00:10:40,280 --> 00:10:42,800 Speaker 1: not mean you're making a ton of money. And if 162 00:10:42,880 --> 00:10:45,600 Speaker 1: you're in an environment that could potentially lead to you 163 00:10:45,640 --> 00:10:49,600 Speaker 1: getting sick or hurt and you have no real benefits 164 00:10:49,640 --> 00:10:54,080 Speaker 1: to recover from that. That's not great. So a trio 165 00:10:54,280 --> 00:11:00,280 Speaker 1: of Amazon stories that are pretty rough. When we come act, 166 00:11:00,600 --> 00:11:03,160 Speaker 1: we're gonna do the same thing all over again, except 167 00:11:03,160 --> 00:11:14,439 Speaker 1: this time we'll talk about Facebook. Okay. So yeah, Facebook, 168 00:11:14,480 --> 00:11:17,000 Speaker 1: like I said on Tuesday's episode, had a bad week 169 00:11:17,120 --> 00:11:19,880 Speaker 1: last week. Well, that bad week has stretched into a 170 00:11:19,880 --> 00:11:23,040 Speaker 1: couple of bad weeks. You probably know that the company 171 00:11:23,040 --> 00:11:26,520 Speaker 1: has been dealing with consequences after some employees have leaked 172 00:11:26,559 --> 00:11:31,960 Speaker 1: internal documents to reporters, law enforcement agencies, and political bodies 173 00:11:32,000 --> 00:11:35,040 Speaker 1: like the United States Congress, and next week it will 174 00:11:35,120 --> 00:11:40,200 Speaker 1: be regulatory body bodies in the European Union. Well, when 175 00:11:40,240 --> 00:11:44,079 Speaker 1: that happens to a company, there are a few things 176 00:11:44,120 --> 00:11:48,160 Speaker 1: that a company can do. There are a few different um, 177 00:11:48,200 --> 00:11:52,319 Speaker 1: you know, the courses of action that you can pursue. 178 00:11:53,000 --> 00:11:55,040 Speaker 1: One thing a company could try to do is it 179 00:11:55,120 --> 00:11:59,480 Speaker 1: can try to address systemic problems within the organization. They 180 00:11:59,480 --> 00:12:03,280 Speaker 1: can try to operate in a more ethical, transparent, and 181 00:12:03,440 --> 00:12:07,160 Speaker 1: responsible way. It can try to improve in other words, 182 00:12:08,320 --> 00:12:11,320 Speaker 1: or we can do what Facebook has done and try 183 00:12:11,400 --> 00:12:16,240 Speaker 1: to just clamp down on leaks by instituting policies that 184 00:12:16,480 --> 00:12:19,720 Speaker 1: limit access to certain internal group assets on the company's 185 00:12:19,760 --> 00:12:26,160 Speaker 1: internal communications platform. Facebook uses Workplace. I've also used Workplace. 186 00:12:26,360 --> 00:12:31,600 Speaker 1: It's a platform for all sorts of internal company uh processes. 187 00:12:32,200 --> 00:12:34,840 Speaker 1: I'm actually amazed that folks have time to access the 188 00:12:34,920 --> 00:12:40,040 Speaker 1: darned thing to, you know, have conversations. UM. I find 189 00:12:40,120 --> 00:12:44,760 Speaker 1: navigating it somewhat tedious, but that's my own experience. And 190 00:12:45,000 --> 00:12:47,800 Speaker 1: in Facebook, it is a way for employees to be 191 00:12:47,840 --> 00:12:51,280 Speaker 1: able to access and share information internally within the company. 192 00:12:51,400 --> 00:12:54,800 Speaker 1: None the past, employees were pretty much able to communicate 193 00:12:54,840 --> 00:12:59,439 Speaker 1: across different topics within workplace without issue. They could jump 194 00:12:59,520 --> 00:13:03,800 Speaker 1: into a front channels essentially that are dedicated to specific 195 00:13:04,880 --> 00:13:10,439 Speaker 1: issues or concerns or corporate areas of interest. But now 196 00:13:10,640 --> 00:13:14,720 Speaker 1: certain groups that focus on specific topics, ones that Facebook 197 00:13:14,760 --> 00:13:18,720 Speaker 1: has labeled as being sensitive, have institute a kind of 198 00:13:18,720 --> 00:13:21,600 Speaker 1: members only kind of approach. In other words, it's a 199 00:13:21,679 --> 00:13:24,120 Speaker 1: need to know basis sort of thing. If you are 200 00:13:24,240 --> 00:13:28,080 Speaker 1: part of that department, you have access, Otherwise you do not. 201 00:13:28,920 --> 00:13:32,880 Speaker 1: Um and multiple outlets, including The New York Times, have 202 00:13:32,960 --> 00:13:36,359 Speaker 1: reported on this, showing that the effort to stop leaks 203 00:13:36,559 --> 00:13:40,400 Speaker 1: was then immediately leaked to the media. So what does 204 00:13:40,440 --> 00:13:43,120 Speaker 1: that mean for Facebook? Well, I would say it probably 205 00:13:43,240 --> 00:13:45,880 Speaker 1: means that for one thing, some employees at the company 206 00:13:46,040 --> 00:13:48,600 Speaker 1: might not be so thrilled about what Facebook has been 207 00:13:48,720 --> 00:13:53,600 Speaker 1: up to, and their discontent is enough to be an 208 00:13:53,679 --> 00:13:58,760 Speaker 1: ongoing issue, and that perhaps if Facebook wants to fix this, 209 00:13:59,280 --> 00:14:01,960 Speaker 1: they need to look at the root causes as opposed 210 00:14:01,960 --> 00:14:05,840 Speaker 1: to the leaks. The Verge has another Facebook story that 211 00:14:06,000 --> 00:14:10,440 Speaker 1: is interesting. The company has been working on various AI 212 00:14:10,559 --> 00:14:15,920 Speaker 1: projects that ultimately would relate to Facebook's augmented reality initiatives. 213 00:14:16,120 --> 00:14:18,080 Speaker 1: So the idea is to create a system that has 214 00:14:18,080 --> 00:14:22,000 Speaker 1: a type of almost like a memory connected to it. 215 00:14:22,120 --> 00:14:27,080 Speaker 1: Like the system itself would be able to go back 216 00:14:27,320 --> 00:14:31,800 Speaker 1: over data over time and glean important things from that. 217 00:14:32,600 --> 00:14:34,800 Speaker 1: You may have seen videos of Google kind of showing 218 00:14:34,800 --> 00:14:39,600 Speaker 1: off a similar approach to how like the Assistant program, 219 00:14:39,640 --> 00:14:43,960 Speaker 1: the virtual assistant in Google can continue to answer questions 220 00:14:44,000 --> 00:14:47,560 Speaker 1: about a subject. So you ask one question, then after 221 00:14:47,640 --> 00:14:49,840 Speaker 1: you get the answer, you follow up with another question, 222 00:14:50,400 --> 00:14:54,000 Speaker 1: and without you know, specifically referencing that you're still talking 223 00:14:54,040 --> 00:14:57,080 Speaker 1: about the same topic, assistant can then draw the conclusion 224 00:14:57,160 --> 00:14:59,520 Speaker 1: that the second question is about the same subject as 225 00:14:59,560 --> 00:15:02,800 Speaker 1: the first. Let me give you an example. Uh, using 226 00:15:02,800 --> 00:15:05,520 Speaker 1: this version of assistant, I could say something like, what's 227 00:15:05,560 --> 00:15:09,240 Speaker 1: the weather like at Disneyland tomorrow, and I get the answer, 228 00:15:09,520 --> 00:15:11,600 Speaker 1: and then I might follow that up with what time 229 00:15:11,640 --> 00:15:14,880 Speaker 1: does it open? Well, I didn't say Disneyland, I said 230 00:15:14,880 --> 00:15:18,240 Speaker 1: what time does it open? Using this kind of approach 231 00:15:18,280 --> 00:15:22,240 Speaker 1: to AI, the assistant might be able to glean the 232 00:15:22,280 --> 00:15:25,520 Speaker 1: fact that when I say it, I'm referring to the 233 00:15:26,160 --> 00:15:30,040 Speaker 1: subject from my previous question, that is Disneyland, and thus 234 00:15:30,040 --> 00:15:33,360 Speaker 1: give me the correct answer. Now this is trivial for 235 00:15:33,480 --> 00:15:35,880 Speaker 1: human beings, right, but it's actually a big deal for 236 00:15:35,960 --> 00:15:40,080 Speaker 1: artificial intelligence. AI does not natively have the ability to 237 00:15:40,240 --> 00:15:46,680 Speaker 1: understand context. So Facebook is working on several similar prop projects, 238 00:15:46,720 --> 00:15:51,320 Speaker 1: some of which are really ambitious. Now, the downside of 239 00:15:51,360 --> 00:15:53,720 Speaker 1: this is that it could lead to some pretty nasty 240 00:15:53,840 --> 00:15:58,800 Speaker 1: privacy and security issues. So, for example, imagine that someone 241 00:15:59,000 --> 00:16:02,640 Speaker 1: is wearing a pair of those a R ray band glasses, 242 00:16:02,640 --> 00:16:05,120 Speaker 1: except these are a more advanced version. They're one that 243 00:16:05,160 --> 00:16:10,200 Speaker 1: are have you know, more augmented reality type of features 244 00:16:10,200 --> 00:16:13,000 Speaker 1: built into them. They're not just like a camera built 245 00:16:13,000 --> 00:16:17,040 Speaker 1: into glasses. And let's say this person wearing the glasses 246 00:16:17,640 --> 00:16:20,960 Speaker 1: goes by a secure facility of some sort, maybe it's 247 00:16:20,960 --> 00:16:24,760 Speaker 1: a bank, right, and they look into the bank and 248 00:16:24,800 --> 00:16:27,440 Speaker 1: the camera and the glasses captures images of a person 249 00:16:27,480 --> 00:16:30,640 Speaker 1: who's accessing a secure part of that facility. So someone 250 00:16:30,680 --> 00:16:35,480 Speaker 1: who's actually authorized to go into some area that otherwise 251 00:16:35,520 --> 00:16:39,880 Speaker 1: you would not have access to. Okay, if you have 252 00:16:40,120 --> 00:16:43,280 Speaker 1: this capability and you pair it with something like facial recognition, 253 00:16:43,320 --> 00:16:47,040 Speaker 1: you could compromise that person's security. It could say, all right, 254 00:16:48,000 --> 00:16:52,160 Speaker 1: this person so and so has access to this secure 255 00:16:52,280 --> 00:16:55,760 Speaker 1: part of the bank. Here's some more information about so 256 00:16:55,800 --> 00:16:58,240 Speaker 1: and so that we're able to pull from the Internet. 257 00:16:58,400 --> 00:17:04,000 Speaker 1: Because of online mind social networking platforms and Twitter profiles 258 00:17:04,000 --> 00:17:06,600 Speaker 1: and all that kind of stuff, you could easily see 259 00:17:06,880 --> 00:17:11,959 Speaker 1: where this can quickly become a threat to privacy and security. Now, 260 00:17:12,000 --> 00:17:14,840 Speaker 1: that is a pretty dramatic example, right, That's something that's 261 00:17:14,840 --> 00:17:18,400 Speaker 1: obviously not going to apply to everyone. But the reality 262 00:17:18,480 --> 00:17:23,160 Speaker 1: is that this technology, unless it's implemented very carefully, could 263 00:17:23,160 --> 00:17:26,120 Speaker 1: become a huge threat to privacy for people in general. 264 00:17:26,640 --> 00:17:29,000 Speaker 1: You know, it could even be like who did I 265 00:17:29,040 --> 00:17:33,000 Speaker 1: see yesterday? Well, a question like that might be answered with, oh, 266 00:17:33,040 --> 00:17:35,400 Speaker 1: you saw your buddy Jim, and you also saw these 267 00:17:35,440 --> 00:17:37,960 Speaker 1: other five people you don't know, but I know them 268 00:17:38,000 --> 00:17:40,760 Speaker 1: because I have facial recognition technology and I was able 269 00:17:40,760 --> 00:17:43,720 Speaker 1: to cross reference it with the profiles that are available 270 00:17:43,760 --> 00:17:47,240 Speaker 1: on Facebook. So now you know all the people you encountered, 271 00:17:47,440 --> 00:17:49,680 Speaker 1: whether they wanted you to know them or not. They 272 00:17:49,680 --> 00:17:53,280 Speaker 1: were just going about their daily lives, So that is 273 00:17:53,320 --> 00:17:56,119 Speaker 1: a potential threat. Now Facebook is looking into some pretty 274 00:17:56,160 --> 00:18:00,240 Speaker 1: incredible AI applications such as hand and object manipulation shtion, 275 00:18:00,760 --> 00:18:02,680 Speaker 1: And in that case, you might be trying to learn 276 00:18:02,720 --> 00:18:05,560 Speaker 1: a specific skill and you could use an augmented reality 277 00:18:05,640 --> 00:18:08,199 Speaker 1: system to kind of teach you the steps you need 278 00:18:08,280 --> 00:18:10,439 Speaker 1: to do in order to build your skills. But you 279 00:18:10,440 --> 00:18:13,840 Speaker 1: could also use it potentially to analyze your own performance 280 00:18:13,960 --> 00:18:16,240 Speaker 1: and then and you know, give you tips on how 281 00:18:16,280 --> 00:18:18,879 Speaker 1: you can do that skill better and get you know, 282 00:18:19,000 --> 00:18:22,840 Speaker 1: improve upon it, which is really cool. So this sort 283 00:18:22,840 --> 00:18:27,280 Speaker 1: of hyper focused AI is really you know, compelling, but 284 00:18:27,400 --> 00:18:30,720 Speaker 1: you also start to see where the need for things 285 00:18:30,760 --> 00:18:34,600 Speaker 1: like privacy protection has to step in. This, by the way, 286 00:18:34,720 --> 00:18:37,320 Speaker 1: is all part of an initiative that Facebook calls Ego 287 00:18:37,520 --> 00:18:41,320 Speaker 1: four D. The Verge has a great piece that goes 288 00:18:41,320 --> 00:18:43,280 Speaker 1: into much further detail if you want to learn more 289 00:18:43,320 --> 00:18:47,600 Speaker 1: about this. It is titled Facebook is researching AI systems 290 00:18:47,640 --> 00:18:51,040 Speaker 1: that see here and remember everything you do. It's by 291 00:18:51,160 --> 00:18:54,760 Speaker 1: James Vincent. Highly recommend you check it out more than 292 00:18:54,840 --> 00:18:57,960 Speaker 1: for The organizations have banded together to launch a campaign 293 00:18:58,000 --> 00:19:01,560 Speaker 1: called how to Stop Facebook. So yeah, that hard week 294 00:19:01,640 --> 00:19:05,640 Speaker 1: does just keep extending. The group advocates for regulations that 295 00:19:05,680 --> 00:19:09,399 Speaker 1: would restrict how Facebook collects and uses data, citing the 296 00:19:09,400 --> 00:19:12,200 Speaker 1: concerns that we've seen pop up due to Facebook's reliance 297 00:19:12,200 --> 00:19:17,680 Speaker 1: on engagement driving algorithms, and those algorithms frequently favor content 298 00:19:17,760 --> 00:19:21,000 Speaker 1: that can be harmful in various ways because it drives engagement, 299 00:19:21,480 --> 00:19:25,240 Speaker 1: and as the campaign strategies director of Media Justice Masha 300 00:19:25,440 --> 00:19:30,000 Speaker 1: Hayes has said, quote Facebook's surveillance capitalist business model is 301 00:19:30,080 --> 00:19:35,800 Speaker 1: fundamentally incompatible with basic human rights end quote. That's a 302 00:19:35,880 --> 00:19:38,600 Speaker 1: heck of a statement. Hard for me to find fault 303 00:19:38,640 --> 00:19:43,480 Speaker 1: in it. Um, especially the way Facebook pursues its you know, 304 00:19:43,720 --> 00:19:48,200 Speaker 1: business model. Uh, it appears that it is at least 305 00:19:48,240 --> 00:19:52,760 Speaker 1: indifferent to basic human rights. The group is calling upon leaders, 306 00:19:52,920 --> 00:19:56,000 Speaker 1: like political leaders to intervene and essentially force Facebook to 307 00:19:56,040 --> 00:19:59,480 Speaker 1: fundamentally change how it operates. Um. And there are a 308 00:19:59,480 --> 00:20:01,560 Speaker 1: lot of different groups that are all part of this, 309 00:20:01,760 --> 00:20:04,240 Speaker 1: and a lot of them are are human rights groups 310 00:20:04,320 --> 00:20:07,720 Speaker 1: and things along those lines that are very much concerned 311 00:20:08,080 --> 00:20:12,640 Speaker 1: that Facebook's performance in the last you know a few 312 00:20:12,760 --> 00:20:16,560 Speaker 1: years has shown it to be an organization that has 313 00:20:16,640 --> 00:20:20,159 Speaker 1: the capacity to do a disproportionate amount of harm to 314 00:20:20,240 --> 00:20:24,440 Speaker 1: vulnerable populations. Finally, the cherry on top for bad Facebook 315 00:20:24,480 --> 00:20:27,919 Speaker 1: news comes out of Washington State. The Attorney General of 316 00:20:28,000 --> 00:20:31,399 Speaker 1: Washington has filed a lawsuit against Facebook, saying that the 317 00:20:31,440 --> 00:20:35,960 Speaker 1: company representatives have provided false testimony during a previous case 318 00:20:36,440 --> 00:20:41,440 Speaker 1: about whether or not the platform violated Washington's campaign finance laws. 319 00:20:41,480 --> 00:20:43,640 Speaker 1: So at the heart of the matter is how Facebook 320 00:20:43,680 --> 00:20:46,920 Speaker 1: sold political ads in the state of Washington. Washington state 321 00:20:47,000 --> 00:20:51,359 Speaker 1: law requires that any platform that sells ads space to 322 00:20:51,520 --> 00:20:56,040 Speaker 1: political campaigns has to provide information about the ads who 323 00:20:56,080 --> 00:21:00,879 Speaker 1: bought them, uh that their address. It has to also 324 00:21:00,920 --> 00:21:04,240 Speaker 1: have who were the ads meant to target, how many 325 00:21:04,320 --> 00:21:07,240 Speaker 1: views did those ads get. All of that information is 326 00:21:07,280 --> 00:21:11,000 Speaker 1: required by state law to be publicly available upon request. 327 00:21:11,480 --> 00:21:13,960 Speaker 1: Facebook is not complied with the full extent of that law. 328 00:21:14,560 --> 00:21:18,200 Speaker 1: You can request information about Facebook ads, but it doesn't 329 00:21:18,240 --> 00:21:22,600 Speaker 1: include all the points of data that Washington State law 330 00:21:22,720 --> 00:21:27,560 Speaker 1: calls for. The Company's lawyers facebooks lawyers argue that Washington's 331 00:21:27,720 --> 00:21:31,680 Speaker 1: law is unconstitutional, but the heart of this matter is 332 00:21:31,720 --> 00:21:36,840 Speaker 1: that the Attorney General says in that case, Facebook representatives 333 00:21:37,440 --> 00:21:42,480 Speaker 1: specifically gave false statements to questions asked of them, and 334 00:21:42,560 --> 00:21:45,520 Speaker 1: that therefore they are guilty of perjury. So this is 335 00:21:45,560 --> 00:21:49,240 Speaker 1: another part layer on top of everything else. So this 336 00:21:49,359 --> 00:21:51,880 Speaker 1: legal battle is ongoing. We will have to check back 337 00:21:51,920 --> 00:21:57,320 Speaker 1: in the future. Okay, there were other news stories today, 338 00:21:57,359 --> 00:22:00,679 Speaker 1: but honestly, all the Amazon and face Book stuff kind 339 00:22:00,720 --> 00:22:03,040 Speaker 1: of took all the wind out of my sales just 340 00:22:03,040 --> 00:22:08,159 Speaker 1: because it was just so much rough reporting on different things. 341 00:22:08,560 --> 00:22:11,960 Speaker 1: So we're gonna wrap it up here. Plus I can 342 00:22:12,640 --> 00:22:15,840 Speaker 1: here that the lawn maintenance folks are outside of my 343 00:22:15,880 --> 00:22:19,000 Speaker 1: house now, so it's gonna get progressively louder. My dog's 344 00:22:19,000 --> 00:22:21,400 Speaker 1: gonna join into it's just gonna be a free for all. 345 00:22:21,480 --> 00:22:25,159 Speaker 1: So um, while the noise level continually goes up in 346 00:22:25,160 --> 00:22:28,159 Speaker 1: this episode, if you have suggestions for topics I should 347 00:22:28,240 --> 00:22:31,120 Speaker 1: cover in the future, maybe like how weed whackers work, 348 00:22:31,280 --> 00:22:33,840 Speaker 1: for example. Don't know why that jumped to mind, let 349 00:22:33,920 --> 00:22:37,160 Speaker 1: me know. Send me a message on Twitter. The handle 350 00:22:37,200 --> 00:22:40,720 Speaker 1: for the show is text stuff h s W and 351 00:22:40,920 --> 00:22:44,960 Speaker 1: I'll garden again. I'm sorry, I'll talk to you again 352 00:22:45,800 --> 00:22:54,040 Speaker 1: really soon. Text Stuff is an i heart Radio production. 353 00:22:54,280 --> 00:22:57,119 Speaker 1: For more podcasts from my heart Radio, visit the i 354 00:22:57,200 --> 00:23:00,480 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 355 00:23:00,480 --> 00:23:01,399 Speaker 1: your favorite shows.