1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,760 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,880 --> 00:00:18,360 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,400 --> 00:00:21,200 Speaker 1: and how the tech are you? It is time for 5 00:00:21,280 --> 00:00:26,040 Speaker 1: the tech news for Thursday, January twenty Twin D two, 6 00:00:27,200 --> 00:00:31,120 Speaker 1: the Journal of Public Health recently posted a study that 7 00:00:31,160 --> 00:00:35,120 Speaker 1: shows anti vaccination groups on Facebook we're sewing the seeds 8 00:00:35,159 --> 00:00:39,120 Speaker 1: of mistrust regarding COVID Night teen and COVID Night Team 9 00:00:39,200 --> 00:00:44,080 Speaker 1: vaccines before there was even an organized vaccine development program 10 00:00:44,120 --> 00:00:47,280 Speaker 1: from the United States government. According to the study, the 11 00:00:47,320 --> 00:00:51,559 Speaker 1: groups were circulating misinformation about COVID nineteen and vaccines for 12 00:00:51,760 --> 00:00:56,120 Speaker 1: COVID nineteen as early as February twenty twenty. And heck, 13 00:00:56,200 --> 00:01:00,400 Speaker 1: that's before we started seeing lockdowns in various cities around 14 00:01:00,480 --> 00:01:03,440 Speaker 1: the world and in the United States. I was still 15 00:01:03,480 --> 00:01:07,759 Speaker 1: on vacation in February at the point where these stories 16 00:01:07,800 --> 00:01:10,000 Speaker 1: were starting to circulate, and in fact, I was starting 17 00:01:10,040 --> 00:01:12,479 Speaker 1: to wonder if I would ever get back home. But um, 18 00:01:12,760 --> 00:01:17,120 Speaker 1: but I did. Anyway. The researchers identified the anti vax 19 00:01:17,160 --> 00:01:19,800 Speaker 1: groups that were most active on Facebook at that time, 20 00:01:20,040 --> 00:01:24,200 Speaker 1: and they have some pretty remarkable names. You know, names 21 00:01:24,240 --> 00:01:28,480 Speaker 1: like National Vaccine Information Center, which makes it sound like 22 00:01:28,480 --> 00:01:32,280 Speaker 1: it's some sort of official agency that focuses on vaccines, 23 00:01:32,600 --> 00:01:35,959 Speaker 1: but in fact, it was a disinformation campaign designed to 24 00:01:35,959 --> 00:01:43,200 Speaker 1: promote conspiracy theories and anti vaxx messaging and undermine confidence 25 00:01:43,240 --> 00:01:46,919 Speaker 1: in vaccinations in general. Two of the other four anti 26 00:01:47,000 --> 00:01:52,360 Speaker 1: vax groups that the research report named have similar names 27 00:01:52,360 --> 00:01:55,640 Speaker 1: to that. One is a Vaccination Information Network and the 28 00:01:55,680 --> 00:01:58,640 Speaker 1: other is Vaccine Machine. And the four groups that the 29 00:01:58,640 --> 00:02:03,120 Speaker 1: researchers identified had posted around two thousand sixty times on 30 00:02:03,160 --> 00:02:07,880 Speaker 1: Facebook from February twenty to May, and about half of 31 00:02:07,920 --> 00:02:12,839 Speaker 1: those posts mentioned COVID nineteen. Uh, presumably the other half 32 00:02:12,840 --> 00:02:17,000 Speaker 1: were just kind of more generally anti vacs. The study 33 00:02:17,080 --> 00:02:20,760 Speaker 1: really pulls back the curtain on some important things. One 34 00:02:20,960 --> 00:02:25,880 Speaker 1: is that misinformation campaigns really got the jump on US 35 00:02:25,960 --> 00:02:30,000 Speaker 1: public health officials. Uh. Those deceptive posts came out well 36 00:02:30,040 --> 00:02:34,840 Speaker 1: before we started getting you know, reliable information from organizations 37 00:02:34,880 --> 00:02:37,799 Speaker 1: like the c d C, for example. Another is that 38 00:02:38,360 --> 00:02:41,880 Speaker 1: this is yet more proof of how Facebook facilitates the 39 00:02:42,000 --> 00:02:45,560 Speaker 1: spread of harmful information. I'm not saying that the platform 40 00:02:45,639 --> 00:02:49,960 Speaker 1: creates harmful information, but definitely facilitates the spread of it, 41 00:02:50,320 --> 00:02:52,160 Speaker 1: and I know I'm beating a dead horse here. I 42 00:02:52,200 --> 00:02:55,680 Speaker 1: know I've talked about this countless times, but Facebook's entire 43 00:02:55,720 --> 00:02:59,600 Speaker 1: business model depends upon people spreading and engaging with content 44 00:02:59,639 --> 00:03:03,360 Speaker 1: on face Book. I mean that engagement is a commodity 45 00:03:03,480 --> 00:03:09,280 Speaker 1: that Facebook sells to advertisers, so it financially benefits Facebook 46 00:03:09,560 --> 00:03:12,480 Speaker 1: when these things happen, and it was really only after 47 00:03:12,680 --> 00:03:15,639 Speaker 1: massive pressure that the company indicated it would do anything 48 00:03:15,639 --> 00:03:17,919 Speaker 1: about it. In fact, that's what a lot of those 49 00:03:17,919 --> 00:03:21,560 Speaker 1: internal papers that Francis Hogan leaked to the authorities and 50 00:03:21,639 --> 00:03:25,920 Speaker 1: pressed indicated that when there were voices inside Facebook that 51 00:03:25,919 --> 00:03:30,640 Speaker 1: we're warning about stuff like misinformation campaigns, they were often 52 00:03:30,720 --> 00:03:35,760 Speaker 1: ignored or silenced. Anyway. The study is titled Faster than 53 00:03:35,840 --> 00:03:40,200 Speaker 1: warp Speed Early attention to COVID nineteen by anti vaccine 54 00:03:40,200 --> 00:03:44,120 Speaker 1: groups on Facebook. Warp Speed is a reference not to 55 00:03:44,200 --> 00:03:47,680 Speaker 1: star Trek, although I mean indirectly yes, but more directly 56 00:03:47,920 --> 00:03:53,360 Speaker 1: to the United States government's uh project to fast track 57 00:03:53,680 --> 00:03:57,440 Speaker 1: vaccine development for COVID nineteen. The big news and activision 58 00:03:57,440 --> 00:04:01,080 Speaker 1: blizzard this month has mostly been about how Microsoft announced 59 00:04:01,080 --> 00:04:03,440 Speaker 1: it was going to acquire the company by the end 60 00:04:03,480 --> 00:04:07,800 Speaker 1: of fiscal year twenty twenty three, assuming regulatory bodies around 61 00:04:07,800 --> 00:04:10,800 Speaker 1: the world don't throw the deal off track. But there's 62 00:04:10,800 --> 00:04:14,280 Speaker 1: more going on at that company than a big acquisition. 63 00:04:15,000 --> 00:04:19,800 Speaker 1: Employees at Raven Software, which is a division within Activision Blizzard, 64 00:04:20,000 --> 00:04:24,200 Speaker 1: have formed a union. Specifically, the employees I'm talking about 65 00:04:24,240 --> 00:04:28,040 Speaker 1: our thirty four quality assurance or q A testers within 66 00:04:28,279 --> 00:04:31,479 Speaker 1: Raven Software. They are the ones who have unionized. Now. 67 00:04:32,400 --> 00:04:36,760 Speaker 1: Part of the unionization process, which I've actually seen firsthand 68 00:04:36,800 --> 00:04:41,279 Speaker 1: now involves the union potentially reaching out to the company 69 00:04:41,600 --> 00:04:45,520 Speaker 1: and asking that the company formally recognized the union, like 70 00:04:45,839 --> 00:04:50,960 Speaker 1: essentially saying, hey, we have a threshold of employees who 71 00:04:51,000 --> 00:04:55,360 Speaker 1: want to unionize. Uh. And here's maybe some union cards 72 00:04:55,480 --> 00:04:58,760 Speaker 1: that people have signed that we were hoping that you 73 00:04:58,800 --> 00:05:03,160 Speaker 1: will recognize this as a union uh. And and this 74 00:05:03,200 --> 00:05:06,520 Speaker 1: can go a couple of ways. The company can choose 75 00:05:06,560 --> 00:05:11,560 Speaker 1: to voluntarily recognize the union uh, and that process isn't 76 00:05:11,600 --> 00:05:14,960 Speaker 1: necessarily straightforward. There's usually some back and forth between the 77 00:05:15,000 --> 00:05:19,279 Speaker 1: company and the union organizers to determine who's actually covered 78 00:05:19,320 --> 00:05:22,560 Speaker 1: by the union, etcetera. Or a company can choose to 79 00:05:22,640 --> 00:05:27,320 Speaker 1: not voluntarily recognize the union, and that's the way Activision 80 00:05:27,360 --> 00:05:32,400 Speaker 1: Blizzard chose. So now those QA testers at Raven are 81 00:05:32,440 --> 00:05:35,440 Speaker 1: going to have to file with the National Labor Relations 82 00:05:35,440 --> 00:05:38,760 Speaker 1: Board are in l RB in order to get permission 83 00:05:38,839 --> 00:05:42,160 Speaker 1: to hold a union election. So the election is a 84 00:05:42,160 --> 00:05:46,280 Speaker 1: more formal process. And ultimately what does is if the 85 00:05:46,279 --> 00:05:50,359 Speaker 1: election passes, If if a majority of employees vote in 86 00:05:50,440 --> 00:05:54,160 Speaker 1: favor of a union, that serves as evidence that they 87 00:05:54,200 --> 00:05:57,800 Speaker 1: do in fact want to unionize. And should the election pass, 88 00:05:58,080 --> 00:06:01,360 Speaker 1: then the n l r B will certify the union, 89 00:06:01,800 --> 00:06:04,760 Speaker 1: and that means that companies are going to have to 90 00:06:04,839 --> 00:06:08,440 Speaker 1: deal with union representatives for the purposes of collective bargaining 91 00:06:08,480 --> 00:06:10,960 Speaker 1: for things like compensation and benefits and that kind of stuff. 92 00:06:12,000 --> 00:06:15,520 Speaker 1: Raven released a statement expressing disappointment that Activision Blizzard did 93 00:06:15,520 --> 00:06:19,680 Speaker 1: not voluntarily recognize their union, but expressed confidence that the 94 00:06:19,720 --> 00:06:22,839 Speaker 1: outcome will still be the same, that in fact, the 95 00:06:22,920 --> 00:06:27,000 Speaker 1: election will show that the employees wish to unionize. As 96 00:06:27,040 --> 00:06:31,000 Speaker 1: for what this means when Microsoft takes control of Activision Blizzard, 97 00:06:31,080 --> 00:06:34,960 Speaker 1: assuming that deal goes through well, Microsoft is also not 98 00:06:35,120 --> 00:06:38,960 Speaker 1: known for being super enthusiastic about unionization. It's a kind 99 00:06:39,000 --> 00:06:41,400 Speaker 1: way of putting it, but anyway, I just want to 100 00:06:41,400 --> 00:06:43,799 Speaker 1: say I stand with the q A testers at Raven 101 00:06:43,960 --> 00:06:48,920 Speaker 1: you know, solidarity. Speaking of unions, organizers at Amazon's JFK 102 00:06:49,120 --> 00:06:52,920 Speaker 1: eight fulfillment center in Staten Island, New York have reached 103 00:06:53,080 --> 00:06:55,760 Speaker 1: the number of signatures they need in order to hold 104 00:06:55,839 --> 00:06:59,640 Speaker 1: a union election vote. The group A previously attempted this 105 00:06:59,760 --> 00:07:03,360 Speaker 1: last year, but they did not get enough signatures to 106 00:07:03,520 --> 00:07:06,080 Speaker 1: merit a vote on whether or not to form a union. 107 00:07:06,400 --> 00:07:09,479 Speaker 1: Organizers need to secure the signatures of thirty percent of 108 00:07:09,520 --> 00:07:14,200 Speaker 1: the overall workforce that express interest in doing so. Unsurprisingly, 109 00:07:14,520 --> 00:07:17,880 Speaker 1: Amazon representatives have protested this whole move, and they question 110 00:07:18,000 --> 00:07:21,000 Speaker 1: if the signatures that have been gathered are even legitimate. 111 00:07:21,560 --> 00:07:24,400 Speaker 1: They argue that the previous attempt from last year shows 112 00:07:24,440 --> 00:07:27,960 Speaker 1: that workers at the fulfillment center are not interested in organizing, 113 00:07:28,000 --> 00:07:31,320 Speaker 1: and that besides, you know, quote, our employees have always 114 00:07:31,360 --> 00:07:33,560 Speaker 1: had a choice of whether or not to join a union. 115 00:07:33,760 --> 00:07:37,000 Speaker 1: End quote. Now part of my skepticism here, But Amazon 116 00:07:37,120 --> 00:07:39,000 Speaker 1: is also the company that got a slap on the 117 00:07:39,000 --> 00:07:42,400 Speaker 1: wrist in the wake of a different union vote, one 118 00:07:42,440 --> 00:07:45,320 Speaker 1: that took place at a fulfillment center in Alabama. And 119 00:07:45,360 --> 00:07:49,600 Speaker 1: in that case in which employees voted against forming a union, 120 00:07:49,680 --> 00:07:54,360 Speaker 1: and Amazon was you know, really happy about that. The 121 00:07:54,480 --> 00:07:59,520 Speaker 1: organizers alleged that Amazon reps had interfered with the election process. 122 00:07:59,560 --> 00:08:03,320 Speaker 1: That action was sustained by the National Labor Relations Board 123 00:08:03,760 --> 00:08:07,080 Speaker 1: and they are n l RB authorized a new vote, 124 00:08:07,480 --> 00:08:10,000 Speaker 1: and that new vote has not yet happened. But essentially, 125 00:08:10,600 --> 00:08:14,840 Speaker 1: you know, Amazon has a history that seems to indicate 126 00:08:15,120 --> 00:08:21,280 Speaker 1: that the company is very keen on discouraging unionization. And 127 00:08:21,360 --> 00:08:24,480 Speaker 1: on a related note, Amazon made headlines last year when 128 00:08:24,560 --> 00:08:28,280 Speaker 1: journalists reported that the company had launched an influence campaign 129 00:08:28,400 --> 00:08:30,960 Speaker 1: to make Amazon seemed like a super awesome place to 130 00:08:31,000 --> 00:08:34,880 Speaker 1: work and perhaps maybe as part of that too, you know, 131 00:08:35,040 --> 00:08:40,679 Speaker 1: kind of discourage dangerous ideas like unionization. Essentially, the story 132 00:08:40,760 --> 00:08:43,360 Speaker 1: was that Amazon was paying employees to post on social 133 00:08:43,400 --> 00:08:47,920 Speaker 1: media about how much they lived working at Amazon, and specifically, 134 00:08:48,040 --> 00:08:53,000 Speaker 1: folks at warehouses and fulfillment centers were encouraged monetarily, that is, 135 00:08:53,080 --> 00:08:57,000 Speaker 1: paid to post positive stories about their work experience. This 136 00:08:57,120 --> 00:09:00,400 Speaker 1: program began in two thousand eighteen, during a time when 137 00:09:00,440 --> 00:09:04,400 Speaker 1: folks were kind of scrutinizing Amazon's working conditions, which were 138 00:09:04,440 --> 00:09:10,800 Speaker 1: reported to range from not so great to degrading and inhumane. Like, 139 00:09:10,880 --> 00:09:12,839 Speaker 1: some of the stories were pretty awful. I mean, if 140 00:09:12,880 --> 00:09:16,600 Speaker 1: my employer set performance targets for me that I could 141 00:09:16,640 --> 00:09:18,840 Speaker 1: only meet if I were to pee in a bottle 142 00:09:18,960 --> 00:09:21,440 Speaker 1: rather than take a bathroom break whenever I needed to go. 143 00:09:21,920 --> 00:09:24,720 Speaker 1: I would call that dehumanizing. And I work from home, 144 00:09:24,880 --> 00:09:28,960 Speaker 1: not in a warehouse, and yeah, that's the kind of 145 00:09:28,960 --> 00:09:32,640 Speaker 1: stuff that was being reported out of Amazon Fulfillment centers now. 146 00:09:32,679 --> 00:09:36,840 Speaker 1: The program of promoting Amazon as a great place to 147 00:09:36,880 --> 00:09:41,120 Speaker 1: work consisted of more than fifty online accounts on Twitter, 148 00:09:41,960 --> 00:09:47,319 Speaker 1: all using the appended title of Amazon FC Ambassador, and 149 00:09:47,440 --> 00:09:51,480 Speaker 1: many of them using similar or sometimes identical content inside 150 00:09:51,480 --> 00:09:54,520 Speaker 1: their posts, all of which position to Amazon is a 151 00:09:54,600 --> 00:09:57,439 Speaker 1: hunky dory kind of place to work, almost like Willy 152 00:09:57,520 --> 00:10:02,080 Speaker 1: Wonka's chocolate factory. The strategy did not work. Folks called 153 00:10:02,080 --> 00:10:04,560 Speaker 1: it out right away, and it now appears that Amazon 154 00:10:04,640 --> 00:10:07,640 Speaker 1: has completely scrapped the program by the end of last 155 00:10:07,720 --> 00:10:10,760 Speaker 1: year and then got to work essentially wiping away any 156 00:10:10,800 --> 00:10:13,520 Speaker 1: trace that it was ever a thing. On a related 157 00:10:13,600 --> 00:10:16,760 Speaker 1: note and a peek behind the scenes, I recently went 158 00:10:16,800 --> 00:10:19,719 Speaker 1: through some f c C and f TC training here 159 00:10:19,760 --> 00:10:22,800 Speaker 1: at work at I Heart, and that includes learning about 160 00:10:22,800 --> 00:10:27,360 Speaker 1: how the FTC and the FCC set out rules regarding 161 00:10:27,400 --> 00:10:32,400 Speaker 1: things like sponsored posts and and ads like content creators 162 00:10:32,440 --> 00:10:35,640 Speaker 1: were obligated to indicate when something is an ad or 163 00:10:35,679 --> 00:10:38,520 Speaker 1: a sponsored piece of content. We are not supposed to 164 00:10:38,559 --> 00:10:42,960 Speaker 1: pass it off as if it's our genuine thoughts, if 165 00:10:43,000 --> 00:10:46,040 Speaker 1: we wouldn't have done that otherwise. And you've probably seen 166 00:10:46,080 --> 00:10:49,520 Speaker 1: this in online places like YouTube and Instagram and that 167 00:10:49,600 --> 00:10:51,760 Speaker 1: kind of thing, where you're supposed to make it it's 168 00:10:51,840 --> 00:10:56,600 Speaker 1: explicitly clear if the content you're presenting is sponsored or 169 00:10:56,720 --> 00:10:59,960 Speaker 1: is an outright advertisement. Now I would not be surprised 170 00:11:00,000 --> 00:11:03,640 Speaker 1: of Eventually we see similar rules applied to companies that 171 00:11:03,679 --> 00:11:08,359 Speaker 1: are not directly associated with content creation, because what Amazon 172 00:11:08,559 --> 00:11:12,240 Speaker 1: did would be against the rules if it were to 173 00:11:12,280 --> 00:11:15,560 Speaker 1: happen on say a podcast or a radio show or 174 00:11:15,559 --> 00:11:19,319 Speaker 1: television show or online videos. So, in other words, if 175 00:11:19,480 --> 00:11:22,240 Speaker 1: I had done something similar to what was going on 176 00:11:22,320 --> 00:11:25,040 Speaker 1: at Amazon with this promotional program, and I did not 177 00:11:25,800 --> 00:11:30,240 Speaker 1: explicitly make it clear that this was a sponsored message 178 00:11:30,280 --> 00:11:33,280 Speaker 1: I was sending out, I could be reliable. I could 179 00:11:33,320 --> 00:11:36,800 Speaker 1: be fined an enormous amount of money, or my company 180 00:11:36,840 --> 00:11:40,400 Speaker 1: could be fined an enormous amount of money. And I'm 181 00:11:40,440 --> 00:11:44,480 Speaker 1: guessing that we will probably see a point where companies 182 00:11:44,520 --> 00:11:46,480 Speaker 1: in general will be held to that where they are 183 00:11:46,559 --> 00:11:50,360 Speaker 1: not allowed to run these kinds of campaigns without it 184 00:11:50,400 --> 00:11:54,280 Speaker 1: potentially resulting in fines. I wouldn't be surprised to see that. 185 00:11:55,160 --> 00:11:57,760 Speaker 1: And our last Amazon story for this episode is about 186 00:11:57,760 --> 00:12:02,080 Speaker 1: the Companies Sold by Amazon program. In that program, Amazon 187 00:12:02,200 --> 00:12:05,200 Speaker 1: partnered with third party vendors and then would promote and 188 00:12:05,240 --> 00:12:09,680 Speaker 1: sell products from those vendors would be sold by Amazon label. 189 00:12:09,880 --> 00:12:13,600 Speaker 1: So uh, the vendors would enter into an agreement with Amazon, 190 00:12:14,080 --> 00:12:17,240 Speaker 1: and that agreement would include a minimum payment rate for 191 00:12:17,280 --> 00:12:20,199 Speaker 1: stuff sold on Amazon, and then if sales went above 192 00:12:20,880 --> 00:12:24,040 Speaker 1: that minimum, then Amazon would start taking a cut of 193 00:12:24,120 --> 00:12:26,920 Speaker 1: the revenue. But the Attorney General for the state of 194 00:12:26,920 --> 00:12:31,120 Speaker 1: Washington launched an investigation into Amazon and concluded that this 195 00:12:31,200 --> 00:12:36,319 Speaker 1: practice was anti competitive and included illegal price fixing, and 196 00:12:36,400 --> 00:12:39,080 Speaker 1: that the program was meant to push Amazon sales numbers 197 00:12:39,120 --> 00:12:42,360 Speaker 1: up at the expense of independent third party vendors on 198 00:12:42,520 --> 00:12:46,800 Speaker 1: the platform. Amazon subsequently discontinued the program and will pay 199 00:12:46,840 --> 00:12:48,840 Speaker 1: a fee of more than two million dollars to the 200 00:12:48,880 --> 00:12:52,480 Speaker 1: office of the Attorney General, and reportedly that fine will 201 00:12:52,520 --> 00:12:56,600 Speaker 1: go to fund more anti trust enforcement efforts. We've got 202 00:12:56,640 --> 00:12:58,920 Speaker 1: a lot more stories to cover, but before we get 203 00:12:58,920 --> 00:13:08,880 Speaker 1: to that, let's take a quick break. Okay, we just 204 00:13:08,960 --> 00:13:13,079 Speaker 1: came back from ads and LG announced to advertising companies 205 00:13:13,120 --> 00:13:17,000 Speaker 1: this week that it would offer guaranteed outcome AD services 206 00:13:17,559 --> 00:13:21,400 Speaker 1: through its connected televisions. So once you start cutting through 207 00:13:21,440 --> 00:13:24,560 Speaker 1: the marketing speak, it appears that this really indicates the 208 00:13:24,720 --> 00:13:27,880 Speaker 1: LG is going to offer up stuff like targeted advertising 209 00:13:27,920 --> 00:13:31,520 Speaker 1: capabilities through its smart TVs. So if you buy an 210 00:13:31,640 --> 00:13:36,559 Speaker 1: LG television, then you should expect more targeted ads to 211 00:13:36,760 --> 00:13:40,120 Speaker 1: come through. Ads on smart televisions are not a new thing. 212 00:13:40,160 --> 00:13:42,920 Speaker 1: If you have a smart TV, you're probably familiar with this. 213 00:13:43,000 --> 00:13:45,120 Speaker 1: A lot of smart TVs will display an AD on 214 00:13:45,240 --> 00:13:49,760 Speaker 1: say the home screen for the television or within certain menus. 215 00:13:49,760 --> 00:13:52,880 Speaker 1: But LG has been particularly aggressive with ads and now 216 00:13:53,000 --> 00:13:55,959 Speaker 1: is looking to step up into that targeted ad game. 217 00:13:56,520 --> 00:14:00,439 Speaker 1: And according to digit A, LG will quote promise brands 218 00:14:00,760 --> 00:14:04,640 Speaker 1: that their CTV that stands for a connected television video 219 00:14:04,679 --> 00:14:09,120 Speaker 1: ads running on LG smart TVs meet campaign goals across 220 00:14:09,200 --> 00:14:14,480 Speaker 1: multiple kpi s. KPI stands for a key performance indicators 221 00:14:14,520 --> 00:14:18,840 Speaker 1: such as video completion rates. Buyers only pay if videos 222 00:14:18,840 --> 00:14:23,760 Speaker 1: are played in their entirety, demographic targets, reach and frequency 223 00:14:23,800 --> 00:14:28,320 Speaker 1: goals etcetera. Conversion metrics for mobile are also offered, but 224 00:14:28,440 --> 00:14:32,920 Speaker 1: elements including tune in web visits, physical location visits, etcetera, 225 00:14:32,960 --> 00:14:37,000 Speaker 1: won't be available for another few months end quote. And 226 00:14:37,040 --> 00:14:39,800 Speaker 1: that starts to sound a bit scary to me. I 227 00:14:39,840 --> 00:14:44,720 Speaker 1: mean from that, I infer that eventually LG connected televisions 228 00:14:44,760 --> 00:14:48,600 Speaker 1: are going to share information with your smartphone, including any 229 00:14:48,680 --> 00:14:52,560 Speaker 1: location tracking that your phone uses, and share that information 230 00:14:52,600 --> 00:14:55,280 Speaker 1: with advertisers. So that let's say that you've got a 231 00:14:55,320 --> 00:15:00,480 Speaker 1: company that advertises mattresses and that company then sees that 232 00:15:00,640 --> 00:15:04,200 Speaker 1: the targeted customers are actually visiting brick and mortar stores 233 00:15:04,240 --> 00:15:08,160 Speaker 1: that the company operates. Because the LG smart TV is 234 00:15:08,200 --> 00:15:13,240 Speaker 1: sharing that location data between apps on the smartphone, the television, 235 00:15:13,480 --> 00:15:16,360 Speaker 1: and the whole advertising campaign in the first place, starts 236 00:15:16,360 --> 00:15:19,800 Speaker 1: to get a little invasive. Also, if we're talking targeted 237 00:15:19,800 --> 00:15:23,720 Speaker 1: advertising that takes into account stuff like your browsing habits 238 00:15:23,760 --> 00:15:26,480 Speaker 1: that you know it's connected to an account that is 239 00:15:27,440 --> 00:15:31,040 Speaker 1: living on, say your computer or your smartphone, a lot 240 00:15:31,080 --> 00:15:33,240 Speaker 1: of other red flags pop up For me. It could 241 00:15:33,280 --> 00:15:35,600 Speaker 1: be a really big privacy issue. Just imagine for a 242 00:15:35,640 --> 00:15:37,960 Speaker 1: second that you're living in an apartment with a whole 243 00:15:38,000 --> 00:15:40,840 Speaker 1: bunch of roommates, and you might happen to own an 244 00:15:41,000 --> 00:15:43,720 Speaker 1: LG smart TV and you offer to have that be 245 00:15:43,840 --> 00:15:46,680 Speaker 1: the communal television in the living room space. So you're 246 00:15:46,720 --> 00:15:50,760 Speaker 1: living room TV is your smart LG TV. If that 247 00:15:50,880 --> 00:15:54,240 Speaker 1: television is connected to some sort of account on your 248 00:15:54,280 --> 00:15:58,640 Speaker 1: smartphone or laptop, it's possible that the ads that end 249 00:15:58,720 --> 00:16:02,360 Speaker 1: up being displayed are revealing information that you really didn't 250 00:16:02,360 --> 00:16:05,280 Speaker 1: want to share with your roommates. Maybe it's like medical 251 00:16:05,320 --> 00:16:09,120 Speaker 1: information or something. And that's just one hypothetical situation. I 252 00:16:09,160 --> 00:16:12,760 Speaker 1: can imagine where this is not a good thing, but 253 00:16:12,840 --> 00:16:15,880 Speaker 1: we'll have to see how it plays out. Today is 254 00:16:16,040 --> 00:16:20,360 Speaker 1: Holocaust Remembrance Day. It's the anniversary of the liberation of Auschwitz, 255 00:16:20,760 --> 00:16:24,320 Speaker 1: and today is also when the TikTok platform will begin 256 00:16:24,400 --> 00:16:29,400 Speaker 1: promoting a website called about Holocaust dot org. The World 257 00:16:29,520 --> 00:16:33,800 Speaker 1: Jewish Congress and UNESCO created that site to educate people 258 00:16:33,920 --> 00:16:38,160 Speaker 1: about the Holocaust and to fight back against misinformation and 259 00:16:38,240 --> 00:16:44,840 Speaker 1: denial campaigns, which unfortunately proliferate across the internet. Advocacy groups 260 00:16:44,840 --> 00:16:48,560 Speaker 1: have long complained that TikTok was not taking enough action 261 00:16:48,640 --> 00:16:52,880 Speaker 1: to curtail anti Semitic messaging on the platform, and Holocaust 262 00:16:52,960 --> 00:16:57,360 Speaker 1: denial in videos as well. In addition, several Jewish TikTok 263 00:16:57,400 --> 00:17:00,760 Speaker 1: creators have reported being the target of abuse. Sometimes they've 264 00:17:00,760 --> 00:17:05,560 Speaker 1: had their content mistakenly flagged or removed with no justification. 265 00:17:06,080 --> 00:17:08,920 Speaker 1: And according to see net, there have been some trends 266 00:17:08,920 --> 00:17:13,440 Speaker 1: on TikTok that I'm not even going to describe here 267 00:17:13,520 --> 00:17:17,960 Speaker 1: because I find them deeply upsetting and very sad. But 268 00:17:18,080 --> 00:17:20,480 Speaker 1: I will say they are trends that minimize or even 269 00:17:20,520 --> 00:17:25,480 Speaker 1: celebrate the Holocaust, which is it turns my stomach now. 270 00:17:25,520 --> 00:17:28,520 Speaker 1: I think TikTok is taking the right actions to try 271 00:17:28,520 --> 00:17:31,480 Speaker 1: and push back against these trends on its platform and 272 00:17:31,520 --> 00:17:35,040 Speaker 1: to make sure that people visiting TikTok have the opportunity 273 00:17:35,080 --> 00:17:39,000 Speaker 1: to learn from reputable sources about the Holocaust and not 274 00:17:39,119 --> 00:17:43,520 Speaker 1: allow misinformation to run unchecked. Of course, it will remain 275 00:17:43,560 --> 00:17:46,840 Speaker 1: to be seen about whether or not these efforts are effective. 276 00:17:47,600 --> 00:17:52,959 Speaker 1: Do you remember way back in November two thousand nineteen, 277 00:17:53,200 --> 00:17:58,240 Speaker 1: when Tesla unveiled its design for the cyber truck. The 278 00:17:58,280 --> 00:18:02,359 Speaker 1: reaction to that unveiling was, I'll be kind, I'll say 279 00:18:02,400 --> 00:18:05,640 Speaker 1: it was mixed. Maybe part of that is because during 280 00:18:05,640 --> 00:18:09,120 Speaker 1: a demonstration to show how unbreakable. The truck's windows were 281 00:18:09,920 --> 00:18:14,840 Speaker 1: the windows broke? What sie? It also has a pretty 282 00:18:14,840 --> 00:18:18,840 Speaker 1: funky design, do it like? Some people just absolutely hated it. 283 00:18:18,960 --> 00:18:23,320 Speaker 1: A few people thought it was weird but kind of interesting. 284 00:18:23,800 --> 00:18:26,800 Speaker 1: I think it's interesting, but not It doesn't look practical 285 00:18:26,800 --> 00:18:30,160 Speaker 1: to me, But what do I know. I don't drive anyway. Um, 286 00:18:30,200 --> 00:18:32,679 Speaker 1: and we got reports late last year that an updated 287 00:18:33,400 --> 00:18:36,520 Speaker 1: design of the cyber truck was spotted driving around test 288 00:18:36,560 --> 00:18:40,919 Speaker 1: tracks in Fremont. Well, the cyber truck was supposed to 289 00:18:40,960 --> 00:18:42,840 Speaker 1: come out by the end of this year. Actually, I 290 00:18:42,880 --> 00:18:45,840 Speaker 1: think they were supposed to start shipping late last year. 291 00:18:45,920 --> 00:18:48,919 Speaker 1: That didn't happen, and now it looks like it's not 292 00:18:48,920 --> 00:18:53,240 Speaker 1: gonna happen this year either, because in an earnings call yesterday, 293 00:18:53,280 --> 00:18:55,400 Speaker 1: Elon Musk revealed that the production of the truck has 294 00:18:55,400 --> 00:18:59,159 Speaker 1: been delayed until twenty three. He also said that the 295 00:18:59,160 --> 00:19:02,120 Speaker 1: company would not be introducing any new models this year, 296 00:19:02,440 --> 00:19:05,199 Speaker 1: and he also dismissed the idea that Tesla would pursue 297 00:19:05,240 --> 00:19:09,080 Speaker 1: producing a low cost vehicle in the near future. He 298 00:19:09,119 --> 00:19:11,880 Speaker 1: said there were no plans to design a twenty five 299 00:19:12,359 --> 00:19:15,800 Speaker 1: dollar car, and he essentially said that the company doesn't 300 00:19:15,800 --> 00:19:18,560 Speaker 1: really need to make a low cost car, at least 301 00:19:18,600 --> 00:19:22,640 Speaker 1: not yet, because they'll end up selling every car they're 302 00:19:22,720 --> 00:19:27,479 Speaker 1: able to produce. Now, you could interpret that as being boastful, right, 303 00:19:27,560 --> 00:19:30,159 Speaker 1: Let's say like, oh, we don't need to have a 304 00:19:30,200 --> 00:19:32,320 Speaker 1: budget car in our lineup. We don't. It doesn't matter 305 00:19:32,359 --> 00:19:34,520 Speaker 1: what we charge, We're gonna sell every single car off 306 00:19:34,560 --> 00:19:37,600 Speaker 1: the line, which is kind of how Musk said it. 307 00:19:37,680 --> 00:19:41,439 Speaker 1: I guess, so we wouldn't be unfair to frame it 308 00:19:41,520 --> 00:19:44,680 Speaker 1: that way. However, I don't really think of it as boasting. 309 00:19:44,720 --> 00:19:47,199 Speaker 1: I think of it more as an indication that Tesla 310 00:19:47,480 --> 00:19:51,840 Speaker 1: doesn't produce nearly as many vehicles as some of its 311 00:19:51,840 --> 00:19:57,560 Speaker 1: competitors do. For example, in one Tesla said that it 312 00:19:57,640 --> 00:20:03,040 Speaker 1: delivered nine thirty six thousand vehicles, so below one million, 313 00:20:03,119 --> 00:20:07,959 Speaker 1: but ninety six thousand. Toyota meanwhile produced seven point six 314 00:20:08,119 --> 00:20:11,160 Speaker 1: million vehicles at the end of their last fiscal year, 315 00:20:11,200 --> 00:20:14,640 Speaker 1: which was at the end of last March. So if 316 00:20:14,640 --> 00:20:16,879 Speaker 1: you're looking at that level of scale, to me, it 317 00:20:16,960 --> 00:20:19,760 Speaker 1: makes sense that that Tesla is not focusing on low 318 00:20:19,800 --> 00:20:23,359 Speaker 1: cost vehicles yet because it has not reached a scale 319 00:20:23,359 --> 00:20:26,879 Speaker 1: of production where it would merit. The move to making 320 00:20:26,960 --> 00:20:30,239 Speaker 1: low cost vehicles in fact, it might not even be 321 00:20:30,720 --> 00:20:36,320 Speaker 1: financially feasible to make low cost vehicles right now because 322 00:20:36,400 --> 00:20:40,320 Speaker 1: of those issues of the scale of production. Now as 323 00:20:40,320 --> 00:20:45,000 Speaker 1: Tesla grows, that can very much change. But I think 324 00:20:45,280 --> 00:20:47,760 Speaker 1: that's kind of what Well, that was my interpretation of 325 00:20:47,800 --> 00:20:49,760 Speaker 1: what Elon Musk was getting at. I could be totally 326 00:20:49,760 --> 00:20:53,280 Speaker 1: off base. Now here's a question. If a self driving 327 00:20:53,359 --> 00:20:56,719 Speaker 1: vehicle gets into an accident, whom do you find at fault? 328 00:20:57,520 --> 00:21:01,159 Speaker 1: Is it the car's owner, was it the person who 329 00:21:01,240 --> 00:21:04,520 Speaker 1: was in the vehicle? Was it the car maker? Well, 330 00:21:04,560 --> 00:21:07,280 Speaker 1: the Law Commission of England and Wales as well as 331 00:21:07,320 --> 00:21:10,440 Speaker 1: the Scottish Law Commission released a joint report that suggests 332 00:21:10,560 --> 00:21:14,680 Speaker 1: people who are you know, in an autonomous vehicle should 333 00:21:14,760 --> 00:21:18,920 Speaker 1: not be held responsible for road safety issues. Instead, the 334 00:21:19,000 --> 00:21:23,480 Speaker 1: car maker should be held accountable for any road safety 335 00:21:23,480 --> 00:21:27,399 Speaker 1: issues that that occur because of that car. And this 336 00:21:27,480 --> 00:21:29,960 Speaker 1: has been one of those areas of debate as engineers 337 00:21:30,000 --> 00:21:33,080 Speaker 1: get closer to producing what we might actually call a 338 00:21:33,160 --> 00:21:37,880 Speaker 1: truly autonomous vehicle. But the commissions also point out quite 339 00:21:37,920 --> 00:21:42,120 Speaker 1: correctly that there is massive confusion among the general public 340 00:21:42,480 --> 00:21:45,840 Speaker 1: as to what is and what is not a truly 341 00:21:45,920 --> 00:21:50,000 Speaker 1: self driving car. And we were just talking about Tesla, 342 00:21:50,240 --> 00:21:52,320 Speaker 1: so I'm going to use them as an example of 343 00:21:52,359 --> 00:21:55,520 Speaker 1: how the public can get confused. Tesla refers to its 344 00:21:56,119 --> 00:22:00,280 Speaker 1: you know, basic driver assist suite of features as auto pilot, 345 00:22:00,920 --> 00:22:04,600 Speaker 1: and I would argue that sets an expectation that isn't 346 00:22:04,800 --> 00:22:08,800 Speaker 1: very realistic because the word autopilot seems to suggest, at 347 00:22:08,840 --> 00:22:12,160 Speaker 1: least to me, that the driver doesn't have to worry 348 00:22:12,200 --> 00:22:16,080 Speaker 1: about anything because the car is an autopilot. But arguably 349 00:22:16,119 --> 00:22:19,879 Speaker 1: worse than that, the company has uh the full self 350 00:22:20,000 --> 00:22:23,320 Speaker 1: driving mode, which is in beta. It's not a full 351 00:22:23,320 --> 00:22:27,359 Speaker 1: release yet, but I would say that's not really a 352 00:22:27,359 --> 00:22:30,840 Speaker 1: full self driving mode. Even though it's called full self driving. 353 00:22:31,080 --> 00:22:33,240 Speaker 1: I would say it's not full self driving. You still 354 00:22:33,280 --> 00:22:37,159 Speaker 1: have to have an owner prepared to intervene and to 355 00:22:37,400 --> 00:22:39,240 Speaker 1: you're not supposed to take your hands off the wheel, 356 00:22:39,480 --> 00:22:42,600 Speaker 1: You're supposed to maintain your attention. That to me does 357 00:22:42,640 --> 00:22:47,639 Speaker 1: not mean full self driving. So there's no wonder that 358 00:22:47,680 --> 00:22:50,320 Speaker 1: the average person might have an unrealistic expectation as to 359 00:22:50,320 --> 00:22:54,680 Speaker 1: what a self driving car can and cannot do. Anyway, 360 00:22:54,880 --> 00:22:59,800 Speaker 1: should the UK adopt the recommendations these commissions have suggested 361 00:23:00,600 --> 00:23:04,920 Speaker 1: and hold car makers accountable for any accidents that happened 362 00:23:04,920 --> 00:23:08,320 Speaker 1: with their their vehicles. We could see a precedent set 363 00:23:08,359 --> 00:23:11,040 Speaker 1: in which governments around the world agree that car makers 364 00:23:11,320 --> 00:23:17,040 Speaker 1: are the responsible ones if self driving vehicles get into accidents. Now, personally, 365 00:23:17,040 --> 00:23:20,320 Speaker 1: I think that's the most reasonable approach. Um. You know, 366 00:23:20,400 --> 00:23:22,320 Speaker 1: you can even get more granular than that, if you 367 00:23:22,320 --> 00:23:27,879 Speaker 1: want to argue that the department in charge of whatever 368 00:23:28,520 --> 00:23:32,600 Speaker 1: part of the the self driving system was at fault 369 00:23:32,840 --> 00:23:35,120 Speaker 1: really takes the blame. But I think that the car 370 00:23:35,200 --> 00:23:36,919 Speaker 1: maker is the one that makes the most sense. And 371 00:23:36,920 --> 00:23:40,200 Speaker 1: I think when we start talking about stuff like insurance, 372 00:23:41,040 --> 00:23:43,119 Speaker 1: that's going to have to play a factor as well. 373 00:23:43,560 --> 00:23:45,879 Speaker 1: Assuming we do get to a world where you know, 374 00:23:46,000 --> 00:23:48,800 Speaker 1: you might own a self driving car, keeping in mind 375 00:23:48,840 --> 00:23:51,920 Speaker 1: that most models I see suggest that self driving cars 376 00:23:51,920 --> 00:23:54,800 Speaker 1: will actually be owned by companies that are essentially right 377 00:23:54,880 --> 00:24:01,320 Speaker 1: hailing companies, not necessarily sold to individual customers. Um, but yeah, 378 00:24:01,359 --> 00:24:03,360 Speaker 1: this could really set a legal precedent that we could 379 00:24:03,359 --> 00:24:07,320 Speaker 1: see spread around the world. And unless we're talking about 380 00:24:07,320 --> 00:24:11,399 Speaker 1: a case where a human is interfering with the autonomous 381 00:24:11,400 --> 00:24:14,040 Speaker 1: operations of a car, I think that that just makes 382 00:24:14,040 --> 00:24:16,439 Speaker 1: sense if you are talking about something where somebody I 383 00:24:16,440 --> 00:24:21,040 Speaker 1: don't know, rests control of the car from the system. Uh, 384 00:24:21,080 --> 00:24:23,399 Speaker 1: then obviously you would be in a different case, and 385 00:24:23,520 --> 00:24:25,879 Speaker 1: I don't think the car maker should be held accountable 386 00:24:25,920 --> 00:24:29,119 Speaker 1: in those instances. That's why I think a lot of 387 00:24:29,119 --> 00:24:31,199 Speaker 1: car companies that have been kind of playing with the 388 00:24:31,240 --> 00:24:34,520 Speaker 1: idea of autonomous cars have also floated the concept of 389 00:24:35,400 --> 00:24:38,280 Speaker 1: vehicles that don't have controls in them, right, because then 390 00:24:38,320 --> 00:24:41,600 Speaker 1: you take away at least human accessible controls them. You 391 00:24:41,640 --> 00:24:43,880 Speaker 1: take away the ability for a human to like turn 392 00:24:43,960 --> 00:24:46,640 Speaker 1: the wheels suddenly when it should be under the control 393 00:24:46,800 --> 00:24:49,560 Speaker 1: of the car itself. Okay, we've got a few more 394 00:24:49,640 --> 00:24:52,080 Speaker 1: stories to cover, but before we get to those, let's 395 00:24:52,119 --> 00:25:04,800 Speaker 1: take another quick break. Block Incorporated CEO Jack Dorsey is 396 00:25:05,160 --> 00:25:09,880 Speaker 1: probably sweating a little bit right now. Block inc Used 397 00:25:09,880 --> 00:25:13,360 Speaker 1: to be known as Square, which is its famous product 398 00:25:13,400 --> 00:25:17,440 Speaker 1: and service. You probably have encountered a Square dongle at 399 00:25:17,480 --> 00:25:20,840 Speaker 1: some point. These are typically it's a device that you 400 00:25:20,920 --> 00:25:24,280 Speaker 1: plug into something like an iOS device like an iPhone 401 00:25:24,359 --> 00:25:29,119 Speaker 1: or tablet iPad, and it allows you to process credit 402 00:25:29,160 --> 00:25:34,280 Speaker 1: card payments, and the tablet or phone acts as the 403 00:25:34,280 --> 00:25:39,000 Speaker 1: communications node that then works with the back end of 404 00:25:39,040 --> 00:25:41,960 Speaker 1: the credit card companies and you can use the iOS 405 00:25:42,000 --> 00:25:44,520 Speaker 1: device as a point of sale, and it's very useful 406 00:25:44,560 --> 00:25:48,119 Speaker 1: for small business owners. Well, Apple has recently announced that 407 00:25:48,200 --> 00:25:50,480 Speaker 1: it is working on a service that will allow those 408 00:25:50,520 --> 00:25:54,080 Speaker 1: small business owners to accept payments directly through their iOS 409 00:25:54,160 --> 00:25:58,160 Speaker 1: devices without the need for an additional piece of hardware 410 00:25:58,480 --> 00:26:02,440 Speaker 1: like a Square dongle. And so it's just saying, well, 411 00:26:02,440 --> 00:26:06,320 Speaker 1: we're going to offer that as a native capability in 412 00:26:06,359 --> 00:26:09,600 Speaker 1: our devices, so if you want, you can use our 413 00:26:09,640 --> 00:26:12,600 Speaker 1: payment system and you don't even have to have like 414 00:26:12,800 --> 00:26:16,560 Speaker 1: a Square account or anything like that. And uh, there 415 00:26:16,560 --> 00:26:19,400 Speaker 1: aren't a whole lot of details here, but Bloomberg suggests 416 00:26:19,480 --> 00:26:22,560 Speaker 1: that Apple is going to use NFC technology as the foundation. 417 00:26:22,720 --> 00:26:26,600 Speaker 1: NFC stands for near field communication that allows for the 418 00:26:26,680 --> 00:26:30,960 Speaker 1: wireless data transfer a very small packets of information across 419 00:26:31,640 --> 00:26:34,800 Speaker 1: small physical distances. Like you know, you hold two phones 420 00:26:34,920 --> 00:26:38,240 Speaker 1: up next to each other and they exchange contact information, 421 00:26:38,320 --> 00:26:41,600 Speaker 1: that kind of thing that's that typically uses NFC, and 422 00:26:41,720 --> 00:26:44,840 Speaker 1: modern credit cards typically have a chip that uses NFC 423 00:26:44,920 --> 00:26:47,760 Speaker 1: for contact list payments, So for those where you can 424 00:26:47,960 --> 00:26:50,960 Speaker 1: tap the card against a point of sale, those kind 425 00:26:51,000 --> 00:26:53,960 Speaker 1: of things, it'll work with that, at least according to Blueberg, 426 00:26:54,480 --> 00:26:56,639 Speaker 1: And it will be interesting to see how block Inc 427 00:26:56,720 --> 00:26:59,320 Speaker 1: responds to this. I imagine there's going to be a 428 00:26:59,320 --> 00:27:01,800 Speaker 1: battle of of his fees to try and stay competitive 429 00:27:02,600 --> 00:27:05,800 Speaker 1: because usually on the back end of this, if you're 430 00:27:05,800 --> 00:27:08,639 Speaker 1: a small business owner, you're having to pay a certain 431 00:27:08,640 --> 00:27:12,240 Speaker 1: amount of money or surrender a certain amount per sale 432 00:27:12,760 --> 00:27:17,879 Speaker 1: to fund these services. Right, So, if block inc is 433 00:27:17,960 --> 00:27:22,200 Speaker 1: able to offer a more competitive fee versus Apple, then 434 00:27:22,240 --> 00:27:24,919 Speaker 1: it might be able to go toe to toe, but 435 00:27:25,000 --> 00:27:27,639 Speaker 1: it's too early to say here. In the United States, 436 00:27:27,640 --> 00:27:30,560 Speaker 1: the Internal Revenue Service or i r S, which is 437 00:27:30,560 --> 00:27:34,320 Speaker 1: already a super popular agency in this country, is poised 438 00:27:34,359 --> 00:27:37,200 Speaker 1: to require taxpayers who want to use certain i r 439 00:27:37,400 --> 00:27:41,320 Speaker 1: S online services to first use a third party company 440 00:27:41,359 --> 00:27:46,240 Speaker 1: called i d dot me to verify their identity. And 441 00:27:46,720 --> 00:27:48,439 Speaker 1: you know, it makes sense that the i r S 442 00:27:48,480 --> 00:27:51,560 Speaker 1: wants a way to authenticate that a person is whom 443 00:27:51,840 --> 00:27:55,040 Speaker 1: they claim to be, particularly when you're talking about stuff 444 00:27:55,080 --> 00:27:59,080 Speaker 1: that relates to taxes or accessing sensitive documents that relate 445 00:27:59,119 --> 00:28:01,560 Speaker 1: to a person's income. Um, you know it makes sense 446 00:28:01,600 --> 00:28:03,880 Speaker 1: you want to make sure you protect all that well. 447 00:28:03,920 --> 00:28:06,080 Speaker 1: I D dot me is going to require users to 448 00:28:06,119 --> 00:28:09,400 Speaker 1: submit some documents to prove they are who they say 449 00:28:09,400 --> 00:28:12,840 Speaker 1: they are, as well as to submit a video selfie 450 00:28:12,840 --> 00:28:16,040 Speaker 1: as part of that verification process. So we're talking about 451 00:28:16,080 --> 00:28:20,240 Speaker 1: facial recognition technology at play here, and for several reasons, 452 00:28:20,640 --> 00:28:26,320 Speaker 1: many security and privacy advocates have criticized this announcement. For 453 00:28:26,400 --> 00:28:31,639 Speaker 1: one thing, it brings a private company into the citizens 454 00:28:31,680 --> 00:28:34,920 Speaker 1: interaction with the i r S, and there are worries 455 00:28:35,280 --> 00:28:39,400 Speaker 1: that that alone compromises security as soon as the i 456 00:28:39,480 --> 00:28:42,280 Speaker 1: r S chose to offload verification to a private, third 457 00:28:42,320 --> 00:28:46,360 Speaker 1: party company. Then there are many of the same general 458 00:28:46,400 --> 00:28:50,080 Speaker 1: issues we've seen with facial recognition software that we should consider. 459 00:28:50,200 --> 00:28:54,360 Speaker 1: For example, we've seen time and again through various different 460 00:28:54,400 --> 00:28:58,400 Speaker 1: facial recognition systems that many of them have a bias 461 00:28:58,520 --> 00:29:02,160 Speaker 1: to them, and that this bias often makes them unreliable 462 00:29:02,680 --> 00:29:05,840 Speaker 1: when used with people of color, or with women, or 463 00:29:05,840 --> 00:29:09,840 Speaker 1: with people who are gender nonconforming uh. The critics also 464 00:29:09,880 --> 00:29:14,520 Speaker 1: point out that this can create a deeper digital divide 465 00:29:15,400 --> 00:29:18,400 Speaker 1: because it means that users who want to access those 466 00:29:18,440 --> 00:29:22,160 Speaker 1: online services will have to have a web camera or 467 00:29:22,200 --> 00:29:25,760 Speaker 1: a smartphone with a camera on it in order to 468 00:29:25,800 --> 00:29:28,440 Speaker 1: be able to do the video selfie thing and to 469 00:29:28,720 --> 00:29:32,440 Speaker 1: use facial recognition to prove they are who they say 470 00:29:32,440 --> 00:29:35,400 Speaker 1: they are, and not everyone can afford that. Not everyone 471 00:29:35,520 --> 00:29:38,840 Speaker 1: has that at their disposal, but everyone has to deal 472 00:29:38,880 --> 00:29:41,600 Speaker 1: with the I R s Now. All that being said, 473 00:29:41,960 --> 00:29:44,280 Speaker 1: I d dot me has claimed that it's technology has 474 00:29:44,280 --> 00:29:48,560 Speaker 1: shown no bias or inherent unreliability based on skin color. Uh. 475 00:29:48,680 --> 00:29:52,520 Speaker 1: I don't know that there's any independent research into that, 476 00:29:52,600 --> 00:29:56,120 Speaker 1: but I d dot me says it's not the case. Uh. 477 00:29:56,160 --> 00:29:59,120 Speaker 1: And they also point out that the online services where 478 00:29:59,160 --> 00:30:02,160 Speaker 1: this will apply limited to just a couple of features. 479 00:30:03,120 --> 00:30:05,959 Speaker 1: But um, you know one of those features is checking 480 00:30:06,000 --> 00:30:09,040 Speaker 1: your account online. That sounds like a really basic action 481 00:30:09,080 --> 00:30:13,160 Speaker 1: to me. Maybe I'm missing something here now. I RIS 482 00:30:13,280 --> 00:30:16,640 Speaker 1: has said you can still file and pay taxes without 483 00:30:16,760 --> 00:30:20,280 Speaker 1: going through the online route at all. And if you're 484 00:30:20,320 --> 00:30:21,840 Speaker 1: doing that, like if you're doing it in the old 485 00:30:21,840 --> 00:30:25,200 Speaker 1: fashioned paper and pencil way or paper and pen way, 486 00:30:25,240 --> 00:30:27,960 Speaker 1: I guess you don't even have to worry about any 487 00:30:28,000 --> 00:30:29,800 Speaker 1: of this, right. I d dot me is not at 488 00:30:29,840 --> 00:30:32,040 Speaker 1: all involved with any of that side of things, and 489 00:30:32,040 --> 00:30:37,480 Speaker 1: it's not a gatekeeper. Still, privacy advocates are concerned about 490 00:30:37,480 --> 00:30:40,560 Speaker 1: where this is headed. And finally, let's talk about secret 491 00:30:40,600 --> 00:30:44,280 Speaker 1: government agencies and how to uncover them. A researcher in 492 00:30:44,320 --> 00:30:47,479 Speaker 1: Germany named Lilith Whitman says that she used an Apple 493 00:30:47,560 --> 00:30:52,040 Speaker 1: air tag to prove that a German government agency called 494 00:30:52,040 --> 00:30:57,160 Speaker 1: the Federal Telecommunications Service is actually a cover organization for 495 00:30:57,360 --> 00:31:02,160 Speaker 1: a secret branch of the German in Terior Intelligence Agency. 496 00:31:02,480 --> 00:31:05,800 Speaker 1: Whitman was working on a computer program to evaluate this 497 00:31:05,920 --> 00:31:11,400 Speaker 1: telecommunication services work, and presumably that work is to help 498 00:31:11,520 --> 00:31:16,200 Speaker 1: telecommunications companies in Germany, but that department turns out to 499 00:31:16,320 --> 00:31:20,960 Speaker 1: have no official budget and that kind of sounds sus right. Well, 500 00:31:20,960 --> 00:31:23,560 Speaker 1: Whitman looked into it further, trying to get to the 501 00:31:23,560 --> 00:31:27,080 Speaker 1: real story of what the supposed department really was all about. 502 00:31:27,800 --> 00:31:31,320 Speaker 1: She did online research, she made phone calls, she drove 503 00:31:31,800 --> 00:31:34,520 Speaker 1: to offices or at least to what was claimed to 504 00:31:34,560 --> 00:31:37,479 Speaker 1: be an office to check it out in person, and 505 00:31:37,520 --> 00:31:40,240 Speaker 1: lots of other actions, uh, some of which just led 506 00:31:40,280 --> 00:31:44,600 Speaker 1: to dead ends, And ultimately she deduced that the Federal 507 00:31:44,640 --> 00:31:49,240 Speaker 1: Telecommunications Service is really part of Germany's Ministry of the Interior, 508 00:31:49,560 --> 00:31:53,800 Speaker 1: specifically the Federal Office for the Protection of the Constitution, 509 00:31:54,800 --> 00:31:58,400 Speaker 1: which is kind of similar to the FBI here in 510 00:31:58,400 --> 00:32:01,800 Speaker 1: the United States, with some pretty major differences. But you know, 511 00:32:02,000 --> 00:32:07,040 Speaker 1: that's kind of the realm this agency works in, and 512 00:32:07,080 --> 00:32:10,440 Speaker 1: Whitman was told repeatedly that she was on the wrong track, 513 00:32:10,520 --> 00:32:13,000 Speaker 1: which obviously she would be right. If she was on 514 00:32:13,040 --> 00:32:15,800 Speaker 1: the right track, she would still get that message. And 515 00:32:15,840 --> 00:32:17,440 Speaker 1: if she was on the wrong track, then she would 516 00:32:17,480 --> 00:32:20,200 Speaker 1: also get that message. So no matter what, she would 517 00:32:20,280 --> 00:32:22,760 Speaker 1: be told, you're not on the right track. So she 518 00:32:22,840 --> 00:32:26,560 Speaker 1: decided to try and experiment. She mailed a package containing 519 00:32:26,560 --> 00:32:32,400 Speaker 1: an Apple air tag to the supposed telecommunication agency's postal address. Now, 520 00:32:32,400 --> 00:32:35,280 Speaker 1: air tags lets you track something right like, you can 521 00:32:35,360 --> 00:32:37,960 Speaker 1: connect it to pretty much anything. You can attach it 522 00:32:38,000 --> 00:32:41,080 Speaker 1: to a bag. And let's say that you're going through 523 00:32:41,120 --> 00:32:43,080 Speaker 1: an airport and you're gonna check your luggage. You might 524 00:32:43,120 --> 00:32:45,120 Speaker 1: have an air tag in your luggage so that you 525 00:32:45,160 --> 00:32:47,600 Speaker 1: can make sure you know where your luggage is when 526 00:32:47,600 --> 00:32:49,760 Speaker 1: you get to your destination. Maybe it doesn't pop up 527 00:32:49,760 --> 00:32:52,640 Speaker 1: at at baggage retrieval and you want to find out 528 00:32:52,680 --> 00:32:55,040 Speaker 1: where it is. Air tag can help you track it down. 529 00:32:55,160 --> 00:32:57,000 Speaker 1: So she puts one of these inside a package, and 530 00:32:57,120 --> 00:32:59,440 Speaker 1: she tracks it as it goes through the postal system, 531 00:32:59,720 --> 00:33:03,840 Speaker 1: and she says that the tag went not to the 532 00:33:03,920 --> 00:33:09,640 Speaker 1: supposed address of the Federal Telecommunication Service, but rather ended 533 00:33:09,720 --> 00:33:12,440 Speaker 1: up being delivered to the Office for the Protection of 534 00:33:12,480 --> 00:33:16,840 Speaker 1: the Constitution in Cologne, Germany, and that seems to support 535 00:33:16,880 --> 00:33:20,760 Speaker 1: her assertion that the Federal Telecommunication Service is just a 536 00:33:20,880 --> 00:33:25,280 Speaker 1: front for the intelligence gathering agency. Now you might wonder 537 00:33:25,320 --> 00:33:28,400 Speaker 1: why would she do this, Like, why would she do 538 00:33:29,160 --> 00:33:33,520 Speaker 1: uncover the secrets? Why would she share that information? Well, 539 00:33:33,640 --> 00:33:37,320 Speaker 1: you can bet that if a security researcher has figured 540 00:33:37,520 --> 00:33:41,760 Speaker 1: this out, the quote unquote bad guys also know about it. 541 00:33:42,120 --> 00:33:45,400 Speaker 1: And if the bad guys know your cover story, then 542 00:33:45,440 --> 00:33:49,160 Speaker 1: you don't have a cover story, right, Like it's like 543 00:33:49,200 --> 00:33:53,520 Speaker 1: you're wearing wearing a Groucho Marx glasses and calling it 544 00:33:53,560 --> 00:33:56,440 Speaker 1: a disguise. So this is one of those functions that 545 00:33:56,520 --> 00:34:00,520 Speaker 1: hackers fill that can easily be misinterpreted. You might say, well, 546 00:34:00,560 --> 00:34:03,320 Speaker 1: why is this hacker pointing out this big gap in security? 547 00:34:03,440 --> 00:34:06,560 Speaker 1: Isn't that dangerous? If the hacker says, hey, here's how 548 00:34:06,560 --> 00:34:09,600 Speaker 1: I hacked into the system, isn't that dangerous? Well, for 549 00:34:09,680 --> 00:34:12,520 Speaker 1: one thing, when a hacker does this, it sends the 550 00:34:12,600 --> 00:34:16,719 Speaker 1: message that there is a gap insecurity, which really means 551 00:34:16,760 --> 00:34:20,680 Speaker 1: there's no security at all, because you can sure as 552 00:34:20,760 --> 00:34:23,359 Speaker 1: heck bet the bad guys are not going to let 553 00:34:23,400 --> 00:34:26,239 Speaker 1: on that the system has flaws in it until it's 554 00:34:26,280 --> 00:34:28,719 Speaker 1: too late to do anything about it. And for another thing, 555 00:34:28,760 --> 00:34:32,239 Speaker 1: it really is a message to whatever the agency or 556 00:34:32,360 --> 00:34:35,240 Speaker 1: organization or company or whatever it is. It's a message 557 00:34:35,280 --> 00:34:38,680 Speaker 1: to them to say, hey, you're doing a bad job 558 00:34:38,719 --> 00:34:41,799 Speaker 1: with your security, and if you want to do what 559 00:34:41,920 --> 00:34:46,160 Speaker 1: you're doing, you've gotta do it better. So you could 560 00:34:46,160 --> 00:34:48,520 Speaker 1: say that this is a way of doing quality control 561 00:34:48,600 --> 00:34:51,880 Speaker 1: when it comes to gathering intelligence. Anyway, I thought it 562 00:34:51,920 --> 00:34:54,759 Speaker 1: was an interesting story. I wanted to conclude with that one. 563 00:34:55,120 --> 00:34:58,359 Speaker 1: Hope you are all doing well. If you have suggestions 564 00:34:58,400 --> 00:35:01,799 Speaker 1: for topics that I should cover tech stuff, whether it's 565 00:35:01,800 --> 00:35:06,359 Speaker 1: a company, uh, technology trend in general, a specific kind 566 00:35:06,400 --> 00:35:08,719 Speaker 1: of technology and how it works, maybe it's one of 567 00:35:08,760 --> 00:35:10,520 Speaker 1: the tech stuff tidbits you would like me to cover. 568 00:35:11,000 --> 00:35:12,680 Speaker 1: Let me know the best way to do that is 569 00:35:12,719 --> 00:35:14,839 Speaker 1: to reach out on Twitter. The handle for the show 570 00:35:14,960 --> 00:35:18,279 Speaker 1: is text stuff hs W and I'll talk to you 571 00:35:18,320 --> 00:35:27,000 Speaker 1: again really soon. Y Tech Stuff is an I Heart 572 00:35:27,080 --> 00:35:30,839 Speaker 1: Radio production. For more podcasts from my Heart Radio, visit 573 00:35:30,880 --> 00:35:33,920 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 574 00:35:34,000 --> 00:35:35,360 Speaker 1: listen to your favorite shows.