1 00:00:01,160 --> 00:00:03,640 Speaker 1: Welcome to the Tutor Dixon Podcast Today. I have a 2 00:00:03,720 --> 00:00:07,200 Speaker 1: question for you, and the question is do you use Uber? 3 00:00:07,800 --> 00:00:10,520 Speaker 1: I use Uber. My mom complains about it all the 4 00:00:10,560 --> 00:00:13,120 Speaker 1: time because she says to me, this is like not safe, 5 00:00:13,240 --> 00:00:16,880 Speaker 1: you can't use Uber, and I legitimately have completely ignored 6 00:00:16,920 --> 00:00:19,440 Speaker 1: her every time because I'm like, what are you talking about. 7 00:00:19,680 --> 00:00:22,520 Speaker 1: It's a huge company. It's in every big city. I 8 00:00:22,640 --> 00:00:26,439 Speaker 1: was just in Europe. I used Uber there. Literally, it's international. 9 00:00:26,760 --> 00:00:29,320 Speaker 1: It must be safe because a big company like that, 10 00:00:29,760 --> 00:00:34,280 Speaker 1: I assume is safe, which I get that's naive. But 11 00:00:34,440 --> 00:00:38,120 Speaker 1: when you think about everything being connected and your phones 12 00:00:38,120 --> 00:00:40,800 Speaker 1: are connected and you can have a complaint through your phone, 13 00:00:40,920 --> 00:00:45,080 Speaker 1: It's like there's cameras everywhere. They must not only be 14 00:00:45,159 --> 00:00:48,000 Speaker 1: doing these incredible background checks, but they have to also 15 00:00:48,080 --> 00:00:52,480 Speaker 1: be watching they're Uber drivers. And then suddenly I hear 16 00:00:52,680 --> 00:00:57,480 Speaker 1: that they get a report of sexual assault or sexual 17 00:00:57,520 --> 00:01:03,360 Speaker 1: misconduct reported to Uber like every eight minutes, and they're 18 00:01:03,400 --> 00:01:07,280 Speaker 1: not really reporting that back. So I was like, wait 19 00:01:07,319 --> 00:01:10,520 Speaker 1: a minute, what. I want a little clarity about this. 20 00:01:10,600 --> 00:01:13,360 Speaker 1: Because I am a user of Uber. I don't want 21 00:01:13,400 --> 00:01:16,280 Speaker 1: to stop using Uber. It's convenient. I mean, it's hard 22 00:01:16,319 --> 00:01:19,720 Speaker 1: to find a taxi. I felt like, actually an Uber 23 00:01:19,800 --> 00:01:22,559 Speaker 1: was safer than taxi. But now I have to ask 24 00:01:22,680 --> 00:01:25,280 Speaker 1: someone who's done a lot of research on this, So 25 00:01:25,400 --> 00:01:28,360 Speaker 1: I have Will Held here. He's the executive director of 26 00:01:28,440 --> 00:01:33,520 Speaker 1: Consumer Research at the nation's oldest consumer protection organization. Well, 27 00:01:33,640 --> 00:01:37,480 Speaker 1: what is happening? Why is this? Why is this happening 28 00:01:37,640 --> 00:01:38,160 Speaker 1: with Uber? 29 00:01:38,319 --> 00:01:41,039 Speaker 2: Well that's a great question, and obviously each company, it's 30 00:01:41,040 --> 00:01:43,760 Speaker 2: a very specific circumstance to that company, but I can 31 00:01:43,800 --> 00:01:46,119 Speaker 2: tell you in general, this is something that Consumer's research 32 00:01:46,160 --> 00:01:48,600 Speaker 2: has sort of become famous for pointing out. I think 33 00:01:48,600 --> 00:01:51,960 Speaker 2: that's why you brought me on, is that often a 34 00:01:52,040 --> 00:01:55,320 Speaker 2: company's move into wokeness is to cover up from either 35 00:01:55,360 --> 00:01:58,080 Speaker 2: misdeeds in the market or mistreatment of their customers, and 36 00:01:58,120 --> 00:02:01,480 Speaker 2: I think with Uber, obviously you have both here. What 37 00:02:01,520 --> 00:02:05,120 Speaker 2: you're referring to specifically is that there was recently a 38 00:02:05,400 --> 00:02:08,480 Speaker 2: judgment found against Uber for eight and a half million 39 00:02:08,560 --> 00:02:12,120 Speaker 2: dollars because someone was sexually assaulted. Unfortunately, a woman was 40 00:02:12,120 --> 00:02:15,520 Speaker 2: sexually assaulted during one of the rides, and it was 41 00:02:15,600 --> 00:02:19,000 Speaker 2: proven that Uber basically knew that this driver was a 42 00:02:19,080 --> 00:02:22,200 Speaker 2: high risk for this. They have an algorithm that that 43 00:02:22,280 --> 00:02:25,120 Speaker 2: tracks and is able to somewhat predict which drivers. I 44 00:02:25,160 --> 00:02:27,360 Speaker 2: don't know if that's based on complaints or or just 45 00:02:27,480 --> 00:02:30,799 Speaker 2: the driver's behaviors. I mean sometimes, as we know from 46 00:02:30,919 --> 00:02:34,200 Speaker 2: from AI now, there are all kinds of things statistics 47 00:02:34,200 --> 00:02:37,800 Speaker 2: and behaviors that wouldn't seem relevant to predicting other behaviors, 48 00:02:37,800 --> 00:02:40,919 Speaker 2: but they are. And Uber had a system in place 49 00:02:41,600 --> 00:02:44,640 Speaker 2: that was supposed to flag that. But instead of either 50 00:02:44,880 --> 00:02:49,560 Speaker 2: notifying the you know, person, removing them off of the platform, 51 00:02:49,600 --> 00:02:51,600 Speaker 2: which is what you'd assume you'd want to do if 52 00:02:51,600 --> 00:02:55,560 Speaker 2: someone was like, right, yeah, driver, right, they allowed this 53 00:02:55,680 --> 00:02:58,560 Speaker 2: driver and a whole host of others to continue to 54 00:02:58,639 --> 00:03:00,880 Speaker 2: drive on that platform. And so because of this, a 55 00:03:01,000 --> 00:03:03,359 Speaker 2: jury found again eight and a half million dollar judgment 56 00:03:03,400 --> 00:03:06,640 Speaker 2: against Uber. But what's more alarming is that this claim 57 00:03:06,720 --> 00:03:09,720 Speaker 2: is only one of thousands against the company over the 58 00:03:09,760 --> 00:03:12,880 Speaker 2: last few years, and so these are all moving through 59 00:03:12,919 --> 00:03:15,920 Speaker 2: the courts. Obviously, this sets a very high precedent. If 60 00:03:15,960 --> 00:03:19,639 Speaker 2: you multiply out, if each person who's suing Uber right 61 00:03:19,639 --> 00:03:22,000 Speaker 2: now over an alleged sexual assault was to get eight 62 00:03:22,000 --> 00:03:24,280 Speaker 2: and a half million dollars, it would be about twenty 63 00:03:24,280 --> 00:03:26,359 Speaker 2: four billion dollars of judgments that company who have to 64 00:03:26,400 --> 00:03:27,880 Speaker 2: pay about a quarter or fifth of. 65 00:03:27,840 --> 00:03:28,959 Speaker 3: Their market cap. 66 00:03:29,040 --> 00:03:32,320 Speaker 2: So this is a clearly a serious, serious problem that 67 00:03:32,440 --> 00:03:34,640 Speaker 2: Uber has been dealing with. But as somebody who's watched 68 00:03:34,639 --> 00:03:36,880 Speaker 2: the company now for a number of years, it's been 69 00:03:36,920 --> 00:03:41,600 Speaker 2: pretty stunning to watch the ham fisted and poor way 70 00:03:42,080 --> 00:03:47,320 Speaker 2: in which they have handled this particular issue. And this 71 00:03:47,680 --> 00:03:49,760 Speaker 2: goes back to why we've launched a campaign against them 72 00:03:49,760 --> 00:03:50,440 Speaker 2: many years ago. 73 00:03:51,080 --> 00:03:54,120 Speaker 1: So there's a ballot initiative in California, and I have 74 00:03:54,200 --> 00:03:57,520 Speaker 1: to say, generally, when I say those words, it's something insane, 75 00:03:57,800 --> 00:04:01,160 Speaker 1: Like a California ballot initiative is generally something that I'm 76 00:04:01,160 --> 00:04:03,839 Speaker 1: not going to agree with. But there's this site it's 77 00:04:03,840 --> 00:04:06,960 Speaker 1: called Every eight Minutes, and you go there and they 78 00:04:07,000 --> 00:04:11,040 Speaker 1: put up these statistics and say, almost every eight minutes, 79 00:04:11,360 --> 00:04:17,760 Speaker 1: someone is sexually assaulted or attacked in some way sexually 80 00:04:17,800 --> 00:04:21,360 Speaker 1: by an in an Uber in an Uber vehicle. And 81 00:04:21,400 --> 00:04:24,680 Speaker 1: they have all of these facts that they put up. 82 00:04:24,880 --> 00:04:27,760 Speaker 1: I mean, they're saying their facts. I haven't done the research. 83 00:04:28,200 --> 00:04:31,320 Speaker 1: I know you've done more research than I have. They're 84 00:04:31,360 --> 00:04:36,920 Speaker 1: saying that these drivers oftentimes have multiple complaints against them, 85 00:04:37,720 --> 00:04:41,200 Speaker 1: and yet they allow them to continue driving. So so 86 00:04:41,279 --> 00:04:44,960 Speaker 1: do we have It reminds me on a much more 87 00:04:45,040 --> 00:04:47,960 Speaker 1: dangerous scale of like you go to a hotel and 88 00:04:48,080 --> 00:04:50,600 Speaker 1: you think you have control over the temperature in the room, 89 00:04:50,600 --> 00:04:52,479 Speaker 1: but it's just kind of fake and you don't actually 90 00:04:52,520 --> 00:04:55,640 Speaker 1: have control. It's like, I think I have control over 91 00:04:55,680 --> 00:04:58,080 Speaker 1: saying whether my Uber driver was good or not by 92 00:04:58,160 --> 00:05:02,119 Speaker 1: going into my phone and putting review, but it seems 93 00:05:02,160 --> 00:05:03,080 Speaker 1: like it's meaningless. 94 00:05:03,720 --> 00:05:05,279 Speaker 3: Yeah, it's it's pretty terrifying. 95 00:05:05,520 --> 00:05:07,599 Speaker 2: And there are a lot of different theories as to 96 00:05:07,640 --> 00:05:11,279 Speaker 2: what's driving Uber's lack of action. So the first and 97 00:05:11,320 --> 00:05:14,159 Speaker 2: foremost is obviously looked for a profit incentive, and the 98 00:05:14,240 --> 00:05:16,360 Speaker 2: issue here may be that they are always hard up 99 00:05:16,400 --> 00:05:19,520 Speaker 2: to get more people onto the platform. You know, there 100 00:05:19,560 --> 00:05:22,360 Speaker 2: was a period of time in which they were subsidizing 101 00:05:22,400 --> 00:05:25,680 Speaker 2: basically every ride from the driver's side, meaning they were 102 00:05:25,720 --> 00:05:28,800 Speaker 2: paying the driver more than than Uber was even taking 103 00:05:28,800 --> 00:05:30,839 Speaker 2: in because they were trying to soak up market. They 104 00:05:30,880 --> 00:05:33,400 Speaker 2: were in a market share battle with Lyft and the 105 00:05:33,440 --> 00:05:38,080 Speaker 2: other main companies that compete in that in that market, 106 00:05:38,400 --> 00:05:42,200 Speaker 2: and so you know, this may have been a situation 107 00:05:42,240 --> 00:05:44,640 Speaker 2: where they just they wanted as many drivers as they 108 00:05:44,640 --> 00:05:47,240 Speaker 2: could have, and so they were loathed to push anyone 109 00:05:47,279 --> 00:05:51,560 Speaker 2: off the platform regardless of how many objections. The other issue, 110 00:05:52,000 --> 00:05:55,360 Speaker 2: and this is you know, you mentioned California ballot initiatives 111 00:05:55,400 --> 00:05:58,840 Speaker 2: that kind of brings in California politics or woke politics 112 00:05:58,839 --> 00:06:01,159 Speaker 2: as I like to call it. There's another issue that 113 00:06:01,400 --> 00:06:04,400 Speaker 2: these reports and these flags may not have been evenly 114 00:06:04,440 --> 00:06:08,520 Speaker 2: distributed amongst different populations. And there's been an issue with that, 115 00:06:08,760 --> 00:06:11,599 Speaker 2: especially after Obama's second term, that a lot of companies 116 00:06:11,839 --> 00:06:15,120 Speaker 2: were sued if there were any kind of differential disparate 117 00:06:15,120 --> 00:06:17,320 Speaker 2: true what they call it, desparate impact of any kind 118 00:06:17,360 --> 00:06:20,560 Speaker 2: of policy. So, in other words, I speaking concretes instead 119 00:06:20,560 --> 00:06:23,120 Speaker 2: of avogories. Here, if you had a policy that pushed 120 00:06:23,760 --> 00:06:26,159 Speaker 2: people off of a platform, if they got too many 121 00:06:26,200 --> 00:06:29,080 Speaker 2: complaints of sexual assault or something like that, and it 122 00:06:29,160 --> 00:06:32,480 Speaker 2: disproportionately pushed people of a certain racial or sex based 123 00:06:32,520 --> 00:06:34,800 Speaker 2: demograp Obviously in this case it would almost exclusively be men. 124 00:06:34,839 --> 00:06:36,200 Speaker 3: But let's just use race for a second. 125 00:06:36,440 --> 00:06:39,600 Speaker 2: If it pushed off disproportionate number of people of a 126 00:06:39,600 --> 00:06:42,760 Speaker 2: certain race, it is conceivable you could get sued as 127 00:06:42,800 --> 00:06:46,120 Speaker 2: a corporation for that policy, that it was racist or 128 00:06:46,120 --> 00:06:48,000 Speaker 2: that you were doing something wrong. And so a lot 129 00:06:48,040 --> 00:06:50,960 Speaker 2: of companies we saw during the height of the DEI era, 130 00:06:51,000 --> 00:06:52,760 Speaker 2: which is something that we did a lot of work on. 131 00:06:53,760 --> 00:06:56,440 Speaker 2: We're doing this kind of stuff. Sometimes it was internally driven. 132 00:06:56,640 --> 00:06:59,560 Speaker 2: They had a DEI director that was woke and crazy 133 00:06:59,560 --> 00:07:01,560 Speaker 2: and pushed all kinds of a sort of policies. But 134 00:07:01,640 --> 00:07:03,960 Speaker 2: sometimes it was because they were afraid of either the 135 00:07:04,000 --> 00:07:07,600 Speaker 2: Obama or then the Biden DJ and uh that it 136 00:07:07,640 --> 00:07:10,120 Speaker 2: was this was being pushed top down, from from from 137 00:07:10,280 --> 00:07:13,040 Speaker 2: your government, government policy. So that might also be the 138 00:07:13,080 --> 00:07:15,000 Speaker 2: reason why they were loaded to push push people off 139 00:07:15,040 --> 00:07:17,240 Speaker 2: if they were worried either the bad bad press from 140 00:07:17,240 --> 00:07:20,080 Speaker 2: that or even legal legal consequences. 141 00:07:20,440 --> 00:07:25,320 Speaker 1: Is this California initiative strictly for Uber or is this 142 00:07:25,520 --> 00:07:29,520 Speaker 1: ride share companies in general? If you get assaulted, because 143 00:07:29,720 --> 00:07:32,520 Speaker 1: if I'm reading this correctly, if you are assaulted in 144 00:07:32,520 --> 00:07:36,440 Speaker 1: one of their vehicles, they are responsible. Rather than you 145 00:07:36,560 --> 00:07:39,800 Speaker 1: trying to track down the driver, you can sue the company. 146 00:07:40,520 --> 00:07:43,240 Speaker 2: Yeah, that would that would theoretically apply to all of 147 00:07:43,280 --> 00:07:45,960 Speaker 2: the ride share services, which is you know, it's probably 148 00:07:46,000 --> 00:07:49,920 Speaker 2: good because it you know, in general, one likes to 149 00:07:49,960 --> 00:07:52,040 Speaker 2: have a light touch with government. You want small government. 150 00:07:52,080 --> 00:07:54,520 Speaker 2: But once it's been proven that they whatever entity can't 151 00:07:54,520 --> 00:07:56,640 Speaker 2: handle it. It does seem to suggest that there needs 152 00:07:56,640 --> 00:07:58,960 Speaker 2: to be an incentive or disincentive applied directly to these 153 00:07:59,040 --> 00:08:02,160 Speaker 2: ride share companies to clean up the problem. And obviously 154 00:08:02,880 --> 00:08:06,240 Speaker 2: there is no amount of sexual assault that's acceptable in 155 00:08:06,240 --> 00:08:09,480 Speaker 2: our society. Obviously it's you know, all crimes are unfortunate 156 00:08:09,600 --> 00:08:12,080 Speaker 2: and probably inevitably happen at some level. But there's no 157 00:08:12,200 --> 00:08:14,600 Speaker 2: reason why if someone has had these accusations that they 158 00:08:14,600 --> 00:08:19,000 Speaker 2: shouldn't be fully investigated. So putting that liability on these companies, 159 00:08:19,040 --> 00:08:22,600 Speaker 2: which have, you know, by their own admission, an algorithm, 160 00:08:22,480 --> 00:08:25,280 Speaker 2: complicated algorithm thats supposed to predict these types of things, 161 00:08:25,280 --> 00:08:26,360 Speaker 2: does seem like the right answer. 162 00:08:26,800 --> 00:08:29,240 Speaker 1: But that's the crazy thing to me. So on this site, 163 00:08:29,280 --> 00:08:32,480 Speaker 1: and I'm gonna say that, you know this is not 164 00:08:32,559 --> 00:08:35,440 Speaker 1: coming from me. This is the every eight Minutes site, 165 00:08:36,240 --> 00:08:40,880 Speaker 1: they actually say that Uber strictly prohibited employees from reporting 166 00:08:40,960 --> 00:08:44,199 Speaker 1: driver assaults of the writers to the police. So if 167 00:08:44,240 --> 00:08:47,240 Speaker 1: you got a if you were like the monitor, I mean, 168 00:08:47,280 --> 00:08:50,800 Speaker 1: I'm assuming AI flags this stuff and then it goes 169 00:08:50,840 --> 00:08:54,120 Speaker 1: to an actual person who finds out, oh, this person 170 00:08:54,160 --> 00:08:58,600 Speaker 1: says they were assaulted. The company was supposedly, according to 171 00:08:58,640 --> 00:09:02,360 Speaker 1: the site, saying you can can't contact the police even 172 00:09:02,679 --> 00:09:06,240 Speaker 1: if the driver and in some case apparently one driver 173 00:09:06,400 --> 00:09:09,320 Speaker 1: did admit to committing the rape, they didn't go to 174 00:09:09,360 --> 00:09:09,920 Speaker 1: the police. 175 00:09:10,840 --> 00:09:14,680 Speaker 2: That is that is truly astonishing and such a damning 176 00:09:15,040 --> 00:09:17,840 Speaker 2: piece of allegation, as you as we noted, like that 177 00:09:17,920 --> 00:09:20,920 Speaker 2: explains the eight and a half million dollar judgment. When 178 00:09:20,920 --> 00:09:23,679 Speaker 2: you have behaviors, this's egregious. And if if that turns 179 00:09:23,679 --> 00:09:27,880 Speaker 2: out to be you know, widely their policy internally. I 180 00:09:27,880 --> 00:09:29,160 Speaker 2: think what they're going to find is that eight and 181 00:09:29,200 --> 00:09:31,840 Speaker 2: a half may be a low a low amount. They 182 00:09:31,920 --> 00:09:33,160 Speaker 2: might want to settle all these for eight and a 183 00:09:33,200 --> 00:09:35,800 Speaker 2: half million if they can, because I mean that that 184 00:09:35,880 --> 00:09:39,080 Speaker 2: the number the thousands of victims would total, you know, 185 00:09:41,400 --> 00:09:43,360 Speaker 2: near what we heard from the Catholic Church when they 186 00:09:43,400 --> 00:09:46,000 Speaker 2: were in the middle of all of their problems with 187 00:09:46,160 --> 00:09:49,720 Speaker 2: moving priests around who had who had done things to children. 188 00:09:49,760 --> 00:09:52,000 Speaker 2: This is this is ava scale. If it turns out 189 00:09:52,000 --> 00:09:56,200 Speaker 2: that that those those allegations are are proven, and this 190 00:09:56,280 --> 00:09:58,280 Speaker 2: is something you know, we I should take a quick 191 00:09:58,280 --> 00:10:00,880 Speaker 2: step back into the history of consumers research with uber. 192 00:10:01,760 --> 00:10:02,360 Speaker 3: When Roe v. 193 00:10:02,520 --> 00:10:06,600 Speaker 2: Wade was overturned, there was a whole rush by companies 194 00:10:06,679 --> 00:10:09,960 Speaker 2: to weigh into that political issue. Part of this was 195 00:10:10,040 --> 00:10:13,200 Speaker 2: again driven internally. You had woke activists within the companies 196 00:10:13,240 --> 00:10:16,000 Speaker 2: who wanted to push back and weigh in on the 197 00:10:16,040 --> 00:10:21,959 Speaker 2: side of abortion. But you know, Uber in particular was 198 00:10:22,920 --> 00:10:26,240 Speaker 2: sort of out in front of saying that they would 199 00:10:26,240 --> 00:10:30,559 Speaker 2: help people cross state lines in order to receive abortions 200 00:10:30,800 --> 00:10:34,840 Speaker 2: that would be illegal in the state where they currently reside. 201 00:10:35,360 --> 00:10:39,000 Speaker 2: And this was a pretty incredible thing for a company 202 00:10:39,040 --> 00:10:40,719 Speaker 2: to be saying that they were going to aid in 203 00:10:40,760 --> 00:10:43,840 Speaker 2: a bet avoiding state laws. And so when we looked 204 00:10:43,880 --> 00:10:45,760 Speaker 2: into the company, this was the first thing we found 205 00:10:46,040 --> 00:10:49,000 Speaker 2: was that they had a significant problem with sexual assaults 206 00:10:49,000 --> 00:10:50,680 Speaker 2: with their drivers that they weren't solving. 207 00:10:50,760 --> 00:10:51,520 Speaker 3: And this was years ago. 208 00:10:51,559 --> 00:10:53,120 Speaker 2: I think we hit them in like twenty twenty two 209 00:10:53,240 --> 00:10:55,439 Speaker 2: or twenty twenty three. I can't remember exactly when roev 210 00:10:55,520 --> 00:11:00,319 Speaker 2: Wade was overturned, but this has been years of problems 211 00:11:00,360 --> 00:11:02,679 Speaker 2: that they've had, and you may recall they also had 212 00:11:02,720 --> 00:11:04,920 Speaker 2: an issue where it seems like this was an attempt 213 00:11:04,960 --> 00:11:09,439 Speaker 2: to maybe without going down the roads that I know 214 00:11:09,520 --> 00:11:11,920 Speaker 2: we've talked about, like removing some of these drivers for 215 00:11:11,960 --> 00:11:14,960 Speaker 2: whatever reason, they didn't want to do that your call. 216 00:11:15,040 --> 00:11:17,319 Speaker 2: They were sort of a widely mocked policy about a 217 00:11:17,400 --> 00:11:19,400 Speaker 2: year ago where Uber was going to allow you to 218 00:11:19,480 --> 00:11:22,520 Speaker 2: choose the sex of your driver, so you could choose 219 00:11:22,520 --> 00:11:24,800 Speaker 2: a woman. But of course that runs them headlong into 220 00:11:24,920 --> 00:11:27,360 Speaker 2: the far left progressive trap of okay, well, then how 221 00:11:27,880 --> 00:11:30,880 Speaker 2: are you defining a woman, because somebody could just put 222 00:11:30,880 --> 00:11:34,560 Speaker 2: that as a check mark. And you know, unless Uber 223 00:11:34,640 --> 00:11:37,280 Speaker 2: is going to define a woman as you know medical 224 00:11:37,280 --> 00:11:41,280 Speaker 2: science does, by their chromosomes and their genitalia, then there's 225 00:11:41,320 --> 00:11:43,720 Speaker 2: nothing to stop all of these drivers from simply selecting 226 00:11:43,720 --> 00:11:45,960 Speaker 2: into that program and may actually making the problem worse 227 00:11:46,040 --> 00:11:49,600 Speaker 2: because maybe a woman is a particular vulnerable moment, maybe 228 00:11:49,600 --> 00:11:52,400 Speaker 2: she's inebriated coming out of a bar or something like that, 229 00:11:52,440 --> 00:11:54,280 Speaker 2: wants to make sure that she's with a female driver, 230 00:11:54,559 --> 00:11:57,600 Speaker 2: and there they select a female driver, and they get 231 00:11:57,760 --> 00:11:59,520 Speaker 2: a man pretending to be a woman or saying that 232 00:11:59,800 --> 00:12:02,560 Speaker 2: he's woman or something like that. So again, this is 233 00:12:03,040 --> 00:12:05,400 Speaker 2: this seems like an issue that UBER has been grappling 234 00:12:05,400 --> 00:12:07,320 Speaker 2: with for years. But instead of just dealing with the 235 00:12:07,360 --> 00:12:09,880 Speaker 2: issue directly and saying we are going to take this 236 00:12:10,160 --> 00:12:12,320 Speaker 2: these allegations very seriously, we are going to report them 237 00:12:12,320 --> 00:12:14,679 Speaker 2: to the police, have them properly investigated, and we're going 238 00:12:14,720 --> 00:12:19,559 Speaker 2: to remove drivers that have you know, substantial or you know, coherent, 239 00:12:19,720 --> 00:12:23,160 Speaker 2: plausible allegations against them for sexual assault. They have tried 240 00:12:23,280 --> 00:12:26,760 Speaker 2: all of this woke signaling instead, and that's a really 241 00:12:26,800 --> 00:12:29,120 Speaker 2: troubling thing for this company. I'd to tell you the truth. 242 00:12:29,200 --> 00:12:30,319 Speaker 2: I wish I could tell you I sit here and 243 00:12:30,360 --> 00:12:32,760 Speaker 2: could tell you why and that I fully understand it. 244 00:12:32,920 --> 00:12:33,319 Speaker 3: I don't. 245 00:12:33,559 --> 00:12:35,280 Speaker 2: It's just that they have done it, and it's been 246 00:12:35,280 --> 00:12:36,600 Speaker 2: a repeated pattern for years. 247 00:12:36,920 --> 00:12:39,640 Speaker 1: Let's take a quick commercial break. We'll continue next on 248 00:12:39,720 --> 00:12:45,400 Speaker 1: the Tutor Dixon Podcast. They must have known that this 249 00:12:45,559 --> 00:12:47,400 Speaker 1: was coming, though. I mean, as you said, you were 250 00:12:47,440 --> 00:12:51,040 Speaker 1: pointing this out. It's weird to me because I look 251 00:12:51,080 --> 00:12:53,680 Speaker 1: at this and I think you would think you've seen 252 00:12:53,880 --> 00:12:57,040 Speaker 1: yellow cabs, and you know, taxi companies all across the 253 00:12:57,120 --> 00:13:00,559 Speaker 1: United States, all across the world. Actually there's tax companies 254 00:13:00,559 --> 00:13:03,280 Speaker 1: in Europe. There's taxi companies all over right, So there 255 00:13:03,320 --> 00:13:06,400 Speaker 1: have to be some regulations that you can kind of 256 00:13:06,720 --> 00:13:09,560 Speaker 1: model yours off of there where they're saying, Okay, we've 257 00:13:09,559 --> 00:13:12,120 Speaker 1: got to keep this is our business. We have to 258 00:13:12,200 --> 00:13:15,800 Speaker 1: keep people safe. It's shocking to me as someone who 259 00:13:15,840 --> 00:13:19,800 Speaker 1: comes from a family business where safety was the top concern. 260 00:13:20,280 --> 00:13:23,440 Speaker 1: Every meeting was opened with a safety report. Every time 261 00:13:23,480 --> 00:13:26,520 Speaker 1: we met with people, we talked about safety first. Everything 262 00:13:26,559 --> 00:13:30,560 Speaker 1: across the walls of the shop were about safety. How 263 00:13:30,559 --> 00:13:33,560 Speaker 1: can this not be a top priority? Wouldn't there be 264 00:13:33,960 --> 00:13:36,360 Speaker 1: all kinds of background checks for every driver? And I 265 00:13:36,360 --> 00:13:39,959 Speaker 1: say this as someone like I said, who uses the 266 00:13:40,080 --> 00:13:44,120 Speaker 1: service and thinks I'm safe because they've done their homework 267 00:13:44,160 --> 00:13:46,720 Speaker 1: to make sure that when they pick me up at 268 00:13:46,720 --> 00:13:49,760 Speaker 1: point A, they get me to point B safely. That 269 00:13:49,800 --> 00:13:53,120 Speaker 1: should be your number one priority. How does this happen? 270 00:13:54,080 --> 00:13:55,000 Speaker 3: It's a great question. 271 00:13:55,080 --> 00:13:57,200 Speaker 2: I think this goes into this somewhat of the Silicon 272 00:13:57,320 --> 00:14:00,000 Speaker 2: Valley mentality. You know, it's sort of a cliche now, 273 00:14:00,160 --> 00:14:02,839 Speaker 2: but the quote, you know, move fast and break things 274 00:14:03,559 --> 00:14:06,320 Speaker 2: used to typify Silicon value around the time that Uber 275 00:14:06,559 --> 00:14:07,960 Speaker 2: was coming up, and we saw that. 276 00:14:08,679 --> 00:14:10,800 Speaker 3: I'm a longtime. 277 00:14:10,200 --> 00:14:12,520 Speaker 2: DC resident, and I saw that DC was one of 278 00:14:12,559 --> 00:14:16,120 Speaker 2: the first cities where Uber deployed. And at the time, 279 00:14:16,600 --> 00:14:18,560 Speaker 2: it was basically what they were doing was illegal. They 280 00:14:18,559 --> 00:14:22,960 Speaker 2: were running an illegal taxi company outside of the regulations 281 00:14:23,000 --> 00:14:26,920 Speaker 2: of the district, which, in their defense, were cumbersome and 282 00:14:27,000 --> 00:14:30,120 Speaker 2: draconian and had no space for innovation or kind of 283 00:14:30,120 --> 00:14:33,680 Speaker 2: improvement on the service. But their solution to that was 284 00:14:33,720 --> 00:14:37,520 Speaker 2: not to lobby for changes or to try and fix it. 285 00:14:37,560 --> 00:14:40,720 Speaker 2: They simply ran the program illegally and when people would 286 00:14:40,760 --> 00:14:43,280 Speaker 2: get pulled over and find for this, they would pay 287 00:14:43,280 --> 00:14:47,120 Speaker 2: the fines, and eventually they had enough of a customer 288 00:14:47,160 --> 00:14:49,760 Speaker 2: base that they then used the Uber app to push 289 00:14:49,800 --> 00:14:52,720 Speaker 2: people on the customer base, push push customers of theirs 290 00:14:52,920 --> 00:14:56,040 Speaker 2: to call their aldermen and demand that they make changes 291 00:14:56,120 --> 00:14:57,400 Speaker 2: in favor of Uber. 292 00:14:57,440 --> 00:14:57,920 Speaker 3: I remember this. 293 00:14:58,000 --> 00:15:00,960 Speaker 2: I was a very early adopter of Uber, and obviously 294 00:15:01,000 --> 00:15:03,560 Speaker 2: the you know, the efficiency of the system of hiring, 295 00:15:03,600 --> 00:15:05,720 Speaker 2: of getting a car through a phone is, if there's 296 00:15:05,760 --> 00:15:08,920 Speaker 2: no doubt, is better than you know, it seems archaic 297 00:15:09,000 --> 00:15:11,520 Speaker 2: now to wave to a car randomly from the side 298 00:15:11,520 --> 00:15:14,440 Speaker 2: of the street, but it does sort of typify that 299 00:15:14,480 --> 00:15:17,320 Speaker 2: mentality of who cares if we're breaking the law, We're 300 00:15:17,320 --> 00:15:20,000 Speaker 2: going to do it our way, whatever works. And you know, 301 00:15:20,240 --> 00:15:22,400 Speaker 2: I have some admiration for that when they're pushing back 302 00:15:22,400 --> 00:15:27,600 Speaker 2: against big government regulations and sclerotic oversight. But it's very 303 00:15:27,640 --> 00:15:30,760 Speaker 2: different when you are playing fast and loose with female 304 00:15:30,760 --> 00:15:34,120 Speaker 2: security and the safety of a lot of women who 305 00:15:34,160 --> 00:15:37,520 Speaker 2: apparently were sexually assaulted because they weren't doing their job. 306 00:15:37,640 --> 00:15:40,080 Speaker 2: So I think it goes back to that that move 307 00:15:40,160 --> 00:15:43,160 Speaker 2: fast and break things Silicon Valley mentality. 308 00:15:43,160 --> 00:15:43,960 Speaker 3: They'll just figure it out. 309 00:15:44,000 --> 00:15:46,640 Speaker 2: So the a lot of the background checks and pretty 310 00:15:46,680 --> 00:15:49,720 Speaker 2: much everything is done digitally, and so there isn't like 311 00:15:49,760 --> 00:15:52,880 Speaker 2: a face to face connection with these people. In most cases, 312 00:15:52,880 --> 00:15:56,040 Speaker 2: they're submitting documents online, which obviously we live in twenty 313 00:15:56,080 --> 00:15:59,040 Speaker 2: twenty six, that's happening more and more often, but changing 314 00:15:59,360 --> 00:16:03,320 Speaker 2: you know, topics, just ever so slightly. And I can't 315 00:16:03,520 --> 00:16:05,720 Speaker 2: confirm this. I've just heard a lot of rumors of this. 316 00:16:06,080 --> 00:16:08,280 Speaker 2: One of the issues that the companies are now running 317 00:16:08,280 --> 00:16:12,720 Speaker 2: into is that one person will sign up as a 318 00:16:12,800 --> 00:16:16,840 Speaker 2: driver and go through the background check system, and then 319 00:16:16,960 --> 00:16:19,840 Speaker 2: they will lend out their car and their phone to 320 00:16:20,240 --> 00:16:23,880 Speaker 2: other people that are going to use the system. And 321 00:16:24,040 --> 00:16:27,200 Speaker 2: to my knowledge that Uber has not been quick to 322 00:16:27,560 --> 00:16:29,120 Speaker 2: fix this or and some of the other ones have 323 00:16:29,200 --> 00:16:31,120 Speaker 2: allegedly not been quick to fix this either. Again, I 324 00:16:31,120 --> 00:16:33,520 Speaker 2: have no direct knowledge of this is just what I'm hearing. 325 00:16:33,600 --> 00:16:37,160 Speaker 2: Is I monitor this situation, and that introduces all kinds 326 00:16:37,200 --> 00:16:40,080 Speaker 2: of other obvious problems, which you may not have even 327 00:16:40,080 --> 00:16:41,960 Speaker 2: been in the car with the driver that you thought 328 00:16:42,000 --> 00:16:45,360 Speaker 2: you were, which you know obviously creates all kinds of. 329 00:16:45,320 --> 00:16:48,600 Speaker 1: Safety right because then you can't even report it because 330 00:16:48,600 --> 00:16:51,760 Speaker 1: you don't know actually who you were with. So just 331 00:16:51,840 --> 00:16:55,840 Speaker 1: so we so that the audience knows. Uber was ordered 332 00:16:55,880 --> 00:16:58,960 Speaker 1: to pay eight and a half million to a young woman. 333 00:16:59,760 --> 00:17:03,240 Speaker 1: They were found liable for a twenty twenty three sexual 334 00:17:03,280 --> 00:17:06,239 Speaker 1: assault of a nineteen year old passenger. And you are 335 00:17:06,359 --> 00:17:09,560 Speaker 1: worst case scenario. I think this is everyone's kind of 336 00:17:09,560 --> 00:17:13,760 Speaker 1: worst case scenario, especially for people my age who have 337 00:17:13,840 --> 00:17:17,040 Speaker 1: kids that are going to college. You think that they're 338 00:17:17,080 --> 00:17:19,560 Speaker 1: doing the right thing because she was drunk. So this 339 00:17:19,720 --> 00:17:23,879 Speaker 1: young woman was intoxicated, she was alone, she didn't have 340 00:17:23,920 --> 00:17:26,720 Speaker 1: a way home. This is what we tell our kids. 341 00:17:27,119 --> 00:17:29,800 Speaker 1: Get a ride, don't get in a car. Get a ride, 342 00:17:29,800 --> 00:17:31,639 Speaker 1: don't get in a car with someone you don't know. 343 00:17:31,720 --> 00:17:35,080 Speaker 1: Get a safe ried. You assume that calling an Uber 344 00:17:35,200 --> 00:17:39,399 Speaker 1: is a safe ride. And then she was assaulted in 345 00:17:39,440 --> 00:17:40,640 Speaker 1: the backseat of the vehicle. 346 00:17:42,119 --> 00:17:45,720 Speaker 2: Yeah, And what's truly astonishing to me in some ways 347 00:17:45,920 --> 00:17:49,119 Speaker 2: is that there have been other safety type incidences that 348 00:17:49,200 --> 00:17:51,639 Speaker 2: Uber has responded to. So, for example, there was a 349 00:17:51,680 --> 00:17:54,480 Speaker 2: problem for a while of people getting into the wrong car. 350 00:17:54,520 --> 00:17:57,719 Speaker 2: There was someone kidnapped and murdered because they got into 351 00:17:57,800 --> 00:18:00,199 Speaker 2: what they thought was their Uber I believe at an 352 00:18:00,240 --> 00:18:03,080 Speaker 2: airport or maybe in front of a grocery store, and 353 00:18:03,560 --> 00:18:06,520 Speaker 2: unfortunately that person was not an Uber driver and simply 354 00:18:06,920 --> 00:18:10,080 Speaker 2: was kind of using that system as bait and took 355 00:18:10,119 --> 00:18:13,240 Speaker 2: the person. Subsequently, sometimes when you get an Uber, and 356 00:18:13,280 --> 00:18:16,200 Speaker 2: sometimes heard about this on Lift as well, you will 357 00:18:16,440 --> 00:18:19,640 Speaker 2: have a code on your phone when you get into 358 00:18:19,640 --> 00:18:21,439 Speaker 2: the car, and you have to give that code to 359 00:18:21,560 --> 00:18:23,920 Speaker 2: the driver in order to make sure that you've matched 360 00:18:24,000 --> 00:18:26,000 Speaker 2: up correctly. I've seen this in areas where they've had 361 00:18:26,040 --> 00:18:29,160 Speaker 2: problems with people getting into the wrong cars. So what's 362 00:18:29,160 --> 00:18:32,080 Speaker 2: weird about this is in some Uber has been responsive 363 00:18:32,119 --> 00:18:34,000 Speaker 2: to the safety of passengers, and so it leaves the 364 00:18:34,040 --> 00:18:37,680 Speaker 2: mystery of why this. As you articulated earlier, it appears 365 00:18:37,880 --> 00:18:40,320 Speaker 2: that they aren't even fully investigating or reporting to police 366 00:18:40,440 --> 00:18:43,120 Speaker 2: some of these cases of alleged sexual assault. So why 367 00:18:43,240 --> 00:18:46,439 Speaker 2: is it that they take some things very seriously and 368 00:18:46,560 --> 00:18:49,600 Speaker 2: seem to be playing really fast and loose and I 369 00:18:49,600 --> 00:18:53,520 Speaker 2: mean obviously credibly recklessly with the safety of women In 370 00:18:53,560 --> 00:18:55,880 Speaker 2: this other case, and the best I can figure out 371 00:18:56,000 --> 00:18:58,200 Speaker 2: is that this is somewhat of a case of wokeness, 372 00:18:58,200 --> 00:19:00,960 Speaker 2: that there's some area of this that they don't want 373 00:19:00,960 --> 00:19:02,800 Speaker 2: to have to touch or deal with. As I hinted 374 00:19:02,840 --> 00:19:05,399 Speaker 2: at earlier, it may be a case they you know, 375 00:19:05,520 --> 00:19:07,679 Speaker 2: one of the things Uber does is they recruit heavily 376 00:19:07,720 --> 00:19:11,760 Speaker 2: in immigrant communities. You notice that in any major city there, 377 00:19:11,880 --> 00:19:15,399 Speaker 2: especially over periods of time, you will see overwhelming numbers 378 00:19:15,400 --> 00:19:19,680 Speaker 2: of drivers coming from the same nationhood, immigrants, from the 379 00:19:19,680 --> 00:19:22,600 Speaker 2: same countries. And I'm just I'm wondering. I have no 380 00:19:22,640 --> 00:19:24,520 Speaker 2: proof of this, but I'm wondering if some of the 381 00:19:24,560 --> 00:19:27,119 Speaker 2: issues that they've had would have led to having to 382 00:19:28,000 --> 00:19:30,520 Speaker 2: show that there was a disproportionate number of these complaints 383 00:19:30,600 --> 00:19:33,640 Speaker 2: or incidence is coming from certain communities and the wokeness 384 00:19:33,640 --> 00:19:36,600 Speaker 2: within the company, their DEI programs, or just their own 385 00:19:36,640 --> 00:19:41,359 Speaker 2: political you know views, has kept them from really solving that. 386 00:19:41,400 --> 00:19:43,480 Speaker 2: And that's something we've seen and I mentioned, you know, 387 00:19:43,520 --> 00:19:46,440 Speaker 2: we've been on this beat of pushing back against companies 388 00:19:46,440 --> 00:19:49,320 Speaker 2: going woke for about six years now, and this has 389 00:19:49,359 --> 00:19:52,280 Speaker 2: been a repeated, repeated pattern We've seen with so many 390 00:19:52,280 --> 00:19:56,280 Speaker 2: corporations where they have a real problem with Chinese slave labor, 391 00:19:56,320 --> 00:20:00,840 Speaker 2: with sexual assaults by their drivers, with anti trust violations, 392 00:20:01,160 --> 00:20:05,000 Speaker 2: and instead of fixing the issue, they virtue signal and 393 00:20:05,040 --> 00:20:07,360 Speaker 2: go woke in order to hide the issue. 394 00:20:07,520 --> 00:20:09,400 Speaker 1: Because yeah, I was going to say, you brought up 395 00:20:09,480 --> 00:20:13,200 Speaker 1: the so this obviously, this assault was twenty twenty three. 396 00:20:13,720 --> 00:20:16,440 Speaker 1: You imagine that they've probably heard of quite a few 397 00:20:16,480 --> 00:20:19,600 Speaker 1: other assaults and had quite a few complaints before that. 398 00:20:20,040 --> 00:20:24,200 Speaker 1: So twenty twenty two was when Roe was overturned. This 399 00:20:24,240 --> 00:20:27,800 Speaker 1: is a big deal. Everybody's talking about it. It becomes 400 00:20:27,840 --> 00:20:30,960 Speaker 1: bigger and bigger as states start to put in different 401 00:20:31,080 --> 00:20:33,119 Speaker 1: rules saying Okay, we're going to have a limit up 402 00:20:33,160 --> 00:20:34,720 Speaker 1: to this number of weeks, or we're going to have 403 00:20:34,760 --> 00:20:37,520 Speaker 1: no abortion. And that's when you started to see Uber 404 00:20:37,600 --> 00:20:39,520 Speaker 1: coming out and saying, you know what, We're going to 405 00:20:39,560 --> 00:20:43,760 Speaker 1: help you break the law, which seemed really weird, like okay, 406 00:20:43,840 --> 00:20:47,199 Speaker 1: you have a problem within your company, so too like 407 00:20:47,800 --> 00:20:51,360 Speaker 1: bright shiny thing over here, look away from that. You say, well, 408 00:20:51,440 --> 00:20:56,639 Speaker 1: help you commit a crime. That seems weird, right, and I. 409 00:20:56,640 --> 00:20:57,520 Speaker 3: Should mention here. 410 00:20:57,720 --> 00:20:59,520 Speaker 2: One of the things we brought up in our tweet 411 00:20:59,520 --> 00:21:04,800 Speaker 2: thread this earlier last week is there were emails they 412 00:21:04,800 --> 00:21:06,919 Speaker 2: call them. You know, we've Epstein files right now, Well, 413 00:21:06,920 --> 00:21:09,600 Speaker 2: they were the Uber files somebody hacked or somehow, and 414 00:21:09,960 --> 00:21:12,040 Speaker 2: it may have been through discovery. There were a whole 415 00:21:12,040 --> 00:21:15,040 Speaker 2: bunch of emails from internally at Uber while they were 416 00:21:15,040 --> 00:21:17,280 Speaker 2: dealing with this problem that were released. And what was 417 00:21:17,359 --> 00:21:24,080 Speaker 2: really scary is the glib and almost celebratory nature that 418 00:21:24,800 --> 00:21:28,000 Speaker 2: people that were handling this issue within Uber, this issue 419 00:21:28,000 --> 00:21:30,840 Speaker 2: of people suing them for these sexual assaults, how they 420 00:21:30,880 --> 00:21:34,199 Speaker 2: described their interactions in the media, and their way they 421 00:21:34,240 --> 00:21:36,800 Speaker 2: were talking about these victims. One of the quotes is 422 00:21:36,800 --> 00:21:39,600 Speaker 2: that that I just trashed one of the victims in 423 00:21:39,720 --> 00:21:42,640 Speaker 2: USA today. One of the people that worked on Uber 424 00:21:42,680 --> 00:21:46,120 Speaker 2: that was handling, you know, the fallout from this said 425 00:21:46,119 --> 00:21:47,920 Speaker 2: that they used to feel like they had a soul, 426 00:21:48,040 --> 00:21:51,000 Speaker 2: but now they're not so sure anymore. And that was 427 00:21:51,080 --> 00:21:54,480 Speaker 2: directly related to Germaine, to the way that they were 428 00:21:54,560 --> 00:21:59,480 Speaker 2: attacking these their customers who had been victimized by their drivers. So, 429 00:21:59,680 --> 00:22:03,639 Speaker 2: like I said, it's an astonishing, you know, breach of trust, 430 00:22:04,119 --> 00:22:05,720 Speaker 2: and really it's scary. 431 00:22:05,720 --> 00:22:08,159 Speaker 3: I mean, why not just fix the problem? Why not? 432 00:22:08,240 --> 00:22:09,760 Speaker 3: I mean, who wants you know, women? 433 00:22:10,040 --> 00:22:11,919 Speaker 1: I was going to say, how common is that that 434 00:22:12,000 --> 00:22:14,399 Speaker 1: you see a company that has a big problem and 435 00:22:14,440 --> 00:22:19,120 Speaker 1: they just put out a giant marketing campaign to distract 436 00:22:19,119 --> 00:22:19,560 Speaker 1: you from it. 437 00:22:20,080 --> 00:22:23,200 Speaker 2: Yeah, unfortunately, much more common than you'd like to think, 438 00:22:23,320 --> 00:22:25,320 Speaker 2: and a lot of times it's not so much people 439 00:22:25,320 --> 00:22:28,399 Speaker 2: tend to think of corporations as singular things like Uber 440 00:22:28,440 --> 00:22:31,240 Speaker 2: thinks this, or Uber thinks that. If that was the case, 441 00:22:31,240 --> 00:22:33,600 Speaker 2: it would probably be cheaper for Uber to just fix 442 00:22:33,640 --> 00:22:36,159 Speaker 2: the problem. The issue is that all of these companies 443 00:22:36,200 --> 00:22:39,480 Speaker 2: are made up of individual people and they are called 444 00:22:39,480 --> 00:22:42,720 Speaker 2: to account for various problems or successes or that, you know, 445 00:22:42,760 --> 00:22:45,680 Speaker 2: what have you. And so when there's an issue, if 446 00:22:45,680 --> 00:22:48,280 Speaker 2: it's you, if it's your career that's going to get destroyed, 447 00:22:48,359 --> 00:22:50,879 Speaker 2: you have a strong incentive to simply cover up the 448 00:22:50,960 --> 00:22:55,080 Speaker 2: problem instead of fix it because the consequences for having 449 00:22:55,080 --> 00:22:56,920 Speaker 2: the problem been on your watch could be very high. 450 00:22:56,960 --> 00:22:58,600 Speaker 2: And this goes all the way up to sometimes to 451 00:22:58,640 --> 00:23:00,960 Speaker 2: the CEOs themselves, and so it's one of the real 452 00:23:01,000 --> 00:23:04,960 Speaker 2: issues with corporate governance. And you know, we probably don't 453 00:23:05,000 --> 00:23:06,239 Speaker 2: have time to get into this, but one of our 454 00:23:06,240 --> 00:23:08,879 Speaker 2: big things we've pushed back on is black Rock and 455 00:23:08,920 --> 00:23:11,000 Speaker 2: this so called ESG movement. Well, the G and ESG 456 00:23:11,040 --> 00:23:13,880 Speaker 2: is supposed to stand for governance, but instead of focusing 457 00:23:13,880 --> 00:23:15,520 Speaker 2: on how do you solve that problem, how do you 458 00:23:15,520 --> 00:23:18,080 Speaker 2: solve the issue of making sure that corporations whin there 459 00:23:18,119 --> 00:23:20,240 Speaker 2: are problems deal with them in the right way that 460 00:23:20,280 --> 00:23:24,800 Speaker 2: doesn't get more passengers and ubers sexually assaulted. How do 461 00:23:24,840 --> 00:23:26,760 Speaker 2: you make sure that people feel like they can bring 462 00:23:26,800 --> 00:23:29,480 Speaker 2: these issues up and fix them. They focus on DEI, 463 00:23:29,680 --> 00:23:31,680 Speaker 2: on race and sex based quotas, more of the same 464 00:23:31,720 --> 00:23:33,720 Speaker 2: wokeness that they're using to try and cover the problem. 465 00:23:33,760 --> 00:23:38,040 Speaker 2: And in fact, one of the real refuge of corporate 466 00:23:38,119 --> 00:23:42,480 Speaker 2: charlatans is to run to ESG and wokeness to hide 467 00:23:42,520 --> 00:23:45,000 Speaker 2: when they've had bad quarters, bad profit earnings, they've had 468 00:23:45,000 --> 00:23:47,359 Speaker 2: a big blow up, like what Uber is going through. 469 00:23:47,400 --> 00:23:50,600 Speaker 2: So it weirdly this movement that was supposed to just 470 00:23:50,640 --> 00:23:52,720 Speaker 2: like wokeness, right, it's supposed to be about virtue and 471 00:23:52,840 --> 00:23:54,920 Speaker 2: treating people great, and what does it do It covers 472 00:23:55,000 --> 00:23:59,760 Speaker 2: up industrial levels sexual assaults. The ESG movement, the G 473 00:24:00,280 --> 00:24:02,520 Speaker 2: is supposed to be all about good governance, and instead 474 00:24:02,600 --> 00:24:05,840 Speaker 2: it is the refuge for bad executives who are doing wrong, 475 00:24:05,880 --> 00:24:07,480 Speaker 2: and they run to the issue move and run to 476 00:24:07,480 --> 00:24:09,679 Speaker 2: people like black Rock to cover up their issues. 477 00:24:09,800 --> 00:24:12,479 Speaker 1: Let's take a quick commercial break. We'll continue next on 478 00:24:12,480 --> 00:24:18,879 Speaker 1: the Tutor Dixon Podcast. Do you still see wokeness as 479 00:24:19,000 --> 00:24:24,600 Speaker 1: a popular method of doing business? I feel like since 480 00:24:25,520 --> 00:24:28,119 Speaker 1: this past election and a lot of people have been saying, 481 00:24:28,440 --> 00:24:30,000 Speaker 1: you know, a lot of people have turned against the 482 00:24:30,320 --> 00:24:34,200 Speaker 1: men and women's sports, the surgeries for children, a lot 483 00:24:34,200 --> 00:24:37,280 Speaker 1: of these things that they were standing up and saying, 484 00:24:37,800 --> 00:24:40,800 Speaker 1: this is who we are. We are willing to make 485 00:24:41,000 --> 00:24:46,360 Speaker 1: sacrifice these children, sacrifice our morals, do all of these 486 00:24:46,400 --> 00:24:50,280 Speaker 1: things so that we can meet the far leftist progressives. 487 00:24:50,520 --> 00:24:52,800 Speaker 1: Do you feel like that they've kind of been a 488 00:24:52,840 --> 00:24:55,119 Speaker 1: hit on that that people have turned against that. I 489 00:24:55,119 --> 00:24:57,119 Speaker 1: mean a lot of these clinics have had to close. 490 00:24:57,560 --> 00:25:00,919 Speaker 1: A lot of these politicians that were pro boys and girls' 491 00:25:00,960 --> 00:25:04,440 Speaker 1: sports have lost their elections. Is that turning at all? 492 00:25:05,080 --> 00:25:07,919 Speaker 3: Yeah? I would say it absolutely is. Waiting. 493 00:25:08,480 --> 00:25:11,240 Speaker 2: You've had a lot of shifts in behavior across corporate 494 00:25:11,280 --> 00:25:14,680 Speaker 2: America over the last two to three years, especially as 495 00:25:14,680 --> 00:25:17,080 Speaker 2: sort of the US and others have waged war on 496 00:25:17,160 --> 00:25:21,480 Speaker 2: woke and ESG. I will say this, Unfortunately, the ESG 497 00:25:21,680 --> 00:25:24,600 Speaker 2: and the woke capitalism movement put a lot of crazies 498 00:25:24,640 --> 00:25:28,399 Speaker 2: and activists and far left radicals inside these companies. And 499 00:25:28,480 --> 00:25:30,919 Speaker 2: while those people might not be bragging in their annual 500 00:25:30,960 --> 00:25:32,560 Speaker 2: reports about their activities. 501 00:25:32,119 --> 00:25:33,240 Speaker 3: Anymore, they're already there. 502 00:25:33,320 --> 00:25:36,000 Speaker 2: They are They're already there, and their goal now is 503 00:25:36,040 --> 00:25:38,359 Speaker 2: to keep their jobs and continue on a lot of 504 00:25:38,359 --> 00:25:41,240 Speaker 2: this behavior and just not brag about it as much. So, unfortunately, 505 00:25:41,320 --> 00:25:43,800 Speaker 2: our movement to push back against this, our last easy 506 00:25:43,840 --> 00:25:45,679 Speaker 2: day was yesterday, because that was when they put it 507 00:25:45,720 --> 00:25:48,120 Speaker 2: on the on the mastad right at the front. We 508 00:25:48,160 --> 00:25:50,960 Speaker 2: now have to chisel these people out of these companies 509 00:25:51,119 --> 00:25:54,040 Speaker 2: and push on these companies to focus on serving their consumers, 510 00:25:54,560 --> 00:25:57,240 Speaker 2: not on woke activists and woke politicians. Because when you do, 511 00:25:57,359 --> 00:26:00,439 Speaker 2: when you focus on activists and politicians, what happened at 512 00:26:00,480 --> 00:26:01,280 Speaker 2: Uber is what you get. 513 00:26:02,000 --> 00:26:05,119 Speaker 1: That's interesting. I think we so often when we're in 514 00:26:05,160 --> 00:26:07,879 Speaker 1: the political world, we're so deep in the political world 515 00:26:07,880 --> 00:26:10,359 Speaker 1: that we are focused on the election and the person 516 00:26:10,480 --> 00:26:14,479 Speaker 1: and the individual and how they can affect government. You 517 00:26:14,520 --> 00:26:17,800 Speaker 1: are looking at a totally different picture. So this, honestly, 518 00:26:17,920 --> 00:26:22,080 Speaker 1: this report on Uber was stunning to me because it's 519 00:26:22,200 --> 00:26:25,479 Speaker 1: not something I think of, I think of what the 520 00:26:25,520 --> 00:26:28,880 Speaker 1: policies are, rather than what's happening when I get into 521 00:26:28,880 --> 00:26:31,879 Speaker 1: that vehicle. Give us a little insight. What else should 522 00:26:31,960 --> 00:26:35,320 Speaker 1: we be aware of? What else are you guys working on? Well, 523 00:26:36,520 --> 00:26:39,160 Speaker 1: like I said, we are full spectrum. Attacking ESG. 524 00:26:39,720 --> 00:26:42,280 Speaker 2: ESG is sort of the disease, and then woe capitalism 525 00:26:42,280 --> 00:26:43,600 Speaker 2: is sort of the symptoms. So if you look at 526 00:26:43,640 --> 00:26:46,480 Speaker 2: the cause of the disease, it's big asset managers like 527 00:26:46,560 --> 00:26:50,200 Speaker 2: black Rocks, State Street and Vanguard. It's big banks and 528 00:26:50,320 --> 00:26:53,160 Speaker 2: big insurers that are using their market share that they 529 00:26:53,200 --> 00:26:56,200 Speaker 2: have sort of this bottleneck of financial services to push 530 00:26:56,240 --> 00:26:57,760 Speaker 2: wokeness across. 531 00:26:57,480 --> 00:27:00,440 Speaker 3: All of corporate America. Even companies otherwise do it. 532 00:27:00,720 --> 00:27:02,879 Speaker 2: So we're going to have to have that same mentality 533 00:27:02,880 --> 00:27:05,600 Speaker 2: on our side push back against companies. And like you said, 534 00:27:06,359 --> 00:27:09,280 Speaker 2: government is great, and government can be used to push 535 00:27:09,359 --> 00:27:12,080 Speaker 2: back on these companies for builing their fiduciary duty, engaging 536 00:27:12,320 --> 00:27:15,040 Speaker 2: security squad on a host of other things. But we 537 00:27:15,080 --> 00:27:18,720 Speaker 2: as consumers should be pushing back on these companies when 538 00:27:18,720 --> 00:27:20,560 Speaker 2: they do that. And I want to leave everyone with 539 00:27:20,560 --> 00:27:23,240 Speaker 2: a good note. Obviously this is extremely morose and morbid 540 00:27:23,280 --> 00:27:27,919 Speaker 2: topic of women having been affected, but consumers should not 541 00:27:28,200 --> 00:27:33,040 Speaker 2: feel hopeless or powerless or weak. There has been a 542 00:27:33,280 --> 00:27:36,080 Speaker 2: monumental shift in tone in corporate America just over the 543 00:27:36,160 --> 00:27:39,320 Speaker 2: last five to six years. And that is because consumers 544 00:27:39,640 --> 00:27:43,560 Speaker 2: have been pushing back. Our campaigns don't mean anything. They're 545 00:27:43,560 --> 00:27:45,639 Speaker 2: on their own. They're afraid of two things. They're afraid 546 00:27:45,640 --> 00:27:48,399 Speaker 2: of consumers changing their behaviors, and they're afraid of government 547 00:27:48,440 --> 00:27:51,560 Speaker 2: changing the rules and bring enforcement actions against these companies. 548 00:27:51,760 --> 00:27:54,000 Speaker 2: And I'm happy to say that our movement is getting 549 00:27:54,040 --> 00:27:56,600 Speaker 2: better and better pushing back again. And I'll just leave 550 00:27:56,640 --> 00:27:59,840 Speaker 2: one note, one bit of advice. If you see a 551 00:27:59,880 --> 00:28:02,560 Speaker 2: company signaling that they are woke, we had one, the 552 00:28:02,720 --> 00:28:05,879 Speaker 2: Redfin the Zillo competitor that just had an ad that 553 00:28:05,920 --> 00:28:08,800 Speaker 2: was super woke during the Super Bowl. When you see 554 00:28:08,840 --> 00:28:12,480 Speaker 2: a company pushing a woke agenda, they always have skeletons 555 00:28:12,480 --> 00:28:14,280 Speaker 2: in their closet. That has been the iron law of 556 00:28:14,280 --> 00:28:17,040 Speaker 2: every company we've looked into. That's crazy, how funny signals 557 00:28:17,080 --> 00:28:19,880 Speaker 2: they are woke they are mistreating their customers, or they're 558 00:28:19,960 --> 00:28:22,560 Speaker 2: using slave labor, or they're in bed with the communist Chinese. 559 00:28:22,640 --> 00:28:24,840 Speaker 2: I don't know what it is, but one hundred times 560 00:28:24,840 --> 00:28:26,960 Speaker 2: out of one hundred when we've looked into a company 561 00:28:27,000 --> 00:28:30,680 Speaker 2: signaling wokeness. They've had serious, serious issues with their business models. 562 00:28:30,720 --> 00:28:32,960 Speaker 2: So the more you see wokeness, the more you should 563 00:28:32,960 --> 00:28:34,119 Speaker 2: probably stay away from the company. 564 00:28:35,560 --> 00:28:37,520 Speaker 1: But we have had some effect on some of these 565 00:28:37,520 --> 00:28:42,120 Speaker 1: companies because I remember Target and the whole Pride Month 566 00:28:42,760 --> 00:28:44,960 Speaker 1: debacle and people said, you know what, I don't want 567 00:28:44,960 --> 00:28:47,960 Speaker 1: these swimming suits in with my daughter's swimming suits. And 568 00:28:48,080 --> 00:28:50,480 Speaker 1: the next year things were a lot different at Targets. 569 00:28:50,520 --> 00:28:53,840 Speaker 1: So do we have consumer power still. 570 00:28:53,840 --> 00:28:56,200 Speaker 2: One hundred percent? And as I said before, remember these 571 00:28:56,240 --> 00:28:59,440 Speaker 2: corporations are never one human being making one single decision, 572 00:29:00,240 --> 00:29:03,440 Speaker 2: up of different factions. And when consumers push back like 573 00:29:03,480 --> 00:29:05,920 Speaker 2: they did with Target, like they did with Budweiser, it 574 00:29:06,000 --> 00:29:09,760 Speaker 2: actually enables the voice of sanity in these companies to 575 00:29:09,800 --> 00:29:12,000 Speaker 2: come to the go with the people, the head of DEI, 576 00:29:12,040 --> 00:29:12,680 Speaker 2: the head of marketing. 577 00:29:12,760 --> 00:29:15,560 Speaker 3: Those are the two big people that usually push this 578 00:29:15,640 --> 00:29:17,160 Speaker 3: kind of nonsense. 579 00:29:17,480 --> 00:29:20,560 Speaker 2: Those people should be eliminated in almost every corporate setting. 580 00:29:20,600 --> 00:29:24,920 Speaker 2: They're usually completely idiots. But it enables when consumers push back. 581 00:29:25,360 --> 00:29:27,920 Speaker 2: It enables those people say listen, this is clearly not 582 00:29:28,040 --> 00:29:29,720 Speaker 2: good for our business. Welde, I don't know what excuse 583 00:29:29,760 --> 00:29:31,920 Speaker 2: you use to try and get us into this, but 584 00:29:32,680 --> 00:29:35,040 Speaker 2: you have hurt the company and your voice needs to 585 00:29:35,240 --> 00:29:37,640 Speaker 2: go away for a while. We saw this, we went 586 00:29:37,640 --> 00:29:40,640 Speaker 2: after a company called State Farm Insurance. They were pushing 587 00:29:40,680 --> 00:29:44,320 Speaker 2: transgender ideology books on kids as young as five, and 588 00:29:44,400 --> 00:29:46,640 Speaker 2: it turned out that it was their head of DEI 589 00:29:47,160 --> 00:29:50,440 Speaker 2: had without really alerting the CEO to this, partnered with 590 00:29:50,480 --> 00:29:53,320 Speaker 2: something called the Gender Cool Project, which whole aim is 591 00:29:53,360 --> 00:29:55,480 Speaker 2: to put these books and it says right on the 592 00:29:55,520 --> 00:30:00,640 Speaker 2: book kids five plus into public schools using State Farm agents, 593 00:30:00,680 --> 00:30:02,840 Speaker 2: and a whistleblower came to us and we hit them. 594 00:30:02,880 --> 00:30:05,000 Speaker 2: And as it turns out from you know what's come 595 00:30:05,000 --> 00:30:08,440 Speaker 2: out from that issue since is again this was one 596 00:30:08,560 --> 00:30:13,280 Speaker 2: small faction's radical transgender ideology faction within the company that 597 00:30:13,360 --> 00:30:17,640 Speaker 2: had done this, and it changed radically the culture within 598 00:30:17,680 --> 00:30:21,240 Speaker 2: that company. The CEO apologized and so this wasn't again, 599 00:30:21,320 --> 00:30:23,040 Speaker 2: this wasn't an issue where everyone at State Farm was 600 00:30:23,080 --> 00:30:26,920 Speaker 2: in on it. It was a small radical minority that 601 00:30:27,000 --> 00:30:32,360 Speaker 2: had been been enabled by neglect to engage in this. 602 00:30:32,400 --> 00:30:33,920 Speaker 2: And that's where it is a lot of companies, and 603 00:30:33,960 --> 00:30:36,800 Speaker 2: so pushing back on these companies, you're actually aiding our 604 00:30:36,840 --> 00:30:39,240 Speaker 2: allies inside these companies to clean them up. 605 00:30:39,800 --> 00:30:41,920 Speaker 1: I mean, look, people didn't like how they were going 606 00:30:42,000 --> 00:30:45,040 Speaker 1: to redecorate cracker barrel and they said, no, you know, 607 00:30:45,280 --> 00:30:48,239 Speaker 1: so we obviously can have an effect. We just have 608 00:30:48,320 --> 00:30:52,680 Speaker 1: to be loud, and you know, money talks, so I 609 00:30:52,720 --> 00:30:55,720 Speaker 1: think we can make a difference when these things start 610 00:30:55,800 --> 00:30:59,160 Speaker 1: to happen. But it's been fascinating talking to you. I 611 00:30:59,680 --> 00:31:03,160 Speaker 1: will still use Uber, but maybe I'll just use it 612 00:31:03,200 --> 00:31:05,120 Speaker 1: when I'm in a group of three. I don't know, 613 00:31:05,200 --> 00:31:07,560 Speaker 1: I feel like I need to have more people with me. 614 00:31:07,680 --> 00:31:10,040 Speaker 1: It seems like that was another thing that I had 615 00:31:10,080 --> 00:31:12,240 Speaker 1: read with those assaults was it seemed like it was 616 00:31:12,280 --> 00:31:14,560 Speaker 1: generally someone that was alone, because you obviously don't want 617 00:31:14,560 --> 00:31:17,120 Speaker 1: to take on more than one person. So maybe that's 618 00:31:17,200 --> 00:31:19,640 Speaker 1: just the message to our daughters is if you're going 619 00:31:19,720 --> 00:31:22,000 Speaker 1: to take a ride share, you're going to use one 620 00:31:22,040 --> 00:31:23,920 Speaker 1: of those ride share apps, go with a friend. 621 00:31:24,600 --> 00:31:26,040 Speaker 3: Great advice, great advice. 622 00:31:26,920 --> 00:31:29,520 Speaker 1: Will Hill, thank you so much for coming on the podcast, 623 00:31:29,680 --> 00:31:32,400 Speaker 1: Thanks for having me absolutely, and thank you all for 624 00:31:32,480 --> 00:31:35,320 Speaker 1: joining the Tutor Dixon Podcast. For this episode and others, 625 00:31:35,320 --> 00:31:38,080 Speaker 1: go to the Tutor dixonpodcast dot com, or you can 626 00:31:38,120 --> 00:31:41,440 Speaker 1: subscribe at the iHeartRadio app, Apple Podcasts or wherever you 627 00:31:41,480 --> 00:31:44,080 Speaker 1: get your podcasts. You can also watch it on YouTube 628 00:31:44,160 --> 00:31:46,680 Speaker 1: or rumble at tutor Dixon, but make sure you join 629 00:31:46,800 --> 00:31:47,960 Speaker 1: us and have a blessed day.