1 00:00:15,356 --> 00:00:24,436 Speaker 1: Pushkin. I recently talked with Francis Frye and Anne Morris. 2 00:00:24,756 --> 00:00:27,636 Speaker 2: I'm Francis Frye. I'm a professor at the Harvard Business School, 3 00:00:28,556 --> 00:00:30,956 Speaker 2: and I'm married to Anne. I'm Anne Morris. 4 00:00:30,996 --> 00:00:35,076 Speaker 3: I'm a company builder and a leadership coach, and the 5 00:00:35,156 --> 00:00:39,356 Speaker 3: marriage is mutual and consensual. Thank goodness, we're married to 6 00:00:39,356 --> 00:00:39,756 Speaker 3: each other. 7 00:00:40,276 --> 00:00:43,236 Speaker 1: Francis and Ann also work together. They are the co 8 00:00:43,316 --> 00:00:47,436 Speaker 1: founders of a training and consulting company called the Leadership Consortium. 9 00:00:47,836 --> 00:00:50,996 Speaker 1: They specialize in helping leaders build trust. They also co 10 00:00:51,076 --> 00:00:55,396 Speaker 1: host a podcast called Fixable. And there is this particular 11 00:00:55,516 --> 00:00:58,236 Speaker 1: project that they worked on. In fact, it's the project 12 00:00:58,236 --> 00:01:01,276 Speaker 1: that inspired them to start their company. And I found 13 00:01:01,276 --> 00:01:04,756 Speaker 1: this project so surprising and so illuminating that I wanted 14 00:01:04,796 --> 00:01:06,716 Speaker 1: to have them on the show to talk about it. 15 00:01:11,756 --> 00:01:14,756 Speaker 1: I'm Jacob Goldstein. This is What's your Problem? And Francis 16 00:01:14,876 --> 00:01:17,316 Speaker 1: and Ann are here today to tell the story of 17 00:01:17,356 --> 00:01:21,596 Speaker 1: their work with Uber. It starts back in twenty seventeen, 18 00:01:22,156 --> 00:01:25,836 Speaker 1: what was maybe the lowest point for the company. Francis 19 00:01:25,836 --> 00:01:28,516 Speaker 1: and Anne got involved when a Harvard Business School a 20 00:01:28,556 --> 00:01:31,196 Speaker 1: lum who was working at Uber came to Francis and 21 00:01:31,236 --> 00:01:33,396 Speaker 1: said the company needed her help. 22 00:01:34,636 --> 00:01:37,036 Speaker 2: He said, will you come and meet my CEO, Travis 23 00:01:37,116 --> 00:01:41,196 Speaker 2: Kalnik And I said no. My first instinct was no, 24 00:01:41,796 --> 00:01:44,476 Speaker 2: and I was like, I read the newspapers. It sounds terrible. 25 00:01:45,156 --> 00:01:48,436 Speaker 2: And then she said, please, as a personal favor, he 26 00:01:48,556 --> 00:01:51,156 Speaker 2: is not the person that you're reading about. Will you 27 00:01:51,196 --> 00:01:54,116 Speaker 2: come and meet with him? And so it was entirely 28 00:01:54,316 --> 00:01:57,716 Speaker 2: as a favor that I flew out to meet Travis. 29 00:01:57,796 --> 00:02:00,916 Speaker 1: And what was before you met him? What was your impression, 30 00:02:01,476 --> 00:02:03,636 Speaker 1: not having met him, of Travis Kalvinick. 31 00:02:04,036 --> 00:02:08,236 Speaker 2: It sounded like he didn't care, was out of touch, 32 00:02:08,756 --> 00:02:13,836 Speaker 2: created a very horrific climate for women and not awesome 33 00:02:13,956 --> 00:02:17,596 Speaker 2: for others as well. So we help good people do 34 00:02:17,716 --> 00:02:21,756 Speaker 2: hard things, but we don't help bad people. And so 35 00:02:22,236 --> 00:02:23,516 Speaker 2: I was like, I'm not going to work with him, 36 00:02:23,516 --> 00:02:25,196 Speaker 2: but I'll come out and meet him as a favorite 37 00:02:25,196 --> 00:02:25,436 Speaker 2: to you. 38 00:02:25,996 --> 00:02:28,036 Speaker 1: So you go fly out and what happens. 39 00:02:28,196 --> 00:02:31,476 Speaker 2: Yes, it was a two hour meeting. It lasted three days, 40 00:02:32,556 --> 00:02:43,436 Speaker 2: and I found him thoughtful, rigorous, open, just I adored him. 41 00:02:43,916 --> 00:02:45,516 Speaker 2: By the end, I was down for the count. 42 00:02:45,836 --> 00:02:48,756 Speaker 1: So that is truly surprising. 43 00:02:49,716 --> 00:02:53,676 Speaker 2: You're not kidding because he came across very the two 44 00:02:53,716 --> 00:02:56,876 Speaker 2: things I really like are rigor and optimism and he 45 00:02:56,956 --> 00:03:02,636 Speaker 2: was super rigorous and super optimistic. And then he said, 46 00:03:03,236 --> 00:03:06,516 Speaker 2: and I need help. The last company I ran had 47 00:03:06,596 --> 00:03:07,316 Speaker 2: eight people. 48 00:03:07,836 --> 00:03:09,076 Speaker 1: And how big is Uber at this point? 49 00:03:09,316 --> 00:03:10,196 Speaker 2: Thirteen thousand? 50 00:03:10,676 --> 00:03:14,476 Speaker 1: Wow? Okay, and so so you're in. 51 00:03:14,876 --> 00:03:17,196 Speaker 2: Well he was like, you know, you need to come 52 00:03:17,196 --> 00:03:18,916 Speaker 2: and work here full time. And I was like, I'm 53 00:03:18,956 --> 00:03:22,836 Speaker 2: never leaving Harvard, but I'll consult for you. He's like, no, no, no, 54 00:03:22,876 --> 00:03:26,236 Speaker 2: we need all of you. So I left there and 55 00:03:26,276 --> 00:03:29,236 Speaker 2: came home and asked, you know, told Anne about it 56 00:03:29,276 --> 00:03:32,316 Speaker 2: and was like, what do you think about my taking 57 00:03:32,316 --> 00:03:37,076 Speaker 2: a leave from HBS and going and working there full time? 58 00:03:37,476 --> 00:03:38,876 Speaker 1: And and what did you say? 59 00:03:39,196 --> 00:03:41,956 Speaker 3: That's a ridiculous idea? 60 00:03:42,836 --> 00:03:43,156 Speaker 1: Go on? 61 00:03:43,916 --> 00:03:47,436 Speaker 3: I mean, I think we sat with it for a 62 00:03:47,476 --> 00:03:50,476 Speaker 3: good week or so. But where we got to is 63 00:03:50,516 --> 00:03:54,636 Speaker 3: if we can make it work at Uber, and if 64 00:03:54,676 --> 00:03:57,276 Speaker 3: we can make some of these ideas that we're kind 65 00:03:57,276 --> 00:04:01,236 Speaker 3: of batting around, if we can really test them at Uber, 66 00:04:01,316 --> 00:04:04,116 Speaker 3: and if they can work there, then they can work anywhere. 67 00:04:04,716 --> 00:04:08,916 Speaker 3: These kinds of issues tend to I don't think I'm 68 00:04:09,316 --> 00:04:14,516 Speaker 3: stating it that paralyze some entrepreneurs who are building great 69 00:04:14,556 --> 00:04:18,676 Speaker 3: companies and get really stuck on this stuff, and so 70 00:04:18,716 --> 00:04:23,076 Speaker 3: the opportunity to show the world not only how to 71 00:04:23,076 --> 00:04:25,396 Speaker 3: make progress on these issues, but that you can make 72 00:04:25,436 --> 00:04:28,636 Speaker 3: progress on these issues was also super energizing. Yeah. 73 00:04:28,636 --> 00:04:32,076 Speaker 1: So okay, so you take the job and like, what 74 00:04:32,276 --> 00:04:35,476 Speaker 1: is the problem you're setting out to fix? Eduber or problems? 75 00:04:35,676 --> 00:04:39,956 Speaker 2: Yeah, so two sides. The problems at any organization are 76 00:04:40,076 --> 00:04:44,156 Speaker 2: always only two things, achievement and or sentiment? 77 00:04:46,236 --> 00:04:49,156 Speaker 1: Is the business working and do people feel good about it? 78 00:04:49,196 --> 00:04:50,276 Speaker 1: Is that what that means? Yes? 79 00:04:50,876 --> 00:04:54,916 Speaker 2: And it had both problems, huh. And the sentiment was 80 00:04:55,116 --> 00:05:01,076 Speaker 2: really rough and the achievement was also they were magnificent 81 00:05:01,116 --> 00:05:03,676 Speaker 2: in some ways, but this was the era when costs 82 00:05:03,716 --> 00:05:04,996 Speaker 2: were greater than revenue. 83 00:05:05,396 --> 00:05:06,356 Speaker 1: They were losing money. 84 00:05:06,516 --> 00:05:10,756 Speaker 2: Yeah, And that's that can seduce you into thinking all 85 00:05:10,876 --> 00:05:14,156 Speaker 2: kinds of things that are upside down are right side up. 86 00:05:14,436 --> 00:05:16,396 Speaker 2: And so that's where the strategy. 87 00:05:16,156 --> 00:05:18,956 Speaker 1: Going really fast? Right, is that complicated Silicon Valley thing 88 00:05:18,996 --> 00:05:22,196 Speaker 1: of like incredible growth, but like they're selling every dollar 89 00:05:22,236 --> 00:05:24,436 Speaker 1: for seventy five cents, and lots of people it turns out, 90 00:05:24,676 --> 00:05:26,396 Speaker 1: will buy a dollar if it costs seventy five. 91 00:05:26,276 --> 00:05:28,636 Speaker 2: Cents, yes, yeah, And I don't know if it's so complicated. 92 00:05:28,916 --> 00:05:32,116 Speaker 2: I don't think it's that complicated either. I think I 93 00:05:32,156 --> 00:05:35,836 Speaker 2: think that if you get venture capitalists that are willing 94 00:05:35,916 --> 00:05:38,316 Speaker 2: to take their money and put it in the pockets 95 00:05:38,316 --> 00:05:42,436 Speaker 2: of riders, it turns out that can last a long time. 96 00:05:43,156 --> 00:05:45,196 Speaker 2: So but that's what we went in is sort of 97 00:05:45,236 --> 00:05:47,916 Speaker 2: we when we talk about study the achievement. 98 00:05:48,156 --> 00:05:51,076 Speaker 1: Achievement problem is they're losing money. What specifically was the 99 00:05:51,116 --> 00:05:51,996 Speaker 1: sentiment problem? 100 00:05:52,196 --> 00:05:57,516 Speaker 2: Oh goodness, the It was a culture of I'll give 101 00:05:57,556 --> 00:05:59,876 Speaker 2: you an example. If one person was going to give 102 00:05:59,916 --> 00:06:03,316 Speaker 2: someone else feedback at uber, it was an arms race 103 00:06:03,356 --> 00:06:07,196 Speaker 2: of how cutting the feedback could be. So they were mean, 104 00:06:08,036 --> 00:06:11,596 Speaker 2: they were they were Yeah, there was a but but 105 00:06:11,796 --> 00:06:15,756 Speaker 2: not with the intention of cruelty. But yes, it came 106 00:06:15,796 --> 00:06:18,596 Speaker 2: across as cruel it. Also, you know a lot of 107 00:06:18,636 --> 00:06:21,076 Speaker 2: people had their first job here, and so when I 108 00:06:21,116 --> 00:06:23,836 Speaker 2: got there and I would talk to women in particular, 109 00:06:24,836 --> 00:06:29,316 Speaker 2: the climate that they were enduring was something like the following. 110 00:06:30,236 --> 00:06:34,996 Speaker 2: Woman's the only woman on an engineering team. She's working 111 00:06:35,036 --> 00:06:37,356 Speaker 2: really hard. She's going to stay late to work on something. 112 00:06:37,396 --> 00:06:39,836 Speaker 2: She asks a colleague if he'll stay late with her, 113 00:06:39,916 --> 00:06:43,116 Speaker 2: and he'll say, yeah, as long as you sleep with me. 114 00:06:44,596 --> 00:06:48,956 Speaker 2: She reacts and he says, just kidding. You know that, 115 00:06:49,156 --> 00:06:51,956 Speaker 2: like the tyranny of just kidding. There was a lot 116 00:06:52,036 --> 00:06:57,596 Speaker 2: of just kidding there, and so no one had taught 117 00:06:57,676 --> 00:07:02,436 Speaker 2: anyone how to manage and how to lead. So if 118 00:07:02,476 --> 00:07:06,556 Speaker 2: I mean there were a thousand problems, literally one thousand complaints, 119 00:07:07,356 --> 00:07:10,116 Speaker 2: and more than ninety percent of them had to do 120 00:07:10,156 --> 00:07:13,836 Speaker 2: with the interaction between someone and their manager. There were 121 00:07:13,876 --> 00:07:16,876 Speaker 2: three thousand managers, so they were either three thousand bad 122 00:07:16,916 --> 00:07:21,556 Speaker 2: people or something systematically was going on that we were 123 00:07:21,596 --> 00:07:23,876 Speaker 2: not setting anyone up for success. 124 00:07:24,396 --> 00:07:28,036 Speaker 1: So when do you start wearing the Uber shirt every day? 125 00:07:28,996 --> 00:07:30,956 Speaker 2: So I got there and I was you know, Anne 126 00:07:30,996 --> 00:07:34,756 Speaker 2: and I were excited, and I was really proud of 127 00:07:34,796 --> 00:07:37,356 Speaker 2: the mission. I was proud of the work ethic. I 128 00:07:37,436 --> 00:07:42,476 Speaker 2: was proud of the intelligence of everyone, and so I 129 00:07:42,596 --> 00:07:44,436 Speaker 2: was proud to be there. And I got there and 130 00:07:44,996 --> 00:07:47,396 Speaker 2: everyone was ashamed to work there. There were all the 131 00:07:47,436 --> 00:07:52,476 Speaker 2: newspaper things like delete Uber, and there was videos on 132 00:07:52,516 --> 00:07:55,876 Speaker 2: the web of the CEO that weren't good, and so 133 00:07:56,556 --> 00:07:59,116 Speaker 2: people were embarrassed. They wouldn't when they got into an 134 00:07:59,156 --> 00:08:01,116 Speaker 2: Uber car, they wouldn't admit that they to the driver 135 00:08:01,196 --> 00:08:01,956 Speaker 2: they worked at Uber. 136 00:08:03,076 --> 00:08:05,676 Speaker 1: Oh so you're saying people who worked at Uber would 137 00:08:05,716 --> 00:08:07,996 Speaker 1: get into an Uber and wouldn't tell the driver they 138 00:08:07,996 --> 00:08:10,276 Speaker 1: worked at Uber because driver would think they're a bad 139 00:08:10,316 --> 00:08:11,756 Speaker 1: person for working at Uber. 140 00:08:12,036 --> 00:08:15,636 Speaker 2: Yes, they would. They stopped going to parties because the 141 00:08:15,636 --> 00:08:18,476 Speaker 2: topic of conversation was always Uber and it was embarrassing 142 00:08:18,516 --> 00:08:22,636 Speaker 2: for them. So they had real shame. And so what 143 00:08:22,796 --> 00:08:26,876 Speaker 2: I said is, I'm so proud of what we're all 144 00:08:26,916 --> 00:08:30,036 Speaker 2: intending to do. I'm going to wear an uber t 145 00:08:30,156 --> 00:08:33,716 Speaker 2: shirt until everyone else regains his pride because there used 146 00:08:33,756 --> 00:08:35,996 Speaker 2: to be the pride in the organization. 147 00:08:36,156 --> 00:08:39,196 Speaker 1: And so you're wearing it Monday to Friday, you're wearing 148 00:08:39,236 --> 00:08:41,716 Speaker 1: it Saturday and Sunday. And tell me, tell me as 149 00:08:41,756 --> 00:08:42,836 Speaker 1: the partner of the person. 150 00:08:43,836 --> 00:08:46,876 Speaker 3: Yeah, this wasn't an ideal part of the commitment. I'm 151 00:08:46,916 --> 00:08:48,716 Speaker 3: not sure you had. I did not clear it with 152 00:08:48,756 --> 00:08:52,276 Speaker 3: this many moves ahead and a chessboard. Now, yeah, there 153 00:08:52,276 --> 00:08:54,876 Speaker 3: were a couple of awkward family moments. We went to 154 00:08:54,916 --> 00:08:58,756 Speaker 3: a fancy party where you had to dress up and 155 00:08:58,996 --> 00:09:00,636 Speaker 3: Francis woarren uber T shirt. 156 00:09:00,716 --> 00:09:04,396 Speaker 2: Underneath a jacket. But indeed I did no. I promised 157 00:09:04,436 --> 00:09:06,276 Speaker 2: to wear any like you could see it like, ah, 158 00:09:06,716 --> 00:09:09,636 Speaker 2: for sure, for sure the commitment was real. 159 00:09:12,916 --> 00:09:16,636 Speaker 1: So okay, we've got the shirt, We've got the problems. 160 00:09:17,236 --> 00:09:20,676 Speaker 1: After the break, solving the problems and getting to the 161 00:09:20,676 --> 00:09:24,516 Speaker 1: point where Francis finally feels like she can stop wearing 162 00:09:24,556 --> 00:09:34,836 Speaker 1: an Uber shirt every single day. Now back to the show. So, okay, 163 00:09:34,916 --> 00:09:37,836 Speaker 1: you got your shirt, you're fired up. What's the first 164 00:09:37,916 --> 00:09:41,556 Speaker 1: sort of key thing you do to try and make 165 00:09:42,276 --> 00:09:44,276 Speaker 1: people like Uber better? 166 00:09:44,916 --> 00:09:48,436 Speaker 2: Yes, So the first thing was to teach the entire 167 00:09:48,596 --> 00:09:51,916 Speaker 2: organization how to build and more importantly in their case, 168 00:09:52,036 --> 00:09:54,596 Speaker 2: rebuild trust and how to do it quickly. So in 169 00:09:54,636 --> 00:09:59,356 Speaker 2: the presence of trust, everything goes faster and higher. So 170 00:09:59,956 --> 00:10:04,796 Speaker 2: if you get trust, nobody's going to relitigate the decision afterwards, 171 00:10:04,796 --> 00:10:06,956 Speaker 2: you get to stay on one things, and in the 172 00:10:06,996 --> 00:10:08,436 Speaker 2: absence of trust, people are going to ask you to 173 00:10:08,436 --> 00:10:11,916 Speaker 2: compromise way more than you should otherwise. So when we're 174 00:10:11,996 --> 00:10:15,036 Speaker 2: teaching about trust, what we're doing is delivering on the 175 00:10:15,076 --> 00:10:18,956 Speaker 2: promise that you can go faster and further. So our 176 00:10:18,996 --> 00:10:23,396 Speaker 2: collaboration will be better, our innovation will be better. In 177 00:10:23,436 --> 00:10:27,116 Speaker 2: the presence of trust, everything is better. In the absence 178 00:10:27,156 --> 00:10:32,556 Speaker 2: of trust, it's a miserable place to be. It starts 179 00:10:32,636 --> 00:10:35,596 Speaker 2: it stops. We go one step forward, two steps back. 180 00:10:36,396 --> 00:10:40,596 Speaker 2: So trust is we find trust is the foundation for 181 00:10:40,756 --> 00:10:41,836 Speaker 2: all human progress. 182 00:10:42,356 --> 00:10:46,316 Speaker 1: So that's the abstraction. What is a specific thing you 183 00:10:46,396 --> 00:10:47,316 Speaker 1: do to that end? 184 00:10:48,036 --> 00:10:52,036 Speaker 2: Well, you determine is trust breaking down for one of 185 00:10:52,116 --> 00:10:55,196 Speaker 2: three reasons. Turns out there's three pillars of trust, and 186 00:10:55,836 --> 00:10:59,916 Speaker 2: in Uber's case, the reason was empathy at every turn. 187 00:11:00,396 --> 00:11:03,836 Speaker 2: So you can be awesome, but if you're not empathetic, 188 00:11:04,076 --> 00:11:05,076 Speaker 2: we're not going to trust you. 189 00:11:05,316 --> 00:11:07,316 Speaker 1: And I mean, you say there's three pillars of trust, 190 00:11:07,596 --> 00:11:09,956 Speaker 1: I got to say, okay, if empathy is on, what 191 00:11:10,036 --> 00:11:10,596 Speaker 1: are the other two? 192 00:11:11,196 --> 00:11:14,116 Speaker 2: Logic and authenticity. 193 00:11:15,156 --> 00:11:17,156 Speaker 1: So if you think about it in terms of a 194 00:11:17,196 --> 00:11:20,676 Speaker 1: person's relationship to their manager, they are thinking, does what 195 00:11:20,716 --> 00:11:24,516 Speaker 1: my boss says make sense? Is it logical? Is my 196 00:11:24,636 --> 00:11:27,756 Speaker 1: boss a phony? Are they just giving me a smoke 197 00:11:27,796 --> 00:11:30,636 Speaker 1: screen that's authenticity? And does my boss care about me? 198 00:11:30,676 --> 00:11:33,676 Speaker 3: I mean it's not exactly okay, and care about my 199 00:11:33,796 --> 00:11:36,356 Speaker 3: success in this job. 200 00:11:37,436 --> 00:11:42,996 Speaker 1: And so the people at Uber, in your estimation, in 201 00:11:43,036 --> 00:11:48,076 Speaker 1: your finding, were logical and authentic. They just didn't care 202 00:11:48,116 --> 00:11:49,836 Speaker 1: about the people they worked with. 203 00:11:50,276 --> 00:11:53,516 Speaker 2: And it was for the employees, it was also for 204 00:11:53,556 --> 00:12:00,636 Speaker 2: the regulators. It was also for the investors, so they are. 205 00:12:00,556 --> 00:12:03,956 Speaker 1: Caring about the regulators. Was sort of Uber's brand, right, 206 00:12:04,036 --> 00:12:05,996 Speaker 1: like that was I don't quite want to say their 207 00:12:05,996 --> 00:12:08,796 Speaker 1: secret sauce, but kind of right. Well, it was going 208 00:12:08,836 --> 00:12:11,396 Speaker 1: into cities where, you know, to be fair to Uber, 209 00:12:11,436 --> 00:12:16,476 Speaker 1: it seemed like the taxi cab cartels had the regulators 210 00:12:16,556 --> 00:12:18,156 Speaker 1: kind of in their pockets in a lot of places. 211 00:12:18,196 --> 00:12:20,956 Speaker 1: So in a way, being not empathetic to the regulators 212 00:12:22,036 --> 00:12:23,156 Speaker 1: there was a logic to it. 213 00:12:23,116 --> 00:12:28,236 Speaker 2: From absolutely, and so the question is is what got 214 00:12:28,276 --> 00:12:32,156 Speaker 2: you here, what's going to take you there? And they 215 00:12:32,196 --> 00:12:35,756 Speaker 2: were magnificent at not being empathetic to the regulators. And 216 00:12:35,756 --> 00:12:39,636 Speaker 2: that's spilled over to many aspects of the organization. 217 00:12:40,276 --> 00:12:43,196 Speaker 1: So you've identified now in more detail. Okay, this is 218 00:12:43,236 --> 00:12:46,036 Speaker 1: what's wrong. How do you fix it? 219 00:12:46,356 --> 00:12:51,356 Speaker 2: Yeah, So two things. One, we created a curriculum, So 220 00:12:51,436 --> 00:12:55,116 Speaker 2: I got friends of mine from Harvard and we created 221 00:12:55,156 --> 00:12:58,556 Speaker 2: a remote curriculum. Half of it leadership, half of its 222 00:12:58,556 --> 00:13:02,036 Speaker 2: strategy to teach people how to do things. And it 223 00:13:02,076 --> 00:13:05,076 Speaker 2: was for the three thousand managers. We thought, you know, 224 00:13:05,756 --> 00:13:10,996 Speaker 2: maybe five hundred people would show up. People took it. 225 00:13:11,396 --> 00:13:14,476 Speaker 1: Six thousand out of three thousand mandagers, so all of 226 00:13:14,516 --> 00:13:17,436 Speaker 1: the managers basically and a lot of other people. Yes, Okay, 227 00:13:17,556 --> 00:13:20,956 Speaker 1: that's interesting, Like it seems like the problem is everybody 228 00:13:21,076 --> 00:13:26,436 Speaker 1: was being mean essentially to each other. How does like 229 00:13:26,676 --> 00:13:29,516 Speaker 1: online courses actually get people to not be mean? 230 00:13:29,716 --> 00:13:29,796 Speaker 2: Like? 231 00:13:29,836 --> 00:13:32,316 Speaker 1: It seems weird to me. Maybe I've just had bad 232 00:13:32,356 --> 00:13:34,276 Speaker 1: online courses in my work life. 233 00:13:34,716 --> 00:13:36,276 Speaker 3: Well, let me let me, let me add a little 234 00:13:36,316 --> 00:13:37,716 Speaker 3: let me add a little context, which I think is 235 00:13:37,796 --> 00:13:40,796 Speaker 3: material here into a lot of the organizations we work in. 236 00:13:42,476 --> 00:13:45,476 Speaker 3: It was very clear to the organization that plan A 237 00:13:46,156 --> 00:13:51,116 Speaker 3: wasn't working, Yeah, and that they had not collectively built 238 00:13:51,156 --> 00:13:55,276 Speaker 3: a culture where people were thriving. And so the motivation 239 00:13:55,796 --> 00:13:59,276 Speaker 3: for behavior change is partly why we like. We like 240 00:13:59,356 --> 00:14:02,116 Speaker 3: working with organizations that are at this point in their 241 00:14:02,156 --> 00:14:04,436 Speaker 3: life cycle there's some kind of crisis because people are 242 00:14:04,436 --> 00:14:08,756 Speaker 3: really leaning into changing and are very in touch with 243 00:14:08,796 --> 00:14:09,956 Speaker 3: the potential payoff. 244 00:14:10,316 --> 00:14:12,796 Speaker 1: Huh. Is there a particular time you remember and when 245 00:14:12,836 --> 00:14:15,316 Speaker 1: you were talking to Francis about what was happening, Like 246 00:14:15,476 --> 00:14:18,196 Speaker 1: any specific conversations come to mind. 247 00:14:18,756 --> 00:14:21,636 Speaker 3: Yeah, Francis had do you remember the robot we had 248 00:14:21,636 --> 00:14:24,356 Speaker 3: experimented with an It was actually called the Wanatron. It's 249 00:14:24,556 --> 00:14:28,756 Speaker 3: difficult product name, but the Wanatron was the height of 250 00:14:29,356 --> 00:14:32,676 Speaker 3: a human sitting in a chair essentially, so it would 251 00:14:32,676 --> 00:14:34,876 Speaker 3: wheel up to the dinner table. Francis would join us 252 00:14:34,916 --> 00:14:36,116 Speaker 3: for dinner, and then we would. 253 00:14:37,876 --> 00:14:40,036 Speaker 1: And it had a little like a like an iPad 254 00:14:40,076 --> 00:14:41,436 Speaker 1: for a face or something. 255 00:14:41,076 --> 00:14:45,236 Speaker 3: Exactly, so Francis would could could be there friends. 256 00:14:46,196 --> 00:14:51,556 Speaker 2: Yeah, it would take maybe three minutes for you and 257 00:14:51,596 --> 00:14:53,996 Speaker 2: the boys to stop interacting with like a robot, and 258 00:14:53,996 --> 00:14:55,276 Speaker 2: it was as if I was there. 259 00:14:55,396 --> 00:15:00,516 Speaker 3: Yeah. I do remember coming home, you know, a little 260 00:15:00,556 --> 00:15:03,756 Speaker 3: bit late at night, keeping the babysitter a little late, 261 00:15:04,476 --> 00:15:08,996 Speaker 3: cats away Jacob, you know, uh, and pulling into the driveway, 262 00:15:09,236 --> 00:15:12,956 Speaker 3: the wanatron was just waiting in the waiting in the window. 263 00:15:15,076 --> 00:15:17,796 Speaker 3: And but you were super excited. I mean I can 264 00:15:17,956 --> 00:15:23,076 Speaker 3: I can remember the image. You were super excited and 265 00:15:23,236 --> 00:15:25,276 Speaker 3: you had just finished a class and it had gone 266 00:15:25,996 --> 00:15:29,076 Speaker 3: really well, and you can tell, I mean, you can 267 00:15:29,156 --> 00:15:32,556 Speaker 3: tell in the classroom when there's an the energy of 268 00:15:32,636 --> 00:15:36,076 Speaker 3: engagement and that can get really infectious, even in these 269 00:15:36,796 --> 00:15:42,396 Speaker 3: digital classrooms. And it had worked, and you basically were like, 270 00:15:42,436 --> 00:15:44,516 Speaker 3: you know, honey, I think we're onto something. 271 00:15:45,316 --> 00:15:47,796 Speaker 2: That's when I knew it was it was fixable. 272 00:15:48,356 --> 00:15:53,116 Speaker 1: So at some point Travis Kalanick gets I don't know 273 00:15:53,116 --> 00:15:55,476 Speaker 1: if he was fired pushed out. He definitely gets pushed 274 00:15:55,476 --> 00:15:56,436 Speaker 1: out of the company. 275 00:15:56,196 --> 00:15:56,756 Speaker 2: Nine days in. 276 00:15:57,356 --> 00:16:00,556 Speaker 1: So say that again. Nine days in, nine days after 277 00:16:00,596 --> 00:16:02,956 Speaker 1: you took the full time job, he sweet talked to 278 00:16:02,956 --> 00:16:04,396 Speaker 1: you and that he was gone. 279 00:16:04,716 --> 00:16:07,316 Speaker 2: It wasn't his idea to leave, No, it was it 280 00:16:07,396 --> 00:16:12,796 Speaker 2: was really of a sudden and a surprise. And he 281 00:16:12,876 --> 00:16:16,796 Speaker 2: said to me, he said, you know, in my absence, 282 00:16:17,436 --> 00:16:20,396 Speaker 2: please do this act in the best because I was like, 283 00:16:20,396 --> 00:16:21,516 Speaker 2: what can I do to be helpful to you? And 284 00:16:21,596 --> 00:16:23,156 Speaker 2: he said, you can be helpful to me by doing 285 00:16:23,236 --> 00:16:26,396 Speaker 2: things in the best interest of Uber. And so that's 286 00:16:26,436 --> 00:16:26,796 Speaker 2: what I did. 287 00:16:28,276 --> 00:16:31,156 Speaker 1: So let's talk about what else you did. There's there's 288 00:16:31,196 --> 00:16:35,156 Speaker 1: the course. There was one detail that I read about 289 00:16:35,756 --> 00:16:38,236 Speaker 1: that was really interesting to me, and it was about 290 00:16:38,836 --> 00:16:42,996 Speaker 1: how people behaved in meetings. Yeah, tell me about that. 291 00:16:43,436 --> 00:16:47,516 Speaker 2: Yeah, And I think this was illustrative of the large 292 00:16:47,556 --> 00:16:51,076 Speaker 2: empathy wobble we would say at the organization. So in 293 00:16:51,116 --> 00:16:55,636 Speaker 2: the senior team, it was not a safe place. And 294 00:16:55,716 --> 00:16:59,116 Speaker 2: by that I mean when people were speaking, they were 295 00:16:59,196 --> 00:17:02,036 Speaker 2: like nervous and looking around and I. 296 00:17:01,956 --> 00:17:05,436 Speaker 1: Couldn't been in like c Sweitel people even in C suite. 297 00:17:05,876 --> 00:17:07,956 Speaker 2: No, no, no, even in C suite. Then what I 298 00:17:07,996 --> 00:17:13,436 Speaker 2: found out is that they were texting one another about 299 00:17:13,476 --> 00:17:15,116 Speaker 2: the person who was speaking. 300 00:17:16,596 --> 00:17:19,116 Speaker 1: In the room, in the room, and they all knew 301 00:17:19,156 --> 00:17:20,196 Speaker 1: that they were doing it. 302 00:17:20,116 --> 00:17:22,596 Speaker 2: And they all knew it, which was creating, Like, if 303 00:17:22,636 --> 00:17:25,236 Speaker 2: you want to create an environment that's not safe, just 304 00:17:25,276 --> 00:17:28,796 Speaker 2: start doing that. That's that guaranteed to make it not safe. 305 00:17:29,156 --> 00:17:33,356 Speaker 3: That it was shocking, super common you came to discover. 306 00:17:33,156 --> 00:17:34,156 Speaker 1: Yeah, super common. 307 00:17:34,156 --> 00:17:35,716 Speaker 3: Wait, and super tech industry. 308 00:17:37,036 --> 00:17:40,156 Speaker 1: Everybody does that and everybody knows everybody's doing that. 309 00:17:40,596 --> 00:17:43,436 Speaker 3: I don't think that's an unfair statement. It was at 310 00:17:43,436 --> 00:17:44,276 Speaker 3: that time, at. 311 00:17:44,156 --> 00:17:45,316 Speaker 2: That time, it. 312 00:17:46,756 --> 00:17:48,756 Speaker 1: That's so weird, it's so weird. 313 00:17:48,796 --> 00:17:50,556 Speaker 2: I have to say I had exactly the reaction that 314 00:17:50,596 --> 00:17:53,436 Speaker 2: you did. So set a norm because I was facilitating 315 00:17:53,476 --> 00:17:55,076 Speaker 2: the senior team. The board asked me to do it 316 00:17:55,116 --> 00:17:58,756 Speaker 2: in between the CEOs, and I said, for our meetings, 317 00:17:58,756 --> 00:18:02,196 Speaker 2: we're gonna have technology off in a way. Now we 318 00:18:02,276 --> 00:18:02,716 Speaker 2: do that in the. 319 00:18:02,676 --> 00:18:05,436 Speaker 1: Harvard classroom, meaning no phones, no laptops. 320 00:18:05,676 --> 00:18:10,196 Speaker 2: It turned out that when you remove the distraction, we 321 00:18:10,476 --> 00:18:14,276 Speaker 2: actually got an unprecedented amount of work done. Like we did. 322 00:18:14,476 --> 00:18:17,956 Speaker 2: We I don't think Uber ever got more done than 323 00:18:18,036 --> 00:18:24,036 Speaker 2: over that summer, so it started an incredible improvement trajectory. 324 00:18:24,196 --> 00:18:26,516 Speaker 2: So I'll give you another example of something that we did. 325 00:18:26,556 --> 00:18:29,156 Speaker 2: They were not very good at giving each other feedback, 326 00:18:29,156 --> 00:18:32,716 Speaker 2: and that I know as an operations professor, if I 327 00:18:32,836 --> 00:18:36,356 Speaker 2: give you effective feedback, you will improve at a dramatically 328 00:18:36,396 --> 00:18:41,036 Speaker 2: improved rate. So my ability to give you feedback is 329 00:18:41,156 --> 00:18:43,836 Speaker 2: actually going to be a great, big influence on your improvement. 330 00:18:44,916 --> 00:18:47,676 Speaker 2: The way they gave feedback often made people worse. So 331 00:18:47,796 --> 00:18:49,756 Speaker 2: not only did it not make them better, well once 332 00:18:49,796 --> 00:18:52,756 Speaker 2: we taught them how to do it, and not only 333 00:18:52,796 --> 00:18:55,356 Speaker 2: did it help other people improve, but it improved the 334 00:18:55,396 --> 00:18:57,076 Speaker 2: culture what we did. 335 00:18:57,716 --> 00:19:03,356 Speaker 1: Is there an example of helping people give better feedback? 336 00:19:03,596 --> 00:19:03,796 Speaker 2: Oh? 337 00:19:03,996 --> 00:19:06,396 Speaker 1: Yeah, yeah, like one specific thing. 338 00:19:06,716 --> 00:19:11,276 Speaker 2: Sure. So if I want you to improve the ideal 339 00:19:11,396 --> 00:19:15,876 Speaker 2: ratio of positive reinforcement, that is, do more of this 340 00:19:16,516 --> 00:19:21,076 Speaker 2: to constructive advice, do it differently. The ideal ratio is 341 00:19:21,156 --> 00:19:22,956 Speaker 2: five to one, huh. 342 00:19:23,396 --> 00:19:25,636 Speaker 1: So you say five good things that the person did 343 00:19:25,636 --> 00:19:27,476 Speaker 1: and one thing they could improve on, and. 344 00:19:27,436 --> 00:19:32,996 Speaker 2: At uber it was zero to ten. And so by 345 00:19:33,036 --> 00:19:38,636 Speaker 2: doing that it unleashed all of this amazing improvement and 346 00:19:39,476 --> 00:19:42,516 Speaker 2: we were catching people doing things right in sincere and 347 00:19:42,556 --> 00:19:45,116 Speaker 2: specific ways. It can't be performed at ever you lose 348 00:19:45,156 --> 00:19:50,196 Speaker 2: that authenticity thing. So the culture improved and the improvement improved, 349 00:19:50,236 --> 00:19:52,276 Speaker 2: and this happens overnight. 350 00:19:52,996 --> 00:19:57,356 Speaker 1: Was there a moment when you felt like you had 351 00:19:57,396 --> 00:19:58,916 Speaker 1: done it? Was there a moment when you took off 352 00:19:58,956 --> 00:20:01,196 Speaker 1: the shirt? Yeah? 353 00:20:01,276 --> 00:20:04,916 Speaker 2: I took off the shirt nine months in. 354 00:20:05,276 --> 00:20:09,636 Speaker 1: And was there some thing? Was there? What was the 355 00:20:09,716 --> 00:20:12,556 Speaker 1: last thing? What was the thing that why didn't you 356 00:20:12,556 --> 00:20:14,076 Speaker 1: take it off eight months? What was the last thing 357 00:20:14,076 --> 00:20:15,956 Speaker 1: you had to do to take off the shirt? 358 00:20:16,956 --> 00:20:17,156 Speaker 2: Yeah? 359 00:20:17,156 --> 00:20:19,556 Speaker 3: I mean the thing that was most gratifying to you, 360 00:20:20,716 --> 00:20:24,636 Speaker 3: if I recall, is seeing uber t shirts on this 361 00:20:24,756 --> 00:20:26,876 Speaker 3: everywhere of San Francisco. 362 00:20:26,396 --> 00:20:30,996 Speaker 2: Everywhere, everywhere you just walk in, and it would have 363 00:20:31,076 --> 00:20:34,836 Speaker 2: to be like a nighttime horror story to tell people 364 00:20:34,836 --> 00:20:37,236 Speaker 2: what it used to be like, because you couldn't imagine 365 00:20:37,236 --> 00:20:38,436 Speaker 2: it being like that anymore. 366 00:20:38,716 --> 00:20:40,516 Speaker 1: You didn't need to wear the shirt anymore because everybody 367 00:20:40,516 --> 00:20:41,236 Speaker 1: else was worthing. 368 00:20:41,036 --> 00:20:43,596 Speaker 2: Everybody else did everyone. And that, by the way, is 369 00:20:43,596 --> 00:20:45,796 Speaker 2: when I stopped working their full time because I also 370 00:20:45,796 --> 00:20:47,876 Speaker 2: didn't need to be there full time anymore. 371 00:20:48,076 --> 00:20:50,116 Speaker 1: And what'd you think when what do you think when 372 00:20:50,156 --> 00:20:54,636 Speaker 1: Francis first took off the shirt, there was Jacob. 373 00:20:54,676 --> 00:20:56,036 Speaker 2: There was great relief. 374 00:20:57,476 --> 00:21:01,436 Speaker 3: As her spouse, as her collaborator, which I think it 375 00:21:01,476 --> 00:21:03,636 Speaker 3: was by the end of this story that you had 376 00:21:03,676 --> 00:21:06,156 Speaker 3: sold me, and about midway through that you were really 377 00:21:06,196 --> 00:21:08,956 Speaker 3: putting the pressure on we should go and do this together. 378 00:21:10,276 --> 00:21:11,196 Speaker 2: I was intrigued. 379 00:21:11,836 --> 00:21:14,116 Speaker 1: Tell me more about that. So you're saying this story 380 00:21:14,956 --> 00:21:17,756 Speaker 1: when Francis was working at uber was when and you 381 00:21:17,876 --> 00:21:20,596 Speaker 1: really decided that you and Francis should go work together 382 00:21:21,356 --> 00:21:22,116 Speaker 1: as consultants. 383 00:21:22,236 --> 00:21:27,316 Speaker 3: Yeah, Well, we started a company that was very informed 384 00:21:27,316 --> 00:21:31,636 Speaker 3: by this experience, that was focused on removing barriers to 385 00:21:31,716 --> 00:21:36,596 Speaker 3: impact and advancement for people, particularly in tech, and the 386 00:21:36,716 --> 00:21:40,916 Speaker 3: mission has expanded and so that and particularly. 387 00:21:40,396 --> 00:21:43,516 Speaker 2: For women, people of color, LGBTQ, like the people for 388 00:21:43,596 --> 00:21:47,396 Speaker 2: whom the harms were most likely to be done. What 389 00:21:47,436 --> 00:21:49,476 Speaker 2: we decided to do is instead of going and helping 390 00:21:49,516 --> 00:21:52,236 Speaker 2: one company at a time, we started a company where 391 00:21:52,236 --> 00:21:55,996 Speaker 2: we could bring people in and help many companies at once, 392 00:21:56,396 --> 00:22:01,996 Speaker 2: and that exists until today, all focusing on the education part, 393 00:22:02,076 --> 00:22:10,636 Speaker 2: and has been enormously successful in unleashing individuals and the organizations. 394 00:22:12,716 --> 00:22:15,036 Speaker 1: We'll be back in a minute with the lightning round. 395 00:22:22,156 --> 00:22:24,276 Speaker 1: That's the end of the ads now we're going back 396 00:22:24,276 --> 00:22:26,316 Speaker 1: to the show. Let's do a lightning round. 397 00:22:26,916 --> 00:22:27,516 Speaker 2: Let's do it. 398 00:22:27,836 --> 00:22:32,356 Speaker 1: Okay, what's one tip for teaching your kid how to fish? 399 00:22:32,436 --> 00:22:35,916 Speaker 2: If the birds aren't there, go come back another day. 400 00:22:36,556 --> 00:22:40,516 Speaker 1: A good one, because the birds eat the fish, and 401 00:22:40,556 --> 00:22:43,196 Speaker 1: if there's no fish, there won't be birds. Fish where 402 00:22:43,196 --> 00:22:45,116 Speaker 1: there are fish, A fisherman told male. 403 00:22:44,956 --> 00:22:46,996 Speaker 2: Fish where there are fish. 404 00:22:47,316 --> 00:22:50,276 Speaker 1: I know you talk on your show about the lesbian 405 00:22:50,356 --> 00:22:53,996 Speaker 1: can do attitude, which is an attitude that I would 406 00:22:53,996 --> 00:22:59,356 Speaker 1: love to have. So, like, tell me one thing I 407 00:22:59,516 --> 00:23:03,436 Speaker 1: need to have the lesbian can do attitude. 408 00:23:04,156 --> 00:23:08,316 Speaker 3: Yeah, Well, it's our it's our affectionate way of saying 409 00:23:08,356 --> 00:23:11,756 Speaker 3: getting in touch with the fixer inside you. 410 00:23:13,276 --> 00:23:14,716 Speaker 2: And I think it's. 411 00:23:14,596 --> 00:23:18,876 Speaker 3: Fundamentally about mindset. So it's getting in touch with the 412 00:23:18,916 --> 00:23:20,636 Speaker 3: agency you have to solve problems. 413 00:23:20,756 --> 00:23:24,156 Speaker 2: That's it. That's your inner lesbian. We all have one. Yeah, 414 00:23:24,196 --> 00:23:26,396 Speaker 2: And you know this is interesting. I don't know if 415 00:23:26,436 --> 00:23:29,236 Speaker 2: I've ever taught anyone how to have can do lesbian spirit. 416 00:23:29,276 --> 00:23:31,996 Speaker 2: What I do is encourage people to surround themselves with 417 00:23:32,116 --> 00:23:37,236 Speaker 2: people that have this inner spirit. So it's quite infectious. 418 00:23:37,356 --> 00:23:39,996 Speaker 2: I think the answer, I think that's the answer, which 419 00:23:40,036 --> 00:23:43,396 Speaker 2: is be around people who in the presence of a 420 00:23:43,556 --> 00:23:48,596 Speaker 2: problem walk towards it. They neither freeze nor do they 421 00:23:48,636 --> 00:23:49,156 Speaker 2: walk away. 422 00:23:49,636 --> 00:23:51,516 Speaker 1: What's the best thing about working with your spouse? 423 00:23:52,556 --> 00:23:58,676 Speaker 3: I get to know her more every day, and I 424 00:23:58,716 --> 00:24:05,916 Speaker 3: fall in love with her more every day. 425 00:24:06,276 --> 00:24:09,036 Speaker 1: Francis Fry and Anne Morris are the co founders of 426 00:24:09,196 --> 00:24:12,596 Speaker 1: the Leadership Consortium and the host of the Fixable podcast. 427 00:24:13,076 --> 00:24:14,836 Speaker 1: If you want a chance to be on their show, 428 00:24:15,116 --> 00:24:20,036 Speaker 1: call their hotline at two three four Fixable. That's two 429 00:24:20,036 --> 00:24:24,316 Speaker 1: three four three four nine two two five three, and 430 00:24:24,356 --> 00:24:27,356 Speaker 1: you can leave a voicemail for Annon Francis with your 431 00:24:27,396 --> 00:24:33,316 Speaker 1: workplace problem. Today's show was produced by Edith Russolo, edited 432 00:24:33,316 --> 00:24:36,916 Speaker 1: by Karen ch Kergie, and engineered by Amanda k. Wall. 433 00:24:38,556 --> 00:24:43,316 Speaker 1: Special thanks today to Isabelle Carter, Constanza Gayardo, and Sarah Yis. 434 00:24:44,116 --> 00:24:47,356 Speaker 1: You can email us at problem at Pushkin dot fm. 435 00:24:47,796 --> 00:24:50,516 Speaker 1: I'm Jacob Goldstein and we will be back next week 436 00:24:50,556 --> 00:24:57,196 Speaker 1: with another episode of What's Your Problem.