1 00:00:05,080 --> 00:00:07,680 Speaker 1: A few years ago and Whisky start a company called 2 00:00:07,680 --> 00:00:11,000 Speaker 1: twenty three and Me. That company provides anyone who uses 3 00:00:11,000 --> 00:00:14,319 Speaker 1: the firm services with information about their own genetic code. 4 00:00:14,760 --> 00:00:17,599 Speaker 1: With that information, people going to determine their own ancestry 5 00:00:17,640 --> 00:00:19,800 Speaker 1: and they can also determine where they have potential genetic 6 00:00:19,880 --> 00:00:22,560 Speaker 1: health problems in the future. I sat down with Anne 7 00:00:22,560 --> 00:00:24,840 Speaker 1: and her home in Palo Alto to discuss the future 8 00:00:24,880 --> 00:00:26,680 Speaker 1: of twenty three and me and how it can help 9 00:00:26,760 --> 00:00:30,600 Speaker 1: solve genetic health care problems. So for those people that 10 00:00:30,720 --> 00:00:33,360 Speaker 1: have not done twenty three and me yet as a 11 00:00:33,400 --> 00:00:36,800 Speaker 1: test kit, tell us what twenty three and me actually is. 12 00:00:37,159 --> 00:00:39,800 Speaker 1: So twenty three and Me is a company that allows 13 00:00:39,840 --> 00:00:43,120 Speaker 1: you to see your own DNA, but it also helps 14 00:00:43,159 --> 00:00:46,680 Speaker 1: you understand what are your genetic health risks? What is 15 00:00:46,680 --> 00:00:49,920 Speaker 1: it that your potentially predisposed to something like type two diabetes, 16 00:00:50,440 --> 00:00:54,760 Speaker 1: h fibrillation. But it also helps you understand your ancestry. 17 00:00:54,960 --> 00:00:57,280 Speaker 1: So where are you from the world, how are you 18 00:00:57,320 --> 00:01:00,680 Speaker 1: connected to other people? And we make it so shoal like. 19 00:01:00,720 --> 00:01:02,480 Speaker 1: One of the big differences with twenty three and Me 20 00:01:02,600 --> 00:01:05,160 Speaker 1: is like we have a it's a social platform, so 21 00:01:05,200 --> 00:01:07,000 Speaker 1: you could share like you and I could share our 22 00:01:07,080 --> 00:01:10,640 Speaker 1: data together, we could potentially be relatives. You can find 23 00:01:11,000 --> 00:01:13,720 Speaker 1: new relatives on twenty three ME and you can also 24 00:01:13,800 --> 00:01:16,880 Speaker 1: share some of your health information. So when people take 25 00:01:16,920 --> 00:01:19,640 Speaker 1: this test, and I should say I took it recently, 26 00:01:19,640 --> 00:01:21,920 Speaker 1: I haven't gotten the results back. I'm nervous about whether 27 00:01:21,920 --> 00:01:25,080 Speaker 1: I'm going to get good or bad results. But all 28 00:01:25,160 --> 00:01:28,200 Speaker 1: you do is you spit into a little tube, you 29 00:01:28,280 --> 00:01:31,720 Speaker 1: mail it back to you, and then how do you 30 00:01:31,760 --> 00:01:34,560 Speaker 1: get out of somebody's saliva all this information? Isn't it? 31 00:01:34,560 --> 00:01:37,160 Speaker 1: It's it's amazing that it's just it's a little tube 32 00:01:37,160 --> 00:01:39,520 Speaker 1: of spit. So you spit in the tube. It actually 33 00:01:39,520 --> 00:01:42,640 Speaker 1: takes about ten minutes to spit um. You have cells 34 00:01:43,080 --> 00:01:46,480 Speaker 1: from your cheek and you know, cells cells that come 35 00:01:46,560 --> 00:01:49,280 Speaker 1: into that spit into the saliva, and you send it 36 00:01:49,320 --> 00:01:51,600 Speaker 1: back to us. We um you know, take part of 37 00:01:51,640 --> 00:01:55,080 Speaker 1: that sample. We extract the DNA and we run it 38 00:01:55,320 --> 00:01:58,800 Speaker 1: on Jean chip. You know, it's a technology that Alumina 39 00:01:58,840 --> 00:02:01,760 Speaker 1: has made and that's part of it's you know, thanks 40 00:02:01,760 --> 00:02:04,120 Speaker 1: to this company. It's a very affordable test. You know, 41 00:02:04,200 --> 00:02:06,920 Speaker 1: it's under two dollars if you want to get the 42 00:02:06,960 --> 00:02:09,760 Speaker 1: health and the ancestry, and it takes a couple of 43 00:02:09,760 --> 00:02:11,760 Speaker 1: weeks for us to turn it around, and then you 44 00:02:11,800 --> 00:02:14,240 Speaker 1: have your genome. And what twenty three really has done 45 00:02:14,320 --> 00:02:17,840 Speaker 1: is built all of the tools, all the interactive software tools, 46 00:02:18,280 --> 00:02:22,040 Speaker 1: so that you can navigate and explore what does the 47 00:02:22,080 --> 00:02:25,120 Speaker 1: genome mean for you? So when you give the results 48 00:02:25,160 --> 00:02:28,880 Speaker 1: back to people, what are they typically most surprised to learn? 49 00:02:29,160 --> 00:02:32,960 Speaker 1: We're first and foremost really concerned about health returning health results, 50 00:02:33,320 --> 00:02:35,560 Speaker 1: and so we spend a lot of time thinking about, Okay, 51 00:02:35,560 --> 00:02:37,480 Speaker 1: if I'm going to tell you your genetically high risk 52 00:02:37,520 --> 00:02:40,919 Speaker 1: for Alzheimer's, how are you going to respond to that? Um, 53 00:02:41,000 --> 00:02:43,440 Speaker 1: We have all kinds of additional layers of consent, We 54 00:02:43,520 --> 00:02:46,600 Speaker 1: have you know, physician videos around it. But I would 55 00:02:46,600 --> 00:02:50,040 Speaker 1: say what the biggest surprises for people is that they 56 00:02:50,080 --> 00:02:54,079 Speaker 1: find out that they have additional family members that they 57 00:02:54,080 --> 00:02:57,000 Speaker 1: did not know about, or the family members that they 58 00:02:57,040 --> 00:02:59,720 Speaker 1: have they did not necessarily you know, they find out 59 00:02:59,760 --> 00:03:03,560 Speaker 1: that they're not necessarily related UM or genetically related. So 60 00:03:03,639 --> 00:03:07,000 Speaker 1: I would say the family connections are frankly actually become 61 00:03:07,040 --> 00:03:09,960 Speaker 1: the biggest surprise for me. UM. I would say, when 62 00:03:10,000 --> 00:03:12,280 Speaker 1: the other biggest questions is like, what is that definition 63 00:03:12,320 --> 00:03:15,280 Speaker 1: of race? So a lot of people find out that 64 00:03:15,320 --> 00:03:19,000 Speaker 1: they have African ancestry or they have Jewish ancestry, but 65 00:03:19,040 --> 00:03:22,359 Speaker 1: they never knew that they had either those. So then 66 00:03:22,400 --> 00:03:24,799 Speaker 1: what really is that definition of being Jewish or what's 67 00:03:24,800 --> 00:03:28,720 Speaker 1: that definition of being African American? With the co discovery 68 00:03:28,720 --> 00:03:31,679 Speaker 1: of DNA with James Watson and uh, he once told 69 00:03:31,680 --> 00:03:34,600 Speaker 1: me that he had his human genome mapped early, one 70 00:03:34,639 --> 00:03:36,480 Speaker 1: of the early people, but he said he didn't want 71 00:03:36,480 --> 00:03:40,000 Speaker 1: to know if he had a predisposition to Alzheimer's, And 72 00:03:40,120 --> 00:03:42,080 Speaker 1: I wonder or a lot of people have that view, 73 00:03:42,080 --> 00:03:44,840 Speaker 1: which is, I'd like to know what my genes would be, 74 00:03:44,840 --> 00:03:47,240 Speaker 1: but don't tell me if I have a predisposition Alzheimer's, 75 00:03:47,240 --> 00:03:48,880 Speaker 1: because not much I can do about it. Is that right? 76 00:03:49,320 --> 00:03:51,400 Speaker 1: I think it's it's actually I would say you have 77 00:03:51,480 --> 00:03:54,120 Speaker 1: sort of two camps, and both camps feel very strongly. 78 00:03:54,480 --> 00:03:56,600 Speaker 1: You have a group of people who very much want 79 00:03:56,680 --> 00:04:00,120 Speaker 1: to know everything about themselves, and they don't necessarily a 80 00:04:00,200 --> 00:04:03,880 Speaker 1: report like Alzheimer's doesn't necessarily have clinical utility in the 81 00:04:03,920 --> 00:04:05,800 Speaker 1: capacity that you're going to have a treatment or you're 82 00:04:05,840 --> 00:04:09,320 Speaker 1: going to change anything, but it has high personal utility, 83 00:04:09,480 --> 00:04:12,680 Speaker 1: meaning that those people might decide they create a will 84 00:04:13,200 --> 00:04:17,039 Speaker 1: they retire earlier or one of the main things that 85 00:04:17,040 --> 00:04:18,479 Speaker 1: we see people do is they go and they buy 86 00:04:18,480 --> 00:04:20,960 Speaker 1: long term care insurance. So what we see is that 87 00:04:21,000 --> 00:04:24,120 Speaker 1: they take a personal utility and they're actually proactive with 88 00:04:24,160 --> 00:04:27,720 Speaker 1: that information. Now, in the consumer world, what people like 89 00:04:27,880 --> 00:04:31,200 Speaker 1: is repeat customers. And so if I did as I 90 00:04:31,240 --> 00:04:33,440 Speaker 1: just did, I took a twenty three and me test. 91 00:04:33,800 --> 00:04:36,760 Speaker 1: Once I get the results, how do you monetize me 92 00:04:36,800 --> 00:04:38,719 Speaker 1: in the future? What are there are different things you 93 00:04:38,760 --> 00:04:40,839 Speaker 1: can sell to me? Or because that's what usually companies 94 00:04:40,880 --> 00:04:42,560 Speaker 1: like to do is go back to their customers and 95 00:04:42,560 --> 00:04:45,080 Speaker 1: sell them something else. So what do you have interesting? 96 00:04:45,120 --> 00:04:47,000 Speaker 1: You know, I could look back like if we if 97 00:04:47,000 --> 00:04:49,080 Speaker 1: I ever do a retrospective on twenty three and me, 98 00:04:49,279 --> 00:04:52,400 Speaker 1: was it a mistake to not have a subscription product earlier? 99 00:04:52,920 --> 00:04:55,680 Speaker 1: So we do today have a subscription product. In early 100 00:04:55,800 --> 00:05:00,400 Speaker 1: early stages of the subscription product, the original I ideas 101 00:05:00,400 --> 00:05:02,240 Speaker 1: with twenty three me was that we'd get you in 102 00:05:02,240 --> 00:05:04,960 Speaker 1: in the simplest way possible because we didn't want to 103 00:05:05,000 --> 00:05:08,120 Speaker 1: overly confuse the messaging about a subscription or not. But 104 00:05:08,200 --> 00:05:10,560 Speaker 1: it was just it was so unknown, like why do 105 00:05:10,600 --> 00:05:13,960 Speaker 1: you even want your genetic information. So the idea was 106 00:05:14,000 --> 00:05:16,720 Speaker 1: like I needed to see the market just with like 107 00:05:16,800 --> 00:05:19,760 Speaker 1: my early adopters is like, just like what is this information? 108 00:05:19,839 --> 00:05:22,520 Speaker 1: And make it as simple as possible. But what I've 109 00:05:22,560 --> 00:05:25,520 Speaker 1: seen over the last sixteen years is that my customers 110 00:05:25,520 --> 00:05:29,719 Speaker 1: are really engaged. So for example, over eighty five of 111 00:05:29,720 --> 00:05:33,279 Speaker 1: our customers are consenting for research, like really high numbers 112 00:05:33,279 --> 00:05:37,080 Speaker 1: of people opt into research. Over fift of my customers 113 00:05:37,080 --> 00:05:39,880 Speaker 1: are coming back on a quarterly basis. So if I 114 00:05:40,320 --> 00:05:43,080 Speaker 1: subscribe to one of your subscription services and I have 115 00:05:43,160 --> 00:05:46,560 Speaker 1: a predisposition to disease A or be, you're likely to 116 00:05:46,600 --> 00:05:48,520 Speaker 1: send me information on that from time and time. Is 117 00:05:48,560 --> 00:05:50,720 Speaker 1: that what you mean? We'll potentially update that report or 118 00:05:50,760 --> 00:05:53,279 Speaker 1: well potentially said you additional But the main thing we 119 00:05:53,320 --> 00:05:56,560 Speaker 1: did we recently acquired a company called Lemonade Health. And 120 00:05:56,600 --> 00:05:59,479 Speaker 1: the reason why we acquired Lemonade Health was two folds, 121 00:05:59,520 --> 00:06:02,560 Speaker 1: like mostly to fit answer the question that you just asked. 122 00:06:02,600 --> 00:06:04,599 Speaker 1: Here is like you find out your higher risk, Well, 123 00:06:04,600 --> 00:06:07,560 Speaker 1: now what can you do? So first and foremost will 124 00:06:07,560 --> 00:06:10,359 Speaker 1: be able to have medical protocols about how can we 125 00:06:10,400 --> 00:06:14,560 Speaker 1: actually deliver genomic based care. No one has adopted genomic medicine, 126 00:06:14,960 --> 00:06:17,480 Speaker 1: So how can I adopt genomic based care in a 127 00:06:17,520 --> 00:06:20,360 Speaker 1: primary care setting to really help think about how do 128 00:06:20,400 --> 00:06:23,120 Speaker 1: I engage, Like how do I help you be healthy? So, 129 00:06:23,240 --> 00:06:25,359 Speaker 1: first and foremost we're going to like solve that question 130 00:06:25,400 --> 00:06:28,479 Speaker 1: for you. Second is really about the engagement side. You 131 00:06:28,520 --> 00:06:31,400 Speaker 1: find out your higher risk for heart disease, well, like 132 00:06:31,440 --> 00:06:34,120 Speaker 1: an article just came out about caffeine consumption in heart 133 00:06:34,120 --> 00:06:37,120 Speaker 1: disease or sleep in heart disease, I want you to 134 00:06:37,240 --> 00:06:40,080 Speaker 1: update me. So the company then will start to update 135 00:06:40,160 --> 00:06:43,000 Speaker 1: people to say, hey, here's the latest about what's happening. 136 00:06:43,040 --> 00:06:45,640 Speaker 1: Here's things that are in your hands that you should 137 00:06:45,640 --> 00:06:48,159 Speaker 1: actually start to go and think about. Now, you have 138 00:06:48,320 --> 00:06:50,800 Speaker 1: a fairly distinguished family. I'd like to talk about it 139 00:06:50,880 --> 00:06:54,160 Speaker 1: for a moment. It's been written about. I recall your 140 00:06:54,200 --> 00:06:56,520 Speaker 1: father was a physics professor in the head of the 141 00:06:56,520 --> 00:07:00,480 Speaker 1: physics department at Stanford. Your mother is a distinguished educator 142 00:07:00,520 --> 00:07:03,720 Speaker 1: and wrote a book about how to raise distinguished, accomplished children. 143 00:07:04,279 --> 00:07:08,280 Speaker 1: She sure did. One of your sisters is an epidemiologist 144 00:07:08,320 --> 00:07:12,880 Speaker 1: at the University of California, professor. Another sister runs YouTube. 145 00:07:13,240 --> 00:07:15,560 Speaker 1: And then you are the founder of twenty three and 146 00:07:15,840 --> 00:07:18,640 Speaker 1: so what was it like growing up on the Stanford 147 00:07:18,640 --> 00:07:21,560 Speaker 1: campus with all this talented around you. I think the 148 00:07:21,600 --> 00:07:24,200 Speaker 1: best thing about how we grew up is we were 149 00:07:24,240 --> 00:07:31,080 Speaker 1: surrounded by an academic culture of people who really prioritized 150 00:07:31,800 --> 00:07:36,120 Speaker 1: education and learning, but also really encourage people to pursue 151 00:07:36,120 --> 00:07:40,120 Speaker 1: their passions. So what I what I love about all 152 00:07:40,160 --> 00:07:42,520 Speaker 1: my sisters, Like we both you know, Susan and I 153 00:07:42,600 --> 00:07:46,640 Speaker 1: happened to be in higher profile roles, but everyone is 154 00:07:46,640 --> 00:07:49,760 Speaker 1: pursuing something that they really love and they really enjoy doing. 155 00:07:49,840 --> 00:07:52,680 Speaker 1: So I think that the best thing we all got 156 00:07:52,800 --> 00:07:56,640 Speaker 1: from our parents was to pursue your passions, no matter 157 00:07:56,680 --> 00:08:01,000 Speaker 1: how esoteric, or whether or not they're monetize able or 158 00:08:01,040 --> 00:08:03,360 Speaker 1: if it's you know, an academic kind of pursuit. Right. 159 00:08:03,400 --> 00:08:05,760 Speaker 1: So I grew up in the Stanford campus. I assume 160 00:08:05,800 --> 00:08:08,080 Speaker 1: you're an all A student. Your sisters are all a's. 161 00:08:08,160 --> 00:08:10,240 Speaker 1: So where did you want to go to school? In college? 162 00:08:10,480 --> 00:08:12,200 Speaker 1: You know, I kind of debate. I mean, my my 163 00:08:12,280 --> 00:08:14,680 Speaker 1: father went to Harvard. Like the realities, you know, things 164 00:08:15,400 --> 00:08:17,520 Speaker 1: based on your parents, and my father was really lucky. 165 00:08:17,560 --> 00:08:19,960 Speaker 1: He went to Harvard after he came from he came 166 00:08:20,000 --> 00:08:21,800 Speaker 1: from Poland. He was in school and he went early 167 00:08:21,840 --> 00:08:25,560 Speaker 1: to Harvard. Um My mom went to Berkeley, so I knew. 168 00:08:25,560 --> 00:08:28,320 Speaker 1: My parents always encouraged me. They would say, you know, go, 169 00:08:28,640 --> 00:08:31,440 Speaker 1: like it's important to go, it's important to have education, 170 00:08:31,480 --> 00:08:33,640 Speaker 1: it's important to go to a good school. But there's 171 00:08:33,679 --> 00:08:35,960 Speaker 1: a lot of good schools, and I would say today 172 00:08:36,000 --> 00:08:39,199 Speaker 1: people over emphasized, like you know what, exactly what school 173 00:08:39,240 --> 00:08:41,120 Speaker 1: I have to go to. There's so many great schools, 174 00:08:41,120 --> 00:08:43,800 Speaker 1: and frankly, the great school finds you. Um So, I 175 00:08:43,880 --> 00:08:46,439 Speaker 1: always wanted to go. I thought Harvard would be fascinating. 176 00:08:46,480 --> 00:08:49,439 Speaker 1: My oldest sister she went to Harvard. Um I used 177 00:08:49,480 --> 00:08:51,640 Speaker 1: to stay with her at Harvard. I just like I 178 00:08:51,679 --> 00:08:54,920 Speaker 1: was fascinated by people who wore tweed and had leather patches. 179 00:08:55,480 --> 00:08:58,520 Speaker 1: Um So I decided not to end up going to Harvard. 180 00:08:58,559 --> 00:09:01,800 Speaker 1: I went to Yale for on weekend, and I fell 181 00:09:01,840 --> 00:09:04,960 Speaker 1: in love with Yale and I'm a yalely I just 182 00:09:05,040 --> 00:09:06,920 Speaker 1: loved it. Okay, And what did you study there? I 183 00:09:06,960 --> 00:09:09,680 Speaker 1: studied biology. And did you want to go to medical school? 184 00:09:09,800 --> 00:09:12,240 Speaker 1: I did. I thought about going to medical school. I 185 00:09:12,280 --> 00:09:15,920 Speaker 1: loved research, but I wasn't Um I can't say that 186 00:09:15,960 --> 00:09:19,520 Speaker 1: I was an incredibly talented researcher. I would say that 187 00:09:19,600 --> 00:09:22,520 Speaker 1: detail in that precision. There was a lot of wonderful 188 00:09:22,720 --> 00:09:25,240 Speaker 1: post docs in the lab who helped me get through 189 00:09:25,240 --> 00:09:29,880 Speaker 1: my research. But I love the science. I love molecular biology, 190 00:09:29,920 --> 00:09:32,600 Speaker 1: like it keeps me up at night that you have 191 00:09:32,760 --> 00:09:36,160 Speaker 1: this incredible set of processes going on right now in 192 00:09:36,240 --> 00:09:39,240 Speaker 1: your body. She went to Yale, you graduated. He decided 193 00:09:39,280 --> 00:09:40,800 Speaker 1: not going to go to medical school. But what did 194 00:09:40,800 --> 00:09:42,880 Speaker 1: you do? Well? I and I actually always you know, 195 00:09:42,920 --> 00:09:44,600 Speaker 1: and I joke a lot. I'm on the Kaiser Permanent, 196 00:09:44,640 --> 00:09:47,480 Speaker 1: a school medicine board, and I sometimes joke. I was like, 197 00:09:47,520 --> 00:09:49,599 Speaker 1: at some point, I'll still go to medical school, like I, 198 00:09:49,760 --> 00:09:53,160 Speaker 1: I mean, it's like the most it's the most interesting 199 00:09:53,240 --> 00:09:55,760 Speaker 1: job you can have, I think, But um, but I 200 00:09:55,840 --> 00:09:58,240 Speaker 1: decided not to, and very randomly I got a job. 201 00:09:59,160 --> 00:10:03,079 Speaker 1: I randomly got my resume sent to the Wallenberg family 202 00:10:03,080 --> 00:10:06,280 Speaker 1: in Sweden and they offered me a job. I didn't 203 00:10:06,320 --> 00:10:08,440 Speaker 1: know who they were, I didn't know what it was, 204 00:10:09,240 --> 00:10:11,760 Speaker 1: um and but I thought it was interesting, like it 205 00:10:11,800 --> 00:10:14,720 Speaker 1: was like a one year opportunity paid me forty dollars. 206 00:10:14,760 --> 00:10:16,640 Speaker 1: I thought I would never make that much money again, 207 00:10:17,200 --> 00:10:18,599 Speaker 1: so I figured I would go and do it for 208 00:10:18,600 --> 00:10:20,320 Speaker 1: a year. So where did you do this? So? I 209 00:10:20,360 --> 00:10:23,320 Speaker 1: worked in New York for almost ten years. Um. So 210 00:10:23,360 --> 00:10:25,319 Speaker 1: I started off with the Wallenberg's and then I went 211 00:10:25,360 --> 00:10:28,720 Speaker 1: to a series of hedge funds afterwards. Right so, um, 212 00:10:28,840 --> 00:10:31,480 Speaker 1: nothing could be higher in life than working for hedge funds. Right? No, 213 00:10:31,720 --> 00:10:34,600 Speaker 1: exactly how did you feel there was a higher calling 214 00:10:34,600 --> 00:10:36,160 Speaker 1: than working for a hedge fund at the point I 215 00:10:36,720 --> 00:10:39,440 Speaker 1: was my my my best friend's father used to call 216 00:10:39,480 --> 00:10:43,400 Speaker 1: me a leech on society. Um so, and I took 217 00:10:43,440 --> 00:10:45,840 Speaker 1: up fondly. I was like, look, the reality for me 218 00:10:45,960 --> 00:10:50,120 Speaker 1: is I was learning, Um, there was no better education 219 00:10:50,600 --> 00:10:52,720 Speaker 1: about it was like getting a PC. Like I went 220 00:10:52,760 --> 00:10:55,520 Speaker 1: to medical meetings every week. I went, I was speaking 221 00:10:55,520 --> 00:10:57,960 Speaker 1: with experts every day, Like I knew the health care 222 00:10:57,960 --> 00:11:01,280 Speaker 1: system so well. But I did feel after ten years, 223 00:11:01,960 --> 00:11:05,840 Speaker 1: I was investing in a system that was fundamentally not 224 00:11:05,960 --> 00:11:08,719 Speaker 1: really like not really the system that I wanted to 225 00:11:08,760 --> 00:11:12,080 Speaker 1: be part of. And um, and then I wanted to 226 00:11:12,080 --> 00:11:13,720 Speaker 1: do something. I feel like, you can't. If you're going 227 00:11:13,720 --> 00:11:17,000 Speaker 1: to complain about a problem, then you should, you know, 228 00:11:17,000 --> 00:11:18,960 Speaker 1: figure out how to fix it. So what year did 229 00:11:19,000 --> 00:11:21,520 Speaker 1: you decide to start? Twenty three? And me, it started 230 00:11:21,640 --> 00:11:24,079 Speaker 1: thinking about a lot in two thousand five, and then 231 00:11:24,120 --> 00:11:28,360 Speaker 1: it was two thousand two six that we actually started. 232 00:11:28,360 --> 00:11:31,040 Speaker 1: So for people who haven't paid attention to the genome world, 233 00:11:31,160 --> 00:11:33,800 Speaker 1: in the year two thousand, I think it was the 234 00:11:33,800 --> 00:11:38,000 Speaker 1: federal government said that two researchers, Francis Collins working at 235 00:11:38,040 --> 00:11:40,520 Speaker 1: n I H and Craig Ventor, who has had a 236 00:11:40,520 --> 00:11:45,160 Speaker 1: private company, had basically discovered the human genome. So that happened, 237 00:11:45,160 --> 00:11:47,680 Speaker 1: and I think two thousand President Clinton said that was 238 00:11:47,720 --> 00:11:50,920 Speaker 1: a tie, and they both given credit for it, leaving 239 00:11:50,920 --> 00:11:53,880 Speaker 1: aside the intramural fighting that had gone on before. So 240 00:11:54,080 --> 00:11:55,880 Speaker 1: it was then thought, Okay, the world is gonna be 241 00:11:55,920 --> 00:11:58,199 Speaker 1: so much better now that we know that human genome. 242 00:11:58,200 --> 00:12:00,439 Speaker 1: It's been mapped. There will be no more to Jesus, 243 00:12:00,440 --> 00:12:02,760 Speaker 1: we can fix all diseases. But what happened for the 244 00:12:02,920 --> 00:12:05,400 Speaker 1: ten years or so or fifteen years afterwards, it didn't 245 00:12:05,440 --> 00:12:07,880 Speaker 1: seem like much happened. But this is that's such an 246 00:12:07,880 --> 00:12:11,560 Speaker 1: interesting question, um, and this is also something that plagues me. 247 00:12:11,720 --> 00:12:14,880 Speaker 1: Like Francis Collins and President Clinton at the time came 248 00:12:14,920 --> 00:12:18,240 Speaker 1: out and said this human genome, like sequencing, the human 249 00:12:18,280 --> 00:12:22,000 Speaker 1: genome is going to transform how we diagnose treat and 250 00:12:22,080 --> 00:12:26,200 Speaker 1: prevent all human disease. So it's a huge statement about 251 00:12:26,200 --> 00:12:28,559 Speaker 1: what's going to happen. And I would argue one of 252 00:12:28,600 --> 00:12:31,520 Speaker 1: the biggest issues that's happened today is there has not 253 00:12:31,640 --> 00:12:35,679 Speaker 1: been adoption of genetic technology. And I would say point 254 00:12:35,720 --> 00:12:39,199 Speaker 1: back to again part of my dissatisfaction with the investing 255 00:12:39,240 --> 00:12:42,079 Speaker 1: world is part of the reason why it has not 256 00:12:42,200 --> 00:12:44,719 Speaker 1: been adopted is that a lot of genetic information is 257 00:12:44,760 --> 00:12:49,480 Speaker 1: about prevention. So for instance, if I tell you are 258 00:12:49,800 --> 00:12:53,080 Speaker 1: likely to have an adverse event from this this antidepressant, 259 00:12:53,360 --> 00:12:57,000 Speaker 1: or your potentially higher risk for atrial fib those are 260 00:12:57,080 --> 00:13:02,000 Speaker 1: not well monetized medical events. You can make a lot 261 00:13:02,040 --> 00:13:05,680 Speaker 1: of money treating atro fibrillation, but you cannot make money 262 00:13:05,720 --> 00:13:08,520 Speaker 1: in the absence of a disease. So, for instance, if 263 00:13:08,559 --> 00:13:11,760 Speaker 1: I successfully keep you healthy to a hundred, you are 264 00:13:11,840 --> 00:13:15,320 Speaker 1: not a profit center for the healthcare world. And so 265 00:13:15,440 --> 00:13:19,800 Speaker 1: genetic information a lot of that potential is not really 266 00:13:19,840 --> 00:13:23,760 Speaker 1: adopted in the primary care setting, in part because there's 267 00:13:23,760 --> 00:13:26,520 Speaker 1: not an assilium monetization path. So you're starting to see 268 00:13:26,520 --> 00:13:29,120 Speaker 1: that more in cancer, you start to see it more 269 00:13:29,360 --> 00:13:33,839 Speaker 1: in pregnancy related but it has not been widely adopted 270 00:13:34,320 --> 00:13:36,880 Speaker 1: I would say the second thing is that the US 271 00:13:37,040 --> 00:13:41,360 Speaker 1: is far far behind countries like the UK. So the 272 00:13:41,480 --> 00:13:44,760 Speaker 1: UK has an incredibly impressive program called the uk Biobank 273 00:13:45,120 --> 00:13:47,479 Speaker 1: where they have half a million people who have consented, 274 00:13:47,840 --> 00:13:52,120 Speaker 1: who are participating in a nationwide program and it's amazing, 275 00:13:52,200 --> 00:13:55,120 Speaker 1: like everyone including twenty three me uses the data that 276 00:13:55,160 --> 00:13:58,640 Speaker 1: comes out from the UK Biobank. The US has totally 277 00:13:58,640 --> 00:14:01,960 Speaker 1: fallen behind. So we have a program called all of 278 00:14:02,080 --> 00:14:04,240 Speaker 1: Us where we're trying to get people. It's supposed to 279 00:14:04,280 --> 00:14:06,760 Speaker 1: be a million people, but the UK has already left 280 00:14:06,760 --> 00:14:09,400 Speaker 1: prog us and said we have a five million person program. 281 00:14:09,480 --> 00:14:12,400 Speaker 1: Now that's going to be, you know, running ahead. So 282 00:14:12,480 --> 00:14:14,400 Speaker 1: the US and part there's also been a lot of 283 00:14:14,440 --> 00:14:17,280 Speaker 1: politics around you know, how is this going to execute? 284 00:14:17,480 --> 00:14:19,600 Speaker 1: You know, what are the ethics on it? If we 285 00:14:19,640 --> 00:14:22,360 Speaker 1: want to really capture the benefit of the human genome, 286 00:14:23,160 --> 00:14:26,240 Speaker 1: you need a massive I would say multiple countries that 287 00:14:26,320 --> 00:14:28,880 Speaker 1: come together and have like a hundred million people who 288 00:14:28,920 --> 00:14:31,320 Speaker 1: come together so you can really understand what the code is. 289 00:14:31,480 --> 00:14:34,160 Speaker 1: The US is obsessed not so much with the UK, 290 00:14:34,800 --> 00:14:37,400 Speaker 1: but it's with China. So it is China ahead of 291 00:14:37,440 --> 00:14:39,640 Speaker 1: us in this area or not? Well, China is crushing 292 00:14:39,720 --> 00:14:41,840 Speaker 1: us too, but we don't know as much about what 293 00:14:41,840 --> 00:14:44,920 Speaker 1: they're actually doing. So I would say, like the one 294 00:14:45,360 --> 00:14:48,360 Speaker 1: big worry that I always have is like China. China 295 00:14:48,440 --> 00:14:51,240 Speaker 1: knows how to execute and they know how to lead 296 00:14:51,280 --> 00:14:54,440 Speaker 1: on a program in China has made no um, no 297 00:14:54,600 --> 00:14:59,160 Speaker 1: uncertain statements about their interest in genetics and that they 298 00:14:59,200 --> 00:15:01,240 Speaker 1: want to be the world leader in this next the 299 00:15:01,280 --> 00:15:05,360 Speaker 1: genetic revolution. So China has made huge efforts. They have 300 00:15:05,440 --> 00:15:08,080 Speaker 1: huge programs, and they also have something called the Beijing 301 00:15:08,120 --> 00:15:12,720 Speaker 1: Genome Institute, which is an incredibly well funded you know, um, 302 00:15:12,720 --> 00:15:15,280 Speaker 1: you know, well funded with the government and a massive 303 00:15:15,320 --> 00:15:19,280 Speaker 1: sequencing shop where tons and tons of discoveries are being 304 00:15:19,280 --> 00:15:21,280 Speaker 1: made and there's nothing in the U s or nothing 305 00:15:21,280 --> 00:15:23,400 Speaker 1: in the UK that's even comparable. As you look at 306 00:15:23,400 --> 00:15:25,040 Speaker 1: the company you've created, and you look at what you've 307 00:15:25,080 --> 00:15:27,120 Speaker 1: done with your life, how do you think you're having 308 00:15:27,160 --> 00:15:29,360 Speaker 1: an impact on the world for the betterment of the world. 309 00:15:29,960 --> 00:15:32,960 Speaker 1: I think in two ways. So one, I personally believe 310 00:15:33,000 --> 00:15:36,680 Speaker 1: there's no more interesting challenge than trying to understand the 311 00:15:36,720 --> 00:15:40,320 Speaker 1: code of life. And I think that twenty three is 312 00:15:40,840 --> 00:15:46,760 Speaker 1: frankly the best positioned entity public or not not public 313 00:15:46,960 --> 00:15:50,120 Speaker 1: or you know, for profit or nonprofit, we are by 314 00:15:50,120 --> 00:15:53,000 Speaker 1: far and away the best positioned entity to solve the 315 00:15:53,080 --> 00:15:55,720 Speaker 1: question as to what does the human genome mean and 316 00:15:55,760 --> 00:15:58,880 Speaker 1: how can you apply it to individuals? So one, I 317 00:15:58,880 --> 00:16:00,960 Speaker 1: think we are going to be the lead in deciphering 318 00:16:01,000 --> 00:16:06,520 Speaker 1: the human genetic code. Second, I transform people's lives, like 319 00:16:06,600 --> 00:16:08,560 Speaker 1: it's what keeps me up at night, and it's also 320 00:16:08,600 --> 00:16:11,640 Speaker 1: what keeps me going. I tell someone that they're higher 321 00:16:11,720 --> 00:16:15,200 Speaker 1: risk for breast cancer and that leads them to get amasectomy, 322 00:16:15,440 --> 00:16:20,320 Speaker 1: and I can potentially prevent an early death, like that's amazing. 323 00:16:20,920 --> 00:16:23,240 Speaker 1: Or I can reconnect to family members where they've been 324 00:16:23,280 --> 00:16:26,320 Speaker 1: lost for years. Like. One thing twenty three ME does 325 00:16:26,400 --> 00:16:32,160 Speaker 1: is that we generate incredibly meaningful information for individuals. By 326 00:16:32,160 --> 00:16:34,080 Speaker 1: the way, all my twenty three and ME tasks I'm 327 00:16:34,120 --> 00:16:36,520 Speaker 1: waiting for they get the results soon. How can I 328 00:16:36,560 --> 00:16:38,920 Speaker 1: be certain that nobody will know the results other than me? 329 00:16:39,040 --> 00:16:40,880 Speaker 1: How can I be sure that you're not selling the 330 00:16:40,960 --> 00:16:43,560 Speaker 1: data to somebody else? So we honor the fact that, like, 331 00:16:43,600 --> 00:16:45,840 Speaker 1: this is your genome and you should do with it 332 00:16:45,880 --> 00:16:48,520 Speaker 1: what you want. So that means, though I'm not going 333 00:16:48,520 --> 00:16:51,360 Speaker 1: to prevent you from sharing it, I'm going to give 334 00:16:51,400 --> 00:16:54,680 Speaker 1: you options. So we as we ask you, do you 335 00:16:54,720 --> 00:16:58,640 Speaker 1: want to opt into research and over our customers are 336 00:16:58,640 --> 00:17:02,040 Speaker 1: now opting in so but it's not a default. We 337 00:17:02,120 --> 00:17:04,840 Speaker 1: actually ask you like you have to elect that option. 338 00:17:05,280 --> 00:17:07,880 Speaker 1: If you wanted to share your genetic information with me, 339 00:17:08,880 --> 00:17:11,960 Speaker 1: you have that opportunity. But there's no defaults, like nothing 340 00:17:12,040 --> 00:17:14,760 Speaker 1: is mandatory. So it's all about you being in control. 341 00:17:15,200 --> 00:17:18,040 Speaker 1: So your company went public not too long ago through 342 00:17:18,040 --> 00:17:20,560 Speaker 1: a spac process. I think it was a spact that 343 00:17:20,640 --> 00:17:24,320 Speaker 1: was involved with Richard Branson and Virgin So any regrets 344 00:17:24,320 --> 00:17:26,960 Speaker 1: to backgoing public or throw us back? No, I mean 345 00:17:27,000 --> 00:17:29,199 Speaker 1: it's interesting because it's it's definitely not you know, the 346 00:17:29,240 --> 00:17:33,040 Speaker 1: most positive of market times. Um, but I loved I 347 00:17:33,080 --> 00:17:37,159 Speaker 1: love the process, like being being public. The process of 348 00:17:37,160 --> 00:17:39,720 Speaker 1: going public is a lot of work. And so when 349 00:17:39,760 --> 00:17:42,600 Speaker 1: we decided about or we at least looked at opportunities 350 00:17:42,640 --> 00:17:44,840 Speaker 1: to go public, Like one of the big considerations for 351 00:17:44,880 --> 00:17:46,480 Speaker 1: me was how much time is it going to take 352 00:17:46,480 --> 00:17:49,400 Speaker 1: from me? And I wanted, like, I'm very much focused 353 00:17:49,400 --> 00:17:51,320 Speaker 1: and very much of a CEO that like I like 354 00:17:51,440 --> 00:17:53,000 Speaker 1: to operate and I like to be in the weed, 355 00:17:53,080 --> 00:17:55,679 Speaker 1: like how are we executing? So I wanted to make 356 00:17:55,720 --> 00:17:58,040 Speaker 1: sure I was minimizing times, like how much time is 357 00:17:58,080 --> 00:18:00,200 Speaker 1: it going to take for the whole process? This back 358 00:18:00,200 --> 00:18:03,960 Speaker 1: process is incredibly efficient. Um. I got incredibly lucky with 359 00:18:04,080 --> 00:18:07,320 Speaker 1: Richard Branson. He's a very high quality investor. He has 360 00:18:07,440 --> 00:18:11,359 Speaker 1: an incredibly educated team. He's actually invested in healthcare before. 361 00:18:11,800 --> 00:18:14,360 Speaker 1: He's run you know, benefits companies. He's had a care 362 00:18:14,400 --> 00:18:17,639 Speaker 1: company in the UK, so he knows this space and 363 00:18:17,680 --> 00:18:20,679 Speaker 1: he was specifically looking at healthcare in a consumer space. 364 00:18:21,040 --> 00:18:24,000 Speaker 1: So I'm happy with the like the process was smooth 365 00:18:24,160 --> 00:18:27,480 Speaker 1: and we ended up in a really successful situation. But 366 00:18:27,640 --> 00:18:31,800 Speaker 1: many tech companies have seen their prices go down, so 367 00:18:32,200 --> 00:18:35,479 Speaker 1: you you have as well? I said, so does that 368 00:18:35,560 --> 00:18:38,119 Speaker 1: bother you and you're saying, well, we'll come back, or 369 00:18:38,119 --> 00:18:41,080 Speaker 1: what is your view generally on the the future of 370 00:18:41,160 --> 00:18:43,400 Speaker 1: tech companies in terms of the valuations they once had. 371 00:18:43,680 --> 00:18:46,080 Speaker 1: I think coming from Wall Street has given me a 372 00:18:46,080 --> 00:18:49,080 Speaker 1: good perspective on this. You know, Wall Street is not 373 00:18:49,119 --> 00:18:52,919 Speaker 1: always a reflective of true value. And I can also 374 00:18:53,000 --> 00:18:55,920 Speaker 1: say that there's like no shortage of opportunities that people 375 00:18:55,920 --> 00:18:58,639 Speaker 1: have missed where there's a diamond in the gem like 376 00:18:59,000 --> 00:19:00,800 Speaker 1: you know, or diamond the rough where I could look 377 00:19:00,800 --> 00:19:02,320 Speaker 1: at like the early days of Google, or I could 378 00:19:02,320 --> 00:19:04,640 Speaker 1: look at the early days of Amazon, where people did 379 00:19:04,640 --> 00:19:06,919 Speaker 1: not really understand the story, and I would put us 380 00:19:06,960 --> 00:19:09,680 Speaker 1: in that bucket. A lot of people don't quite understand 381 00:19:09,720 --> 00:19:12,639 Speaker 1: what we do. They either think are we ancestry. We 382 00:19:12,680 --> 00:19:15,160 Speaker 1: also have a biotech side, which we haven't even talked about, 383 00:19:15,240 --> 00:19:16,960 Speaker 1: is like how do you translate all this data into 384 00:19:17,000 --> 00:19:21,760 Speaker 1: drug discovery? So we're complicated story. The market conditions are bad. 385 00:19:22,640 --> 00:19:25,560 Speaker 1: You know, markets are efficient over time, You're largely seen 386 00:19:25,600 --> 00:19:29,359 Speaker 1: as a consumer company, and consumer companies have relatively lower 387 00:19:29,400 --> 00:19:33,000 Speaker 1: pe multiples tech companies and biotech companies when the market 388 00:19:33,000 --> 00:19:35,959 Speaker 1: comes back and higher multiple. So can you transform your 389 00:19:35,960 --> 00:19:39,760 Speaker 1: company into being a tech or biotech company by working 390 00:19:39,760 --> 00:19:43,560 Speaker 1: on disease solutions as opposed to just giving people their information? 391 00:19:44,040 --> 00:19:47,080 Speaker 1: We are, Um, that's actually one of the beauties. So 392 00:19:47,200 --> 00:19:49,440 Speaker 1: twenty three AM when we started the company, remember I said, 393 00:19:49,440 --> 00:19:52,760 Speaker 1: it was like all about like exploring the genome. And 394 00:19:52,800 --> 00:19:54,080 Speaker 1: so one of the things we do is we have 395 00:19:54,119 --> 00:19:56,560 Speaker 1: this huge research arm like how can I help best 396 00:19:56,680 --> 00:19:59,800 Speaker 1: understand what the human genome means? And that's part of 397 00:20:00,240 --> 00:20:02,280 Speaker 1: the whole process from the very beginning, is how do 398 00:20:02,320 --> 00:20:05,480 Speaker 1: I consent people for research, engage them in research, learn 399 00:20:05,560 --> 00:20:08,720 Speaker 1: about them, and then I have a whole analytics team 400 00:20:08,760 --> 00:20:10,879 Speaker 1: and data science team that's in looking at all that 401 00:20:11,000 --> 00:20:13,160 Speaker 1: data to see how do I translate it either as 402 00:20:13,200 --> 00:20:16,239 Speaker 1: a consumer product back to you or actually now for 403 00:20:16,400 --> 00:20:19,280 Speaker 1: research and discovery. So I have over a hundred people 404 00:20:19,400 --> 00:20:22,679 Speaker 1: in a twenty three therapeutics arm of the business. We 405 00:20:22,720 --> 00:20:26,119 Speaker 1: have a major collaboration with GSK. We have over fifty 406 00:20:26,160 --> 00:20:29,159 Speaker 1: programs that are in development and we have to actually 407 00:20:29,200 --> 00:20:33,119 Speaker 1: in clinical trials, So we're a hundred Like absolutely, thinking 408 00:20:33,119 --> 00:20:36,000 Speaker 1: about how people will benefit from the human genome is 409 00:20:36,040 --> 00:20:39,200 Speaker 1: about actually translating that information into theory. And GSK is 410 00:20:39,200 --> 00:20:43,760 Speaker 1: a very large healthcare company medical research company. What are 411 00:20:43,760 --> 00:20:47,320 Speaker 1: you doing specifically with him? Specifically it's on target identification, 412 00:20:47,640 --> 00:20:50,520 Speaker 1: So we're specifically working with g s K to look 413 00:20:50,560 --> 00:20:53,320 Speaker 1: at the data that twenty three is generated to see 414 00:20:53,400 --> 00:20:58,119 Speaker 1: can we identify new novel drug targets. So you've built 415 00:20:58,119 --> 00:21:00,400 Speaker 1: a successful company, but you might not want to run 416 00:21:00,440 --> 00:21:04,080 Speaker 1: in for the next ten years, or do you? And 417 00:21:04,119 --> 00:21:06,040 Speaker 1: if you didn't, would you want to go into government? 418 00:21:06,040 --> 00:21:08,680 Speaker 1: At the president called you to serve our country? Would 419 00:21:08,680 --> 00:21:11,240 Speaker 1: you go into government or would you start another company. 420 00:21:11,960 --> 00:21:16,000 Speaker 1: I love what I do and I feel like I 421 00:21:16,080 --> 00:21:19,720 Speaker 1: have this luxury with twenty three ME that um, it's 422 00:21:19,760 --> 00:21:22,119 Speaker 1: and it keeps evolving. You know, we started out just 423 00:21:22,280 --> 00:21:25,520 Speaker 1: as a you know, started out a consumer company. We're 424 00:21:25,600 --> 00:21:29,400 Speaker 1: thick in research, we got into drug discovering two fifteen, 425 00:21:29,920 --> 00:21:32,560 Speaker 1: and now we're getting to the delivery of care. So 426 00:21:33,440 --> 00:21:35,360 Speaker 1: I've been really lucky with twenty three me that it's 427 00:21:35,359 --> 00:21:42,000 Speaker 1: been endlessly interesting and I'm surrounded by incredibly talented, brilliant 428 00:21:42,040 --> 00:21:45,200 Speaker 1: individuals where I learned from them, and what I always 429 00:21:45,200 --> 00:21:47,320 Speaker 1: advised people to do is like you should keep you 430 00:21:47,320 --> 00:21:50,639 Speaker 1: should take jobs where you're learning, and if you stop 431 00:21:50,720 --> 00:21:52,960 Speaker 1: learning in a job, then you should quit. And like frankly, 432 00:21:52,960 --> 00:21:55,199 Speaker 1: when people leave twenty three me, like I say, if 433 00:21:55,200 --> 00:21:58,840 Speaker 1: you're not learning anymore, you should move on. So I'm 434 00:21:59,000 --> 00:22:02,200 Speaker 1: I'm hun on committed to twenty three. That said, my 435 00:22:02,280 --> 00:22:06,440 Speaker 1: other passion is other direct to consumers, Like I want 436 00:22:06,480 --> 00:22:09,600 Speaker 1: to build out a true consumer empowered healthcare world, like 437 00:22:09,680 --> 00:22:13,080 Speaker 1: one that is outside of the existing system. And I 438 00:22:13,119 --> 00:22:17,679 Speaker 1: do there's no greater force than government and the opportunity 439 00:22:17,680 --> 00:22:20,480 Speaker 1: for government, So you know, there's always ideas, like I 440 00:22:20,480 --> 00:22:23,800 Speaker 1: could always do a sabbatical. You have three talented sisters too, 441 00:22:24,440 --> 00:22:28,480 Speaker 1: very um accomplished parents. Your parents ever said, well, guess 442 00:22:28,520 --> 00:22:30,800 Speaker 1: what your sister just did to kind of say you 443 00:22:30,800 --> 00:22:33,080 Speaker 1: should work even harder and be more, even more accomplished. 444 00:22:33,240 --> 00:22:36,880 Speaker 1: They put rivalries together, or you don't have that problem. No, 445 00:22:37,080 --> 00:22:39,560 Speaker 1: they know my parents. My parents are lovely, and my 446 00:22:39,640 --> 00:22:44,199 Speaker 1: sisters are you know, my sisters are quick to you know, 447 00:22:44,280 --> 00:22:46,520 Speaker 1: make me humble. I mean, it's part of why I 448 00:22:46,520 --> 00:22:49,280 Speaker 1: lived with my sisters. Like I had somebody over today, 449 00:22:49,320 --> 00:22:51,200 Speaker 1: like I had we had a coffee chat that with 450 00:22:51,560 --> 00:22:54,199 Speaker 1: somebody in the company today who you know? We auction 451 00:22:54,280 --> 00:22:56,879 Speaker 1: this off. And my sister was laughing. She's like someone 452 00:22:56,960 --> 00:23:00,760 Speaker 1: paid money to eat with you. And I was like thanks, 453 00:23:00,800 --> 00:23:03,400 Speaker 1: Like that wasn't the kinds of reminders I get from 454 00:23:03,400 --> 00:23:06,639 Speaker 1: the family. So your mother wrote this book about raising 455 00:23:06,720 --> 00:23:09,720 Speaker 1: talented children. Do you raise your children the way she 456 00:23:09,960 --> 00:23:15,640 Speaker 1: raised you? I listen. I think each generation improves upon 457 00:23:15,760 --> 00:23:19,720 Speaker 1: the last generation. So I think that we have taken um. 458 00:23:19,760 --> 00:23:21,880 Speaker 1: I have I haven't. I couldn't read the whole book. 459 00:23:21,920 --> 00:23:23,720 Speaker 1: I read parts of the book, but I wrote part 460 00:23:23,760 --> 00:23:27,160 Speaker 1: of I wrote the forward UM, and my my take 461 00:23:27,200 --> 00:23:29,280 Speaker 1: on that was more to like give the reality, like 462 00:23:29,280 --> 00:23:31,840 Speaker 1: the perspective from the child, Like I know my mother 463 00:23:31,880 --> 00:23:33,560 Speaker 1: is telling you about how she raised us, but let 464 00:23:33,560 --> 00:23:36,840 Speaker 1: me tell you about my mother, and so I look, 465 00:23:36,880 --> 00:23:40,119 Speaker 1: I think it's it's my mom. My parents are incredible, 466 00:23:40,320 --> 00:23:42,399 Speaker 1: Like I'm really grateful and there's lots of things that 467 00:23:42,440 --> 00:23:45,159 Speaker 1: I learned from them, but I'm definitely not adopting everything 468 00:23:45,160 --> 00:23:47,440 Speaker 1: they did. If your children go to Harvard and rather 469 00:23:47,480 --> 00:23:51,600 Speaker 1: than yeah, we'll be okay, let's not talk about that. 470 00:23:53,480 --> 00:23:56,000 Speaker 1: I mean my kids, my kids. I want my kids 471 00:23:56,000 --> 00:23:59,439 Speaker 1: to My kids are so interesting and so different, you know, 472 00:23:59,560 --> 00:24:01,600 Speaker 1: I want them to be passionate. That's the only thing 473 00:24:01,640 --> 00:24:03,359 Speaker 1: I care about is I want them to wake up 474 00:24:03,359 --> 00:24:05,880 Speaker 1: every day and say, like, I love what I'm doing. 475 00:24:06,160 --> 00:24:07,760 Speaker 1: And you would you be upset if they went into 476 00:24:07,840 --> 00:24:11,600 Speaker 1: something important like private equity? Are you giving me the 477 00:24:11,680 --> 00:24:15,880 Speaker 1: job offer? Thanks for listening to hear more of my interviews. 478 00:24:15,920 --> 00:24:19,840 Speaker 1: You can subscribe and download my podcast on Spotify, Apple, 479 00:24:20,119 --> 00:24:21,000 Speaker 1: or wherever you listen