1 00:00:05,800 --> 00:00:07,520 Speaker 1: Welcome to Fear and Greed Q and A where we 2 00:00:07,520 --> 00:00:11,160 Speaker 1: ask and answer questions about business, investing, economics, politics and more. 3 00:00:11,200 --> 00:00:14,960 Speaker 1: I'm sure n alma. Healthcare is a multi trillion dollar 4 00:00:15,000 --> 00:00:18,680 Speaker 1: industry and many argue it's overdue for disruption. My guest 5 00:00:18,720 --> 00:00:21,760 Speaker 1: today is entrepreneur Mark Britt, who made his name as 6 00:00:21,800 --> 00:00:24,760 Speaker 1: he founder of streaming platform i Flix, which was then 7 00:00:24,800 --> 00:00:27,479 Speaker 1: acquired by ten Cent. He's now the founder and CEO 8 00:00:27,640 --> 00:00:32,360 Speaker 1: of preventative health company Tomorrow. That's TMRW Tomorrow, which is 9 00:00:32,360 --> 00:00:35,160 Speaker 1: a great support of this podcast. Mark, Welcome to Fear 10 00:00:35,200 --> 00:00:37,400 Speaker 1: and Greed Q and a thanks for having the elevator 11 00:00:37,479 --> 00:00:39,360 Speaker 1: Pitch and Tomorrow. Who are you? What do you do? 12 00:00:39,960 --> 00:00:41,960 Speaker 2: We're a next generation health company. You know, we exist 13 00:00:41,960 --> 00:00:44,200 Speaker 2: in Australia. We're very lucky to have a public healthcare system. 14 00:00:44,240 --> 00:00:46,839 Speaker 2: But it's built for when I'm sick and when my 15 00:00:46,920 --> 00:00:48,360 Speaker 2: kids are sick on a Sunday night and I can 16 00:00:48,400 --> 00:00:50,519 Speaker 2: rush them down to a public care. It's fantastic if 17 00:00:50,520 --> 00:00:53,720 Speaker 2: they need antibiotics. What it doesn't serve is when we're 18 00:00:54,000 --> 00:00:56,480 Speaker 2: not sick but not quite well. And so what we've 19 00:00:56,480 --> 00:00:59,120 Speaker 2: spent the last two or three years building a next 20 00:00:59,160 --> 00:01:02,400 Speaker 2: generation healthcare. It from the ground up that provides continuous, 21 00:01:02,560 --> 00:01:05,880 Speaker 2: always on healthcare to make you feel amazing all the time. 22 00:01:06,120 --> 00:01:07,319 Speaker 1: So what is that then? 23 00:01:07,600 --> 00:01:10,160 Speaker 2: Yeah, so the core we think it fundamentally about how 24 00:01:10,200 --> 00:01:12,360 Speaker 2: do you create better Tomorrow's right? So the universal human 25 00:01:12,400 --> 00:01:14,839 Speaker 2: need is I want tomorrow to be better than yesterday. 26 00:01:15,120 --> 00:01:17,560 Speaker 2: What does that take? It takes the right habits, the 27 00:01:17,680 --> 00:01:20,160 Speaker 2: right lifestyle, the right mindset, the right food, the right diet, 28 00:01:20,440 --> 00:01:24,080 Speaker 2: personalized medication, personalized supplements, and so what we've built as 29 00:01:24,080 --> 00:01:26,080 Speaker 2: a system that allows you to assess your health on 30 00:01:26,120 --> 00:01:28,720 Speaker 2: an ongoing basis every quarter with thousands of Buyo markers, 31 00:01:29,160 --> 00:01:32,640 Speaker 2: get a plan with a team that's built around you, doctors, clinicians, 32 00:01:32,760 --> 00:01:36,840 Speaker 2: DNA specialist, nutritionness, and then actually personalized treatments that will 33 00:01:36,840 --> 00:01:38,200 Speaker 2: actually deliver your better tomorrow. 34 00:01:38,440 --> 00:01:40,080 Speaker 1: Okay, so I go to the doctor once a year. 35 00:01:40,120 --> 00:01:43,640 Speaker 1: Basically I have bloods done after that test, and mostly 36 00:01:43,680 --> 00:01:45,800 Speaker 1: I don't hear anything until the next time around. You're 37 00:01:45,840 --> 00:01:47,560 Speaker 1: talking about once a quarter, So what's that mean. 38 00:01:47,680 --> 00:01:49,080 Speaker 2: Yeah, So let me give you the difference. So when 39 00:01:49,080 --> 00:01:50,920 Speaker 2: you go see a doctor, you'll get say fifteen to 40 00:01:50,960 --> 00:01:53,440 Speaker 2: thirty biomarkers potentially, and he's going to tell you what's 41 00:01:53,440 --> 00:01:55,560 Speaker 2: going on in your body right now. Now, we do 42 00:01:55,640 --> 00:01:57,520 Speaker 2: comprehensive blood tests, but we may do one hundred, one 43 00:01:57,560 --> 00:02:00,800 Speaker 2: hundred and fifty buiomarkers. We also do what it's called genetic testing, 44 00:02:00,920 --> 00:02:04,280 Speaker 2: and that's seventeen one hundred biomarkers that will give you 45 00:02:04,320 --> 00:02:07,680 Speaker 2: access to your epigenetic age. We look inside your cells, 46 00:02:07,760 --> 00:02:09,919 Speaker 2: what's your pace of aging? How quickly is your body 47 00:02:09,960 --> 00:02:12,720 Speaker 2: aging right now? Based on how you're living your life, 48 00:02:12,840 --> 00:02:15,640 Speaker 2: what's the individual organ ages of the eleven different systems 49 00:02:15,639 --> 00:02:18,440 Speaker 2: in the body, and which are the biomarkers individually that 50 00:02:18,480 --> 00:02:21,920 Speaker 2: are creating disease risk? And then we give you supplements 51 00:02:22,680 --> 00:02:26,240 Speaker 2: that specifically target those biomarkers. So in my case today, 52 00:02:26,280 --> 00:02:28,680 Speaker 2: I have a little on espresso capsule. It has thirty 53 00:02:28,880 --> 00:02:32,360 Speaker 2: seven ingredients in it every single day that are optimized 54 00:02:32,400 --> 00:02:34,960 Speaker 2: for my biomarkers. And as I get healthier and my 55 00:02:35,040 --> 00:02:37,680 Speaker 2: pace of aging improves and my epigenetic age improves and 56 00:02:37,720 --> 00:02:40,799 Speaker 2: my disease risk comes down, we adapt that supplementation and 57 00:02:40,840 --> 00:02:41,720 Speaker 2: medication as we go. 58 00:02:42,520 --> 00:02:47,040 Speaker 1: Okay, most of this isn't breakthrough medical technology. It's just 59 00:02:47,280 --> 00:02:50,000 Speaker 1: using better what we know. That's a question, not a statement. 60 00:02:50,080 --> 00:02:53,960 Speaker 2: Yeah, Look, I I'm really passionate about the fact that 61 00:02:53,960 --> 00:02:56,120 Speaker 2: that actually what we don't want is brand new, cutting 62 00:02:56,160 --> 00:02:58,720 Speaker 2: edge medicine, right. What we want to do is take leading, 63 00:02:58,960 --> 00:03:03,320 Speaker 2: proven medicine, proven science. Epigenetics, for example, has been coming 64 00:03:03,320 --> 00:03:06,560 Speaker 2: out of Harvard, Yale, Duneeden, University of Target. 65 00:03:06,440 --> 00:03:07,679 Speaker 1: University define epgity. 66 00:03:07,840 --> 00:03:10,000 Speaker 2: So epigenetic is basically looking at what's going on inside 67 00:03:10,000 --> 00:03:13,040 Speaker 2: the cell, right, It's actually the way your genes express themselves. 68 00:03:13,200 --> 00:03:15,399 Speaker 2: And for example, in my case and my epigenetic case, 69 00:03:15,440 --> 00:03:17,639 Speaker 2: I've never smoked in my life, but my cells have 70 00:03:17,720 --> 00:03:20,240 Speaker 2: damage from passive smoking when I was a kid. I 71 00:03:20,280 --> 00:03:22,400 Speaker 2: have exposure to forever chemicals in my body. They're in 72 00:03:22,400 --> 00:03:24,200 Speaker 2: the top five percent of the community. Why because we 73 00:03:24,200 --> 00:03:26,280 Speaker 2: have teflon given to us when we got married and 74 00:03:26,320 --> 00:03:28,240 Speaker 2: I haven't been cooking with it right time. So I 75 00:03:28,280 --> 00:03:31,120 Speaker 2: can now live my life now with that knowledge. So 76 00:03:31,160 --> 00:03:33,160 Speaker 2: that's what epigenetic does. Think about it like the difference 77 00:03:33,160 --> 00:03:36,040 Speaker 2: between the climate, what's the system doing versus your bloods 78 00:03:36,040 --> 00:03:38,120 Speaker 2: are almost like what's the weather? Is it raining outside 79 00:03:38,160 --> 00:03:41,040 Speaker 2: outside outside of Piermont today. When you put those two 80 00:03:41,040 --> 00:03:43,600 Speaker 2: things together, you get a perspective on people's health. It 81 00:03:43,640 --> 00:03:46,360 Speaker 2: was never possible before. Now this science has been around 82 00:03:46,400 --> 00:03:49,080 Speaker 2: for twenty years. The difference is it's what a longevity 83 00:03:49,080 --> 00:03:51,320 Speaker 2: doctor would do for you for twenty thousand dollars and 84 00:03:51,440 --> 00:03:53,120 Speaker 2: what we've been working for the last few years is 85 00:03:53,160 --> 00:03:54,360 Speaker 2: to do it to give it to you for the 86 00:03:54,400 --> 00:03:56,440 Speaker 2: same price as a gym membership, get it to a 87 00:03:56,440 --> 00:03:59,400 Speaker 2: price that absolutely everybody can afford as part of their life. 88 00:03:59,680 --> 00:04:04,960 Speaker 1: Why is health still so expensive when we have technology breakthroughs, 89 00:04:05,040 --> 00:04:09,040 Speaker 1: we have great data sharing abilities. Is it a permissions 90 00:04:09,080 --> 00:04:12,400 Speaker 1: thing from individuals? Is that doctors kind of trying to 91 00:04:12,760 --> 00:04:15,680 Speaker 1: keep it all to themselves. What you're talking about makes 92 00:04:15,680 --> 00:04:19,000 Speaker 1: a lot of sense. And whilst it is a breakthrough, 93 00:04:19,080 --> 00:04:21,080 Speaker 1: it's not kind of brain science, right, it's actually just 94 00:04:21,120 --> 00:04:22,839 Speaker 1: doing stuff better. Why haven't we done it yet? 95 00:04:22,920 --> 00:04:25,159 Speaker 2: Well, it's interesting every data point you look at says 96 00:04:25,200 --> 00:04:26,719 Speaker 2: as a thirst for preventative care. 97 00:04:26,880 --> 00:04:27,040 Speaker 1: Right. 98 00:04:27,080 --> 00:04:28,520 Speaker 2: And so one of the things we've been sharing this 99 00:04:28,560 --> 00:04:30,839 Speaker 2: week is a new study with you gov. It's the 100 00:04:30,880 --> 00:04:33,840 Speaker 2: first study has ever been done of Australian consumers on 101 00:04:33,920 --> 00:04:36,440 Speaker 2: how they're using cutting edge technology and healthcare right and 102 00:04:36,480 --> 00:04:38,880 Speaker 2: particularly AI. Right, and tell you something interesting, right, So 103 00:04:38,880 --> 00:04:41,320 Speaker 2: eighty percent of Australians actually have trust and confidence in 104 00:04:41,320 --> 00:04:44,520 Speaker 2: the system. It's good, but almost half of them leave 105 00:04:44,960 --> 00:04:48,440 Speaker 2: every medical appointment confused. Forty four percent of them in 106 00:04:48,480 --> 00:04:51,240 Speaker 2: the last year have gone on AI. They put in 107 00:04:51,240 --> 00:04:54,720 Speaker 2: their confidential data into public models in the US and 108 00:04:54,720 --> 00:04:58,520 Speaker 2: asked for medical advice. It's two thirds of parents are 109 00:04:58,520 --> 00:05:00,919 Speaker 2: seeking medical advice. And so this study has actually been 110 00:05:00,960 --> 00:05:02,280 Speaker 2: a bit of a wake up call to the industry, 111 00:05:02,320 --> 00:05:04,280 Speaker 2: right that actually people are looking for something better. They're 112 00:05:04,279 --> 00:05:06,839 Speaker 2: trying to get access to information. They're overwhelmed with the 113 00:05:06,920 --> 00:05:08,920 Speaker 2: data and they want support through that process. 114 00:05:09,720 --> 00:05:13,239 Speaker 1: Okay, so tomorrow I can do it through you, right, 115 00:05:13,520 --> 00:05:17,000 Speaker 1: But I mean is your plan global domination here? I 116 00:05:17,000 --> 00:05:19,400 Speaker 1: mean for this to work, it would actually be great 117 00:05:19,720 --> 00:05:20,919 Speaker 1: if everyone was doing. 118 00:05:20,720 --> 00:05:21,320 Speaker 2: This, that's right. 119 00:05:22,120 --> 00:05:23,320 Speaker 1: Can that happen over time? 120 00:05:23,360 --> 00:05:25,440 Speaker 2: Absolutely absolutely? And so what it tells you is people 121 00:05:25,480 --> 00:05:27,159 Speaker 2: want to take control their own health, they want access 122 00:05:27,160 --> 00:05:28,840 Speaker 2: to better information, et cetera. And so I think over 123 00:05:28,839 --> 00:05:31,679 Speaker 2: the next few years, what AI is doing is putting 124 00:05:31,920 --> 00:05:34,440 Speaker 2: power in people's hands that they've never had before. Now, 125 00:05:34,480 --> 00:05:36,360 Speaker 2: the question is how do you do that understoo proper 126 00:05:36,360 --> 00:05:39,640 Speaker 2: supervision of a doctor, right, And so most people don't 127 00:05:39,680 --> 00:05:42,400 Speaker 2: want their doctor replaced by AI. Right, and when you 128 00:05:42,440 --> 00:05:44,640 Speaker 2: look at the data over so nine million people will 129 00:05:44,640 --> 00:05:46,800 Speaker 2: say what they want is a medical system that is 130 00:05:46,880 --> 00:05:50,040 Speaker 2: enhanced by AI using the latest technology. Now, that's super 131 00:05:50,040 --> 00:05:52,200 Speaker 2: hard in a public system, right. The public systems are 132 00:05:52,240 --> 00:05:54,520 Speaker 2: built to provide safety and stability and security. And I 133 00:05:54,560 --> 00:05:56,760 Speaker 2: think in Australia we have one of the best healthcare 134 00:05:56,760 --> 00:05:59,440 Speaker 2: systems we have, but it doesn't help me be optimized. 135 00:05:59,480 --> 00:06:00,920 Speaker 2: And so I think over time we're going to see 136 00:06:00,920 --> 00:06:02,640 Speaker 2: two different models. We're going to see a sick care 137 00:06:02,680 --> 00:06:06,160 Speaker 2: system for when I'm ill, when I'm in the latest 138 00:06:06,160 --> 00:06:08,520 Speaker 2: stages of life, and that will continue. And I think 139 00:06:08,560 --> 00:06:11,080 Speaker 2: increasingly we're going to see people in private hands actually 140 00:06:11,120 --> 00:06:13,240 Speaker 2: taking steps to take control of their own health. But 141 00:06:13,279 --> 00:06:16,479 Speaker 2: the difference is that only used to be accessible to 142 00:06:16,520 --> 00:06:19,360 Speaker 2: people with a great bank balance. What technology is now 143 00:06:19,400 --> 00:06:21,080 Speaker 2: doing is make it accessible to everybody. 144 00:06:21,360 --> 00:06:23,799 Speaker 1: We talked about AI, and you've mentioned AI a few times. 145 00:06:24,560 --> 00:06:27,520 Speaker 1: People using AI. I mean, I'm using AI to actually 146 00:06:27,600 --> 00:06:31,240 Speaker 1: diagnose myself with all sorts of illnesses. How prevalent is that? 147 00:06:31,880 --> 00:06:34,320 Speaker 1: And I'm sure that's a good side to that and 148 00:06:34,400 --> 00:06:35,080 Speaker 1: bad side to that. 149 00:06:35,440 --> 00:06:37,920 Speaker 2: Yeah, So it's I think now for the first time 150 00:06:37,920 --> 00:06:39,880 Speaker 2: we have the data, it's almost half Australians. And this 151 00:06:40,000 --> 00:06:41,960 Speaker 2: is a technology that's only two or three years old, 152 00:06:42,240 --> 00:06:43,919 Speaker 2: and so that number is only going to continue and 153 00:06:43,920 --> 00:06:45,960 Speaker 2: to increase. But you know, it used to be you know, 154 00:06:45,960 --> 00:06:48,200 Speaker 2: in the Internet world, we used to call it doctor Google. Right, 155 00:06:48,320 --> 00:06:50,960 Speaker 2: eighty percent of people consult doctor Google. What's now happening 156 00:06:51,080 --> 00:06:52,400 Speaker 2: is I'm going to what you know, you almost think 157 00:06:52,400 --> 00:06:55,000 Speaker 2: about a shadow AI doctor. Now that's actually not a 158 00:06:55,000 --> 00:06:58,000 Speaker 2: good thing because the AI models will hallucinate, they'll make 159 00:06:58,040 --> 00:07:00,960 Speaker 2: things up and they will pretend to die. In Australia, 160 00:07:01,000 --> 00:07:03,120 Speaker 2: we have a regulatory region where AI should not be 161 00:07:03,240 --> 00:07:05,919 Speaker 2: diagnosing people. Right, So what people are telling us, what 162 00:07:05,920 --> 00:07:07,800 Speaker 2: customer's telling us, So they want something better. They want 163 00:07:07,800 --> 00:07:11,360 Speaker 2: a more modern healthcare system that embraces technology into the 164 00:07:11,360 --> 00:07:14,000 Speaker 2: delivery of the service, helps me understand my data, helps 165 00:07:14,040 --> 00:07:16,200 Speaker 2: me be in control, and that's what we're building it tomorrow. 166 00:07:16,480 --> 00:07:19,960 Speaker 1: So you mentioned a mam we democratization. Effectively, what's the 167 00:07:20,040 --> 00:07:20,800 Speaker 1: role of government in this? 168 00:07:21,640 --> 00:07:23,640 Speaker 2: I think there's two questions for government actually that the 169 00:07:23,680 --> 00:07:26,000 Speaker 2: biggest one is I think, and I almost talk more 170 00:07:26,040 --> 00:07:29,960 Speaker 2: broadly than healthcare. I think AI is profoundly changing the 171 00:07:30,000 --> 00:07:32,480 Speaker 2: way that we get work done so quickly that it 172 00:07:32,520 --> 00:07:35,040 Speaker 2: is almost incomprehensible. So as a business leader, the way 173 00:07:35,080 --> 00:07:37,520 Speaker 2: we're building organizations, the way we're training our staff, the 174 00:07:37,520 --> 00:07:39,640 Speaker 2: way we're running compliance, it's profoundly different than it was 175 00:07:39,680 --> 00:07:42,320 Speaker 2: six months ago. An incredibly difficult challenge then is how 176 00:07:42,320 --> 00:07:45,040 Speaker 2: do you as a government keep up with regulatory and policy. 177 00:07:45,320 --> 00:07:47,080 Speaker 2: And I don't have good answers to that other than 178 00:07:47,160 --> 00:07:48,680 Speaker 2: it's going to be a continued It's going to be 179 00:07:48,680 --> 00:07:50,600 Speaker 2: a continued race. Right. We need to do two things. 180 00:07:50,640 --> 00:07:52,480 Speaker 2: We need to make sure we're supporting the innovation and 181 00:07:52,520 --> 00:07:56,040 Speaker 2: the companies and the entrepreneurs to continue to ethically and 182 00:07:56,080 --> 00:07:58,920 Speaker 2: appropriately push the boundaries on getting customers to let us 183 00:07:58,960 --> 00:08:01,720 Speaker 2: in technology. We've got to be super careful of sovereignty. 184 00:08:01,760 --> 00:08:03,880 Speaker 2: We've got to be very careful about billionaires in the 185 00:08:03,960 --> 00:08:06,600 Speaker 2: US standing up and encouraging people to put their blood 186 00:08:06,680 --> 00:08:10,200 Speaker 2: data into their public model. That for me is very scary. 187 00:08:10,640 --> 00:08:12,840 Speaker 2: So there's a real sovereignty issue around how you protect 188 00:08:13,320 --> 00:08:16,440 Speaker 2: Australian consumers in the Australian environment. But we've seen the 189 00:08:16,480 --> 00:08:18,400 Speaker 2: government do a good job of this before. They've done 190 00:08:18,400 --> 00:08:20,120 Speaker 2: it in online gambling. They've done it with the under 191 00:08:20,120 --> 00:08:22,240 Speaker 2: sixteens ban, and what they're showing is actually the government 192 00:08:22,240 --> 00:08:24,960 Speaker 2: can respond really well. The technologically changes, and I think 193 00:08:24,960 --> 00:08:27,000 Speaker 2: they're getting better at it, and I do think health 194 00:08:27,080 --> 00:08:28,960 Speaker 2: is going to be one of the one of the 195 00:08:28,960 --> 00:08:29,680 Speaker 2: next fronts for that. 196 00:08:30,200 --> 00:08:32,520 Speaker 1: Why are you worried about sharing of data of blood 197 00:08:32,720 --> 00:08:33,880 Speaker 1: blood types for example? 198 00:08:34,320 --> 00:08:38,000 Speaker 2: So I still deeply care about privacy. I think it matters, 199 00:08:38,000 --> 00:08:39,280 Speaker 2: and I want my children to grow up in a 200 00:08:39,320 --> 00:08:42,080 Speaker 2: world where privacy matters. And unfortunately, when you put your 201 00:08:42,080 --> 00:08:45,160 Speaker 2: blood results into a public chat model in the US, 202 00:08:45,600 --> 00:08:48,240 Speaker 2: you have absolutely no control over where that data goes. Now, 203 00:08:48,280 --> 00:08:50,080 Speaker 2: wait until it's not just your blood results, but it's 204 00:08:50,080 --> 00:08:53,200 Speaker 2: also your DNA, and it's all your medical history, and 205 00:08:53,240 --> 00:08:55,200 Speaker 2: it's your personal preferences. And I think we're going to 206 00:08:55,240 --> 00:08:57,360 Speaker 2: hit a point around data sovereignty and a conversation as 207 00:08:57,360 --> 00:08:59,480 Speaker 2: a country over the next five to ten years where 208 00:08:59,840 --> 00:09:02,400 Speaker 2: hoping this becomes very important public policy conversation. 209 00:09:02,720 --> 00:09:06,959 Speaker 1: Okay, so what's tomorrow's ultimate goal? Is longevity? If I 210 00:09:07,000 --> 00:09:09,840 Speaker 1: think of society, I think longevity is about the only 211 00:09:09,880 --> 00:09:11,640 Speaker 1: thing you can measure whe the society is going forward 212 00:09:11,760 --> 00:09:13,760 Speaker 1: or like I've thought a lot about that over the 213 00:09:13,760 --> 00:09:16,319 Speaker 1: many years. Is it about longevity? 214 00:09:16,440 --> 00:09:18,480 Speaker 2: So I hate the idea of longevity. Every time I 215 00:09:18,480 --> 00:09:20,880 Speaker 2: think about longevity, I end up in a debate with 216 00:09:20,960 --> 00:09:22,959 Speaker 2: myself about the idea of do I want to be immortal? 217 00:09:22,960 --> 00:09:25,760 Speaker 2: And I'm not quite sure how that resolves. I do think, though, 218 00:09:25,760 --> 00:09:28,400 Speaker 2: there is a question that for every day on my life, 219 00:09:28,960 --> 00:09:30,960 Speaker 2: I want to be healthier. I want to be better. 220 00:09:31,000 --> 00:09:32,559 Speaker 2: I want to be living my best life. I want 221 00:09:32,600 --> 00:09:34,920 Speaker 2: to have more energy, I want to pick the kids up. 222 00:09:34,920 --> 00:09:36,480 Speaker 2: I want to be able to take my dog for 223 00:09:36,520 --> 00:09:38,040 Speaker 2: a run in ten years time, right and we call 224 00:09:38,080 --> 00:09:40,640 Speaker 2: that not lifespan but health span. Right now, the challenge 225 00:09:40,640 --> 00:09:42,880 Speaker 2: in the last generation is what's happened is we're living longer, 226 00:09:42,920 --> 00:09:46,200 Speaker 2: but we're living sicker. So chronic conditions are increasing the 227 00:09:46,280 --> 00:09:48,280 Speaker 2: number of deaths in Australia. The majority of deaths are 228 00:09:48,280 --> 00:09:51,600 Speaker 2: from chronic condition, eighty percent of which are avoidable by 229 00:09:51,640 --> 00:09:54,040 Speaker 2: lifestyle and diet changes. But we don't have a system 230 00:09:54,080 --> 00:09:56,120 Speaker 2: that avoids that. The majority of health care costs goes 231 00:09:56,120 --> 00:09:58,800 Speaker 2: into last three months of life. So what's our goal. 232 00:09:58,920 --> 00:10:00,960 Speaker 2: Our goal is to give every everybody a personal plan 233 00:10:01,040 --> 00:10:03,079 Speaker 2: to create a better tomorrow for them, whatever that means. 234 00:10:03,440 --> 00:10:05,480 Speaker 2: Right and actually extend what we will call health span 235 00:10:05,720 --> 00:10:06,800 Speaker 2: more better days. 236 00:10:07,000 --> 00:10:08,640 Speaker 1: So if people who are interested, how do they what 237 00:10:08,679 --> 00:10:10,880 Speaker 1: do they search AI or Google? 238 00:10:11,200 --> 00:10:13,400 Speaker 2: So first thing is no matter what, take control of 239 00:10:13,400 --> 00:10:14,679 Speaker 2: your own health, whether you do it with us or 240 00:10:14,679 --> 00:10:16,559 Speaker 2: do with anyone else. What changed my life right when 241 00:10:16,559 --> 00:10:19,440 Speaker 2: I was very sick is actually getting doctors, getting control 242 00:10:19,440 --> 00:10:21,680 Speaker 2: of my own data, and actually getting my own plan 243 00:10:21,800 --> 00:10:23,839 Speaker 2: in terms of how to shift my Buymark, isn't it 244 00:10:23,880 --> 00:10:27,160 Speaker 2: profoundly changed my life? Or come to start my Tomorrow 245 00:10:27,200 --> 00:10:29,040 Speaker 2: dot com and come and join the service. 246 00:10:29,320 --> 00:10:33,120 Speaker 1: Tell me you previously found a streaming platform I flicks, 247 00:10:34,200 --> 00:10:37,800 Speaker 1: which is better? It is entertainment. 248 00:10:37,920 --> 00:10:40,720 Speaker 2: It is one of the greatest joys being able to 249 00:10:40,720 --> 00:10:42,960 Speaker 2: wake up every day and have the core purpose of 250 00:10:43,000 --> 00:10:45,800 Speaker 2: your company help people. And I don't ever want to 251 00:10:45,800 --> 00:10:46,400 Speaker 2: do anything else. 252 00:10:46,440 --> 00:10:48,200 Speaker 1: Fair enough, Mark, thanks for talking to Fear and Greed. 253 00:10:48,280 --> 00:10:48,800 Speaker 2: Great thank you. 254 00:10:49,040 --> 00:10:52,079 Speaker 1: That was Mark Britt, founder and CEO of Tomorrow. That's 255 00:10:52,160 --> 00:10:55,480 Speaker 1: tmr W, a great supporter of this podcast. I'm Seanielmer 256 00:10:55,520 --> 00:11:00,800 Speaker 1: and this is Fear and Greed. Q and DA