1 00:00:15,356 --> 00:00:22,436 Speaker 1: Pushkin, tell me what we don't know about how smell works? 2 00:00:23,196 --> 00:00:25,916 Speaker 1: Oh jeez, be sure to tell you what we do. 3 00:00:27,556 --> 00:00:30,516 Speaker 1: This is Alex Wilscoe. He's the co founder and CEO 4 00:00:30,596 --> 00:00:34,636 Speaker 1: of a company called Osmo, and despite his protests there 5 00:00:34,756 --> 00:00:37,436 Speaker 1: he did tell me some of the things nobody knows 6 00:00:37,516 --> 00:00:39,756 Speaker 1: about how smell works. 7 00:00:40,436 --> 00:00:43,716 Speaker 2: Why do things smell the way that they do. Why 8 00:00:43,716 --> 00:00:46,956 Speaker 2: can we smell certain things and not other things. What 9 00:00:47,156 --> 00:00:51,396 Speaker 2: is the logic of how molecules are combined to create 10 00:00:51,436 --> 00:00:56,436 Speaker 2: beautiful smells? Why do some smells create incredibly powerful emotional 11 00:00:56,476 --> 00:01:01,276 Speaker 2: associations instantly and others seem neutral? Right? Why do something 12 00:01:01,356 --> 00:01:04,396 Speaker 2: smell different to people? I think we have a hints 13 00:01:04,436 --> 00:01:09,116 Speaker 2: in all these directions, but we have nothing like musical scale, 14 00:01:09,356 --> 00:01:11,956 Speaker 2: where we have nothing like a periodic table. We don't 15 00:01:11,956 --> 00:01:14,556 Speaker 2: know any structure to why things are the way that 16 00:01:14,596 --> 00:01:17,876 Speaker 2: they are. It's a ton of mystery, and that's what 17 00:01:17,916 --> 00:01:20,316 Speaker 2: makes it so exciting to work on this topic, is 18 00:01:20,396 --> 00:01:22,676 Speaker 2: like there's so much we don't know, and to. 19 00:01:22,636 --> 00:01:27,956 Speaker 1: Be clear, like with light, we just know whatever. If 20 00:01:27,996 --> 00:01:31,876 Speaker 1: you tell me the frequency the wavelength, I can know 21 00:01:31,996 --> 00:01:34,156 Speaker 1: exactly what color you're talking about, or the same thing 22 00:01:34,156 --> 00:01:36,876 Speaker 1: with a wave form of sound, right, and so but 23 00:01:36,956 --> 00:01:39,436 Speaker 1: if I give you some random molecule and say what 24 00:01:39,476 --> 00:01:40,276 Speaker 1: does it smell like? 25 00:01:40,796 --> 00:01:44,116 Speaker 2: Do you know? So that's what I've spent a lot 26 00:01:44,116 --> 00:01:46,716 Speaker 2: of my professional life working on. It is exactly that question. Yeah, 27 00:01:46,756 --> 00:01:49,636 Speaker 2: which is a draw structure of a molecule on a whiteboard, 28 00:01:49,796 --> 00:01:51,756 Speaker 2: point out it and say, hey, what does this smell 29 00:01:51,796 --> 00:01:55,316 Speaker 2: like wood or flowers or fruits or whatever? And so 30 00:01:56,396 --> 00:02:00,156 Speaker 2: there is no way to know that for sure at all. 31 00:02:01,076 --> 00:02:04,996 Speaker 2: But there's no good way even statistically to predict that 32 00:02:05,036 --> 00:02:08,476 Speaker 2: without using large data sets, and at least in our hands. 33 00:02:09,356 --> 00:02:11,916 Speaker 2: You need neural networks, you need deep sorting in order 34 00:02:11,956 --> 00:02:19,956 Speaker 2: to do that. I'm Jacob Goldstein, and this is what's 35 00:02:19,956 --> 00:02:20,476 Speaker 2: your problem? 36 00:02:20,636 --> 00:02:22,516 Speaker 1: The show where I talk to people who are trying 37 00:02:22,516 --> 00:02:24,436 Speaker 1: to make technological progress. 38 00:02:24,876 --> 00:02:25,596 Speaker 2: Alex Wilch. 39 00:02:25,636 --> 00:02:29,516 Speaker 1: Goo's problem is this, Can you use AI to teach 40 00:02:29,516 --> 00:02:33,036 Speaker 1: computers to smell? And once you figured out how to 41 00:02:33,076 --> 00:02:35,716 Speaker 1: do that, can you build a profitable business around it? 42 00:02:36,676 --> 00:02:39,756 Speaker 1: Osmo's fun out of Google in twenty twenty three. The 43 00:02:39,796 --> 00:02:43,276 Speaker 1: company recently launched a fragrance house to develop new perfumes. 44 00:02:43,676 --> 00:02:47,276 Speaker 1: They've also done some work using scent to detect counterfeit shoes, 45 00:02:47,796 --> 00:02:50,276 Speaker 1: and in the long run, they plan to use scent 46 00:02:50,436 --> 00:02:55,556 Speaker 1: to diagnose disease. Before he started Osmo, Alex worked at 47 00:02:55,596 --> 00:02:58,676 Speaker 1: Google as an AI researcher. Before that, he got a 48 00:02:58,716 --> 00:03:02,876 Speaker 1: PhD at Harvard studying how mice respond to scent. But 49 00:03:03,116 --> 00:03:05,596 Speaker 1: maybe the most important part of his bio came even 50 00:03:05,636 --> 00:03:09,436 Speaker 1: earlier in his life, specifically when he was twelve years 51 00:03:09,436 --> 00:03:13,196 Speaker 1: old and went off to summer camp in his home state, Texas. 52 00:03:13,516 --> 00:03:16,876 Speaker 2: I was from a small town college station, and then 53 00:03:16,996 --> 00:03:18,916 Speaker 2: most of the kids were from big towns like Houston 54 00:03:18,956 --> 00:03:23,436 Speaker 2: and Dallas and Austin and San Antonio, and I hadn't 55 00:03:23,476 --> 00:03:28,076 Speaker 2: really been exposed to, like I don't know fashion trends 56 00:03:28,276 --> 00:03:32,316 Speaker 2: or you know, what was cool or popular. But everybody's 57 00:03:32,316 --> 00:03:34,476 Speaker 2: all lumped together in summer camp. And then there is 58 00:03:34,516 --> 00:03:38,596 Speaker 2: this thing called perfume that some of the richer, frankly 59 00:03:38,716 --> 00:03:42,196 Speaker 2: richer and more popular kids had, And it was just 60 00:03:42,236 --> 00:03:48,036 Speaker 2: amazing to me that these boys could spray themselves with 61 00:03:48,076 --> 00:03:51,916 Speaker 2: this invisible mist, a clear mist, and then for the 62 00:03:51,956 --> 00:03:54,316 Speaker 2: next four to six hours people around them would treat 63 00:03:54,356 --> 00:03:58,956 Speaker 2: them differently, and that just blew my mind right, like 64 00:03:59,196 --> 00:04:01,356 Speaker 2: there's no I can't I can see the clothes. Yeah, 65 00:04:01,436 --> 00:04:03,596 Speaker 2: I can see how they act and walk and talk 66 00:04:03,676 --> 00:04:06,116 Speaker 2: and how they you know, posture and all that, but 67 00:04:06,156 --> 00:04:09,796 Speaker 2: I cannot see the fragrance. Yeah, but yet it is 68 00:04:10,076 --> 00:04:13,156 Speaker 2: obviously doing something magical. It's like an. 69 00:04:13,036 --> 00:04:17,716 Speaker 1: Axe body spray ad what what does that cause you 70 00:04:17,756 --> 00:04:18,036 Speaker 1: to do? 71 00:04:18,236 --> 00:04:23,316 Speaker 2: When I get home, beg my parents to buy and fail. 72 00:04:25,996 --> 00:04:28,316 Speaker 2: We shopped at TJ Max and I started to really 73 00:04:28,316 --> 00:04:31,156 Speaker 2: look out for fragrances there and then it just kind 74 00:04:31,156 --> 00:04:33,236 Speaker 2: of snowballs from there, which I just realized there was 75 00:04:33,276 --> 00:04:34,916 Speaker 2: like a whole lot of these things. I guess what, 76 00:04:34,956 --> 00:04:37,996 Speaker 2: you can just try them, and some of them are 77 00:04:38,356 --> 00:04:41,796 Speaker 2: actually way better and more opinionated and more beautiful. I 78 00:04:41,796 --> 00:04:44,396 Speaker 2: don't I didn't have the vocabulary then, but like it 79 00:04:44,516 --> 00:04:47,516 Speaker 2: just it was clear to me early on that like 80 00:04:47,876 --> 00:04:49,876 Speaker 2: I never really thought about who made the clothes, but 81 00:04:49,916 --> 00:04:53,036 Speaker 2: I started to think about who made these perfumes, because 82 00:04:53,076 --> 00:04:55,236 Speaker 2: it was clear that there were choices that were being made. 83 00:04:55,676 --> 00:04:59,036 Speaker 2: And like I just remember trying, and this is years later, 84 00:04:59,196 --> 00:05:02,996 Speaker 2: trying Bulgari black, which really kind of clued me into 85 00:05:03,036 --> 00:05:06,356 Speaker 2: this world. Bulgary black is not necessarily a great fragrance, 86 00:05:07,156 --> 00:05:09,916 Speaker 2: but you can experience the top, middle and base notes 87 00:05:09,916 --> 00:05:12,196 Speaker 2: in like forty minutes. Forty five minutes is pretty short, 88 00:05:12,556 --> 00:05:15,756 Speaker 2: and so like a bigger fragrance, like a creed, events 89 00:05:15,876 --> 00:05:18,316 Speaker 2: will last on your skin for a day, and so 90 00:05:18,716 --> 00:05:21,756 Speaker 2: the whole fragrance unfolds. I mean, top notes will last 91 00:05:22,116 --> 00:05:25,436 Speaker 2: max Fifteen thirty minutes, but the heart might last for 92 00:05:25,516 --> 00:05:28,796 Speaker 2: several hours, and the base note might last for ten hours. Right, 93 00:05:28,836 --> 00:05:30,236 Speaker 2: so it smells different. 94 00:05:30,276 --> 00:05:32,716 Speaker 1: You can still smell it, but it smells different whatever 95 00:05:32,756 --> 00:05:33,476 Speaker 1: an hour after. 96 00:05:33,356 --> 00:05:35,796 Speaker 2: You put it on and four hours after that, because 97 00:05:35,796 --> 00:05:39,196 Speaker 2: a great fragrance is actually many different fragrances within it. Yeah, Right, 98 00:05:39,196 --> 00:05:41,116 Speaker 2: There's the first one which peels off or that quickly 99 00:05:41,116 --> 00:05:43,716 Speaker 2: burns off quickly. There's the second fragrance, which is the 100 00:05:43,716 --> 00:05:46,396 Speaker 2: heart note, which will last for you know, sometimes hours, 101 00:05:46,396 --> 00:05:48,796 Speaker 2: but in this case, like another twenty minutes. And then 102 00:05:49,036 --> 00:05:51,196 Speaker 2: the base note, which is a third fragrance, which is 103 00:05:51,196 --> 00:05:55,036 Speaker 2: what's left after those two burn off. And it's like 104 00:05:55,076 --> 00:05:57,676 Speaker 2: three acts of a movie, right, I think it's quite beautiful. 105 00:05:58,036 --> 00:05:58,756 Speaker 2: So how do we. 106 00:05:58,676 --> 00:06:04,996 Speaker 1: Get from you being a teenager preoccupied with fragrance to 107 00:06:05,356 --> 00:06:09,796 Speaker 1: you using AI to predict how malls will smell? 108 00:06:10,036 --> 00:06:13,076 Speaker 2: Yeah, Like the computer part was always different from the 109 00:06:13,076 --> 00:06:16,196 Speaker 2: fragrance part I just I love computers. We always had 110 00:06:16,196 --> 00:06:19,516 Speaker 2: computers at home. I started programming around I don't know, 111 00:06:19,636 --> 00:06:22,076 Speaker 2: eight years old. It was my life, Like, my entire 112 00:06:22,076 --> 00:06:24,116 Speaker 2: life was computers for a long time, still lose in 113 00:06:24,196 --> 00:06:27,676 Speaker 2: a way, and fragrance was not a part of it. 114 00:06:28,476 --> 00:06:31,476 Speaker 2: I got into you know, statistics, which became machine learning 115 00:06:31,476 --> 00:06:33,756 Speaker 2: around the same time. Again for totally independent reason. There's 116 00:06:33,796 --> 00:06:35,916 Speaker 2: this thing called the Netflix Prize. It was like one 117 00:06:35,956 --> 00:06:39,996 Speaker 2: of the first competitions to build great mL algorithms. I competed. 118 00:06:40,116 --> 00:06:42,716 Speaker 1: Now, that's basically to tell me what else I'll like 119 00:06:42,756 --> 00:06:44,876 Speaker 1: on Netflix, right, That's what that contaest was. Like, if 120 00:06:44,916 --> 00:06:49,956 Speaker 1: I've watched whatever, if I watched Succession and the Sopranos, 121 00:06:49,996 --> 00:06:51,196 Speaker 1: what should I watch next? 122 00:06:51,356 --> 00:06:54,116 Speaker 2: Then you're gonna like another kind of dark, but you know, funny, 123 00:06:54,316 --> 00:06:57,316 Speaker 2: kind of soap opera type of a thing, exactly. And 124 00:06:57,356 --> 00:06:59,636 Speaker 2: so Netflix did a really bold thing, which is they 125 00:06:59,676 --> 00:07:01,836 Speaker 2: released a data set and said, here's what good looks like, 126 00:07:01,836 --> 00:07:04,236 Speaker 2: here's how we measure it. Have at it. And they 127 00:07:04,276 --> 00:07:06,316 Speaker 2: paid a million dollars to the Winter, which was a 128 00:07:06,316 --> 00:07:08,916 Speaker 2: combination of a few teams. But what they really did 129 00:07:08,996 --> 00:07:13,556 Speaker 2: is they brought a particular kind of machine learning into 130 00:07:13,596 --> 00:07:17,556 Speaker 2: the forefront called collaborative filtering. Really showed that this stuff worked, 131 00:07:17,596 --> 00:07:20,676 Speaker 2: and by the way, other companies were already racing to 132 00:07:20,796 --> 00:07:23,996 Speaker 2: use this, So like this recommended systems was a big thing, 133 00:07:24,476 --> 00:07:26,436 Speaker 2: but Netflix was putting it out into the public and 134 00:07:26,476 --> 00:07:28,316 Speaker 2: allowed a kid like me I think I was eighteen 135 00:07:28,356 --> 00:07:30,556 Speaker 2: or nineteen years old to actually compete and pretty well 136 00:07:30,596 --> 00:07:33,356 Speaker 2: in that. And so I just got exposed to this 137 00:07:33,436 --> 00:07:36,396 Speaker 2: world through that and it was super fun. I mean 138 00:07:36,396 --> 00:07:38,956 Speaker 2: they gamified it and had I had a blast. So 139 00:07:38,996 --> 00:07:40,756 Speaker 2: that was my first exposure to machine learning. 140 00:07:40,996 --> 00:07:42,956 Speaker 1: Turned out to be a good time to start working 141 00:07:42,996 --> 00:07:44,316 Speaker 1: on machine learning, Yeah. 142 00:07:44,236 --> 00:07:46,156 Speaker 2: Totally, because if I had started now, they wouldn't let 143 00:07:46,156 --> 00:07:50,316 Speaker 2: me in because that's probably ten years ago. Yeah, ten 144 00:07:50,356 --> 00:07:55,476 Speaker 2: years ago. Yeah. And then, you know, I was doing 145 00:07:55,516 --> 00:07:58,916 Speaker 2: my undergraduate training in neuroscience, and I was studying more 146 00:07:58,996 --> 00:08:02,796 Speaker 2: behavior than old action because it actually turned out that 147 00:08:02,796 --> 00:08:05,556 Speaker 2: olf action was a hyper specialized sub field of neuroscience. 148 00:08:05,596 --> 00:08:08,196 Speaker 2: I didn't realize how niche it was. I loved smell 149 00:08:08,516 --> 00:08:10,796 Speaker 2: and I was doing neuroscience, and I knew I wanted 150 00:08:10,836 --> 00:08:14,476 Speaker 2: to do smell neuroscience the fancy word for nactual factory neuroscience. 151 00:08:14,996 --> 00:08:17,236 Speaker 2: And so there's really two universities in the world that 152 00:08:17,316 --> 00:08:20,876 Speaker 2: like have a critical mass of these researchers. It's Columbia 153 00:08:20,916 --> 00:08:23,276 Speaker 2: and it's Harvard, and I applied to both. I went 154 00:08:23,316 --> 00:08:27,516 Speaker 2: to Harvard, and I realized nobody cares about this problem. 155 00:08:28,276 --> 00:08:32,116 Speaker 2: Nobody cares about why molecules smell the way that they do. 156 00:08:32,476 --> 00:08:34,596 Speaker 2: There's a much longer conversation as so why that's the 157 00:08:34,636 --> 00:08:37,316 Speaker 2: case and why that's still persistent. Now that's changing. Well, 158 00:08:37,396 --> 00:08:38,836 Speaker 2: let me ask you this. Let me ask you this. 159 00:08:38,916 --> 00:08:41,356 Speaker 1: At that time, I mean I get it as a 160 00:08:41,396 --> 00:08:42,636 Speaker 1: basic research question. 161 00:08:42,996 --> 00:08:43,636 Speaker 2: I mean, I'll tell you. 162 00:08:43,996 --> 00:08:46,396 Speaker 1: We was talking with the producer and editor of this show, 163 00:08:46,396 --> 00:08:48,276 Speaker 1: and we were getting ready for this interview, and we 164 00:08:48,276 --> 00:08:50,316 Speaker 1: had this interesting conversation talking about scent and what you're 165 00:08:50,316 --> 00:08:53,876 Speaker 1: working on whatever. And then I went down and I 166 00:08:53,916 --> 00:08:55,836 Speaker 1: saw my daughter, and so, what are you organized? Said 167 00:08:55,836 --> 00:08:57,516 Speaker 1: this guy who's trying to figure out scent and teach 168 00:08:57,556 --> 00:08:58,316 Speaker 1: computers to smell? 169 00:08:58,396 --> 00:09:01,396 Speaker 2: And she said why, I said, I don't know. 170 00:09:01,596 --> 00:09:06,436 Speaker 1: I ask, so why was it compelling to I get 171 00:09:06,436 --> 00:09:08,556 Speaker 1: it as a basic research question, but at that time 172 00:09:08,716 --> 00:09:12,076 Speaker 1: was like, there were there applications that came to your mind. 173 00:09:12,636 --> 00:09:20,676 Speaker 3: Look, the the steps of development of this thing that's 174 00:09:20,716 --> 00:09:23,636 Speaker 3: now OSMO went through those different iterations. 175 00:09:23,676 --> 00:09:26,276 Speaker 2: You know, I started as an academic scientist and I 176 00:09:26,356 --> 00:09:30,156 Speaker 2: was trained in that world, and then I left and 177 00:09:30,196 --> 00:09:33,356 Speaker 2: it did some entrepreneurship, but it ended up in industrial research, 178 00:09:33,716 --> 00:09:36,116 Speaker 2: and they're like being curious, frankly was enough, and the 179 00:09:36,356 --> 00:09:38,956 Speaker 2: idea this is this is Google, this is now a 180 00:09:38,956 --> 00:09:41,276 Speaker 2: Google Brain. Yeah, and there's a few steps in between. 181 00:09:41,316 --> 00:09:44,316 Speaker 1: But basically as you're an AI researcher at Google at. 182 00:09:44,156 --> 00:09:46,836 Speaker 2: This moment when you're doing industrial research, right, yeah, exactly, 183 00:09:46,916 --> 00:09:49,036 Speaker 2: and Google Brain at the time now it's Google Deep 184 00:09:49,076 --> 00:09:52,396 Speaker 2: Mind very much had like a thousand flowers bloom mentality, 185 00:09:52,436 --> 00:09:54,596 Speaker 2: and so people were working on crazy stuff, including me 186 00:09:54,636 --> 00:09:56,076 Speaker 2: working on something Bell Labs. 187 00:09:56,116 --> 00:09:59,196 Speaker 1: It's basically like Bell Labs of the twenty first century, right, you. 188 00:09:59,116 --> 00:10:01,556 Speaker 2: Have it, exactly, Bell Labs, Erox Park, that kind of 189 00:10:01,556 --> 00:10:06,396 Speaker 2: avid and it truly was dreamy, sounds dreamy. It was awesome, right, 190 00:10:06,436 --> 00:10:08,316 Speaker 2: And it was also a moment in time, and now 191 00:10:08,316 --> 00:10:10,356 Speaker 2: I think that moment's going for better or for worse. 192 00:10:12,796 --> 00:10:17,196 Speaker 2: The idea was pretty straightforward for Google, which was, are 193 00:10:17,316 --> 00:10:19,156 Speaker 2: the products at Google know what the world looks like 194 00:10:19,236 --> 00:10:21,716 Speaker 2: and know what the world sounds like, and that's useful. Right, 195 00:10:21,716 --> 00:10:25,076 Speaker 2: that's information that Google's organizing. If we knew what things 196 00:10:25,116 --> 00:10:28,316 Speaker 2: smelled and tasted like, that would be useful, right. Uh huh. 197 00:10:28,356 --> 00:10:31,676 Speaker 1: The original mission of Google is organized the world's information, right. 198 00:10:32,036 --> 00:10:35,316 Speaker 2: Exactly, and make it universally accessible and useful. And there 199 00:10:35,396 --> 00:10:38,956 Speaker 2: was a whole slice of reality, huh, the chemical slice 200 00:10:38,956 --> 00:10:41,796 Speaker 2: of reality that was invisible, right, not just to Google 201 00:10:41,836 --> 00:10:45,316 Speaker 2: but two computers. Yeah. Yeah, and that felt really important, 202 00:10:45,436 --> 00:10:47,476 Speaker 2: and we had agreement and buying all the way up 203 00:10:47,516 --> 00:10:49,756 Speaker 2: to the executive level. They're like, yeah, let's go, let's 204 00:10:49,796 --> 00:10:50,276 Speaker 2: go look at that. 205 00:10:50,756 --> 00:10:53,276 Speaker 1: So you're doing a like basic AI research at Google 206 00:10:53,316 --> 00:10:57,076 Speaker 1: and you decide to see if you can basically use 207 00:10:57,116 --> 00:11:01,196 Speaker 1: AI to figure out scent, to say, here's a molecule, 208 00:11:01,396 --> 00:11:02,236 Speaker 1: what does it smell like? 209 00:11:02,316 --> 00:11:04,956 Speaker 2: Right, that's the basic endeavor. How do you do that? 210 00:11:05,156 --> 00:11:08,556 Speaker 2: What is it that you actually do? Yeah, so first 211 00:11:08,596 --> 00:11:10,116 Speaker 2: it starts to innovation, which I was like, let's figure 212 00:11:10,116 --> 00:11:13,756 Speaker 2: out smell. But it actually was a lot more natural 213 00:11:14,036 --> 00:11:18,436 Speaker 2: than I think it sounds, which is scent is just chemistry. 214 00:11:18,596 --> 00:11:22,476 Speaker 2: It's molecules, and we got to do AI for molecules. Right. 215 00:11:22,516 --> 00:11:25,116 Speaker 2: If we're going to do AI for scent, and the 216 00:11:25,156 --> 00:11:28,196 Speaker 2: thing that had happened in between. You know, there's a 217 00:11:28,236 --> 00:11:33,916 Speaker 2: five year period between my academic life and my industrial life, 218 00:11:34,076 --> 00:11:35,916 Speaker 2: and what had happened in those five years is actually 219 00:11:35,916 --> 00:11:37,956 Speaker 2: some of the people I did my PhD with and 220 00:11:37,956 --> 00:11:39,396 Speaker 2: then some of the people I ended up working with 221 00:11:39,436 --> 00:11:45,276 Speaker 2: at Google Brain really cracked machine learning or AI for molecules. 222 00:11:45,436 --> 00:11:46,956 Speaker 2: But they didn't do it for scent. They did it 223 00:11:46,996 --> 00:11:49,156 Speaker 2: for a few other things. They did for drug discovery, 224 00:11:49,156 --> 00:11:52,836 Speaker 2: and they did it for like materials discovery, so like 225 00:11:53,196 --> 00:11:54,476 Speaker 2: new materials for LEDs. 226 00:11:54,796 --> 00:11:59,436 Speaker 1: Right, So you happen to be doing essentially basic research 227 00:11:59,596 --> 00:12:02,436 Speaker 1: at Google at this moment when there is this new 228 00:12:03,836 --> 00:12:06,876 Speaker 1: way to use AI that is well suited to molecules, 229 00:12:06,916 --> 00:12:11,156 Speaker 1: and you say, we can do let's do it, Yeah, 230 00:12:11,276 --> 00:12:11,636 Speaker 1: let's do it. 231 00:12:11,756 --> 00:12:13,556 Speaker 2: Yeah, we can do it. The other pieces are great. 232 00:12:13,596 --> 00:12:16,356 Speaker 2: You got the algorithm. Where's the data? Classic? That's the 233 00:12:16,396 --> 00:12:19,796 Speaker 2: classic AI question, right, like, exactly where's the data? What 234 00:12:19,916 --> 00:12:23,076 Speaker 2: I did know just from being obsessed and in this 235 00:12:23,156 --> 00:12:25,836 Speaker 2: world for a long time prior to that, there were 236 00:12:25,876 --> 00:12:29,636 Speaker 2: these collections of data sets that were honestly really more 237 00:12:29,676 --> 00:12:33,756 Speaker 2: like magazine catalogs for fragrance ingredients, and so there were 238 00:12:33,796 --> 00:12:37,556 Speaker 2: these catalogs basically saying this is the ingredient, this is 239 00:12:37,596 --> 00:12:39,716 Speaker 2: the molecular structure of this ingredient, and here's what it 240 00:12:39,756 --> 00:12:42,276 Speaker 2: smells like. And by the way, the rating of what 241 00:12:42,316 --> 00:12:45,356 Speaker 2: it smells like was done by a professional, by a perfumer. 242 00:12:45,996 --> 00:12:48,236 Speaker 2: And so the special sauce that we added is we 243 00:12:48,396 --> 00:12:50,876 Speaker 2: we went and we got that data and we fused 244 00:12:50,876 --> 00:12:53,796 Speaker 2: a few data sets together, and we cleaned it very carefully, 245 00:12:54,196 --> 00:12:55,636 Speaker 2: and that that hadn't been done. 246 00:12:55,716 --> 00:12:58,516 Speaker 1: And it's something it's like five thousand dish right, it's 247 00:12:58,556 --> 00:13:00,676 Speaker 1: five thousand or so different molecules. 248 00:13:00,716 --> 00:13:04,076 Speaker 2: Okay, yep, exactly. And here is this the one with 249 00:13:04,156 --> 00:13:06,316 Speaker 2: the list. I love the list. Here I have it. 250 00:13:06,716 --> 00:13:11,076 Speaker 1: This the one sweet fruity, vanilla, powdery, fluoral, barry, fermented, 251 00:13:11,156 --> 00:13:12,756 Speaker 1: nutty ozone, buttery musk. 252 00:13:12,836 --> 00:13:15,596 Speaker 2: It's it's that lispright. Those are they? And there's one 253 00:13:15,676 --> 00:13:18,396 Speaker 2: hundred and thirty eight of those descriptors I think that 254 00:13:18,436 --> 00:13:21,156 Speaker 2: we used in that data set. Sometimes we use smaller subsets, 255 00:13:21,196 --> 00:13:24,276 Speaker 2: but the full set originally is about one hundred and forty. 256 00:13:24,836 --> 00:13:28,356 Speaker 1: So okay, so you have your whatever, your five thousand 257 00:13:28,396 --> 00:13:32,276 Speaker 1: molecules labeled with one hundred and forty different sets. You 258 00:13:32,356 --> 00:13:35,836 Speaker 1: train your AI model on this data set, and then 259 00:13:35,876 --> 00:13:38,036 Speaker 1: you want to find out does the model work? 260 00:13:38,076 --> 00:13:39,196 Speaker 2: Does the AI work right? 261 00:13:39,276 --> 00:13:42,756 Speaker 1: If I give the model some new molecule molecule that 262 00:13:42,916 --> 00:13:45,916 Speaker 1: wasn't in the training data, will it know what that 263 00:13:45,996 --> 00:13:49,716 Speaker 1: molecule smells like? And to test that to answer that question, 264 00:13:49,796 --> 00:13:50,676 Speaker 1: you actually. 265 00:13:50,316 --> 00:13:51,116 Speaker 2: Do this study. 266 00:13:51,276 --> 00:13:53,236 Speaker 1: So you get a bunch of people to smell these 267 00:13:53,636 --> 00:13:56,676 Speaker 1: molecules that are not that your model was not trained on, 268 00:13:56,836 --> 00:13:59,636 Speaker 1: essentially right, and say what it smells like? It's weird 269 00:14:00,076 --> 00:14:03,076 Speaker 1: like what? You don't actually care what it fundamentally smells like. 270 00:14:03,116 --> 00:14:05,956 Speaker 1: You just care what everybody on average thinks it smells. 271 00:14:05,676 --> 00:14:08,716 Speaker 2: Like, because guess what, that's what it's what smell is? Yeah, 272 00:14:08,756 --> 00:14:10,876 Speaker 2: that's what do you think of smell? Yeah? Ye? 273 00:14:11,236 --> 00:14:14,516 Speaker 1: So you asked this panel to what do all these 274 00:14:16,036 --> 00:14:18,436 Speaker 1: molecule smell like? And then you ask the model what 275 00:14:18,476 --> 00:14:20,356 Speaker 1: do they smell like? And you compare the results and 276 00:14:20,396 --> 00:14:21,196 Speaker 1: how does the model do? 277 00:14:23,156 --> 00:14:26,956 Speaker 2: That was really the threshold of breakthrough in my mind 278 00:14:27,196 --> 00:14:29,676 Speaker 2: was like are you worse than a person? Yeah? Or 279 00:14:29,676 --> 00:14:33,756 Speaker 2: are you slightly better than a person? And we got 280 00:14:33,756 --> 00:14:36,276 Speaker 2: slightly better than a person, which was a breakthrough in 281 00:14:36,316 --> 00:14:36,956 Speaker 2: my view. 282 00:14:36,956 --> 00:14:39,796 Speaker 1: Right, and so yes, so that paper you published in 283 00:14:39,836 --> 00:14:43,516 Speaker 1: Science and you started Ozmo kind of around the same time, Right, 284 00:14:43,516 --> 00:14:46,436 Speaker 1: you started that study at at Google, is that right? 285 00:14:46,436 --> 00:14:47,956 Speaker 1: And then by the time it was published you had 286 00:14:47,956 --> 00:14:51,076 Speaker 1: spun Osmo out of Google, right, that's right. So you 287 00:14:51,396 --> 00:14:54,356 Speaker 1: have this map, you have this model that can basically 288 00:14:54,396 --> 00:14:58,236 Speaker 1: given a molecule, predict pretty well what the average person 289 00:14:58,356 --> 00:15:03,516 Speaker 1: thinks that molecule smell like. But there's still a second problem, right, 290 00:15:03,596 --> 00:15:07,196 Speaker 1: which is, in the world, in the wild, you don't 291 00:15:07,276 --> 00:15:09,156 Speaker 1: know what molecules are in the air. You don't you 292 00:15:09,236 --> 00:15:13,076 Speaker 1: know what molecules somebody's smelling. And so for that second problem, 293 00:15:13,156 --> 00:15:15,636 Speaker 1: you need to try and build some kind of automated 294 00:15:15,676 --> 00:15:18,716 Speaker 1: system for figuring out what molecules are in the air 295 00:15:18,916 --> 00:15:20,156 Speaker 1: at a given that's correct. 296 00:15:20,676 --> 00:15:25,956 Speaker 2: Getting to one molecule structure is actually not trivial. So 297 00:15:26,116 --> 00:15:28,036 Speaker 2: to go from a physical thing and know all the 298 00:15:28,116 --> 00:15:31,876 Speaker 2: molecular structures like not a solveable. So there's a lot 299 00:15:31,876 --> 00:15:33,476 Speaker 2: of ways to do that. There's a lot of chemical 300 00:15:33,476 --> 00:15:36,196 Speaker 2: sensors out there, none of them will just tell you 301 00:15:37,276 --> 00:15:41,036 Speaker 2: the formula, right, So that's hard, really hard. 302 00:15:41,156 --> 00:15:44,676 Speaker 1: So there's like a chemistry problem of like isolating the 303 00:15:44,756 --> 00:15:49,316 Speaker 1: molecule basically and deriving the chemical formula. 304 00:15:49,236 --> 00:15:51,716 Speaker 2: Exactly taking a real smell and it's composed of a 305 00:15:51,716 --> 00:15:55,196 Speaker 2: bunch of different molecules with different structures, and there's different amounts, 306 00:15:55,236 --> 00:15:57,956 Speaker 2: there's ratios. You got to get that recipe out of 307 00:15:57,956 --> 00:16:02,036 Speaker 2: the air. So that's on. That's hard. That was unsolved 308 00:16:02,036 --> 00:16:03,836 Speaker 2: at the time to do it in an automated way. 309 00:16:04,396 --> 00:16:08,116 Speaker 2: And by the way, if we're following this story chronologically, 310 00:16:08,116 --> 00:16:10,196 Speaker 2: we hadn't done this yet, agah, but we knew we 311 00:16:10,236 --> 00:16:12,236 Speaker 2: had to do that, right, So we knew that, Okay, 312 00:16:12,236 --> 00:16:15,436 Speaker 2: if we wanted to actually digitize the world of scent 313 00:16:15,796 --> 00:16:18,516 Speaker 2: and have a record of what the world smelled like 314 00:16:18,516 --> 00:16:20,396 Speaker 2: and maybe even replay it, we're going to have to 315 00:16:20,436 --> 00:16:22,596 Speaker 2: do this. We needed to automate that and have it 316 00:16:22,676 --> 00:16:24,196 Speaker 2: be automatic, and that's what we did. 317 00:16:24,396 --> 00:16:28,116 Speaker 1: So basically, you can put any smell into the machine 318 00:16:28,116 --> 00:16:31,236 Speaker 1: and it'll tell you what it's made of. At this point, 319 00:16:31,276 --> 00:16:33,596 Speaker 1: oh yeah, so you're setting out to start OSMO, Like, 320 00:16:33,716 --> 00:16:36,516 Speaker 1: what are you thinking of in terms of the set 321 00:16:36,556 --> 00:16:39,916 Speaker 1: of potential commercial applications. 322 00:16:39,516 --> 00:16:45,716 Speaker 2: So we really had we had three in mind, and 323 00:16:45,756 --> 00:16:49,756 Speaker 2: there's still very much present in mind the focuses has 324 00:16:49,836 --> 00:16:52,036 Speaker 2: become a lot crisper though in terms of what we're 325 00:16:52,476 --> 00:16:56,436 Speaker 2: concentrating on. We know the fragrance industry is huge and 326 00:16:57,076 --> 00:17:02,276 Speaker 2: very profitable, and it's also something I personally love. That's 327 00:17:02,676 --> 00:17:05,916 Speaker 2: a thing we want to automate and understand. And then 328 00:17:06,076 --> 00:17:09,476 Speaker 2: we know that dogs can detect things, right, and so 329 00:17:09,516 --> 00:17:13,356 Speaker 2: we know dogs can detect harmful substances like drugs or bombs, 330 00:17:13,676 --> 00:17:15,796 Speaker 2: or things that just shouldn't be there, like produce where 331 00:17:15,796 --> 00:17:18,996 Speaker 2: it shouldn't be being shipped. And then we also know that 332 00:17:19,036 --> 00:17:22,596 Speaker 2: dogs and even in some cases people can detect health 333 00:17:22,796 --> 00:17:25,836 Speaker 2: or disease states. Right. We know that missus Milner, a 334 00:17:25,956 --> 00:17:29,516 Speaker 2: nurse in the UK, was able to smell Parkinson's disease 335 00:17:31,316 --> 00:17:34,156 Speaker 2: and she's since been able to teach that skill to 336 00:17:34,196 --> 00:17:36,396 Speaker 2: other people, which is really amazing. And then we figured 337 00:17:36,396 --> 00:17:39,796 Speaker 2: out all the chemistry of what's actually being smelled. We 338 00:17:39,876 --> 00:17:43,356 Speaker 2: know that there's many many instances where there is a 339 00:17:43,436 --> 00:17:45,836 Speaker 2: scent signature to a disease or to a wellness or 340 00:17:45,876 --> 00:17:50,276 Speaker 2: to a health state that hasn't yet been fully figured out, right, 341 00:17:50,316 --> 00:17:53,236 Speaker 2: but we know that they exist. Those are the three, right, 342 00:17:53,316 --> 00:17:57,596 Speaker 2: So fragrance industry really security and supply chain and health 343 00:17:57,596 --> 00:18:03,076 Speaker 2: and wellness, and I view them in that order because 344 00:18:03,516 --> 00:18:07,476 Speaker 2: that's like the order in which I think we can 345 00:18:07,516 --> 00:18:11,036 Speaker 2: be useful to the world. Right, Designing fragrances is something 346 00:18:11,036 --> 00:18:14,156 Speaker 2: that's much more attainable technically, and frankly, it's just a great, 347 00:18:14,556 --> 00:18:17,156 Speaker 2: much faster sales cycle to be business to be in 348 00:18:17,836 --> 00:18:20,676 Speaker 2: than ultimately diagnostics, which are so hard. Right, I mean, 349 00:18:20,836 --> 00:18:22,916 Speaker 2: it is my north star. It's like where I want 350 00:18:22,956 --> 00:18:25,596 Speaker 2: to take the company. But I also have no illusions 351 00:18:25,636 --> 00:18:27,556 Speaker 2: about how hard that is, and I just I've seen 352 00:18:27,596 --> 00:18:29,556 Speaker 2: all the failures of the companies that have attempted it, 353 00:18:30,076 --> 00:18:33,356 Speaker 2: and I think I've learned from what hasn't worked, and 354 00:18:33,396 --> 00:18:35,916 Speaker 2: so I'm incorporating those learnings into how I want to 355 00:18:35,916 --> 00:18:39,276 Speaker 2: build the company, which is, build a great business in fragrance, 356 00:18:39,356 --> 00:18:42,716 Speaker 2: build beautiful fragrances for the world, and then strike out 357 00:18:42,756 --> 00:18:46,876 Speaker 2: from that position of strength and to even more ambitious frontiers. 358 00:18:49,716 --> 00:18:51,276 Speaker 2: We'll be back in just a minute. 359 00:19:00,276 --> 00:19:02,396 Speaker 1: When I first heard about the work Alex was doing 360 00:19:02,396 --> 00:19:07,236 Speaker 1: at OSMO, I understood why it would be useful for sensing. Basically, 361 00:19:07,356 --> 00:19:10,516 Speaker 1: you might be able to build automate sniffing machines that 362 00:19:10,596 --> 00:19:14,196 Speaker 1: could say detect cancer in a person or detect a 363 00:19:14,236 --> 00:19:18,116 Speaker 1: bomb in a suitcase. But I couldn't figure out truly 364 00:19:18,556 --> 00:19:22,676 Speaker 1: what the business case was for perfume. And in fact 365 00:19:22,756 --> 00:19:27,196 Speaker 1: Osmo has recently launched a perfume business. It's called Generation. 366 00:19:27,836 --> 00:19:32,356 Speaker 1: So I asked Alex, why is using AI and fancy machines? 367 00:19:32,636 --> 00:19:35,396 Speaker 1: Why is that better than just designing perfume in the 368 00:19:35,436 --> 00:19:36,196 Speaker 1: traditional way. 369 00:19:36,836 --> 00:19:41,276 Speaker 2: We can go from the first kind of client demand. So, hey, 370 00:19:41,276 --> 00:19:42,836 Speaker 2: I want to create a fragrance, and here's who my 371 00:19:42,836 --> 00:19:45,156 Speaker 2: brand is, here's what I want to do. So just 372 00:19:45,156 --> 00:19:49,116 Speaker 2: that description to a starting place of a fragrance in 373 00:19:49,156 --> 00:19:49,876 Speaker 2: a minute or two. 374 00:19:50,316 --> 00:19:53,796 Speaker 1: What happens at a traditional perfumer when somebody comes in 375 00:19:53,796 --> 00:19:54,476 Speaker 1: with that request. 376 00:19:54,596 --> 00:19:56,756 Speaker 2: Well, so let's say you're an emerging Let's say you're 377 00:19:56,756 --> 00:19:59,196 Speaker 2: an emerging brand, right, So you're starting out or you 378 00:19:59,236 --> 00:20:00,596 Speaker 2: have your first product and you want to add a 379 00:20:00,596 --> 00:20:02,996 Speaker 2: second one. But you're small, right, You're not making a 380 00:20:03,036 --> 00:20:05,916 Speaker 2: billion dollars in revenue. You're making less than that. So 381 00:20:06,116 --> 00:20:09,716 Speaker 2: if you want to make a new custom fragrance, good luck. Right, 382 00:20:09,756 --> 00:20:11,436 Speaker 2: you're not going to be able to get the attention 383 00:20:12,116 --> 00:20:15,276 Speaker 2: of the big fragrance houses because they want a service 384 00:20:15,436 --> 00:20:20,196 Speaker 2: business that's like millions and millions of dollars and you're 385 00:20:20,196 --> 00:20:22,476 Speaker 2: not big enough yet. So if you want a great 386 00:20:22,516 --> 00:20:25,636 Speaker 2: custom fragrance that your consumers are going to love, and 387 00:20:25,676 --> 00:20:27,916 Speaker 2: you want to do it quickly so you're responding to trends, 388 00:20:29,636 --> 00:20:32,876 Speaker 2: you aren't going to be able to get it done. 389 00:20:33,236 --> 00:20:35,556 Speaker 2: So you have to make compromises, right, So if you 390 00:20:35,596 --> 00:20:37,156 Speaker 2: want to move fast, you're gonna have to use a 391 00:20:37,196 --> 00:20:40,476 Speaker 2: regurgitated fragrance. It's also called a library fragrance, which means 392 00:20:40,596 --> 00:20:42,276 Speaker 2: somebody else in the market has your spell. 393 00:20:42,356 --> 00:20:44,396 Speaker 1: I'm imagining that people who sell it call it a 394 00:20:44,436 --> 00:20:47,436 Speaker 1: library fragrance rather than a regurgitated library fragrance. 395 00:20:47,476 --> 00:20:49,876 Speaker 2: They do, they don't say regurgitated, but that's what it 396 00:20:49,996 --> 00:20:51,916 Speaker 2: effectively is, right fair. 397 00:20:51,956 --> 00:20:57,796 Speaker 1: Regurgitated does have a particular old factory connotation, so it's 398 00:20:55,956 --> 00:20:56,636 Speaker 1: a level. 399 00:20:57,796 --> 00:20:59,756 Speaker 2: It's vic or all sticks in the mind. 400 00:20:59,596 --> 00:21:03,596 Speaker 1: What like, And I'm not I just genuinely don't understand, Like, 401 00:21:03,596 --> 00:21:05,636 Speaker 1: why can't somebody just have a company with a bunch 402 00:21:05,716 --> 00:21:08,196 Speaker 1: like who knows the molecules? You know, who knows what 403 00:21:08,236 --> 00:21:11,356 Speaker 1: the five thousand milecules in the book smell like? Because 404 00:21:11,356 --> 00:21:13,076 Speaker 1: they've got the book and they can just use the 405 00:21:13,076 --> 00:21:15,156 Speaker 1: book and be like, oh, you want this, let's try that. 406 00:21:15,196 --> 00:21:17,876 Speaker 2: Do you know what I mean? Like, I'm not trying 407 00:21:17,876 --> 00:21:18,836 Speaker 2: to be difficult, but I. 408 00:21:18,756 --> 00:21:23,876 Speaker 1: Genuinely don't understand why you need the technology to do that. 409 00:21:24,676 --> 00:21:29,236 Speaker 2: Yeah, I genuinely didn't understand this either. And there's there's 410 00:21:29,276 --> 00:21:33,676 Speaker 2: a class of professional called a perfumer, and their job 411 00:21:33,716 --> 00:21:35,236 Speaker 2: is to do what you're describing, which is, Hey, I 412 00:21:35,276 --> 00:21:37,356 Speaker 2: know all these ingredients and I'm going to make them 413 00:21:37,356 --> 00:21:41,076 Speaker 2: in order to create your fragrance. So they typically there's 414 00:21:41,076 --> 00:21:44,156 Speaker 2: no perfumer that knows five thousand ingredients, but the best 415 00:21:44,356 --> 00:21:48,596 Speaker 2: perfumers know a thousand or two thousand ingredients. Most perfumers 416 00:21:48,636 --> 00:21:52,276 Speaker 2: work with two hundred, one hundred, two hundred ingredients. So 417 00:21:52,396 --> 00:21:55,716 Speaker 2: already there, like where we're there is very few people 418 00:21:55,716 --> 00:21:57,316 Speaker 2: in the world that can do what you're saying. Yeah, 419 00:21:57,316 --> 00:22:01,316 Speaker 2: they can do, and then how what are they going 420 00:22:01,396 --> 00:22:04,756 Speaker 2: to work on? Right? So it might take them weeks 421 00:22:04,796 --> 00:22:07,556 Speaker 2: or months to create a fragrance. They're working on a 422 00:22:07,556 --> 00:22:10,076 Speaker 2: few at a time. Why would they work on an 423 00:22:10,076 --> 00:22:11,916 Speaker 2: emerging brands fragrance when they can go work on a 424 00:22:11,996 --> 00:22:15,276 Speaker 2: much larger account. So there's just very very limited number 425 00:22:15,276 --> 00:22:18,476 Speaker 2: of people who can engage in the fragrance creation process, 426 00:22:18,476 --> 00:22:22,156 Speaker 2: because it is difficult. It's not so much identifying, Hey, 427 00:22:22,316 --> 00:22:25,356 Speaker 2: all these molecules smell this particular way, and therefore I 428 00:22:25,356 --> 00:22:27,316 Speaker 2: should be able to mix them, Like what ratios do 429 00:22:27,396 --> 00:22:28,916 Speaker 2: you mix them in? Like? What are the rules? Right? 430 00:22:28,916 --> 00:22:31,676 Speaker 2: And now you're actually getting into designing a system which 431 00:22:31,876 --> 00:22:34,356 Speaker 2: understands sent well enough to create new fragrance formulas, is 432 00:22:34,436 --> 00:22:36,756 Speaker 2: starting places, and then of course a perfumer finished system. 433 00:22:37,756 --> 00:22:40,556 Speaker 2: But you're right, it's like, oh, why shouldn't that exist? 434 00:22:40,716 --> 00:22:42,276 Speaker 2: And then when you actually start to peel back the 435 00:22:42,356 --> 00:22:44,076 Speaker 2: layers one by one, you realize, oh, you actually have 436 00:22:44,116 --> 00:22:46,076 Speaker 2: to build what we built. It's actually in order to 437 00:22:46,116 --> 00:22:46,756 Speaker 2: answer that question. 438 00:22:46,836 --> 00:22:50,516 Speaker 1: So presumably now your model cannot only predict what one 439 00:22:50,996 --> 00:22:52,836 Speaker 1: molecule is going to smell like, but it can predict 440 00:22:52,836 --> 00:22:55,676 Speaker 1: the combination of molecules. I mean, is it predicting? Does 441 00:22:55,716 --> 00:23:00,676 Speaker 1: it know concentration? Like does it know oh yeah, yeah, 442 00:23:01,596 --> 00:23:02,196 Speaker 1: how good is it? 443 00:23:02,236 --> 00:23:08,196 Speaker 2: I mean you have a perfumer on staff? Why? Well, 444 00:23:08,236 --> 00:23:13,436 Speaker 2: I think the goal of tools is to have them 445 00:23:13,436 --> 00:23:17,156 Speaker 2: in the hands of creatives. And there's many steps to perfumery, 446 00:23:17,196 --> 00:23:19,876 Speaker 2: but I think there's three that are relevant for what 447 00:23:19,876 --> 00:23:22,276 Speaker 2: we're talking about. The first is a perfumer when they're 448 00:23:22,276 --> 00:23:24,276 Speaker 2: when they're starting on a project, they have to have 449 00:23:24,276 --> 00:23:26,716 Speaker 2: a starting place, they have a starting formula, and then 450 00:23:26,836 --> 00:23:29,996 Speaker 2: they do their creative work step two to evolve that 451 00:23:30,116 --> 00:23:34,276 Speaker 2: formula to exactly what the customer wants, to a creative 452 00:23:34,276 --> 00:23:37,356 Speaker 2: expression that the lights the consumer as well. That's the 453 00:23:37,356 --> 00:23:40,236 Speaker 2: funnest part. Perfumers love that that is actual creation in 454 00:23:40,276 --> 00:23:43,556 Speaker 2: the creative part. Number three is then it has to 455 00:23:43,596 --> 00:23:45,756 Speaker 2: be the right price. It has to be compliant with 456 00:23:45,756 --> 00:23:48,516 Speaker 2: with a regulatory compliance. There cannot be allergens, all that 457 00:23:48,556 --> 00:23:51,676 Speaker 2: stuff that's more like sound engineering than it is composition 458 00:23:51,836 --> 00:23:54,636 Speaker 2: or being a rock star. Steps one, the starting place, 459 00:23:54,756 --> 00:23:58,356 Speaker 2: step three, all the regulatory requirements. That's where we spend 460 00:23:58,356 --> 00:24:01,436 Speaker 2: the most energy in building these tools. And then a 461 00:24:01,476 --> 00:24:06,076 Speaker 2: perfumer is the person that is taking the formulas from 462 00:24:06,156 --> 00:24:09,076 Speaker 2: starting place to creative endpoint and then handing it off 463 00:24:09,196 --> 00:24:12,996 Speaker 2: for like regulatory finishing. And they're just way more effective 464 00:24:12,996 --> 00:24:13,356 Speaker 2: with these. 465 00:24:13,276 --> 00:24:17,396 Speaker 1: Tools, at least for now, right Like, that's the way 466 00:24:17,436 --> 00:24:19,756 Speaker 1: I feel using an LLM, like I feel like I 467 00:24:19,756 --> 00:24:22,156 Speaker 1: have a window and me Plus the LLM is better 468 00:24:22,156 --> 00:24:25,276 Speaker 1: than the LLM alone and we haven't. That window hasn't 469 00:24:25,276 --> 00:24:28,836 Speaker 1: closed yet, but I'm not optimistic about my long term prospects. 470 00:24:29,796 --> 00:24:33,756 Speaker 2: We'll see though, I mean, but like listen, I honest 471 00:24:33,756 --> 00:24:37,676 Speaker 2: belief here, like the tools will get better, but the 472 00:24:37,836 --> 00:24:40,636 Speaker 2: drive to create will never go away. And I think 473 00:24:40,676 --> 00:24:44,396 Speaker 2: people will always want to know about the person behind 474 00:24:44,436 --> 00:24:47,396 Speaker 2: the creation in a way, and it's not uniforms. So 475 00:24:47,596 --> 00:24:49,636 Speaker 2: I don't think people want to know the perfumer behind 476 00:24:49,636 --> 00:24:52,236 Speaker 2: the hand soap in the gas station. They just don't, 477 00:24:52,396 --> 00:24:57,796 Speaker 2: right it. But there I think will always be room 478 00:24:58,556 --> 00:25:01,756 Speaker 2: for craft and creative use of tools, and the profession 479 00:25:01,756 --> 00:25:04,996 Speaker 2: that uses those tools might change radically, in the industry 480 00:25:05,036 --> 00:25:08,316 Speaker 2: in which those tools are used might change radically, but 481 00:25:08,476 --> 00:25:11,356 Speaker 2: the tools will always be wielded by people, but the 482 00:25:11,396 --> 00:25:15,076 Speaker 2: work that's being done might be unrecognizable. So you know, 483 00:25:15,156 --> 00:25:17,476 Speaker 2: we'll see how the world evolves. But like I just 484 00:25:17,796 --> 00:25:21,516 Speaker 2: like AI is like an engine, it's just a technologist, 485 00:25:21,556 --> 00:25:21,996 Speaker 2: just a tool. 486 00:25:23,996 --> 00:25:27,836 Speaker 1: So what's the what's the business model? Just briefly for generation, 487 00:25:28,076 --> 00:25:29,276 Speaker 1: like how what you know? 488 00:25:29,276 --> 00:25:34,196 Speaker 2: What's the model? The business model really simply is we 489 00:25:34,236 --> 00:25:36,276 Speaker 2: all design the fragrance for you and then you'll buy 490 00:25:36,316 --> 00:25:38,756 Speaker 2: that fragrance to put in your products or will even 491 00:25:38,796 --> 00:25:40,956 Speaker 2: actually create the full finished product. We'll put in a 492 00:25:40,956 --> 00:25:43,916 Speaker 2: bottle for you if you if you want. We are 493 00:25:43,956 --> 00:25:46,756 Speaker 2: behind the scenes. We're an engine supporting brands. We're not 494 00:25:46,796 --> 00:25:50,556 Speaker 2: a brand ourselves, and we're here to make beautiful fragrance 495 00:25:50,596 --> 00:25:56,676 Speaker 2: products for for brands. So what's the frontier like you have? 496 00:25:56,876 --> 00:25:59,636 Speaker 1: You know, on the business side, the generation is kind 497 00:25:59,676 --> 00:26:02,236 Speaker 1: of the central thing you're working on now, but on 498 00:26:02,276 --> 00:26:04,676 Speaker 1: the more on the on the research side, like what 499 00:26:04,756 --> 00:26:05,836 Speaker 1: are you trying to figure out? 500 00:26:05,876 --> 00:26:11,396 Speaker 2: Now? What are you working on now? So there's there's 501 00:26:11,516 --> 00:26:13,996 Speaker 2: our starting place, which is why does this molecule smell 502 00:26:13,996 --> 00:26:16,196 Speaker 2: the way that it does? And we can never stop 503 00:26:16,196 --> 00:26:18,876 Speaker 2: getting better at that. Then there's the next question of 504 00:26:18,876 --> 00:26:21,076 Speaker 2: why does this mixture of molecule smell the way that 505 00:26:21,116 --> 00:26:23,196 Speaker 2: it does? And we can never stop getting better at that. 506 00:26:23,716 --> 00:26:27,236 Speaker 2: And then there's do you like it? Which is maybe 507 00:26:27,236 --> 00:26:30,036 Speaker 2: the most important question from a business perspective, or who 508 00:26:30,156 --> 00:26:34,356 Speaker 2: likes it? And in what context? Yeah, exactly exactly, which 509 00:26:34,396 --> 00:26:37,036 Speaker 2: is it's not just the formula as the input to 510 00:26:37,076 --> 00:26:39,556 Speaker 2: this model, but there's also who you are, what are 511 00:26:39,596 --> 00:26:41,836 Speaker 2: your experiences, where are you from, what are the other 512 00:26:41,876 --> 00:26:43,076 Speaker 2: things in your life that you've got. 513 00:26:42,996 --> 00:26:45,756 Speaker 1: That actually goes back to your Netflix collaborative. It gets 514 00:26:45,756 --> 00:26:49,556 Speaker 1: me like if I watched Succession and the Sopranos and 515 00:26:49,676 --> 00:26:50,996 Speaker 1: I'm fifty and. 516 00:26:51,036 --> 00:26:55,436 Speaker 2: Then what's the cologne for me? Yeah? Exactly. And so 517 00:26:56,356 --> 00:26:59,196 Speaker 2: I was very fortunate to be able to start this 518 00:26:59,276 --> 00:27:01,956 Speaker 2: company with a guy work with at Twitter name. His 519 00:27:02,036 --> 00:27:05,756 Speaker 2: name is Rich Witcombe. He's our chief technology officer. His 520 00:27:05,836 --> 00:27:09,036 Speaker 2: whole professional life has been recommended system. So he was 521 00:27:09,676 --> 00:27:13,956 Speaker 2: a lead on Spotify's song recommenders system US. So if 522 00:27:13,996 --> 00:27:16,396 Speaker 2: you like your wrapped playlist or recommended playlist, like, that's 523 00:27:16,396 --> 00:27:19,036 Speaker 2: his code. And then he also worked on self driving 524 00:27:19,036 --> 00:27:22,036 Speaker 2: cars at Nvidia. But he's been in this world of like, 525 00:27:22,716 --> 00:27:25,116 Speaker 2: hey you like these things, what about this thing? Or 526 00:27:25,276 --> 00:27:27,156 Speaker 2: here's the inputs that the system is getting. What do 527 00:27:27,196 --> 00:27:29,916 Speaker 2: I do nowt so really really deep into that world. 528 00:27:29,956 --> 00:27:33,436 Speaker 2: Then we're kind of bringing that spirit, that mindset to 529 00:27:34,036 --> 00:27:34,996 Speaker 2: sent into fragrance. 530 00:27:35,556 --> 00:27:40,076 Speaker 1: And then what about beyond you know, for the parts 531 00:27:40,076 --> 00:27:42,996 Speaker 1: of your work that are the next steps that you 532 00:27:43,036 --> 00:27:46,636 Speaker 1: alluded to farther in the distance, the essentially sensing right, 533 00:27:46,716 --> 00:27:51,116 Speaker 1: sensing for security, sensing for health, Like what work are 534 00:27:51,156 --> 00:27:52,516 Speaker 1: you doing now toward that end? 535 00:27:53,276 --> 00:27:56,596 Speaker 2: Yeah? So we're incubating this right now. So I'll tell 536 00:27:56,596 --> 00:27:59,196 Speaker 2: you two things. So one is, we have a partner, 537 00:27:59,236 --> 00:28:03,116 Speaker 2: We've deployed sensors out in the field. We're detecting inauthentic 538 00:28:03,276 --> 00:28:06,796 Speaker 2: or counterfeit goods. It's working. What's really the second thing 539 00:28:06,796 --> 00:28:09,556 Speaker 2: I'll say is it's we've learned something really interesting, which 540 00:28:09,636 --> 00:28:13,316 Speaker 2: is the molecules that smell really good and fruits and 541 00:28:13,356 --> 00:28:16,156 Speaker 2: flowers and vegetables that we have to understand to create 542 00:28:16,196 --> 00:28:20,676 Speaker 2: fragrance are the same molecules in counter for luxury goods 543 00:28:20,716 --> 00:28:24,756 Speaker 2: and the same molecules in our scent. And by getting 544 00:28:24,796 --> 00:28:28,876 Speaker 2: really good at understanding and designing fragrance in one domain 545 00:28:29,076 --> 00:28:32,596 Speaker 2: in the fragrance industry, we're actually strengthening this platform that 546 00:28:32,596 --> 00:28:35,316 Speaker 2: we're building to get really good at the next frontiers 547 00:28:35,796 --> 00:28:38,796 Speaker 2: of security detection. And then ultimately, what we care about 548 00:28:39,956 --> 00:28:43,556 Speaker 2: is healthy. So that's what really surprised us as I 549 00:28:43,596 --> 00:28:47,756 Speaker 2: thought that by working in fragrance, we're making a trade off, 550 00:28:47,796 --> 00:28:50,436 Speaker 2: which is we're here to build a great business to 551 00:28:50,476 --> 00:28:52,476 Speaker 2: make ourselves resilient so that we can work on the 552 00:28:52,556 --> 00:28:56,156 Speaker 2: much longer haul problems. But in reality, we're making progress 553 00:28:56,196 --> 00:28:59,556 Speaker 2: on those problems by teaching our platform about what the 554 00:28:59,556 --> 00:29:01,996 Speaker 2: world smells like. And it's all one it's just scent, 555 00:29:02,116 --> 00:29:04,396 Speaker 2: it's just molecules in the air. And so the more 556 00:29:04,436 --> 00:29:06,636 Speaker 2: we learn about really any piece of what the world 557 00:29:06,636 --> 00:29:08,276 Speaker 2: smells like, the better we get at all of it. 558 00:29:09,156 --> 00:29:10,796 Speaker 2: I think I'll tell you what I think the big 559 00:29:11,116 --> 00:29:17,876 Speaker 2: technical frontier is is predicting emotion. Ah, that's interesting. Uh huh. 560 00:29:18,276 --> 00:29:21,516 Speaker 2: So when you smell something, you obviously perceive something like 561 00:29:21,676 --> 00:29:24,636 Speaker 2: the first thought or first perception is whatever, fresh cut 562 00:29:24,676 --> 00:29:28,516 Speaker 2: grass or grapefruit. But then there's another thing that happens 563 00:29:28,516 --> 00:29:32,556 Speaker 2: almost at the same time, which is I remember or 564 00:29:32,996 --> 00:29:37,716 Speaker 2: I feel a particular thing. And predicting that is something 565 00:29:37,756 --> 00:29:41,796 Speaker 2: I don't think anybody's really figured out. But is a 566 00:29:41,796 --> 00:29:44,836 Speaker 2: beautiful frontier. Well, how do you get the data? You 567 00:29:44,916 --> 00:29:46,676 Speaker 2: got to ask a lot of people how they feel, 568 00:29:46,716 --> 00:29:48,276 Speaker 2: what they smell a lot of things, and they have 569 00:29:48,356 --> 00:29:49,756 Speaker 2: to be able to articulate it. Right. 570 00:29:49,796 --> 00:29:54,196 Speaker 1: Part of the thing with scent is it's so primordial that, like, 571 00:29:54,276 --> 00:29:56,156 Speaker 1: you might not even be able to say how you feel, 572 00:29:57,396 --> 00:30:00,036 Speaker 1: so you need in the computer interface. 573 00:30:00,476 --> 00:30:02,836 Speaker 2: You might you might, But turns out we have voices 574 00:30:02,836 --> 00:30:06,596 Speaker 2: and faces that are effectively BCIs there's a lot of 575 00:30:06,596 --> 00:30:09,196 Speaker 2: information that leaks out of us all the time. And 576 00:30:09,236 --> 00:30:11,116 Speaker 2: that was what my PhD was in, is how do 577 00:30:11,156 --> 00:30:15,036 Speaker 2: you interpret body language in a way that makes sense? 578 00:30:15,076 --> 00:30:17,276 Speaker 2: And by the way, the body language I worked on 579 00:30:17,356 --> 00:30:20,596 Speaker 2: most closely was body language driven by odors, right, things 580 00:30:20,636 --> 00:30:23,836 Speaker 2: that make I studied this in animals, but makes animals 581 00:30:23,876 --> 00:30:27,396 Speaker 2: happy or sad or afraid or calm, And you can 582 00:30:27,436 --> 00:30:30,516 Speaker 2: read that out. I mean, our behaviors are meant to 583 00:30:30,556 --> 00:30:34,156 Speaker 2: communicate to other animals. Right, We're very social, we're social species. 584 00:30:34,636 --> 00:30:36,396 Speaker 2: So I think there's more fundamentals that we have to 585 00:30:36,436 --> 00:30:40,276 Speaker 2: figure out. But this is I think there's some really 586 00:30:40,316 --> 00:30:42,676 Speaker 2: fundamental stuff that's still unknown here. 587 00:30:43,156 --> 00:30:45,916 Speaker 1: I heard you say in another interview that you worry 588 00:30:45,956 --> 00:30:50,716 Speaker 1: sometimes that you'll hit some barrier in nature to your work, 589 00:30:51,716 --> 00:30:53,596 Speaker 1: and you said it in passing. But I was very 590 00:30:53,636 --> 00:30:54,396 Speaker 1: curious about that. 591 00:30:54,436 --> 00:30:57,596 Speaker 2: What does that mean? I always think about that, which 592 00:30:57,636 --> 00:31:01,676 Speaker 2: is like, what day will it be when mother nature 593 00:31:01,716 --> 00:31:06,636 Speaker 2: says you can't figure the next hard thing out? And 594 00:31:06,676 --> 00:31:08,916 Speaker 2: I just look at this from the history of science. 595 00:31:10,036 --> 00:31:12,876 Speaker 2: You know, how if somebody cared about how the planets 596 00:31:12,876 --> 00:31:16,236 Speaker 2: were moving in twelve hundred, well good luck, Like you 597 00:31:16,236 --> 00:31:19,356 Speaker 2: don't have the right telescopes, you don't have tycho brahe. 598 00:31:19,916 --> 00:31:22,636 Speaker 2: There's a bunch of stuff you're gonna need, right, And 599 00:31:22,676 --> 00:31:24,836 Speaker 2: so in a way, it's like mother nature and what 600 00:31:24,876 --> 00:31:28,436 Speaker 2: our society and species knows conspiring together that basically says 601 00:31:28,916 --> 00:31:31,636 Speaker 2: progress will have to wait. And so I think about that. 602 00:31:31,676 --> 00:31:34,236 Speaker 2: I worry about that all the time. And so my 603 00:31:34,516 --> 00:31:38,636 Speaker 2: mental framework that keeps me super humble is like, I'm 604 00:31:38,676 --> 00:31:40,836 Speaker 2: just thankful for all the progress we've been able to make. 605 00:31:40,876 --> 00:31:44,556 Speaker 2: That the tools were around right. So I didn't invent 606 00:31:44,596 --> 00:31:47,036 Speaker 2: graph neural networks. I didn't even invent the data sets 607 00:31:47,116 --> 00:31:50,556 Speaker 2: like we are piecing together and curating, cobbling together. All 608 00:31:50,596 --> 00:31:53,036 Speaker 2: these were standing on the shoulders of so many people 609 00:31:53,676 --> 00:31:57,596 Speaker 2: and it's just always been the case, and I don't know, 610 00:31:57,596 --> 00:32:02,956 Speaker 2: it just it makes Maybe this is too philosophical, but 611 00:32:03,436 --> 00:32:06,156 Speaker 2: for me, when I've been up close and personal with 612 00:32:06,276 --> 00:32:08,516 Speaker 2: scientific progress, either that I've had a part in or 613 00:32:08,556 --> 00:32:11,996 Speaker 2: I've observed other people do, it all feels so tenuous. 614 00:32:12,076 --> 00:32:14,916 Speaker 2: It feels so lucky because once you really dig into 615 00:32:14,916 --> 00:32:17,356 Speaker 2: the details, you realize, oh my gosh, they had to 616 00:32:17,396 --> 00:32:19,756 Speaker 2: be right there at that time and have known about 617 00:32:19,756 --> 00:32:20,156 Speaker 2: that thing. 618 00:32:20,836 --> 00:32:23,356 Speaker 1: It's amazing that anything happens, whether you think of how 619 00:32:23,356 --> 00:32:24,636 Speaker 1: confut is everything. 620 00:32:24,436 --> 00:32:27,396 Speaker 2: The amazing that anything happens, and you know, when you 621 00:32:27,516 --> 00:32:30,316 Speaker 2: really dig in, you're like, wow, how does anything good 622 00:32:30,436 --> 00:32:34,116 Speaker 2: happen at all? But nonetheless you persist. And also I 623 00:32:34,116 --> 00:32:36,556 Speaker 2: think you can create the conditions where it's more likely 624 00:32:36,636 --> 00:32:39,516 Speaker 2: than not to happen. And so that's what OZMO is, 625 00:32:39,596 --> 00:32:42,116 Speaker 2: and that's why OSMO Birth Generation is, like, let's create 626 00:32:42,156 --> 00:32:45,076 Speaker 2: an environment where we're much more likely than not to 627 00:32:45,156 --> 00:32:47,756 Speaker 2: make both the scientific progress we need to make, but 628 00:32:47,836 --> 00:32:52,476 Speaker 2: also like really help and change the fragrance industry, which, 629 00:32:52,516 --> 00:32:54,996 Speaker 2: by the way, will teach us the things we need 630 00:32:55,036 --> 00:32:56,836 Speaker 2: to know to get to the next thing. So I 631 00:32:56,836 --> 00:32:59,516 Speaker 2: think there's so much beauty to create in the fragrance 632 00:32:59,516 --> 00:33:01,716 Speaker 2: industry that I'm going to just enjoy the heck out 633 00:33:01,716 --> 00:33:03,956 Speaker 2: of it and do it for the rest of my life. 634 00:33:04,236 --> 00:33:05,636 Speaker 2: But I think it's going to teach us things that 635 00:33:05,676 --> 00:33:09,156 Speaker 2: will allow us to do even more audacious work in 636 00:33:09,196 --> 00:33:09,676 Speaker 2: the future. 637 00:33:13,036 --> 00:33:15,356 Speaker 1: We'll be back in a minute with the lightning round. 638 00:33:18,196 --> 00:33:27,956 Speaker 1: M let's finish with the lightning round. I'm gonna ask you. 639 00:33:27,916 --> 00:33:28,716 Speaker 2: A bunch of questions. 640 00:33:28,716 --> 00:33:32,516 Speaker 1: Now, what seemingly pleasant scent do you never want to 641 00:33:32,516 --> 00:33:33,076 Speaker 1: smell again. 642 00:33:34,516 --> 00:33:36,596 Speaker 2: Seemingly pleasant scent that I never want to smell again. 643 00:33:37,996 --> 00:33:40,756 Speaker 2: Artificial cherry. It was the cough syrup that I was 644 00:33:40,756 --> 00:33:42,956 Speaker 2: forced to drink as a kid, and I'm super sensitive 645 00:33:42,996 --> 00:33:46,196 Speaker 2: to it. The molecules ethyl moltole do not like, are 646 00:33:46,236 --> 00:33:48,116 Speaker 2: you wearing fragrance right now? And if so, what is it? 647 00:33:49,116 --> 00:33:50,836 Speaker 2: I am not. I stopped wearing as soon as I 648 00:33:50,876 --> 00:33:53,996 Speaker 2: started the company because I needed to smell, of course, 649 00:33:55,876 --> 00:33:58,876 Speaker 2: but like, what's your what's your? Well, give me give 650 00:33:58,876 --> 00:34:01,476 Speaker 2: me a pick. Name some fragrance that's that you love 651 00:34:01,516 --> 00:34:05,716 Speaker 2: for some reason. So I really like this is kind 652 00:34:05,716 --> 00:34:09,356 Speaker 2: of a basic choice from folks inside the industry. I 653 00:34:09,436 --> 00:34:12,756 Speaker 2: love Terrator Maz. It's like the RMEZ flagship men's fragrance. 654 00:34:13,156 --> 00:34:15,476 Speaker 2: It's by a perfumer, Jean Claude Eleena. I really love 655 00:34:15,516 --> 00:34:15,876 Speaker 2: his work. 656 00:34:15,956 --> 00:34:18,276 Speaker 1: Basic Is that like basic in the way of saying 657 00:34:18,316 --> 00:34:20,076 Speaker 1: it's like if I asked you for a watch and 658 00:34:20,076 --> 00:34:21,716 Speaker 1: you said a Rolex Submarine or something. 659 00:34:21,716 --> 00:34:24,316 Speaker 2: It's just like exactly or saying, like what pop music 660 00:34:24,316 --> 00:34:26,676 Speaker 2: do you like? You said Taylor Swift. People like it 661 00:34:26,676 --> 00:34:30,236 Speaker 2: because it's great, Huh, Taylor Swift is great? A Rolex 662 00:34:30,276 --> 00:34:32,956 Speaker 2: watch is a great watch. Terrtormnz is a great fragrance, 663 00:34:33,196 --> 00:34:35,116 Speaker 2: but it's very popular. What is it about it that 664 00:34:35,156 --> 00:34:38,036 Speaker 2: you love? I love it's minimalism and I just happen 665 00:34:38,116 --> 00:34:40,396 Speaker 2: to like the notes, right, So it's really heavy on 666 00:34:40,476 --> 00:34:42,676 Speaker 2: a molecule. I like iso be super. I think it's 667 00:34:42,676 --> 00:34:45,996 Speaker 2: a great highlight of that ingredient and it just wears 668 00:34:45,996 --> 00:34:47,716 Speaker 2: really well on my skin. So that was what I 669 00:34:47,796 --> 00:34:51,236 Speaker 2: used to wear almost every day before I stopped. What's 670 00:34:51,276 --> 00:34:57,276 Speaker 2: your second favorite sense? My second favorite sense is probably 671 00:34:57,636 --> 00:35:00,836 Speaker 2: gonna be It's a hard between vision and hearing because 672 00:35:00,876 --> 00:35:03,116 Speaker 2: I love music, but I like looking at stuff too, 673 00:35:03,396 --> 00:35:09,676 Speaker 2: Like the World World of Beautiful are more expensive. Perfumes 674 00:35:09,716 --> 00:35:13,316 Speaker 2: actually better sometimes, right, So I think there's just like 675 00:35:13,516 --> 00:35:17,236 Speaker 2: anything like bicycles or art, as you start to pay more, 676 00:35:17,276 --> 00:35:18,236 Speaker 2: everything gets better. 677 00:35:18,276 --> 00:35:20,476 Speaker 1: And then at Plato's right, what's the worst thing you 678 00:35:20,516 --> 00:35:21,036 Speaker 1: ever smelled? 679 00:35:22,356 --> 00:35:25,916 Speaker 2: I have a memory. I picked a mushroom that I 680 00:35:26,036 --> 00:35:28,756 Speaker 2: thought looked cool and wanted to show it to my 681 00:35:28,956 --> 00:35:31,636 Speaker 2: dad when I was young, and I forgot about it 682 00:35:31,716 --> 00:35:34,756 Speaker 2: and it was just turned completely gross. 683 00:35:35,276 --> 00:35:37,276 Speaker 1: I had a version of that, of bringing shells home 684 00:35:37,316 --> 00:35:39,116 Speaker 1: from the beach that were alive. 685 00:35:39,196 --> 00:35:41,196 Speaker 2: It turned out I found out when they were dead. 686 00:35:41,516 --> 00:35:45,036 Speaker 2: It's like great intentions, but didn't really have wherewithal to 687 00:35:45,076 --> 00:35:48,196 Speaker 2: thick that through or understand the consequences. 688 00:35:54,156 --> 00:35:57,516 Speaker 1: Alex Wiltscow is the co founder and CEO of OSMO. 689 00:35:58,316 --> 00:36:01,596 Speaker 1: Today's show was produced by Gabriel Hunter Chang. It was 690 00:36:01,876 --> 00:36:05,316 Speaker 1: edited by Lyddya jen Kott and engineered by Sarah Bruguer. 691 00:36:05,796 --> 00:36:09,116 Speaker 1: You can email us at problem at Pushkin dot FM. 692 00:36:09,556 --> 00:36:11,916 Speaker 1: I'm Jacob Goldstein, and we'll be back next week with 693 00:36:11,956 --> 00:36:25,356 Speaker 1: another episode of What's Your Problem.