1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:20,956 --> 00:00:25,796 Speaker 2: So in September of twenty twenty, there was lightning storms 3 00:00:25,996 --> 00:00:29,396 Speaker 2: that started over six hundred fires over the span of 4 00:00:29,436 --> 00:00:33,076 Speaker 2: a single day, and it caused a number of enormous 5 00:00:33,196 --> 00:00:36,956 Speaker 2: mega fires and really overwhelmed the fire department, and smoke 6 00:00:37,036 --> 00:00:40,996 Speaker 2: flooded over San Francisco and blanketed the sky, and all 7 00:00:41,036 --> 00:00:43,916 Speaker 2: of us living in San Francisco woke up to a 8 00:00:44,196 --> 00:00:47,636 Speaker 2: blood red, blade runner sky and the sun never rose 9 00:00:47,796 --> 00:00:48,236 Speaker 2: that day. 10 00:00:49,076 --> 00:00:51,196 Speaker 1: This is Sonya Kessner. She spent a lot of her 11 00:00:51,236 --> 00:00:55,436 Speaker 1: career managing supply chains for companies like Nest, which makes 12 00:00:55,596 --> 00:00:59,876 Speaker 1: doorbells and thermostats, and Packs which makes fancy vape pins. 13 00:01:00,436 --> 00:01:03,396 Speaker 1: But the year before that day the sky turned red, 14 00:01:03,516 --> 00:01:06,996 Speaker 1: she had founded a company called pano Ai. The company's 15 00:01:07,036 --> 00:01:10,636 Speaker 1: initial goal was to reduce the damage caused by wildfires. 16 00:01:11,636 --> 00:01:15,996 Speaker 2: After that day that the sky turned blood red, we 17 00:01:16,036 --> 00:01:19,556 Speaker 2: talked to each other and decided we wanted to speed up, 18 00:01:19,756 --> 00:01:22,436 Speaker 2: and we just got an outpouring of support from all 19 00:01:22,476 --> 00:01:25,956 Speaker 2: of our friends and family, and everyone encouraged us go 20 00:01:25,996 --> 00:01:28,676 Speaker 2: out and raise vunt your capital funding, go as quickly 21 00:01:28,716 --> 00:01:31,916 Speaker 2: as you can. We need folks working on this problem. 22 00:01:32,276 --> 00:01:37,556 Speaker 2: And we felt optimistic actually that this crisis would lead 23 00:01:37,596 --> 00:01:40,236 Speaker 2: to urgency in the market, and we actually have seen that. 24 00:01:40,316 --> 00:01:43,156 Speaker 2: So that's the silver lining of what was a very, 25 00:01:44,076 --> 00:01:46,556 Speaker 2: very scary and bone chilling wake up call. 26 00:01:52,476 --> 00:01:54,876 Speaker 1: I'm Jacob Goldstein and this is What's Your Problem, the 27 00:01:54,916 --> 00:01:57,116 Speaker 1: show where I talk to people who are trying to 28 00:01:57,156 --> 00:02:01,996 Speaker 1: make technological progress. Pano, the company Sonya founded, mounts cameras 29 00:02:02,036 --> 00:02:06,636 Speaker 1: on remote mountaintop towers, then sends panoramic images from those 30 00:02:06,636 --> 00:02:10,196 Speaker 1: cameras to an AI model that's trained to spot smoke 31 00:02:10,316 --> 00:02:13,836 Speaker 1: from wildfires. The goal is to alert fire cruise early 32 00:02:13,916 --> 00:02:18,356 Speaker 1: before the fire spreads. So far, Panel's customers include utility 33 00:02:18,396 --> 00:02:21,956 Speaker 1: companies and firefighting agencies in several states across the western 34 00:02:22,036 --> 00:02:26,316 Speaker 1: US and also in Australia. I should mention that I 35 00:02:26,356 --> 00:02:29,716 Speaker 1: talked to Sonya last month before the fire in Maui. 36 00:02:30,716 --> 00:02:32,916 Speaker 1: Sonia start a Pano because she wanted to solve a 37 00:02:32,916 --> 00:02:36,916 Speaker 1: problem that goes beyond wildfires. The problem is this, how 38 00:02:36,916 --> 00:02:39,916 Speaker 1: do you use data to mitigate the damage caused by 39 00:02:39,916 --> 00:02:40,916 Speaker 1: climate change? 40 00:02:41,676 --> 00:02:44,956 Speaker 2: I will say, even after the sky turned red, raising 41 00:02:45,036 --> 00:02:48,876 Speaker 2: venture capital was not that easy. For the first year 42 00:02:48,956 --> 00:02:52,316 Speaker 2: or two, the idea of using technology to adapt to 43 00:02:52,356 --> 00:02:57,196 Speaker 2: climate change was still a very immature, nascent idea. There 44 00:02:57,276 --> 00:03:00,476 Speaker 2: was starting to be more focus again on technologies to 45 00:03:00,516 --> 00:03:04,516 Speaker 2: mitigate climate change, but when we would meet with VCS, 46 00:03:04,876 --> 00:03:07,316 Speaker 2: we had to start with wait, tell me again. You're 47 00:03:07,316 --> 00:03:10,596 Speaker 2: saying climate change is already here today. You're saying we 48 00:03:10,596 --> 00:03:12,556 Speaker 2: should do something about it. You're saying we can do 49 00:03:12,636 --> 00:03:15,236 Speaker 2: something about it. That's where we started the conversation. 50 00:03:15,836 --> 00:03:18,156 Speaker 1: Huh, that's interesting. So you're saying when you were trying 51 00:03:18,196 --> 00:03:20,436 Speaker 1: to raise money when you started the company not that 52 00:03:20,556 --> 00:03:23,756 Speaker 1: long ago, three four years ago, the idea of like, 53 00:03:23,876 --> 00:03:25,836 Speaker 1: oh no, we live in a world where the climate 54 00:03:25,956 --> 00:03:28,396 Speaker 1: already has changed, and we need to deal with that, 55 00:03:28,476 --> 00:03:30,236 Speaker 1: and we need to build companies to deal with that. 56 00:03:30,236 --> 00:03:31,516 Speaker 1: That was a novel idea. 57 00:03:32,876 --> 00:03:35,396 Speaker 2: It was often the first time they had heard that pitch. 58 00:03:36,276 --> 00:03:38,676 Speaker 2: It was the meeting with me. So I had to 59 00:03:38,716 --> 00:03:41,236 Speaker 2: start there. Climate change is not something in the far 60 00:03:41,276 --> 00:03:44,196 Speaker 2: distant future, but climate change is here today. I think 61 00:03:44,236 --> 00:03:49,076 Speaker 2: of myself as a realist, or more of a a 62 00:03:49,116 --> 00:03:51,476 Speaker 2: pessimist or a realist than others. Where you know, I 63 00:03:51,636 --> 00:03:53,556 Speaker 2: work in supply chain. We're very good thinking of what 64 00:03:53,596 --> 00:03:56,676 Speaker 2: can go wrong if you're willing to face the harsh 65 00:03:56,756 --> 00:04:00,276 Speaker 2: reality of what might be going wrong or what is 66 00:04:00,316 --> 00:04:03,156 Speaker 2: going wrong? Then you can do something about it. So 67 00:04:03,636 --> 00:04:08,116 Speaker 2: it's an optimistic take on pestimism, I guess. And so 68 00:04:08,956 --> 00:04:11,316 Speaker 2: I thought there would be a slew of adaptation companies 69 00:04:11,956 --> 00:04:15,396 Speaker 2: and I was actually shocked that there were not, And 70 00:04:16,276 --> 00:04:18,596 Speaker 2: that to me felt like, Okay, there's a void here 71 00:04:18,596 --> 00:04:19,516 Speaker 2: that needs to be filled. 72 00:04:19,596 --> 00:04:23,676 Speaker 1: So find of change causes lots of problems. How do 73 00:04:23,716 --> 00:04:27,596 Speaker 1: you land on fires? How do you get to starting 74 00:04:27,636 --> 00:04:28,596 Speaker 1: the company that you start? 75 00:04:30,116 --> 00:04:33,196 Speaker 2: You know, we actually at PANO our mission is to 76 00:04:33,276 --> 00:04:36,436 Speaker 2: bring more data to bear to mitigate all types of 77 00:04:36,516 --> 00:04:41,356 Speaker 2: climate driven natural disasters floods, hurricanes, moth slides, But starting 78 00:04:41,356 --> 00:04:44,436 Speaker 2: with wildfires is a natural place to start when you 79 00:04:44,476 --> 00:04:47,636 Speaker 2: live in California and you and your friends have experienced 80 00:04:47,636 --> 00:04:50,356 Speaker 2: the devastation firsthand. Your friends have lost their homes and 81 00:04:51,076 --> 00:04:53,916 Speaker 2: their children have been evacuated from schools, And there was 82 00:04:53,996 --> 00:04:57,516 Speaker 2: also tremendous hunger from the fire fighting community for new technology. 83 00:04:58,316 --> 00:05:02,516 Speaker 2: When we went to start researching the idea of bringing 84 00:05:02,676 --> 00:05:08,436 Speaker 2: technology to wildfires, there was already a huge community raising 85 00:05:08,476 --> 00:05:10,796 Speaker 2: a rallying cross we need more technology as a force 86 00:05:10,876 --> 00:05:14,396 Speaker 2: multiplier to tackle wildfires, and they were listing out what 87 00:05:14,436 --> 00:05:18,196 Speaker 2: they wanted. We want cameras, we want drones, we want satellites, 88 00:05:18,236 --> 00:05:22,796 Speaker 2: we want AI, we want mobile software. Please send us 89 00:05:22,796 --> 00:05:27,476 Speaker 2: some tools. And this is exactly what I knew how 90 00:05:27,476 --> 00:05:29,996 Speaker 2: to build for my career and my colleagues who come 91 00:05:29,996 --> 00:05:33,636 Speaker 2: from Cisco, from Apple, from NAS from Google. This is 92 00:05:33,636 --> 00:05:35,956 Speaker 2: exactly what we know how to build. But we looked 93 00:05:35,956 --> 00:05:37,836 Speaker 2: around the space and there were almost no vendors. 94 00:05:38,476 --> 00:05:41,956 Speaker 1: It's like a market calling out for suppliers. 95 00:05:42,636 --> 00:05:45,796 Speaker 2: It really was. Yeah, I don't think it happens that often. 96 00:05:45,796 --> 00:05:50,076 Speaker 2: I mean for our first product, the actual detailed features 97 00:05:50,116 --> 00:05:52,036 Speaker 2: for the product were written in a report from the 98 00:05:52,076 --> 00:05:53,916 Speaker 2: California Public Utilities. 99 00:05:53,556 --> 00:05:57,676 Speaker 1: Basically saying, would someone build this and sell it to us? 100 00:05:57,996 --> 00:06:00,556 Speaker 2: Yes? Exactly, that is exactly what happened. 101 00:06:00,636 --> 00:06:03,076 Speaker 1: So do you start the company and build that product 102 00:06:03,076 --> 00:06:05,396 Speaker 1: that the California whatever was asking for? 103 00:06:06,036 --> 00:06:08,636 Speaker 2: That is exactly what they We did. So the thing 104 00:06:08,676 --> 00:06:14,196 Speaker 2: that's challenging about this business at PANO is we need 105 00:06:14,236 --> 00:06:18,556 Speaker 2: to have all the capabilities of a Internet of things 106 00:06:18,636 --> 00:06:21,836 Speaker 2: company like a Nest or a Fitbit. Because we design 107 00:06:21,876 --> 00:06:24,996 Speaker 2: our own hardware, we manufacture in this factory back here, 108 00:06:25,836 --> 00:06:29,196 Speaker 2: We design our own software, we design our own artificial 109 00:06:29,196 --> 00:06:33,436 Speaker 2: intelligence algorithms. That's the same as any Internet of things, 110 00:06:33,996 --> 00:06:39,156 Speaker 2: say consumer IoT gadget in your home. But on top 111 00:06:39,196 --> 00:06:42,116 Speaker 2: of that, we also are a company that manufactures and 112 00:06:42,116 --> 00:06:46,436 Speaker 2: deploys ruggedized equipment in remote locations, and so that makes 113 00:06:46,516 --> 00:06:49,516 Speaker 2: us look more like a telecom company. We need those 114 00:06:49,516 --> 00:06:52,756 Speaker 2: capabilities in house as well, because one of the things 115 00:06:52,756 --> 00:06:56,076 Speaker 2: that the customers really want in our industry is a 116 00:06:56,076 --> 00:06:58,436 Speaker 2: one stop shop. They want a company that just handles 117 00:06:58,436 --> 00:06:58,956 Speaker 2: the whole thing. 118 00:06:59,156 --> 00:07:02,276 Speaker 1: So just tell me how, just tell me how the 119 00:07:02,316 --> 00:07:03,676 Speaker 1: system works, what happens. 120 00:07:04,876 --> 00:07:08,876 Speaker 2: So at each tower location on a mountaintop, we deploy 121 00:07:08,916 --> 00:07:12,636 Speaker 2: a PANO station which includes two ultra high definition security cameras, 122 00:07:13,156 --> 00:07:16,036 Speaker 2: and these cameras are rotating three hundred and sixty degrees 123 00:07:16,076 --> 00:07:21,036 Speaker 2: every minute, capturing ten frames of high resolution images, and 124 00:07:21,156 --> 00:07:24,236 Speaker 2: we're uploading them to the cloud over cellular or a 125 00:07:24,276 --> 00:07:28,796 Speaker 2: wired connection or starlink. And then the data goes to 126 00:07:28,836 --> 00:07:32,156 Speaker 2: the cloud and it goes to two places. First, the 127 00:07:32,516 --> 00:07:37,196 Speaker 2: images go through our AI algorithm, which is looking for 128 00:07:37,396 --> 00:07:40,876 Speaker 2: signs of smoke, and whenever the AI thinks that's the smoke, 129 00:07:40,916 --> 00:07:43,396 Speaker 2: it adds a bounding box, and those bounding boxes are 130 00:07:43,436 --> 00:07:46,836 Speaker 2: then reviewed by analysts in our Pano Intelligence Center. 131 00:07:46,796 --> 00:07:49,596 Speaker 1: A bounding box just mean the AI basically draws a 132 00:07:49,676 --> 00:07:52,636 Speaker 1: box around what it thinks is the smoke exactly. 133 00:07:52,796 --> 00:07:58,556 Speaker 2: That's exactly right, and so our Pano Intelligence Center will 134 00:07:58,596 --> 00:08:01,556 Speaker 2: dismiss any false alerts, but when they see that there 135 00:08:01,596 --> 00:08:04,796 Speaker 2: actually is smoke, that means it's time to trigger an alert. 136 00:08:05,396 --> 00:08:09,036 Speaker 2: We arrange the panel stations so that incidents can be 137 00:08:09,156 --> 00:08:12,676 Speaker 2: seen from two stations and we mark both. We have 138 00:08:12,716 --> 00:08:15,476 Speaker 2: an algorithm that calculates bearing. We actually have a patent 139 00:08:15,516 --> 00:08:18,796 Speaker 2: on this that allows it to be very accurate, and 140 00:08:18,836 --> 00:08:20,476 Speaker 2: that creates allaw two longitude. 141 00:08:20,516 --> 00:08:22,596 Speaker 1: So just to be clear, when you say bearing, it's 142 00:08:22,596 --> 00:08:25,796 Speaker 1: like there's a line from each camera and you can 143 00:08:25,836 --> 00:08:27,596 Speaker 1: figure out where the lines cross, so you can know 144 00:08:27,716 --> 00:08:30,636 Speaker 1: exactly the latitude and longitude of the site where there 145 00:08:30,676 --> 00:08:33,596 Speaker 1: is smoke. Yes, because every spot can be seen by 146 00:08:33,636 --> 00:08:35,636 Speaker 1: two different stations. 147 00:08:35,436 --> 00:08:39,836 Speaker 2: That's right. And this triangulation strategy was deployed in fire 148 00:08:39,916 --> 00:08:42,316 Speaker 2: lookout towers for hundreds of years. They would use a 149 00:08:42,356 --> 00:08:47,756 Speaker 2: dial and a string to draw the bearing manually in 150 00:08:47,796 --> 00:08:50,156 Speaker 2: a lookout tower. So we just created the digital version 151 00:08:50,156 --> 00:08:52,916 Speaker 2: of this. So we've done a human review, We've marked 152 00:08:52,916 --> 00:08:56,356 Speaker 2: the fire we've created a latitude and longitude. We push 153 00:08:56,396 --> 00:09:00,836 Speaker 2: out then automated notifications through text and email to all 154 00:09:00,956 --> 00:09:03,516 Speaker 2: of the emergency managers who have been onboarded to the 155 00:09:03,516 --> 00:09:07,396 Speaker 2: platform in that area, and they all get alerted to 156 00:09:07,436 --> 00:09:10,596 Speaker 2: this incident within minutes of the fire starting, and it 157 00:09:10,636 --> 00:09:13,276 Speaker 2: gives them location information and it gives them a video 158 00:09:13,956 --> 00:09:15,356 Speaker 2: of the growing smoke. 159 00:09:15,796 --> 00:09:19,156 Speaker 1: And I presume it's a subscription. They pay you by 160 00:09:19,196 --> 00:09:20,196 Speaker 1: the season or something. 161 00:09:20,556 --> 00:09:25,116 Speaker 2: Yeah, we do an annual subscription and PANO everything's included 162 00:09:25,156 --> 00:09:29,236 Speaker 2: that subscription, and Panel maintains the equipment. We maintained the AI, 163 00:09:29,916 --> 00:09:32,356 Speaker 2: the Panel Intelligence Center, et cetera. The customers are just 164 00:09:32,436 --> 00:09:34,156 Speaker 2: getting fire intelligence as a service. 165 00:09:34,556 --> 00:09:37,956 Speaker 1: So tell me about developing the AI, Like, was there 166 00:09:38,196 --> 00:09:40,516 Speaker 1: some off the shelf model that you could start with 167 00:09:40,916 --> 00:09:43,076 Speaker 1: or what where did you start with the I? What 168 00:09:43,116 --> 00:09:43,796 Speaker 1: did you have to do? 169 00:09:44,716 --> 00:09:49,676 Speaker 2: So we do use open source object detection models, like 170 00:09:49,796 --> 00:09:53,636 Speaker 2: any company in modern computer vision would, but these are 171 00:09:53,676 --> 00:09:56,556 Speaker 2: not specific to detecting smoke or fire. These are just 172 00:09:57,036 --> 00:10:00,356 Speaker 2: models that are used to detect objects out of camera data. 173 00:10:00,836 --> 00:10:03,996 Speaker 2: And then we need to train the model with images 174 00:10:04,036 --> 00:10:08,236 Speaker 2: of smoke and not smoke, And it turns out that 175 00:10:08,316 --> 00:10:12,276 Speaker 2: Detecting smoke is a hard computer vision problem. You know. 176 00:10:12,636 --> 00:10:14,636 Speaker 2: When I first started the company, I mean, I'm a 177 00:10:14,676 --> 00:10:17,596 Speaker 2: hardware manufacturing person. I thought the AI would be easy, 178 00:10:17,636 --> 00:10:19,836 Speaker 2: you know, like I saw this Silicon Valley episode hot 179 00:10:19,876 --> 00:10:21,876 Speaker 2: Dog not a hot Dog. I figured, you know, a 180 00:10:21,916 --> 00:10:24,076 Speaker 2: couple of months, a couple of months of loading some 181 00:10:24,196 --> 00:10:28,236 Speaker 2: data into you know, TensorFlow, boom boom, be done. You know, 182 00:10:28,276 --> 00:10:30,756 Speaker 2: three years in we have a great model and it's 183 00:10:30,756 --> 00:10:34,476 Speaker 2: still getting better every day. Smoke is a difficult thing 184 00:10:34,516 --> 00:10:38,556 Speaker 2: to detect because a wildfire, smoke is very rare. It 185 00:10:38,596 --> 00:10:41,596 Speaker 2: doesn't wildfires don't occur that often, but there are lots 186 00:10:41,636 --> 00:10:44,476 Speaker 2: of things that look like smoke. There's cloud, there's fog, 187 00:10:44,596 --> 00:10:46,436 Speaker 2: there's dust, there's barbecues. 188 00:10:47,116 --> 00:10:48,876 Speaker 1: What did you have to do to make it work? 189 00:10:48,916 --> 00:10:51,236 Speaker 1: It was harder than you thought. Like how did you 190 00:10:51,276 --> 00:10:52,916 Speaker 1: go from it not working to it working? 191 00:10:54,276 --> 00:10:55,716 Speaker 2: Well? I could tell you, but I'd have to kill 192 00:10:55,716 --> 00:11:00,556 Speaker 2: you on that one. But I can't get into too 193 00:11:00,636 --> 00:11:02,436 Speaker 2: much of the things we tried and what worked and 194 00:11:02,476 --> 00:11:04,276 Speaker 2: what didn't work and how we how we got to 195 00:11:04,356 --> 00:11:06,996 Speaker 2: our you know, end result. But you know, what I'll 196 00:11:06,996 --> 00:11:10,116 Speaker 2: say is that you know the range tools have to 197 00:11:10,156 --> 00:11:13,516 Speaker 2: do with the data you gather, how you labeled the data, 198 00:11:14,316 --> 00:11:17,516 Speaker 2: the type of model you use, so certain model techniques, 199 00:11:18,756 --> 00:11:22,556 Speaker 2: the type of not fire data that you gather as well. 200 00:11:22,916 --> 00:11:25,356 Speaker 1: You want to show the model lots of instances where 201 00:11:25,356 --> 00:11:27,756 Speaker 1: there's dust blowing up from the ground and where there's 202 00:11:28,196 --> 00:11:30,676 Speaker 1: fog in a way that looks kind of like smoke, 203 00:11:30,756 --> 00:11:32,436 Speaker 1: and all of the things we can think of that 204 00:11:32,476 --> 00:11:33,796 Speaker 1: are smoke. 205 00:11:33,676 --> 00:11:36,716 Speaker 2: Like, right, right, and you know, one of the keys 206 00:11:36,836 --> 00:11:42,436 Speaker 2: to developing a great AI program, which we're still continuing 207 00:11:42,476 --> 00:11:45,676 Speaker 2: to build out now. And actually we just hired a 208 00:11:45,676 --> 00:11:48,556 Speaker 2: new VP of Engineering who had a five hundred person 209 00:11:48,596 --> 00:11:52,796 Speaker 2: team at Meta that included both machine learning and software 210 00:11:53,156 --> 00:11:55,396 Speaker 2: and he has a PhD in computer vision. He spent 211 00:11:55,436 --> 00:11:58,036 Speaker 2: his entire career in this field. One of the first 212 00:11:58,076 --> 00:12:00,396 Speaker 2: things he did when he joined us a few months 213 00:12:00,436 --> 00:12:03,276 Speaker 2: ago was to say, we really need to invest in infrastructure, 214 00:12:04,156 --> 00:12:08,996 Speaker 2: not just not just running the experiments and building new 215 00:12:09,156 --> 00:12:11,756 Speaker 2: versions of the model, but an entire end to end 216 00:12:12,276 --> 00:12:16,236 Speaker 2: pipeline that lets you run these experiments more efficiently, gather 217 00:12:16,316 --> 00:12:19,476 Speaker 2: the learnings, compare the different results, and and then iterate 218 00:12:19,476 --> 00:12:22,916 Speaker 2: and iterate faster and faster. And so actually a lot 219 00:12:22,916 --> 00:12:25,476 Speaker 2: of companies in the in the AI space right now 220 00:12:25,556 --> 00:12:29,116 Speaker 2: are shifting to more focus on just as much focus 221 00:12:29,156 --> 00:12:34,716 Speaker 2: on the infrastructure around AI development as into the experiments themselves. 222 00:12:34,956 --> 00:12:38,996 Speaker 1: Is that why in Vidia stock is worth trillions of 223 00:12:39,036 --> 00:12:42,876 Speaker 1: dollars all of a sudden trillion dollars. One of the reasons, Yeah, 224 00:12:42,956 --> 00:12:45,076 Speaker 1: a strained at all by hardware. 225 00:12:45,916 --> 00:12:49,196 Speaker 2: We're not constrained by it, because if you're willing to pay, 226 00:12:49,956 --> 00:12:52,316 Speaker 2: you can get as much as you need from the 227 00:12:52,316 --> 00:12:55,996 Speaker 2: cloud providers. But it is, but it is extremely expensive. 228 00:12:56,236 --> 00:12:58,996 Speaker 2: It is. It is one of our is definitely one 229 00:12:58,996 --> 00:13:02,356 Speaker 2: of our highest R and D expenses is the cloud 230 00:13:02,396 --> 00:13:06,396 Speaker 2: computed for running experiments and then actually on an ongoing basis. 231 00:13:06,596 --> 00:13:09,756 Speaker 2: I mean, we're uploading trillions of pixels and and they 232 00:13:09,796 --> 00:13:13,156 Speaker 2: have to all be processed every minute, and so one 233 00:13:13,156 --> 00:13:16,756 Speaker 2: of our highest expenses is running the AI on all 234 00:13:16,836 --> 00:13:18,196 Speaker 2: this data all the time. 235 00:13:20,516 --> 00:13:23,476 Speaker 1: In a minute. Lots of other ways that Pano might 236 00:13:23,596 --> 00:13:34,116 Speaker 1: use data to mitigate the risk of disasters. Now back 237 00:13:34,156 --> 00:13:37,236 Speaker 1: to the show. Earlier, you said something to the effect that, like, 238 00:13:37,796 --> 00:13:41,276 Speaker 1: the big idea for the company was not about wildfires 239 00:13:41,316 --> 00:13:45,636 Speaker 1: per se, but it was an idea that better data 240 00:13:45,916 --> 00:13:48,716 Speaker 1: could help mitigate natural disasters. 241 00:13:48,796 --> 00:13:51,396 Speaker 2: Is that right, You just gave my elevator pitch for me, 242 00:13:51,556 --> 00:13:53,676 Speaker 2: Thank you very much. Now, I say, what we heard 243 00:13:53,716 --> 00:13:58,196 Speaker 2: from customers who work in disaster management is that there 244 00:13:58,236 --> 00:14:02,556 Speaker 2: is a paucity of data at all phases of disaster response. 245 00:14:02,996 --> 00:14:07,196 Speaker 2: And those phases are the real time response phase, which 246 00:14:07,196 --> 00:14:08,596 Speaker 2: is probably what most of us think of when they 247 00:14:08,636 --> 00:14:13,076 Speaker 2: think of disaster management, evacuation, search and rescue, fire containment, 248 00:14:13,396 --> 00:14:19,076 Speaker 2: restoring power and internet. But there's three other phases. There's recovery, mitigation, 249 00:14:19,436 --> 00:14:21,956 Speaker 2: which is determining how you're going to harden your system 250 00:14:22,036 --> 00:14:24,436 Speaker 2: so that the disasters aren't as damaging the next time. 251 00:14:24,876 --> 00:14:27,676 Speaker 2: And then preparedness, which is planning making sure you have 252 00:14:27,756 --> 00:14:30,676 Speaker 2: enough emergency blankets, making sure you know your evacuation roots, 253 00:14:30,676 --> 00:14:36,596 Speaker 2: and and you have shelters prepared. And all of those 254 00:14:36,636 --> 00:14:39,956 Speaker 2: phases need more data to face this growing threat. 255 00:14:40,156 --> 00:14:43,916 Speaker 1: To be clear, you're not just talking about wildfires here, right. 256 00:14:44,076 --> 00:14:46,716 Speaker 1: Is there a next kind of disaster you're thinking over, 257 00:14:46,796 --> 00:14:49,556 Speaker 1: a next phase of disaster. You're thinking over both, like 258 00:14:50,676 --> 00:14:52,516 Speaker 1: what do you want to do next? 259 00:14:52,636 --> 00:14:57,276 Speaker 2: So we are customers often started out as wildfire mitigation 260 00:14:57,436 --> 00:15:01,436 Speaker 2: teams and then we're asked to add mudslides and flooding 261 00:15:01,876 --> 00:15:08,196 Speaker 2: and other disasters to their remit because extreme heat, extreme cold, 262 00:15:08,676 --> 00:15:11,716 Speaker 2: for example, our power utility customers, all of these disasters 263 00:15:11,756 --> 00:15:16,476 Speaker 2: are incredibly disruptive to their power grid, and so we've 264 00:15:16,516 --> 00:15:21,956 Speaker 2: been asked to explore our building situational awareness tools that 265 00:15:21,996 --> 00:15:25,036 Speaker 2: can help them make better decisions both in the real 266 00:15:25,076 --> 00:15:27,436 Speaker 2: time heat of the moment when they're trying to restore power, 267 00:15:27,476 --> 00:15:32,796 Speaker 2: restore internet after a hurricane, for example, or after the fact, 268 00:15:32,956 --> 00:15:37,956 Speaker 2: when they're trying to analyze their entire asset map and 269 00:15:38,076 --> 00:15:41,716 Speaker 2: decide on how to deploy billions of dollars into hardening 270 00:15:41,716 --> 00:15:45,116 Speaker 2: their system. The flood maps are one hundred years out 271 00:15:45,116 --> 00:15:48,516 Speaker 2: of date. How are they going to decide which region 272 00:15:48,556 --> 00:15:52,156 Speaker 2: of their territory should they bury power lines first or second. 273 00:15:53,556 --> 00:15:57,276 Speaker 2: There's going to be trillions of dollars deployed over the 274 00:15:57,316 --> 00:16:02,076 Speaker 2: next couple of decades to help humanity prepare for climate change, 275 00:16:02,116 --> 00:16:06,076 Speaker 2: to help us harden our cities, harden our transportation infrastructure, 276 00:16:06,436 --> 00:16:11,956 Speaker 2: harden our power grids. Where's the data to inform those 277 00:16:12,036 --> 00:16:15,076 Speaker 2: that trillion dollars of spend? These customers are realizing that 278 00:16:15,156 --> 00:16:16,556 Speaker 2: this data gap is a problem. 279 00:16:16,916 --> 00:16:25,116 Speaker 1: Somehow, As you were talking about about disaster management as 280 00:16:25,156 --> 00:16:29,076 Speaker 1: a data problem. I was thinking of what you were 281 00:16:29,076 --> 00:16:31,676 Speaker 1: saying earlier about supply chains and supply chains as sort 282 00:16:31,716 --> 00:16:34,956 Speaker 1: of a worldview, and I couldn't quite nail the link. 283 00:16:34,996 --> 00:16:39,116 Speaker 1: I couldn't quite articulate the connection. But can you do 284 00:16:39,156 --> 00:16:40,836 Speaker 1: you feel a connection between those? 285 00:16:42,276 --> 00:16:46,316 Speaker 2: Actually you're getting at something which I didn't share. To 286 00:16:46,356 --> 00:16:48,636 Speaker 2: your question of why I wanted to found this company, 287 00:16:49,556 --> 00:16:53,596 Speaker 2: I had been in terms of thinking about getting into 288 00:16:53,636 --> 00:16:56,876 Speaker 2: the field of disaster management, something I had been thinking 289 00:16:56,876 --> 00:17:00,036 Speaker 2: about for many years because every time I would hear 290 00:17:00,076 --> 00:17:04,156 Speaker 2: on the news about a disaster and I would envision 291 00:17:04,236 --> 00:17:06,396 Speaker 2: the skill that it would take to go cope with 292 00:17:06,396 --> 00:17:09,276 Speaker 2: this disaster. It actually reminded me a lot of supply chain. 293 00:17:09,156 --> 00:17:10,916 Speaker 1: Manufacturing in what way. 294 00:17:11,716 --> 00:17:15,076 Speaker 2: So when you're running a supply chain, well, nobody notices you. 295 00:17:15,436 --> 00:17:19,756 Speaker 2: Everything just shows up on time, beautifully, and and it's 296 00:17:19,796 --> 00:17:23,236 Speaker 2: it's invisible. So you need to make very very meticulous 297 00:17:23,276 --> 00:17:25,796 Speaker 2: plans of exactly what you're going to build and all 298 00:17:25,876 --> 00:17:30,756 Speaker 2: the necessary pieces that need to go into place to 299 00:17:30,836 --> 00:17:34,076 Speaker 2: make sure that you can manufacture that product at the 300 00:17:34,116 --> 00:17:36,996 Speaker 2: quantities you need on time and get it to where 301 00:17:36,996 --> 00:17:40,836 Speaker 2: it needs to go. So you're planning meticulously and then 302 00:17:41,756 --> 00:17:47,116 Speaker 2: all of your plans never goes planned. You you still 303 00:17:47,956 --> 00:17:53,116 Speaker 2: all your well laid plans still result in fire drill, 304 00:17:53,156 --> 00:17:55,756 Speaker 2: fire drill, fire drill, fire drill, just a disaster disaster 305 00:17:55,916 --> 00:17:58,116 Speaker 2: every single day. I mean it's a super high adrenaline job. 306 00:17:58,876 --> 00:18:02,956 Speaker 2: Where where there's a labor strike, there's a pandemic, there's 307 00:18:03,036 --> 00:18:07,516 Speaker 2: a a you know, something's held up in customs, the 308 00:18:07,556 --> 00:18:11,236 Speaker 2: supplier got confused and made the wrong color. Just disaster 309 00:18:11,316 --> 00:18:14,996 Speaker 2: after disaster after disaster is what happens in supply chain, 310 00:18:15,356 --> 00:18:17,356 Speaker 2: and you need to recover and think on your feet 311 00:18:17,516 --> 00:18:21,716 Speaker 2: and figure out how to resolve that issue. And at 312 00:18:21,716 --> 00:18:25,236 Speaker 2: the end of the day, things result in calm and 313 00:18:25,276 --> 00:18:29,076 Speaker 2: stability and you save the day. And and you know, 314 00:18:29,196 --> 00:18:32,476 Speaker 2: disaster management I think has a lot of similarities where 315 00:18:32,716 --> 00:18:37,916 Speaker 2: emergency managers they spend the off season meticulously planning for 316 00:18:38,076 --> 00:18:41,916 Speaker 2: how they're going to harden their system, like building fire 317 00:18:41,956 --> 00:18:44,836 Speaker 2: brakes and safety zones, how they're going to be prepared, 318 00:18:44,956 --> 00:18:49,196 Speaker 2: like communicating evacuation plans and rehearsing and having drills and 319 00:18:50,156 --> 00:18:54,276 Speaker 2: ordering emergency blankets. And still when the fire comes, they 320 00:18:54,316 --> 00:18:57,836 Speaker 2: need to react in real time and make snap spurt 321 00:18:57,876 --> 00:19:00,876 Speaker 2: the moment decisions, and so I think I have coming 322 00:19:00,876 --> 00:19:02,476 Speaker 2: from supply chain, I have a lot of empathy for 323 00:19:02,516 --> 00:19:06,316 Speaker 2: our customers emergency management, and I can imagine how the 324 00:19:06,316 --> 00:19:08,676 Speaker 2: more data that they can have, the better they can 325 00:19:08,676 --> 00:19:12,036 Speaker 2: make their decisions. Because being in supply chain. Data is key. 326 00:19:12,396 --> 00:19:15,196 Speaker 2: You know what, when I was leading supply chain organizations, 327 00:19:15,716 --> 00:19:21,516 Speaker 2: my goal was to surface data on every on as 328 00:19:21,596 --> 00:19:23,596 Speaker 2: far upstream and the supply chain as I could go, 329 00:19:23,916 --> 00:19:26,556 Speaker 2: as early as possible. And if I could look, if 330 00:19:26,596 --> 00:19:29,836 Speaker 2: I could look, you know, twelve weeks upstream into the 331 00:19:29,876 --> 00:19:32,716 Speaker 2: supply chain and I saw, oh, they had a labor strike, 332 00:19:33,116 --> 00:19:35,476 Speaker 2: I know there's going to be a problem, you know, 333 00:19:35,636 --> 00:19:37,396 Speaker 2: coming at me twelve weeks from now. But when I 334 00:19:37,396 --> 00:19:39,876 Speaker 2: have twelve weeks to react to it, it's much easier 335 00:19:39,916 --> 00:19:41,916 Speaker 2: for me to solve that problem that if I only 336 00:19:41,956 --> 00:19:45,236 Speaker 2: find out about that labor strike when that part just 337 00:19:45,276 --> 00:19:48,196 Speaker 2: doesn't show up. Data is critical to running supply chain 338 00:19:48,276 --> 00:19:51,916 Speaker 2: and emergency managers share the same thing. The more data 339 00:19:51,956 --> 00:19:56,436 Speaker 2: they have, the more they can respond efficiently and safely. 340 00:19:57,556 --> 00:20:00,476 Speaker 2: There are some similarities. You're right, It's good. There is 341 00:20:00,516 --> 00:20:01,796 Speaker 2: something that ties it all together. 342 00:20:05,436 --> 00:20:07,556 Speaker 1: We'll be back in a minute with the lightning round, 343 00:20:07,556 --> 00:20:17,116 Speaker 1: then back to the show, So we do a bunch 344 00:20:17,156 --> 00:20:19,236 Speaker 1: of questions at the end of the show. I didn't 345 00:20:19,476 --> 00:20:23,156 Speaker 1: realize when I wrote those questions for this interview how 346 00:20:23,236 --> 00:20:25,276 Speaker 1: much we talk about supply chains in the main part 347 00:20:25,276 --> 00:20:27,076 Speaker 1: of the interview. I wrote a bunch of supply chain 348 00:20:27,116 --> 00:20:29,396 Speaker 1: lightning around questions for you. Oh that's okay, Okay, let's 349 00:20:29,396 --> 00:20:30,196 Speaker 1: do them anyways. 350 00:20:30,396 --> 00:20:32,276 Speaker 2: Yeah, my favorite talking that's great. 351 00:20:33,516 --> 00:20:35,596 Speaker 1: What do you love about supply chains? 352 00:20:37,436 --> 00:20:39,156 Speaker 2: I think what I love about the supply chain is 353 00:20:39,196 --> 00:20:44,356 Speaker 2: both the planning part and the and the adrenaline part 354 00:20:44,596 --> 00:20:47,956 Speaker 2: of having to to have the diving catches in the moment. 355 00:20:48,116 --> 00:20:50,836 Speaker 1: Diving catch is a great metaphor. Who doesn't love making 356 00:20:50,836 --> 00:20:56,356 Speaker 1: the diving catch? Yeah? What was the difference between managing 357 00:20:56,636 --> 00:21:00,476 Speaker 1: the supply chain for high tech vight pens at packs 358 00:21:00,676 --> 00:21:05,596 Speaker 1: and managing the supply chain for fancy thermostats or cameras 359 00:21:05,716 --> 00:21:06,316 Speaker 1: at nest? 360 00:21:10,476 --> 00:21:13,676 Speaker 2: Those supply chains are pretty similar, to be honest. The 361 00:21:13,756 --> 00:21:18,316 Speaker 2: reason we get so many awesome coal gadgets so quickly 362 00:21:18,356 --> 00:21:22,276 Speaker 2: into the market is that the consumer electronics industry has 363 00:21:22,276 --> 00:21:25,956 Speaker 2: built a really mature supply chain that's made up of 364 00:21:25,996 --> 00:21:29,956 Speaker 2: building blocks that can be rearranged into very different products. 365 00:21:29,956 --> 00:21:32,996 Speaker 2: So the building blocks of both of those products are 366 00:21:33,836 --> 00:21:37,316 Speaker 2: surface mount technology, which is how you make the printed 367 00:21:37,356 --> 00:21:40,396 Speaker 2: circuit boards, and then the other building blocks are mechanical 368 00:21:40,796 --> 00:21:45,956 Speaker 2: components that go around those electronics, so plastic injection molded parts, 369 00:21:46,396 --> 00:21:51,396 Speaker 2: metal formed parts. Our supply chain here PANO is radically different. 370 00:21:51,436 --> 00:21:54,276 Speaker 1: However, it's interesting to think of when you describe it, 371 00:21:54,316 --> 00:21:58,156 Speaker 1: how many things we get are just circuit boards with 372 00:21:58,236 --> 00:22:03,556 Speaker 1: plastic wrapped around them, right, So many things yep yep. 373 00:22:05,436 --> 00:22:07,476 Speaker 2: By the way, I don't want it to seem that simple, 374 00:22:07,476 --> 00:22:11,676 Speaker 2: because my husband is a leads hardware engineering and at 375 00:22:11,716 --> 00:22:15,316 Speaker 2: a startup, and he would say, it's not that simple. 376 00:22:15,556 --> 00:22:20,636 Speaker 1: Fair. So, now you work in the world of natural disasters, 377 00:22:20,676 --> 00:22:23,676 Speaker 1: and I'm curious, are you a prepper? 378 00:22:24,676 --> 00:22:27,396 Speaker 2: Oh, that is a great question. You know, we do 379 00:22:27,516 --> 00:22:32,196 Speaker 2: have our earthquake kit. We do have finally filled up 380 00:22:32,236 --> 00:22:35,996 Speaker 2: our water jugs. I said to my husband, like, it 381 00:22:36,036 --> 00:22:38,916 Speaker 2: will just be too embarrassing if I founded an emergency 382 00:22:38,956 --> 00:22:42,556 Speaker 2: management company and then we die because we didn't have 383 00:22:42,756 --> 00:22:44,596 Speaker 2: our jugs of water in the earthquake. 384 00:22:45,556 --> 00:22:49,436 Speaker 1: Vanity and shame as a motivator to not die works right, Why. 385 00:22:49,276 --> 00:22:53,276 Speaker 2: Not Honestly, it's really emergency preparedness is really important. 386 00:22:53,356 --> 00:22:56,036 Speaker 1: If everything goes well, what problem will you be trying 387 00:22:56,036 --> 00:22:57,556 Speaker 1: to solve in five years. 388 00:22:57,916 --> 00:23:02,036 Speaker 2: I think the response phase is low hanging fruit, and 389 00:23:02,116 --> 00:23:04,516 Speaker 2: we have a tool that really helps the response phase. 390 00:23:04,796 --> 00:23:07,476 Speaker 2: But mitigation is just as important, and we need to 391 00:23:07,916 --> 00:23:10,396 Speaker 2: We need to think through through how many more helicopters 392 00:23:10,476 --> 00:23:12,756 Speaker 2: do we need, Where do we need to bury power lines, 393 00:23:12,956 --> 00:23:15,996 Speaker 2: what fuel breaks do we need to cut? What are 394 00:23:16,036 --> 00:23:19,156 Speaker 2: ways that we can harden our cities and the infrastructure 395 00:23:19,196 --> 00:23:22,276 Speaker 2: against wildfires? But I also would like to help solve 396 00:23:22,276 --> 00:23:25,556 Speaker 2: problems related to other disasters, like how do we restore 397 00:23:25,716 --> 00:23:30,796 Speaker 2: power and internet faster after a disaster? Right? And can 398 00:23:30,876 --> 00:23:34,516 Speaker 2: data help in that solution? So I'd really love to 399 00:23:34,516 --> 00:23:38,036 Speaker 2: be working on that problem, and I'd also really love 400 00:23:38,076 --> 00:23:43,876 Speaker 2: to be helping inform policy makers around rebuilding efforts. Build 401 00:23:43,916 --> 00:23:47,516 Speaker 2: back better is an expression used in Washington. You can't 402 00:23:47,516 --> 00:23:53,436 Speaker 2: build back better if you don't have data. 403 00:23:55,356 --> 00:23:59,716 Speaker 1: Sonya Kassner is the founder and CEO of PANO. Today's 404 00:23:59,716 --> 00:24:03,196 Speaker 1: show was produced by Gabriel Hunter Chang and Edith Russolo. 405 00:24:03,396 --> 00:24:06,756 Speaker 1: It was edited by Sarah Nix and engineered by Amanda 406 00:24:06,836 --> 00:24:10,676 Speaker 1: k Wong. You can email us at problem at Pushkin 407 00:24:10,836 --> 00:24:14,116 Speaker 1: dot fm. You can find me on Twitter at Jacob Goldstein. 408 00:24:14,476 --> 00:24:16,836 Speaker 1: I'm Jacob Goldstein and we'll be back next week with 409 00:24:16,916 --> 00:24:22,916 Speaker 1: another episode of What's Your Problem.