1 00:00:15,316 --> 00:00:27,756 Speaker 1: Pushkin. I'm Jacob Goldstein and this is What's Your Problem, 2 00:00:27,956 --> 00:00:30,716 Speaker 1: the show where entrepreneurs and engineers talk about how they're 3 00:00:30,756 --> 00:00:33,556 Speaker 1: going to change the world once they solve a few problems. 4 00:00:34,516 --> 00:00:38,196 Speaker 1: My guest today is Anna Katrina d Letski. She's a 5 00:00:38,236 --> 00:00:41,716 Speaker 1: mechanical engineer who started her career at Apple, where she 6 00:00:41,796 --> 00:00:46,836 Speaker 1: worked on the Apple Watch that was responsible for we 7 00:00:46,956 --> 00:00:52,396 Speaker 1: call it packing the suitcase. So the industrial design team 8 00:00:52,436 --> 00:00:55,236 Speaker 1: makes beautiful surfaces of like what it's going to look 9 00:00:55,276 --> 00:00:58,716 Speaker 1: like on the outside, and the product design engineers figure 10 00:00:58,756 --> 00:01:01,676 Speaker 1: out how to get all the parts inside that suitcase. 11 00:01:01,836 --> 00:01:03,956 Speaker 1: If you will, you got to get all the parts in, 12 00:01:04,276 --> 00:01:07,516 Speaker 1: and then you know that product needs to be robust. 13 00:01:08,076 --> 00:01:10,556 Speaker 1: When Anna was working on the watch, she saw a 14 00:01:10,596 --> 00:01:13,716 Speaker 1: massive problem, not just with the watch or even just 15 00:01:13,756 --> 00:01:17,916 Speaker 1: with Apple, but with all of electronics manufacturing. The problem 16 00:01:17,996 --> 00:01:23,876 Speaker 1: is this electronics manufacturing is incredibly inefficient, incredibly wasteful. And 17 00:01:23,916 --> 00:01:27,556 Speaker 1: I'm not talking about consumers throwing devices away. I'm talking 18 00:01:27,556 --> 00:01:31,676 Speaker 1: about the manufacturing process itself, what happens before the product 19 00:01:31,716 --> 00:01:35,396 Speaker 1: even leaves the factory. So Anna left Apple and started 20 00:01:35,436 --> 00:01:38,436 Speaker 1: a company called Instrumental to try to solve that problem 21 00:01:39,516 --> 00:01:42,916 Speaker 1: in our conversation. Anna laid out the problem really clearly 22 00:01:43,076 --> 00:01:46,276 Speaker 1: when she explained what happens when a new device is 23 00:01:46,316 --> 00:01:49,196 Speaker 1: about to go into production. It starts with a sort 24 00:01:49,236 --> 00:01:52,116 Speaker 1: of practice build, when a few hundred or a few 25 00:01:52,156 --> 00:01:55,476 Speaker 1: thousand devices are assembled as a test. If you're part 26 00:01:55,516 --> 00:01:58,516 Speaker 1: of a team of engineers working in your cushy office 27 00:01:58,516 --> 00:02:01,516 Speaker 1: in Kupertino, this is the moment when you and your 28 00:02:01,516 --> 00:02:04,356 Speaker 1: team have to go and actually see this thing you've 29 00:02:04,396 --> 00:02:07,556 Speaker 1: designed being built. You'd all get on a flight and 30 00:02:07,636 --> 00:02:10,716 Speaker 1: you'd fly in earlatest China. It's where a lot of 31 00:02:10,716 --> 00:02:13,316 Speaker 1: stuff is built, although today maybe it's somewhere else in 32 00:02:13,356 --> 00:02:17,876 Speaker 1: Greater Asia. And then you land in a city, and 33 00:02:17,916 --> 00:02:20,436 Speaker 1: then you'd get into a factory van and you might 34 00:02:20,516 --> 00:02:22,596 Speaker 1: drive for an hour and a half or two hours 35 00:02:22,596 --> 00:02:25,876 Speaker 1: out into a very rural part of China, and that's 36 00:02:25,876 --> 00:02:29,636 Speaker 1: where you'd have a massive manufacturing center. And then we 37 00:02:29,676 --> 00:02:32,956 Speaker 1: would be there for about ten or fourteen days. What 38 00:02:33,156 --> 00:02:35,396 Speaker 1: is the point of you being there. I'm there trying 39 00:02:35,396 --> 00:02:40,596 Speaker 1: to find issues, find problems, find parts that aren't fitting, 40 00:02:40,796 --> 00:02:44,756 Speaker 1: find quality issues with parts. So like we expect there 41 00:02:44,756 --> 00:02:47,916 Speaker 1: to be problems and quality issues with the parts, Anna said, 42 00:02:48,076 --> 00:02:51,356 Speaker 1: she's not allowed to talk about specific preproduction problems they 43 00:02:51,396 --> 00:02:53,956 Speaker 1: had with the Apple Watch, but she did give me 44 00:02:53,996 --> 00:02:58,876 Speaker 1: a hypothetical source of problems the antenna. So, okay, picture 45 00:02:58,916 --> 00:03:01,956 Speaker 1: of factory with hundreds of workers packing all these tiny 46 00:03:01,996 --> 00:03:04,676 Speaker 1: little parts into the watch. One of those parts is 47 00:03:04,756 --> 00:03:07,876 Speaker 1: the antenna, this tiny little thing the sizes of like 48 00:03:07,996 --> 00:03:11,036 Speaker 1: a one inch piece of dried spaghetti, and it has 49 00:03:11,076 --> 00:03:13,036 Speaker 1: to go in there just right so that the watch 50 00:03:13,076 --> 00:03:15,796 Speaker 1: can talk to the world. If the antenna isn't just right, 51 00:03:16,036 --> 00:03:19,516 Speaker 1: the watch doesn't work. Yeah, So let's say hypothetically, you 52 00:03:19,676 --> 00:03:21,156 Speaker 1: at the end of the line, you have a pile 53 00:03:21,196 --> 00:03:24,916 Speaker 1: of units that the antennas don't work. The antenna's really challenging. 54 00:03:24,956 --> 00:03:27,996 Speaker 1: It's like near a lot of metal and antenna's a 55 00:03:28,036 --> 00:03:32,076 Speaker 1: metal don't don't mix, and so the antennas don't work. 56 00:03:32,276 --> 00:03:36,636 Speaker 1: What you have as an engineer is test station results 57 00:03:36,676 --> 00:03:39,476 Speaker 1: that say this signal is low. It could be an 58 00:03:39,516 --> 00:03:42,956 Speaker 1: eighty percent tissue, like eighty eight out of ten don't work. 59 00:03:43,276 --> 00:03:46,596 Speaker 1: It could be a you know, point one percent issue. 60 00:03:46,756 --> 00:03:48,676 Speaker 1: You only have one that doesn't work. It made a 61 00:03:48,716 --> 00:03:51,236 Speaker 1: thousand and nine hundred and ninety nine of them work, 62 00:03:51,396 --> 00:03:53,356 Speaker 1: But what's wrong with this one? Because if you make 63 00:03:53,396 --> 00:03:55,156 Speaker 1: a million, what's wrong with that one? If you make 64 00:03:55,196 --> 00:03:58,596 Speaker 1: a million, one in a thousand is a big problem, right, yes, 65 00:03:58,876 --> 00:04:04,516 Speaker 1: exactly exactly, And so in development you care about every failure. So, okay, 66 00:04:04,516 --> 00:04:07,556 Speaker 1: so you have this problem the antenna doesn't work whatever 67 00:04:07,836 --> 00:04:10,116 Speaker 1: often or once in a while, what do you do? 68 00:04:10,276 --> 00:04:13,796 Speaker 1: We would gather up those physical units, which is non 69 00:04:13,836 --> 00:04:16,116 Speaker 1: trivial to get all those units in one place in 70 00:04:16,116 --> 00:04:20,316 Speaker 1: this big factory, and then we would very carefully start 71 00:04:20,356 --> 00:04:22,996 Speaker 1: to rip them apart to try to get more data 72 00:04:23,036 --> 00:04:26,076 Speaker 1: about what's wrong with these units? Why don't they work? Now. 73 00:04:26,236 --> 00:04:30,636 Speaker 1: One of the things that's interesting about antenna specifically and Watch, 74 00:04:30,716 --> 00:04:33,276 Speaker 1: if you've ever seen a teardown of the watch, is 75 00:04:33,556 --> 00:04:36,556 Speaker 1: when you're taking it apart, you could very easily be 76 00:04:36,676 --> 00:04:40,156 Speaker 1: destroying the evidence of the thing you're trying to find. Oh, 77 00:04:40,196 --> 00:04:43,396 Speaker 1: like you just budge the antenna a tiny little millimeter 78 00:04:43,516 --> 00:04:46,996 Speaker 1: or whatever, and that matter, yes, and it matters. And 79 00:04:47,076 --> 00:04:49,956 Speaker 1: so you might try to take some X ray images 80 00:04:49,996 --> 00:04:53,876 Speaker 1: before you disassemble, so you have some idea literally there 81 00:04:53,956 --> 00:04:57,956 Speaker 1: you have an X ray camera, yes, which is like 82 00:04:57,956 --> 00:05:02,236 Speaker 1: a three dimensional X ray, And then you take them apart, 83 00:05:02,476 --> 00:05:05,116 Speaker 1: and you'd very carefully try to figure out what was 84 00:05:05,156 --> 00:05:08,596 Speaker 1: going wrong. And sometimes those things are obvious. You open 85 00:05:08,676 --> 00:05:11,036 Speaker 1: it up, it has no antenna in it, So that's 86 00:05:11,036 --> 00:05:14,836 Speaker 1: why the antenna doesn't where. Sometimes it's very obvious. Sometimes 87 00:05:14,836 --> 00:05:17,956 Speaker 1: it's not obvious, and like, as a skilled engineer, you're 88 00:05:17,996 --> 00:05:19,996 Speaker 1: looking at it, You're like, I have no idea, And 89 00:05:20,036 --> 00:05:23,076 Speaker 1: then you start looking at the process. You go, you walk, 90 00:05:23,276 --> 00:05:25,716 Speaker 1: you know, fifty meters up the line to look at 91 00:05:25,716 --> 00:05:28,396 Speaker 1: where they're putting the antenna's in, and you're watching that 92 00:05:28,436 --> 00:05:31,556 Speaker 1: process and trying to see is there something happening here? 93 00:05:31,676 --> 00:05:35,236 Speaker 1: Are there like are there screwdrivers swinging across the line 94 00:05:35,236 --> 00:05:36,996 Speaker 1: that are like going to hit the parts and knock 95 00:05:37,076 --> 00:05:39,196 Speaker 1: the antenna? I mean, I feel like there's sort of 96 00:05:39,276 --> 00:05:44,836 Speaker 1: two layers of problems here. Right. There's the fundamental problem of, oh, 97 00:05:44,916 --> 00:05:47,596 Speaker 1: we manufactured a million of these things and we know 98 00:05:47,796 --> 00:05:49,796 Speaker 1: ten thousand of them are going to break, right. That 99 00:05:49,956 --> 00:05:55,076 Speaker 1: is just a straight up problem. There is also the 100 00:05:55,316 --> 00:05:59,556 Speaker 1: like engineers like you have to go fly to China, 101 00:05:59,636 --> 00:06:01,876 Speaker 1: you know, stand on an assembly line and just hope 102 00:06:01,876 --> 00:06:05,116 Speaker 1: they catch things, hope they figure out where the problem is. 103 00:06:05,316 --> 00:06:07,676 Speaker 1: Not even they're looking for problems, they don't even know 104 00:06:07,676 --> 00:06:10,516 Speaker 1: their problems are yet, Like I would go to a 105 00:06:10,636 --> 00:06:13,596 Speaker 1: build and it would be a successful build if we 106 00:06:13,716 --> 00:06:16,756 Speaker 1: left the build with one hundred issues in a spreadsheet 107 00:06:16,836 --> 00:06:19,356 Speaker 1: of different things that we needed to fix. So that 108 00:06:19,476 --> 00:06:22,756 Speaker 1: was the success. Like, the more problems you find, the better. 109 00:06:22,916 --> 00:06:24,836 Speaker 1: I mean, it's a success because you found it because 110 00:06:24,836 --> 00:06:26,716 Speaker 1: you know the problems are there, and you're just worried 111 00:06:26,756 --> 00:06:28,556 Speaker 1: that you're not going to find. Yes, that's like your 112 00:06:28,636 --> 00:06:31,996 Speaker 1: real nightmare problem if you're like a design engineer, right, 113 00:06:32,036 --> 00:06:34,196 Speaker 1: the one that like you don't know is a problem 114 00:06:34,276 --> 00:06:37,036 Speaker 1: until you've shipped a million pairs of earbuds or something. 115 00:06:37,476 --> 00:06:40,076 Speaker 1: The technical term for that is an escape. So that's 116 00:06:40,116 --> 00:06:44,716 Speaker 1: a failure that has escaped the problem escaped the factory. Yes, yes, 117 00:06:44,796 --> 00:06:50,596 Speaker 1: and so escapes are what essentially then cause returns and 118 00:06:50,676 --> 00:06:53,396 Speaker 1: if we go back to that waste, yeah, which is 119 00:06:53,476 --> 00:06:55,836 Speaker 1: hugely costly. Right, it's bad for the reputation of the 120 00:06:55,876 --> 00:06:59,716 Speaker 1: manufacturer because it's like, oh, it's a crappy product, bad reviews, 121 00:06:59,796 --> 00:07:03,396 Speaker 1: not a reliable problem Amazon, bad reviews, and you've got 122 00:07:03,396 --> 00:07:05,476 Speaker 1: to like refund the money, send the person a new one. 123 00:07:05,516 --> 00:07:07,876 Speaker 1: So it's costly on many levels. Yes, yes, and so 124 00:07:08,036 --> 00:07:13,396 Speaker 1: returns should be avoided, and you avoid returns by reducing escapes, 125 00:07:13,796 --> 00:07:18,676 Speaker 1: and you reduce escapes not only by figuring out all 126 00:07:18,676 --> 00:07:21,076 Speaker 1: the things that you can possibly test for. And so 127 00:07:21,116 --> 00:07:23,596 Speaker 1: that's why there are testations on the line, but the 128 00:07:23,636 --> 00:07:26,796 Speaker 1: testation can't test for everything. Did you just live in 129 00:07:26,916 --> 00:07:34,396 Speaker 1: fear all the time of missing some terrible problem? I 130 00:07:34,396 --> 00:07:38,116 Speaker 1: don't even know how to answer that. I think that 131 00:07:38,316 --> 00:07:42,636 Speaker 1: a good engineer is a paranoid engineer. So for me, 132 00:07:42,756 --> 00:07:45,316 Speaker 1: as an engineer, I thought it was kind of silly 133 00:07:45,356 --> 00:07:47,676 Speaker 1: that there is so much luck involved. You just have 134 00:07:47,716 --> 00:07:49,876 Speaker 1: to get lucky and catch the problem before it goes 135 00:07:49,876 --> 00:07:53,196 Speaker 1: into production. Luck like that, yes, and so you're trying 136 00:07:53,196 --> 00:07:56,076 Speaker 1: to find you're relying on luck to find things that 137 00:07:56,156 --> 00:08:00,556 Speaker 1: you can improve in development. And then there's also like 138 00:08:00,596 --> 00:08:03,516 Speaker 1: the scramble and the heroism that happens when you do 139 00:08:03,636 --> 00:08:06,076 Speaker 1: have a problem that everybody learns about. Oh, we have 140 00:08:06,156 --> 00:08:09,836 Speaker 1: this problem. So then you have this fired drill where 141 00:08:09,836 --> 00:08:12,076 Speaker 1: everybody flies in tries to figure out what's going on. 142 00:08:12,116 --> 00:08:14,996 Speaker 1: There's tons of freneticism and activity, and this just seemed 143 00:08:15,036 --> 00:08:16,676 Speaker 1: like a waste, Like why is it that we're so 144 00:08:16,756 --> 00:08:20,396 Speaker 1: reactive versus proactive? Why is it that like I don't 145 00:08:20,436 --> 00:08:22,796 Speaker 1: have access to the data that I need as an 146 00:08:22,796 --> 00:08:25,676 Speaker 1: engineer to actually proactively solve these problems. I have to 147 00:08:25,756 --> 00:08:30,116 Speaker 1: go and like hope versus No. After the break, Annah 148 00:08:30,196 --> 00:08:34,196 Speaker 1: launches Instrumental, a company that's trying to help engineers solve 149 00:08:34,236 --> 00:08:38,316 Speaker 1: the problem of finding problems. Help him go from hope 150 00:08:38,876 --> 00:08:49,556 Speaker 1: to know. That's the end of the ads. Now we're 151 00:08:49,556 --> 00:08:51,916 Speaker 1: going back to the show, So just a reset here. 152 00:08:52,276 --> 00:08:55,516 Speaker 1: Anna has landed a job at Apple, the company where 153 00:08:55,556 --> 00:09:00,156 Speaker 1: engineers dream of working, this amazing technology company, and yet 154 00:09:00,196 --> 00:09:02,796 Speaker 1: she decides to leave. So I asked her in the end, 155 00:09:02,916 --> 00:09:05,356 Speaker 1: what made you quit your job and go start this 156 00:09:05,396 --> 00:09:10,156 Speaker 1: new company, Instrumental. I can share some thing that that 157 00:09:10,316 --> 00:09:12,436 Speaker 1: you can use, but it's maybe it might be a 158 00:09:12,436 --> 00:09:14,636 Speaker 1: little sideways to what you're talking about, but it's the 159 00:09:14,676 --> 00:09:18,436 Speaker 1: true reason I started the company. So about a year 160 00:09:18,436 --> 00:09:21,436 Speaker 1: before Apple Watch shipped and the world knew that it 161 00:09:21,516 --> 00:09:24,476 Speaker 1: even existed, I actually had a personal tragedy happened in 162 00:09:24,476 --> 00:09:27,676 Speaker 1: my life. My husband was killed by a drunk driver. 163 00:09:28,236 --> 00:09:31,356 Speaker 1: Oh my god, and I was twenty seven, he was 164 00:09:31,396 --> 00:09:39,636 Speaker 1: twenty seven. And I realized that life is precious and limited, 165 00:09:40,556 --> 00:09:42,276 Speaker 1: and you don't know when your last day is going 166 00:09:42,316 --> 00:09:44,676 Speaker 1: to be, and so you want to be proud of 167 00:09:44,716 --> 00:09:47,036 Speaker 1: the days that you have and how you spent them 168 00:09:47,196 --> 00:09:49,636 Speaker 1: and the impact you could have on others, and also 169 00:09:49,716 --> 00:09:54,836 Speaker 1: reevaluate the justification for my existence on this planet. And 170 00:09:55,356 --> 00:09:58,676 Speaker 1: I decided that Apple is great. I'm glad I was there, 171 00:09:59,036 --> 00:10:01,596 Speaker 1: but it can't be my life's work for me. I 172 00:10:01,716 --> 00:10:05,876 Speaker 1: felt like I needed to do something bigger. And the 173 00:10:06,036 --> 00:10:08,916 Speaker 1: big problem that we all need to solve is how 174 00:10:09,156 --> 00:10:12,956 Speaker 1: we build stuff is so wasteful. That is a more 175 00:10:13,076 --> 00:10:17,036 Speaker 1: meaningful direction. And there is a big problem that I 176 00:10:17,076 --> 00:10:19,916 Speaker 1: know that I have some intuition about how to go solve, 177 00:10:20,156 --> 00:10:23,116 Speaker 1: And so I took what I understood as like, wow, 178 00:10:23,116 --> 00:10:24,876 Speaker 1: it's like really hard to find and fix these issues. 179 00:10:24,876 --> 00:10:28,116 Speaker 1: We're wasting so much money, so much physical stuff. We 180 00:10:28,196 --> 00:10:32,076 Speaker 1: are pumping so much chemicals into our rivers, we are 181 00:10:32,196 --> 00:10:36,716 Speaker 1: burning so much energy. We are wasting human lifetimes of 182 00:10:36,796 --> 00:10:44,596 Speaker 1: time doing things that don't matter. It's just wasted, and 183 00:10:45,236 --> 00:10:47,276 Speaker 1: we need to figure that out. We need to change 184 00:10:47,276 --> 00:10:50,156 Speaker 1: how we build, We need to change how we think 185 00:10:50,196 --> 00:10:53,076 Speaker 1: about building products. So this is like a little maybe 186 00:10:53,396 --> 00:10:56,556 Speaker 1: sappy for your audience, but this is the true reason 187 00:10:57,476 --> 00:11:02,516 Speaker 1: that like really made the change of thinking about away 188 00:11:02,596 --> 00:11:04,836 Speaker 1: from thinking about, oh, my job as an engineer, I 189 00:11:04,996 --> 00:11:08,876 Speaker 1: make stuff too. Oh, my job as a human is 190 00:11:08,876 --> 00:11:11,956 Speaker 1: to figure out how to do the things we need 191 00:11:12,036 --> 00:11:15,396 Speaker 1: to do better. So Anna decides she's going to try 192 00:11:15,436 --> 00:11:20,996 Speaker 1: to make electronics manufacturing better, more efficient, less wasteful. And 193 00:11:21,076 --> 00:11:23,356 Speaker 1: to do this, she and an engineer she worked with 194 00:11:23,396 --> 00:11:26,596 Speaker 1: at Apple decide to start a company. They call it Instrumental, 195 00:11:27,196 --> 00:11:30,956 Speaker 1: And when it started, they had one big idea. Instead 196 00:11:30,996 --> 00:11:33,996 Speaker 1: of forcing engineers like Anna to frantically run up and 197 00:11:33,996 --> 00:11:37,676 Speaker 1: down the assembly line hoping they'd spotted every potential problem, 198 00:11:38,196 --> 00:11:41,156 Speaker 1: they would put dozens of cameras up and down the 199 00:11:41,196 --> 00:11:46,476 Speaker 1: line to capture the entire assembly process for every single device. 200 00:11:46,876 --> 00:11:50,436 Speaker 1: It's the idea that there is value in this data 201 00:11:50,676 --> 00:11:54,956 Speaker 1: that today does not get collected, and so the actual 202 00:11:54,996 --> 00:11:58,476 Speaker 1: core pieces, Let's go get all the data. That's the 203 00:11:58,916 --> 00:12:01,836 Speaker 1: that's the core innovation here. Let's go get all the data. 204 00:12:02,036 --> 00:12:04,836 Speaker 1: The data in this case is visual data. What does 205 00:12:04,876 --> 00:12:07,996 Speaker 1: everything look like as this phone or this watch or 206 00:12:08,036 --> 00:12:10,356 Speaker 1: whatever is getting put together? Is that right? That was 207 00:12:10,396 --> 00:12:13,076 Speaker 1: the first data set that we went after was the 208 00:12:13,116 --> 00:12:17,196 Speaker 1: image data set, because there isn't one. So we're creating 209 00:12:17,436 --> 00:12:20,156 Speaker 1: a data set that didn't exist before. And as you know, 210 00:12:20,276 --> 00:12:22,516 Speaker 1: as they say, a picture is worth a thousand words. 211 00:12:23,236 --> 00:12:24,956 Speaker 1: When you take a picture, you don't need to know 212 00:12:24,996 --> 00:12:27,476 Speaker 1: in advance what you're going to be looking for in 213 00:12:27,516 --> 00:12:31,196 Speaker 1: the picture. It's a very high resolution data set, and 214 00:12:31,236 --> 00:12:34,196 Speaker 1: then you can come back with twenty twenty hindsight and 215 00:12:34,236 --> 00:12:36,796 Speaker 1: look at the picture and be like, oh, like, here's 216 00:12:36,796 --> 00:12:39,316 Speaker 1: a problem in that picture. And so that was the 217 00:12:39,356 --> 00:12:41,956 Speaker 1: initial concept of like, we need to find where these 218 00:12:41,956 --> 00:12:43,996 Speaker 1: issues are, which means we need to see them. We 219 00:12:44,076 --> 00:12:46,116 Speaker 1: work in the physical world, so we need to see them. 220 00:12:46,356 --> 00:12:48,556 Speaker 1: Remember I was talking about the antennas, where you had 221 00:12:48,596 --> 00:12:51,356 Speaker 1: to carefully take them apart, and as you took them apart, 222 00:12:51,436 --> 00:12:54,956 Speaker 1: you'd sometimes damage the evidence you're looking for. Well, if 223 00:12:54,956 --> 00:12:58,036 Speaker 1: you have images of them as they went together, then 224 00:12:58,196 --> 00:13:00,236 Speaker 1: you don't maybe have to take them apart, or you 225 00:13:00,316 --> 00:13:03,876 Speaker 1: maybe have additional data about what they looked like as 226 00:13:03,876 --> 00:13:05,796 Speaker 1: they were going together. But you don't know which ones 227 00:13:05,836 --> 00:13:08,316 Speaker 1: are going to be failures in advance, so you take 228 00:13:08,316 --> 00:13:10,596 Speaker 1: pictures of all of them and then you have the 229 00:13:10,636 --> 00:13:13,076 Speaker 1: ones after the fact that ended up being failures. You 230 00:13:13,116 --> 00:13:15,796 Speaker 1: have those images. So that was like the first idea. 231 00:13:15,916 --> 00:13:18,036 Speaker 1: So the basic ideas. You take pictures of the whole thing, 232 00:13:18,116 --> 00:13:20,756 Speaker 1: and you identify, you know, whatever, the serial number on 233 00:13:20,836 --> 00:13:24,116 Speaker 1: each say, watch each device that's going down the line, 234 00:13:24,276 --> 00:13:26,796 Speaker 1: and then if there's one device that doesn't work, you say, 235 00:13:26,796 --> 00:13:28,196 Speaker 1: show me all the pictures of that one, and then 236 00:13:28,196 --> 00:13:29,916 Speaker 1: you say, how is that one different than all the 237 00:13:29,916 --> 00:13:32,756 Speaker 1: ones that do work in the pictures? Yes, yes, But 238 00:13:33,036 --> 00:13:35,596 Speaker 1: it's not just about figuring out what's wrong once you 239 00:13:35,636 --> 00:13:38,156 Speaker 1: find a problem, right is It also about getting better 240 00:13:38,196 --> 00:13:41,236 Speaker 1: at finding problems in the first place. So discovery, we're 241 00:13:41,236 --> 00:13:43,956 Speaker 1: able to solve the discovery problem for our customer. We 242 00:13:44,076 --> 00:13:50,156 Speaker 1: built algorithms that look for anomalies, and so what's an anomaly. 243 00:13:50,716 --> 00:13:53,796 Speaker 1: An anomaly is something that's different from the other ones. 244 00:13:54,036 --> 00:13:56,916 Speaker 1: So if you build a population of units, they should 245 00:13:56,916 --> 00:13:59,636 Speaker 1: all look the same, Okay, a bunch of devices, right, 246 00:13:59,876 --> 00:14:03,796 Speaker 1: The ones that look different are probably interesting, and so 247 00:14:03,836 --> 00:14:07,436 Speaker 1: we build algorithms to find the ones that look different. 248 00:14:07,836 --> 00:14:11,196 Speaker 1: And that was kind of the first offering that we provided, 249 00:14:11,396 --> 00:14:14,156 Speaker 1: And so our customers were able to use that combination 250 00:14:14,716 --> 00:14:19,636 Speaker 1: of the photo record with the algorithms that would highlight 251 00:14:19,676 --> 00:14:23,156 Speaker 1: automatically the things that were different and interesting, and they 252 00:14:23,156 --> 00:14:26,436 Speaker 1: were able to use that to find issues. Basically, one 253 00:14:26,476 --> 00:14:29,236 Speaker 1: of these things is not like the other, yes, exactly, 254 00:14:29,356 --> 00:14:31,996 Speaker 1: So like, oh, this one's bent a little bit, this 255 00:14:32,036 --> 00:14:34,396 Speaker 1: one's the parts supposed to be black and it's white. 256 00:14:34,476 --> 00:14:37,916 Speaker 1: Is that a problem? Maybe not? Maybe we highlight those 257 00:14:37,956 --> 00:14:41,716 Speaker 1: things and then the next piece is okay, well what's 258 00:14:41,756 --> 00:14:45,436 Speaker 1: causing like problems? Okay, so you've built a system that 259 00:14:45,556 --> 00:14:49,636 Speaker 1: is better at identifying for finding problems. So now even 260 00:14:49,676 --> 00:14:53,436 Speaker 1: if a device pass all the tests, your system will say, well, 261 00:14:53,436 --> 00:14:55,956 Speaker 1: you want to check out this one missing screws because 262 00:14:55,956 --> 00:14:59,356 Speaker 1: it just looks a little different missing screws. Okay, so yeah, 263 00:14:59,476 --> 00:15:02,396 Speaker 1: like that missing screws, there's no screw tester. You're right, 264 00:15:02,436 --> 00:15:05,996 Speaker 1: like missing screws. Devices will actually miss screws. I'm so 265 00:15:06,116 --> 00:15:07,916 Speaker 1: naive that I'm like, well, surely a device wouldn't be 266 00:15:07,996 --> 00:15:10,996 Speaker 1: missing a screw, but one hundred percent, it's like one 267 00:15:11,036 --> 00:15:13,796 Speaker 1: of the top ten defects in production is missing or 268 00:15:13,996 --> 00:15:17,396 Speaker 1: extra parts, extra screw just to make up for the 269 00:15:17,396 --> 00:15:19,396 Speaker 1: other one that's missing a screw. On average, they have 270 00:15:19,436 --> 00:15:22,436 Speaker 1: the right number of screws. Yes, on average they have 271 00:15:22,516 --> 00:15:25,716 Speaker 1: the right number of scares. So we need to be 272 00:15:25,756 --> 00:15:28,156 Speaker 1: able to find issues. We need to be able to 273 00:15:28,196 --> 00:15:31,716 Speaker 1: help engineers fix them, and then we need to help 274 00:15:31,756 --> 00:15:33,756 Speaker 1: them make sure they don't come back. Those are kind 275 00:15:33,756 --> 00:15:36,516 Speaker 1: of the three things that we have to do. So okay, 276 00:15:36,596 --> 00:15:39,196 Speaker 1: those are the three key things Anna's company does to 277 00:15:39,236 --> 00:15:42,596 Speaker 1: solve this big problem and to tile those pieces together. 278 00:15:42,916 --> 00:15:45,436 Speaker 1: She gave me a case study, a case study from 279 00:15:45,436 --> 00:15:48,396 Speaker 1: a company that was trying to build a webcam and 280 00:15:48,476 --> 00:15:51,596 Speaker 1: the company was having trouble with the antenna. The antenna, which, 281 00:15:51,756 --> 00:15:54,516 Speaker 1: as Anna told me about earlier in the interview, is 282 00:15:54,556 --> 00:15:57,676 Speaker 1: this crucial part that is surprisingly hard to get right. 283 00:15:57,876 --> 00:16:01,036 Speaker 1: They always fail. There are very sensitive parts, and all 284 00:16:01,036 --> 00:16:04,036 Speaker 1: of our connected devices really rely on their antenna performance. 285 00:16:04,036 --> 00:16:06,116 Speaker 1: So you don't think about it, but you know it's 286 00:16:06,156 --> 00:16:08,716 Speaker 1: an important part. And so we had a customer who 287 00:16:08,756 --> 00:16:13,116 Speaker 1: had failing antennas and they were able to take that 288 00:16:13,236 --> 00:16:15,756 Speaker 1: group of that population of units and they were able 289 00:16:15,796 --> 00:16:18,596 Speaker 1: to actively, like root cause it to three different things. 290 00:16:18,636 --> 00:16:21,236 Speaker 1: And so this is actually the next piece of the puzzle, 291 00:16:21,516 --> 00:16:24,476 Speaker 1: which is we need to find root cause. And so 292 00:16:24,916 --> 00:16:27,436 Speaker 1: in this case, the customer was able to see a 293 00:16:27,476 --> 00:16:29,956 Speaker 1: couple of different things. The first thing they found is 294 00:16:29,996 --> 00:16:32,516 Speaker 1: a group of those failures was actually had to do 295 00:16:32,676 --> 00:16:36,276 Speaker 1: with the alignment of a connector that was connecting the antenna. 296 00:16:36,356 --> 00:16:38,556 Speaker 1: So the connector, if it was like shifted a little 297 00:16:38,596 --> 00:16:40,796 Speaker 1: bit this way, it was good, and it was shifted 298 00:16:40,836 --> 00:16:43,116 Speaker 1: a little bit the other way, it was bad. And 299 00:16:43,156 --> 00:16:45,836 Speaker 1: this was very subtle. This is not something they had 300 00:16:45,916 --> 00:16:50,916 Speaker 1: like a specification. This is a discovery. Okay, so the 301 00:16:51,036 --> 00:16:53,916 Speaker 1: angle mattered and they didn't know that in advance. Okay, 302 00:16:53,996 --> 00:16:57,556 Speaker 1: Now this is also maybe too much in the weeds. 303 00:16:57,636 --> 00:17:00,236 Speaker 1: But there are a lot of reasons, and antenna can 304 00:17:00,276 --> 00:17:02,636 Speaker 1: fail in a lot of different ways. So it's not 305 00:17:02,716 --> 00:17:05,436 Speaker 1: just like, oh, if they're failing, they're all the same 306 00:17:05,836 --> 00:17:08,916 Speaker 1: root costs. Actually, there could be a group of different 307 00:17:09,236 --> 00:17:12,196 Speaker 1: causes that cause failures. So you have two different phones 308 00:17:12,396 --> 00:17:15,276 Speaker 1: from the same build, both of which have the antenna 309 00:17:15,316 --> 00:17:17,476 Speaker 1: that doesn't work. It might be a different reason for 310 00:17:17,516 --> 00:17:19,756 Speaker 1: each phone that the antenna doesn't work, which makes it 311 00:17:19,756 --> 00:17:21,516 Speaker 1: harder to solve. So you solve one problem, but then 312 00:17:21,556 --> 00:17:22,916 Speaker 1: some of the phones still don't work. Is that what 313 00:17:22,916 --> 00:17:26,276 Speaker 1: you're saying, Yes, yes, that's very common. They also found 314 00:17:26,356 --> 00:17:29,796 Speaker 1: that the software version that they were using on the 315 00:17:29,836 --> 00:17:33,396 Speaker 1: tester was also correlated to the failures, so most of 316 00:17:33,436 --> 00:17:36,196 Speaker 1: the failures were from one particular tester, So in fact, 317 00:17:36,236 --> 00:17:39,916 Speaker 1: it wasn't a bad antenna. It was a test for 318 00:17:39,996 --> 00:17:41,836 Speaker 1: some of those units, not all of them. Some of 319 00:17:41,836 --> 00:17:44,116 Speaker 1: them still had the connector coming out at a wrong angle. 320 00:17:44,236 --> 00:17:47,636 Speaker 1: Yeah yeah. And then the last one on the same 321 00:17:47,716 --> 00:17:50,316 Speaker 1: group of failures, in the same group of failures, there 322 00:17:50,396 --> 00:17:54,756 Speaker 1: was another group of units where what the image based 323 00:17:54,756 --> 00:17:57,716 Speaker 1: algorithms found as a high correlation to the color of 324 00:17:57,796 --> 00:18:02,356 Speaker 1: the circuit board. And so color doesn't affect performance, but 325 00:18:02,876 --> 00:18:05,756 Speaker 1: it was a different vendor. One circuit board was like 326 00:18:05,796 --> 00:18:09,316 Speaker 1: a slightly different shade of a color than another, and 327 00:18:09,436 --> 00:18:11,796 Speaker 1: so that means like the vendor of that circuit board 328 00:18:12,996 --> 00:18:15,476 Speaker 1: was for one of the vendors that made that circuit 329 00:18:15,476 --> 00:18:17,516 Speaker 1: board was more likely to have failing in it. So 330 00:18:17,596 --> 00:18:20,236 Speaker 1: that gives engineers then three different things. They can go 331 00:18:20,356 --> 00:18:22,556 Speaker 1: chase down. They can go chase down a process is 332 00:18:22,636 --> 00:18:24,876 Speaker 1: year around the angle of the connector. They can go 333 00:18:24,996 --> 00:18:28,476 Speaker 1: chase down like the drifting test station and work on 334 00:18:29,436 --> 00:18:32,476 Speaker 1: calibrating the testations. And then they can go and investigate 335 00:18:32,636 --> 00:18:35,636 Speaker 1: if there's something meaningfully different between these two vendors of 336 00:18:35,836 --> 00:18:38,316 Speaker 1: circuit boards. So they're able to continue building, but they 337 00:18:38,316 --> 00:18:41,236 Speaker 1: have now three specific things they can go do, whereas 338 00:18:41,276 --> 00:18:43,876 Speaker 1: without instrumental they would kind of be guessing. So, like, 339 00:18:44,316 --> 00:18:46,436 Speaker 1: what's the sort of frontier for you? What's the thing 340 00:18:46,476 --> 00:18:48,916 Speaker 1: you're trying to do that you haven't quite figured out yet, Like, 341 00:18:48,916 --> 00:18:51,956 Speaker 1: what's the next problem to solve? Yeah, I mean so 342 00:18:52,036 --> 00:18:54,676 Speaker 1: it goes back to the reason for being for the company, 343 00:18:54,996 --> 00:18:57,876 Speaker 1: the reason for being being that twenty cents of every 344 00:18:57,876 --> 00:19:01,156 Speaker 1: dollar that's wasted in manufacturing, we haven't solved that problem yet. 345 00:19:01,316 --> 00:19:04,516 Speaker 1: What does that require? We need to change how we 346 00:19:04,556 --> 00:19:07,036 Speaker 1: think about how we build things, how we design things, 347 00:19:07,116 --> 00:19:11,356 Speaker 1: the process of iterating through that development process, the process 348 00:19:11,396 --> 00:19:14,156 Speaker 1: of what happens when it goes into production, what happens 349 00:19:14,196 --> 00:19:16,316 Speaker 1: when you return units? And what do we do with 350 00:19:16,356 --> 00:19:20,796 Speaker 1: that information from your return as a consumer. And today 351 00:19:20,836 --> 00:19:23,836 Speaker 1: we're really not doing that as an industry. Not gathering 352 00:19:23,996 --> 00:19:27,276 Speaker 1: data in a meaningful way is that we're not gathering data, 353 00:19:27,436 --> 00:19:31,356 Speaker 1: we're not thinking proactively. As an engineer. If I'm designing something, 354 00:19:31,396 --> 00:19:33,996 Speaker 1: I'm thinking about, you know, the next build and production, 355 00:19:34,036 --> 00:19:38,276 Speaker 1: but I'm not I'm not necessarily thinking about, like how 356 00:19:38,316 --> 00:19:40,996 Speaker 1: can I develop this product's that I can capture the 357 00:19:41,036 --> 00:19:44,316 Speaker 1: most data from it the fastest, the earlier in the 358 00:19:44,356 --> 00:19:47,636 Speaker 1: process versus later in the product. Like everybody knows their 359 00:19:47,716 --> 00:19:50,156 Speaker 1: data is valuable and there's stuff in there that could 360 00:19:50,156 --> 00:19:52,876 Speaker 1: be valuable, but they don't know how to value it 361 00:19:53,116 --> 00:19:56,236 Speaker 1: because the problems haven't happened yet. When they're on fire 362 00:19:56,316 --> 00:20:00,396 Speaker 1: quote unquote and they are spending a million dollars a 363 00:20:00,436 --> 00:20:04,156 Speaker 1: month on returns, they know exactly what the value is 364 00:20:04,196 --> 00:20:07,356 Speaker 1: in the data that they don't have, Yes, but when 365 00:20:07,396 --> 00:20:10,116 Speaker 1: they haven't started the program yet and they don't know 366 00:20:10,156 --> 00:20:12,836 Speaker 1: what fires they're going to have, they don't know how 367 00:20:12,836 --> 00:20:16,356 Speaker 1: to value having the data. And like preventing those problems 368 00:20:16,356 --> 00:20:18,516 Speaker 1: in the first place, the vision is that that data 369 00:20:18,596 --> 00:20:21,956 Speaker 1: is enough and then if you figured out how to 370 00:20:21,996 --> 00:20:24,476 Speaker 1: harvest that data, you can actually build lines that improve 371 00:20:24,516 --> 00:20:27,756 Speaker 1: themselves and then you eliminate the waste. So this new 372 00:20:27,796 --> 00:20:30,196 Speaker 1: different way of doing things, like can you just tell 373 00:20:30,196 --> 00:20:32,876 Speaker 1: me sort of specifically, like a few details, what would 374 00:20:32,876 --> 00:20:35,036 Speaker 1: it be like, what would it look like, what would 375 00:20:35,076 --> 00:20:38,716 Speaker 1: be happening in this world? Everything gets built cheaper, everything 376 00:20:38,756 --> 00:20:42,996 Speaker 1: gets built with less waste. And when that happens, we 377 00:20:43,076 --> 00:20:45,716 Speaker 1: have maybe we have cheaper products, we have maybe more 378 00:20:45,716 --> 00:20:49,716 Speaker 1: profitable companies, we have less waste going into the world, 379 00:20:49,956 --> 00:20:53,756 Speaker 1: less physical waste, chemicals and rivers, less energy use, less 380 00:20:53,796 --> 00:20:58,636 Speaker 1: human lifetimes wasted, And it's just thinking completely differently about 381 00:20:58,756 --> 00:21:02,916 Speaker 1: like that manufacturing actually is a machine itself, and that 382 00:21:02,956 --> 00:21:05,356 Speaker 1: we need to optimize that whole process as a machine 383 00:21:05,356 --> 00:21:08,636 Speaker 1: itself versus a means to justify the ends of like 384 00:21:08,836 --> 00:21:10,836 Speaker 1: we need we need to get products out the other 385 00:21:10,916 --> 00:21:14,236 Speaker 1: end of the line. In a minute, the Lightning Round, 386 00:21:15,116 --> 00:21:18,316 Speaker 1: Anna tells us what mechanical engineers know about the world 387 00:21:18,876 --> 00:21:30,356 Speaker 1: and the inefficiencies out in the wild that grind her tears. Now, 388 00:21:30,436 --> 00:21:34,116 Speaker 1: let's get back to what's your problem? Great, let's do 389 00:21:34,156 --> 00:21:37,996 Speaker 1: the Lightning Round. Are you ready? What's one piece of 390 00:21:38,036 --> 00:21:41,276 Speaker 1: advice you'd give to someone trying to solve a hard problem. 391 00:21:41,356 --> 00:21:45,796 Speaker 1: I love that question. I love it. I had a 392 00:21:45,836 --> 00:21:47,716 Speaker 1: science teacher in high school taught me how to solve 393 00:21:47,756 --> 00:21:50,236 Speaker 1: hard problems because I was working on science research. I'm 394 00:21:50,276 --> 00:21:53,916 Speaker 1: a science fair kid, and he taught me to break 395 00:21:53,956 --> 00:21:56,996 Speaker 1: those problems down into manageable pieces. So like, if you 396 00:21:56,996 --> 00:22:00,836 Speaker 1: can take smaller steps and understanding a problem and the 397 00:22:00,916 --> 00:22:04,476 Speaker 1: small steps you can do, then it becomes easier to solve. 398 00:22:04,956 --> 00:22:08,276 Speaker 1: So as somebody whose job is sort of to find 399 00:22:08,516 --> 00:22:12,116 Speaker 1: errors and inefficiencies in the world. Is there some some 400 00:22:12,276 --> 00:22:15,076 Speaker 1: domain something you encounter in your daily life that you 401 00:22:15,196 --> 00:22:19,956 Speaker 1: just you really want to optimize inefficiencies in things that 402 00:22:20,036 --> 00:22:25,276 Speaker 1: cause lines really irk me, like at the airport or 403 00:22:25,276 --> 00:22:27,956 Speaker 1: in like a restaurant, like anything that feels like a 404 00:22:28,036 --> 00:22:31,796 Speaker 1: non optimal like kind of scheduling and line that causes 405 00:22:31,796 --> 00:22:34,276 Speaker 1: a line of humans. That is something that I that 406 00:22:34,356 --> 00:22:37,316 Speaker 1: I noticed, and I just like really grinds my gears. 407 00:22:37,796 --> 00:22:40,716 Speaker 1: Of all your patents, which one's your favorite, I'm gonna 408 00:22:40,796 --> 00:22:44,196 Speaker 1: go with my first patent more because of what it means. 409 00:22:44,756 --> 00:22:48,476 Speaker 1: So I share my first patent with actually who is 410 00:22:48,516 --> 00:22:52,756 Speaker 1: now my co founder, Sam Weiss, and we met in 411 00:22:52,916 --> 00:22:55,276 Speaker 1: two thousand and nine at Apple, and that's actually the 412 00:22:55,316 --> 00:22:57,876 Speaker 1: summer that we invented our mutual. It was our mutual 413 00:22:57,916 --> 00:23:02,236 Speaker 1: first pattern, and we had to build essentially a switch 414 00:23:02,396 --> 00:23:04,636 Speaker 1: very similar to you know on the side of an 415 00:23:04,636 --> 00:23:07,436 Speaker 1: iPhone there's like a little ringer switch that you can 416 00:23:08,236 --> 00:23:10,556 Speaker 1: you can kind of down. So we had to build 417 00:23:10,556 --> 00:23:14,636 Speaker 1: a switch like that that had a very small ford 418 00:23:14,676 --> 00:23:16,756 Speaker 1: back there, like it needed to fit in a weird shape. 419 00:23:17,076 --> 00:23:19,556 Speaker 1: And so that was our first pattern. That pattern will 420 00:23:19,596 --> 00:23:21,476 Speaker 1: never see the light of day. There will never be 421 00:23:21,516 --> 00:23:24,276 Speaker 1: a switch that has made in that design, but it 422 00:23:24,356 --> 00:23:29,476 Speaker 1: was cool. That's awesome. What do mechanical engineers know about 423 00:23:29,516 --> 00:23:32,276 Speaker 1: the world that nobody else really gets? That everything is 424 00:23:32,316 --> 00:23:35,956 Speaker 1: imperfect and different from every other thing. Is there a 425 00:23:35,956 --> 00:23:39,356 Speaker 1: particular piece in say my iPhone that is the one 426 00:23:39,436 --> 00:23:42,796 Speaker 1: that is like has caused the most manufacturing problems, the 427 00:23:42,796 --> 00:23:44,756 Speaker 1: most problems on the assembly line? Is there some piece 428 00:23:44,796 --> 00:23:47,116 Speaker 1: that's like, oh, that piece is the one. I never 429 00:23:47,156 --> 00:23:49,676 Speaker 1: worked on phones, so I don't know specifically, but in 430 00:23:49,716 --> 00:23:57,116 Speaker 1: general in products, the antenna, anna, anything with glue, and 431 00:23:57,836 --> 00:23:59,876 Speaker 1: anything having to do with water, So those are the 432 00:23:59,916 --> 00:24:02,516 Speaker 1: three things. It's like. It's like the antenna's heart displays 433 00:24:02,516 --> 00:24:06,756 Speaker 1: are hard too, but like, so basically everything. Is there 434 00:24:06,836 --> 00:24:10,476 Speaker 1: some trick that mechanical engineers use when something isn't working, 435 00:24:10,516 --> 00:24:13,476 Speaker 1: like a remote control or whatever. I always think about it, 436 00:24:13,556 --> 00:24:15,956 Speaker 1: is it a hardware problem or a software problem. If 437 00:24:15,956 --> 00:24:19,676 Speaker 1: it's a software problem, reboot it. If it's a hardware problem, 438 00:24:19,716 --> 00:24:21,876 Speaker 1: I don't know, check the batteries. I make sure it's 439 00:24:21,916 --> 00:24:29,716 Speaker 1: plugged in. Anna Katrina shad Letsky is the founder and 440 00:24:29,876 --> 00:24:34,196 Speaker 1: CEO of Instrumental. I have a request for you this week, 441 00:24:34,356 --> 00:24:37,716 Speaker 1: and it is this. Please let me know who you 442 00:24:37,756 --> 00:24:40,396 Speaker 1: want to hear on this show. You can email me 443 00:24:40,556 --> 00:24:44,956 Speaker 1: at problem at Pushkin dot fm. That's problem at Pushkin 444 00:24:45,036 --> 00:24:48,596 Speaker 1: dot Fm. Or you can tweet at me at Jacob Goldstein. 445 00:24:49,756 --> 00:24:53,796 Speaker 1: Today's show was edited by Robert Smith, produced by Edith Russolo, 446 00:24:53,956 --> 00:24:57,636 Speaker 1: and engineered by Amanda k Wong. I'm Jacob Goldstein, and 447 00:24:57,676 --> 00:25:00,036 Speaker 1: I'll be back next week with another episode of What's 448 00:25:00,036 --> 00:25:04,476 Speaker 1: Your Problem