1 00:00:02,200 --> 00:00:06,800 Speaker 1: This is Masters in Business with Barry Ridholts on Bloomberg Radio. 2 00:00:09,280 --> 00:00:11,920 Speaker 1: This week on the podcast, I have an extra special guest. 3 00:00:12,160 --> 00:00:14,720 Speaker 1: His name is ben Nick Evans, and he is a 4 00:00:14,760 --> 00:00:19,400 Speaker 1: partner at the famed technology venture capital firm in Silicon 5 00:00:19,560 --> 00:00:25,439 Speaker 1: Valley Andrews and Horowitz, perhaps known to the aficionado as 6 00:00:25,520 --> 00:00:29,720 Speaker 1: a sixteen z Um. I have actually some swag from 7 00:00:30,000 --> 00:00:34,720 Speaker 1: Andreas and Harrowitz. I've blue a sixteen z hat which 8 00:00:34,800 --> 00:00:38,360 Speaker 1: matches my car and I just leave that in the car, 9 00:00:38,520 --> 00:00:41,360 Speaker 1: Um when I'm driving around trying to look cool to 10 00:00:41,440 --> 00:00:45,640 Speaker 1: a very select group of tech geeks. Um, if you 11 00:00:45,720 --> 00:00:50,320 Speaker 1: are at all interested in the development of technology, of ecosystems, 12 00:00:50,400 --> 00:00:57,200 Speaker 1: of of autonomous everything, of smartness, of really where the 13 00:00:57,280 --> 00:01:01,840 Speaker 1: puck is going to be the future of technology, you 14 00:01:01,880 --> 00:01:05,560 Speaker 1: will find this to be a fascinating conversation. He absolutely 15 00:01:05,640 --> 00:01:09,880 Speaker 1: has an encyclopedic knowledge of what's taken place and why 16 00:01:10,280 --> 00:01:15,680 Speaker 1: and has tremendous insight into the likely direction that uh, 17 00:01:15,720 --> 00:01:18,800 Speaker 1: this space is going. So, with no further ado, my 18 00:01:18,959 --> 00:01:26,920 Speaker 1: conversation with Andres and Horowitz is Benedict Evans. I'm Barry 19 00:01:26,959 --> 00:01:30,440 Speaker 1: rid Hilts. You're listening to Master's in Business on Bloomberg Radio. 20 00:01:30,920 --> 00:01:34,160 Speaker 1: My special guest today is Benedict Evans. He is a 21 00:01:34,200 --> 00:01:38,080 Speaker 1: partner at the legendary venture capital firm Andrees and Horowitz, 22 00:01:38,120 --> 00:01:41,760 Speaker 1: where he works as an analyst and does a weekly 23 00:01:42,120 --> 00:01:48,080 Speaker 1: consumer newsletter blog What have You, covering everything from mobile 24 00:01:48,120 --> 00:01:52,880 Speaker 1: platforms to AI to autonomous everything. He is also a 25 00:01:52,960 --> 00:01:57,360 Speaker 1: recommended follow. He began his career as an equity analyst 26 00:01:57,760 --> 00:02:01,440 Speaker 1: in investment banking, moving on to rategy and business development 27 00:02:01,800 --> 00:02:07,240 Speaker 1: at such August firms as Orange and NBC Universal. Benlict Evans, 28 00:02:07,320 --> 00:02:09,560 Speaker 1: Welcome to Bloomberg. Thank you for having me. Do do 29 00:02:09,600 --> 00:02:12,280 Speaker 1: you prefer Benlicht or Ben or Benedictio? Is better? S? A? 30 00:02:12,680 --> 00:02:16,800 Speaker 1: Okay for sure? Um, there is an advantage to having 31 00:02:16,800 --> 00:02:21,160 Speaker 1: an unusual first or last name for SEO purposes. I'm 32 00:02:21,200 --> 00:02:23,799 Speaker 1: convinced if my name was Barry Smith, no one would 33 00:02:23,880 --> 00:02:26,080 Speaker 1: know who I am. But we can talk about s 34 00:02:26,080 --> 00:02:29,680 Speaker 1: c O a little later. So you graduate Cambridge in 35 00:02:31,680 --> 00:02:36,040 Speaker 1: right in the midst of the dot com and technology boom, 36 00:02:36,520 --> 00:02:39,959 Speaker 1: not too long before the bust. What was it like 37 00:02:40,120 --> 00:02:44,040 Speaker 1: coming out into the banking world in the midst of 38 00:02:44,440 --> 00:02:47,840 Speaker 1: that era. Well, so I actually joined an equity capital 39 00:02:47,840 --> 00:02:50,440 Speaker 1: markets team. So we were doing europ and tech I 40 00:02:50,480 --> 00:02:53,040 Speaker 1: P A S and Safe for about nine months. I thought, 41 00:02:53,200 --> 00:02:55,160 Speaker 1: this is going to be a fun career, right, this 42 00:02:55,240 --> 00:02:57,440 Speaker 1: is easy. You just throw it out there and everybody 43 00:02:57,480 --> 00:03:01,119 Speaker 1: buys it. He's some random thing, and a sovereign wealth 44 00:03:01,160 --> 00:03:04,400 Speaker 1: fund puts in an order for three extra floats. And 45 00:03:04,440 --> 00:03:08,560 Speaker 1: then in about well early spring of two thousand, everything 46 00:03:08,600 --> 00:03:11,840 Speaker 1: started going down and we sort of forget now that 47 00:03:12,000 --> 00:03:13,840 Speaker 1: we talked about the dot com bubble. But you know, 48 00:03:13,880 --> 00:03:16,359 Speaker 1: particularly in Europe, there was also a mobile bubble. But 49 00:03:16,400 --> 00:03:18,600 Speaker 1: there was also a mobile bubble, and there was also 50 00:03:18,680 --> 00:03:21,040 Speaker 1: the fixed line broadband fiber bubble. So you kind of 51 00:03:21,080 --> 00:03:23,400 Speaker 1: had three bubbles that kind of had happened more or 52 00:03:23,480 --> 00:03:26,160 Speaker 1: less independently for different reasons, and kind of merged and 53 00:03:26,240 --> 00:03:28,920 Speaker 1: combined into one enormous thing. I would then when then 54 00:03:28,960 --> 00:03:31,240 Speaker 1: the market went down for three years, I would argue 55 00:03:31,240 --> 00:03:33,400 Speaker 1: you had a semi conductor bubble, you had a software bubble, 56 00:03:33,400 --> 00:03:35,600 Speaker 1: you had a hardware bubble, you had a storage bubble. 57 00:03:35,640 --> 00:03:38,840 Speaker 1: It all started a band with bubble. In the US, 58 00:03:38,880 --> 00:03:41,160 Speaker 1: we had fiber uptick and telecom bubble, where you had 59 00:03:41,600 --> 00:03:43,560 Speaker 1: you had a tech bubble, but you also had a 60 00:03:43,560 --> 00:03:46,960 Speaker 1: mobile bubble. Which is very only for very distant reasons, 61 00:03:47,000 --> 00:03:48,400 Speaker 1: was happening at the same time. It was happening at 62 00:03:48,400 --> 00:03:50,000 Speaker 1: the same time, and then there was a fiber bubble, 63 00:03:50,040 --> 00:03:51,520 Speaker 1: and yet so they all combined and we got this 64 00:03:51,640 --> 00:03:54,040 Speaker 1: enormous kind of explosion and then an enormous pot and 65 00:03:54,080 --> 00:03:56,120 Speaker 1: the market went down without stopping more or less for 66 00:03:56,160 --> 00:03:58,840 Speaker 1: three years. So were you always the tech guy or 67 00:03:59,000 --> 00:04:01,240 Speaker 1: or did you was your background more finance or something 68 00:04:01,360 --> 00:04:04,240 Speaker 1: well above? So I did my degree in history, which 69 00:04:04,560 --> 00:04:08,080 Speaker 1: is really an analytic subject. So here is a bunch 70 00:04:08,120 --> 00:04:11,040 Speaker 1: of information, try and work out something interesting or meaningful 71 00:04:11,120 --> 00:04:13,920 Speaker 1: or insightful to say about it, or something original and 72 00:04:13,960 --> 00:04:16,360 Speaker 1: interesting to say about it. And so I went from 73 00:04:16,400 --> 00:04:19,159 Speaker 1: that into sort of fairly conventional routing into invest in 74 00:04:19,160 --> 00:04:21,960 Speaker 1: banking UM, and sort of quickly decided that e c 75 00:04:22,160 --> 00:04:24,400 Speaker 1: M was kind of project management and whereas research was 76 00:04:24,440 --> 00:04:26,560 Speaker 1: probably something I was likely to be better at. And 77 00:04:26,600 --> 00:04:28,360 Speaker 1: so I went into at research and I did did 78 00:04:28,640 --> 00:04:30,960 Speaker 1: did UM. So I went and did my European mybile 79 00:04:30,960 --> 00:04:33,040 Speaker 1: stocks for a couple of years, and that was in 80 00:04:33,080 --> 00:04:35,680 Speaker 1: London or New York and London. And did you did 81 00:04:35,680 --> 00:04:37,320 Speaker 1: you spend any time in New York or did you 82 00:04:37,360 --> 00:04:40,720 Speaker 1: go straight London to well, I was on Valley. Uh No, So, 83 00:04:40,839 --> 00:04:42,800 Speaker 1: I mean I came to New York for work and 84 00:04:42,800 --> 00:04:45,200 Speaker 1: I worked for NBC Universal, so I used to come 85 00:04:45,200 --> 00:04:47,159 Speaker 1: over here. Um, but now the first time I actually 86 00:04:47,200 --> 00:04:48,920 Speaker 1: worked properly in the States was to go to work 87 00:04:48,960 --> 00:04:53,160 Speaker 1: for a sixty and oh so London with really just 88 00:04:53,360 --> 00:04:55,560 Speaker 1: brief sojourns in New York City. Yeah, I mean I 89 00:04:55,600 --> 00:04:57,040 Speaker 1: worked for Orange, So I went to Paris and I 90 00:04:57,160 --> 00:04:59,400 Speaker 1: worked for NBC Universal. So I came to New York, 91 00:04:59,440 --> 00:05:02,920 Speaker 1: but that my job job spot was in London. And 92 00:05:02,960 --> 00:05:05,960 Speaker 1: so you're covering telecom starts, recovering Mobile, You're covering a 93 00:05:06,000 --> 00:05:10,440 Speaker 1: lot of the technology related space as an analyst. What 94 00:05:10,480 --> 00:05:14,000 Speaker 1: was that like in the midst of wild overvaluation and 95 00:05:14,040 --> 00:05:17,360 Speaker 1: collapsing share price? So it's it's kind of interesting. I mean, 96 00:05:17,360 --> 00:05:19,240 Speaker 1: there's there's there's a couple of different things that one 97 00:05:19,279 --> 00:05:22,400 Speaker 1: of them is it's interesting in hindsight to to look 98 00:05:22,400 --> 00:05:24,560 Speaker 1: at them in hindsight in that they were the go 99 00:05:24,680 --> 00:05:28,000 Speaker 1: go growth, exciting, dynamics, sexy, disruptive companies. Me and Mobile 100 00:05:28,040 --> 00:05:31,520 Speaker 1: had gone from nothing in oh my god, everyone on 101 00:05:31,520 --> 00:05:33,240 Speaker 1: Earth is going to have one of these things, and 102 00:05:33,279 --> 00:05:36,360 Speaker 1: they were really kind of like they thought of themselves 103 00:05:36,360 --> 00:05:41,080 Speaker 1: as poets and they connected everybody and that was it. 104 00:05:41,480 --> 00:05:43,160 Speaker 1: And you know, there's obviously there's still a lot of 105 00:05:43,200 --> 00:05:45,360 Speaker 1: stuff going on in emerging markets, but like everyone Knework 106 00:05:45,360 --> 00:05:46,840 Speaker 1: got a phone and that was it. And then they 107 00:05:46,839 --> 00:05:49,760 Speaker 1: went from being amazing. They went from being Google to 108 00:05:49,800 --> 00:05:54,080 Speaker 1: being water companies in like a year two years effect. 109 00:05:54,360 --> 00:05:56,240 Speaker 1: But so you look at the all the stuff we 110 00:05:56,240 --> 00:05:57,880 Speaker 1: do on our smartphones, now, this was all kind of 111 00:05:57,880 --> 00:06:00,440 Speaker 1: in concept videos and presentations from the mob All operated 112 00:06:00,480 --> 00:06:03,600 Speaker 1: is in you're going to do all of this stuff, 113 00:06:03,600 --> 00:06:05,080 Speaker 1: and they were going to do it all, and of 114 00:06:05,120 --> 00:06:07,360 Speaker 1: course it all happened, but it wasn't done by them, 115 00:06:07,760 --> 00:06:12,440 Speaker 1: And so that perspective of how kind of I think 116 00:06:12,440 --> 00:06:14,000 Speaker 1: there's a lot of echoes of that. For example, if 117 00:06:14,000 --> 00:06:16,160 Speaker 1: you look at cars now, you have this all this 118 00:06:16,279 --> 00:06:18,480 Speaker 1: future ology of what's going to happen and what cars 119 00:06:18,520 --> 00:06:20,560 Speaker 1: will be and how everything will change, and you have 120 00:06:20,640 --> 00:06:22,560 Speaker 1: kind of the renderings of the glass cars, which is 121 00:06:22,600 --> 00:06:24,640 Speaker 1: like the renderings of the glass phones in two thousands 122 00:06:24,640 --> 00:06:27,520 Speaker 1: when there were no phones with color screens, and you think, well, 123 00:06:27,520 --> 00:06:30,479 Speaker 1: you might get like three quarters of this, right, But 124 00:06:30,640 --> 00:06:32,520 Speaker 1: like the last quarter of it is going to be 125 00:06:32,560 --> 00:06:34,760 Speaker 1: where all the money is. And that's the difference between 126 00:06:34,760 --> 00:06:37,400 Speaker 1: it being knock you and Microsoft in it being companies 127 00:06:37,440 --> 00:06:40,520 Speaker 1: you've never heard of like Apple and Samsung and Google 128 00:06:40,600 --> 00:06:42,880 Speaker 1: or forgotten like Apple um and no one had ever 129 00:06:42,920 --> 00:06:45,160 Speaker 1: heard of Google. And so it's interesting to look back 130 00:06:45,160 --> 00:06:46,839 Speaker 1: and think, how did we think about what was going 131 00:06:46,880 --> 00:06:49,080 Speaker 1: to happen then, and how much of it was right 132 00:06:49,120 --> 00:06:50,800 Speaker 1: and wrong? And what would you have had to have 133 00:06:50,839 --> 00:06:53,000 Speaker 1: said then in order to get it right. So you're 134 00:06:53,080 --> 00:06:58,880 Speaker 1: describing a form of historical futurism because when we've what 135 00:06:58,960 --> 00:07:03,400 Speaker 1: we've seen over the years is expectations of the future 136 00:07:03,920 --> 00:07:08,960 Speaker 1: turn out to be wildly either over optimistic or pessimistic, 137 00:07:09,320 --> 00:07:11,440 Speaker 1: but rarely right on the nose. Is that a fair 138 00:07:11,960 --> 00:07:13,640 Speaker 1: And I think that's right. I mean, as I said, 139 00:07:13,680 --> 00:07:15,880 Speaker 1: I think there's sort of several bits. There's sort of 140 00:07:15,880 --> 00:07:19,200 Speaker 1: several lessons you can learn from, like not so much 141 00:07:19,240 --> 00:07:20,640 Speaker 1: to drop the bubble per se, but the kind of 142 00:07:20,640 --> 00:07:22,640 Speaker 1: the way people thought about it. And so there was 143 00:07:22,680 --> 00:07:24,400 Speaker 1: this whole idea in the early nineties of the thing, 144 00:07:24,440 --> 00:07:28,080 Speaker 1: this thing called the Information super Highway, which the whole 145 00:07:28,200 --> 00:07:30,120 Speaker 1: just the name conveys the fact that it was going 146 00:07:30,160 --> 00:07:31,920 Speaker 1: to be kind of centrally controlled, and it would be 147 00:07:31,960 --> 00:07:35,080 Speaker 1: the cable company and News Corporation and twenty Century Fox 148 00:07:35,080 --> 00:07:37,200 Speaker 1: and the New York Times, and they would kind of 149 00:07:37,240 --> 00:07:39,200 Speaker 1: get together every six months and decide what you were 150 00:07:39,200 --> 00:07:42,160 Speaker 1: going to have and how. And of course what we 151 00:07:42,200 --> 00:07:44,600 Speaker 1: had instead was permission in this innovation and the open Internet, 152 00:07:44,640 --> 00:07:46,320 Speaker 1: and there was no central authority, and anyone could do 153 00:07:46,360 --> 00:07:48,840 Speaker 1: what they wanted in just to some extent. And so 154 00:07:49,000 --> 00:07:50,880 Speaker 1: that is interesting to compare that now with the way 155 00:07:50,880 --> 00:07:52,720 Speaker 1: we for example, people talk about cars all the way 156 00:07:52,760 --> 00:07:56,120 Speaker 1: people talk about what's happening to TV now. Um, I 157 00:07:56,160 --> 00:07:59,640 Speaker 1: think the kind of the but as I just said, 158 00:07:59,680 --> 00:08:01,440 Speaker 1: the fact that you could have got you could have 159 00:08:01,480 --> 00:08:04,440 Speaker 1: guessed like eighty or nine of it and it's still 160 00:08:04,480 --> 00:08:07,120 Speaker 1: have missed the important parts is also fascinating. Like you 161 00:08:07,120 --> 00:08:09,960 Speaker 1: could have said in nineteen in two thousand, okay, everyone 162 00:08:09,960 --> 00:08:11,840 Speaker 1: will have one of these things. Everyone will have Internet 163 00:08:11,880 --> 00:08:14,080 Speaker 1: on the phone. It will be a real operating system 164 00:08:14,120 --> 00:08:17,680 Speaker 1: and not a future phone. It will be open Internet. Um, 165 00:08:17,760 --> 00:08:20,640 Speaker 1: therefore it will not be the carriers that would have 166 00:08:20,760 --> 00:08:23,120 Speaker 1: can describe today's well perfectly, But you would have still 167 00:08:23,120 --> 00:08:24,960 Speaker 1: have said, well then it'll be Microsoft K Nokia that 168 00:08:25,000 --> 00:08:28,080 Speaker 1: do that. What's amazing is there was this a T 169 00:08:28,240 --> 00:08:31,720 Speaker 1: and T commercial. I very vividly remember. I think Tom 170 00:08:31,760 --> 00:08:35,080 Speaker 1: Selleck did the voice over and he talks about what 171 00:08:35,200 --> 00:08:38,400 Speaker 1: the future will be like, and I think the punchline 172 00:08:39,000 --> 00:08:41,600 Speaker 1: on the beach you will. And it turned out to 173 00:08:41,640 --> 00:08:43,920 Speaker 1: be dead right, except for the fact that a T 174 00:08:44,040 --> 00:08:46,319 Speaker 1: has or more nothing to do with it exactly. I mean. 175 00:08:46,320 --> 00:08:47,920 Speaker 1: One of the ways I described this when I talked 176 00:08:47,920 --> 00:08:51,000 Speaker 1: to Talcos now is it's as though a municipal water 177 00:08:51,080 --> 00:08:53,680 Speaker 1: company looked at the mineral water business and they said, 178 00:08:53,800 --> 00:08:56,160 Speaker 1: you know, come on, we've got brand, we've got water, 179 00:08:56,280 --> 00:08:58,199 Speaker 1: we've got trusted. You wouldn't buy water from a company 180 00:08:58,240 --> 00:09:01,480 Speaker 1: you don't trust. We should be doing And they hire 181 00:09:01,559 --> 00:09:04,439 Speaker 1: McKinsey and wolf Fallens and they build the whole thing, 182 00:09:04,480 --> 00:09:06,440 Speaker 1: and like two years later, that first pallet of water 183 00:09:06,480 --> 00:09:08,319 Speaker 1: balls into Walmart and it goes onto the shelves and 184 00:09:08,320 --> 00:09:09,719 Speaker 1: they look at it next to yether of two D 185 00:09:09,880 --> 00:09:12,360 Speaker 1: brands and they think, hang on a second, something not 186 00:09:12,480 --> 00:09:17,480 Speaker 1: quite right here. Um, So, how did you go from 187 00:09:17,520 --> 00:09:20,520 Speaker 1: working in London or Paris as a telecom man? Was 188 00:09:20,600 --> 00:09:24,160 Speaker 1: too I know, I'm going to move six thousand miles 189 00:09:24,160 --> 00:09:27,400 Speaker 1: away and become a imagineer. And I just asked them 190 00:09:27,440 --> 00:09:28,840 Speaker 1: for a job. I don't know why people say it's 191 00:09:28,880 --> 00:09:30,280 Speaker 1: hard to get a job in venture. You know, you 192 00:09:30,280 --> 00:09:32,360 Speaker 1: go ask them for a job, they say, yes, okay, 193 00:09:32,400 --> 00:09:34,800 Speaker 1: that's it. Yeah, it was easy. How did you how 194 00:09:34,800 --> 00:09:37,960 Speaker 1: did you? Well? First hear of instance, so well, so 195 00:09:38,000 --> 00:09:40,520 Speaker 1: there's a process here. And as I mentioned, I actually 196 00:09:40,520 --> 00:09:42,720 Speaker 1: did my degree in history, which is analysis, and so 197 00:09:42,880 --> 00:09:44,440 Speaker 1: the question was sort of picking up and looking for 198 00:09:44,480 --> 00:09:46,240 Speaker 1: things where you can apply that to analysis, and so 199 00:09:46,440 --> 00:09:49,319 Speaker 1: I left university and became a south Side analyst, and 200 00:09:49,600 --> 00:09:52,520 Speaker 1: is sort of many people will know that, for many 201 00:09:52,600 --> 00:09:54,560 Speaker 1: various reasons, stop being a fun thing to do, or 202 00:09:54,559 --> 00:09:56,200 Speaker 1: even a thing that anyone could do. So I think 203 00:09:56,200 --> 00:09:58,520 Speaker 1: when I was in telecoms, merl Lynch, you have had 204 00:09:58,559 --> 00:10:01,800 Speaker 1: something like fifty um up in telecoms analysts, and the 205 00:10:01,880 --> 00:10:03,320 Speaker 1: last time I looked, I think they had less than 206 00:10:03,360 --> 00:10:06,680 Speaker 1: half a dozen. So that industry ceased to be fun. 207 00:10:06,960 --> 00:10:10,040 Speaker 1: And so I left and went and worked in strategy, 208 00:10:10,240 --> 00:10:13,480 Speaker 1: first at Orange and then at um NBC Universal and 209 00:10:13,559 --> 00:10:16,800 Speaker 1: Channel four again both in London. So from sort of 210 00:10:16,800 --> 00:10:18,920 Speaker 1: sitting on the outside in sitting on the inside out 211 00:10:18,920 --> 00:10:20,200 Speaker 1: trying to work out what we do and how do 212 00:10:20,200 --> 00:10:22,280 Speaker 1: we how do we understand this? And it's kind of 213 00:10:22,280 --> 00:10:24,160 Speaker 1: interesting because the kind of the questions changed and suddenly 214 00:10:24,200 --> 00:10:25,959 Speaker 1: you've got a spreadsheet that lists all the stuff you've 215 00:10:25,960 --> 00:10:28,280 Speaker 1: been trying to work out from the from the public financials. 216 00:10:28,480 --> 00:10:30,439 Speaker 1: But then you're just trying to work out different stuff. 217 00:10:30,480 --> 00:10:32,199 Speaker 1: But the process is the thought process at the state. 218 00:10:32,559 --> 00:10:34,559 Speaker 1: Because I did that for a while. Um. I left 219 00:10:34,679 --> 00:10:37,560 Speaker 1: NBC Universal the week that General Electric share price fell fifty, 220 00:10:37,960 --> 00:10:41,440 Speaker 1: which is obviously a called direct causal relationship. Sometimes you know, 221 00:10:41,559 --> 00:10:44,120 Speaker 1: things happened and you could just draw a straight line 222 00:10:44,160 --> 00:10:46,600 Speaker 1: from A to B exactly. I recall seeing the Wall 223 00:10:46,600 --> 00:10:50,080 Speaker 1: Street General Peace, Ben lick Evans departs NBC Universal gu 224 00:10:50,200 --> 00:10:53,240 Speaker 1: started out in their exactly we went up up by half, 225 00:10:53,240 --> 00:10:55,040 Speaker 1: and that the same amount the next week. Of course, 226 00:10:55,640 --> 00:10:58,960 Speaker 1: um so I left when the financial that is in 227 00:10:58,960 --> 00:11:01,679 Speaker 1: the financial crisis. It was so yeah, sound about that, 228 00:11:02,280 --> 00:11:04,760 Speaker 1: an old and boring story of what happened there. Um 229 00:11:04,800 --> 00:11:07,000 Speaker 1: and then I went as I worked as a consultant 230 00:11:07,040 --> 00:11:09,240 Speaker 1: in London, so I was doing a lot of sort 231 00:11:09,240 --> 00:11:13,840 Speaker 1: of strategy consultancy and sort of producing research reports forum 232 00:11:14,160 --> 00:11:16,560 Speaker 1: around the European media and telecom space, and so I 233 00:11:16,640 --> 00:11:19,440 Speaker 1: was writing both about kind of what's happening in fashion 234 00:11:19,480 --> 00:11:22,880 Speaker 1: magazines and you know record retail and DVD retail. Would 235 00:11:22,880 --> 00:11:26,400 Speaker 1: also water Google and Facebook doing what's happening with smartphones, 236 00:11:26,960 --> 00:11:29,080 Speaker 1: And so I started a blog, and more or less 237 00:11:29,080 --> 00:11:30,719 Speaker 1: the same time, I went on Twitter at that time, 238 00:11:30,760 --> 00:11:32,160 Speaker 1: which he was kind of new, and so you could 239 00:11:32,160 --> 00:11:34,120 Speaker 1: were quite easy to get noticed if you're doing stuff, 240 00:11:34,520 --> 00:11:36,360 Speaker 1: and I was kind of writing stuff. The way I 241 00:11:36,360 --> 00:11:39,040 Speaker 1: describe it is basically I would write stuff that a 242 00:11:39,040 --> 00:11:41,720 Speaker 1: cell side analyst or a senior strategy person or somewhat 243 00:11:41,720 --> 00:11:45,560 Speaker 1: at Apple could write, except that either they couldn't public it, 244 00:11:45,600 --> 00:11:48,440 Speaker 1: publish it, or they would be writing for a very 245 00:11:48,480 --> 00:11:50,840 Speaker 1: different audience. They'd be writing for, you know, buy cell 246 00:11:50,880 --> 00:11:55,120 Speaker 1: whole audience or um or they wouldn't have the analytic 247 00:11:55,160 --> 00:11:56,959 Speaker 1: background to write it. So like a sell side analyst 248 00:11:57,000 --> 00:11:59,000 Speaker 1: could write this stuff, but they're writing for a different audience. 249 00:11:59,000 --> 00:12:01,400 Speaker 1: A senior person at a company knows all of this stuff, 250 00:12:01,400 --> 00:12:03,000 Speaker 1: but they are not used to writing stuff and they 251 00:12:03,000 --> 00:12:05,480 Speaker 1: can't publish it anyway. Somebody and McKinsey knows all this 252 00:12:05,520 --> 00:12:06,960 Speaker 1: stuff that they're not allowed to say that's not going 253 00:12:07,000 --> 00:12:10,079 Speaker 1: to work. Um. So I was kind of an interesting, 254 00:12:10,200 --> 00:12:12,800 Speaker 1: kind of kind of niche in the vendor kind of 255 00:12:12,800 --> 00:12:15,480 Speaker 1: little segment on your vend diagram of somebody who had 256 00:12:15,480 --> 00:12:17,840 Speaker 1: an analytic and strategy background was used to writing it 257 00:12:17,880 --> 00:12:19,720 Speaker 1: that about this stuff and explaining it, could say it 258 00:12:19,720 --> 00:12:22,440 Speaker 1: in public at a time when very few other people 259 00:12:22,440 --> 00:12:25,520 Speaker 1: were doing it. And so for those reasons, UM, I 260 00:12:25,640 --> 00:12:27,440 Speaker 1: sort of got noticed. And you know, I spent like 261 00:12:27,480 --> 00:12:29,640 Speaker 1: two or three years writing a bog posts and getting 262 00:12:29,640 --> 00:12:31,640 Speaker 1: like a hundred page vs a months, and then I 263 00:12:31,679 --> 00:12:33,560 Speaker 1: went through a period where I was getting a couple 264 00:12:33,559 --> 00:12:35,840 Speaker 1: of thousand page vis a day. Um. And that sort 265 00:12:35,840 --> 00:12:38,240 Speaker 1: of happened in the course of two thousand thirteen. And 266 00:12:38,280 --> 00:12:40,320 Speaker 1: then and I kind of picked up my my bag 267 00:12:40,360 --> 00:12:41,760 Speaker 1: and thought, well, what else do I want to do? 268 00:12:41,800 --> 00:12:43,400 Speaker 1: What do I want to do next? Where could I 269 00:12:43,440 --> 00:12:45,719 Speaker 1: deploy this? And you know, you you have those conversations 270 00:12:45,720 --> 00:12:47,480 Speaker 1: at various stages in your life. And I thought, well, 271 00:12:47,800 --> 00:12:50,040 Speaker 1: I should go and do you do this in venture capital? 272 00:12:50,040 --> 00:12:51,640 Speaker 1: And where's the place to go and do venture capital? 273 00:12:51,640 --> 00:12:55,120 Speaker 1: And you the global cluster with San Francisco, and what 274 00:12:55,200 --> 00:12:58,160 Speaker 1: was the kind of firm where it felt like this 275 00:12:58,240 --> 00:13:01,000 Speaker 1: kind of innovative approach to its explaining things and adding 276 00:13:01,040 --> 00:13:03,679 Speaker 1: value in public would work, and A sixteen Z was 277 00:13:03,720 --> 00:13:06,440 Speaker 1: thought of at the top of that list, and you 278 00:13:06,480 --> 00:13:08,680 Speaker 1: reached out to them as opposed to visus. So I 279 00:13:08,800 --> 00:13:11,440 Speaker 1: got a couple of introductions. I went in and I said, well, 280 00:13:11,720 --> 00:13:13,440 Speaker 1: this is sort of what I do. Is there a 281 00:13:13,440 --> 00:13:15,280 Speaker 1: place for this in this firm? Does this fit within 282 00:13:15,320 --> 00:13:17,960 Speaker 1: the firm? Is it's useful? And they kind of said yeah, 283 00:13:18,360 --> 00:13:23,000 Speaker 1: just like that or less. So um, lots of other 284 00:13:23,040 --> 00:13:26,840 Speaker 1: people blogers should be inundating a sixteen Z with their 285 00:13:26,960 --> 00:13:29,800 Speaker 1: resumes and saying, look, I've been doing this also. You know, 286 00:13:29,840 --> 00:13:32,040 Speaker 1: I don't think you need a resume. It's it's kind 287 00:13:32,040 --> 00:13:33,880 Speaker 1: of like, you know, if you want a sales job, 288 00:13:33,920 --> 00:13:35,120 Speaker 1: you should be able to get the meeting. If you 289 00:13:35,120 --> 00:13:36,719 Speaker 1: can't get the meeting, you shouldn't have a sales job. 290 00:13:36,760 --> 00:13:38,120 Speaker 1: And it's kind of the same. You prove that you 291 00:13:38,160 --> 00:13:40,200 Speaker 1: can do the job by doing it. That makes that 292 00:13:40,240 --> 00:13:44,400 Speaker 1: makes a lot of sense. So the focus that you 293 00:13:44,480 --> 00:13:48,840 Speaker 1: have today is no longer telecom the way it was, 294 00:13:49,320 --> 00:13:53,760 Speaker 1: but certainly the concept of mobility as a source of 295 00:13:54,480 --> 00:13:59,320 Speaker 1: um possible changes in technology is a key factor. What 296 00:13:59,360 --> 00:14:01,360 Speaker 1: do you focus on these things? Well, I think talking 297 00:14:01,400 --> 00:14:03,960 Speaker 1: about mobile now is a bit like talking about PCs 298 00:14:04,040 --> 00:14:06,600 Speaker 1: ten years ago. Right, Yes, this is the center of everything, 299 00:14:06,600 --> 00:14:09,920 Speaker 1: but it's happened, and so we're not arguing about iOS 300 00:14:10,040 --> 00:14:12,040 Speaker 1: versus Android, or whether everyone's going to have one of 301 00:14:12,080 --> 00:14:13,920 Speaker 1: these things, or apps versus the web, all of those 302 00:14:14,000 --> 00:14:16,280 Speaker 1: kind of things, though, they're not interesting conversations anymore. It's 303 00:14:16,320 --> 00:14:18,480 Speaker 1: like arguing, you know, is everybody get onto going to 304 00:14:18,520 --> 00:14:20,720 Speaker 1: get onto the web? Well? Yes, Now what next question? 305 00:14:21,520 --> 00:14:24,880 Speaker 1: So you look? So there's one answer is well, what 306 00:14:24,920 --> 00:14:26,640 Speaker 1: are the next questions? What are the next kind of 307 00:14:27,080 --> 00:14:30,280 Speaker 1: mega trends that are happening. The second is there's a 308 00:14:30,280 --> 00:14:33,000 Speaker 1: lot of conversation around what happens with that stuff? What 309 00:14:33,120 --> 00:14:35,560 Speaker 1: happens with the stuff that we've already had so ten 310 00:14:35,600 --> 00:14:38,000 Speaker 1: years ago? What do we do with broadband and browsers? 311 00:14:38,040 --> 00:14:40,520 Speaker 1: So well, let's talk about search, let's talk about social, 312 00:14:40,560 --> 00:14:42,680 Speaker 1: Let's talk about what happens once everyone has a PC. 313 00:14:43,120 --> 00:14:46,040 Speaker 1: Now what happens once everyone has a smartphone and everyone 314 00:14:46,120 --> 00:14:48,440 Speaker 1: is on Facebook? So we have those kinds of conversations 315 00:14:48,440 --> 00:14:50,480 Speaker 1: about the world, where the world that we've just built, 316 00:14:50,480 --> 00:14:51,920 Speaker 1: what happens with that, and then we think, well, what 317 00:14:51,960 --> 00:14:56,800 Speaker 1: are the next things? So machine learning, autonomous cars, mixed reality, cryptocurrency, 318 00:14:57,160 --> 00:14:59,160 Speaker 1: What are the next fundamental trends that will shape the 319 00:14:59,200 --> 00:15:01,400 Speaker 1: tech industry the way first the PC and then mobile 320 00:15:01,440 --> 00:15:03,560 Speaker 1: shape the tech industry in the last twenty or thirty years. 321 00:15:03,920 --> 00:15:06,760 Speaker 1: I think then within that there's a kind of you know, 322 00:15:06,960 --> 00:15:09,520 Speaker 1: for what I try and do as a question of 323 00:15:09,600 --> 00:15:12,120 Speaker 1: looking for the arguments or looking for the questions. So 324 00:15:12,760 --> 00:15:15,600 Speaker 1: what are the places where there's an on the one hand, 325 00:15:15,680 --> 00:15:17,320 Speaker 1: On the other hand, is it going to look like 326 00:15:17,360 --> 00:15:18,840 Speaker 1: this or is it going to look more like that? 327 00:15:18,880 --> 00:15:20,720 Speaker 1: Are they going to be winner not? Will there be 328 00:15:20,760 --> 00:15:24,040 Speaker 1: autonomous cars? Yes not? Will it be in fifteen or 329 00:15:24,040 --> 00:15:27,120 Speaker 1: twenty years? Who knows, don't know. It's more like are 330 00:15:27,120 --> 00:15:28,840 Speaker 1: they going to be winner? Takes all effects in this? 331 00:15:28,920 --> 00:15:30,360 Speaker 1: Is it going to look like Android? Or is it 332 00:15:30,400 --> 00:15:32,440 Speaker 1: going to look like abs where there's like a widget 333 00:15:32,440 --> 00:15:35,240 Speaker 1: that everyone buys and it doesn't matter. Is what's going 334 00:15:35,280 --> 00:15:37,480 Speaker 1: to happen in mixed reality? Is this going to be 335 00:15:37,520 --> 00:15:40,000 Speaker 1: something that's going to be every hardware manufacturer? Is it 336 00:15:40,000 --> 00:15:42,280 Speaker 1: going to be super concentrated? How do we think about 337 00:15:42,360 --> 00:15:44,720 Speaker 1: machine learning? How do we try and understand what that 338 00:15:44,800 --> 00:15:47,960 Speaker 1: might change inside big companies, and so kind of there's 339 00:15:48,000 --> 00:15:49,720 Speaker 1: like what are the topics and then how do we 340 00:15:49,760 --> 00:15:52,200 Speaker 1: work out what the kind of the useful questions to 341 00:15:52,240 --> 00:15:55,120 Speaker 1: talk about might be within that? So you're less extra 342 00:15:55,240 --> 00:15:58,760 Speaker 1: bolending current trends out into some future date and instead 343 00:15:58,960 --> 00:16:03,000 Speaker 1: thinking about, well, here's what we are several steps beyond that. 344 00:16:03,080 --> 00:16:06,560 Speaker 1: What are going to be the subsequent developments that this 345 00:16:06,640 --> 00:16:10,360 Speaker 1: might lead to. You're really several steps ahead, that's right. 346 00:16:10,400 --> 00:16:12,280 Speaker 1: I mean that's partly a consequence of the state of 347 00:16:12,280 --> 00:16:15,240 Speaker 1: the industry. So like five years ago, okay, what's it 348 00:16:15,320 --> 00:16:17,560 Speaker 1: was a horse race who's gonna win? But is Apple 349 00:16:17,600 --> 00:16:19,240 Speaker 1: and Google? What's going to happen? Is Apple going to 350 00:16:19,320 --> 00:16:22,280 Speaker 1: survive even though Android has got this open story and 351 00:16:22,280 --> 00:16:24,240 Speaker 1: this open source story and so on? So is there 352 00:16:24,240 --> 00:16:25,640 Speaker 1: going to be one for Apple? Is there going to 353 00:16:25,680 --> 00:16:27,040 Speaker 1: be win for a third entry? And what will happen 354 00:16:27,040 --> 00:16:28,760 Speaker 1: to BlackBerry? Will black Bey killing on? And in niche? 355 00:16:28,800 --> 00:16:30,240 Speaker 1: Will Windows going to be able to break in? So 356 00:16:30,360 --> 00:16:33,560 Speaker 1: all of that was it was the motorating I'm sorry Windows, Yeah, yah, yeah, 357 00:16:33,760 --> 00:16:36,760 Speaker 1: I don't I don't recall that. Yeah, yeah, that's it's 358 00:16:36,760 --> 00:16:39,640 Speaker 1: a painful subject for some people. So there's it was 359 00:16:39,640 --> 00:16:42,440 Speaker 1: a horse race Steve, Steve Bomber in particular. Yeah, yeah, 360 00:16:42,440 --> 00:16:46,200 Speaker 1: he's three chairs. But that was you are you're calling 361 00:16:46,200 --> 00:16:48,240 Speaker 1: the place. That's that the American phrase, you're you're going 362 00:16:48,320 --> 00:16:50,400 Speaker 1: to work out what was going on? Now you're not. 363 00:16:50,480 --> 00:16:52,640 Speaker 1: That's we know what happened. And we don't have those 364 00:16:52,720 --> 00:16:55,200 Speaker 1: kind of day by day tactical questions around autonomer's cars 365 00:16:55,200 --> 00:16:57,520 Speaker 1: because like there aren't in your automo's cars. We don't 366 00:16:57,520 --> 00:16:59,960 Speaker 1: have those kind of day by day questions around mixed 367 00:17:00,000 --> 00:17:02,400 Speaker 1: reality because you can't buy a mixed reality headset yet, 368 00:17:03,240 --> 00:17:05,719 Speaker 1: mixed reality glasses yet. So the character of the question 369 00:17:05,800 --> 00:17:08,600 Speaker 1: changes because we're at a different point in the S curve. Basically, 370 00:17:08,600 --> 00:17:10,600 Speaker 1: when the S curve is going near vertical, then you're 371 00:17:10,600 --> 00:17:12,200 Speaker 1: trying to work out, oh my god, what's going on? 372 00:17:13,160 --> 00:17:14,560 Speaker 1: Is the rocket ship going to blow up? When is 373 00:17:14,600 --> 00:17:17,280 Speaker 1: it going to start flattening out. Where we are now 374 00:17:17,440 --> 00:17:19,520 Speaker 1: is we've got the oldest curve has flattened out right 375 00:17:19,520 --> 00:17:21,080 Speaker 1: at the top of the of the scale, and we're 376 00:17:21,080 --> 00:17:22,520 Speaker 1: talking about what you can on the top of it, 377 00:17:22,720 --> 00:17:24,159 Speaker 1: and the new S curves are kind of under the 378 00:17:24,240 --> 00:17:26,600 Speaker 1: radar if I can mix my metaphors. One of the 379 00:17:26,640 --> 00:17:28,840 Speaker 1: ways I described this, it's kind of a good Manhattan metaphor. 380 00:17:28,840 --> 00:17:30,880 Speaker 1: It's like you walk past your construction site every day 381 00:17:30,880 --> 00:17:32,800 Speaker 1: for six months and there's a bunch of construction worker 382 00:17:32,840 --> 00:17:34,840 Speaker 1: is kind of scout standing around, scratching their backside, is 383 00:17:34,880 --> 00:17:36,760 Speaker 1: not doing anything. You think all these guys are lazy. 384 00:17:37,040 --> 00:17:38,840 Speaker 1: Then you walk past on Monday morning and they put 385 00:17:38,840 --> 00:17:40,760 Speaker 1: fifteen stories of steel frame up me You think, wow, 386 00:17:40,760 --> 00:17:44,679 Speaker 1: they're really busy over the weekend. You're missing the slow 387 00:17:46,000 --> 00:17:47,480 Speaker 1: And then there's a period at the end where they're 388 00:17:47,520 --> 00:17:49,640 Speaker 1: like putting the facade on and doing all the fit out, 389 00:17:49,720 --> 00:17:51,879 Speaker 1: and again that looks so boring and so mobile is 390 00:17:51,920 --> 00:17:53,840 Speaker 1: at the stage where you know you're putting the facade 391 00:17:53,840 --> 00:17:56,200 Speaker 1: down and you're doing the fit out and the other 392 00:17:56,200 --> 00:17:57,840 Speaker 1: staff are still kind of hulling the ground and you 393 00:17:57,880 --> 00:18:00,960 Speaker 1: can't really see what's going on. Um And so that 394 00:18:01,000 --> 00:18:04,200 Speaker 1: means the character the questions change. In the early stages 395 00:18:04,240 --> 00:18:07,720 Speaker 1: of investments. There there is no data, there's no discounted 396 00:18:07,720 --> 00:18:12,080 Speaker 1: cash forlor model. You're really dealing with two founders and 397 00:18:12,119 --> 00:18:15,439 Speaker 1: a PowerPoint presentation and some numbers which are more or 398 00:18:15,520 --> 00:18:21,199 Speaker 1: less best guesses. Yeah, so the way, it's an interesting 399 00:18:21,240 --> 00:18:23,520 Speaker 1: shift in being an equity analyst because you're right at 400 00:18:23,560 --> 00:18:25,480 Speaker 1: one end of the risk profile. You know, if you 401 00:18:25,520 --> 00:18:27,520 Speaker 1: have tourn of buying tea bills is at the other end. 402 00:18:28,040 --> 00:18:30,800 Speaker 1: You're at the stage where half of the deals you 403 00:18:30,840 --> 00:18:34,200 Speaker 1: do will return less than invested capital. And that's the plan, 404 00:18:35,320 --> 00:18:39,000 Speaker 1: and five pent will produce more than a ten x 405 00:18:39,040 --> 00:18:41,760 Speaker 1: return and that gets you a kind of a three 406 00:18:41,880 --> 00:18:45,400 Speaker 1: x return over ten years. And so everything you do 407 00:18:45,600 --> 00:18:48,120 Speaker 1: has to be capable of being amazing, and if it's 408 00:18:48,160 --> 00:18:50,200 Speaker 1: capable of going from two people at a power point 409 00:18:50,400 --> 00:18:52,960 Speaker 1: to being an amazing thing worth hundreds of millions of 410 00:18:52,960 --> 00:18:54,959 Speaker 1: billions of dollars, it kind of has to be implausible 411 00:18:55,000 --> 00:18:56,879 Speaker 1: and crazy, and they have to be a bunch of 412 00:18:56,920 --> 00:18:59,840 Speaker 1: reasons why it might not work. And therefore what you 413 00:19:00,040 --> 00:19:02,760 Speaker 1: have to do is sort of suspend disbelief and not think, 414 00:19:02,800 --> 00:19:04,840 Speaker 1: here are all the reasons why this might not work, 415 00:19:04,880 --> 00:19:07,280 Speaker 1: But what if this did work, what would it be 416 00:19:07,720 --> 00:19:10,080 Speaker 1: and are these the people who can make that happen, 417 00:19:10,720 --> 00:19:13,679 Speaker 1: Which is a very very different approach than thinking about, 418 00:19:14,160 --> 00:19:17,200 Speaker 1: all right, will this new widget or this new management 419 00:19:17,200 --> 00:19:20,719 Speaker 1: team or whatever sell enough to move the earnings pre 420 00:19:20,760 --> 00:19:25,159 Speaker 1: share calculus of this publicly trading exactly. I think the 421 00:19:25,240 --> 00:19:27,160 Speaker 1: kind of the key way to think about Silicon Valley 422 00:19:27,240 --> 00:19:30,879 Speaker 1: is it's a machine for running experiments, and most of 423 00:19:30,920 --> 00:19:33,840 Speaker 1: the experiments won't work, and that's the plan. And yes, 424 00:19:33,960 --> 00:19:35,960 Speaker 1: you know you might. You know, you could do some 425 00:19:36,080 --> 00:19:38,200 Speaker 1: an experiment that's clearly a terrible idea and it was 426 00:19:38,240 --> 00:19:39,600 Speaker 1: never going to work, and you could kind of mess 427 00:19:39,600 --> 00:19:40,840 Speaker 1: it up and blow up your lab, and you know, 428 00:19:40,840 --> 00:19:42,679 Speaker 1: people will look down on you for that, but no 429 00:19:42,680 --> 00:19:44,040 Speaker 1: one will look down on you for the fact that 430 00:19:44,040 --> 00:19:46,320 Speaker 1: you ran an experiment in a produced negative result that 431 00:19:46,400 --> 00:19:49,199 Speaker 1: was just okay, well we tried, and do you do 432 00:19:49,200 --> 00:19:51,480 Speaker 1: the do you run the experiment, right, that's a different question. 433 00:19:51,680 --> 00:19:54,080 Speaker 1: But you ran the experiment, it didn't work, Okay, that's fine. 434 00:19:54,640 --> 00:19:57,920 Speaker 1: And the ones that do work justify the whole exercise 435 00:19:57,960 --> 00:20:00,960 Speaker 1: and pay for the whole exercise and produce mobile phone, 436 00:20:01,040 --> 00:20:03,880 Speaker 1: saw produced Apple and Google and so on. So let's 437 00:20:03,920 --> 00:20:06,680 Speaker 1: let's let's zoom in on that a little bit. When 438 00:20:06,680 --> 00:20:08,840 Speaker 1: you're at this side of the funnel, when you're looking 439 00:20:08,880 --> 00:20:13,080 Speaker 1: at these really early stage companies, if half of them 440 00:20:13,240 --> 00:20:16,679 Speaker 1: are effectively money losers for you, for the farm as 441 00:20:16,720 --> 00:20:22,120 Speaker 1: an investment. How do you conceptualize what you're looking at? 442 00:20:22,359 --> 00:20:25,720 Speaker 1: Is it is it the founders themselves? Is it the idea? 443 00:20:26,359 --> 00:20:29,280 Speaker 1: Is it something that, hey, this is just so crazy 444 00:20:29,280 --> 00:20:32,560 Speaker 1: it might work. What what is the thought process like when, 445 00:20:33,200 --> 00:20:37,199 Speaker 1: especially where you sit, where there's an endless stream of 446 00:20:37,240 --> 00:20:41,440 Speaker 1: people coming to Silicon Valley to pitch ideas to vcs. Well, 447 00:20:41,520 --> 00:20:44,480 Speaker 1: so I think most venture capitalists look at this stuff 448 00:20:44,600 --> 00:20:46,760 Speaker 1: mostly in the same way. You can argue a little 449 00:20:46,760 --> 00:20:50,479 Speaker 1: bit about emphasis, um, but there is the question of 450 00:20:50,520 --> 00:20:54,960 Speaker 1: what is the market opportunity here? And um? Are these 451 00:20:55,040 --> 00:20:57,119 Speaker 1: the people who are going to be able to work 452 00:20:57,119 --> 00:21:00,040 Speaker 1: out find discover that market opportunity because they never to 453 00:21:00,119 --> 00:21:02,520 Speaker 1: PE's not going to be exactly the original idea. It's 454 00:21:02,520 --> 00:21:04,360 Speaker 1: going to be something sort of adjacent to that. You'll 455 00:21:04,440 --> 00:21:06,879 Speaker 1: kind of twist around and find it pivot shall we? 456 00:21:07,440 --> 00:21:09,160 Speaker 1: Not so much that I'm more iterate. I would say 457 00:21:09,200 --> 00:21:11,439 Speaker 1: pivoty is more like, Okay, we built a whole company 458 00:21:11,440 --> 00:21:13,159 Speaker 1: on that promise and that didn't work at all, so 459 00:21:13,280 --> 00:21:15,840 Speaker 1: now let's try something else. Um, that's not really the 460 00:21:15,880 --> 00:21:18,600 Speaker 1: same thing. It's more kind of well, we were kind 461 00:21:18,600 --> 00:21:20,720 Speaker 1: of that and we sort of moved around, we got, 462 00:21:20,760 --> 00:21:23,280 Speaker 1: we made, we found it worked. There's this whole sort 463 00:21:23,280 --> 00:21:26,040 Speaker 1: of concept in silicon value product market fit, and so 464 00:21:26,160 --> 00:21:27,720 Speaker 1: you have sort of a product and you're trying to 465 00:21:27,800 --> 00:21:29,520 Speaker 1: work out, well, what the market being, what would the 466 00:21:29,520 --> 00:21:32,240 Speaker 1: product be around this space until we can find something 467 00:21:32,280 --> 00:21:35,960 Speaker 1: that meshes and takes off. And so the earlier you are, 468 00:21:36,400 --> 00:21:39,560 Speaker 1: the more in a sense you're betting on the ability 469 00:21:39,600 --> 00:21:42,800 Speaker 1: of the founders to find that, But you're also betting 470 00:21:42,800 --> 00:21:45,360 Speaker 1: on is this a great market and is there some 471 00:21:45,480 --> 00:21:47,439 Speaker 1: angle or some way that they're going to find in 472 00:21:47,560 --> 00:21:49,560 Speaker 1: order to make in order to turn this into a thing. 473 00:21:49,800 --> 00:21:52,920 Speaker 1: So what I'm hearing from you as future growth prospects 474 00:21:53,040 --> 00:22:00,120 Speaker 1: or and valuation is well, well so, um, it depends. 475 00:22:01,040 --> 00:22:04,840 Speaker 1: But the more progress you make and the more numbers 476 00:22:04,880 --> 00:22:07,600 Speaker 1: you have, the more that you start looking at metrics 477 00:22:07,760 --> 00:22:10,800 Speaker 1: and the less that you start thinking about potential. Um. 478 00:22:10,960 --> 00:22:13,520 Speaker 1: So it's sort of the super super early stage. Then 479 00:22:13,560 --> 00:22:16,480 Speaker 1: the valuations all tend to look the same as you 480 00:22:16,520 --> 00:22:19,520 Speaker 1: go further in. Then you start getting much more specific 481 00:22:19,600 --> 00:22:21,399 Speaker 1: about well, how well are they doing and what do 482 00:22:21,440 --> 00:22:23,640 Speaker 1: we think this looks like? Um. Then it gets sort 483 00:22:23,640 --> 00:22:25,040 Speaker 1: of gets kind of case by case, and you know, 484 00:22:25,080 --> 00:22:27,159 Speaker 1: you get up to a company that's got billions of 485 00:22:27,200 --> 00:22:29,160 Speaker 1: dollars a revenue, and then you're doing dcs, and you're 486 00:22:29,200 --> 00:22:32,800 Speaker 1: doing multiples like anybody else, but at the super early stage. 487 00:22:32,800 --> 00:22:34,600 Speaker 1: In a sense, doing a DCF on two people with 488 00:22:34,640 --> 00:22:36,840 Speaker 1: a power point is just an exercise in self deception. 489 00:22:37,040 --> 00:22:39,080 Speaker 1: Sure you could do it. You could. You could sort 490 00:22:39,080 --> 00:22:42,480 Speaker 1: of say, well, if they managed to sell this thing 491 00:22:42,520 --> 00:22:44,680 Speaker 1: to two billion people, well there will be some value 492 00:22:44,720 --> 00:22:47,040 Speaker 1: that trying to do a dcattle that. Well, like, what 493 00:22:47,080 --> 00:22:48,600 Speaker 1: does that get me? It doesn't tell me anything. You're 494 00:22:48,640 --> 00:22:53,240 Speaker 1: just making Yeah, and the things that really work, um, 495 00:22:53,400 --> 00:22:55,280 Speaker 1: in a sense, it doesn't kind of matter if this 496 00:22:55,400 --> 00:22:57,160 Speaker 1: is you know, if you've got you know, take give 497 00:22:57,160 --> 00:22:59,960 Speaker 1: me a kind of hypothetic hypothetical example. You've got to 498 00:23:00,000 --> 00:23:03,040 Speaker 1: fifty million dollar seed fund and you make a significant 499 00:23:03,040 --> 00:23:05,639 Speaker 1: investment in something that turns out to be worth fifty 500 00:23:05,640 --> 00:23:08,560 Speaker 1: billion dollars. Does it really matter if it's worth forty 501 00:23:08,560 --> 00:23:11,879 Speaker 1: billion or sixty billion. The amount of your assist in 502 00:23:11,920 --> 00:23:13,960 Speaker 1: the start, the amount of the amount that it's returned 503 00:23:14,000 --> 00:23:16,600 Speaker 1: your fund is sign is sufficient that those numbers don't 504 00:23:16,600 --> 00:23:21,280 Speaker 1: really make any difference. So so that that's really fascinating 505 00:23:21,280 --> 00:23:23,959 Speaker 1: and end Mark and reason when when we said then 506 00:23:24,040 --> 00:23:28,440 Speaker 1: had a conversation said something similar. But looking back at 507 00:23:28,440 --> 00:23:31,760 Speaker 1: a perspective of twenty years of doing this, and if 508 00:23:31,760 --> 00:23:33,359 Speaker 1: you go back and if you would have paid double 509 00:23:33,440 --> 00:23:36,600 Speaker 1: for everything, it wouldn't mean any difference whatsoever. Yeah, I 510 00:23:36,600 --> 00:23:39,040 Speaker 1: think there's there's a famous investor whose name I think, 511 00:23:39,359 --> 00:23:41,240 Speaker 1: I think I should think it was somebody in Hollywood, 512 00:23:41,240 --> 00:23:43,160 Speaker 1: of famous producer. He said something, if I had said 513 00:23:43,200 --> 00:23:44,800 Speaker 1: yes to all the ones I said no to, and 514 00:23:44,840 --> 00:23:46,200 Speaker 1: no to all the ones I've said yes to, you, 515 00:23:46,200 --> 00:23:48,359 Speaker 1: I would have come out exactly the same place. So 516 00:23:48,400 --> 00:23:49,880 Speaker 1: there's always, you know, this is kind of the index 517 00:23:50,040 --> 00:23:52,200 Speaker 1: indexing story. There's always you can always kind of push 518 00:23:52,240 --> 00:23:55,200 Speaker 1: these arguments and say, well then we should just nobody 519 00:23:55,240 --> 00:23:57,240 Speaker 1: knows anything, And there's a little bit more to it 520 00:23:57,280 --> 00:24:01,200 Speaker 1: than that. Let's talk a little bit of out technology. 521 00:24:01,760 --> 00:24:04,320 Speaker 1: You you have a quote I really like, you have 522 00:24:04,400 --> 00:24:07,760 Speaker 1: several quotes I really like. Um, And let's start with 523 00:24:08,080 --> 00:24:11,120 Speaker 1: one or two of thesegncy where they go. All social 524 00:24:11,200 --> 00:24:14,080 Speaker 1: apps grow until you need a news feed or news 525 00:24:14,119 --> 00:24:17,960 Speaker 1: news feeds grow until you need an algorithm. All algorithmic 526 00:24:18,000 --> 00:24:20,560 Speaker 1: feeds grow until you get fed up seeing the wrong stuff. 527 00:24:20,920 --> 00:24:25,000 Speaker 1: And leave for a new app with less information overload. 528 00:24:25,280 --> 00:24:28,960 Speaker 1: So is the nature of all tech Rinse, lather, repeat 529 00:24:29,080 --> 00:24:34,000 Speaker 1: and and something becomes an incumbent becomes successful and the 530 00:24:34,040 --> 00:24:35,960 Speaker 1: new guys are just going to come up and eat 531 00:24:35,960 --> 00:24:38,440 Speaker 1: their launch. So I have notes through a blog post 532 00:24:38,600 --> 00:24:42,920 Speaker 1: in my phone which is called something like um technology determinism. 533 00:24:42,960 --> 00:24:45,119 Speaker 1: And so there are as it might be half a 534 00:24:45,160 --> 00:24:48,520 Speaker 1: dozen things which are just steady processes, and now half 535 00:24:48,560 --> 00:24:51,120 Speaker 1: a dozen things that of cycles. So steady process would 536 00:24:51,119 --> 00:24:53,720 Speaker 1: most obvious one would be More's law. Uh, there was 537 00:24:53,760 --> 00:24:56,920 Speaker 1: a process of technology news from research labs to start 538 00:24:57,000 --> 00:24:59,600 Speaker 1: ups to pick companies. Um. You know, you can imagine 539 00:24:59,600 --> 00:25:01,960 Speaker 1: like kind of half a dozen of those. Um. Then 540 00:25:01,960 --> 00:25:04,439 Speaker 1: there are cycles. And so there are cycles where you 541 00:25:04,480 --> 00:25:07,159 Speaker 1: go from bundling to unbundling, You go from the client 542 00:25:07,280 --> 00:25:10,639 Speaker 1: to the server and back again. You go from the 543 00:25:10,720 --> 00:25:14,760 Speaker 1: public markets. We had conglomerates and then deconglomerization and then 544 00:25:14,800 --> 00:25:19,480 Speaker 1: re conglomerates. Exactly exactly. Colleague, you at Steven Sinofsky who 545 00:25:19,680 --> 00:25:22,400 Speaker 1: used to run office. He says, all products expand until 546 00:25:22,400 --> 00:25:25,919 Speaker 1: they can edit photographs. That's very funny, like word can 547 00:25:26,080 --> 00:25:28,840 Speaker 1: edit photographs, can edit photographs? All products expand until they 548 00:25:28,840 --> 00:25:31,159 Speaker 1: can edit it. That's the endpoint of software. And so 549 00:25:31,200 --> 00:25:33,200 Speaker 1: there were kind of there and it's a comment about, 550 00:25:33,200 --> 00:25:34,800 Speaker 1: you know, feature creep or whatever it is you want 551 00:25:34,800 --> 00:25:36,359 Speaker 1: to say. And so there were these kind of inevitable 552 00:25:36,400 --> 00:25:39,080 Speaker 1: processes or inevitable pieces of logic that kind of flow through. 553 00:25:39,280 --> 00:25:41,560 Speaker 1: Now the one that you were sort of quoted, particularly 554 00:25:41,800 --> 00:25:44,040 Speaker 1: in sort of an observation about I'm actually I wrote 555 00:25:44,160 --> 00:25:45,720 Speaker 1: a blog post about this last week. It's sort of 556 00:25:45,720 --> 00:25:48,440 Speaker 1: a combination of dune Bar's number and zarka Berg's law. 557 00:25:48,840 --> 00:25:50,919 Speaker 1: Dunebar's number is like, you know, like a hundred and 558 00:25:50,920 --> 00:25:53,120 Speaker 1: fifty or two hundred people, well enough that you would 559 00:25:53,160 --> 00:25:56,200 Speaker 1: trend them on Facebook at the way least. And you've 560 00:25:56,240 --> 00:25:58,800 Speaker 1: got these social apps which make it easy to post 561 00:25:58,880 --> 00:26:01,920 Speaker 1: stuff and share stuff. And because it's one too many, 562 00:26:01,960 --> 00:26:03,960 Speaker 1: you're not emailing it to someone or texting it to someone. 563 00:26:04,000 --> 00:26:05,680 Speaker 1: You can close quite a lot because you don't feel 564 00:26:05,720 --> 00:26:07,400 Speaker 1: like you're kind of imposing on people to do that. 565 00:26:07,960 --> 00:26:11,040 Speaker 1: But you've got two hundred friends and they post five 566 00:26:11,040 --> 00:26:13,680 Speaker 1: things a day, Okay, Now you've got a thousand things 567 00:26:13,680 --> 00:26:16,480 Speaker 1: a day in your news feed, and you can't read notes, 568 00:26:17,240 --> 00:26:19,359 Speaker 1: and so this is the logic that gets Facebook to 569 00:26:19,600 --> 00:26:22,880 Speaker 1: producing what is you know, what's called an algorithmic feed, 570 00:26:22,880 --> 00:26:25,240 Speaker 1: which is just like engineers speak for, let's try and 571 00:26:25,280 --> 00:26:27,000 Speaker 1: work out which of your friends we care about, and 572 00:26:27,040 --> 00:26:28,840 Speaker 1: maybe we should put those at the top. And let's 573 00:26:28,840 --> 00:26:30,600 Speaker 1: work out that you like these kind of things. You 574 00:26:30,640 --> 00:26:32,879 Speaker 1: don't really like new stories from the Guardian. You prefer 575 00:26:32,960 --> 00:26:34,560 Speaker 1: to look at pictures of baby So let's put the 576 00:26:34,560 --> 00:26:36,240 Speaker 1: babies in front of the new stories from the Guardian. 577 00:26:36,520 --> 00:26:37,959 Speaker 1: And you kind of you, you kind of you come 578 00:26:38,000 --> 00:26:41,160 Speaker 1: up and you create that. And so now you've got, 579 00:26:41,200 --> 00:26:43,600 Speaker 1: instead of, so to speak, a random sample, which is 580 00:26:43,640 --> 00:26:45,879 Speaker 1: I open the app, what have people posted in the 581 00:26:45,960 --> 00:26:47,919 Speaker 1: last hour, because I'm not going to scroll past the 582 00:26:47,960 --> 00:26:50,399 Speaker 1: last hour's worth of post So actually it's random, the 583 00:26:50,480 --> 00:26:52,520 Speaker 1: random point fact of being what time did open the app. 584 00:26:52,960 --> 00:26:56,280 Speaker 1: So that's the linear feed. The chronological feed is random. 585 00:26:56,320 --> 00:26:57,639 Speaker 1: So then you said, well, maybe we should put the 586 00:26:57,640 --> 00:27:00,480 Speaker 1: stuff that's important at the beginning. But then you think, okay, 587 00:27:00,480 --> 00:27:03,000 Speaker 1: but it's now you are arguing, well, why isn't the 588 00:27:03,000 --> 00:27:05,000 Speaker 1: Guardian all the New York Times at the top. That 589 00:27:05,000 --> 00:27:06,719 Speaker 1: should be up at the top, because that's important as 590 00:27:06,760 --> 00:27:09,240 Speaker 1: a public benefit to that and you got this wrong. 591 00:27:09,480 --> 00:27:11,480 Speaker 1: My friend posted this thing I wanted to see, and 592 00:27:11,480 --> 00:27:14,280 Speaker 1: I didn't see it, and so you get that sense of, well, 593 00:27:14,320 --> 00:27:17,000 Speaker 1: maybe this isn't actually working, and you have Russians trying 594 00:27:17,040 --> 00:27:18,719 Speaker 1: to game it, and you have all sorts of problems 595 00:27:18,720 --> 00:27:21,240 Speaker 1: with trying to make that feed work, and so then 596 00:27:21,280 --> 00:27:23,000 Speaker 1: you then you can say, well, actually, what I want 597 00:27:23,040 --> 00:27:24,840 Speaker 1: to do is if I really care about this stuff 598 00:27:24,840 --> 00:27:26,199 Speaker 1: and I want people to see it, I'll send them 599 00:27:26,200 --> 00:27:27,840 Speaker 1: a text message, I'll do it and WhatsApp, I'll do 600 00:27:27,880 --> 00:27:31,880 Speaker 1: it on Facebook Messenger. But then I've got fifteen parallel 601 00:27:31,880 --> 00:27:34,560 Speaker 1: conversations or twenty parallel conversations with people. And then we 602 00:27:34,560 --> 00:27:37,480 Speaker 1: start creating WhatsApp groups where twenty people from school or 603 00:27:37,560 --> 00:27:39,000 Speaker 1: can work and all talk to each other. And then 604 00:27:39,040 --> 00:27:41,000 Speaker 1: everyone is like posting more and more stuff, and you think, 605 00:27:41,320 --> 00:27:43,000 Speaker 1: I really like to have a screen in this app 606 00:27:43,040 --> 00:27:44,959 Speaker 1: that just showed me like the important stuff from all 607 00:27:44,960 --> 00:27:46,920 Speaker 1: of these chats that I would see it, and maybe 608 00:27:47,000 --> 00:27:49,800 Speaker 1: that should be sorted by which are the ones that 609 00:27:49,840 --> 00:27:51,399 Speaker 1: I want to see, And so like you kind of 610 00:27:51,400 --> 00:27:54,679 Speaker 1: create the same process over and over again. Now this 611 00:27:54,760 --> 00:27:56,679 Speaker 1: is an opinion. This may be entirely be wrong. This 612 00:27:56,800 --> 00:27:58,679 Speaker 1: may not be how it works, but you can certainly 613 00:27:58,680 --> 00:28:01,959 Speaker 1: sort of see that problem that if you create tools 614 00:28:01,960 --> 00:28:04,400 Speaker 1: that let everyone you've ever met share anything they've they've 615 00:28:04,400 --> 00:28:06,320 Speaker 1: ever been interested in, then you're not going to be 616 00:28:06,359 --> 00:28:08,560 Speaker 1: able to read it all. Well, this leads me to 617 00:28:09,119 --> 00:28:13,240 Speaker 1: the very related quote of Facebook's engineering effort goes to 618 00:28:13,280 --> 00:28:16,440 Speaker 1: stuffing more noise into your news feed and the other 619 00:28:16,440 --> 00:28:19,280 Speaker 1: fift is working on ways to filter it out, or 620 00:28:19,280 --> 00:28:21,320 Speaker 1: it's another expression at the same point. This is this 621 00:28:21,480 --> 00:28:23,479 Speaker 1: is classic joke and I read in a book by 622 00:28:23,480 --> 00:28:26,240 Speaker 1: Casting Glany, which is that somebody's dug a hole to 623 00:28:26,359 --> 00:28:28,959 Speaker 1: build a to create the foundation to build put up 624 00:28:28,960 --> 00:28:31,000 Speaker 1: a building, and they asked their friend what should I 625 00:28:31,000 --> 00:28:32,680 Speaker 1: do with all this earth that's come out of the whole, 626 00:28:32,760 --> 00:28:34,399 Speaker 1: and their friends says, We'll just dig another hole and 627 00:28:34,440 --> 00:28:38,280 Speaker 1: put it in that, And well, it's just not gonna 628 00:28:38,280 --> 00:28:40,200 Speaker 1: work like that. You can't do it like that. And 629 00:28:40,280 --> 00:28:42,400 Speaker 1: if you've created a system that lets anybody you've ever 630 00:28:42,440 --> 00:28:44,360 Speaker 1: met send you anything that they feel like sending, then 631 00:28:44,360 --> 00:28:45,840 Speaker 1: you're not going to be able to fiddle. And there's 632 00:28:45,840 --> 00:28:48,080 Speaker 1: not like some magic algorithm that's going to make that work. 633 00:28:48,080 --> 00:28:50,440 Speaker 1: You're and you're ever gonna have a sample. So I 634 00:28:50,480 --> 00:28:52,120 Speaker 1: want to talk to you about some of your favorite 635 00:28:52,120 --> 00:28:56,280 Speaker 1: technologies and future projects. Do we want to discuss it 636 00:28:56,400 --> 00:28:59,640 Speaker 1: all what's been going on with Facebook this year and 637 00:29:00,320 --> 00:29:06,040 Speaker 1: everything from Russian bots to the um just changes with 638 00:29:06,120 --> 00:29:10,000 Speaker 1: Cambridge Analytics and scraping. First of all, I want to 639 00:29:10,040 --> 00:29:12,560 Speaker 1: say that UM Oxford Analytics is actually not nearly as 640 00:29:12,600 --> 00:29:17,400 Speaker 1: good as Candy Channelized. Okay, for sure, a little little 641 00:29:17,440 --> 00:29:21,840 Speaker 1: almamount of joke, but how much. I'm kind of surprised 642 00:29:21,880 --> 00:29:27,560 Speaker 1: at how shocked people are about this full disclosure. I 643 00:29:27,600 --> 00:29:32,280 Speaker 1: have not been a Facebook heavy user ever. I chafed 644 00:29:32,320 --> 00:29:34,760 Speaker 1: at everything they've ever asked me to give them. They 645 00:29:34,760 --> 00:29:36,760 Speaker 1: don't have my real birthday, they don't have my real phone, 646 00:29:36,760 --> 00:29:39,239 Speaker 1: they don't have my real email address, they don't have 647 00:29:39,320 --> 00:29:42,200 Speaker 1: my real anything. So anyone who wants to scrape other 648 00:29:42,280 --> 00:29:44,600 Speaker 1: than where I went to college, where I went to 649 00:29:44,640 --> 00:29:48,320 Speaker 1: grad school, and and a couple of jobs I've worked 650 00:29:48,320 --> 00:29:50,959 Speaker 1: at that are public, they could scrape what they want 651 00:29:51,040 --> 00:29:54,200 Speaker 1: from me. They're they're getting nothing. But I think this 652 00:29:54,280 --> 00:30:00,360 Speaker 1: whole headache was readily foreseeable by anyone who who forget 653 00:30:00,440 --> 00:30:04,120 Speaker 1: reading Facebook's terms of use. No, you're not entitled to 654 00:30:04,160 --> 00:30:06,680 Speaker 1: any of that private information. I'm not sharing that with you. 655 00:30:07,040 --> 00:30:09,400 Speaker 1: I'll find what I what I'm interested in my own 656 00:30:09,880 --> 00:30:12,240 Speaker 1: or am I just outside of their demographic I'm an 657 00:30:12,240 --> 00:30:17,400 Speaker 1: old fogy And so I think there's a bunch of 658 00:30:17,720 --> 00:30:22,760 Speaker 1: kind of unresolved feeling about Facebook. Is it private or public? 659 00:30:23,280 --> 00:30:25,280 Speaker 1: If I post stuff in the news feed, will my 660 00:30:25,320 --> 00:30:28,120 Speaker 1: friends see it or not? What does that mean? Do 661 00:30:28,160 --> 00:30:29,840 Speaker 1: I want to share? Do I want to share this 662 00:30:29,920 --> 00:30:33,200 Speaker 1: stuff or not? And like, we sort of understand that 663 00:30:33,280 --> 00:30:35,520 Speaker 1: if you search on Google for something, it'll show you 664 00:30:35,680 --> 00:30:37,240 Speaker 1: or try to show you what you asked for, even 665 00:30:37,280 --> 00:30:39,920 Speaker 1: if it's something you shouldn't have searched for. We don't 666 00:30:40,080 --> 00:30:43,320 Speaker 1: really feel like if I my racist uncle posts that 667 00:30:43,400 --> 00:30:46,239 Speaker 1: story on Facebook, should I see it or not? How 668 00:30:46,280 --> 00:30:48,640 Speaker 1: do we think about that? Well, if you like it 669 00:30:48,840 --> 00:30:51,560 Speaker 1: or reposted. But we also think we also sort of 670 00:30:51,560 --> 00:30:53,640 Speaker 1: think or maybe Facebook shouldn't be showing that at the 671 00:30:53,680 --> 00:30:55,520 Speaker 1: top of the list. Well, well it is my he 672 00:30:55,600 --> 00:30:58,600 Speaker 1: is my uncle, and he did post it. So we 673 00:30:58,760 --> 00:31:01,920 Speaker 1: have a bunch of sort of of I don't think 674 00:31:01,920 --> 00:31:04,960 Speaker 1: we have a clear sentiment about put its other extreme, 675 00:31:05,160 --> 00:31:07,000 Speaker 1: we are confident that are back. We are comfortable with 676 00:31:07,040 --> 00:31:08,680 Speaker 1: the fact that our banks know how much money we have. 677 00:31:09,440 --> 00:31:11,640 Speaker 1: For sure, we're comfortable that our mobile operators and know 678 00:31:11,640 --> 00:31:13,400 Speaker 1: where our phone is and that there's a kind of 679 00:31:13,400 --> 00:31:16,240 Speaker 1: a legal apparatus around that if the police need to know, 680 00:31:16,320 --> 00:31:18,080 Speaker 1: then they can find out. But it's just not available 681 00:31:18,120 --> 00:31:20,360 Speaker 1: to anybody. I think we sort of have that feeling 682 00:31:20,360 --> 00:31:24,240 Speaker 1: around Google mostly. I don't think we have like a 683 00:31:24,280 --> 00:31:28,440 Speaker 1: resolved feeling around Facebook. Um. Now, this particular story is 684 00:31:28,440 --> 00:31:30,600 Speaker 1: sort of fascinating because of how many pieces of there 685 00:31:30,720 --> 00:31:34,640 Speaker 1: there are to that. So Facebook creates this developer platform 686 00:31:34,840 --> 00:31:37,360 Speaker 1: that allows you to install an app that can access 687 00:31:37,360 --> 00:31:40,560 Speaker 1: your information and also access information about your friends. And 688 00:31:40,600 --> 00:31:42,400 Speaker 1: there's a sort of a bunch of reasons why that 689 00:31:42,440 --> 00:31:44,920 Speaker 1: would be useful, like I want to create in a 690 00:31:44,960 --> 00:31:47,120 Speaker 1: calendar entry, or I want to you know, I want 691 00:31:47,160 --> 00:31:49,840 Speaker 1: to see who else is using this app. Um. And 692 00:31:50,000 --> 00:31:53,040 Speaker 1: a large part of like the advocacy around the tech 693 00:31:53,080 --> 00:31:55,840 Speaker 1: industry with wall gardens are bad, Facebook is bad because 694 00:31:55,880 --> 00:31:57,680 Speaker 1: it's closed. People being need to be able to get 695 00:31:57,680 --> 00:31:59,480 Speaker 1: their information out. You need to be able other people 696 00:31:59,520 --> 00:32:01,920 Speaker 1: need to be able to use this to innovate. So 697 00:32:02,080 --> 00:32:04,680 Speaker 1: there was this whole sort of ideological argument um that 698 00:32:04,720 --> 00:32:06,840 Speaker 1: Facebook would be evil for not doing this, and Facebook 699 00:32:06,880 --> 00:32:10,360 Speaker 1: needed to do this, So you create this platform. Um, 700 00:32:10,400 --> 00:32:13,080 Speaker 1: it turns out that people are able to exploit that 701 00:32:13,280 --> 00:32:16,080 Speaker 1: and do stuff with it that was not really anticipated. 702 00:32:16,400 --> 00:32:19,719 Speaker 1: You think it's not anticipated because from my perspective, I 703 00:32:19,760 --> 00:32:23,400 Speaker 1: look at it as by design that Okay, so yes 704 00:32:23,440 --> 00:32:25,520 Speaker 1: and no, So let's get let's go to an analogy. 705 00:32:25,600 --> 00:32:30,520 Speaker 1: So do you remember word macro viruses? Um? Sure about? 706 00:32:30,920 --> 00:32:34,840 Speaker 1: So office is supposed to be an open development environment. 707 00:32:35,040 --> 00:32:37,480 Speaker 1: The people suing them before antitrust is suing them from 708 00:32:37,520 --> 00:32:39,760 Speaker 1: making it closed in half for third parties to work with. 709 00:32:40,200 --> 00:32:42,120 Speaker 1: So they create all these API s and they create 710 00:32:42,160 --> 00:32:44,560 Speaker 1: this whole macro language. One of the things on page 711 00:32:44,560 --> 00:32:47,160 Speaker 1: fifteen of the textbook is you can make a macro run. 712 00:32:47,160 --> 00:32:50,160 Speaker 1: When you open the document page forty eight of the textbook. 713 00:32:50,520 --> 00:32:52,280 Speaker 1: You can get it to look at your email addresses. 714 00:32:52,400 --> 00:32:56,160 Speaker 1: Page seventy two. You can get it to send an email. Okay, 715 00:32:56,200 --> 00:32:58,239 Speaker 1: so I get an email word document, I open it, 716 00:32:58,240 --> 00:32:59,920 Speaker 1: it emails a copy of itself to everyone in might 717 00:33:00,080 --> 00:33:04,959 Speaker 1: s book at. Okay, that's not what we expected, but 718 00:33:05,080 --> 00:33:07,360 Speaker 1: it's really that is not its intended but it's not 719 00:33:07,400 --> 00:33:10,920 Speaker 1: its intended purpose exactly. But all of the individual A 720 00:33:11,000 --> 00:33:13,239 Speaker 1: p I s were there, and so you've got this 721 00:33:13,320 --> 00:33:15,840 Speaker 1: period from Microsoft where they're thinking, okay, so are we 722 00:33:15,880 --> 00:33:18,640 Speaker 1: supposed to not have macros that we're supposed to be closed? Now, 723 00:33:18,840 --> 00:33:20,440 Speaker 1: how do we think about this? And they had to 724 00:33:20,440 --> 00:33:22,959 Speaker 1: go through this like hundred and eighty degree turn as 725 00:33:23,000 --> 00:33:25,120 Speaker 1: they went from thinking we should make it as easy 726 00:33:25,160 --> 00:33:28,040 Speaker 1: as possible for anybody to do this to what would 727 00:33:28,040 --> 00:33:32,800 Speaker 1: happen if it's going to moderate my language? What would 728 00:33:32,800 --> 00:33:37,080 Speaker 1: happen if a bad person, um what my seven year 729 00:33:37,080 --> 00:33:40,080 Speaker 1: old would call a dingly head, decided to read the 730 00:33:40,120 --> 00:33:42,800 Speaker 1: textbook Because it's not like you found bugs in it. 731 00:33:42,920 --> 00:33:44,680 Speaker 1: They're doing stuff it was in the textbook, that was 732 00:33:44,720 --> 00:33:46,520 Speaker 1: in the manual, but you weren't expecting them to use 733 00:33:46,560 --> 00:33:48,640 Speaker 1: them in those ways. And I think there's a very 734 00:33:48,680 --> 00:33:52,040 Speaker 1: strong parallel there with Whatness Cambridgewn Litigo was doing, which 735 00:33:52,120 --> 00:33:53,760 Speaker 1: was you were able to install an app. The app 736 00:33:53,800 --> 00:33:55,840 Speaker 1: can get your information, The app can ask for your 737 00:33:55,840 --> 00:33:57,959 Speaker 1: friends information. If you say yes, it will get your 738 00:33:57,960 --> 00:34:00,600 Speaker 1: friends information. Yeah, but we didn't expect that people would 739 00:34:00,640 --> 00:34:04,560 Speaker 1: use that to exportrate eighty million people's profiles, just as 740 00:34:04,600 --> 00:34:06,400 Speaker 1: we didn't expect someone would make a word document that 741 00:34:06,400 --> 00:34:08,359 Speaker 1: could email a copy of itself to me. So now, 742 00:34:08,440 --> 00:34:10,479 Speaker 1: so now let me I'm gonna go toe to toe 743 00:34:10,480 --> 00:34:12,960 Speaker 1: on technology with you, which is clearly a mistake on 744 00:34:13,040 --> 00:34:18,160 Speaker 1: my pod. But Microsoft notorious for having all these weak 745 00:34:18,239 --> 00:34:23,840 Speaker 1: security setups and easily exploitable that is such a foreseeable issue. 746 00:34:24,480 --> 00:34:28,200 Speaker 1: Granted this little bit of hindsight bioso in hindsight, right, 747 00:34:28,400 --> 00:34:35,239 Speaker 1: but you know Microsoft is home of the exploitable security era. 748 00:34:36,040 --> 00:34:39,239 Speaker 1: My relationship with Amazon and with Apple is that I 749 00:34:39,360 --> 00:34:42,560 Speaker 1: pay them for stuff and I expect a different level 750 00:34:43,160 --> 00:34:45,640 Speaker 1: of trust and a different level of Hey, I'm already 751 00:34:45,640 --> 00:34:48,520 Speaker 1: giving you money, don't exploit my personal data for other reasons. Well, 752 00:34:48,600 --> 00:34:50,480 Speaker 1: let me let me give you a hypothetical. Um, you 753 00:34:50,480 --> 00:34:52,839 Speaker 1: can install an app on your smartphone. It can pop 754 00:34:52,880 --> 00:34:55,120 Speaker 1: up a box and ask for your friends, and there's 755 00:34:55,120 --> 00:34:57,200 Speaker 1: an awful lot of sensible reasons, like you're playing a game, 756 00:34:57,200 --> 00:34:58,839 Speaker 1: who else? Which of your friends are playing? The game? 757 00:34:59,200 --> 00:35:01,880 Speaker 1: Installed the Instagram? Who are your friends? So you can 758 00:35:01,920 --> 00:35:03,759 Speaker 1: follow them? Like we were asked if you want to 759 00:35:03,800 --> 00:35:08,520 Speaker 1: split a fair with somebody, any basic logical reasons why 760 00:35:08,560 --> 00:35:10,399 Speaker 1: you wanted to do that? Okay, So that app has 761 00:35:10,440 --> 00:35:15,879 Speaker 1: just downloaded six hundred people's home addresses. Is that a breach? Well, 762 00:35:15,960 --> 00:35:19,759 Speaker 1: it has a downloaded their email address or address or 763 00:35:19,760 --> 00:35:22,520 Speaker 1: have they looked at the home address? And then people 764 00:35:23,320 --> 00:35:25,279 Speaker 1: second order question it needs their email address or their 765 00:35:25,280 --> 00:35:27,120 Speaker 1: phone number to work. That's the idea. There's there is 766 00:35:27,200 --> 00:35:30,319 Speaker 1: no other identify, so it's got six d people's email addresses. Okay, 767 00:35:30,360 --> 00:35:33,080 Speaker 1: is that a breach of privacy? Is that Apple's faults? 768 00:35:33,920 --> 00:35:38,560 Speaker 1: Well maybe a little bit. I would say it is not. 769 00:35:38,680 --> 00:35:41,200 Speaker 1: But Apple popped up They said, here is this capability. 770 00:35:41,239 --> 00:35:42,879 Speaker 1: Apple pops up the thing. Do you want this app 771 00:35:42,920 --> 00:35:45,760 Speaker 1: to to have your address or not? And the sort 772 00:35:45,840 --> 00:35:48,040 Speaker 1: of a point where it's not kind of black and white. 773 00:35:48,040 --> 00:35:51,160 Speaker 1: It's not like somebody hacked into Google and gave them email. 774 00:35:51,719 --> 00:35:54,000 Speaker 1: And so I think Facebook has been sort of they 775 00:35:54,640 --> 00:35:56,920 Speaker 1: they have this record of pushing this thing to the 776 00:35:57,000 --> 00:36:00,680 Speaker 1: kind of the outer envelope um for the last fifteen years. 777 00:36:00,760 --> 00:36:03,560 Speaker 1: And I think what's happened here slightly ironically, because they 778 00:36:03,560 --> 00:36:05,640 Speaker 1: actually closed off all of these APIs like a couple 779 00:36:05,640 --> 00:36:07,600 Speaker 1: of years ago. So sort of what happened is like 780 00:36:07,640 --> 00:36:10,400 Speaker 1: the stable door was open if you were a developer, 781 00:36:10,440 --> 00:36:12,040 Speaker 1: if you were in tech, if you went to the 782 00:36:12,120 --> 00:36:14,200 Speaker 1: Developer event. They stood up on stage and said, hey, 783 00:36:14,200 --> 00:36:16,120 Speaker 1: look we leave our stables store always doors open so 784 00:36:16,200 --> 00:36:18,759 Speaker 1: you can do all this stuff. Isn't this great? And 785 00:36:18,880 --> 00:36:21,040 Speaker 1: their doors stable doors are opened for like five years 786 00:36:21,600 --> 00:36:24,040 Speaker 1: and then they maybe this isn't a good idea. We're 787 00:36:24,080 --> 00:36:25,239 Speaker 1: going to close the door now. And a bunch of 788 00:36:25,280 --> 00:36:27,480 Speaker 1: people said, oh, evil, Facebook, you shouldn't close the doors. 789 00:36:27,520 --> 00:36:29,960 Speaker 1: You should allow free open access. And now here we 790 00:36:30,000 --> 00:36:31,640 Speaker 1: are in two thousand and eighteen and people go, wow, 791 00:36:31,680 --> 00:36:33,279 Speaker 1: a lot of people went into the stable and stole 792 00:36:33,320 --> 00:36:39,120 Speaker 1: all the horses. What a shark? And Facebook, Yeah, but 793 00:36:40,239 --> 00:36:43,080 Speaker 1: like we near, the doors were open. So there's a 794 00:36:43,120 --> 00:36:44,680 Speaker 1: bunch of you can kind of see this as they 795 00:36:44,719 --> 00:36:46,359 Speaker 1: try and work out how we talk about this, how 796 00:36:46,400 --> 00:36:48,640 Speaker 1: do we think about this stuff. We have been speaking 797 00:36:48,640 --> 00:36:51,919 Speaker 1: with Benlic Devans of Andres and Horowitz. If you enjoy 798 00:36:52,000 --> 00:36:55,080 Speaker 1: this conversation, be sure and check out our podcast extras, 799 00:36:55,280 --> 00:36:58,560 Speaker 1: where we keep the tape rolling and continue discussing all 800 00:36:58,680 --> 00:37:04,080 Speaker 1: things technology. We love your comments, feedback and suggestions right 801 00:37:04,200 --> 00:37:07,480 Speaker 1: to us at m IB podcast at Bloomberg dot net. 802 00:37:08,120 --> 00:37:10,759 Speaker 1: Check out my daily column on Bloomberg Vie dot com. 803 00:37:10,960 --> 00:37:14,160 Speaker 1: You could follow me on Twitter at Ri Halts, I'm 804 00:37:14,200 --> 00:37:18,520 Speaker 1: Barry Hults. You're listening to Masters in Business on Bloomberg Radio. 805 00:37:31,560 --> 00:37:34,239 Speaker 1: Welcome to the podcast, Bennetick, Thank you so much for 806 00:37:34,280 --> 00:37:37,800 Speaker 1: doing this. You're you're talking about stuff that I really 807 00:37:37,840 --> 00:37:45,000 Speaker 1: find endlessly fascinating, including, UM, the responsibility of Facebook to 808 00:37:45,840 --> 00:37:49,040 Speaker 1: either have an open platform and what that risk entails, 809 00:37:49,640 --> 00:37:53,360 Speaker 1: or to control their own APIs and control who access 810 00:37:53,520 --> 00:37:58,280 Speaker 1: what from your UM feed, what nobody is talking about. 811 00:37:58,440 --> 00:38:01,520 Speaker 1: And by the time this airs, he will have Mark 812 00:38:01,640 --> 00:38:07,480 Speaker 1: Zuckerberg will have already um done his congressional testimony. But UM, 813 00:38:08,160 --> 00:38:11,680 Speaker 1: there's still a tremendous amount of responsibility on the individual 814 00:38:11,800 --> 00:38:15,920 Speaker 1: user who willingly said, hey, here's a ton of private 815 00:38:15,960 --> 00:38:19,480 Speaker 1: personal information about my me, try not to mess it up. 816 00:38:20,000 --> 00:38:23,160 Speaker 1: I was always too skeptical. I was always too why 817 00:38:23,200 --> 00:38:26,239 Speaker 1: do you want this information? Can kind of noted it 818 00:38:26,239 --> 00:38:28,839 Speaker 1: from the other end, which was I think people have felt. 819 00:38:28,920 --> 00:38:31,120 Speaker 1: I think for years and years and years people have 820 00:38:31,160 --> 00:38:33,560 Speaker 1: just sort of presumed that nothing on Facebook with public 821 00:38:33,600 --> 00:38:36,680 Speaker 1: with private, right, that's correct, and therefore just sort of 822 00:38:36,680 --> 00:38:38,720 Speaker 1: treated it as a public forum. If you were smart, 823 00:38:38,800 --> 00:38:40,680 Speaker 1: that's what you presumed, or if you were I should 824 00:38:40,719 --> 00:38:43,279 Speaker 1: say knowledgeable. That should have been your work, even think 825 00:38:43,520 --> 00:38:45,399 Speaker 1: I didn't even think beyond that. So, like I've heard 826 00:38:45,400 --> 00:38:47,840 Speaker 1: people say things like I don't use Facebook. I wouldn't 827 00:38:47,840 --> 00:38:49,560 Speaker 1: say that on Facebook Messenger because I don't want it 828 00:38:49,560 --> 00:38:51,880 Speaker 1: to be public, and of course favorit. Messager isn't public. 829 00:38:51,880 --> 00:38:55,319 Speaker 1: I mean technically Facebook can, but anybody can screenshot and 830 00:38:55,320 --> 00:38:58,000 Speaker 1: anybody can. Yeah. Well that's like people thought it was 831 00:38:58,000 --> 00:38:59,920 Speaker 1: his public and stuff on their on their public profile, 832 00:39:00,080 --> 00:39:02,799 Speaker 1: and I went, I joined Facebook, and I forget when 833 00:39:02,800 --> 00:39:05,399 Speaker 1: it launched in the UK, but everyone's profile was public. 834 00:39:05,440 --> 00:39:06,759 Speaker 1: I mean it was public if you were to if 835 00:39:06,760 --> 00:39:09,440 Speaker 1: to like if you're in the London network, so anyone 836 00:39:09,480 --> 00:39:11,759 Speaker 1: could join the London network, so it was public. And 837 00:39:11,800 --> 00:39:13,719 Speaker 1: then so there was a period of like a year when, 838 00:39:13,880 --> 00:39:16,160 Speaker 1: like everybody I've been at school, at university, we've had 839 00:39:16,160 --> 00:39:17,920 Speaker 1: a profile on Facebook and it was public and I 840 00:39:17,920 --> 00:39:19,279 Speaker 1: could go and look, and then everyone kind of turned 841 00:39:19,280 --> 00:39:22,480 Speaker 1: the privacy things on. But I just and even then 842 00:39:22,560 --> 00:39:25,439 Speaker 1: it's not for I don't feel like you would say 843 00:39:25,480 --> 00:39:29,000 Speaker 1: stuff on Facebook that you expect to be secret. That's 844 00:39:29,040 --> 00:39:30,800 Speaker 1: what I'm getting at, And I don't think many people 845 00:39:30,880 --> 00:39:33,279 Speaker 1: ever actually thought that, which comes again to my point 846 00:39:33,320 --> 00:39:35,759 Speaker 1: about sort of unresolved feelings, you can kind of you 847 00:39:35,800 --> 00:39:39,279 Speaker 1: can go to the extreme and say, um, everything on 848 00:39:39,320 --> 00:39:42,200 Speaker 1: Facebook should have been completely private and everyone should have 849 00:39:42,280 --> 00:39:44,759 Speaker 1: understood that and everyone you know, that's not realistic. But 850 00:39:44,840 --> 00:39:46,840 Speaker 1: I don't think that's actually very realistic. I think it 851 00:39:46,920 --> 00:39:48,719 Speaker 1: was much more kind of nuanced and fuzzy than that. 852 00:39:48,840 --> 00:39:50,560 Speaker 1: And I think you could argue everything in a WhatsApp 853 00:39:50,600 --> 00:39:52,600 Speaker 1: group should be private, but I think that's a very 854 00:39:52,640 --> 00:39:54,919 Speaker 1: different kind of form to posting on your news feed. 855 00:39:55,040 --> 00:39:59,480 Speaker 1: But when I ask people, um, you know, sometimes you 856 00:39:59,480 --> 00:40:02,120 Speaker 1: have to come out the discussion from an oblique angle. 857 00:40:02,719 --> 00:40:04,840 Speaker 1: And I like to use radio as an example. I 858 00:40:04,880 --> 00:40:08,560 Speaker 1: ask people what is radio cell and invariably they say advertising, 859 00:40:08,920 --> 00:40:12,239 Speaker 1: And that's the wrong answer. The advertisers of the buyers. 860 00:40:12,960 --> 00:40:16,560 Speaker 1: Radio sells an audience, and Facebook more or less has 861 00:40:16,600 --> 00:40:19,200 Speaker 1: the same business model. So I find this interesting. There's 862 00:40:19,200 --> 00:40:20,920 Speaker 1: a sort of there's a common meme even kind of 863 00:40:20,920 --> 00:40:23,680 Speaker 1: in the tech industry where people say Facebook sales for information. 864 00:40:24,200 --> 00:40:26,319 Speaker 1: Now they sell and you as a part of an audience. Well, 865 00:40:26,320 --> 00:40:28,919 Speaker 1: it's interesting because in in a literal sense, if you're 866 00:40:28,920 --> 00:40:31,120 Speaker 1: an advertising on Facebook, you don't get given a zip 867 00:40:31,120 --> 00:40:34,440 Speaker 1: file of the profile of everybody who saw that. So 868 00:40:34,560 --> 00:40:36,799 Speaker 1: in a literal sense, it's give not true. In a 869 00:40:36,880 --> 00:40:39,080 Speaker 1: kind of a metaphorical sense, well, they're sort of they're 870 00:40:39,080 --> 00:40:41,680 Speaker 1: selling the fact that they know like even like kind 871 00:40:41,680 --> 00:40:43,799 Speaker 1: of a metaphorical sense. That kind of struggles me. I 872 00:40:43,840 --> 00:40:46,759 Speaker 1: struggle with that as a statement, But I think the 873 00:40:46,760 --> 00:40:49,799 Speaker 1: fact that people have those conversations reflects against sort of 874 00:40:50,080 --> 00:40:52,920 Speaker 1: unresolved feelings about how we should think about this stuff. 875 00:40:52,920 --> 00:40:54,799 Speaker 1: It's the same with this idea that's going around now 876 00:40:54,960 --> 00:40:56,800 Speaker 1: that all the Facebook's problems can be blamed on the 877 00:40:56,800 --> 00:40:58,800 Speaker 1: ad model, and you think, well, if they had a 878 00:40:58,840 --> 00:41:01,719 Speaker 1: subscription model, would they not be having to how to 879 00:41:01,760 --> 00:41:03,960 Speaker 1: develop a platform? Would you not have been posting your 880 00:41:03,960 --> 00:41:06,360 Speaker 1: personal stuff to your news feed? Like why would that 881 00:41:06,400 --> 00:41:08,879 Speaker 1: have made any difference? And you would you would still 882 00:41:08,920 --> 00:41:11,480 Speaker 1: have the same opportunity for apps to come in and 883 00:41:11,520 --> 00:41:15,560 Speaker 1: scrape that data before they closed the barn door. Yeah, 884 00:41:15,600 --> 00:41:17,279 Speaker 1: And I mean I was talked about this because somebody, 885 00:41:17,320 --> 00:41:19,640 Speaker 1: I think you say on Twitter's somebody who's had a journalism, 886 00:41:19,800 --> 00:41:21,520 Speaker 1: have had a journalism in school, and they posted like 887 00:41:21,560 --> 00:41:23,600 Speaker 1: an explanation of what came don We've been doing it. 888 00:41:23,840 --> 00:41:27,960 Speaker 1: This is three weeks ago, and it's just it was 889 00:41:28,040 --> 00:41:30,120 Speaker 1: interesting simply that they were. Three weeks on, we're still 890 00:41:30,120 --> 00:41:33,080 Speaker 1: having people are still writing explanations of what happened because 891 00:41:33,120 --> 00:41:35,120 Speaker 1: it's all sort of fuzzy and a pay can no 892 00:41:35,120 --> 00:41:38,719 Speaker 1: one quite understandable. It's complex, and people like simple narratives 893 00:41:38,719 --> 00:41:41,839 Speaker 1: with defineable good guys and bad guys. That makes it 894 00:41:41,880 --> 00:41:45,960 Speaker 1: easier to to do. Nuance sort of gets lost on 895 00:41:45,960 --> 00:41:50,160 Speaker 1: on cable television to the least. All right, let's talk 896 00:41:50,160 --> 00:41:55,200 Speaker 1: about some of your favorite technologies. Autonomy of filling the blank, 897 00:41:55,239 --> 00:42:00,120 Speaker 1: autonomous cars, autonomous uh, whatever, how do we how do 898 00:42:00,160 --> 00:42:04,640 Speaker 1: you see the development of AI and autonomous everything? Have 899 00:42:04,760 --> 00:42:07,520 Speaker 1: you got another hour? I do? I don't know if 900 00:42:07,560 --> 00:42:11,520 Speaker 1: we are are, so I think what can we say 901 00:42:11,520 --> 00:42:13,319 Speaker 1: about this? So, first of all, the reason that we're 902 00:42:13,320 --> 00:42:16,480 Speaker 1: talking about autonomous cars now is because of what we 903 00:42:16,560 --> 00:42:19,520 Speaker 1: call AI, which really means this new technology called machine learning, 904 00:42:19,600 --> 00:42:22,359 Speaker 1: or technology that just started working called machine learning, which 905 00:42:22,400 --> 00:42:24,880 Speaker 1: offers the prospect that a bunch of problems around autonomous 906 00:42:24,920 --> 00:42:26,680 Speaker 1: cars might be solvable in a way that they really 907 00:42:26,680 --> 00:42:29,399 Speaker 1: didn't seem to be easily solvable before. So, in that sense, 908 00:42:29,440 --> 00:42:31,600 Speaker 1: you could say autonomy is a spinoff of machine learning 909 00:42:31,600 --> 00:42:33,120 Speaker 1: all the break the advanced. The fact that we're interested 910 00:42:33,120 --> 00:42:34,960 Speaker 1: in autonomy is a spinoff in machine learning, but there's 911 00:42:34,960 --> 00:42:36,960 Speaker 1: altos of other stuff that machine learning does as well 912 00:42:37,400 --> 00:42:40,759 Speaker 1: as we go to autonomy. UM. The sort of the 913 00:42:40,800 --> 00:42:42,719 Speaker 1: way I kind of talk about this is I have 914 00:42:42,719 --> 00:42:44,920 Speaker 1: a slide with a picture of a horseless carriage from 915 00:42:45,840 --> 00:42:48,880 Speaker 1: It's a carriage with no horse, but otherwise it's nothing's changed. 916 00:42:49,200 --> 00:42:51,560 Speaker 1: And I think that's what I hear when people say 917 00:42:51,640 --> 00:42:55,400 Speaker 1: driverless car, that you've taken the steering wheel out maybe, 918 00:42:55,440 --> 00:42:57,319 Speaker 1: but like nothing else has changed and it will still 919 00:42:57,400 --> 00:43:00,560 Speaker 1: drive around like like autonomous cars mean us that drive 920 00:43:00,600 --> 00:43:03,239 Speaker 1: like people, but without making mistakes or making the speed limit. 921 00:43:03,280 --> 00:43:05,520 Speaker 1: And that's that's just a really shortsighted way of thinking 922 00:43:05,520 --> 00:43:07,520 Speaker 1: about this. I think there's kind of two kind of 923 00:43:07,520 --> 00:43:09,560 Speaker 1: building blocks to think about. The first is you can 924 00:43:09,680 --> 00:43:11,799 Speaker 1: if you have when we have a fully autonomous world 925 00:43:11,880 --> 00:43:15,239 Speaker 1: in end decades forty years, depending on what you think 926 00:43:15,239 --> 00:43:18,200 Speaker 1: the s cards will look like. UM. And of coursely 927 00:43:18,280 --> 00:43:20,400 Speaker 1: you have periods where some places will be much go 928 00:43:20,560 --> 00:43:22,320 Speaker 1: much quicker, So like you might you might say Manhattan 929 00:43:22,360 --> 00:43:25,759 Speaker 1: is autonomous only in the week in twenty thirty or something. Um. 930 00:43:26,320 --> 00:43:30,040 Speaker 1: But at that point there's no accidents, well, certainly much 931 00:43:30,120 --> 00:43:33,279 Speaker 1: less well basically no accidents because all the accidents are 932 00:43:33,280 --> 00:43:36,080 Speaker 1: caused by human error, and you have much less congestion 933 00:43:36,160 --> 00:43:38,280 Speaker 1: because you don't have traffic waves, you don't have accidents 934 00:43:38,320 --> 00:43:40,600 Speaker 1: which cause a third of congestion or something. UM. You 935 00:43:40,640 --> 00:43:42,920 Speaker 1: can have vehicles on freeways driving a hundred miles and 936 00:43:42,960 --> 00:43:45,839 Speaker 1: our two apart from each other. So you radically change 937 00:43:45,840 --> 00:43:48,120 Speaker 1: what congestion looks like. You change radically change all the 938 00:43:48,200 --> 00:43:50,640 Speaker 1: vehicles look like. So you could have a vehicle that 939 00:43:50,840 --> 00:43:52,840 Speaker 1: will never you know, if you're got calling an on 940 00:43:52,880 --> 00:43:55,799 Speaker 1: demand vehicle in Manhattan, it will not go over twenty miles. 941 00:43:55,800 --> 00:43:58,480 Speaker 1: An how it could be a golf cart. Um. If 942 00:43:58,520 --> 00:44:00,319 Speaker 1: you're going to go to JFK, they and you would 943 00:44:00,320 --> 00:44:03,000 Speaker 1: send a different vehicle. And so you can radically just 944 00:44:03,080 --> 00:44:04,759 Speaker 1: redesign the vehicle in the same way that when we 945 00:44:04,760 --> 00:44:06,840 Speaker 1: got rid of the horse, you radically redesigned the vehicle. 946 00:44:07,280 --> 00:44:09,520 Speaker 1: And you can also radically redesign the city or change 947 00:44:09,520 --> 00:44:11,360 Speaker 1: your assumptions about the city in the same way that 948 00:44:11,400 --> 00:44:13,239 Speaker 1: we did whom we got rid of the horse, um, 949 00:44:13,360 --> 00:44:17,040 Speaker 1: except that you're no longer trying to design a city 950 00:44:17,480 --> 00:44:20,840 Speaker 1: that will constrain the nineteen year old guy on a 951 00:44:20,840 --> 00:44:24,280 Speaker 1: souped up Chevy Tomorrow. Um, you're designing a city based 952 00:44:24,320 --> 00:44:26,279 Speaker 1: on You can create rules. You can tell the cars 953 00:44:26,280 --> 00:44:27,719 Speaker 1: where to go and what they can do or not do. 954 00:44:27,960 --> 00:44:29,880 Speaker 1: You can have dynamic real time road right saying that 955 00:44:29,920 --> 00:44:31,600 Speaker 1: you can say, do you want to pay a lot 956 00:44:31,640 --> 00:44:33,440 Speaker 1: to get there in fifteen minutes or you're willing to 957 00:44:33,440 --> 00:44:35,839 Speaker 1: pay less to get there in thirty minutes. You can 958 00:44:35,880 --> 00:44:38,080 Speaker 1: tell the cars at any given instant which road to 959 00:44:38,120 --> 00:44:40,080 Speaker 1: say should be taking and how fast they should be going. 960 00:44:40,560 --> 00:44:42,680 Speaker 1: And so what you have is like a change in 961 00:44:42,680 --> 00:44:45,359 Speaker 1: the structure of a city that has more a lot 962 00:44:45,400 --> 00:44:47,560 Speaker 1: in common with like the way the city changes a 963 00:44:47,600 --> 00:44:50,440 Speaker 1: result of the car. UM kind of the example I 964 00:44:50,480 --> 00:44:53,560 Speaker 1: often give, which works well since we're in Manhattan is 965 00:44:53,600 --> 00:44:57,200 Speaker 1: imagine if you live in Brooklyn and it's November and 966 00:44:57,239 --> 00:44:59,560 Speaker 1: you want to go to that cool new restaurant in Manhattan. 967 00:45:00,000 --> 00:45:01,439 Speaker 1: How are you going to get there? Where you could 968 00:45:01,440 --> 00:45:04,040 Speaker 1: walk ten minutes in the rain to the subway station, 969 00:45:05,360 --> 00:45:07,759 Speaker 1: you could get a cab UK well, presuming you can 970 00:45:07,800 --> 00:45:09,359 Speaker 1: get a cab. Presume you can get a cab O care. 971 00:45:09,400 --> 00:45:12,319 Speaker 1: It's going to cost you what dollars each way you 972 00:45:12,320 --> 00:45:14,640 Speaker 1: could drive, then one of you can't drink on the 973 00:45:14,640 --> 00:45:16,040 Speaker 1: way back, and you're going to have to park, and 974 00:45:16,080 --> 00:45:18,239 Speaker 1: you have to pay for parking in little twenty minutes 975 00:45:18,280 --> 00:45:20,839 Speaker 1: to find a place to park there. Let's not go 976 00:45:20,880 --> 00:45:23,680 Speaker 1: out of that place, okay, Now go to a fully 977 00:45:23,719 --> 00:45:27,440 Speaker 1: autonomous world. You raise your watch, you say, hey, alex 978 00:45:27,520 --> 00:45:29,640 Speaker 1: or I need a car. The nip hod that's around 979 00:45:29,640 --> 00:45:32,880 Speaker 1: the block stops outside your door within thirty seconds, and 980 00:45:33,000 --> 00:45:35,600 Speaker 1: there is no congestion. The pod just takes you there. 981 00:45:35,719 --> 00:45:38,160 Speaker 1: It drops you off outside the door um if it's 982 00:45:38,200 --> 00:45:40,719 Speaker 1: your pod, um, and then it goes and waits for 983 00:45:40,760 --> 00:45:42,680 Speaker 1: you somewhere where parking is cheap, or it waits for 984 00:45:42,680 --> 00:45:44,520 Speaker 1: you somewhere else. If it's an on demand pod, it 985 00:45:44,560 --> 00:45:48,040 Speaker 1: immediately starts driving other people around, so there's no parking, remember, 986 00:45:48,320 --> 00:45:50,319 Speaker 1: and then you can go home when you can drink 987 00:45:50,320 --> 00:45:52,120 Speaker 1: as much as you like. I mean, remember all those 988 00:45:52,160 --> 00:45:56,480 Speaker 1: those photographs of European and East Coast cities from before cars, 989 00:45:56,560 --> 00:45:58,200 Speaker 1: and you think where the streets are all twice as 990 00:45:58,239 --> 00:46:00,759 Speaker 1: wide because you don't have cars hop down both sides 991 00:46:00,800 --> 00:46:03,719 Speaker 1: of the street. What we could go back to, and 992 00:46:03,800 --> 00:46:06,239 Speaker 1: so you have like kind of radical change in how 993 00:46:06,280 --> 00:46:09,520 Speaker 1: we think about what the city is. Therefore, in what 994 00:46:09,600 --> 00:46:11,759 Speaker 1: does a gas station mean? Very obviously, but also what 995 00:46:11,800 --> 00:46:13,560 Speaker 1: does big box we tell mean? Where are you willing 996 00:46:13,560 --> 00:46:17,040 Speaker 1: to shop? What does your commute look like? You know? Today? 997 00:46:17,120 --> 00:46:19,800 Speaker 1: So I live in San Francisco, which calls itself a city, 998 00:46:19,840 --> 00:46:21,959 Speaker 1: and I work in Menlo Park, which is an office park. 999 00:46:22,120 --> 00:46:25,600 Speaker 1: Forty five minutes drive south if there's not much traffic, Um, 1000 00:46:25,640 --> 00:46:27,439 Speaker 1: I should say an hour, but I'll admit to forty 1001 00:46:27,480 --> 00:46:32,719 Speaker 1: five minutes south. Now, supposing there really is no traffic, well, 1002 00:46:32,760 --> 00:46:35,120 Speaker 1: then I can get there in half an hour. But 1003 00:46:35,360 --> 00:46:38,399 Speaker 1: supposing I'm able to read instead of drive, I might 1004 00:46:38,400 --> 00:46:40,720 Speaker 1: be able to live further away and have it spend 1005 00:46:40,760 --> 00:46:42,400 Speaker 1: longer in the car because I don't need to just 1006 00:46:42,440 --> 00:46:44,719 Speaker 1: sit there staring at the road all day. So where 1007 00:46:44,719 --> 00:46:46,400 Speaker 1: do you live, where do you commute? Where does the 1008 00:46:46,480 --> 00:46:49,759 Speaker 1: retail go? All those sorts of changes that happened, like 1009 00:46:50,000 --> 00:46:51,799 Speaker 1: the stuff that happened as a result of cars, which 1010 00:46:51,800 --> 00:46:53,600 Speaker 1: basically what I'm talking about. Like, there's this great saying 1011 00:46:53,600 --> 00:46:55,720 Speaker 1: that it was easy to predict mass ownership of cars, 1012 00:46:55,880 --> 00:46:57,799 Speaker 1: but hard to put itt Walmart. It's hard to put 1013 00:46:57,800 --> 00:47:01,000 Speaker 1: it drive through a t M s like that kind 1014 00:47:01,040 --> 00:47:04,200 Speaker 1: of stuff. Those second and third auto consequences will flow 1015 00:47:04,200 --> 00:47:07,799 Speaker 1: out of this stuff. Let's talk about smart fill in 1016 00:47:07,840 --> 00:47:12,239 Speaker 1: the blanks, smart homes, smartphone, smart cars, smart speakers. What 1017 00:47:12,440 --> 00:47:17,520 Speaker 1: is it about the the preface smart that suddenly everybody 1018 00:47:17,560 --> 00:47:22,960 Speaker 1: wants to move in that direction? So so software and okay, 1019 00:47:23,000 --> 00:47:24,719 Speaker 1: so three or four blocks to talk about here. The 1020 00:47:24,760 --> 00:47:27,520 Speaker 1: first block is smartphone supply chain. One and a half 1021 00:47:27,520 --> 00:47:30,520 Speaker 1: billion smartphones sold last year. All of those chips are 1022 00:47:30,520 --> 00:47:32,480 Speaker 1: available as like a fire hose of stuff to make 1023 00:47:32,520 --> 00:47:34,680 Speaker 1: stuff with. So before smartphones, if you wanted to put 1024 00:47:34,680 --> 00:47:37,279 Speaker 1: computing into something, you had to use PC components, So 1025 00:47:37,320 --> 00:47:39,959 Speaker 1: an a t M is a PC. But those are big, 1026 00:47:40,000 --> 00:47:42,279 Speaker 1: and they're heavy, and they need mains power, they need 1027 00:47:42,320 --> 00:47:45,080 Speaker 1: power and so on. Smartphone components much cheaper, much smaller, 1028 00:47:45,160 --> 00:47:47,120 Speaker 1: much lighter. So suddenly you can make a connected door lock, 1029 00:47:47,160 --> 00:47:50,759 Speaker 1: and like you can get the parts really easily. UM 1030 00:47:50,840 --> 00:47:53,839 Speaker 1: plus um UBER called as internet plus machine learning, means 1031 00:47:53,880 --> 00:47:56,319 Speaker 1: that like a camera can actually be able to tell 1032 00:47:56,320 --> 00:47:58,400 Speaker 1: if there's something moving or not. So you've got all 1033 00:47:58,480 --> 00:48:01,040 Speaker 1: this stuff. Suddenly the stuff was not working. I think 1034 00:48:01,080 --> 00:48:04,520 Speaker 1: The best analogy for this is like our grandparents could 1035 00:48:04,560 --> 00:48:06,879 Speaker 1: have told you how many electric motors they owned. There 1036 00:48:06,920 --> 00:48:09,000 Speaker 1: was one in the car, they had a vacuum cleaner, 1037 00:48:09,040 --> 00:48:10,879 Speaker 1: there was one in the fridge. They owned maybe five 1038 00:48:10,920 --> 00:48:13,480 Speaker 1: electric motors in total, Like there was only one electric 1039 00:48:13,480 --> 00:48:15,800 Speaker 1: motor in the car. Let's start a major that was it. Today, 1040 00:48:15,880 --> 00:48:17,480 Speaker 1: who has a clue have any electric motors are in 1041 00:48:17,480 --> 00:48:19,480 Speaker 1: your car? There's like twenty And if you tell your 1042 00:48:19,520 --> 00:48:21,640 Speaker 1: grandfather that you can press a button to adjust your 1043 00:48:21,640 --> 00:48:23,319 Speaker 1: wing moor, he'd hit you on the back of the head. 1044 00:48:23,440 --> 00:48:25,560 Speaker 1: But that's just how it worked, because that's just how 1045 00:48:25,600 --> 00:48:27,879 Speaker 1: the technology got deployed. The same thing in your home. 1046 00:48:28,120 --> 00:48:29,560 Speaker 1: There was a period when everything was going to have 1047 00:48:29,600 --> 00:48:32,520 Speaker 1: a DC motor. Now everybody has you like you have 1048 00:48:32,600 --> 00:48:34,840 Speaker 1: an you have a microwave, You maybe have a toaster, 1049 00:48:34,920 --> 00:48:37,160 Speaker 1: you may have a kettle, you have a blender. Nobody 1050 00:48:37,200 --> 00:48:38,880 Speaker 1: has all of those things. Everyone in East Agea has 1051 00:48:38,880 --> 00:48:41,040 Speaker 1: a rice cooker. Everyone in the UK has a kettle. 1052 00:48:41,080 --> 00:48:42,920 Speaker 1: In America, you maybe don't have a kettle, but you 1053 00:48:42,960 --> 00:48:46,200 Speaker 1: have a coffee machine. But nobody has an electric carving knife. 1054 00:48:47,080 --> 00:48:49,000 Speaker 1: And so what happens is like there are those underlying 1055 00:48:49,000 --> 00:48:51,480 Speaker 1: components as cheap commodities were in this period of trying 1056 00:48:51,480 --> 00:48:52,960 Speaker 1: to work out what you should do with them and 1057 00:48:53,000 --> 00:48:55,359 Speaker 1: how they should all get plugged together, and should they 1058 00:48:55,400 --> 00:48:56,960 Speaker 1: all be the same system or not sure they all 1059 00:48:56,960 --> 00:48:59,680 Speaker 1: talked to Alexa or not everything, And then you've got 1060 00:48:59,680 --> 00:49:02,640 Speaker 1: like the austrial logic. So like somebody the Samson board 1061 00:49:02,680 --> 00:49:04,800 Speaker 1: sat down and said, everything we sell must have the 1062 00:49:04,840 --> 00:49:07,759 Speaker 1: Samsing Voice Assistant, because then people will be more likely 1063 00:49:07,800 --> 00:49:09,680 Speaker 1: to buy the Samson fridge that sorts to the Samsing 1064 00:49:09,719 --> 00:49:12,280 Speaker 1: dish watcher when it talks to the samthing this, um, 1065 00:49:12,280 --> 00:49:14,399 Speaker 1: how's it working out? Well? Then you so it's great 1066 00:49:14,440 --> 00:49:16,160 Speaker 1: you see this at CS because you can see like 1067 00:49:16,200 --> 00:49:17,960 Speaker 1: the fridge people thought this is a fantastic I mean 1068 00:49:18,000 --> 00:49:20,160 Speaker 1: metaphorically speaking, the fridge people thought, this is a fantastic 1069 00:49:20,239 --> 00:49:23,760 Speaker 1: idea to dish watch. Your people are like, don't damn it. Okay, 1070 00:49:23,960 --> 00:49:25,920 Speaker 1: we'll put the voice assistant in the dish wash but 1071 00:49:27,760 --> 00:49:30,120 Speaker 1: a that and be they want to sell it to 1072 00:49:30,160 --> 00:49:32,640 Speaker 1: people who also in an LG fridge and a sub 1073 00:49:32,719 --> 00:49:35,160 Speaker 1: zero fridge. So they've got the Samson voice is this, 1074 00:49:35,320 --> 00:49:37,880 Speaker 1: and they've got home kit and they've got Alexa in 1075 00:49:37,920 --> 00:49:39,600 Speaker 1: it because they only need to sell dish watches. They 1076 00:49:39,640 --> 00:49:42,200 Speaker 1: don't care about the Greek strategy. It's like when Everythingy 1077 00:49:42,200 --> 00:49:44,319 Speaker 1: product had a memory stick, and there was some and 1078 00:49:44,400 --> 00:49:46,560 Speaker 1: like the Sony group had said everything's gonna have have 1079 00:49:46,600 --> 00:49:48,400 Speaker 1: a memory stick, and some bits of Sony thought this 1080 00:49:48,440 --> 00:49:51,000 Speaker 1: is great, and some bits were like, God, damn it, um, 1081 00:49:51,040 --> 00:49:52,160 Speaker 1: what are we gonna do? What are we going to 1082 00:49:52,239 --> 00:49:54,880 Speaker 1: do with this? And as with an electric carving knife 1083 00:49:54,960 --> 00:49:57,400 Speaker 1: versus the blender, like some of this stuff will make them, 1084 00:49:57,440 --> 00:49:59,520 Speaker 1: so some of it won't. It would be quite nice 1085 00:49:59,560 --> 00:50:01,960 Speaker 1: to be able to say to my oven, um, okay, 1086 00:50:02,000 --> 00:50:05,279 Speaker 1: pre heat the oven to three degrees, as opposed to 1087 00:50:05,360 --> 00:50:08,080 Speaker 1: having to learner the interview and walk over and touch it. 1088 00:50:08,560 --> 00:50:10,759 Speaker 1: I have my apartment, I have an infrared sensor in 1089 00:50:10,760 --> 00:50:12,640 Speaker 1: the bathroom. I walk into the bathroom, the light comes on. 1090 00:50:13,360 --> 00:50:16,200 Speaker 1: It's fantastic. My parents hated because they get up at 1091 00:50:16,200 --> 00:50:18,440 Speaker 1: four o'clock in the morning, then the lights come on. 1092 00:50:18,480 --> 00:50:21,239 Speaker 1: I think you program that because it's not anything. It's 1093 00:50:21,280 --> 00:50:23,839 Speaker 1: just literally it is just a dumile sensor from forty 1094 00:50:23,880 --> 00:50:26,480 Speaker 1: years again. So so here's where the smart world starts 1095 00:50:26,520 --> 00:50:29,600 Speaker 1: to get more interesting. So I have a long, twisty 1096 00:50:29,719 --> 00:50:32,719 Speaker 1: driveway because we're Our house is said off the main road, 1097 00:50:33,280 --> 00:50:37,120 Speaker 1: and we have lights along it. And the first phase 1098 00:50:37,320 --> 00:50:39,560 Speaker 1: was having a switch that put the lights on and 1099 00:50:39,600 --> 00:50:41,920 Speaker 1: put the lights off. And then the next phase was 1100 00:50:41,960 --> 00:50:45,040 Speaker 1: having a timer that has the lights going on at 1101 00:50:45,040 --> 00:50:48,680 Speaker 1: seven a pm and off at eleven pm. But the 1102 00:50:48,719 --> 00:50:51,880 Speaker 1: problem with that is seven pm is light sometimes of 1103 00:50:51,880 --> 00:50:56,160 Speaker 1: the year and dark. I thought, no, no, nothing like 1104 00:50:56,200 --> 00:51:00,839 Speaker 1: that's a contemporary house. But the new switch that is 1105 00:51:00,880 --> 00:51:06,360 Speaker 1: literally going in this weekend is built into the lights 1106 00:51:06,360 --> 00:51:10,520 Speaker 1: which as you can set your latitude and that lets 1107 00:51:10,840 --> 00:51:14,320 Speaker 1: whatever you said as your like going on thirty minutes 1108 00:51:14,360 --> 00:51:17,960 Speaker 1: after sunset will change throughout the year regardless and you 1109 00:51:18,000 --> 00:51:21,120 Speaker 1: don't have to deal with it. So that's our sort 1110 00:51:21,160 --> 00:51:24,120 Speaker 1: of smart application of software. Are like, we don't have 1111 00:51:24,160 --> 00:51:26,920 Speaker 1: any you know, your your phone changes its clock when 1112 00:51:27,000 --> 00:51:30,319 Speaker 1: the time don't changed automatically things the satellites. So I 1113 00:51:30,320 --> 00:51:32,000 Speaker 1: think there's a there's a kind of question to where 1114 00:51:32,000 --> 00:51:33,880 Speaker 1: will the complexity set And this is like the electric 1115 00:51:33,840 --> 00:51:35,600 Speaker 1: carving knife. There will be stuff that just doesn't make 1116 00:51:35,640 --> 00:51:38,440 Speaker 1: sense the other you do electric carving knife have been 1117 00:51:38,480 --> 00:51:40,520 Speaker 1: around for decades and they never would the other the 1118 00:51:40,560 --> 00:51:42,759 Speaker 1: other really so well. Some people have a like I 1119 00:51:42,800 --> 00:51:44,360 Speaker 1: can show you where they took a chunk out of 1120 00:51:44,360 --> 00:51:49,239 Speaker 1: my finger. The other extreme is like the other. The 1121 00:51:49,239 --> 00:51:51,200 Speaker 1: other way to think about this is it's old saying 1122 00:51:51,239 --> 00:51:53,680 Speaker 1: that basically a computer should never ask you a question 1123 00:51:53,680 --> 00:51:55,640 Speaker 1: that it ought to be able to work out on itself. 1124 00:51:56,640 --> 00:51:59,080 Speaker 1: And that used to mean, like you plug the printer in, 1125 00:51:59,120 --> 00:52:00,840 Speaker 1: the computer should know what the printer is. It shouldn't 1126 00:52:00,880 --> 00:52:03,279 Speaker 1: ask you what printer it is. Um, then it means 1127 00:52:03,360 --> 00:52:05,000 Speaker 1: your phone doesn't ask where you are when you call 1128 00:52:05,040 --> 00:52:07,640 Speaker 1: a car because it's got GPS and it knows. Then 1129 00:52:07,680 --> 00:52:09,479 Speaker 1: it means like the light should know what the times 1130 00:52:09,520 --> 00:52:11,239 Speaker 1: it is, so the light should just go on right, 1131 00:52:11,400 --> 00:52:14,200 Speaker 1: that's right. And the chair that the tension point and 1132 00:52:14,320 --> 00:52:16,840 Speaker 1: all of those is, is it actually more hassle to 1133 00:52:16,880 --> 00:52:19,680 Speaker 1: configure the thing to do that? Sometimes it is, sometimes 1134 00:52:19,760 --> 00:52:22,279 Speaker 1: it is, sometimes it isn't. So my beef about Siri, 1135 00:52:22,480 --> 00:52:26,760 Speaker 1: which was the leading voice app to begin with before 1136 00:52:27,560 --> 00:52:31,200 Speaker 1: Alexa began to eat its lunch, is asking you questions 1137 00:52:31,200 --> 00:52:34,439 Speaker 1: that it should, from the context be able to figure out. 1138 00:52:34,760 --> 00:52:38,600 Speaker 1: This is just paying a database, paying a location, um 1139 00:52:39,160 --> 00:52:42,520 Speaker 1: software ping something it has access to all that stuff. 1140 00:52:42,719 --> 00:52:45,760 Speaker 1: This so this is a different sense of the word smart. 1141 00:52:46,280 --> 00:52:48,920 Speaker 1: So let's talk about voice for a minute. Okay, So 1142 00:52:50,000 --> 00:52:52,359 Speaker 1: we have this thing called machine so okay, the right 1143 00:52:52,400 --> 00:52:54,840 Speaker 1: way putting this. So when you talk to a computer, 1144 00:52:55,360 --> 00:52:58,239 Speaker 1: the three things that are going on. Step one is 1145 00:52:58,320 --> 00:53:01,040 Speaker 1: it is taking the audio wave form and turning that 1146 00:53:01,120 --> 00:53:04,120 Speaker 1: into text. So it's just transcribing it and turning it 1147 00:53:04,160 --> 00:53:06,719 Speaker 1: into words, which I think is basic math and and 1148 00:53:07,000 --> 00:53:09,480 Speaker 1: well no it was. And this is something where it 1149 00:53:09,640 --> 00:53:11,879 Speaker 1: used to work okay, like three quarters of the time, 1150 00:53:12,400 --> 00:53:16,440 Speaker 1: but you remember using dictation apps like National actually speaking, 1151 00:53:17,120 --> 00:53:20,200 Speaker 1: it's sort of work three quarters. There are more and 1152 00:53:20,200 --> 00:53:22,359 Speaker 1: it would get better like half a point every year. 1153 00:53:22,800 --> 00:53:25,280 Speaker 1: A machine learning comes along and it goes from working 1154 00:53:25,280 --> 00:53:29,120 Speaker 1: three quarters at the time to working at the tent 1155 00:53:29,160 --> 00:53:30,600 Speaker 1: at the time. And so machine learning gives you this 1156 00:53:30,680 --> 00:53:34,000 Speaker 1: radical change in that it also then there's a second piece, 1157 00:53:34,040 --> 00:53:36,239 Speaker 1: which is you needed to go from the text to 1158 00:53:36,640 --> 00:53:39,360 Speaker 1: a structured query. You need to actually work out what 1159 00:53:39,440 --> 00:53:41,360 Speaker 1: the verb and the nan is and what is his asking, 1160 00:53:41,400 --> 00:53:45,160 Speaker 1: which is a completely different computation. Um. And so when 1161 00:53:45,200 --> 00:53:47,400 Speaker 1: you talk to your computer, that's it's two very almost 1162 00:53:47,480 --> 00:53:50,520 Speaker 1: unrelated things going on. Machine learning. Also, this is called 1163 00:53:50,600 --> 00:53:55,759 Speaker 1: natural language processing. UM. Machine learning also made that work way, way, 1164 00:53:55,800 --> 00:53:59,640 Speaker 1: way way better. UM. Then you have the third problem 1165 00:53:59,760 --> 00:54:02,440 Speaker 1: we're is, Okay, I've created this structured query, do I 1166 00:54:02,480 --> 00:54:06,759 Speaker 1: have anything to give it to? And you should have. Well, 1167 00:54:07,960 --> 00:54:11,000 Speaker 1: so this is the problem, which is what you've actually 1168 00:54:11,040 --> 00:54:13,799 Speaker 1: got here is an IVYR. You've got a voice tree, 1169 00:54:13,960 --> 00:54:16,880 Speaker 1: Press one for this, press to for that. Machine learning 1170 00:54:16,920 --> 00:54:18,759 Speaker 1: means you can now and we've we've all kind of 1171 00:54:18,800 --> 00:54:20,799 Speaker 1: experienced this, saying you phone the airline and they say, 1172 00:54:20,880 --> 00:54:22,319 Speaker 1: tell us what you want to ask about, and there's 1173 00:54:22,320 --> 00:54:25,200 Speaker 1: only fifteen things you could ask so they can recognize 1174 00:54:25,200 --> 00:54:27,080 Speaker 1: it the text, and then they can work out what 1175 00:54:27,120 --> 00:54:28,759 Speaker 1: you're asking about and they can read you to the 1176 00:54:28,840 --> 00:54:32,440 Speaker 1: right number inside that they're pretty awful also, But what 1177 00:54:32,560 --> 00:54:34,560 Speaker 1: we don't have. Machine learning gives us a way of 1178 00:54:34,600 --> 00:54:38,719 Speaker 1: automating the transcription and getting the natural language processing to work. 1179 00:54:39,239 --> 00:54:41,600 Speaker 1: But you might ask it any one of three or 1180 00:54:41,600 --> 00:54:45,600 Speaker 1: four million things, and so effectively, what I mean when 1181 00:54:45,640 --> 00:54:48,040 Speaker 1: I say a structured query is you can fill in 1182 00:54:48,040 --> 00:54:50,719 Speaker 1: a dialog box by talking to the computer, and the 1183 00:54:50,719 --> 00:54:53,920 Speaker 1: computer can work out which dialog box you were asking for. 1184 00:54:54,840 --> 00:54:58,440 Speaker 1: But somebody has to have made the dialog box by hand. Okay, 1185 00:54:58,520 --> 00:55:00,960 Speaker 1: so you know if you're if you're a phone company, 1186 00:55:01,080 --> 00:55:04,160 Speaker 1: so if you're a phone manufacturer and you have a 1187 00:55:04,200 --> 00:55:07,680 Speaker 1: few million or maybe even a few billion queries, but 1188 00:55:07,800 --> 00:55:10,839 Speaker 1: how uses your frame of reference? But so you can 1189 00:55:10,920 --> 00:55:15,160 Speaker 1: make the top fifty categories, all the top thousand categories. Well, 1190 00:55:15,400 --> 00:55:17,920 Speaker 1: and if you're gonna ask something, really is the curve 1191 00:55:18,000 --> 00:55:21,799 Speaker 1: gets really really steep really quickly, and you don't know 1192 00:55:21,840 --> 00:55:26,120 Speaker 1: what you can ask. So you cannot ask. You can 1193 00:55:26,160 --> 00:55:28,239 Speaker 1: get the top five things, and you can say you 1194 00:55:28,239 --> 00:55:30,640 Speaker 1: can ask for weather, you can ask for timer, you 1195 00:55:30,680 --> 00:55:32,960 Speaker 1: can ask for a unit conversion, you can ask for 1196 00:55:32,960 --> 00:55:35,239 Speaker 1: the time in a different city. You can ask me 1197 00:55:35,280 --> 00:55:37,160 Speaker 1: to play music and you'll get the music right roughly 1198 00:55:37,200 --> 00:55:38,919 Speaker 1: half of the time. This is you know, you ask 1199 00:55:38,960 --> 00:55:40,920 Speaker 1: it to play an album and it plays you like 1200 00:55:41,000 --> 00:55:44,520 Speaker 1: a weird cover of it because it doesn't understand um. 1201 00:55:44,560 --> 00:55:47,920 Speaker 1: But then I was like, what are the thousand things 1202 00:55:48,000 --> 00:55:51,399 Speaker 1: you might ask your ear to do? And how many 1203 00:55:51,400 --> 00:55:55,040 Speaker 1: of those get really complex really quickly? Like can you 1204 00:55:55,080 --> 00:55:59,000 Speaker 1: rebook my meeting this afternoon? Okay, that's not a simple query, 1205 00:55:59,280 --> 00:56:01,560 Speaker 1: that's like five hundred things that you need to know 1206 00:56:02,719 --> 00:56:05,239 Speaker 1: that And so the problem is as I said. What 1207 00:56:05,280 --> 00:56:07,120 Speaker 1: you have with a voice assistant is you have an 1208 00:56:07,120 --> 00:56:09,759 Speaker 1: IVR that will always understand what you asked. Machine learning 1209 00:56:09,800 --> 00:56:11,960 Speaker 1: means it will always correctly root you to the right number. 1210 00:56:12,480 --> 00:56:14,400 Speaker 1: What it doesn't have is a way to have somebody 1211 00:56:14,440 --> 00:56:17,600 Speaker 1: at that number automated automate what that person at the 1212 00:56:17,680 --> 00:56:20,200 Speaker 1: number will do. And so you have to create those 1213 00:56:20,200 --> 00:56:22,360 Speaker 1: dialog boxes one by one. So give you an example. 1214 00:56:22,880 --> 00:56:25,600 Speaker 1: Sirie learns how to do cricket. You know it didn't 1215 00:56:25,760 --> 00:56:28,840 Speaker 1: somebody at Apple sat down and wrote the cricket module. Okay, 1216 00:56:28,880 --> 00:56:32,960 Speaker 1: now Sirie can do hurling. Okay, Well, somebody at Sirie 1217 00:56:33,040 --> 00:56:35,279 Speaker 1: at Apple had to sit down and write that, and 1218 00:56:35,320 --> 00:56:38,799 Speaker 1: then Siri can do call me a uber, right, Well, 1219 00:56:38,840 --> 00:56:40,840 Speaker 1: someone at Uber had to write that, and then it 1220 00:56:40,880 --> 00:56:44,560 Speaker 1: can do um tell me whether my flight is delayed? Right. 1221 00:56:44,640 --> 00:56:47,440 Speaker 1: Someone had to write that, And every single one of those, 1222 00:56:47,719 --> 00:56:49,920 Speaker 1: some human being has to sit down and spend an 1223 00:56:49,960 --> 00:56:52,400 Speaker 1: afternoon writing those. And the problem with this is it 1224 00:56:52,440 --> 00:56:55,440 Speaker 1: doesn't scale. And in a sense, if you could do that, 1225 00:56:55,480 --> 00:56:58,000 Speaker 1: you'd have made how nine thousand and how nine thousand 1226 00:56:58,120 --> 00:57:00,359 Speaker 1: is not the aggregate of like some of you sitting 1227 00:57:00,400 --> 00:57:02,520 Speaker 1: and writing those one at a time. That's that other 1228 00:57:02,600 --> 00:57:05,160 Speaker 1: thing that's machine learning, a big data and a bunch 1229 00:57:05,160 --> 00:57:07,200 Speaker 1: of things. So that's the gap. You know, we have 1230 00:57:07,239 --> 00:57:09,440 Speaker 1: a way to get the transcription and the NLP to 1231 00:57:09,560 --> 00:57:11,480 Speaker 1: work completely accurately, Well, we don't have. It is a 1232 00:57:11,520 --> 00:57:14,479 Speaker 1: way to get to automatically make a system that could 1233 00:57:14,480 --> 00:57:18,280 Speaker 1: answer any question you could possibly answer human being to 1234 00:57:18,320 --> 00:57:20,480 Speaker 1: think of what all of those would be. So that 1235 00:57:20,680 --> 00:57:24,040 Speaker 1: is a description of So let me answer answer this 1236 00:57:24,480 --> 00:57:30,040 Speaker 1: question another way. So imagine in the nineteenth century someone 1237 00:57:30,040 --> 00:57:33,320 Speaker 1: tries to make a mechanical horse. There's no I mean, 1238 00:57:33,320 --> 00:57:35,840 Speaker 1: you've seen the pattern drawing things. There's no law of 1239 00:57:35,840 --> 00:57:38,640 Speaker 1: physics that says you can't make a mechanical horse. It's 1240 00:57:38,680 --> 00:57:41,200 Speaker 1: just that the degree of complexity required to get that 1241 00:57:41,240 --> 00:57:43,520 Speaker 1: to work was impossible in the nineteenth century, and arguably 1242 00:57:43,520 --> 00:57:45,480 Speaker 1: it's impossible now. I mean, Boston Dynamics are now trying 1243 00:57:45,520 --> 00:57:47,000 Speaker 1: to do this pretty close. But that's like a hundred 1244 00:57:47,000 --> 00:57:49,760 Speaker 1: and fifty years later, um and so and so. What 1245 00:57:49,800 --> 00:57:51,360 Speaker 1: you can actually do is you could make a bicycle 1246 00:57:51,400 --> 00:57:52,960 Speaker 1: and the steam engine, and a bicycle is a lot 1247 00:57:52,960 --> 00:57:54,720 Speaker 1: simpler than a horse, and it can't do everything that 1248 00:57:54,720 --> 00:57:55,920 Speaker 1: a horse can do. But it turns out that was 1249 00:57:55,960 --> 00:57:58,520 Speaker 1: kind of useful anyway. The same thing with with AI. 1250 00:57:58,640 --> 00:58:00,880 Speaker 1: What people tried to do with AI until a couple 1251 00:58:00,880 --> 00:58:02,760 Speaker 1: of years ago was they would try and write rules. 1252 00:58:03,520 --> 00:58:05,800 Speaker 1: So if you wanted to recognize a cat, what you 1253 00:58:05,880 --> 00:58:08,200 Speaker 1: do is you do edge detection, and you do text 1254 00:58:08,240 --> 00:58:10,479 Speaker 1: your analysis, and you try and make something that looks 1255 00:58:10,480 --> 00:58:12,600 Speaker 1: for two eyes in roughly the right place relative to 1256 00:58:12,640 --> 00:58:14,600 Speaker 1: two ears, and you try and make something that tries 1257 00:58:14,640 --> 00:58:16,960 Speaker 1: to look for legs and hopefully, and you try and 1258 00:58:17,000 --> 00:58:18,160 Speaker 1: work out how the hell are we going to tell 1259 00:58:18,160 --> 00:58:20,160 Speaker 1: the difference between a cat and a dog? And it 1260 00:58:20,200 --> 00:58:23,240 Speaker 1: would sort of work, like three quarters at the time, 1261 00:58:23,520 --> 00:58:25,400 Speaker 1: it would never really work for the same reason that 1262 00:58:25,440 --> 00:58:28,840 Speaker 1: the mechanical horse would sort of work but never actually work. 1263 00:58:29,320 --> 00:58:31,120 Speaker 1: And then what machine learning does is say no, no, no no, 1264 00:58:31,120 --> 00:58:33,840 Speaker 1: no no. You've given a million pictures labeled cat and 1265 00:58:33,840 --> 00:58:35,680 Speaker 1: a million pictures of labeled dog, and you let the 1266 00:58:35,720 --> 00:58:38,520 Speaker 1: machine ripe the rules. That's what machine learning means. The 1267 00:58:38,600 --> 00:58:41,760 Speaker 1: machine generates a millionaire statements that will allow it to 1268 00:58:41,800 --> 00:58:44,120 Speaker 1: calculate this difference between a cat picture and a dog 1269 00:58:44,160 --> 00:58:50,000 Speaker 1: picture with ninety two point seven percent accuracy, and that 1270 00:58:50,320 --> 00:58:52,840 Speaker 1: gives us a breakthrough of a whole class of problem. 1271 00:58:52,960 --> 00:58:56,880 Speaker 1: It solves image recognition itselves, speech itselves, language, itselves, a 1272 00:58:56,920 --> 00:58:59,920 Speaker 1: whole kind of level of pattern recognition. When it doesn't 1273 00:59:00,040 --> 00:59:02,920 Speaker 1: do is give you a human ten year old. It 1274 00:59:02,960 --> 00:59:06,200 Speaker 1: doesn't give you general intelligence. Basically, what you've built is 1275 00:59:06,240 --> 00:59:10,920 Speaker 1: you've built an enormous spreadsheet with a million IF statements 1276 00:59:11,080 --> 00:59:13,920 Speaker 1: meshed together and linked together. And you can give this 1277 00:59:14,000 --> 00:59:16,800 Speaker 1: spreadsheet a picture of a cat and a picture, and 1278 00:59:16,800 --> 00:59:18,280 Speaker 1: it will tell you whether the cat in it or not, 1279 00:59:18,600 --> 00:59:21,840 Speaker 1: and that's all it will do. So another way of 1280 00:59:21,880 --> 00:59:23,760 Speaker 1: thinking about this is like, we have this fantasy of 1281 00:59:23,800 --> 00:59:25,720 Speaker 1: domestic robots that will walk around your home and do 1282 00:59:25,760 --> 00:59:28,400 Speaker 1: the housework. We have domestic robots. It's called a washing machine. 1283 00:59:29,160 --> 00:59:31,400 Speaker 1: That's what machine learning gives you. It gives your washing machine. 1284 00:59:32,040 --> 00:59:35,120 Speaker 1: It gives your washing machine that can recognize family pictures. 1285 00:59:35,160 --> 00:59:37,280 Speaker 1: It gives your washing machine that can tell me is 1286 00:59:37,280 --> 00:59:39,600 Speaker 1: there strange behavior happening on this network. It gives your 1287 00:59:39,600 --> 00:59:42,240 Speaker 1: washing machine that can recognize handwriting. It gives your washing 1288 00:59:42,240 --> 00:59:44,520 Speaker 1: machine that can do X. But it can only do 1289 00:59:44,600 --> 00:59:46,800 Speaker 1: that one thing. It can't do anything else, and it 1290 00:59:46,840 --> 00:59:49,960 Speaker 1: can't clean. You can't wash your dishes, And so that's 1291 00:59:49,960 --> 00:59:52,200 Speaker 1: what machine only gives it gives you this step change 1292 00:59:52,200 --> 00:59:54,560 Speaker 1: in capability. So one of the way I'm kind of 1293 00:59:54,600 --> 00:59:56,280 Speaker 1: I'm sort of giving you a stack of metaphors here. 1294 00:59:56,280 --> 00:59:57,680 Speaker 1: I think a really good way to think about what 1295 00:59:57,720 --> 01:00:00,680 Speaker 1: machine learning gives us is it's like thinking about relations databases. 1296 01:00:01,400 --> 01:00:04,280 Speaker 1: It's a relational databases gives you this amazing step change 1297 01:00:04,280 --> 01:00:06,919 Speaker 1: in what you can do with computers. Suddenly you can say, 1298 01:00:06,960 --> 01:00:10,560 Speaker 1: show me X by Y Bloomberg. Going would not exist 1299 01:00:10,600 --> 01:00:13,720 Speaker 1: without this, SAP would not exist without this, just in times, 1300 01:00:13,720 --> 01:00:16,600 Speaker 1: supply chains would not exist without this totally transformed our world. 1301 01:00:17,200 --> 01:00:22,000 Speaker 1: But everything you use now but it's not AI. It is. 1302 01:00:22,040 --> 01:00:24,400 Speaker 1: But it's like if it's AI, then everything is AI. 1303 01:00:25,360 --> 01:00:27,560 Speaker 1: It doesn't get you how nine thousand. Let me jump 1304 01:00:27,600 --> 01:00:31,120 Speaker 1: into some of my favorite questions. Tell us the most 1305 01:00:31,200 --> 01:00:34,800 Speaker 1: important thing that people who work with you don't know 1306 01:00:34,880 --> 01:00:39,120 Speaker 1: about you? I hate personal questions? Is that true? I 1307 01:00:39,160 --> 01:00:40,720 Speaker 1: don't know. I find I kind of look at this, 1308 01:00:40,720 --> 01:00:42,680 Speaker 1: this email, this, I didn't puzzled by these kind of questions. 1309 01:00:42,680 --> 01:00:44,360 Speaker 1: I don't know, what is it the people who work 1310 01:00:44,400 --> 01:00:46,000 Speaker 1: with that, we have no idea. They probably don't know 1311 01:00:46,080 --> 01:00:49,160 Speaker 1: have a dog. There you go, Well, because sometimes it's 1312 01:00:49,200 --> 01:00:53,120 Speaker 1: revealing some people have told us deep dark secrets from 1313 01:00:53,320 --> 01:00:57,480 Speaker 1: decades ago, and case there is a joke that the 1314 01:00:57,480 --> 01:01:00,040 Speaker 1: most secure form of encryption is anything you say in 1315 01:01:00,080 --> 01:01:02,440 Speaker 1: the second hour of a podcast. Here we are, I 1316 01:01:02,480 --> 01:01:06,040 Speaker 1: can say it now, and it's you, me, and a 1317 01:01:06,080 --> 01:01:08,120 Speaker 1: handful of friends that are hearing that nobody else is 1318 01:01:08,160 --> 01:01:10,840 Speaker 1: gonna hear. Um, tell us about your early mentors who 1319 01:01:10,880 --> 01:01:14,960 Speaker 1: affected the way you looked at technology and venture inversiting. 1320 01:01:15,840 --> 01:01:18,560 Speaker 1: I would say it's not really about how I look 1321 01:01:18,600 --> 01:01:21,919 Speaker 1: at technology. It's more sort of mental processes and ways 1322 01:01:21,960 --> 01:01:24,840 Speaker 1: of thinking about things and ways to try and answer questions. 1323 01:01:25,400 --> 01:01:27,760 Speaker 1: And so the stuff I did at university, you know, 1324 01:01:27,800 --> 01:01:30,400 Speaker 1: people talking to me about medieval history, and it's not 1325 01:01:30,640 --> 01:01:32,919 Speaker 1: this is no, it's not that didn't happen. This didn't happen. 1326 01:01:33,360 --> 01:01:36,120 Speaker 1: It's more, okay, how do you actually understand what we're 1327 01:01:36,120 --> 01:01:38,400 Speaker 1: trying to think about here? So, like I remember like 1328 01:01:38,440 --> 01:01:41,160 Speaker 1: my first or second, like you know, the first. The 1329 01:01:41,200 --> 01:01:42,680 Speaker 1: way it came Bridge history works is you do an 1330 01:01:42,960 --> 01:01:44,480 Speaker 1: You get given an essay title and a reading list, 1331 01:01:44,520 --> 01:01:45,960 Speaker 1: and you come back in a week with your essay 1332 01:01:46,120 --> 01:01:49,600 Speaker 1: and that's it. You don't have to get lectures. And 1333 01:01:49,760 --> 01:01:51,440 Speaker 1: so I think. My first essay was the one that, 1334 01:01:51,480 --> 01:01:53,480 Speaker 1: of course everyone screws up, and it was like was 1335 01:01:53,560 --> 01:01:56,280 Speaker 1: the King of England more or less powerful after ten 1336 01:01:56,440 --> 01:01:59,560 Speaker 1: exty six? And you know I gave the long answer 1337 01:01:59,560 --> 01:02:01,360 Speaker 1: and my professor said, look, you have to think about 1338 01:02:01,400 --> 01:02:03,080 Speaker 1: this is what does it actually mean to be powerful? 1339 01:02:03,120 --> 01:02:04,760 Speaker 1: And power is the ability to get people to do 1340 01:02:04,840 --> 01:02:06,880 Speaker 1: something they don't want to do. And so you can 1341 01:02:06,880 --> 01:02:09,400 Speaker 1: give all this other stuff about all the sophistication of 1342 01:02:09,480 --> 01:02:11,800 Speaker 1: the Anglo Saxon monarchy and they had their own much 1343 01:02:11,800 --> 01:02:15,560 Speaker 1: more sophisticated coinage and blah blah blah. No that was 1344 01:02:15,600 --> 01:02:17,640 Speaker 1: two years later, But like, who is more Is he 1345 01:02:17,680 --> 01:02:20,120 Speaker 1: actually more powerful? William the conqueror can tell people to 1346 01:02:20,160 --> 01:02:21,320 Speaker 1: do stuff and they have to do it. And the 1347 01:02:21,400 --> 01:02:23,800 Speaker 1: King of the sax the king before the conquest, King 1348 01:02:23,840 --> 01:02:26,200 Speaker 1: of England, like no one really paid any attention to 1349 01:02:26,240 --> 01:02:28,440 Speaker 1: what he said except for the coin like the complicated coinage. 1350 01:02:28,440 --> 01:02:31,320 Speaker 1: So my point is like it's not so much that 1351 01:02:31,440 --> 01:02:34,360 Speaker 1: someone told me how this is how to think about technology. 1352 01:02:34,920 --> 01:02:38,120 Speaker 1: It's more how to sort of think systematically and try 1353 01:02:38,160 --> 01:02:40,600 Speaker 1: and break up problems and try and look for peace 1354 01:02:40,640 --> 01:02:43,200 Speaker 1: of components and try and look kind of look at 1355 01:02:43,240 --> 01:02:46,760 Speaker 1: things in those kinds of ways quite personally. Tell us 1356 01:02:46,760 --> 01:02:50,720 Speaker 1: about some of your favorite books, be they fiction, nonfiction, technology, 1357 01:02:50,840 --> 01:02:53,800 Speaker 1: or what have you. Um, I don't know. I'm sort 1358 01:02:53,840 --> 01:02:56,600 Speaker 1: of I'm always reading and I'm always buying books, and 1359 01:02:56,600 --> 01:02:59,240 Speaker 1: those two process as are completely independent. I read almost 1360 01:02:59,240 --> 01:03:02,479 Speaker 1: no business books or technology books, partly because I feel 1361 01:03:02,480 --> 01:03:04,800 Speaker 1: that all business books are basically a short magazine article 1362 01:03:04,840 --> 01:03:08,880 Speaker 1: padded out to There are some authors in particular I 1363 01:03:08,880 --> 01:03:11,880 Speaker 1: can name, well, that's their entire over. Well, if you 1364 01:03:11,960 --> 01:03:14,000 Speaker 1: feel like if you've written fifteen business books on the 1365 01:03:14,040 --> 01:03:16,160 Speaker 1: same theme, then like you're either a very bad writer 1366 01:03:16,280 --> 01:03:18,600 Speaker 1: or very good writer, depending on your very good marketer, 1367 01:03:18,920 --> 01:03:22,000 Speaker 1: Well that too. Perhaps, I know. I think I read 1368 01:03:22,040 --> 01:03:24,360 Speaker 1: things that intrigued me and interest me. The last thing 1369 01:03:24,400 --> 01:03:27,880 Speaker 1: I read with the Iliad, which I've never read before. Wait, 1370 01:03:27,920 --> 01:03:31,120 Speaker 1: the history major never read the Iliad. Well, so there's 1371 01:03:31,120 --> 01:03:33,200 Speaker 1: this great line from Italia Calvino where he says the 1372 01:03:33,280 --> 01:03:35,280 Speaker 1: number of something like the number of essential works is 1373 01:03:35,320 --> 01:03:37,240 Speaker 1: so great that nobody, no matter how much they're reading, 1374 01:03:37,280 --> 01:03:40,280 Speaker 1: has read more than a fraction. Um, you know, literature 1375 01:03:40,280 --> 01:03:42,680 Speaker 1: as a sample, you can't read all the books. There 1376 01:03:42,720 --> 01:03:44,160 Speaker 1: was a period when you could have seen all the movies. 1377 01:03:44,160 --> 01:03:45,560 Speaker 1: Now you can't see all the movies, you can't listen 1378 01:03:45,600 --> 01:03:47,960 Speaker 1: to all the music. You just have to take samples 1379 01:03:48,040 --> 01:03:50,280 Speaker 1: and take do things that stimulate you or make you 1380 01:03:50,320 --> 01:03:53,200 Speaker 1: think in interesting ways, particularly things that aren't about what 1381 01:03:53,240 --> 01:03:57,000 Speaker 1: you work on. So give us another example from that's 1382 01:03:57,160 --> 01:04:01,000 Speaker 1: that's newer than two thousand years old. Um, I don't know. 1383 01:04:01,800 --> 01:04:04,320 Speaker 1: By the way, this is everybody's favorite question, the one 1384 01:04:04,600 --> 01:04:07,840 Speaker 1: I get. I read books and then I can't remember 1385 01:04:07,880 --> 01:04:09,400 Speaker 1: what it is? What? What? What? The last book I 1386 01:04:09,480 --> 01:04:11,920 Speaker 1: read was, Um, what stands out to you? What has 1387 01:04:12,000 --> 01:04:14,520 Speaker 1: really read? Really? The Iliad really stood out for me. 1388 01:04:14,600 --> 01:04:15,920 Speaker 1: That was a kind of a couple of weeks ago, 1389 01:04:15,960 --> 01:04:20,280 Speaker 1: I read this fascinating book about medieval manuscripts called Encounters 1390 01:04:20,360 --> 01:04:23,600 Speaker 1: with Fascinating Manuscripts or something like that by college librarian 1391 01:04:23,600 --> 01:04:25,880 Speaker 1: in Cambridge, and it's one of these sort of new 1392 01:04:26,040 --> 01:04:27,920 Speaker 1: kind of genre of books which kind of looked like 1393 01:04:27,920 --> 01:04:30,240 Speaker 1: a big hardback nonfiction book. But they're all in color 1394 01:04:30,280 --> 01:04:31,720 Speaker 1: all the way through, which is funny to do with 1395 01:04:31,760 --> 01:04:34,240 Speaker 1: the development of printing technology, which is there's there's loads 1396 01:04:34,240 --> 01:04:36,080 Speaker 1: of books that have color illustrations all the way through, 1397 01:04:36,080 --> 01:04:37,960 Speaker 1: which is but aren't art books. It's kind of interesting 1398 01:04:38,120 --> 01:04:40,000 Speaker 1: and it's just sort of about how books evolved. So 1399 01:04:40,000 --> 01:04:44,360 Speaker 1: it's kind of fascinating observation that books or codeses used 1400 01:04:44,400 --> 01:04:46,800 Speaker 1: to be rectangular because you made them out of papyrus. 1401 01:04:46,800 --> 01:04:48,800 Speaker 1: So you would lay your sheets, you know, you lay 1402 01:04:48,840 --> 01:04:52,160 Speaker 1: the strips of plant kind of cross weights like plywood, 1403 01:04:52,200 --> 01:04:53,920 Speaker 1: and you'd make them square shaped because you had the 1404 01:04:54,520 --> 01:04:56,080 Speaker 1: strips of all the same length, So that produced to 1405 01:04:56,120 --> 01:05:00,720 Speaker 1: square shaped piece of papyrus. And then you moved to um, 1406 01:05:00,720 --> 01:05:03,640 Speaker 1: what's it called animal skin? What's the word? I've forgotten 1407 01:05:03,640 --> 01:05:06,439 Speaker 1: the word? My mind is gone blank now too, um Um. 1408 01:05:06,480 --> 01:05:10,240 Speaker 1: You moved to parchment from animal skins. And there's this 1409 01:05:10,280 --> 01:05:13,840 Speaker 1: great line. Animals tend to be oblong, so you get 1410 01:05:13,880 --> 01:05:16,560 Speaker 1: oblong pieces and then you fold them up and that 1411 01:05:16,600 --> 01:05:19,080 Speaker 1: produces an oblong book. And so when you go from 1412 01:05:19,120 --> 01:05:22,040 Speaker 1: papyrus department, you go from square books to oblong books. 1413 01:05:22,400 --> 01:05:24,400 Speaker 1: And here we are years later and the books are 1414 01:05:24,400 --> 01:05:27,800 Speaker 1: still oblong, and that's just the residual of the earlier 1415 01:05:27,840 --> 01:05:31,440 Speaker 1: technology of parchment. Yeah, exactly. I think those sorts of 1416 01:05:31,520 --> 01:05:33,600 Speaker 1: I mean, coming back to you know, why is this interesting? 1417 01:05:33,600 --> 01:05:36,720 Speaker 1: I do think those sorts of how did stuff happen, 1418 01:05:36,840 --> 01:05:39,160 Speaker 1: and why did the what's the thread of causation and 1419 01:05:39,200 --> 01:05:42,280 Speaker 1: how things can be completely random is sort of interesting 1420 01:05:42,280 --> 01:05:44,680 Speaker 1: and how do these does this stuff change how we 1421 01:05:44,720 --> 01:05:46,600 Speaker 1: think about it? So on the flight here, I've read 1422 01:05:46,640 --> 01:05:48,920 Speaker 1: a book about kind of lighting tech, the evolution of 1423 01:05:48,960 --> 01:05:51,240 Speaker 1: lighting technology in the eighteenth and nineteent century, as we 1424 01:05:51,280 --> 01:05:53,360 Speaker 1: go from candles to light to kind of lamps that 1425 01:05:53,440 --> 01:05:55,360 Speaker 1: hold the candle and produce five times more light, how 1426 01:05:55,440 --> 01:05:58,160 Speaker 1: we go to now or the gas lights to to 1427 01:05:58,560 --> 01:06:01,520 Speaker 1: um earlier electric light which used weren't very bright, and 1428 01:06:01,520 --> 01:06:03,920 Speaker 1: then use tungsten filament and again, so there's the evelation 1429 01:06:03,960 --> 01:06:06,400 Speaker 1: of light. This observation that stuck in my mind and 1430 01:06:06,400 --> 01:06:08,200 Speaker 1: I and now need to go and check which is 1431 01:06:08,240 --> 01:06:12,640 Speaker 1: an assertion that Dutch seventy century Dutch pictures have windows 1432 01:06:12,680 --> 01:06:16,240 Speaker 1: but no cuts, because why would you keep the light out. 1433 01:06:16,400 --> 01:06:18,040 Speaker 1: Everybody wants to light and you want the light in. 1434 01:06:18,160 --> 01:06:19,600 Speaker 1: And of course, as we all sort of no, light 1435 01:06:19,680 --> 01:06:22,320 Speaker 1: was really expensive and used to be a luxury and 1436 01:06:22,320 --> 01:06:24,240 Speaker 1: now it's not. But I just thought that was a 1437 01:06:24,280 --> 01:06:27,440 Speaker 1: really revealing way of thinking about how your whole sense 1438 01:06:27,480 --> 01:06:30,200 Speaker 1: of the world changes when lighting is cheap. Why would 1439 01:06:30,240 --> 01:06:33,560 Speaker 1: you have curtns that that's quite fastening. The book How 1440 01:06:33,600 --> 01:06:36,720 Speaker 1: We Got to Now has there's six inventions, one of 1441 01:06:36,760 --> 01:06:40,240 Speaker 1: which is lighting, and the process of watching the price 1442 01:06:40,400 --> 01:06:45,320 Speaker 1: plummet over over the centuries is really is really fastening, 1443 01:06:45,640 --> 01:06:47,600 Speaker 1: all right. The interesting thing that comes out of that 1444 01:06:47,720 --> 01:06:50,600 Speaker 1: is that when light was expensive, state of symbol was 1445 01:06:50,680 --> 01:06:53,320 Speaker 1: to stay up all night. So rich people would get 1446 01:06:53,360 --> 01:06:55,280 Speaker 1: up at lunchtime and go to bed at l five 1447 01:06:55,280 --> 01:06:57,200 Speaker 1: in the morning because they could afford to, because they 1448 01:06:57,200 --> 01:07:01,080 Speaker 1: could afford candles. Everybody else gets up with the croaker, think, um, 1449 01:07:01,120 --> 01:07:02,280 Speaker 1: what do you do for fun? What do you do 1450 01:07:02,360 --> 01:07:06,760 Speaker 1: to relax outside of the office in San Francisco? Um? 1451 01:07:06,880 --> 01:07:10,240 Speaker 1: So I have a seven year old and a dog, 1452 01:07:10,760 --> 01:07:15,000 Speaker 1: so my time is not my own understood completely? What 1453 01:07:15,120 --> 01:07:17,560 Speaker 1: sort of advice would you give to a millennial or 1454 01:07:17,640 --> 01:07:21,920 Speaker 1: someone who just graduated college that was interested either in 1455 01:07:21,920 --> 01:07:26,640 Speaker 1: a career in technology or a venture capital So I 1456 01:07:27,240 --> 01:07:29,200 Speaker 1: sort of struggle to give an advice party because my 1457 01:07:29,280 --> 01:07:32,400 Speaker 1: career has been a series of random lurches and kind 1458 01:07:32,400 --> 01:07:36,120 Speaker 1: of companies going out of business. I won't apply for 1459 01:07:36,120 --> 01:07:38,880 Speaker 1: a job at cling Back. Don't worry, I think there's 1460 01:07:38,920 --> 01:07:42,680 Speaker 1: a The only thing I've sort of observed is there is? Um, 1461 01:07:42,720 --> 01:07:44,919 Speaker 1: there is the kind of things that you are good 1462 01:07:44,960 --> 01:07:47,320 Speaker 1: at doing and the kind of mental processes that you 1463 01:07:47,360 --> 01:07:49,880 Speaker 1: are good at. Are you good at people or not? 1464 01:07:50,000 --> 01:07:53,040 Speaker 1: Are you good at process and project management and making 1465 01:07:53,040 --> 01:07:55,520 Speaker 1: sure everything gets done right? Or is that not what 1466 01:07:55,600 --> 01:07:58,320 Speaker 1: it interests you? Are you good at trying to kind 1467 01:07:58,360 --> 01:08:01,240 Speaker 1: of explain an empathy sizing with people? Are you good 1468 01:08:01,240 --> 01:08:04,480 Speaker 1: at trying to explain and discuss and argue with people 1469 01:08:04,600 --> 01:08:08,160 Speaker 1: and persuade people? Um? What are the things that you're 1470 01:08:08,200 --> 01:08:10,400 Speaker 1: good at doing? And those are not necessarily what was 1471 01:08:10,440 --> 01:08:14,080 Speaker 1: your degree in, and they aren't necessarily specific to two jobs. 1472 01:08:14,200 --> 01:08:17,000 Speaker 1: So like I have contemporaries who went off and become 1473 01:08:17,200 --> 01:08:19,519 Speaker 1: what England we call a barrister, which is basically a 1474 01:08:19,600 --> 01:08:21,760 Speaker 1: lawyer who only goes to court is in England and 1475 01:08:21,800 --> 01:08:23,639 Speaker 1: law is split solicitors to be the paper what barristers 1476 01:08:23,680 --> 01:08:25,960 Speaker 1: get called as well. Exactly that's all, but that's all 1477 01:08:25,960 --> 01:08:28,880 Speaker 1: the effectively and it's a completely different profession. Um, And 1478 01:08:28,920 --> 01:08:31,360 Speaker 1: I have contemporaries who are barristers, and you know I 1479 01:08:31,360 --> 01:08:33,360 Speaker 1: could have done that. I could have gone and argued 1480 01:08:33,360 --> 01:08:34,559 Speaker 1: for a living, and you know you kind of you 1481 01:08:34,600 --> 01:08:35,720 Speaker 1: go and do you look at the problem and you 1482 01:08:35,760 --> 01:08:37,240 Speaker 1: pull it apart and you work out what story to 1483 01:08:37,280 --> 01:08:39,160 Speaker 1: tell and how to place added to the law and 1484 01:08:39,160 --> 01:08:41,920 Speaker 1: try and persuade the judge and the jury. Um So, 1485 01:08:42,080 --> 01:08:44,240 Speaker 1: in a sense, I have those skill sets. On the 1486 01:08:44,280 --> 01:08:46,840 Speaker 1: other hand, I look at these contemporaries and it says, 1487 01:08:46,960 --> 01:08:51,080 Speaker 1: you know, um, Charlie is probably the leading up and 1488 01:08:51,120 --> 01:08:54,479 Speaker 1: coming advocate in complex of your maritime insurance disputes. And 1489 01:08:54,520 --> 01:08:57,840 Speaker 1: I think, well, I hope he's paid a lot to that, 1490 01:08:59,000 --> 01:09:02,040 Speaker 1: but his interests did in the argument and the explaining 1491 01:09:02,040 --> 01:09:05,720 Speaker 1: and the discussion. The fact that it's insurance is kind 1492 01:09:05,720 --> 01:09:08,439 Speaker 1: of neither here nor there really, um So, I think 1493 01:09:08,439 --> 01:09:10,080 Speaker 1: you have to just kind of work out, well, what 1494 01:09:10,280 --> 01:09:12,720 Speaker 1: is it What are the things that fascinate you, and 1495 01:09:12,800 --> 01:09:15,720 Speaker 1: what are the kind of mental processes that fascinate you, 1496 01:09:16,160 --> 01:09:18,479 Speaker 1: And and our final question, what is it that you 1497 01:09:18,520 --> 01:09:21,240 Speaker 1: know about the worlds of technology that you wish you 1498 01:09:21,320 --> 01:09:26,320 Speaker 1: knew twenty years ago? Other than like buy Apple or something, No, no, 1499 01:09:26,439 --> 01:09:29,360 Speaker 1: nothing like that. That that's you know, anyone could look 1500 01:09:29,400 --> 01:09:32,360 Speaker 1: at prices twenty years ago and said hey, by Amazon, 1501 01:09:32,560 --> 01:09:36,639 Speaker 1: by Apple. But generally in terms of thinking in terms 1502 01:09:36,680 --> 01:09:39,560 Speaker 1: of broad processes, what would have helped you early in 1503 01:09:39,640 --> 01:09:41,760 Speaker 1: your career had you figured it out sooner. So that 1504 01:09:41,920 --> 01:09:46,160 Speaker 1: actually might be a good way of me framing my 1505 01:09:46,400 --> 01:09:50,280 Speaker 1: technology to determinism peace because it's a sort of I mean, 1506 01:09:50,479 --> 01:09:52,519 Speaker 1: the other thing people talk about about a lot in 1507 01:09:52,600 --> 01:09:55,800 Speaker 1: venture capitalist patent recognition that this sort of looks like 1508 01:09:55,880 --> 01:09:58,280 Speaker 1: things that will work. This looks like things that that 1509 01:09:58,360 --> 01:10:01,000 Speaker 1: never work and why and the thing that never work 1510 01:10:01,439 --> 01:10:03,320 Speaker 1: eventually they do work, and that's where there's a hundred 1511 01:10:03,320 --> 01:10:05,519 Speaker 1: billion dollar outcome. So that's also an interesting kind of 1512 01:10:05,520 --> 01:10:09,320 Speaker 1: fruitful copy conversation. But it's sort of understanding these patents 1513 01:10:09,479 --> 01:10:11,680 Speaker 1: of Okay, you have to think, okay, how is this 1514 01:10:11,720 --> 01:10:15,040 Speaker 1: going to fit within this ecosystem? This is what is 1515 01:10:15,040 --> 01:10:18,000 Speaker 1: the timing for this? It's those sorts of mechanics of 1516 01:10:18,080 --> 01:10:23,200 Speaker 1: the process um that are interesting. I think that's quite fascinating. 1517 01:10:23,840 --> 01:10:26,760 Speaker 1: We have been speaking with Ben Nick Evans of Andres 1518 01:10:26,760 --> 01:10:29,800 Speaker 1: and Horowitz. If you enjoy this conversation, be sure and 1519 01:10:29,840 --> 01:10:31,559 Speaker 1: look up an Inch or down an inch on Apple, 1520 01:10:31,640 --> 01:10:37,000 Speaker 1: iTunes or SoundCloud, overcast, Bloomberg dot com wherever your final 1521 01:10:37,080 --> 01:10:39,400 Speaker 1: podcasts are sold, and you could see any of the 1522 01:10:39,439 --> 01:10:42,479 Speaker 1: other two hundred such conversations we've had over the past 1523 01:10:43,040 --> 01:10:46,920 Speaker 1: four years. We love your comments, feedback and suggestions right 1524 01:10:47,000 --> 01:10:49,920 Speaker 1: to us at m IB podcast at Bloomberg dot net. 1525 01:10:50,439 --> 01:10:52,320 Speaker 1: I would be remiss if I did not thank the 1526 01:10:52,360 --> 01:10:56,240 Speaker 1: crack staff that helps us put this podcast together each week. 1527 01:10:56,560 --> 01:10:59,599 Speaker 1: Taylor Riggs is our book or slash producer. Medina Parwana 1528 01:11:00,080 --> 01:11:03,640 Speaker 1: is our audio engineer slash producer. Mike Batnik is our 1529 01:11:03,680 --> 01:11:07,599 Speaker 1: head of research. I'm Barry Retolts. You're listening to Master's 1530 01:11:07,640 --> 01:11:09,679 Speaker 1: in Business on Bloomberg Radio.