1 00:00:02,240 --> 00:00:06,800 Speaker 1: This is Master's in Business with Barry Ridholts on Boomberg Radio. 2 00:00:09,960 --> 00:00:12,399 Speaker 1: This week on the podcast, I have a special guest. 3 00:00:12,600 --> 00:00:18,840 Speaker 1: His name is a Neil Dash. He is a leading technician, entrepreneur, 4 00:00:18,960 --> 00:00:23,080 Speaker 1: founder uh in the world of micro publishing and blogging. 5 00:00:23,680 --> 00:00:27,720 Speaker 1: I know his work from back when six Apart was 6 00:00:27,840 --> 00:00:32,480 Speaker 1: an early pioneer in the world of blogging. They put 7 00:00:32,479 --> 00:00:39,640 Speaker 1: out type Pad, which was really the first robust, whizzywig 8 00:00:39,680 --> 00:00:42,919 Speaker 1: what you see is what you get sort of blogging platform. 9 00:00:43,000 --> 00:00:47,320 Speaker 1: Before that, you had companies like Yahoo Geo Cities. My 10 00:00:47,440 --> 00:00:50,919 Speaker 1: first blog was on Geo Cities and I say this 11 00:00:51,000 --> 00:00:53,319 Speaker 1: it's It's really not a joke. It's true. I say 12 00:00:53,360 --> 00:00:55,400 Speaker 1: it as a joke, but it's true. I would take 13 00:00:55,440 --> 00:00:58,760 Speaker 1: fift twenty minutes to write something and then two hours 14 00:00:58,800 --> 00:01:01,080 Speaker 1: to code in an HTM out in order for it 15 00:01:01,160 --> 00:01:04,200 Speaker 1: to work. On Geo Cities. It was you were literally 16 00:01:04,880 --> 00:01:09,880 Speaker 1: coding everything. Every every indent, every bold, every underlying, every 17 00:01:09,920 --> 00:01:14,760 Speaker 1: paragraph had an htmail code arout it. So to reduce 18 00:01:15,160 --> 00:01:19,600 Speaker 1: that process of of micro publishing to something that was 19 00:01:19,640 --> 00:01:23,720 Speaker 1: like it very much was like working on a document 20 00:01:23,840 --> 00:01:27,920 Speaker 1: on Microsoft Word. If you wanted to format it, you 21 00:01:28,040 --> 00:01:31,520 Speaker 1: just clicked the button that said, here's your underline, here's 22 00:01:31,520 --> 00:01:34,240 Speaker 1: your talles, here's your footnote, here's your and it was. 23 00:01:34,440 --> 00:01:39,840 Speaker 1: It was absolutely game changing. It went from a tent 24 00:01:39,959 --> 00:01:44,600 Speaker 1: to one ratio of of coding to writing to hundreds 25 00:01:44,600 --> 00:01:46,760 Speaker 1: to one in the opposite direction. You would take a 26 00:01:46,760 --> 00:01:49,760 Speaker 1: half hour to write something and maybe thirty seconds to 27 00:01:49,840 --> 00:01:53,720 Speaker 1: format it. It was easy as pie. My first blog 28 00:01:53,800 --> 00:01:56,520 Speaker 1: was big Picture dot type head dot com. It's still 29 00:01:56,640 --> 00:01:59,000 Speaker 1: up there, by the way. Not only did they create 30 00:01:59,080 --> 00:02:02,600 Speaker 1: the software for this, but they also created the hosting 31 00:02:02,680 --> 00:02:06,559 Speaker 1: for it. And I find six apart was a game 32 00:02:06,640 --> 00:02:09,800 Speaker 1: changer for the world of publishing. I found this to 33 00:02:09,840 --> 00:02:15,840 Speaker 1: be an absolutely fascinating conversation. If you're into blogging, technology, entrepreneurship, 34 00:02:15,960 --> 00:02:20,440 Speaker 1: venture capital, et cetera, you're gonna find this conversation fascinating. So, 35 00:02:20,520 --> 00:02:27,720 Speaker 1: with no further ado, my conversation with a Neil Dash, 36 00:02:28,080 --> 00:02:31,200 Speaker 1: My special guest today is a Neil Dash. He was 37 00:02:31,240 --> 00:02:35,760 Speaker 1: an advisor to the Obama White House's Office of Digital Strategy. 38 00:02:36,200 --> 00:02:40,240 Speaker 1: He is a technical advisor to Vox Media advises the 39 00:02:40,280 --> 00:02:43,760 Speaker 1: company medium on its publishing. He is also on the 40 00:02:43,880 --> 00:02:49,000 Speaker 1: National Advisory Council of Donors Choose. His day job is 41 00:02:49,120 --> 00:02:53,919 Speaker 1: CEO of fog Create Software, which has created such products 42 00:02:53,919 --> 00:02:56,760 Speaker 1: as stack overflow and trell Oh were they involved in 43 00:02:56,800 --> 00:03:00,000 Speaker 1: Marker Bass or is that something separate? Was a startup 44 00:03:00,040 --> 00:03:02,240 Speaker 1: was doing right before this? Before that, he writes a 45 00:03:02,240 --> 00:03:06,080 Speaker 1: monthly opinion column on the impact of technology on society 46 00:03:06,200 --> 00:03:09,839 Speaker 1: for Wired magazine. I know him as employee number one 47 00:03:10,040 --> 00:03:15,000 Speaker 1: at six Apart, behind the seminal blogging software type Pad. 48 00:03:15,560 --> 00:03:17,760 Speaker 1: He also serves on the board of the Data and 49 00:03:17,800 --> 00:03:22,600 Speaker 1: Society Research Institute. Neil Dash Welcome to Bloomberg. Thanks so 50 00:03:22,680 --> 00:03:25,520 Speaker 1: much for having me here. So I mentioned type head, 51 00:03:25,560 --> 00:03:28,640 Speaker 1: I really have to jump right into that. I know 52 00:03:28,720 --> 00:03:32,120 Speaker 1: who you are because I actually was a beta tester 53 00:03:32,240 --> 00:03:34,280 Speaker 1: for type pad, and that's where I rolled out. Thank 54 00:03:34,280 --> 00:03:36,480 Speaker 1: you for being one. That's where I rolled out the 55 00:03:36,520 --> 00:03:39,560 Speaker 1: big picture blogged tell us about your role with the 56 00:03:39,600 --> 00:03:44,280 Speaker 1: company six Sure. So I started blogging back and there 57 00:03:44,320 --> 00:03:46,560 Speaker 1: were a couple dozen bloggers on the internet, and I thought, 58 00:03:46,840 --> 00:03:49,480 Speaker 1: what were you using? Two blog? I was manually doing it. 59 00:03:49,520 --> 00:03:52,240 Speaker 1: I was the people who use Windows who remember an 60 00:03:52,240 --> 00:03:56,200 Speaker 1: app called notepad, and I was manually writing HTML, the 61 00:03:56,240 --> 00:04:00,160 Speaker 1: language of webpages, in Notepad and saving it onto a 62 00:04:00,200 --> 00:04:03,560 Speaker 1: web server by myself, which is the cutting edge or 63 00:04:03,560 --> 00:04:06,320 Speaker 1: technology back into the last So I was using Yahoo 64 00:04:06,400 --> 00:04:09,040 Speaker 1: geo Cities, which you're telling me is more advanced. I 65 00:04:09,080 --> 00:04:12,600 Speaker 1: was just cutting and pasting content and then having to 66 00:04:12,640 --> 00:04:14,800 Speaker 1: wrap the HTML code around, right. So that was the 67 00:04:14,840 --> 00:04:16,640 Speaker 1: state of the art. And then in late ninety nine 68 00:04:16,720 --> 00:04:19,560 Speaker 1: a couple of the first blogging tools came along and 69 00:04:19,600 --> 00:04:21,919 Speaker 1: then and so I started using those like Blogger and 70 00:04:22,000 --> 00:04:24,480 Speaker 1: which Google later bought and other things like that, and 71 00:04:24,600 --> 00:04:27,799 Speaker 1: uh and Friends shortly thereafter built this tool called movable 72 00:04:27,800 --> 00:04:31,120 Speaker 1: type and it was the first sort of serious blogging platform. 73 00:04:31,160 --> 00:04:34,359 Speaker 1: Almost immediately people used it to build Gawker, huffing and post. 74 00:04:34,480 --> 00:04:37,400 Speaker 1: What was behind movable type, Uh, they were then husband 75 00:04:37,400 --> 00:04:41,120 Speaker 1: and wife couple Ben trot A, Mina trot Um and um, 76 00:04:41,440 --> 00:04:43,839 Speaker 1: and really we're sort of visionaries about the idea that 77 00:04:43,839 --> 00:04:45,640 Speaker 1: we should take this medium seriously. It was going to 78 00:04:45,720 --> 00:04:48,200 Speaker 1: be something big. It wasn't just h you know, people 79 00:04:48,839 --> 00:04:51,000 Speaker 1: sharing their feelings in a journal online like this was 80 00:04:51,000 --> 00:04:53,200 Speaker 1: going to be where media was head. And this is 81 00:04:53,279 --> 00:04:57,359 Speaker 1: this is pre Facebook, pre Twitter. Oh yeah, this is 82 00:04:57,400 --> 00:05:00,320 Speaker 1: way earlier. Yeah before friends stir you even remember that one, 83 00:05:00,360 --> 00:05:03,799 Speaker 1: so way way back there, and consumer internet was dead 84 00:05:04,320 --> 00:05:06,000 Speaker 1: deader than dead in two thousand one. I mean you 85 00:05:06,040 --> 00:05:10,039 Speaker 1: couldn't you know. We we did it a round that 86 00:05:10,120 --> 00:05:13,080 Speaker 1: was six grand that we raised, and people were like, wow, 87 00:05:13,080 --> 00:05:14,520 Speaker 1: that's all the money in the world. You know, I 88 00:05:14,560 --> 00:05:16,400 Speaker 1: was here. I mean, it's it's it's like the catering 89 00:05:16,440 --> 00:05:19,320 Speaker 1: budget for a launch party for one of these startups today. 90 00:05:19,760 --> 00:05:22,120 Speaker 1: And uh. And so we did that and what was 91 00:05:22,160 --> 00:05:25,279 Speaker 1: interesting was people almost immediately recognized this is important, this 92 00:05:25,320 --> 00:05:28,520 Speaker 1: is something new in media. There's something really powerful enabling here. 93 00:05:28,520 --> 00:05:29,840 Speaker 1: And we thought it's still not easy enough. It was 94 00:05:29,880 --> 00:05:31,480 Speaker 1: still you had to have a lot of technical knowledge. 95 00:05:32,000 --> 00:05:34,080 Speaker 1: And that was where Type came from. You know, I 96 00:05:34,120 --> 00:05:37,480 Speaker 1: think the predecessor to today. You see the word presses 97 00:05:37,480 --> 00:05:40,120 Speaker 1: and mediums and tumblers of the world, and all of 98 00:05:40,160 --> 00:05:41,720 Speaker 1: them I think would say, you know, Type had as 99 00:05:41,720 --> 00:05:44,160 Speaker 1: a direct antecedent to that. And it was the first, 100 00:05:44,240 --> 00:05:46,120 Speaker 1: as far as I can recall, is the first whizzy 101 00:05:46,160 --> 00:05:49,080 Speaker 1: wig sort of blogging software. What you see is what 102 00:05:49,160 --> 00:05:51,920 Speaker 1: you get. It was like operating within a word perfect 103 00:05:52,040 --> 00:05:54,159 Speaker 1: or a word doc. Since it was since it was 104 00:05:54,200 --> 00:05:56,480 Speaker 1: seventeen years ago, I could say word perfect exactly like 105 00:05:56,560 --> 00:05:59,080 Speaker 1: anyone knows what that for The old timers. But I think, 106 00:05:59,200 --> 00:06:01,200 Speaker 1: you know, the other thing really interesting was we were 107 00:06:01,960 --> 00:06:04,680 Speaker 1: experimenting with today you know, a lot of things that 108 00:06:04,680 --> 00:06:06,560 Speaker 1: we take for granted and consumer web. So just the 109 00:06:06,600 --> 00:06:08,960 Speaker 1: fact that we had a consumer pay model those five 110 00:06:09,000 --> 00:06:10,400 Speaker 1: bucks a month, ten bucks a month. I mean, now 111 00:06:10,440 --> 00:06:12,960 Speaker 1: you know, everybody's paying for Spotify and Netflix and it's nothing, right, 112 00:06:13,000 --> 00:06:15,919 Speaker 1: Nobody was really thinking in terms of recurring revenue stream 113 00:06:15,920 --> 00:06:18,279 Speaker 1: in a monthly Yeah, hits a credit card. Yeah. The 114 00:06:18,320 --> 00:06:20,400 Speaker 1: idea that that we were even gonna ask people to 115 00:06:20,400 --> 00:06:22,320 Speaker 1: do that was radical. I mean, this stuff sounds like 116 00:06:22,480 --> 00:06:24,800 Speaker 1: ridiculous because we all do it now, but you know, 117 00:06:25,120 --> 00:06:28,160 Speaker 1: it was this sixteen years ago. Fifteen years ago, people 118 00:06:28,200 --> 00:06:30,719 Speaker 1: were really really taken aback by that. The other thing 119 00:06:30,760 --> 00:06:34,840 Speaker 1: that I think in retrospect was a bigger deal, huge deal, 120 00:06:34,880 --> 00:06:36,440 Speaker 1: and we sort of just took it for granted, was 121 00:06:36,880 --> 00:06:39,400 Speaker 1: we were probably the first consumer product to use Amazon 122 00:06:39,440 --> 00:06:42,800 Speaker 1: Web services on the back end, so were we had 123 00:06:42,839 --> 00:06:44,960 Speaker 1: some features that would let you list books you were 124 00:06:45,000 --> 00:06:46,560 Speaker 1: reading and music you were listening to, and they would 125 00:06:46,600 --> 00:06:49,000 Speaker 1: talk to Amazon and link up you know, buy my 126 00:06:49,040 --> 00:06:51,719 Speaker 1: book or or or you know by this back then CD. 127 00:06:52,440 --> 00:06:56,000 Speaker 1: And the interesting thing about that is some of us 128 00:06:56,000 --> 00:06:58,800 Speaker 1: still buy CDs just as every one's in a great while, 129 00:06:58,920 --> 00:07:02,480 Speaker 1: I will say so for collecting purposes, that's right. And 130 00:07:02,480 --> 00:07:04,440 Speaker 1: and so it was an interesting thing was that, um, 131 00:07:04,839 --> 00:07:11,240 Speaker 1: we had this this app talking to web services from Amazon. Um, 132 00:07:11,280 --> 00:07:13,760 Speaker 1: you know, years and years before I think that became commonplace. 133 00:07:13,800 --> 00:07:15,640 Speaker 1: And then the third thing that that type Pad did, 134 00:07:15,720 --> 00:07:19,320 Speaker 1: I think nobody really remembers. Um So the you know, 135 00:07:19,360 --> 00:07:21,600 Speaker 1: the iPhone launch, you didn't have an app store. You know, 136 00:07:21,640 --> 00:07:23,800 Speaker 1: Steve Jobs is like, you gotta use against it originally 137 00:07:23,840 --> 00:07:26,240 Speaker 1: and then and then a year later, so they said, okay, 138 00:07:26,240 --> 00:07:29,120 Speaker 1: we're gonna introduce the app store. At that initial rollout 139 00:07:29,280 --> 00:07:31,239 Speaker 1: on stage with Jobs, we had our head of product 140 00:07:31,280 --> 00:07:34,400 Speaker 1: for type Pad showing type Pad taking photos on your phone, 141 00:07:34,760 --> 00:07:36,920 Speaker 1: uploading them to the web from your iPhone, you know, 142 00:07:36,960 --> 00:07:39,000 Speaker 1: in real time. This was, you know, the sort of 143 00:07:39,040 --> 00:07:42,440 Speaker 1: the years long before Instagram took off. The very first 144 00:07:42,440 --> 00:07:44,320 Speaker 1: initial launch of the app store, type Pad was one 145 00:07:44,360 --> 00:07:46,600 Speaker 1: of the first apps they ever future and type head 146 00:07:46,720 --> 00:07:49,440 Speaker 1: is still around, still support, still doing this. They they 147 00:07:50,080 --> 00:07:51,920 Speaker 1: changed ownership a bunch of times. I don't I literally 148 00:07:51,920 --> 00:07:54,160 Speaker 1: have even followed because I've been so busy with other stuff. 149 00:07:54,160 --> 00:07:55,520 Speaker 1: For the the last several years. But they I mean, I 150 00:07:55,560 --> 00:07:58,120 Speaker 1: know the old big picture, which I did not take 151 00:07:58,160 --> 00:08:00,840 Speaker 1: down when I moved toward press in in oh eight. 152 00:08:00,840 --> 00:08:03,280 Speaker 1: I picked September oh eight is perfect time in the 153 00:08:03,280 --> 00:08:05,800 Speaker 1: middle of the financial crisis. It was like six months 154 00:08:05,800 --> 00:08:09,920 Speaker 1: in advance, and then the world blew up. I kept 155 00:08:09,960 --> 00:08:12,280 Speaker 1: both that and I had a fun Essays and a 156 00:08:12,360 --> 00:08:16,640 Speaker 1: Fluvia for just random of fluvia. Um and they're still 157 00:08:16,720 --> 00:08:19,040 Speaker 1: up and running. I still at one point in time 158 00:08:19,040 --> 00:08:21,400 Speaker 1: it was fifty percent of the traffic to wear Press 159 00:08:21,760 --> 00:08:24,400 Speaker 1: and then it gradually, you know, as as people move 160 00:08:24,480 --> 00:08:28,200 Speaker 1: their bookmarks and RSS feeders which no longer. Yeah, there's 161 00:08:28,200 --> 00:08:29,520 Speaker 1: a lot. I mean, there's a lot of technology that's 162 00:08:29,520 --> 00:08:31,280 Speaker 1: sort of come, come and gone, and I think some 163 00:08:31,320 --> 00:08:32,959 Speaker 1: of these things will come back around again. But it's 164 00:08:32,960 --> 00:08:35,800 Speaker 1: been an interesting thing to watch. Um, a lot of 165 00:08:35,800 --> 00:08:38,400 Speaker 1: these ideas we thought were really radical around Oh, you 166 00:08:38,480 --> 00:08:40,760 Speaker 1: might take a picture on your camera on your phone 167 00:08:40,880 --> 00:08:43,480 Speaker 1: and then share with people instantly, and you might store 168 00:08:43,480 --> 00:08:46,320 Speaker 1: that on Amazon web services, and you might use a 169 00:08:46,360 --> 00:08:49,960 Speaker 1: service that's got a consumer subscription model as the way 170 00:08:50,000 --> 00:08:52,760 Speaker 1: of accessing it. All those things, Um, you know, we 171 00:08:52,760 --> 00:08:55,320 Speaker 1: weren't necessarily what we were first on some of them, 172 00:08:55,360 --> 00:08:57,240 Speaker 1: but we weren't, you know, the only ones doing them. 173 00:08:57,280 --> 00:09:00,439 Speaker 1: I think to have been certainly popularized that I think 174 00:09:00,480 --> 00:09:02,320 Speaker 1: to bet that that was the way the Internet was headed, 175 00:09:02,559 --> 00:09:04,600 Speaker 1: where some of the things that we're just as important 176 00:09:04,600 --> 00:09:06,280 Speaker 1: as the idea that blogs were going to be media 177 00:09:06,360 --> 00:09:08,200 Speaker 1: that matter. Here's the quote of yours that I like, 178 00:09:08,320 --> 00:09:11,720 Speaker 1: any form of electronic communication will first be dismissed as 179 00:09:11,720 --> 00:09:15,720 Speaker 1: trivial and worseless until it produces a profound result, after 180 00:09:15,760 --> 00:09:18,680 Speaker 1: which it will be described as obvious and boring. Yeah, 181 00:09:18,840 --> 00:09:22,160 Speaker 1: that's been my experience for sure. You've been blogging since? 182 00:09:24,320 --> 00:09:27,400 Speaker 1: Was this the plan all along? To keep keep? You know? 183 00:09:27,440 --> 00:09:29,960 Speaker 1: It was really one of those I was underemployed, not 184 00:09:30,000 --> 00:09:32,000 Speaker 1: real happy with my place in life, and thought, I'm 185 00:09:32,000 --> 00:09:33,880 Speaker 1: gonna have a creative outlet to put stuff out there. 186 00:09:33,920 --> 00:09:36,320 Speaker 1: And I can't sing, and I can't, you know, I can't, 187 00:09:36,360 --> 00:09:38,679 Speaker 1: I can't do a radio show, so let me do this, 188 00:09:38,920 --> 00:09:41,960 Speaker 1: and uh, it was an interesting thing to to sort 189 00:09:42,000 --> 00:09:43,959 Speaker 1: of just have this outlet. And then realized this media 190 00:09:44,080 --> 00:09:46,120 Speaker 1: mattered enough and that I had been early, you know, 191 00:09:46,160 --> 00:09:48,760 Speaker 1: I thought it was late, starting at nine. That's funny 192 00:09:48,800 --> 00:09:51,800 Speaker 1: because when I started on type head. It was oh 193 00:09:51,800 --> 00:09:55,480 Speaker 1: three the song I even remember July, and it was 194 00:09:55,600 --> 00:10:01,640 Speaker 1: essentially a handful of college uh economics p fessors, nobody 195 00:10:01,679 --> 00:10:05,360 Speaker 1: really writing about markets, nobody really writing about investing, and 196 00:10:05,400 --> 00:10:08,120 Speaker 1: you end up my experience was similar to yours, and 197 00:10:08,160 --> 00:10:10,760 Speaker 1: that I'm not thrilled with what I'm doing. I'm kinda, 198 00:10:11,360 --> 00:10:13,800 Speaker 1: you know, looking for a creative outlet. But it was 199 00:10:13,880 --> 00:10:16,520 Speaker 1: clear that the mainstream media they take some young kid 200 00:10:16,559 --> 00:10:19,160 Speaker 1: out of school and throw them. Uh. Nobody wants to 201 00:10:19,160 --> 00:10:22,400 Speaker 1: cover the FED. Nobody wants to do economics back then, 202 00:10:22,720 --> 00:10:29,239 Speaker 1: and they were frequently, if not wrong, then lacking context. 203 00:10:29,640 --> 00:10:31,440 Speaker 1: They were they weren't deep enough in it right. And 204 00:10:31,480 --> 00:10:32,840 Speaker 1: I think that was the thing. Is that then the 205 00:10:32,880 --> 00:10:34,360 Speaker 1: people writing the blogs, with the people who are like, 206 00:10:34,400 --> 00:10:36,440 Speaker 1: I'm obsessing over this whether there's an audience or not, 207 00:10:36,600 --> 00:10:39,320 Speaker 1: that's that nobody cares, you know. My view was, who 208 00:10:39,320 --> 00:10:41,280 Speaker 1: cares if there's an audience? There was a whole run 209 00:10:41,320 --> 00:10:45,400 Speaker 1: of expert amateurs. Is that is that a real thing? 210 00:10:45,520 --> 00:10:47,120 Speaker 1: I don't know, but I know exactly what you mean. 211 00:10:47,120 --> 00:10:49,800 Speaker 1: And I think that summer too was a particularly interesting time, 212 00:10:49,800 --> 00:10:52,120 Speaker 1: summer a GOO three, because we had folks coming up 213 00:10:52,160 --> 00:10:56,200 Speaker 1: that we're doing covering aspects of economics that people weren't covering. Um, 214 00:10:56,240 --> 00:10:58,199 Speaker 1: we brought some of the verse first. Well, really, I 215 00:10:58,200 --> 00:11:00,200 Speaker 1: think all the first vcs that started blog and came 216 00:11:00,200 --> 00:11:02,160 Speaker 1: out on type paths. We had Fred Wilson on, we 217 00:11:02,200 --> 00:11:05,520 Speaker 1: had yeah, David Hornick from August Capital, a bunch of 218 00:11:05,520 --> 00:11:09,560 Speaker 1: these folks, Joe Eto, who's now heading up the media, 219 00:11:10,120 --> 00:11:13,280 Speaker 1: and you know, all these folks were Reid Hoffman, um, 220 00:11:13,800 --> 00:11:18,480 Speaker 1: you know, like has a podcast exactly, the CEO of 221 00:11:18,480 --> 00:11:21,960 Speaker 1: of Lincoln and uh and so all these people were 222 00:11:22,160 --> 00:11:23,920 Speaker 1: you know, I knew them as investors, and I was 223 00:11:23,960 --> 00:11:27,040 Speaker 1: not at all fluent in m VC. You know, I 224 00:11:27,120 --> 00:11:29,640 Speaker 1: knew tech, but at that time you could actually build 225 00:11:29,640 --> 00:11:34,000 Speaker 1: an entire career writing software and never crossed paths with VC. Really, yeah, 226 00:11:34,120 --> 00:11:35,880 Speaker 1: you know it was it was pretty common. And because 227 00:11:35,880 --> 00:11:37,840 Speaker 1: it was a lot of big companies on the pop 228 00:11:37,840 --> 00:11:39,800 Speaker 1: shops that were building software and apps and they would 229 00:11:39,800 --> 00:11:42,560 Speaker 1: sort of grow organically out of that. And um, and 230 00:11:42,600 --> 00:11:44,560 Speaker 1: if you weren't in the valley in particular, you know, 231 00:11:44,600 --> 00:11:46,360 Speaker 1: you could easily. I was in New York. You could 232 00:11:46,360 --> 00:11:48,560 Speaker 1: just never cross paths. And so to read these blogs 233 00:11:48,600 --> 00:11:50,720 Speaker 1: was like opening up this whole new world, and they 234 00:11:50,720 --> 00:11:53,400 Speaker 1: were now I understand, like you know, to some degree, 235 00:11:53,520 --> 00:11:57,080 Speaker 1: was content marketing. They were marketing their their firms by 236 00:11:57,360 --> 00:12:01,520 Speaker 1: being better storytellers. I think that's what it's involved, right, 237 00:12:01,600 --> 00:12:03,839 Speaker 1: I think that's what it's evolved to, the same thing 238 00:12:03,880 --> 00:12:06,040 Speaker 1: with the freemium model. Here was a whole bunch of 239 00:12:06,040 --> 00:12:07,760 Speaker 1: free content owned by the way. If you want to 240 00:12:08,240 --> 00:12:11,200 Speaker 1: work with us, you can. But that's not I don't 241 00:12:11,200 --> 00:12:13,920 Speaker 1: think that was anyone's intent. It truly wasn't my intention. 242 00:12:14,080 --> 00:12:16,800 Speaker 1: Here's what I do. I'm gonna blog in obscurity for 243 00:12:16,880 --> 00:12:20,720 Speaker 1: twelve years and then have a radio show. Well, it's 244 00:12:20,720 --> 00:12:22,520 Speaker 1: so interesting because I had that where. You know, one 245 00:12:22,520 --> 00:12:24,960 Speaker 1: of the people I connected with back then, UM was 246 00:12:25,040 --> 00:12:27,880 Speaker 1: Joel Spolsky who was writing a blog called Joel on Software, 247 00:12:28,240 --> 00:12:31,240 Speaker 1: and he was really the first person to write about 248 00:12:31,320 --> 00:12:35,160 Speaker 1: sort of coding culture, technical culture, and it was one 249 00:12:35,200 --> 00:12:36,920 Speaker 1: of the things that those of us were coding for 250 00:12:36,920 --> 00:12:38,800 Speaker 1: a living, which is what I did back then. I 251 00:12:38,920 --> 00:12:41,080 Speaker 1: knew these things, but had never seen them addressed as 252 00:12:41,120 --> 00:12:42,920 Speaker 1: like you are you are you know people at the 253 00:12:42,960 --> 00:12:46,800 Speaker 1: place in society, and he was taking programmers very seriously. Um. 254 00:12:46,960 --> 00:12:48,400 Speaker 1: Part of the reason he was blogging. Then he was 255 00:12:48,440 --> 00:12:50,720 Speaker 1: explaining the ideas behind the new company he had just created, 256 00:12:50,720 --> 00:12:54,200 Speaker 1: which was Fall Creek Software. And so seventeen years later, 257 00:12:54,480 --> 00:12:57,200 Speaker 1: you know, for well sixteen years from that point that 258 00:12:57,360 --> 00:13:00,360 Speaker 1: I took over as CEO of that company. Uh, it 259 00:13:00,520 --> 00:13:03,160 Speaker 1: felt like I knew everything about how this company ran 260 00:13:03,240 --> 00:13:05,080 Speaker 1: and what its values were, and what his purpose was 261 00:13:05,120 --> 00:13:07,959 Speaker 1: in the world, because I had from day one followed 262 00:13:08,000 --> 00:13:10,200 Speaker 1: along with the storytelling. And you think he was not 263 00:13:10,320 --> 00:13:13,000 Speaker 1: content marketing, He was not. He didn't start writing this stuff. 264 00:13:13,000 --> 00:13:14,319 Speaker 1: He was literally like trying to solve a problem like 265 00:13:14,320 --> 00:13:16,240 Speaker 1: how do you deal with real estate in Manhattan? How 266 00:13:16,280 --> 00:13:18,000 Speaker 1: do you how do you keep programmers happy in a 267 00:13:18,040 --> 00:13:20,240 Speaker 1: world where you know, back then, hard as it is 268 00:13:20,240 --> 00:13:23,040 Speaker 1: to believe, programmers were treated sort of like dirt. They 269 00:13:23,040 --> 00:13:24,400 Speaker 1: were like this guy in the back, We're going to 270 00:13:24,440 --> 00:13:25,920 Speaker 1: get him a computer and lock him in the dark 271 00:13:25,960 --> 00:13:30,760 Speaker 1: and exactly right, and throw caffeine at him. And uh 272 00:13:30,840 --> 00:13:32,840 Speaker 1: and now you know, you think, well, I mean, there's 273 00:13:32,840 --> 00:13:35,680 Speaker 1: probably no more spoiled workers in the world in terms 274 00:13:35,720 --> 00:13:38,240 Speaker 1: of like the free massages and the trades of candy. 275 00:13:38,280 --> 00:13:42,320 Speaker 1: And you mentioned that Google lifted the idea of all 276 00:13:42,360 --> 00:13:44,880 Speaker 1: the free food. Yeah, this and this is one of 277 00:13:44,920 --> 00:13:47,160 Speaker 1: those things where like things become urban legend, and so 278 00:13:47,200 --> 00:13:49,280 Speaker 1: it's hard to separate fact from fiction. But what I've 279 00:13:49,280 --> 00:13:52,400 Speaker 1: heard so one thing that is, you know, definitely true 280 00:13:52,800 --> 00:13:55,000 Speaker 1: is fault Creek was one of the first companies to 281 00:13:55,040 --> 00:13:57,360 Speaker 1: do the like let's really spoil our coders and our 282 00:13:57,400 --> 00:14:00,600 Speaker 1: technical workers. And it was everything from great free lunches 283 00:14:00,840 --> 00:14:03,760 Speaker 1: to UM. One of the things I still spend a 284 00:14:03,760 --> 00:14:06,840 Speaker 1: lot of time working on completely end to end healthcare. 285 00:14:07,160 --> 00:14:10,120 Speaker 1: Nobody on our team pays anything for healthcare UM. In fact, 286 00:14:10,120 --> 00:14:11,320 Speaker 1: we have people who have been at the company a 287 00:14:11,320 --> 00:14:13,200 Speaker 1: long time or they came right out of school. They 288 00:14:13,200 --> 00:14:15,600 Speaker 1: didn't know what in network out of network means because 289 00:14:15,600 --> 00:14:17,360 Speaker 1: they never had to deal with it the network. And 290 00:14:17,360 --> 00:14:18,920 Speaker 1: I was like, that's great, Like that's what we want 291 00:14:19,000 --> 00:14:21,160 Speaker 1: people to feel. And then one of those things was 292 00:14:21,320 --> 00:14:24,720 Speaker 1: really great free catered lunches. In retrospect, I think some 293 00:14:24,720 --> 00:14:26,840 Speaker 1: of this was the company was always like downtown Manhattan, 294 00:14:26,960 --> 00:14:29,560 Speaker 1: and especially back then, you know, just post nine eleven, 295 00:14:29,840 --> 00:14:32,400 Speaker 1: there wasn't There wasn't like good restaurants there, you know. 296 00:14:32,440 --> 00:14:34,520 Speaker 1: You know it also, especially if you're working late, like 297 00:14:34,800 --> 00:14:37,280 Speaker 1: five pm, it would just shut down that neighborhood would 298 00:14:37,280 --> 00:14:38,840 Speaker 1: just be dead a so how do you get food 299 00:14:38,880 --> 00:14:41,200 Speaker 1: down to So they had caterers, right, and they had 300 00:14:41,200 --> 00:14:45,440 Speaker 1: caterers come in many employees. So now the company were 301 00:14:45,600 --> 00:14:48,440 Speaker 1: about forty it's it's still pretty small. It's it's kind 302 00:14:48,440 --> 00:14:50,680 Speaker 1: of going up and down in size because um over 303 00:14:50,680 --> 00:14:52,720 Speaker 1: the years, like they co created stack Overflow and they 304 00:14:52,720 --> 00:14:55,280 Speaker 1: created Trello, and as those products grow, they get pretty 305 00:14:55,280 --> 00:14:57,560 Speaker 1: big and then they would spin them out. You know, 306 00:14:57,600 --> 00:14:59,400 Speaker 1: Trello spun out a couple of years ago and sold 307 00:14:59,400 --> 00:15:03,320 Speaker 1: to it last back in January for let let me 308 00:15:03,320 --> 00:15:05,280 Speaker 1: reel you back into the blog is sphere and talk 309 00:15:05,560 --> 00:15:09,560 Speaker 1: about one of my favorite subjects, which is um uh 310 00:15:09,880 --> 00:15:13,200 Speaker 1: public a blog post of yours called Don't Read the 311 00:15:13,200 --> 00:15:17,960 Speaker 1: Comments Now. I wrote Bailout Nation on the blog. I 312 00:15:17,960 --> 00:15:20,280 Speaker 1: would put up a few hundred words and I would 313 00:15:20,280 --> 00:15:22,240 Speaker 1: get feedback and have you seen this story and look 314 00:15:22,280 --> 00:15:27,400 Speaker 1: at this link and the readers the community was astonishing. 315 00:15:27,960 --> 00:15:30,600 Speaker 1: And then tragedy of the comments that just gets overrun 316 00:15:30,680 --> 00:15:34,480 Speaker 1: with the spammers and trolls, and it became so time 317 00:15:34,520 --> 00:15:38,480 Speaker 1: consuming to stay on top of comments that finally I 318 00:15:38,520 --> 00:15:41,160 Speaker 1: had to just grip my teeth and rip the band 319 00:15:41,160 --> 00:15:43,680 Speaker 1: aid off in close conscience, and I was not happy 320 00:15:43,720 --> 00:15:46,960 Speaker 1: about doing it, but it just became this giant time suck. 321 00:15:47,560 --> 00:15:49,440 Speaker 1: What is the problem? And by the way, that's true 322 00:15:49,440 --> 00:15:52,120 Speaker 1: on my columns at the Washington Post and Bloomberg as well. 323 00:15:52,480 --> 00:15:55,600 Speaker 1: It's like, if you want to say something, go someplace. 324 00:15:55,640 --> 00:15:57,720 Speaker 1: Here's a link to Twitter, Here's a link to Facebook, 325 00:15:57,920 --> 00:16:01,880 Speaker 1: where hopefully you aren't doing this anonymously, and you could 326 00:16:01,880 --> 00:16:04,760 Speaker 1: be called out because eventually I could subpoena Twitter and 327 00:16:04,760 --> 00:16:07,440 Speaker 1: find out who you are if you say something. Although 328 00:16:07,440 --> 00:16:09,960 Speaker 1: Twitter has been done a good job of protecting people's nanimity, 329 00:16:10,080 --> 00:16:13,280 Speaker 1: they've done at protecting on a non amenmity yet but 330 00:16:13,280 --> 00:16:15,720 Speaker 1: but not abuse at all. Right, they're just in fact. 331 00:16:16,040 --> 00:16:19,560 Speaker 1: In fact, my argument as to why Twitter stock prices 332 00:16:19,760 --> 00:16:21,960 Speaker 1: has been in the crap all this time is they 333 00:16:22,000 --> 00:16:25,440 Speaker 1: have not created an encouraging communally totally agree. So there's 334 00:16:25,440 --> 00:16:26,840 Speaker 1: a couple things I would sort of break out here. 335 00:16:27,200 --> 00:16:30,360 Speaker 1: You know. The first is, um, I get to watch 336 00:16:30,400 --> 00:16:33,000 Speaker 1: people creating the first comment systems on the internet. You know, 337 00:16:33,000 --> 00:16:34,720 Speaker 1: there was a time when you couldn't actually comment on 338 00:16:34,760 --> 00:16:39,040 Speaker 1: a web page and so um. The interestingly, the challenge 339 00:16:39,080 --> 00:16:40,680 Speaker 1: then was what are people can type in this box 340 00:16:40,720 --> 00:16:42,760 Speaker 1: at all. Are they can even understand that you can 341 00:16:42,840 --> 00:16:44,360 Speaker 1: leave a comment and will show up on the page, 342 00:16:44,360 --> 00:16:46,280 Speaker 1: and that's the thing you can do. So they were 343 00:16:46,760 --> 00:16:49,520 Speaker 1: hyper optimized for at any cost. We just want to 344 00:16:49,520 --> 00:16:51,960 Speaker 1: put we don't want to put any barriers to somebody 345 00:16:51,960 --> 00:16:54,800 Speaker 1: typing in this box and leave and comment. Um. Of 346 00:16:54,840 --> 00:16:57,840 Speaker 1: course that pretty quickly became well, we made it really 347 00:16:57,840 --> 00:16:59,600 Speaker 1: really easy, and now there's no barriers and there's no 348 00:16:59,640 --> 00:17:03,400 Speaker 1: standard and what you know, I wish we had understood 349 00:17:03,440 --> 00:17:04,920 Speaker 1: the time. And I will say this, this is something 350 00:17:05,000 --> 00:17:08,520 Speaker 1: I personally didn't get for a long time. Was you 351 00:17:08,800 --> 00:17:11,280 Speaker 1: make a real community by introducing a set of rules 352 00:17:11,280 --> 00:17:13,359 Speaker 1: that everybody understands. This is true in the physical world. 353 00:17:13,600 --> 00:17:15,199 Speaker 1: We get it very intuitively. If you go to a 354 00:17:15,200 --> 00:17:17,439 Speaker 1: park and it's got the you know, you can't be 355 00:17:17,440 --> 00:17:19,479 Speaker 1: throwing your ball around here because you're gonna hit these 356 00:17:19,480 --> 00:17:21,440 Speaker 1: little kids that are over here, or there's a dog running. 357 00:17:21,440 --> 00:17:23,200 Speaker 1: You gotta keep your dog inside the fence. Whatever the 358 00:17:23,280 --> 00:17:25,760 Speaker 1: rules are. It makes the place work for everybody, and 359 00:17:25,800 --> 00:17:27,280 Speaker 1: it doesn't have to be honors. It doesn't have to 360 00:17:27,280 --> 00:17:29,439 Speaker 1: be a burden. We needn't do any of that in 361 00:17:29,440 --> 00:17:32,479 Speaker 1: creating these online systems, and in fact, a lot of 362 00:17:32,520 --> 00:17:34,639 Speaker 1: the people, and this is something it wasn't as much me, 363 00:17:34,720 --> 00:17:37,160 Speaker 1: but people that I saw that were creating other tools 364 00:17:37,680 --> 00:17:40,480 Speaker 1: that made the systems for commenting and feedback online were 365 00:17:41,080 --> 00:17:43,480 Speaker 1: total zealots about the fact there shouldn't be any rules. 366 00:17:44,000 --> 00:17:46,639 Speaker 1: It should be wild West all the time. And you 367 00:17:46,680 --> 00:17:48,720 Speaker 1: know wild West is basically well, if you have wild 368 00:17:48,720 --> 00:17:51,200 Speaker 1: West and there's no cops, guess who runs the shell 369 00:17:52,359 --> 00:17:55,840 Speaker 1: the outlaws. And in fact, when when I was I 370 00:17:56,320 --> 00:18:00,000 Speaker 1: went through a two step process. The first step was saying, 371 00:18:00,680 --> 00:18:03,439 Speaker 1: all right, we're gonna do moderated comments and and in 372 00:18:03,560 --> 00:18:06,080 Speaker 1: order to I'm gonna set up some rules which I'm 373 00:18:06,119 --> 00:18:09,280 Speaker 1: not going to share, so you're not gonna know, you 374 00:18:09,320 --> 00:18:12,480 Speaker 1: won't be able to gain the system just for just 375 00:18:12,480 --> 00:18:14,720 Speaker 1: just to get past step one. And then step two 376 00:18:14,800 --> 00:18:18,640 Speaker 1: is no ad hominem attacks, no no false no fake 377 00:18:18,720 --> 00:18:21,919 Speaker 1: news like it. It's amazing that one simple sentence of 378 00:18:21,960 --> 00:18:28,480 Speaker 1: nonsense can undercut this conversation deep deeply researched, intelligently debated 379 00:18:29,160 --> 00:18:31,960 Speaker 1: um and I want back and forth. I wanted debate, 380 00:18:32,200 --> 00:18:35,960 Speaker 1: but what I didn't want was just people derailing things 381 00:18:36,000 --> 00:18:39,919 Speaker 1: with nonsense. So we've had Trump talk about fake news. 382 00:18:40,440 --> 00:18:43,880 Speaker 1: Fake news, some of which is confirmation bias and tribalism, 383 00:18:44,280 --> 00:18:48,920 Speaker 1: and some of which is just people purposefully trying to Yeah. Yeah, 384 00:18:48,960 --> 00:18:52,200 Speaker 1: And there's there's a couple of guides online which I've 385 00:18:52,240 --> 00:18:55,679 Speaker 1: posted on the blog as a see. It's it's the 386 00:18:55,680 --> 00:18:59,800 Speaker 1: guide to disrupting Social Forums. They're they're actually out there 387 00:18:59,840 --> 00:19:02,400 Speaker 1: and they're out of the CIA Black Book to how 388 00:19:02,440 --> 00:19:06,560 Speaker 1: to go disrupt society. Right, it's misinformation as a tactic. 389 00:19:06,920 --> 00:19:09,200 Speaker 1: And so there's there's an interesting thing here where we 390 00:19:09,200 --> 00:19:12,399 Speaker 1: we've had this escalation. So first we had, um, you 391 00:19:12,440 --> 00:19:14,600 Speaker 1: know what, the Internet always had ordinary trolls, right, just 392 00:19:14,640 --> 00:19:17,280 Speaker 1: people saying like I don't because I can hide behind 393 00:19:17,359 --> 00:19:21,760 Speaker 1: my anonymity or relative nanimity, I can be transgressive in 394 00:19:21,760 --> 00:19:23,959 Speaker 1: this way where I'm just being the person who's like, 395 00:19:24,080 --> 00:19:26,800 Speaker 1: you know, ruining things for people, but but in a 396 00:19:26,840 --> 00:19:29,479 Speaker 1: way that I think they think it's entertaining. Um, And 397 00:19:29,520 --> 00:19:31,639 Speaker 1: that's sort of like it's annoying, but it's not a 398 00:19:31,640 --> 00:19:33,720 Speaker 1: big deal, and if you can ban them, that's fine. 399 00:19:34,800 --> 00:19:38,760 Speaker 1: The evolution of that into organized communities that are trying 400 00:19:38,760 --> 00:19:41,919 Speaker 1: to undermine other communities or undermine individual people who are 401 00:19:41,920 --> 00:19:47,240 Speaker 1: trying to communicate by misinformation, by personal attacks, by threats, 402 00:19:47,240 --> 00:19:50,119 Speaker 1: by all that sort of these tactics. I think people 403 00:19:50,320 --> 00:19:53,080 Speaker 1: don't understand that these have gotten very organized and very instructure. 404 00:19:53,119 --> 00:19:54,840 Speaker 1: As you said, there are guides to how to do it, 405 00:19:55,240 --> 00:19:56,879 Speaker 1: and so they say, well, why don't you just ignore 406 00:19:56,880 --> 00:20:00,240 Speaker 1: it if you don't you know, get y yeah, yeah, total. 407 00:20:00,280 --> 00:20:02,280 Speaker 1: I mean, whether whether it's you know, the game or 408 00:20:02,320 --> 00:20:04,479 Speaker 1: Gate or or or the men's rights activists, like, there 409 00:20:04,480 --> 00:20:06,280 Speaker 1: are a lot of communities that we're doing this online. 410 00:20:06,320 --> 00:20:08,920 Speaker 1: And and the interesting thing about this is one they 411 00:20:09,040 --> 00:20:11,280 Speaker 1: learned from each other. They've been evolving this over ten 412 00:20:11,359 --> 00:20:14,199 Speaker 1: or fifteen years. Uh, they are very organized, but they 413 00:20:14,320 --> 00:20:16,679 Speaker 1: organized in places that people aren't watching. So they're happening 414 00:20:16,720 --> 00:20:20,040 Speaker 1: on you know, some screw chan. I mean, there's the 415 00:20:20,040 --> 00:20:21,800 Speaker 1: ones that people know, but there's also just private channels. 416 00:20:22,160 --> 00:20:24,040 Speaker 1: You know, they can text each other like anybody else. 417 00:20:24,040 --> 00:20:26,080 Speaker 1: Like they're they're not you know, they're very technically literate. 418 00:20:26,760 --> 00:20:29,399 Speaker 1: And so what happens is people who are not sophisticating 419 00:20:29,400 --> 00:20:30,960 Speaker 1: these things think, well, okay, look, if this person is 420 00:20:31,000 --> 00:20:32,960 Speaker 1: being a jerk to online, why don't you just not 421 00:20:33,000 --> 00:20:34,800 Speaker 1: go on Twitter? Or why don't you just not you know, 422 00:20:34,880 --> 00:20:36,719 Speaker 1: don't worry about a delete or comment and kind of thing. 423 00:20:36,720 --> 00:20:38,800 Speaker 1: And it's like, well, the difference is when you have 424 00:20:38,840 --> 00:20:43,480 Speaker 1: people organizing trying to undermine a specific community, Um, just 425 00:20:43,560 --> 00:20:47,280 Speaker 1: ignoring it doesn't make it go away. And the the 426 00:20:47,280 --> 00:20:49,479 Speaker 1: the social costs of for example, saying just get off 427 00:20:49,480 --> 00:20:51,440 Speaker 1: of social media or don't go on Twitter, don't go whenever, 428 00:20:51,720 --> 00:20:53,800 Speaker 1: it's like, well, that's you know, if I work in media, 429 00:20:53,840 --> 00:20:55,720 Speaker 1: I work in publisher and work in tech. It's a 430 00:20:55,720 --> 00:20:57,720 Speaker 1: business tool. I need that. I need to have a 431 00:20:57,720 --> 00:21:01,480 Speaker 1: presence there. Translate the digital version of that into meat space. 432 00:21:01,520 --> 00:21:03,919 Speaker 1: It's sort of like saying don't go to the town square, 433 00:21:04,520 --> 00:21:06,680 Speaker 1: don't don't go to the school, don't go to the theater, 434 00:21:06,960 --> 00:21:09,520 Speaker 1: and it's it's a ridiculous exactly right. Then you take like, 435 00:21:09,600 --> 00:21:11,520 Speaker 1: you know, YouTube comments. I think they're getting a little better, 436 00:21:11,560 --> 00:21:14,240 Speaker 1: but notoriously terrible. And if you said I'm gonna go 437 00:21:14,280 --> 00:21:17,560 Speaker 1: to Google's lobby in Mountain View and I'm going to 438 00:21:17,640 --> 00:21:20,040 Speaker 1: start shouting epithets at people, They're not gonna be like, 439 00:21:20,040 --> 00:21:22,080 Speaker 1: have a seat, have some of our free lash. They're 440 00:21:22,080 --> 00:21:24,400 Speaker 1: gonna be like, no, you gotta stop doing that. We're 441 00:21:24,400 --> 00:21:27,560 Speaker 1: gonna kick you out my My explanation of it was, 442 00:21:27,600 --> 00:21:29,399 Speaker 1: if you have a cocktail party, a you get to 443 00:21:29,440 --> 00:21:32,720 Speaker 1: invite who comes and be someone starts flipping over tables, 444 00:21:32,720 --> 00:21:35,240 Speaker 1: you get to throw them out. It's not like well, 445 00:21:35,280 --> 00:21:38,280 Speaker 1: the thing that makes me incensed about this is that 446 00:21:38,359 --> 00:21:42,159 Speaker 1: they scream First Amendment. And my response is always, nothing 447 00:21:42,280 --> 00:21:45,520 Speaker 1: is stopping you from going and starting your own blog 448 00:21:46,160 --> 00:21:48,760 Speaker 1: and laboriously building it other than the fact that you're 449 00:21:48,880 --> 00:21:52,399 Speaker 1: lazy untalented. Other than yeah, I mean the other thing 450 00:21:52,400 --> 00:21:54,360 Speaker 1: to feel free. They don't want to do that. They're 451 00:21:54,400 --> 00:21:57,360 Speaker 1: not trying to create some coy're trying to destroy something exactly. 452 00:21:57,480 --> 00:21:59,600 Speaker 1: And I think that that there's a really there's a 453 00:21:59,600 --> 00:22:03,720 Speaker 1: tough lunge where um, you know, you can express yourself 454 00:22:03,760 --> 00:22:07,359 Speaker 1: however you want. You're not entitled to destroy my platform 455 00:22:07,440 --> 00:22:10,239 Speaker 1: and my community and my back and forth. Yea, and 456 00:22:10,240 --> 00:22:13,200 Speaker 1: and and and It's an interesting thing because there's also we, 457 00:22:14,119 --> 00:22:16,960 Speaker 1: I think especially technical people, have this desire to say, 458 00:22:16,960 --> 00:22:19,359 Speaker 1: I want one set of rules that applies to everybody, 459 00:22:19,480 --> 00:22:20,960 Speaker 1: and I want this one set of you know, this 460 00:22:21,040 --> 00:22:23,280 Speaker 1: behavior is always bad if you do this, if if 461 00:22:23,359 --> 00:22:27,240 Speaker 1: then exactly exactly exactly. Binary logic like the stuff that 462 00:22:27,000 --> 00:22:29,800 Speaker 1: that doesn't always work, and and it ignores the realities 463 00:22:29,840 --> 00:22:32,919 Speaker 1: of societies, which which is that people have different positions, roles, power, 464 00:22:33,320 --> 00:22:35,320 Speaker 1: all these things shape of dynamics. And the example I 465 00:22:35,320 --> 00:22:38,240 Speaker 1: else give is the sort of like the classic, you know, 466 00:22:38,359 --> 00:22:40,800 Speaker 1: one of the worst things that people do to threaten 467 00:22:40,880 --> 00:22:42,840 Speaker 1: people who are on the margins, who are vulnerable or 468 00:22:43,000 --> 00:22:45,720 Speaker 1: ideas that are unpopular. You know, they're gonna, we're gonna 469 00:22:45,760 --> 00:22:48,119 Speaker 1: expose you. We're gonna, we're gonna you know what they 470 00:22:48,160 --> 00:22:49,720 Speaker 1: called docking, and we're going to publish your home address. 471 00:22:49,760 --> 00:22:50,840 Speaker 1: I mean, have had this happen to me? You know, 472 00:22:51,080 --> 00:22:53,880 Speaker 1: the gamer gets published my home address. Don't you have 473 00:22:53,960 --> 00:22:57,040 Speaker 1: your phone number on your do? I thought that was insane, 474 00:22:57,359 --> 00:22:59,200 Speaker 1: you know. So there's an interesting thing where part of 475 00:22:59,240 --> 00:23:01,040 Speaker 1: it is about reclaim aiming like I'm going to control 476 00:23:01,119 --> 00:23:04,239 Speaker 1: this stuff and what's out there. Um, some of it 477 00:23:04,320 --> 00:23:07,800 Speaker 1: is I do I want to show people of good intent. 478 00:23:07,880 --> 00:23:11,159 Speaker 1: I am accessible, I'm not trying to hide sometimes, I'm 479 00:23:11,160 --> 00:23:13,520 Speaker 1: not trying to be you know, wall myself off from 480 00:23:13,560 --> 00:23:15,159 Speaker 1: because I do think most people are good and do 481 00:23:15,200 --> 00:23:17,560 Speaker 1: want to engage with ideas and want to be thoughtful. 482 00:23:18,000 --> 00:23:20,640 Speaker 1: And so that's something that's just signaling like I'm open 483 00:23:20,680 --> 00:23:24,280 Speaker 1: to that. Then there's this other part, which is so 484 00:23:24,280 --> 00:23:26,879 Speaker 1: there's a behavior where like if somebody is like legitimately 485 00:23:26,920 --> 00:23:30,199 Speaker 1: vulnerable and they are keeping themselves anonymous or pseudonymous in 486 00:23:30,280 --> 00:23:33,240 Speaker 1: order to to keep themselves safe, and you out them, 487 00:23:33,280 --> 00:23:36,800 Speaker 1: that's a real danger. On the other hand, if you say, 488 00:23:37,000 --> 00:23:40,320 Speaker 1: the person who did this transgressive thing, this dangerous thing, 489 00:23:40,359 --> 00:23:43,639 Speaker 1: who posted this threat, I'm gonna identify them, that is 490 00:23:43,640 --> 00:23:46,720 Speaker 1: actually protecting people. Right. So the exact same activity I'm 491 00:23:46,720 --> 00:23:49,120 Speaker 1: gonna identify this person who's trying to hide can be 492 00:23:49,160 --> 00:23:51,320 Speaker 1: both very good or very bad. It can be very 493 00:23:51,400 --> 00:23:53,680 Speaker 1: much a positive that helps society, or can be very 494 00:23:53,680 --> 00:23:57,560 Speaker 1: negative because you're making somebody vulnerable. And so the logic 495 00:23:58,040 --> 00:24:00,520 Speaker 1: of what I think a lot of programmers tend towards, 496 00:24:00,600 --> 00:24:02,240 Speaker 1: like if you do this rule, you're bad. If you 497 00:24:02,280 --> 00:24:04,520 Speaker 1: follow this, if you break this rule, you're bad. It 498 00:24:04,560 --> 00:24:08,240 Speaker 1: ignores power, ignores the dynamics of society, and that's why 499 00:24:08,320 --> 00:24:11,840 Speaker 1: we keep bumping into what seemed like really obvious screw 500 00:24:11,960 --> 00:24:13,920 Speaker 1: ups in tech where we're like, how come we can't 501 00:24:13,920 --> 00:24:15,639 Speaker 1: make a civil place. It's like, you're not going to 502 00:24:15,720 --> 00:24:18,240 Speaker 1: be able to regulate human behavior with a single binary 503 00:24:18,240 --> 00:24:22,120 Speaker 1: set of rules. Let's talk a little bit about diversity 504 00:24:22,240 --> 00:24:26,080 Speaker 1: and inclusion. This is all over the You were way 505 00:24:26,119 --> 00:24:28,719 Speaker 1: way ahead of the curve on this. We have the 506 00:24:28,760 --> 00:24:32,520 Speaker 1: CEO of Uber being forced out. We have a number 507 00:24:32,560 --> 00:24:36,680 Speaker 1: of vcs having to step down either from the boards 508 00:24:36,680 --> 00:24:39,960 Speaker 1: that they serve on or from their own companies. Uh. 509 00:24:40,000 --> 00:24:45,119 Speaker 1: It looks like the broke culture in Silicon Valley is 510 00:24:45,800 --> 00:24:49,119 Speaker 1: coming to an end. UH, tell us a little bit 511 00:24:49,160 --> 00:24:53,320 Speaker 1: about your perspective on diversity and inclusion in the world 512 00:24:53,320 --> 00:24:56,120 Speaker 1: of technology. So there's definitely a moment of reckoning going 513 00:24:56,160 --> 00:24:58,680 Speaker 1: on right now. And for me, ten years ago, I 514 00:24:58,720 --> 00:25:00,520 Speaker 1: was just I had been in Silicon Value a couple 515 00:25:00,560 --> 00:25:02,600 Speaker 1: of years. Um, even though I've been in tech twenty years, 516 00:25:02,600 --> 00:25:05,400 Speaker 1: I've spent a couple of years. Because you're really uh 517 00:25:06,359 --> 00:25:08,320 Speaker 1: in New York. Yeah, yeah, so you when did you? 518 00:25:08,359 --> 00:25:09,960 Speaker 1: Were you here when we were they were doing six 519 00:25:09,960 --> 00:25:12,160 Speaker 1: apart and the company was growing. And I went out 520 00:25:12,359 --> 00:25:15,159 Speaker 1: to uh San Francisco in two thousand four and I 521 00:25:15,160 --> 00:25:19,000 Speaker 1: was there until two thousand seven. And really one of 522 00:25:19,040 --> 00:25:21,800 Speaker 1: the drivers for me of sort of being done with 523 00:25:21,880 --> 00:25:24,199 Speaker 1: being in Silicon Valley and being in San Francisco was 524 00:25:24,840 --> 00:25:28,520 Speaker 1: it felt so homogeneous and and yeah, it just you know, well, 525 00:25:28,720 --> 00:25:30,160 Speaker 1: I think I've been spoiled. I've been in New York. 526 00:25:30,200 --> 00:25:34,080 Speaker 1: I had worked in media publishing, where you are there's 527 00:25:34,119 --> 00:25:35,840 Speaker 1: so many and you know, and I had people that, 528 00:25:36,080 --> 00:25:38,320 Speaker 1: you know, there were friends that were outside of tech 529 00:25:38,359 --> 00:25:40,040 Speaker 1: and they were in other just ones that they could 530 00:25:40,040 --> 00:25:41,840 Speaker 1: be in fashion, they can be media, they can be 531 00:25:41,920 --> 00:25:44,520 Speaker 1: in in finance, whatever it was. And you just get 532 00:25:44,520 --> 00:25:47,440 Speaker 1: a different view of the world. And it was astonishing 533 00:25:47,440 --> 00:25:48,960 Speaker 1: to me. It's like, with all the problems that you know, 534 00:25:49,040 --> 00:25:52,040 Speaker 1: say the entertainment industry had, there was still much more 535 00:25:52,080 --> 00:25:56,000 Speaker 1: inclusive and much more diverse. And so to go into 536 00:25:56,200 --> 00:25:59,560 Speaker 1: conference rooms, boardrooms, meeting rooms in Silicon Valley and look 537 00:25:59,600 --> 00:26:02,120 Speaker 1: around and be like, essentially, this is just a bunch 538 00:26:02,119 --> 00:26:05,960 Speaker 1: of white and Asian guys and that doesn't count as diversity, 539 00:26:06,160 --> 00:26:08,080 Speaker 1: you know. Well, and it's interesting because it's it's it's 540 00:26:08,080 --> 00:26:12,399 Speaker 1: particularly complicated being Asian American, where we're overrepresented and you know, 541 00:26:12,480 --> 00:26:14,520 Speaker 1: like we over index were two percent of the population. 542 00:26:14,560 --> 00:26:18,119 Speaker 1: And yet you look at just Indian American you know, 543 00:26:18,359 --> 00:26:20,920 Speaker 1: people in tech, the CEO of Google, CEO of Microsoft, 544 00:26:21,000 --> 00:26:23,719 Speaker 1: CEO of Adobe, these are all Indian guys, it's like, 545 00:26:23,720 --> 00:26:27,000 Speaker 1: we're doing fine. And yet you know, like I said, 546 00:26:27,000 --> 00:26:30,680 Speaker 1: being in media, publishing, entertainment, I was like, I know 547 00:26:31,000 --> 00:26:33,080 Speaker 1: that there are black and Latino folks in California. You 548 00:26:33,119 --> 00:26:35,520 Speaker 1: can't tell me they're not there. And yet you go 549 00:26:35,560 --> 00:26:38,000 Speaker 1: to these offices and it wasn't just technical staff, it 550 00:26:38,119 --> 00:26:42,720 Speaker 1: was the legal staff, the marketing staff that you know, everything, 551 00:26:42,840 --> 00:26:45,720 Speaker 1: every one of these roles. The representation was way out 552 00:26:45,720 --> 00:26:48,520 Speaker 1: of whack to a point where you couldn't ignore it anymore. 553 00:26:48,680 --> 00:26:51,040 Speaker 1: The number that was in in one of your columns, 554 00:26:51,560 --> 00:26:58,920 Speaker 1: California's Hispanic population is uh the average percentage of employees 555 00:26:59,000 --> 00:27:01,879 Speaker 1: and tech companies head quartered there is less than five percent. 556 00:27:02,480 --> 00:27:05,960 Speaker 1: Google is three And if you look at the industry 557 00:27:06,040 --> 00:27:09,119 Speaker 1: average for women, it's only a third of employees and 558 00:27:09,200 --> 00:27:12,520 Speaker 1: women when more than half of the population of women, right, 559 00:27:12,640 --> 00:27:14,720 Speaker 1: and so so there's just this proportionality where you like, 560 00:27:14,840 --> 00:27:16,960 Speaker 1: at a certain point you can say, Okay, it's not 561 00:27:17,160 --> 00:27:20,840 Speaker 1: exactly match, of course not, but if you try, if 562 00:27:20,840 --> 00:27:23,480 Speaker 1: you take you said, well, we we looked at the 563 00:27:23,520 --> 00:27:25,840 Speaker 1: population and we could only find three percent of our 564 00:27:25,840 --> 00:27:28,159 Speaker 1: staff that could meet our requirements. I don't buy I 565 00:27:28,240 --> 00:27:30,040 Speaker 1: just don't buy it. Just the math doesn't add up. 566 00:27:30,080 --> 00:27:32,479 Speaker 1: Now that the pushback to that is, hey, you know 567 00:27:32,720 --> 00:27:37,960 Speaker 1: you mentioned India, huge technical training, engineering, mathematics science, UH, 568 00:27:38,080 --> 00:27:42,080 Speaker 1: Indian Institute of Technology, go down the list, just tremendous. UH. 569 00:27:42,119 --> 00:27:44,359 Speaker 1: And my other favorite stat is half of the C 570 00:27:44,560 --> 00:27:48,680 Speaker 1: suite and Silicon Valley are immigrants. So there's clearly some 571 00:27:48,800 --> 00:27:52,359 Speaker 1: recognition of meritocracy, at least in fear that's the ideal. 572 00:27:52,400 --> 00:27:54,360 Speaker 1: And I mean, I could not be more pro immigrant. 573 00:27:54,440 --> 00:27:56,919 Speaker 1: My parents are immigrants. They're here because you know, you know, 574 00:27:57,080 --> 00:27:59,920 Speaker 1: they were willing to do the work and get educated 575 00:28:00,000 --> 00:28:02,920 Speaker 1: and and and you know, I'm incredibly proud to be 576 00:28:03,320 --> 00:28:05,560 Speaker 1: in that you know, of that descent and in that tradition. 577 00:28:06,119 --> 00:28:08,840 Speaker 1: That being said, you look at the costs for paying 578 00:28:08,880 --> 00:28:10,960 Speaker 1: for an H one B, for paying for the lawyers 579 00:28:10,960 --> 00:28:12,560 Speaker 1: to bring somebody over here, for paying for all these 580 00:28:12,560 --> 00:28:14,960 Speaker 1: things to bring those workers over here is a huge 581 00:28:15,000 --> 00:28:18,520 Speaker 1: investment for these companies, huge And for the same amount 582 00:28:18,560 --> 00:28:21,240 Speaker 1: of money, you could train workers here and they're not 583 00:28:21,600 --> 00:28:24,359 Speaker 1: and you could train people from the underrepresented communities. So 584 00:28:24,440 --> 00:28:27,920 Speaker 1: what's the thinking that's a sure thing and training them 585 00:28:28,000 --> 00:28:29,720 Speaker 1: is a risk? You know, well, I think there's a 586 00:28:29,720 --> 00:28:31,720 Speaker 1: lot of factors. I think one of them is, you know, 587 00:28:31,840 --> 00:28:35,080 Speaker 1: h one B workers can never organize or really complain 588 00:28:35,240 --> 00:28:38,080 Speaker 1: because if you you know, essentially, if you get too 589 00:28:38,160 --> 00:28:41,240 Speaker 1: up at you lose your job, you get deported, and 590 00:28:41,240 --> 00:28:43,200 Speaker 1: then you're not sending money back to your family, to 591 00:28:43,240 --> 00:28:45,320 Speaker 1: your village or whatever else. So the amount of leverage 592 00:28:45,360 --> 00:28:49,000 Speaker 1: they have over these workers is incredible. UM. I have 593 00:28:49,120 --> 00:28:51,320 Speaker 1: to think that's a factor, you know, you look at 594 00:28:51,360 --> 00:28:54,160 Speaker 1: particularly in the case of like there was UM this 595 00:28:54,280 --> 00:28:58,120 Speaker 1: collusion orchestrated by Steve Jobs and and and Eric Google 596 00:28:58,240 --> 00:29:01,800 Speaker 1: Apple for you of the biggest techn companies, everybody exceptally 597 00:29:01,880 --> 00:29:05,600 Speaker 1: essentially except for Facebook, UM colluded against their own workers 598 00:29:05,880 --> 00:29:07,760 Speaker 1: right to depress their wages by saying we're not going 599 00:29:07,840 --> 00:29:11,440 Speaker 1: to poach each other's workers and UM, which is still 600 00:29:11,640 --> 00:29:15,320 Speaker 1: astonishing to me. And if there was almost no ramification, 601 00:29:15,360 --> 00:29:17,280 Speaker 1: this is an amazing thing. Is the workers Like there 602 00:29:17,320 --> 00:29:19,640 Speaker 1: was a sort of obligatory lawsuit. I think it was 603 00:29:19,680 --> 00:29:21,480 Speaker 1: for eight billion dollars or something. They settled it for 604 00:29:21,520 --> 00:29:26,400 Speaker 1: a tiny pittance that which and you know the people 605 00:29:26,440 --> 00:29:28,280 Speaker 1: at Apple who went through this. I talked to people 606 00:29:28,320 --> 00:29:29,680 Speaker 1: that work there and they were like, yeah, I still 607 00:29:29,720 --> 00:29:31,400 Speaker 1: just really love and Revere, Steve, and I'm like, the 608 00:29:31,480 --> 00:29:34,000 Speaker 1: guy took money out of your kids college funds and 609 00:29:34,040 --> 00:29:36,120 Speaker 1: you're like, it seems like a nice guy. Like what 610 00:29:36,200 --> 00:29:38,440 Speaker 1: would you have to do for you to be like, 611 00:29:38,480 --> 00:29:40,480 Speaker 1: I'm not gonna, you know, fall along with this. You'd 612 00:29:40,520 --> 00:29:43,840 Speaker 1: have to remove the headphone jack from that is a 613 00:29:43,880 --> 00:29:47,080 Speaker 1: bridge too far, right, That's the line. And what you realize, 614 00:29:47,160 --> 00:29:49,920 Speaker 1: what you realize is, you know, when there's that reverence 615 00:29:49,960 --> 00:29:51,479 Speaker 1: to they're like, who am I to complaint? I am 616 00:29:51,520 --> 00:29:53,640 Speaker 1: getting a good salary, I am being paid well. And 617 00:29:53,640 --> 00:29:55,560 Speaker 1: what they don't see is common cause with those H 618 00:29:55,560 --> 00:29:58,320 Speaker 1: one B workers or common cause with the you know 619 00:29:58,360 --> 00:30:01,800 Speaker 1: the contractors, I say contractors and quotes who are providing 620 00:30:01,800 --> 00:30:05,840 Speaker 1: the meals, driving the shuttles to the office and giving 621 00:30:05,840 --> 00:30:08,120 Speaker 1: them a free massages. All those things are presented as 622 00:30:08,160 --> 00:30:09,760 Speaker 1: perks or you get free lunch, you get these things 623 00:30:09,800 --> 00:30:12,440 Speaker 1: like therese are people are workers and they never get 624 00:30:12,480 --> 00:30:15,400 Speaker 1: equity in these companies. And so we realize that there's 625 00:30:15,440 --> 00:30:18,040 Speaker 1: this sort of you know, this class system built in 626 00:30:18,240 --> 00:30:20,800 Speaker 1: that um and I'm just a big believer in like, 627 00:30:20,840 --> 00:30:23,160 Speaker 1: you got to treat your people well and they certainly 628 00:30:23,200 --> 00:30:25,280 Speaker 1: have the money to do it, and they didn't need 629 00:30:25,360 --> 00:30:27,240 Speaker 1: to nickel and dime their workers they did. That's kind 630 00:30:27,240 --> 00:30:29,120 Speaker 1: of odds because there was no reason they have more 631 00:30:29,160 --> 00:30:32,040 Speaker 1: cash than all these companies and more cash I have 632 00:30:32,120 --> 00:30:34,280 Speaker 1: to point out that the only reason Facebook did not 633 00:30:34,400 --> 00:30:37,360 Speaker 1: participate in it is a they were too young for 634 00:30:37,600 --> 00:30:40,480 Speaker 1: that deal was originally made, and be they were busy 635 00:30:40,560 --> 00:30:43,880 Speaker 1: rating Google for some of their favorite engineers. So as 636 00:30:43,880 --> 00:30:46,239 Speaker 1: well as Apple. Let's talk a little bit about what 637 00:30:46,240 --> 00:30:50,040 Speaker 1: fog Creek is doing before we have you repair Twitter. 638 00:30:50,600 --> 00:30:53,080 Speaker 1: Tell me what tell us about glitch and some of 639 00:30:53,120 --> 00:30:56,440 Speaker 1: the other products. Creek Creek is a very storied company, 640 00:30:56,480 --> 00:30:59,280 Speaker 1: has been around seventeen years. Brilliant co founders Joe Spoilsky, 641 00:30:59,400 --> 00:31:01,920 Speaker 1: Michael Prior and what they always wanted to build one 642 00:31:02,000 --> 00:31:04,239 Speaker 1: a great place for like people who want to make 643 00:31:04,280 --> 00:31:06,680 Speaker 1: the most interesting technology to come work. And it's been 644 00:31:06,680 --> 00:31:09,320 Speaker 1: a very influential company that has spun out. They co 645 00:31:09,440 --> 00:31:12,040 Speaker 1: created stack overflow, which is a community that pretty much 646 00:31:12,040 --> 00:31:14,280 Speaker 1: every code in the world uses to answer the questions 647 00:31:14,280 --> 00:31:18,280 Speaker 1: about programming. Did that eventually become like a white labeled 648 00:31:18,360 --> 00:31:19,960 Speaker 1: version where if you want to set up your own 649 00:31:19,960 --> 00:31:22,760 Speaker 1: internal Q and a UM you you can do that. Yeah, 650 00:31:22,800 --> 00:31:25,120 Speaker 1: for sure a sacriflow has has an enterprise product for that. 651 00:31:25,160 --> 00:31:26,480 Speaker 1: I'm I'm just on the board of Stacks, so I 652 00:31:26,480 --> 00:31:28,480 Speaker 1: get to see a lot of what they're doing. And 653 00:31:28,480 --> 00:31:30,120 Speaker 1: and the bigger thing to me is you have this 654 00:31:30,160 --> 00:31:33,160 Speaker 1: community where tens of millions of coders around the world 655 00:31:33,560 --> 00:31:36,600 Speaker 1: come and answer each other's questions in this really collaborative way. 656 00:31:37,240 --> 00:31:42,560 Speaker 1: Um and and interestingly, it's a very you know, reassuring, nurturing, 657 00:31:42,560 --> 00:31:47,200 Speaker 1: supportive environment. People really get answers right away, problems or 658 00:31:47,480 --> 00:31:49,920 Speaker 1: no and you know, like I think historically the one 659 00:31:49,960 --> 00:31:52,240 Speaker 1: of the biggest challenges was they weren't probably friendly enough 660 00:31:52,240 --> 00:31:53,840 Speaker 1: to newbies because they were like, you have to phrase 661 00:31:53,880 --> 00:31:55,360 Speaker 1: your question the right way and it has to be 662 00:31:55,440 --> 00:31:57,560 Speaker 1: you know, exactly right. I think that's starting to ease 663 00:31:57,640 --> 00:31:59,040 Speaker 1: up a little bit, is the sort of you know, 664 00:31:59,080 --> 00:32:01,320 Speaker 1: the community evolves in being more welcoming. But the key 665 00:32:01,360 --> 00:32:04,560 Speaker 1: thing is there's never been wide scale harassment. There's never 666 00:32:04,640 --> 00:32:06,400 Speaker 1: really wide scale abuse. And that's what the site that 667 00:32:06,520 --> 00:32:09,000 Speaker 1: is probably in the top forty websites in the world 668 00:32:09,040 --> 00:32:11,120 Speaker 1: in terms of traffic. So it's possible to make a 669 00:32:11,160 --> 00:32:13,920 Speaker 1: large site that works well. So so Stack was co 670 00:32:14,040 --> 00:32:15,959 Speaker 1: created with the Jeff Atwood and his team, and that's 671 00:32:15,960 --> 00:32:18,200 Speaker 1: spun out. It's an independent company. H They made a 672 00:32:18,240 --> 00:32:20,719 Speaker 1: project management tool, Trelloh, which lots of people use. It's 673 00:32:20,760 --> 00:32:23,520 Speaker 1: really popular, and that spun out and last thing about that. 674 00:32:23,760 --> 00:32:26,760 Speaker 1: So there's this track record of making these wildly successful products. 675 00:32:27,440 --> 00:32:31,040 Speaker 1: And you know, I they approached me about taking over 676 00:32:31,080 --> 00:32:32,960 Speaker 1: the CEO last year and I said, well, you know 677 00:32:33,000 --> 00:32:35,480 Speaker 1: what's what's coming down the pipe. And I saw what 678 00:32:35,720 --> 00:32:38,320 Speaker 1: became Glitch that we launched earlier this year, and I 679 00:32:38,400 --> 00:32:40,320 Speaker 1: was just blown away. I think it's one of the 680 00:32:40,360 --> 00:32:43,600 Speaker 1: most revolutionary products I've seen in my life, like in 681 00:32:43,600 --> 00:32:48,800 Speaker 1: my whole co text. And so what glitches is very simply, 682 00:32:48,840 --> 00:32:51,040 Speaker 1: it's a it's a programming environment where you can go 683 00:32:51,080 --> 00:32:53,840 Speaker 1: and write code. Um and there's a couple of things 684 00:32:53,880 --> 00:32:56,440 Speaker 1: that does that nobody has done before. The first is 685 00:32:56,640 --> 00:32:59,800 Speaker 1: as you type, it's automatically taking the code you're writing, 686 00:32:59,800 --> 00:33:02,240 Speaker 1: the that you're creating and publishing it live to the web. 687 00:33:03,000 --> 00:33:05,800 Speaker 1: That sounds trivial, but that's you know, you take entire 688 00:33:05,840 --> 00:33:10,840 Speaker 1: businesses like Amazon's web services hosting business or Hiroku, which 689 00:33:11,000 --> 00:33:13,720 Speaker 1: is sort of beloved by coders, and the whole process 690 00:33:13,720 --> 00:33:16,120 Speaker 1: of getting an app live onto the web is really hard. 691 00:33:16,160 --> 00:33:19,120 Speaker 1: It's become very complicated. To do it completely automatically in 692 00:33:19,160 --> 00:33:22,160 Speaker 1: the background is a radical change. So is this an app? 693 00:33:22,200 --> 00:33:24,000 Speaker 1: When when we talk about apps, I think about the 694 00:33:24,000 --> 00:33:26,240 Speaker 1: app store in the Android stores right now web app 695 00:33:26,280 --> 00:33:27,840 Speaker 1: So this is things that you go to in your browsers. 696 00:33:27,840 --> 00:33:29,120 Speaker 1: So if you want to make a little, you know, 697 00:33:29,400 --> 00:33:31,520 Speaker 1: a simple app for your business, you're gonna make an 698 00:33:31,520 --> 00:33:33,400 Speaker 1: expensing app for your your team to use, or you 699 00:33:33,520 --> 00:33:35,240 Speaker 1: make it to do list app or something like that. 700 00:33:35,480 --> 00:33:37,520 Speaker 1: It can be as complicated as you want. But the 701 00:33:37,520 --> 00:33:39,360 Speaker 1: problem is that process of like even if you know 702 00:33:39,440 --> 00:33:41,360 Speaker 1: how to write the code and you've built that whole 703 00:33:41,360 --> 00:33:44,840 Speaker 1: thing yourself, just getting it onto the web was a 704 00:33:44,880 --> 00:33:46,920 Speaker 1: lot of work and it was just a pain. So 705 00:33:46,960 --> 00:33:48,760 Speaker 1: we took all that away and that was step one. 706 00:33:49,240 --> 00:33:52,160 Speaker 1: Step two was your entire coding environment. Everything just lives 707 00:33:52,160 --> 00:33:53,920 Speaker 1: in the browser. And that's easier for the same reason 708 00:33:53,920 --> 00:33:56,080 Speaker 1: that like you'd like to use Gmail instead of having 709 00:33:56,360 --> 00:33:58,640 Speaker 1: a mail app on your on your computers, like it's 710 00:33:58,680 --> 00:34:01,400 Speaker 1: just there, whatever computer log into all my stuff is there. 711 00:34:02,240 --> 00:34:05,880 Speaker 1: Those alone were a big leap forward. The biggest things 712 00:34:05,880 --> 00:34:07,479 Speaker 1: that happened the two things that sort of came out 713 00:34:07,480 --> 00:34:10,279 Speaker 1: earliest year was the first is we made almost an 714 00:34:10,280 --> 00:34:12,440 Speaker 1: app store, a catalog of all the different apps people 715 00:34:12,440 --> 00:34:14,760 Speaker 1: have built, and you can remix any of them. So 716 00:34:14,760 --> 00:34:16,440 Speaker 1: I say, somebody already made it to do list app 717 00:34:16,480 --> 00:34:17,759 Speaker 1: and you say, well that's nice, but I want to 718 00:34:17,800 --> 00:34:19,880 Speaker 1: be blue instead of green. You go in there, you 719 00:34:20,000 --> 00:34:22,080 Speaker 1: edit it. That's instantly, years you can remix it, do 720 00:34:22,080 --> 00:34:23,840 Speaker 1: whatever you want to with it. So it's really the 721 00:34:23,880 --> 00:34:25,719 Speaker 1: power the promise of open source that we've had for 722 00:34:25,760 --> 00:34:28,520 Speaker 1: many years. And then the thing that blew my mind 723 00:34:28,560 --> 00:34:31,319 Speaker 1: when I saw the team had built it was you 724 00:34:31,360 --> 00:34:33,720 Speaker 1: can edit this code in real time with other people. 725 00:34:34,120 --> 00:34:36,000 Speaker 1: So like Google Docs. I was about to say, that 726 00:34:36,080 --> 00:34:38,880 Speaker 1: makes me think of Google Docs, but for application exactly 727 00:34:39,719 --> 00:34:43,279 Speaker 1: applications that's exactly it. So nobody has made that kind 728 00:34:43,320 --> 00:34:47,640 Speaker 1: of powerful real programming environment that that is multiplayer. What's 729 00:34:47,680 --> 00:34:49,760 Speaker 1: the business model on this? You're selling it to enterprise. 730 00:34:49,840 --> 00:34:52,399 Speaker 1: There's there's a couple parts to it. Um. The first 731 00:34:52,400 --> 00:34:55,600 Speaker 1: thing we're doing is um incidental to the fact that 732 00:34:55,640 --> 00:34:57,560 Speaker 1: you can code at the same time as other people, 733 00:34:57,840 --> 00:35:00,320 Speaker 1: which is an incredible learning tool, teaching tool because you 734 00:35:00,320 --> 00:35:02,399 Speaker 1: can help people. So right now, all of the big 735 00:35:02,400 --> 00:35:05,680 Speaker 1: companies have what they call their APIs Application Programming Interface, 736 00:35:05,719 --> 00:35:07,800 Speaker 1: and this is the way that you build services on 737 00:35:07,920 --> 00:35:10,840 Speaker 1: top of Stripe for payments, or Twilio for messages, or 738 00:35:11,280 --> 00:35:15,080 Speaker 1: um Twitter for sending messages obviously and um but it's 739 00:35:15,080 --> 00:35:17,399 Speaker 1: really hard for them to get developers to try out 740 00:35:17,400 --> 00:35:19,400 Speaker 1: their tools. So if you say, we have a new 741 00:35:19,440 --> 00:35:21,680 Speaker 1: developer platform and we want people to use it. For example, 742 00:35:21,840 --> 00:35:24,560 Speaker 1: Amazon has skills for the Alexa, you want to make 743 00:35:24,600 --> 00:35:27,120 Speaker 1: new commands that work there. Where Slack has bots in 744 00:35:27,160 --> 00:35:29,920 Speaker 1: their messaging app. They're desperate to be just added a 745 00:35:29,960 --> 00:35:34,480 Speaker 1: whole bunch of slack bots and the birthdays and it's 746 00:35:34,520 --> 00:35:36,239 Speaker 1: just really interesting. Now you see that and you're like, wow, 747 00:35:36,280 --> 00:35:38,120 Speaker 1: I wish it would talk to this other system we're using, 748 00:35:38,719 --> 00:35:41,000 Speaker 1: Like I can never do that, that's too hard. The 749 00:35:41,120 --> 00:35:44,520 Speaker 1: process with glitch now is we've got a sample slack 750 00:35:44,600 --> 00:35:46,560 Speaker 1: bot for you. You go and your remix it. You 751 00:35:46,640 --> 00:35:48,400 Speaker 1: change the part that just works for your system to 752 00:35:48,400 --> 00:35:50,600 Speaker 1: be exactly what you wanted to be, and it's instantly 753 00:35:50,680 --> 00:35:52,880 Speaker 1: up and running. So we take the time to develop 754 00:35:52,920 --> 00:35:56,160 Speaker 1: a slack bot or Alexa skill from uh days or 755 00:35:56,239 --> 00:35:59,640 Speaker 1: hours in two minutes, and that is something that's a 756 00:35:59,719 --> 00:36:02,000 Speaker 1: normal value. All these companies of shots platforms, so they 757 00:36:02,120 --> 00:36:04,399 Speaker 1: want to use it um and be able to pay 758 00:36:04,480 --> 00:36:05,880 Speaker 1: to do a couple of different things. The key one 759 00:36:05,920 --> 00:36:08,520 Speaker 1: is supporting developers creating these things. So right now, if 760 00:36:08,520 --> 00:36:10,640 Speaker 1: you're like, I want to build on you know, Alexa's 761 00:36:10,920 --> 00:36:13,600 Speaker 1: skill set from Amazon, so it's Amazon buying this or 762 00:36:13,800 --> 00:36:17,160 Speaker 1: or Twitter buying this, not necessarily an enterprise company to 763 00:36:17,760 --> 00:36:21,800 Speaker 1: create their own unique Everybody's desperate to make things easier 764 00:36:21,840 --> 00:36:24,359 Speaker 1: for developers because developers are so in demand and it's 765 00:36:24,400 --> 00:36:26,120 Speaker 1: so rare to get them and pay attention to your platform. 766 00:36:26,160 --> 00:36:28,080 Speaker 1: And I look at it. There's a search tool called 767 00:36:28,120 --> 00:36:30,000 Speaker 1: Algolia that I just love. It was one of those 768 00:36:30,239 --> 00:36:33,080 Speaker 1: searchers really hard to do well, Like, I'm not Google, 769 00:36:33,120 --> 00:36:34,880 Speaker 1: I'm not going to figure this thing out. I had 770 00:36:34,920 --> 00:36:37,600 Speaker 1: always been interested in trying it, and I thought I 771 00:36:37,760 --> 00:36:40,200 Speaker 1: have if I could tie that search feature into my app, 772 00:36:40,239 --> 00:36:42,640 Speaker 1: would be really useful. And I was like, but you know, 773 00:36:42,680 --> 00:36:44,480 Speaker 1: I'm running a company. I'm busy. I don't have time 774 00:36:44,520 --> 00:36:45,800 Speaker 1: to learn all this stuff. Even if I got a 775 00:36:45,840 --> 00:36:48,920 Speaker 1: little bit of coding skills. Now I can go remix 776 00:36:48,960 --> 00:36:51,800 Speaker 1: an example out from Algolia that already has search working. 777 00:36:52,440 --> 00:36:54,000 Speaker 1: Just plug in the parts that I want to into 778 00:36:54,040 --> 00:36:55,880 Speaker 1: my own app and be up and running instantly. That 779 00:36:56,080 --> 00:36:58,160 Speaker 1: was something where I was like, it took it from 780 00:36:58,440 --> 00:36:59,800 Speaker 1: it would be cool to try this out. I have 781 00:36:59,880 --> 00:37:02,600 Speaker 1: this intent of learning this programming skill into I can 782 00:37:02,680 --> 00:37:05,560 Speaker 1: deploy it instantly. And the biggest thing I see is 783 00:37:05,920 --> 00:37:08,680 Speaker 1: the coders who we show it too, like their eyes 784 00:37:08,760 --> 00:37:10,560 Speaker 1: light up, like it looks like Christmas morning for them, 785 00:37:10,600 --> 00:37:12,560 Speaker 1: And I'm like, that is a really good sign that 786 00:37:12,640 --> 00:37:15,040 Speaker 1: we were on something something big with glitch. So that 787 00:37:15,160 --> 00:37:18,919 Speaker 1: sounds really fascinating. You you mentioned Twitter earlier. Let's let's 788 00:37:18,960 --> 00:37:23,279 Speaker 1: talk about what you would do to fix Twitter. I've 789 00:37:23,320 --> 00:37:26,640 Speaker 1: been amazed that they while there have been a series 790 00:37:26,680 --> 00:37:30,200 Speaker 1: of updates and they continue to improve the interface and 791 00:37:30,320 --> 00:37:34,120 Speaker 1: some of the general behaviors, the broader concept behind the community, 792 00:37:34,360 --> 00:37:36,960 Speaker 1: they just can't seem to wrap their their heads around. 793 00:37:37,040 --> 00:37:39,719 Speaker 1: It's a hard thing. And I do have a lot 794 00:37:39,800 --> 00:37:41,680 Speaker 1: of empathy for the team. I do think they're trying, 795 00:37:41,840 --> 00:37:44,320 Speaker 1: but I think, you know, it was so long for 796 00:37:44,400 --> 00:37:46,920 Speaker 1: them to turn the ship around to really addressing a 797 00:37:47,000 --> 00:37:49,000 Speaker 1: lot of the issues that that's the question. Believe in it. 798 00:37:49,200 --> 00:37:51,200 Speaker 1: Why why did it take so long? For them to 799 00:37:51,360 --> 00:37:54,480 Speaker 1: notice that they had a giant harassment problem. You know, 800 00:37:54,640 --> 00:37:57,080 Speaker 1: I think there's a there's a lot of reasons for that. 801 00:37:57,280 --> 00:38:01,839 Speaker 1: I am one of the key issues is um who 802 00:38:01,920 --> 00:38:04,360 Speaker 1: feels the pain? Right? So we talked about the inclusion 803 00:38:04,360 --> 00:38:07,560 Speaker 1: in diversity issue and tech and you know, Twitter is 804 00:38:07,640 --> 00:38:09,480 Speaker 1: like most of the tech companies, it doesn't have a 805 00:38:09,560 --> 00:38:12,280 Speaker 1: lot of people from you know, black and Latino communities, 806 00:38:12,320 --> 00:38:14,560 Speaker 1: doesn't have as many women. And guess who gets targeted 807 00:38:14,600 --> 00:38:18,560 Speaker 1: by the majority of harassment online. See, I haven't let 808 00:38:18,640 --> 00:38:20,440 Speaker 1: me stop you there, because as a white dude in 809 00:38:20,440 --> 00:38:25,040 Speaker 1: New York City, perhaps I'm not experiencing the same thing. However, 810 00:38:25,360 --> 00:38:28,399 Speaker 1: and I'm kind of thick skinned. I'm more annoyed by 811 00:38:28,480 --> 00:38:34,279 Speaker 1: the failed logical deduction people calling me ugly. But what 812 00:38:34,440 --> 00:38:38,120 Speaker 1: I I'm amazed at is and I'm I'm I'm seeing 813 00:38:38,680 --> 00:38:43,920 Speaker 1: all this harassment mostly on a partisan basis, name calling 814 00:38:44,000 --> 00:38:48,759 Speaker 1: and stupidity, just craziness, which I think discourages knew you. Oh, 815 00:38:48,800 --> 00:38:52,359 Speaker 1: I don't want to go ye absolutely, And so it's 816 00:38:52,400 --> 00:38:56,160 Speaker 1: a little moderate in Facebook because people have to use 817 00:38:56,520 --> 00:38:59,480 Speaker 1: theoretically are using their real names or otherwise I have 818 00:38:59,520 --> 00:39:01,480 Speaker 1: to go through hope process of creating a fake name. 819 00:39:01,560 --> 00:39:04,120 Speaker 1: And I know people certainly do that, but but there's 820 00:39:04,160 --> 00:39:07,319 Speaker 1: some barriers, right, it's harder to do. So what would 821 00:39:07,360 --> 00:39:10,080 Speaker 1: you do to fix fixed Twitter? Yeah, there are a 822 00:39:10,080 --> 00:39:12,320 Speaker 1: couple of things. I actually wrote a piece back in 823 00:39:12,960 --> 00:39:16,600 Speaker 1: January one Jack Dorsey asked, I guess the whole internet, like, 824 00:39:16,880 --> 00:39:18,960 Speaker 1: how would you how would you fix Twitter? And there 825 00:39:19,000 --> 00:39:21,000 Speaker 1: were a couple of things I wrote in there, and um, 826 00:39:21,360 --> 00:39:24,719 Speaker 1: the more cogent ones. One of them which was, um, 827 00:39:26,400 --> 00:39:28,239 Speaker 1: show people you know how to update the service and 828 00:39:28,320 --> 00:39:31,440 Speaker 1: update the apps. Just the ability to iterate and introduce 829 00:39:31,520 --> 00:39:34,400 Speaker 1: new things would introduce a lot of trust into the platform. 830 00:39:34,440 --> 00:39:37,040 Speaker 1: Because they hadn't shipped any features, they hadn't updated anything, 831 00:39:37,280 --> 00:39:39,360 Speaker 1: so you can make whatever proclamations you want to about 832 00:39:39,360 --> 00:39:42,440 Speaker 1: we're fixing abuse, for fixing harrassment, we're adding better features 833 00:39:42,560 --> 00:39:45,160 Speaker 1: or filtering or whatever, and people don't believe it because like, well, 834 00:39:45,200 --> 00:39:47,239 Speaker 1: the thing hasn't changed at all in a long time, 835 00:39:47,680 --> 00:39:49,560 Speaker 1: you know. And in fact, they were killing off things 836 00:39:49,640 --> 00:39:52,120 Speaker 1: like vine that was really great and creative and interesting 837 00:39:52,200 --> 00:39:55,680 Speaker 1: and didn't have harassment problem. And and you know, she said, okay, 838 00:39:55,680 --> 00:39:57,719 Speaker 1: well these these things that make you feel good, and 839 00:39:57,800 --> 00:40:00,279 Speaker 1: those are getting killed and the things that are making 840 00:40:00,280 --> 00:40:02,480 Speaker 1: me feel terrible. You're you're you know, doubling down on 841 00:40:02,920 --> 00:40:04,480 Speaker 1: So what I think they've gotten better at that they've 842 00:40:04,520 --> 00:40:06,520 Speaker 1: sort of started shipping more too, is to sort of 843 00:40:07,160 --> 00:40:09,880 Speaker 1: be really clear about a harassment policy. So instead of 844 00:40:09,920 --> 00:40:12,080 Speaker 1: these nebulous, vague rules like we have to see like 845 00:40:12,160 --> 00:40:16,160 Speaker 1: this person who's very obviously transgressing, being abusive and horrible 846 00:40:16,200 --> 00:40:19,160 Speaker 1: to people, Um, we got to see their account gets suspended. 847 00:40:19,840 --> 00:40:21,640 Speaker 1: That that happened not too long ago. I forgot his 848 00:40:21,760 --> 00:40:30,280 Speaker 1: name is a fairly milos Yanna Pulosolis, right, notorious troll 849 00:40:30,560 --> 00:40:34,480 Speaker 1: and harasser. And he finally again you have to go 850 00:40:34,719 --> 00:40:38,920 Speaker 1: so far, so repeatedly before your account. Is not not 851 00:40:39,080 --> 00:40:41,880 Speaker 1: just that that he transgressed so frequently, so often that 852 00:40:42,000 --> 00:40:44,319 Speaker 1: he was openly explicit about the fact he was trying 853 00:40:44,360 --> 00:40:46,640 Speaker 1: to harass people, right, you know, and like these are 854 00:40:46,640 --> 00:40:50,600 Speaker 1: the things where it's like this is a um, but 855 00:40:50,719 --> 00:40:53,359 Speaker 1: the fact that he stands out as as someone who's 856 00:40:53,440 --> 00:40:56,960 Speaker 1: banned and there isn't. First of all, there's a boatload 857 00:40:57,040 --> 00:41:00,040 Speaker 1: of bots just insane anounza. I assume half might it 858 00:41:00,120 --> 00:41:03,200 Speaker 1: or followers or software al goes and then on top 859 00:41:03,239 --> 00:41:05,640 Speaker 1: of that, it seems that there are some people that 860 00:41:05,760 --> 00:41:09,520 Speaker 1: are just you know, calls to violence and just all 861 00:41:09,600 --> 00:41:12,960 Speaker 1: sorts of like how do they tolerate this? I think 862 00:41:13,000 --> 00:41:14,360 Speaker 1: one of the things it's easy to lose track of 863 00:41:14,400 --> 00:41:16,920 Speaker 1: outside of Silicon Valley is how extreme they are about 864 00:41:17,000 --> 00:41:20,359 Speaker 1: some of some parts of libertarianism around these views. Right, 865 00:41:20,400 --> 00:41:23,480 Speaker 1: so they're very like everything is free speech, everything is 866 00:41:23,480 --> 00:41:25,640 Speaker 1: fair you know, fair game to put no one saying 867 00:41:25,719 --> 00:41:28,160 Speaker 1: you you can't do this. But we're a private company, 868 00:41:28,800 --> 00:41:31,239 Speaker 1: this is our product, and you could go do this 869 00:41:31,320 --> 00:41:34,640 Speaker 1: house free. You're free to be obnoxious and offense of 870 00:41:34,719 --> 00:41:38,239 Speaker 1: wherever you want, just not in our private community. Well, 871 00:41:38,280 --> 00:41:40,000 Speaker 1: the interesting thing, I think there's a couple of parts 872 00:41:40,040 --> 00:41:42,200 Speaker 1: they sort of ignore, one of which is, you know, 873 00:41:42,440 --> 00:41:44,680 Speaker 1: the argument there's often the argument that technically it's too 874 00:41:44,719 --> 00:41:46,880 Speaker 1: hard to limit these things. They say, well listen, and 875 00:41:46,920 --> 00:41:49,200 Speaker 1: I say listen, you know, go ahead and upload a 876 00:41:49,200 --> 00:41:52,719 Speaker 1: Beyonce MP MP three. Quickly they can detect that. So 877 00:41:52,800 --> 00:41:55,920 Speaker 1: like the tech is there, Right, it's not harder to 878 00:41:56,040 --> 00:42:00,319 Speaker 1: detect you know, a sound signature and a song than 879 00:42:00,400 --> 00:42:03,200 Speaker 1: it is text. Right, this is a racial epic. A 880 00:42:03,280 --> 00:42:07,520 Speaker 1: key phrases. Here's I mean especially in the machine learning 881 00:42:07,600 --> 00:42:10,239 Speaker 1: is making leaps and bounds in advancements like that can 882 00:42:10,280 --> 00:42:12,120 Speaker 1: be something where you can get better at a I 883 00:42:12,160 --> 00:42:13,960 Speaker 1: should really be all over And I think that is 884 00:42:14,080 --> 00:42:16,160 Speaker 1: starting to seep into Twitter a little bit. They're getting 885 00:42:16,160 --> 00:42:18,440 Speaker 1: a little bit better at hiding and flagging things. Um. 886 00:42:18,600 --> 00:42:20,200 Speaker 1: And it's hard to say, but that's part of this. 887 00:42:20,280 --> 00:42:22,680 Speaker 1: They need to communicate clearly about it. The The other 888 00:42:22,800 --> 00:42:25,920 Speaker 1: part is, um, you know, we always talk about the 889 00:42:25,920 --> 00:42:28,160 Speaker 1: free speech for the people who are harassing and abusing, 890 00:42:28,280 --> 00:42:31,200 Speaker 1: but what they do is they chase off vulnerable voices, 891 00:42:31,239 --> 00:42:32,759 Speaker 1: and what about the free speech and the people who 892 00:42:32,800 --> 00:42:36,360 Speaker 1: are exactly And I see this where like you know, 893 00:42:36,480 --> 00:42:39,839 Speaker 1: I have extremely thick skin. I am a loudmouth all 894 00:42:39,880 --> 00:42:42,279 Speaker 1: the time, so like I'm I'm pretty hard to shut up. 895 00:42:42,800 --> 00:42:44,880 Speaker 1: There are times when you get to the worst of them, 896 00:42:45,120 --> 00:42:48,320 Speaker 1: you know, a mob of people coming after you and 897 00:42:48,480 --> 00:42:51,759 Speaker 1: they're you know, targeting you, your friends, your family, your coworkers, 898 00:42:52,360 --> 00:42:54,279 Speaker 1: and you just say enough, I'm gonna put this away. 899 00:42:54,360 --> 00:42:55,960 Speaker 1: And like, if I can go through that with all 900 00:42:56,440 --> 00:42:58,680 Speaker 1: as fortunate and privilege as I am, with as much 901 00:42:58,719 --> 00:43:00,759 Speaker 1: of a network as I have to be able to 902 00:43:00,800 --> 00:43:03,799 Speaker 1: be like, sometimes it's too much. People who aren't as 903 00:43:03,880 --> 00:43:06,040 Speaker 1: lucky as I am and don't have that support network 904 00:43:06,080 --> 00:43:09,360 Speaker 1: behind them can easily be hounded out off of this network. 905 00:43:09,480 --> 00:43:11,600 Speaker 1: And it's like their speech matters too. And I care 906 00:43:11,640 --> 00:43:14,960 Speaker 1: a lot more about the people who are targeted, you know, 907 00:43:15,600 --> 00:43:17,680 Speaker 1: fairly for harassment. I care a lot more about their 908 00:43:17,719 --> 00:43:19,960 Speaker 1: free speech than the people who are saying horrible things 909 00:43:20,000 --> 00:43:22,760 Speaker 1: to them. We have been speaking to a Neil Dash 910 00:43:23,080 --> 00:43:27,440 Speaker 1: of fog Creek Software. If you enjoy this conversation, be 911 00:43:27,520 --> 00:43:30,320 Speaker 1: sure and stick around for the podcast extras, where we 912 00:43:30,480 --> 00:43:33,799 Speaker 1: keep the tape running and continue to discuss all things technology. 913 00:43:34,400 --> 00:43:37,680 Speaker 1: Be sure to check out my daily column on Bloomberg 914 00:43:37,880 --> 00:43:42,440 Speaker 1: View dot com. We love your comments, feedback and suggestions 915 00:43:43,120 --> 00:43:46,839 Speaker 1: right to us at m IB podcast at Bloomberg dot net. 916 00:43:47,360 --> 00:43:50,319 Speaker 1: You can follow me on Twitter at rid Holts. I'm 917 00:43:50,400 --> 00:43:53,400 Speaker 1: Barry rid Holts. You're listening to Masters in Business on 918 00:43:53,480 --> 00:44:08,920 Speaker 1: Bloomberg Radio. Welcome to the podcast, and Neil, thank you 919 00:44:09,040 --> 00:44:11,000 Speaker 1: for doing this and appreciate you being for so so 920 00:44:11,200 --> 00:44:14,160 Speaker 1: generous with your time. Happy to be here. We we 921 00:44:14,280 --> 00:44:19,080 Speaker 1: were talking during the during the break about the blocking 922 00:44:19,280 --> 00:44:23,600 Speaker 1: problem on on Twitter, and you mentioned the service Twitter 923 00:44:23,680 --> 00:44:26,680 Speaker 1: has its own service for importing and exporting lists of blocks, 924 00:44:26,719 --> 00:44:28,960 Speaker 1: but it's not very well integrated. There is a site 925 00:44:28,960 --> 00:44:32,480 Speaker 1: called block together that actually a bunch of activists made 926 00:44:32,520 --> 00:44:35,160 Speaker 1: that gives you the tools to share a block list 927 00:44:35,239 --> 00:44:38,279 Speaker 1: with your friends and or to subscribe to theirs and 928 00:44:38,600 --> 00:44:40,120 Speaker 1: uh and it's really handy because then you don't have 929 00:44:40,160 --> 00:44:42,200 Speaker 1: to individually block a whole community of people that are 930 00:44:42,239 --> 00:44:44,800 Speaker 1: trying to troll a harass. So if you're looking to 931 00:44:45,480 --> 00:44:50,040 Speaker 1: so I break the world into three categories of people 932 00:44:50,600 --> 00:44:54,160 Speaker 1: and we'll we'll keep this relatively PG. But there are 933 00:44:54,200 --> 00:44:57,680 Speaker 1: the folks who are just simply misinformed. And I've learned 934 00:44:58,400 --> 00:45:02,320 Speaker 1: if you give somebody a fact that challenges their misunderstanding, 935 00:45:02,400 --> 00:45:06,200 Speaker 1: it doesn't doesn't help them. So what I always do 936 00:45:06,280 --> 00:45:10,560 Speaker 1: with that group of people is you should find data 937 00:45:10,680 --> 00:45:15,800 Speaker 1: source for the bls and if you could, if you 938 00:45:16,000 --> 00:45:19,160 Speaker 1: can prove to me that what you're you're saying is correct, 939 00:45:19,239 --> 00:45:21,680 Speaker 1: I'll write a column on it and either all because 940 00:45:21,719 --> 00:45:23,800 Speaker 1: if they follow me, I could d M them and 941 00:45:23,880 --> 00:45:26,360 Speaker 1: then subsequently delete the d M so they can't harangue 942 00:45:26,400 --> 00:45:29,719 Speaker 1: me on direct message, and I'll give them a not 943 00:45:29,840 --> 00:45:32,880 Speaker 1: so much a homework assignment. But if you really believe 944 00:45:33,440 --> 00:45:38,000 Speaker 1: this steaming pile and nonsense you've shared. I'm not asking 945 00:45:38,000 --> 00:45:42,239 Speaker 1: you to create a persuasive argument, just sending your data source. Right. 946 00:45:42,719 --> 00:45:46,560 Speaker 1: So so that's a that's the sort of gentle nudge. 947 00:45:47,280 --> 00:45:50,880 Speaker 1: The next level of people who are just kind of 948 00:45:51,160 --> 00:45:55,480 Speaker 1: ideologically broken. And what I mean by that is no 949 00:45:55,760 --> 00:46:01,439 Speaker 1: concept of of beyond not having any data, any data 950 00:46:01,480 --> 00:46:06,640 Speaker 1: that contradicts their beliefs, their ideology, just pure cognitive dissonance, right, 951 00:46:07,120 --> 00:46:09,400 Speaker 1: and and for them but not but not venal, but 952 00:46:09,480 --> 00:46:15,839 Speaker 1: not vicious, just philosophically askew on top of not being 953 00:46:15,920 --> 00:46:20,680 Speaker 1: evidence based. And so now you know, I really because 954 00:46:20,719 --> 00:46:22,920 Speaker 1: I start, I'll look at something and I'll go down 955 00:46:22,960 --> 00:46:25,279 Speaker 1: the rabbit hole and it's like, oh, okay, I know 956 00:46:25,440 --> 00:46:28,719 Speaker 1: this is you know, old roads lead to bright Bard 957 00:46:28,800 --> 00:46:31,400 Speaker 1: and Drudge in a bunch of And there's some versions 958 00:46:31,440 --> 00:46:33,560 Speaker 1: of that on the left as well. Some of the 959 00:46:33,760 --> 00:46:38,160 Speaker 1: early um, I'm trying to remember the sites Salon could 960 00:46:38,160 --> 00:46:40,160 Speaker 1: be a little over the top. And then there's another 961 00:46:40,239 --> 00:46:42,560 Speaker 1: one on the left. And I mean, I think they 962 00:46:42,600 --> 00:46:46,640 Speaker 1: can be wrong or exaggerated, but they very seldom inspire 963 00:46:46,680 --> 00:46:49,400 Speaker 1: a mob to go target people, that's for sure. But 964 00:46:50,080 --> 00:46:53,440 Speaker 1: as somebody who works in finance. I can't afford to 965 00:46:53,520 --> 00:46:57,720 Speaker 1: have a stream of dead, bad memes and misinformation and myths. 966 00:46:57,960 --> 00:46:59,719 Speaker 1: So I just want to cut so those people get 967 00:46:59,800 --> 00:47:04,000 Speaker 1: me did But the aggressive like I'm wrong and I'm 968 00:47:04,040 --> 00:47:07,400 Speaker 1: gonna be aggressive about it, um and obnoxious about it 969 00:47:07,440 --> 00:47:10,200 Speaker 1: and offensive about it. I'm sorry those folks have to 970 00:47:10,280 --> 00:47:13,680 Speaker 1: be blocked. I didn't know block together even existed. My 971 00:47:13,840 --> 00:47:17,560 Speaker 1: only concern is how do I not So I've noticed 972 00:47:17,640 --> 00:47:21,279 Speaker 1: some crazy Trumpers lately. Right at one point in time, 973 00:47:21,320 --> 00:47:23,839 Speaker 1: it was other crazies. But I don't mean people who 974 00:47:23,880 --> 00:47:26,400 Speaker 1: are pro Trump or anti Trump. I have friends in 975 00:47:26,440 --> 00:47:29,640 Speaker 1: the Trump administration. Anthony Scamucci is a buddy of mine. 976 00:47:29,920 --> 00:47:34,120 Speaker 1: We've had very civil debates about Trump. Someone in my 977 00:47:34,200 --> 00:47:36,600 Speaker 1: office as a Trump supporter. She and I have had 978 00:47:37,080 --> 00:47:43,920 Speaker 1: very rational adult discussions that online just do something tribal 979 00:47:44,040 --> 00:47:48,120 Speaker 1: and partisan that people lose their mind. So, and I'll 980 00:47:48,120 --> 00:47:51,600 Speaker 1: give you a perfect example. This weekend, I discovered that 981 00:47:51,800 --> 00:47:56,040 Speaker 1: Jenna Jamison, the former adult film star, is kind of 982 00:47:56,200 --> 00:48:01,920 Speaker 1: like a wild just listen there. There is some extreme 983 00:48:01,960 --> 00:48:05,200 Speaker 1: behavior in her history. So but I never expected that 984 00:48:05,320 --> 00:48:10,839 Speaker 1: to tip into wild ideological and it was just kind 985 00:48:10,880 --> 00:48:14,120 Speaker 1: of random that I found she's amplified white supremacists. I mean, 986 00:48:14,160 --> 00:48:18,759 Speaker 1: there's Yeah, it's a sort of surprising, you know, I think, Um, so, 987 00:48:18,920 --> 00:48:22,160 Speaker 1: how do you how do you block the people that 988 00:48:22,480 --> 00:48:25,200 Speaker 1: you want to block, yet at the same time not 989 00:48:25,520 --> 00:48:29,600 Speaker 1: block so so I can't say it on the radio, 990 00:48:29,719 --> 00:48:34,000 Speaker 1: but there was a science thing about something happening in 991 00:48:34,080 --> 00:48:37,279 Speaker 1: one of the guests, giants in outer space, about a 992 00:48:37,360 --> 00:48:40,960 Speaker 1: subsequent probe. I won't even go there, and she just retweeted, 993 00:48:41,000 --> 00:48:44,080 Speaker 1: at least buy me dinner first. And I just found 994 00:48:44,120 --> 00:48:47,279 Speaker 1: that hilarious that I was looking for something science and 995 00:48:48,239 --> 00:48:52,920 Speaker 1: phil plate uh does does a bad astronomy exactly? And 996 00:48:53,239 --> 00:48:55,239 Speaker 1: from that led to this, led to that, So I 997 00:48:55,320 --> 00:48:57,640 Speaker 1: don't want to The beauty of the web is the 998 00:48:57,840 --> 00:49:01,600 Speaker 1: random serendipity. Yeah, absolutely, So how do you block people 999 00:49:01,880 --> 00:49:07,680 Speaker 1: on a list like um blocked together and not lose 1000 00:49:07,800 --> 00:49:10,560 Speaker 1: that random? You know what I want? They're going to 1001 00:49:10,600 --> 00:49:12,720 Speaker 1: be some false positives, but that already happens, an email 1002 00:49:12,760 --> 00:49:14,759 Speaker 1: that already happens, and everything, right, I mean, but you 1003 00:49:14,800 --> 00:49:17,120 Speaker 1: can at least check your seat. But yeah, but it's 1004 00:49:17,120 --> 00:49:18,560 Speaker 1: the same thing like if you really I mean, you 1005 00:49:18,640 --> 00:49:21,560 Speaker 1: know somebody exactly, you can just go and and bet 1006 00:49:21,600 --> 00:49:23,120 Speaker 1: it like if you're like I really want to see 1007 00:49:23,120 --> 00:49:24,480 Speaker 1: this tweet, but I have this person blocked, you just 1008 00:49:24,520 --> 00:49:27,360 Speaker 1: do unblock them. There's nothing to that. I think what 1009 00:49:27,719 --> 00:49:29,560 Speaker 1: is key is like being able to use the service 1010 00:49:29,600 --> 00:49:32,680 Speaker 1: where like it's not it's not some huge barrier, right 1011 00:49:32,719 --> 00:49:34,520 Speaker 1: if you block somebody, they can still see your tweets 1012 00:49:34,520 --> 00:49:37,600 Speaker 1: if they want, they just don't. Yeah, absolutely, So it's 1013 00:49:37,640 --> 00:49:40,879 Speaker 1: not like this is some impermeable barrier. All it's doing 1014 00:49:40,960 --> 00:49:44,520 Speaker 1: is making it harder for a mob of harassers to 1015 00:49:44,640 --> 00:49:47,160 Speaker 1: target you. And that's really useful. Like for me, it's 1016 00:49:47,160 --> 00:49:50,440 Speaker 1: like like if I'm like whatever, out with my kid 1017 00:49:50,560 --> 00:49:52,320 Speaker 1: and you know, we're doing fun stuff on the weekend, 1018 00:49:52,719 --> 00:49:54,680 Speaker 1: and it happens to be the moment when like a 1019 00:49:54,760 --> 00:49:56,800 Speaker 1: mob of white supremacist is a decided they want to 1020 00:49:56,840 --> 00:49:59,000 Speaker 1: come after me for something I wrote. And the thing 1021 00:49:59,120 --> 00:50:00,919 Speaker 1: is like, they'll go back through your history. It could 1022 00:50:00,920 --> 00:50:03,680 Speaker 1: be something I wrote years ago. Um, and I'm getting 1023 00:50:03,760 --> 00:50:06,120 Speaker 1: notifications on my phone while I'm out with my kid 1024 00:50:06,320 --> 00:50:08,359 Speaker 1: and I'm like, yeah, exactly. I like, I don't want 1025 00:50:08,400 --> 00:50:10,520 Speaker 1: to be distracted by this. If I can just go 1026 00:50:10,760 --> 00:50:13,040 Speaker 1: and find somebody who's already got a good block list, 1027 00:50:13,480 --> 00:50:15,000 Speaker 1: and just share it and be like, Okay, good, I've 1028 00:50:15,040 --> 00:50:17,600 Speaker 1: been able to sort of cut this off. That's it's 1029 00:50:17,640 --> 00:50:19,600 Speaker 1: a no brainer. Like why can you search for a 1030 00:50:19,680 --> 00:50:24,040 Speaker 1: blocklist by a specific it's block list with this person 1031 00:50:24,120 --> 00:50:26,120 Speaker 1: and then I'll share it's just a person. Yeah, so 1032 00:50:26,200 --> 00:50:28,160 Speaker 1: you go by like an individual Twitter user who you 1033 00:50:28,239 --> 00:50:29,800 Speaker 1: trust and use, you know what I mean? The opposite, 1034 00:50:29,840 --> 00:50:33,319 Speaker 1: I want to block Jimmy Dean and then who else 1035 00:50:33,440 --> 00:50:35,560 Speaker 1: is blocked? I don't think so, And part like they've 1036 00:50:35,600 --> 00:50:37,600 Speaker 1: been very thoughtful about it where they don't they don't 1037 00:50:37,680 --> 00:50:40,080 Speaker 1: encourage sort of willy nilly blocking, Like it's really about 1038 00:50:40,160 --> 00:50:42,239 Speaker 1: sort of sharing with the community and doing and and 1039 00:50:42,360 --> 00:50:44,440 Speaker 1: and people having human judgment involved in it. And so 1040 00:50:44,480 --> 00:50:46,960 Speaker 1: I think that's really good. And you can um and 1041 00:50:47,040 --> 00:50:48,960 Speaker 1: you can also go and search for like people that 1042 00:50:49,120 --> 00:50:51,800 Speaker 1: you UM follow who do have lists, and so like 1043 00:50:51,920 --> 00:50:54,040 Speaker 1: those things are very handy. I think Twitter is gonna 1044 00:50:54,040 --> 00:50:56,719 Speaker 1: involve their tools too. But the key is that, like, um, 1045 00:50:57,360 --> 00:51:02,720 Speaker 1: there's a there's a really valid use to blocking people 1046 00:51:03,239 --> 00:51:05,440 Speaker 1: that we learned from Again, like I almost think digital 1047 00:51:06,000 --> 00:51:08,680 Speaker 1: communities need to learn from physical communities, right, And there's 1048 00:51:08,680 --> 00:51:10,520 Speaker 1: a reason why we sort of say, like you can't 1049 00:51:10,560 --> 00:51:12,479 Speaker 1: come into this coffee shop if you're going to shout 1050 00:51:12,560 --> 00:51:14,760 Speaker 1: at people, and you can't you come into this lobby 1051 00:51:14,800 --> 00:51:18,000 Speaker 1: if you're gonna be acting this way, and a crowd theater, 1052 00:51:19,280 --> 00:51:22,600 Speaker 1: and so being able to have analogous tools for sort 1053 00:51:22,600 --> 00:51:26,080 Speaker 1: of just limiting really really anti social behavior. I think 1054 00:51:26,080 --> 00:51:27,719 Speaker 1: it's really useful, and I'm glad that, like the tools 1055 00:51:27,719 --> 00:51:29,359 Speaker 1: are starting to evolve. I think it's a shame they're 1056 00:51:29,360 --> 00:51:33,000 Speaker 1: happening by like activists self funding themselves, building as opposed 1057 00:51:33,000 --> 00:51:35,680 Speaker 1: to the platforms building in. So are we just going 1058 00:51:35,760 --> 00:51:41,040 Speaker 1: to end up with ideologically opposed left blocks and right 1059 00:51:41,120 --> 00:51:44,520 Speaker 1: blocks or is it really is the unifying factor anti 1060 00:51:44,600 --> 00:51:46,680 Speaker 1: social behavior? I think it's a social behavior. There are 1061 00:51:46,719 --> 00:51:48,400 Speaker 1: people that do that, of course, that are like ignore 1062 00:51:48,400 --> 00:51:51,000 Speaker 1: anybody who disagrees with them politically, but they were already 1063 00:51:51,080 --> 00:51:52,880 Speaker 1: doing that, Like they don't need software to do that 1064 00:51:52,880 --> 00:51:54,360 Speaker 1: if you're the kind of person they can't deal with 1065 00:51:54,880 --> 00:51:58,480 Speaker 1: dissenting ideas. The block was not the issue. The interesting 1066 00:51:58,520 --> 00:52:00,799 Speaker 1: thing that's happening rhetorically now where I'm like, I don't 1067 00:52:00,840 --> 00:52:03,920 Speaker 1: want to interact with for example, like white supremacists online, 1068 00:52:05,320 --> 00:52:08,000 Speaker 1: go figure right and they'll come back to me. And 1069 00:52:08,120 --> 00:52:11,560 Speaker 1: be like, oh, you can't handle political dissentsion sake. I 1070 00:52:11,640 --> 00:52:15,600 Speaker 1: love the expression snowflake, yeah, because it's all massive projection 1071 00:52:15,640 --> 00:52:20,279 Speaker 1: because to the slightest challenge and and and and I'm 1072 00:52:20,320 --> 00:52:22,960 Speaker 1: just like, you know what, Like I like, I'm tough 1073 00:52:23,000 --> 00:52:25,239 Speaker 1: as nails. I have no question about that. Like I 1074 00:52:25,320 --> 00:52:26,759 Speaker 1: know what I've been through, and I know you know 1075 00:52:26,800 --> 00:52:29,399 Speaker 1: what I'm able to do. I'm just like, why would 1076 00:52:29,400 --> 00:52:31,520 Speaker 1: I want to deal with you? This is like, this 1077 00:52:31,640 --> 00:52:33,800 Speaker 1: is an elective thing. It's about me having good judgment 1078 00:52:33,880 --> 00:52:36,759 Speaker 1: and discernment where I'm like, you're you're a person who 1079 00:52:36,800 --> 00:52:40,440 Speaker 1: acts obnoxious online all day? Why would I put like, 1080 00:52:40,719 --> 00:52:43,360 Speaker 1: I'm not that's that's not the idea. I'm working to engagement. 1081 00:52:43,440 --> 00:52:45,279 Speaker 1: Like if you think that the only way that your 1082 00:52:45,320 --> 00:52:48,040 Speaker 1: ideas can be representing in culture is by you acting 1083 00:52:48,080 --> 00:52:50,480 Speaker 1: like a monster all day lighting it on fire, and 1084 00:52:52,320 --> 00:52:55,120 Speaker 1: yours probably aren't that good, you know. And it's like, 1085 00:52:55,480 --> 00:52:57,799 Speaker 1: it's not that I'm not intellectually curious. I know, I am. 1086 00:52:57,840 --> 00:53:01,240 Speaker 1: I'm very interested in having my options challenge and learning 1087 00:53:01,280 --> 00:53:04,160 Speaker 1: things like I am very much like I love that 1088 00:53:04,280 --> 00:53:06,439 Speaker 1: idea of like I had to change my mind because 1089 00:53:06,480 --> 00:53:08,040 Speaker 1: I was wrong about the way I thought about this thing, 1090 00:53:08,120 --> 00:53:11,120 Speaker 1: Like I love that feeling. Uh, it doesn't come from 1091 00:53:11,200 --> 00:53:14,759 Speaker 1: somebody saying, like why streaming epithets exactly, Like why the 1092 00:53:14,840 --> 00:53:17,800 Speaker 1: fact that, like my family is multiracial is wrong? Like 1093 00:53:17,880 --> 00:53:20,000 Speaker 1: that is never going to be the thing that I'm like, Wow, 1094 00:53:20,040 --> 00:53:21,279 Speaker 1: I've seen the light. You know, we used to have 1095 00:53:21,440 --> 00:53:23,480 Speaker 1: laws against that sort of stuff that's in the good 1096 00:53:23,520 --> 00:53:26,759 Speaker 1: old days. It's uh, it's it's amazing. You know, when 1097 00:53:26,800 --> 00:53:30,600 Speaker 1: I get the requests through a colleague, Hey, you block 1098 00:53:30,719 --> 00:53:35,080 Speaker 1: so and so so. Now the processes, I'll go look 1099 00:53:35,120 --> 00:53:38,000 Speaker 1: at their not only their tweet stream, but their tweets 1100 00:53:38,040 --> 00:53:40,120 Speaker 1: and replies because I want to see how they're interacting. 1101 00:53:40,200 --> 00:53:42,400 Speaker 1: That's right. And I say to people, listen, life is 1102 00:53:42,480 --> 00:53:47,120 Speaker 1: too short. You're not a good person. There's eight billion people, 1103 00:53:47,640 --> 00:53:50,520 Speaker 1: most of whom are half decent and well intentioned. Some 1104 00:53:50,719 --> 00:53:54,920 Speaker 1: may be misguided. But I don't need venal jerks and 1105 00:53:55,280 --> 00:53:57,520 Speaker 1: nobody needs no Life is too short, and I got 1106 00:53:57,560 --> 00:53:59,400 Speaker 1: too much left to learn to spend my fighting with 1107 00:53:59,440 --> 00:54:01,799 Speaker 1: the strangers really wants to fight more than they want 1108 00:54:01,800 --> 00:54:03,520 Speaker 1: to learn. Right, That's right. You know. One of the 1109 00:54:03,560 --> 00:54:06,880 Speaker 1: beauties of of going to law school is mood court. 1110 00:54:07,320 --> 00:54:09,960 Speaker 1: And the best part about mood court is you have 1111 00:54:10,160 --> 00:54:13,200 Speaker 1: to be able to switch hats and argue either side 1112 00:54:13,280 --> 00:54:17,239 Speaker 1: of any litigation. Literally mid case, you could be all 1113 00:54:17,320 --> 00:54:21,200 Speaker 1: right now. So I've always taken that as a as 1114 00:54:21,239 --> 00:54:25,399 Speaker 1: a sign of intellectual openness. Having the ability to see 1115 00:54:25,440 --> 00:54:27,160 Speaker 1: all sides of an issue, of a problem, what have 1116 00:54:27,320 --> 00:54:33,920 Speaker 1: you prevent you from, uh turning your your opponents into 1117 00:54:34,400 --> 00:54:41,359 Speaker 1: a sub human It keeps the discussion rational because Hey, 1118 00:54:41,520 --> 00:54:44,319 Speaker 1: things that are really really at least in court, things 1119 00:54:44,400 --> 00:54:47,680 Speaker 1: that are really one sided, those cases settle. But where 1120 00:54:47,680 --> 00:54:51,279 Speaker 1: there's a legitimate debate, let's have the debate. And unfortunately 1121 00:54:51,360 --> 00:54:55,080 Speaker 1: too many people uh just can't imagine the other side 1122 00:54:55,239 --> 00:54:57,560 Speaker 1: of the discussion. And that's amazing. Yeah, and I'm very 1123 00:54:57,640 --> 00:54:59,960 Speaker 1: you know, my my mother's families all lawyers, and I'm 1124 00:55:00,040 --> 00:55:04,040 Speaker 1: sort of you know Rais Well, you know, it's interesting 1125 00:55:04,120 --> 00:55:08,800 Speaker 1: they were both um somewhere criminal defense lawyers and and 1126 00:55:08,880 --> 00:55:11,880 Speaker 1: then um, my great grandfather was involved in the Indian 1127 00:55:11,920 --> 00:55:14,040 Speaker 1: independence movement. He marked with FANDI and was sort of 1128 00:55:14,200 --> 00:55:19,440 Speaker 1: very involved in civil rights and social justice about you 1129 00:55:19,520 --> 00:55:21,799 Speaker 1: know what's interesting is they had a printing press at 1130 00:55:21,840 --> 00:55:24,440 Speaker 1: the house, and so there is a tradition of like 1131 00:55:24,520 --> 00:55:30,520 Speaker 1: how do you use your own you know platform exact idea? Right, Yeah, 1132 00:55:30,600 --> 00:55:32,480 Speaker 1: so you know a century ago, that was cutting edge 1133 00:55:32,520 --> 00:55:35,239 Speaker 1: technology to be able to have in a rural, very 1134 00:55:35,320 --> 00:55:37,719 Speaker 1: very poor part of India. And so yeah, I think 1135 00:55:37,760 --> 00:55:40,800 Speaker 1: there is this idea of how do you debate ideas 1136 00:55:41,040 --> 00:55:43,799 Speaker 1: using cutting edge platforms to get your ideas out there 1137 00:55:44,280 --> 00:55:47,120 Speaker 1: and advancing the cause of social justice. And I think 1138 00:55:47,160 --> 00:55:49,520 Speaker 1: those are things that you know, you don't think about 1139 00:55:49,520 --> 00:55:51,320 Speaker 1: consciously as a kid, but they sort of seep in 1140 00:55:51,600 --> 00:55:54,080 Speaker 1: into your mind. And and so there was always a 1141 00:55:54,120 --> 00:55:56,239 Speaker 1: healthy debate about how to how to do these things. 1142 00:55:56,280 --> 00:55:58,719 Speaker 1: I mean, you know, like especially you look at this 1143 00:55:58,880 --> 00:56:01,279 Speaker 1: is still true today. Civil rights movements always have these 1144 00:56:01,400 --> 00:56:04,360 Speaker 1: big schisms within them of like how how radical do 1145 00:56:04,440 --> 00:56:05,880 Speaker 1: we want to be? And what's the right way to 1146 00:56:05,960 --> 00:56:08,240 Speaker 1: approach this? And you know, do we change the system 1147 00:56:08,320 --> 00:56:10,480 Speaker 1: from the inside or do we you know, try to 1148 00:56:10,640 --> 00:56:12,880 Speaker 1: tear it down from the outside. And those kinds of debates. 1149 00:56:13,200 --> 00:56:15,839 Speaker 1: I think those are fascinating and timeless debates. And I'm 1150 00:56:15,840 --> 00:56:18,080 Speaker 1: always happy to engage in those. The people who are 1151 00:56:18,160 --> 00:56:21,520 Speaker 1: like I want to personally, you know, hurt you or 1152 00:56:21,760 --> 00:56:24,800 Speaker 1: or or I want to attack you, and and and 1153 00:56:25,239 --> 00:56:26,640 Speaker 1: that's the only way to vance my ideas. I'm like, 1154 00:56:26,880 --> 00:56:29,120 Speaker 1: that's never it's never going to be the thing that 1155 00:56:29,360 --> 00:56:31,319 Speaker 1: that persuades me or that makes me see the light. 1156 00:56:31,680 --> 00:56:34,720 Speaker 1: And it certainly doesn't indicate much confidence in that person's 1157 00:56:34,760 --> 00:56:37,080 Speaker 1: argument anyway. You know, I've gone back and looked at 1158 00:56:37,160 --> 00:56:40,279 Speaker 1: some old blog posts that I could see are out 1159 00:56:40,360 --> 00:56:43,000 Speaker 1: of frustration where I don't want to say I just 1160 00:56:43,160 --> 00:56:47,000 Speaker 1: called other people idiots, but I would look at their 1161 00:56:47,080 --> 00:56:51,280 Speaker 1: position and first try and take it apart, and somewhere 1162 00:56:51,280 --> 00:56:53,120 Speaker 1: in the middle was a little bit of name calling 1163 00:56:53,160 --> 00:56:55,560 Speaker 1: would be in and then you catch yourself and move 1164 00:56:55,640 --> 00:56:59,640 Speaker 1: away from it. How have you you, more than any 1165 00:56:59,719 --> 00:57:03,279 Speaker 1: person and I know, have been at the vanguard of 1166 00:57:03,560 --> 00:57:07,640 Speaker 1: of seeing the arc of the blogosphere change over time. 1167 00:57:07,719 --> 00:57:11,600 Speaker 1: What what have you noticed? Um? How has this evolved? 1168 00:57:12,960 --> 00:57:16,480 Speaker 1: Do you plan on continuing blogging for forever? What? What 1169 00:57:16,840 --> 00:57:19,280 Speaker 1: are your thoughts on? Yeah, you know, I've learned a lot. 1170 00:57:19,320 --> 00:57:22,160 Speaker 1: I think, Um, it is that I definitely see you 1171 00:57:22,200 --> 00:57:25,040 Speaker 1: when I look back at eighteen years of writing now, 1172 00:57:25,800 --> 00:57:30,439 Speaker 1: UM that it is always this sort of ink blot 1173 00:57:30,480 --> 00:57:33,200 Speaker 1: test about where I met in my head. You go 1174 00:57:33,280 --> 00:57:34,880 Speaker 1: back and read it and I can see exactly how 1175 00:57:34,920 --> 00:57:36,240 Speaker 1: I was feeling at the moment, even if I don't 1176 00:57:36,280 --> 00:57:39,360 Speaker 1: remember writing it. Shocking, wo, I was having a bad 1177 00:57:39,400 --> 00:57:41,520 Speaker 1: day that you know, or or wow, I was really 1178 00:57:41,560 --> 00:57:44,320 Speaker 1: in a good mood. And and and you can really 1179 00:57:44,440 --> 00:57:47,160 Speaker 1: read that into you know, the work. I think that 1180 00:57:47,360 --> 00:57:49,920 Speaker 1: that's been really instructive, because like, I don't keep a 1181 00:57:50,120 --> 00:57:52,520 Speaker 1: like a mood journal, and so I could be writing 1182 00:57:52,560 --> 00:57:54,760 Speaker 1: about like whatever a new update to Windows came out 1183 00:57:54,800 --> 00:57:56,200 Speaker 1: when I used to blog about tech a lot, and 1184 00:57:56,200 --> 00:57:59,000 Speaker 1: I could still tell you exactly how I felt, you 1185 00:57:59,080 --> 00:58:00,800 Speaker 1: know what I read. I think that really instructive. I 1186 00:58:00,840 --> 00:58:04,160 Speaker 1: think my attitude about dealing with you know, people being 1187 00:58:04,200 --> 00:58:07,440 Speaker 1: aggressive or hostile online has has changed a lot over 1188 00:58:07,520 --> 00:58:11,840 Speaker 1: the years. I think initially, um, you know, I think 1189 00:58:11,880 --> 00:58:14,880 Speaker 1: your first reaction is like, well, you know, screw you too, buddy, 1190 00:58:14,920 --> 00:58:17,080 Speaker 1: and you can sort of go at them. I think 1191 00:58:17,160 --> 00:58:18,919 Speaker 1: I spent a long time trying to be like I'm 1192 00:58:18,960 --> 00:58:21,480 Speaker 1: just gonna, you know, love you to death and that'll 1193 00:58:21,520 --> 00:58:23,960 Speaker 1: get you to change. And and I had some successes 1194 00:58:24,000 --> 00:58:28,000 Speaker 1: with that. I mean, I have actually seen conversations where 1195 00:58:28,360 --> 00:58:30,560 Speaker 1: I change somebody's mind her they changed my mind, or 1196 00:58:30,560 --> 00:58:33,200 Speaker 1: at least you get them to back down from the 1197 00:58:33,360 --> 00:58:37,840 Speaker 1: sort of over I think people's initial emails, like it 1198 00:58:37,920 --> 00:58:40,640 Speaker 1: would be great if email had like a sixty minute delay, 1199 00:58:41,040 --> 00:58:44,320 Speaker 1: Like you can't respond instantly. So every now and then 1200 00:58:44,480 --> 00:58:48,080 Speaker 1: someone sends an email and if you give them a 1201 00:58:48,200 --> 00:58:53,000 Speaker 1: big friendly hug and say like you're really upset or 1202 00:58:53,320 --> 00:58:56,000 Speaker 1: or hey, I hope you're not missing the key point here, 1203 00:58:56,360 --> 00:58:58,360 Speaker 1: here's my data source. What are you using to reach 1204 00:58:58,440 --> 00:59:02,320 Speaker 1: this conclusion? As often as not or not, especially in 1205 00:59:02,360 --> 00:59:06,440 Speaker 1: the professional community, So my universe's finance. When someone gets 1206 00:59:06,680 --> 00:59:09,480 Speaker 1: it's not even though back down it's like I've gotten Hey, 1207 00:59:09,480 --> 00:59:12,920 Speaker 1: I appreciate your responding. Civilly, I was online like you say, oh, 1208 00:59:13,040 --> 00:59:16,520 Speaker 1: these people are really professional and smart. He was just 1209 00:59:16,760 --> 00:59:19,320 Speaker 1: perturbed at something. Yeah, it's yeah. It could be whatever 1210 00:59:19,440 --> 00:59:21,960 Speaker 1: they you know, they didn't eat lunch that day, or 1211 00:59:22,000 --> 00:59:24,760 Speaker 1: they have something else going on. What what about the 1212 00:59:24,960 --> 00:59:28,880 Speaker 1: change in media? How has the bloggersphere impacted that? In 1213 00:59:28,960 --> 00:59:31,920 Speaker 1: your perspective, it is It's interesting. I was such an 1214 00:59:31,960 --> 00:59:34,920 Speaker 1: idealist about blogging and social media when he came out. 1215 00:59:34,920 --> 00:59:36,480 Speaker 1: I was like, we're gonna give platforms that are going 1216 00:59:36,520 --> 00:59:38,600 Speaker 1: to give voice to people that don't have any other 1217 00:59:38,600 --> 00:59:40,080 Speaker 1: places to share the words, and that was true. That 1218 00:59:40,160 --> 00:59:43,680 Speaker 1: did happen the professional amateurs exactly who are no longer 1219 00:59:43,720 --> 00:59:50,360 Speaker 1: amateurs now they're professional experts, expert professionals, right, And I'm 1220 00:59:50,480 --> 00:59:52,720 Speaker 1: exhibit A. Yeah, and you know I've been very lucky 1221 00:59:52,800 --> 00:59:55,080 Speaker 1: to benefit from that too. And I think that was true. 1222 00:59:55,960 --> 00:59:58,440 Speaker 1: And and we kept saying, you know, the whatever, the 1223 00:59:58,480 --> 01:00:00,760 Speaker 1: mainstream media makes all these mistakes, and now the social 1224 01:00:00,800 --> 01:00:04,920 Speaker 1: media will help correct it. And then we ignored or 1225 01:00:04,960 --> 01:00:08,520 Speaker 1: didn't anticipate the exact opposite, which was the times when 1226 01:00:08,520 --> 01:00:10,800 Speaker 1: mainstream media was exactly right, and people would use social 1227 01:00:10,840 --> 01:00:15,840 Speaker 1: media to spread disinformation. Yeah, and the idea that like, um, 1228 01:00:16,120 --> 01:00:18,840 Speaker 1: the people who didn't have access to get their voice 1229 01:00:18,840 --> 01:00:21,480 Speaker 1: out there, some of them would would not use that 1230 01:00:21,600 --> 01:00:25,880 Speaker 1: power responsibly. We I was such an optimist and idealist 1231 01:00:25,920 --> 01:00:27,840 Speaker 1: of like, wow, if we just give everybody a printing press, 1232 01:00:27,880 --> 01:00:29,840 Speaker 1: all they're going to print is good, thoughtful, true things, 1233 01:00:30,760 --> 01:00:33,000 Speaker 1: and um, you know that's not the case. And it 1234 01:00:33,120 --> 01:00:36,760 Speaker 1: was a a long, slow, painful lesson that I I know, 1235 01:00:36,960 --> 01:00:39,160 Speaker 1: I personally took too long to learn. What's the old 1236 01:00:39,240 --> 01:00:41,640 Speaker 1: quota a law is halfway around the world where the 1237 01:00:41,680 --> 01:00:44,400 Speaker 1: truth is still tying on its boots. Yeah, yeah, and 1238 01:00:45,680 --> 01:00:48,800 Speaker 1: I think that was very naive on my part. And 1239 01:00:49,040 --> 01:00:50,720 Speaker 1: the amazing thing about it is I felt like I 1240 01:00:50,840 --> 01:00:53,240 Speaker 1: was slow to get that lesson and slow to to 1241 01:00:53,360 --> 01:00:55,000 Speaker 1: really build it into my work and the and the 1242 01:00:55,160 --> 01:00:58,200 Speaker 1: tools and the platforms I was creating. And yet I 1243 01:00:58,280 --> 01:01:00,480 Speaker 1: think it still took even longer for the people that 1244 01:01:00,600 --> 01:01:02,920 Speaker 1: made the facebooks and twitters of the world to get 1245 01:01:02,960 --> 01:01:06,320 Speaker 1: that lesson. And not only do they have to recognize that, 1246 01:01:06,400 --> 01:01:10,040 Speaker 1: they have to then recognize that it's an existential threat 1247 01:01:10,120 --> 01:01:13,600 Speaker 1: to their platform and then create the tech to response 1248 01:01:13,600 --> 01:01:16,480 Speaker 1: to it. Well, and I was lucky and that we 1249 01:01:16,560 --> 01:01:19,120 Speaker 1: had built a product when we built blogging tools that 1250 01:01:19,440 --> 01:01:23,440 Speaker 1: people paid for. You paid for right, you know, And so, um, 1251 01:01:23,880 --> 01:01:25,960 Speaker 1: if you wanted to have a voice in a platform, 1252 01:01:26,080 --> 01:01:27,880 Speaker 1: you were going to directly support it and you were 1253 01:01:27,960 --> 01:01:31,760 Speaker 1: head an investment in it. And so we weren't based 1254 01:01:31,800 --> 01:01:34,000 Speaker 1: on attention, We were not based on the ad model, 1255 01:01:34,800 --> 01:01:37,440 Speaker 1: and um, so you didn't have to be outrageous, you 1256 01:01:37,440 --> 01:01:39,240 Speaker 1: didn't have to jump up that scream, you don't have 1257 01:01:39,360 --> 01:01:42,440 Speaker 1: to have clickbait headlines. Well, and you won't believe what 1258 01:01:42,560 --> 01:01:44,760 Speaker 1: slide three, right, But even if you're going to do that, 1259 01:01:44,800 --> 01:01:47,240 Speaker 1: you were incentivized to build your audience on your terms, 1260 01:01:47,800 --> 01:01:50,720 Speaker 1: but not just views at any cost, you know, Patriots 1261 01:01:50,760 --> 01:01:54,280 Speaker 1: in any cost and everybody else basically went with the 1262 01:01:54,280 --> 01:01:58,560 Speaker 1: ad model, and I, you know, I was really adamantly 1263 01:01:58,560 --> 01:02:01,560 Speaker 1: against it, and I do think inc you know, you 1264 01:02:01,640 --> 01:02:03,880 Speaker 1: see the sort of return as subscription models right at 1265 01:02:03,920 --> 01:02:06,320 Speaker 1: the Times New York Times and like everybody, we want 1266 01:02:06,360 --> 01:02:08,760 Speaker 1: you to subscribe and pay for great journalism all these things. 1267 01:02:09,320 --> 01:02:12,520 Speaker 1: This is this reaction to seeing the distorting effect that 1268 01:02:12,600 --> 01:02:15,280 Speaker 1: that the major ad models have on on web media 1269 01:02:16,000 --> 01:02:19,400 Speaker 1: and the facebooks and twitters and and you know instagrams 1270 01:02:19,440 --> 01:02:22,040 Speaker 1: in the world built models that were totally advertising dependent 1271 01:02:22,520 --> 01:02:27,720 Speaker 1: and so get hyper exaggerated into attention getting models. And 1272 01:02:27,800 --> 01:02:31,400 Speaker 1: I think that's something that the publishers didn't understand how 1273 01:02:31,480 --> 01:02:33,919 Speaker 1: much that would skew their business. And it's interesting because 1274 01:02:33,920 --> 01:02:36,000 Speaker 1: in print, you know, there's one of there was more 1275 01:02:36,000 --> 01:02:39,160 Speaker 1: ad dollars to go around two they were indirect models, right, 1276 01:02:39,240 --> 01:02:41,720 Speaker 1: So the fact that you had a ton of classified 1277 01:02:41,760 --> 01:02:44,240 Speaker 1: ads might encourage you to have a real estate section, 1278 01:02:44,640 --> 01:02:48,120 Speaker 1: but it didn't make your news headlines more extreme about 1279 01:02:48,240 --> 01:02:51,480 Speaker 1: politics and about whether and about whatever. Same thing with 1280 01:02:51,520 --> 01:02:53,520 Speaker 1: the automobile, so you can go through each of the 1281 01:02:53,600 --> 01:02:56,040 Speaker 1: major sections of the Sunday New York Times, and they 1282 01:02:56,040 --> 01:02:59,360 Speaker 1: were advertising and so you have the theater section, the 1283 01:02:59,520 --> 01:03:02,960 Speaker 1: arts and Arts and Entertainment section, real estate, automobiles. I 1284 01:03:03,000 --> 01:03:05,160 Speaker 1: don't know if they still have an automobiles. They might, 1285 01:03:05,240 --> 01:03:07,200 Speaker 1: they might have killed off. But even to that example, 1286 01:03:07,280 --> 01:03:10,040 Speaker 1: like the existence of an automobile section was as a 1287 01:03:10,080 --> 01:03:13,560 Speaker 1: sop to advertisers obviously, but it didn't skew the hard 1288 01:03:13,640 --> 01:03:16,600 Speaker 1: news reporting, and so that was okay because you're like, okay, whatever, 1289 01:03:16,640 --> 01:03:18,680 Speaker 1: you gotta pay the bills. Of course you have comics. 1290 01:03:18,720 --> 01:03:20,840 Speaker 1: You don't have comics because like, like, this is an 1291 01:03:20,840 --> 01:03:23,160 Speaker 1: important news for people, Like this makes people read and 1292 01:03:23,200 --> 01:03:25,840 Speaker 1: that gets the circulation up, and that that um, let's 1293 01:03:25,880 --> 01:03:29,200 Speaker 1: just do the hard news work. And the problem is 1294 01:03:29,360 --> 01:03:32,600 Speaker 1: that line goes away in web media. There isn't some like, well, 1295 01:03:32,760 --> 01:03:34,720 Speaker 1: you're gonna come and read the car ads and that's 1296 01:03:34,760 --> 01:03:37,280 Speaker 1: going to support you're reading the hard news. The hard 1297 01:03:37,320 --> 01:03:38,640 Speaker 1: news has to sell on its own, and so it 1298 01:03:38,680 --> 01:03:41,760 Speaker 1: gets more and more distorted and more and more exaggerated 1299 01:03:41,800 --> 01:03:43,560 Speaker 1: in order to get to that attention because you're in 1300 01:03:43,600 --> 01:03:45,440 Speaker 1: this more and more extreme environment. Now that that's a 1301 01:03:45,560 --> 01:03:48,120 Speaker 1: buzz Feed or maybe a vox or something like that, 1302 01:03:48,320 --> 01:03:50,920 Speaker 1: But that shouldn't be The Washington Post, in the New 1303 01:03:51,000 --> 01:03:53,080 Speaker 1: York Times, the Wall Street Journals, all of them. There's 1304 01:03:53,120 --> 01:03:55,520 Speaker 1: no difference between those. Like, yeah, I mean you look 1305 01:03:55,560 --> 01:03:58,320 Speaker 1: at like the best reported stories on buzz feed are 1306 01:03:58,400 --> 01:04:01,000 Speaker 1: not different than the best reported story race on the 1307 01:04:01,080 --> 01:04:04,000 Speaker 1: Times of the Post, like when you do good journalism, 1308 01:04:04,240 --> 01:04:06,960 Speaker 1: like there are but not necessarily the most read stories. No, 1309 01:04:07,120 --> 01:04:10,560 Speaker 1: that's true. And there are more you know, cat pictures 1310 01:04:10,680 --> 01:04:14,000 Speaker 1: or lists that are subsidizing that journalism on the buzz 1311 01:04:14,080 --> 01:04:16,959 Speaker 1: feeds than there are on you know, the other sides. 1312 01:04:17,040 --> 01:04:18,560 Speaker 1: But like that's just a that's just a question of 1313 01:04:18,600 --> 01:04:20,600 Speaker 1: how you subsidize, you know, in the case of the 1314 01:04:20,680 --> 01:04:23,640 Speaker 1: Post of the Times or whatever, they're subsidizing through print 1315 01:04:23,640 --> 01:04:26,720 Speaker 1: subscriptions that are going to go away as that audience 1316 01:04:27,200 --> 01:04:31,560 Speaker 1: ages out. I finally, I finally dropped everything but the 1317 01:04:31,640 --> 01:04:34,200 Speaker 1: weekend edition in New York Times because I would carry 1318 01:04:34,280 --> 01:04:37,200 Speaker 1: through the office carrying home and it would go go 1319 01:04:37,320 --> 01:04:40,320 Speaker 1: into the recycled. Then yeah, and the Wall Street Journal 1320 01:04:40,440 --> 01:04:43,640 Speaker 1: comes to the office, and this way I don't have 1321 01:04:43,720 --> 01:04:45,720 Speaker 1: to take it home. It goes right into the recycled, right, 1322 01:04:46,520 --> 01:04:48,800 Speaker 1: the way the Journal has there, and I think the 1323 01:04:48,920 --> 01:04:52,080 Speaker 1: Times is the same. If you do a digital subscription, 1324 01:04:52,160 --> 01:04:55,680 Speaker 1: the print edition is free. Yeah what what? Well? And 1325 01:04:55,760 --> 01:04:57,560 Speaker 1: this is all about the metrics and the numbers, right, 1326 01:04:57,600 --> 01:04:59,880 Speaker 1: so the advertising exactly. So they want to tell the advertiser, 1327 01:04:59,880 --> 01:05:02,360 Speaker 1: we this many print subscribers, even though all those people 1328 01:05:02,360 --> 01:05:04,840 Speaker 1: are just you know, when does that stop? When does 1329 01:05:04,880 --> 01:05:08,280 Speaker 1: that go next? Ten years? Ten years? No more newspapers? Yeah, 1330 01:05:08,320 --> 01:05:11,080 Speaker 1: I mean you know, for any practical purposes, there will 1331 01:05:11,120 --> 01:05:12,760 Speaker 1: be the same thing with New York or New York 1332 01:05:12,840 --> 01:05:17,520 Speaker 1: or the digital subscription and the digital plus um print 1333 01:05:17,880 --> 01:05:20,640 Speaker 1: are the same price. So your initial reaction is, oh, 1334 01:05:20,720 --> 01:05:22,520 Speaker 1: well this is more for the same price. I'll take that. 1335 01:05:22,960 --> 01:05:25,640 Speaker 1: Then you have a stack of unread magazines. It's pointless. 1336 01:05:25,720 --> 01:05:27,240 Speaker 1: Well it's like, yeah, unless you're going to the beach 1337 01:05:27,320 --> 01:05:29,320 Speaker 1: or something, you're not you're not looking at that print edition. 1338 01:05:29,360 --> 01:05:30,720 Speaker 1: And I see this all the time with people that 1339 01:05:30,960 --> 01:05:33,600 Speaker 1: like I watched, that have print subscriptions that are saying, Okay, 1340 01:05:33,640 --> 01:05:35,920 Speaker 1: well I found this article, but I'm gonna read it online. Now. 1341 01:05:36,000 --> 01:05:38,400 Speaker 1: I will tell you that when you thumb through a paper, 1342 01:05:38,440 --> 01:05:42,160 Speaker 1: you're gonna find things that you're not going to discovery digital. Yeah, 1343 01:05:42,240 --> 01:05:45,120 Speaker 1: it's other than the ink on your hands and and 1344 01:05:45,280 --> 01:05:48,560 Speaker 1: the fact that you're carrying around. There's there's no question 1345 01:05:48,600 --> 01:05:53,200 Speaker 1: the discovery experience on print media is good, is an 1346 01:05:53,280 --> 01:05:56,160 Speaker 1: interesting and unique thing, and you find stuff you wouldn't 1347 01:05:56,160 --> 01:05:59,280 Speaker 1: find otherwise on digital. But that's also just a design challenge, 1348 01:05:59,560 --> 01:06:01,400 Speaker 1: Like it's not impossible to solve these problems in the 1349 01:06:01,480 --> 01:06:04,040 Speaker 1: digital realm, they just haven't invested in doing so. So 1350 01:06:04,480 --> 01:06:06,920 Speaker 1: one of the things I've noticed in the arc of 1351 01:06:07,200 --> 01:06:11,400 Speaker 1: um of media has been how in the beginning, and 1352 01:06:11,480 --> 01:06:14,640 Speaker 1: I want to call that the mid two thousands, they 1353 01:06:14,760 --> 01:06:20,200 Speaker 1: felt very comfortable lifting anything from print, anything from the 1354 01:06:20,240 --> 01:06:26,080 Speaker 1: blogosphere without attribution, just relentlessly, and and I started the 1355 01:06:26,320 --> 01:06:29,680 Speaker 1: read it here first, here's the blog post from Monday, 1356 01:06:29,760 --> 01:06:31,600 Speaker 1: and oh look, this is what the whole Street General 1357 01:06:31,640 --> 01:06:34,360 Speaker 1: had on Thursday. I would do screenshots side by side. Yeah, 1358 01:06:34,360 --> 01:06:37,000 Speaker 1: I definitely had that with with stuff I read about tech. Right, 1359 01:06:37,040 --> 01:06:38,720 Speaker 1: And then then you send it to the editor and 1360 01:06:38,760 --> 01:06:40,640 Speaker 1: saying I want you to know that I'm going to 1361 01:06:40,760 --> 01:06:43,040 Speaker 1: keep doing this until you're free to call me. I 1362 01:06:43,160 --> 01:06:45,479 Speaker 1: have a professional title where I'm quoted in the media 1363 01:06:45,520 --> 01:06:49,160 Speaker 1: all the time. Don't don't rip this off, and if 1364 01:06:49,200 --> 01:06:51,400 Speaker 1: you do this more blatantly, I'm going to stick the 1365 01:06:51,480 --> 01:06:53,920 Speaker 1: lawyers on UKUS. At a certain point, you cross the 1366 01:06:54,000 --> 01:06:59,520 Speaker 1: line from inspired by two copyright infringement and there was 1367 01:06:59,600 --> 01:07:01,680 Speaker 1: a head. My favorite was a thing I did on 1368 01:07:01,840 --> 01:07:06,280 Speaker 1: terrestrial radio. I'm a big music buff and I just 1369 01:07:06,480 --> 01:07:09,200 Speaker 1: hated what Clear Channel was doing to music. And I 1370 01:07:09,360 --> 01:07:13,360 Speaker 1: didn't right oh three or row four. It was way early. 1371 01:07:13,880 --> 01:07:16,320 Speaker 1: And then I think it was Barons had this front 1372 01:07:16,400 --> 01:07:20,240 Speaker 1: pick cover story losing the signal, and to their credit, 1373 01:07:20,320 --> 01:07:24,080 Speaker 1: they didn't actually steal language, but the structure, and it 1374 01:07:24,200 --> 01:07:26,919 Speaker 1: was like, wait a second, this looks really really didn't 1375 01:07:26,920 --> 01:07:29,960 Speaker 1: even notice it until a friend said, hey, Barons ripped 1376 01:07:30,000 --> 01:07:33,080 Speaker 1: you off? What do you mean? Look at that? It's 1377 01:07:33,200 --> 01:07:34,960 Speaker 1: um And there are a couple of other people I 1378 01:07:35,000 --> 01:07:39,520 Speaker 1: won't mention their names who feel like they can take 1379 01:07:39,640 --> 01:07:43,400 Speaker 1: that and it's so first that was the first phase. 1380 01:07:43,880 --> 01:07:46,440 Speaker 1: The second phase was one the Wall Street Journal in 1381 01:07:46,480 --> 01:07:50,320 Speaker 1: New York Times and other mainstream media started rolling out 1382 01:07:50,760 --> 01:07:54,280 Speaker 1: blogs of their own. Does this have any staying power? What? 1383 01:07:54,440 --> 01:07:57,000 Speaker 1: How do you see the role of a fast, not 1384 01:07:57,200 --> 01:08:00,320 Speaker 1: really very lightly edited, get it out quick gale. How 1385 01:08:00,360 --> 01:08:02,520 Speaker 1: do you see the role of that in mainstream media. 1386 01:08:02,520 --> 01:08:04,200 Speaker 1: I think it depends on the vertical and you know 1387 01:08:04,280 --> 01:08:08,720 Speaker 1: what what space you're covering, and um, how much reporting 1388 01:08:08,800 --> 01:08:12,320 Speaker 1: and and you know the original network sort of fuels it. 1389 01:08:12,800 --> 01:08:15,080 Speaker 1: There are there are great blogs done by mainstream media 1390 01:08:15,080 --> 01:08:17,920 Speaker 1: outlets that are usually written by somebody that has a 1391 01:08:17,960 --> 01:08:21,519 Speaker 1: great voice, that knows the space really well, has their network, 1392 01:08:21,720 --> 01:08:25,599 Speaker 1: and um can turn that into you know, a story 1393 01:08:25,680 --> 01:08:28,240 Speaker 1: really quickly. And I look at like it, Actually it 1394 01:08:28,320 --> 01:08:30,120 Speaker 1: wasn't a blog, but I look at like David Carr 1395 01:08:30,400 --> 01:08:33,120 Speaker 1: what he had done at the Times. Um, boy, I 1396 01:08:33,200 --> 01:08:37,200 Speaker 1: missed that guy. And you know he was very blog 1397 01:08:37,400 --> 01:08:39,439 Speaker 1: like in the way even he wrote his print columns 1398 01:08:40,000 --> 01:08:42,400 Speaker 1: very fast, turn things around new exactly who to call 1399 01:08:42,840 --> 01:08:46,200 Speaker 1: connected the dots really really well. I think that model 1400 01:08:46,280 --> 01:08:51,000 Speaker 1: can work, UM, but very few of the major outlets 1401 01:08:51,080 --> 01:08:53,760 Speaker 1: have a business model for that. You can't hide the 1402 01:08:53,800 --> 01:08:56,360 Speaker 1: blog behind a paywall, because that that ruins its point 1403 01:08:56,400 --> 01:08:58,759 Speaker 1: of linking to stuff. I've had that argument with the Wolster, 1404 01:08:59,160 --> 01:09:01,760 Speaker 1: this discussion with editors of the Wall Street Journal. Hey, 1405 01:09:01,880 --> 01:09:04,360 Speaker 1: why don't you put your blog out ahead of the payaball? 1406 01:09:04,920 --> 01:09:07,519 Speaker 1: But you have to be a registered subscriber to comment, 1407 01:09:07,720 --> 01:09:10,920 Speaker 1: and this way not only that you have actual people 1408 01:09:10,960 --> 01:09:14,960 Speaker 1: who are identifiable and make sense, right, you're inherently raising 1409 01:09:15,000 --> 01:09:17,360 Speaker 1: the quality of discussion. Yeah yeah, and you know so 1410 01:09:17,600 --> 01:09:20,479 Speaker 1: I think the business tensions always arise, and you see 1411 01:09:20,520 --> 01:09:22,599 Speaker 1: that with even sites that create great blogs, they sort 1412 01:09:22,600 --> 01:09:24,519 Speaker 1: of fade out after a while because the company is like, 1413 01:09:24,560 --> 01:09:26,240 Speaker 1: I'm not really behind this, I don't believe in this, 1414 01:09:26,880 --> 01:09:30,680 Speaker 1: um and and I draw that contrast to like organizations 1415 01:09:30,720 --> 01:09:34,000 Speaker 1: that started by blogging, right, so, you know, Gizmoto formerly Gawker, 1416 01:09:34,120 --> 01:09:35,920 Speaker 1: like they're sort of still all in and they still 1417 01:09:35,960 --> 01:09:38,000 Speaker 1: do good work. I mean, I think it's you know, 1418 01:09:38,120 --> 01:09:39,960 Speaker 1: it's always been uneven, but like the good stuff has 1419 01:09:39,960 --> 01:09:43,160 Speaker 1: always been good. You know, BuzzFeed still feels like a blog, um, 1420 01:09:43,479 --> 01:09:45,719 Speaker 1: even though you know, I don't like you can define 1421 01:09:45,760 --> 01:09:48,440 Speaker 1: blog however you want, but like that that that aesthetic, 1422 01:09:48,640 --> 01:09:51,360 Speaker 1: that voice is still there. Can we say the blogosphere 1423 01:09:51,840 --> 01:09:55,880 Speaker 1: is a meritocracy? Or is that overstating Yeah? I think 1424 01:09:55,920 --> 01:09:58,560 Speaker 1: that's overstating it because you still have the same dynamics 1425 01:09:58,600 --> 01:10:00,200 Speaker 1: you have in a lot of media. Would one of 1426 01:10:00,240 --> 01:10:02,559 Speaker 1: which is just simple, like you know, old boys network, 1427 01:10:02,640 --> 01:10:04,760 Speaker 1: like people promote who they know and these kinds of things. 1428 01:10:05,160 --> 01:10:08,160 Speaker 1: I think it's easier to get in. UM. I think 1429 01:10:08,200 --> 01:10:10,800 Speaker 1: you can blog your way to being the voice on 1430 01:10:10,840 --> 01:10:13,400 Speaker 1: a certain topic. Like if you have a certain niche 1431 01:10:13,479 --> 01:10:16,479 Speaker 1: or a certain subject that you are just obsessive about 1432 01:10:16,640 --> 01:10:18,519 Speaker 1: and you go all in on and you just keep 1433 01:10:18,560 --> 01:10:22,320 Speaker 1: bloging about it, you can own that topic or that idea. UM, 1434 01:10:22,439 --> 01:10:24,560 Speaker 1: and that's still true. I think if you're in a 1435 01:10:24,680 --> 01:10:27,040 Speaker 1: popular subject of you know, you know, the sort of 1436 01:10:27,080 --> 01:10:29,599 Speaker 1: like like for me, like a generic tech blog about 1437 01:10:29,640 --> 01:10:33,400 Speaker 1: what's happening in the tech industry, it's really really hard 1438 01:10:33,439 --> 01:10:35,040 Speaker 1: to break out. And that is going to be about 1439 01:10:35,640 --> 01:10:37,519 Speaker 1: do you have the relationships? Do you have somebody who's 1440 01:10:37,520 --> 01:10:40,439 Speaker 1: gonna scratch your back by giving you traffic? Let's jump 1441 01:10:40,560 --> 01:10:43,559 Speaker 1: right into our favorite questions. Tell me something. The most 1442 01:10:43,600 --> 01:10:47,920 Speaker 1: important thing people don't know about your background. UM, A 1443 01:10:48,000 --> 01:10:49,479 Speaker 1: lot of people don't know that I am from a 1444 01:10:49,560 --> 01:10:52,240 Speaker 1: tiny little town in rural Pennsylvania and that we were 1445 01:10:52,360 --> 01:10:57,000 Speaker 1: one of the only uh certainly only Asian families uh 1446 01:10:57,120 --> 01:11:00,360 Speaker 1: and that and that really UM informed out of my 1447 01:11:00,479 --> 01:11:03,400 Speaker 1: view of like having a very different, probably the opposite 1448 01:11:03,439 --> 01:11:06,600 Speaker 1: perspective of living in a big city like Manhattan. Understandable, 1449 01:11:07,160 --> 01:11:10,000 Speaker 1: who are some of your early mentors. Um, I've been 1450 01:11:10,080 --> 01:11:11,600 Speaker 1: very lucky to have a lot of good ones. I 1451 01:11:11,720 --> 01:11:15,200 Speaker 1: had a business partner named Fred Burke, who was my 1452 01:11:15,240 --> 01:11:16,880 Speaker 1: partner in my first company I started the day after 1453 01:11:16,880 --> 01:11:18,920 Speaker 1: I graduated high school, and he taught me a ton 1454 01:11:18,960 --> 01:11:22,160 Speaker 1: about sales and marketing and that was really really instructive. 1455 01:11:23,240 --> 01:11:28,320 Speaker 1: Who most influenced your approach to technology and entrepreneurship? Um, 1456 01:11:28,600 --> 01:11:31,479 Speaker 1: maybe one of the biggest influences is Dan Brocklin, who 1457 01:11:31,600 --> 01:11:37,040 Speaker 1: is the inventor of the spreadsheet and later later it 1458 01:11:37,080 --> 01:11:39,200 Speaker 1: sold it to Lettus and was working with Mr k 1459 01:11:39,280 --> 01:11:41,320 Speaker 1: Port and that team that made one too three. But 1460 01:11:41,520 --> 01:11:45,880 Speaker 1: Dan is one of the more thoughtful, brilliant, creative voices, 1461 01:11:46,120 --> 01:11:48,000 Speaker 1: you know, really undersung as one of the heroes of 1462 01:11:48,040 --> 01:11:51,080 Speaker 1: the tech industry and um super generous with his thoughts 1463 01:11:51,120 --> 01:11:54,000 Speaker 1: and ideas. This is the question that listeners ask all 1464 01:11:54,040 --> 01:11:57,719 Speaker 1: about constantly. This is the most asked question from listeners. 1465 01:11:58,200 --> 01:12:01,720 Speaker 1: What are some of your favorite books? Favorite books? UM, 1466 01:12:03,160 --> 01:12:06,720 Speaker 1: The Powerbroker, Robert Carross, Bio Robert most classic and and 1467 01:12:06,880 --> 01:12:10,360 Speaker 1: and also really teaches systems thinking. Um, so that's one 1468 01:12:10,360 --> 01:12:16,479 Speaker 1: of those that really really jumps out. Um. Uh. David 1469 01:12:16,600 --> 01:12:22,479 Speaker 1: Ritz did a biography of Aretha Franklin. Really and it 1470 01:12:22,720 --> 01:12:25,560 Speaker 1: is it is it's almost a history of America in 1471 01:12:25,760 --> 01:12:28,200 Speaker 1: the form of a biographyavor Wretha Franklin. It is such 1472 01:12:28,240 --> 01:12:34,160 Speaker 1: a brilliant, brilliant David Ritz, Yeah, the biography of Aretha Franklin. Yeah, yeah, 1473 01:12:34,280 --> 01:12:36,840 Speaker 1: see I would have named that respect if Yeah. He 1474 01:12:37,000 --> 01:12:38,800 Speaker 1: I forget what, I forget what the actual titles and 1475 01:12:39,160 --> 01:12:41,080 Speaker 1: they there's a bunch of books about are name of respect, 1476 01:12:41,160 --> 01:12:43,920 Speaker 1: so they didn't go with that name. But it is, um, 1477 01:12:44,560 --> 01:12:46,360 Speaker 1: it is actually one of the best business books that 1478 01:12:46,360 --> 01:12:49,240 Speaker 1: I've read in a long time. Really, Yeah, that's quite fascinating. 1479 01:12:49,280 --> 01:12:54,519 Speaker 1: And give me one more. Oh, um fiction nonfiction technom. Yeah, 1480 01:12:54,640 --> 01:12:58,120 Speaker 1: that's there's so many, it's really it's it's give us 1481 01:12:58,200 --> 01:13:01,200 Speaker 1: ten more. There was actually, um, I'm gonna forget the 1482 01:13:01,280 --> 01:13:02,599 Speaker 1: name of it, there was a book about the creation 1483 01:13:02,640 --> 01:13:06,040 Speaker 1: of the Highline Park here in New York City, um, 1484 01:13:06,160 --> 01:13:08,320 Speaker 1: and how it started as a really a community movement 1485 01:13:08,360 --> 01:13:10,280 Speaker 1: and it became this sort of you know, one of 1486 01:13:10,320 --> 01:13:13,120 Speaker 1: the top landmarks in the city. Now touris to visit. 1487 01:13:13,439 --> 01:13:16,120 Speaker 1: Those guys are actually consulting around the world, people are 1488 01:13:16,120 --> 01:13:18,000 Speaker 1: trying to do similar and they did such a good 1489 01:13:18,040 --> 01:13:20,320 Speaker 1: job of telling story and of even being self critical 1490 01:13:20,800 --> 01:13:24,000 Speaker 1: about the mistakes they made. UM, and for example that 1491 01:13:24,080 --> 01:13:25,920 Speaker 1: the park is not really inclusive enough of the community 1492 01:13:25,960 --> 01:13:28,320 Speaker 1: that that's part of UM. I thought it was just 1493 01:13:28,560 --> 01:13:31,040 Speaker 1: very thoughtful and nuanced and also showed how you can 1494 01:13:31,120 --> 01:13:34,559 Speaker 1: build things that are improbably ambitious and make them happen. Anyway, 1495 01:13:34,640 --> 01:13:36,840 Speaker 1: now now they're talking about trying to create an underground 1496 01:13:36,920 --> 01:13:39,920 Speaker 1: park from it. I don't know if that will really 1497 01:13:39,960 --> 01:13:43,040 Speaker 1: work to the same degree. But listen, no one thought 1498 01:13:43,080 --> 01:13:46,160 Speaker 1: the Highline would work. It's been a home run. UM. 1499 01:13:46,400 --> 01:13:49,120 Speaker 1: Since you've joined tech, tell us about what you think 1500 01:13:49,160 --> 01:13:53,280 Speaker 1: are the most significant changes. UM. You know what I 1501 01:13:53,320 --> 01:13:55,880 Speaker 1: would say these past couple of weeks has been one 1502 01:13:55,920 --> 01:13:59,720 Speaker 1: of the most significant milestones, which is this this wreckling 1503 01:14:00,040 --> 01:14:02,800 Speaker 1: of you know, vcs being pushed out of their own 1504 01:14:02,840 --> 01:14:07,320 Speaker 1: funds on ethical grounds Susan Fowler blog post, which was 1505 01:14:07,360 --> 01:14:09,719 Speaker 1: really powerful, which is amazing because that post was around 1506 01:14:09,760 --> 01:14:12,679 Speaker 1: for months before it seems to have gained any traction. Yeah. Yeah, 1507 01:14:12,760 --> 01:14:14,760 Speaker 1: and and also there were voices before her, right, And 1508 01:14:15,080 --> 01:14:17,120 Speaker 1: you know, I think the fact that just a steady 1509 01:14:17,200 --> 01:14:20,320 Speaker 1: drumbeat of activism worked and that you have things like 1510 01:14:20,479 --> 01:14:22,320 Speaker 1: you know, the CEO of Uber stepping down. I think 1511 01:14:22,400 --> 01:14:25,280 Speaker 1: these are hopefully a moment of reckoning for so many 1512 01:14:25,360 --> 01:14:26,720 Speaker 1: of us that have been talking about the need for 1513 01:14:26,800 --> 01:14:29,960 Speaker 1: tech to be more ethical, more humane, more inclusive, to 1514 01:14:30,040 --> 01:14:33,200 Speaker 1: be able to really point at that having had some impact. 1515 01:14:33,280 --> 01:14:37,120 Speaker 1: And these are not victories, because these are horrible situations 1516 01:14:37,160 --> 01:14:39,639 Speaker 1: that people went through, but it's just the first time 1517 01:14:39,680 --> 01:14:43,280 Speaker 1: that we didn't completely lose after somebody had to endure 1518 01:14:43,400 --> 01:14:47,280 Speaker 1: these kinds of indignities at work. I suspect Travis, who 1519 01:14:47,520 --> 01:14:50,720 Speaker 1: owns fifty one plus percent of the voting stuff, is 1520 01:14:50,760 --> 01:14:54,120 Speaker 1: going to eventually find his way back when he's being 1521 01:14:54,439 --> 01:14:58,000 Speaker 1: you know, re educated and and the kindler, gentler Travis 1522 01:14:58,040 --> 01:15:01,400 Speaker 1: will show up. But you never know. Um. So that's 1523 01:15:01,600 --> 01:15:03,720 Speaker 1: that's the changes that have taken place. What do you 1524 01:15:03,760 --> 01:15:06,839 Speaker 1: think the next major changes are? And it's only technology, 1525 01:15:06,880 --> 01:15:11,840 Speaker 1: it's not a big field that I do think there 1526 01:15:12,040 --> 01:15:14,880 Speaker 1: is um a return to first principles about people being 1527 01:15:14,880 --> 01:15:17,880 Speaker 1: able to create on their own, which is that, you know, 1528 01:15:17,960 --> 01:15:19,320 Speaker 1: the promise of the web was that we were all 1529 01:15:19,360 --> 01:15:20,800 Speaker 1: going to have a voice and we have a place, 1530 01:15:21,280 --> 01:15:23,200 Speaker 1: and that we would be able to publish and create things. 1531 01:15:23,400 --> 01:15:27,160 Speaker 1: And there is such a centralization happening around Facebook, Google, 1532 01:15:27,720 --> 01:15:32,840 Speaker 1: you know, go to those two exactly and about that 1533 01:15:33,000 --> 01:15:36,760 Speaker 1: much of the tech platform development. So what developers are 1534 01:15:36,800 --> 01:15:39,439 Speaker 1: doing outside in the world is I'm doing this for iOS, 1535 01:15:39,520 --> 01:15:41,880 Speaker 1: for for Apple, I'm doing this for Android for Google, 1536 01:15:42,320 --> 01:15:43,760 Speaker 1: and being able to go back and say, well, the 1537 01:15:43,800 --> 01:15:46,439 Speaker 1: web itself is still the bigger platforms. So the biggest 1538 01:15:46,439 --> 01:15:48,840 Speaker 1: platform has ever existed. How do you create for that 1539 01:15:49,040 --> 01:15:51,160 Speaker 1: and support that and express yourself there? And I mean, 1540 01:15:51,520 --> 01:15:53,320 Speaker 1: you know, we have a dog in this fight with 1541 01:15:53,360 --> 01:15:55,840 Speaker 1: glitch where I'm very lucky because developers love it and 1542 01:15:55,880 --> 01:15:58,120 Speaker 1: they're catching on and and really saying like this is 1543 01:15:58,200 --> 01:15:59,920 Speaker 1: one of the things that gives us hope. But I 1544 01:16:00,000 --> 01:16:02,479 Speaker 1: think it's a broader movement, which is um what brought 1545 01:16:02,520 --> 01:16:03,880 Speaker 1: a lot of us to the web in the first place, 1546 01:16:03,920 --> 01:16:05,519 Speaker 1: the idea that we could create something in the world 1547 01:16:05,600 --> 01:16:07,560 Speaker 1: could find it and respond to it and make it 1548 01:16:07,600 --> 01:16:10,800 Speaker 1: a successful So this is a standard question I asked people, 1549 01:16:11,080 --> 01:16:15,719 Speaker 1: but it's especially poignant for you. Tell us about a fail. 1550 01:16:16,000 --> 01:16:18,320 Speaker 1: Tell us about a time you failed and what you 1551 01:16:18,439 --> 01:16:21,000 Speaker 1: learned from the experience. And if people want to know 1552 01:16:21,200 --> 01:16:25,320 Speaker 1: why that is a point in question, just google anal 1553 01:16:25,439 --> 01:16:30,240 Speaker 1: dash and quote fail and you'll understand why. Um, I 1554 01:16:30,360 --> 01:16:34,760 Speaker 1: do think one of the biggest failures was in being 1555 01:16:34,800 --> 01:16:36,559 Speaker 1: part of the community that created the first social media 1556 01:16:36,600 --> 01:16:41,800 Speaker 1: and social networking tools. We were so desperate for and 1557 01:16:41,880 --> 01:16:44,479 Speaker 1: hungry for people to use them and optimize for growth 1558 01:16:45,280 --> 01:16:51,200 Speaker 1: at the expense of making these environments humane and thoughtful 1559 01:16:51,280 --> 01:16:54,320 Speaker 1: for people, and and so there's been a tremendous social 1560 01:16:54,400 --> 01:16:56,760 Speaker 1: costs where Yeah, everybody got connected, and it's great, it's 1561 01:16:56,760 --> 01:16:59,120 Speaker 1: amazing to be able to send messages and media and 1562 01:16:59,200 --> 01:17:03,320 Speaker 1: photos to people instantly anywhere in the world, but it 1563 01:17:03,360 --> 01:17:05,760 Speaker 1: should have also been something that empowered the people, most 1564 01:17:05,840 --> 01:17:09,160 Speaker 1: of the margins, most vulnerable, to be able to advocate 1565 01:17:09,200 --> 01:17:12,320 Speaker 1: for themselves. And and instead, in many ways, we re 1566 01:17:12,600 --> 01:17:17,240 Speaker 1: victimized them. And um that is something that I hope 1567 01:17:17,280 --> 01:17:19,479 Speaker 1: to spend the rest of my career working to fix. 1568 01:17:19,640 --> 01:17:23,439 Speaker 1: To be fair to you guys, that's like third level 1569 01:17:23,520 --> 01:17:25,760 Speaker 1: thinking at the time, it's hey, is anybody going to 1570 01:17:25,880 --> 01:17:30,040 Speaker 1: use these tools? Not the is gonna become widespread, widespread adoption, 1571 01:17:30,200 --> 01:17:34,760 Speaker 1: It'll be enormously successful, so successful that trolls are going 1572 01:17:34,800 --> 01:17:36,880 Speaker 1: to be a problem in the common sense that that 1573 01:17:37,040 --> 01:17:41,400 Speaker 1: was just ends. Yeah, you know, you know, someday there's 1574 01:17:41,400 --> 01:17:43,920 Speaker 1: gonna be a million people on social media, and she 1575 01:17:44,120 --> 01:17:45,560 Speaker 1: was you know, she was founder of our company, and 1576 01:17:45,640 --> 01:17:48,320 Speaker 1: she's like, you idiot, there's gonna be a hundred million people. 1577 01:17:48,600 --> 01:17:50,960 Speaker 1: Of course, then it turns out there's a billion, and so, 1578 01:17:51,280 --> 01:17:55,200 Speaker 1: you know, you would never it would have been absurd 1579 01:17:55,439 --> 01:17:57,400 Speaker 1: if we had said, let's plan for what happens when 1580 01:17:57,400 --> 01:17:59,439 Speaker 1: a billion people show up on these social networks, because 1581 01:17:59,479 --> 01:18:00,960 Speaker 1: people like you're of your mind. There aren't a billion 1582 01:18:01,000 --> 01:18:03,840 Speaker 1: people with computers, right, but there are a billion people 1583 01:18:03,880 --> 01:18:09,000 Speaker 1: with smartphones. It was just so there was no like 1584 01:18:09,080 --> 01:18:12,000 Speaker 1: science fiction that that's exactly right. Um, what do you 1585 01:18:12,080 --> 01:18:15,320 Speaker 1: do to keep mentally? You're physically fit outside of the office, 1586 01:18:15,360 --> 01:18:17,880 Speaker 1: what do you do to relax? I love music. I 1587 01:18:17,920 --> 01:18:21,639 Speaker 1: know you're a music fan. I I still am. I'm 1588 01:18:21,680 --> 01:18:23,960 Speaker 1: sort of notorious for being a big Prince fan even 1589 01:18:24,000 --> 01:18:27,280 Speaker 1: before he passed away. But I love sort of cataloging 1590 01:18:27,479 --> 01:18:29,560 Speaker 1: his work and so he's showing the cultural impact and 1591 01:18:29,640 --> 01:18:32,800 Speaker 1: so it's it's it's a fun hobby because you cross 1592 01:18:32,840 --> 01:18:35,320 Speaker 1: paths with like an improbable cross section of life. Like 1593 01:18:35,400 --> 01:18:38,320 Speaker 1: there are people in politics and media, all these different 1594 01:18:38,320 --> 01:18:40,479 Speaker 1: disciplines that are like, oh, you know whatever, they like 1595 01:18:40,520 --> 01:18:42,320 Speaker 1: the music. It's fun, but also this like, oh, he 1596 01:18:42,439 --> 01:18:44,639 Speaker 1: was a pioneer in tech and did all these sort 1597 01:18:44,640 --> 01:18:46,439 Speaker 1: of interesting cultural things. And then of course just at 1598 01:18:46,479 --> 01:18:48,920 Speaker 1: a human level for me, like, um, one of the 1599 01:18:48,960 --> 01:18:50,599 Speaker 1: best ways to clear my head is I have got 1600 01:18:50,640 --> 01:18:52,320 Speaker 1: a six year old son and will go out and 1601 01:18:52,600 --> 01:18:54,720 Speaker 1: you know, just sort of do something fun. Go to 1602 01:18:54,760 --> 01:18:57,240 Speaker 1: a museum, uh, you know, walk around the neighborhood, take 1603 01:18:57,280 --> 01:19:00,120 Speaker 1: the dog for a walk, and I am now for 1604 01:19:00,600 --> 01:19:03,360 Speaker 1: not reset by spending it just a little bit of 1605 01:19:03,400 --> 01:19:04,840 Speaker 1: time listening to him in his view of the world. 1606 01:19:05,760 --> 01:19:09,160 Speaker 1: That's a lot of fun. If UM, if a millennial 1607 01:19:09,320 --> 01:19:11,160 Speaker 1: came to you, or someone at the beginning of their 1608 01:19:11,240 --> 01:19:14,000 Speaker 1: career came to you and said, hey, I'm interested in 1609 01:19:14,400 --> 01:19:18,360 Speaker 1: going into tech or social community, what sort of advice 1610 01:19:18,520 --> 01:19:22,040 Speaker 1: might you give them? Um? One of the key things is, uh, 1611 01:19:22,520 --> 01:19:26,920 Speaker 1: find this topic about which you are irrationally passionate, like 1612 01:19:27,200 --> 01:19:29,400 Speaker 1: so that it can be as narrow as niche as 1613 01:19:29,439 --> 01:19:31,559 Speaker 1: you want. But if you can be the person who 1614 01:19:31,680 --> 01:19:34,120 Speaker 1: is the one person in the world who is most 1615 01:19:34,160 --> 01:19:36,000 Speaker 1: in love with that idea and knows the most about it, 1616 01:19:36,360 --> 01:19:39,719 Speaker 1: obsesses over it, and really just owns it, Um, there's 1617 01:19:39,760 --> 01:19:42,040 Speaker 1: something there for you like you you can you can 1618 01:19:42,160 --> 01:19:45,200 Speaker 1: really build a whole career around that because you can't 1619 01:19:45,280 --> 01:19:47,400 Speaker 1: win on Like everybody is chasing this one trend and 1620 01:19:47,439 --> 01:19:49,120 Speaker 1: I'm going to be part of it. I think that's 1621 01:19:49,160 --> 01:19:52,760 Speaker 1: really key. I think the other part um, because the 1622 01:19:52,800 --> 01:19:55,560 Speaker 1: tide is starting to turn, particularly in tech, find the 1623 01:19:55,600 --> 01:19:59,960 Speaker 1: people who are most of the margins, least connected, least central, 1624 01:20:00,000 --> 01:20:04,719 Speaker 1: at least privileged in tech and lift them up because 1625 01:20:04,760 --> 01:20:07,000 Speaker 1: people one day will remember it. They'll never forget you 1626 01:20:07,120 --> 01:20:08,760 Speaker 1: did it too. I think that is who is a 1627 01:20:08,760 --> 01:20:11,960 Speaker 1: sentant broadly is the people on on the outside are saying, Okay, 1628 01:20:12,040 --> 01:20:13,880 Speaker 1: I want to be part of this success, that I 1629 01:20:13,960 --> 01:20:16,040 Speaker 1: want to benefit from it. And three it's just the 1630 01:20:16,160 --> 01:20:18,879 Speaker 1: right thing to do. It feels good. And our final question, 1631 01:20:19,600 --> 01:20:22,479 Speaker 1: what is it that you know about technology and social 1632 01:20:22,560 --> 01:20:26,320 Speaker 1: networks today that you wish you knew seventeen or twenty 1633 01:20:26,439 --> 01:20:30,440 Speaker 1: years ago? Um? I wish I had known that technology 1634 01:20:31,479 --> 01:20:33,960 Speaker 1: follows the same rules of human society that the physical 1635 01:20:34,000 --> 01:20:36,519 Speaker 1: world does. In the same way that we architect our 1636 01:20:36,560 --> 01:20:41,920 Speaker 1: buildings and design our communities in our neighborhoods to be safe, comfortable, welcoming, 1637 01:20:42,080 --> 01:20:44,920 Speaker 1: warm places for us to live. We need to put 1638 01:20:44,960 --> 01:20:47,560 Speaker 1: the same thought into designing our digital spaces so that 1639 01:20:47,680 --> 01:20:50,840 Speaker 1: people are welcoming and kind to each other and treat 1640 01:20:50,880 --> 01:20:53,600 Speaker 1: each other well and feel neighborly towards one another. And 1641 01:20:53,800 --> 01:20:56,439 Speaker 1: if we can just repeat a lot of the lessons 1642 01:20:56,520 --> 01:20:59,880 Speaker 1: that the last ten thousand years of civilization have taught us, uh, 1643 01:21:00,120 --> 01:21:02,120 Speaker 1: we can make being online and in our apps a 1644 01:21:02,240 --> 01:21:07,240 Speaker 1: lot more thoughtful and rewarding experience. We have been speaking 1645 01:21:07,360 --> 01:21:11,360 Speaker 1: with Anald Dash he is the CEO of fog Creek Software. 1646 01:21:12,040 --> 01:21:13,720 Speaker 1: Be sure and look up an inch or down an 1647 01:21:13,760 --> 01:21:16,280 Speaker 1: inch on Apple iTunes and you could see any of 1648 01:21:16,320 --> 01:21:20,439 Speaker 1: the other hundred and fifty or so such conversations. We 1649 01:21:20,680 --> 01:21:25,439 Speaker 1: love your feedback, comments and suggestions right to me at 1650 01:21:25,720 --> 01:21:29,439 Speaker 1: m IB podcast at Bloomberg dot net. I would be 1651 01:21:29,560 --> 01:21:32,960 Speaker 1: remiss if I did not thank my audio engineer Charlie Valmer, 1652 01:21:33,439 --> 01:21:37,599 Speaker 1: my head of research Michael bat Nick, and my booker 1653 01:21:37,640 --> 01:21:42,160 Speaker 1: slash producer Taylor Riggs. I'm Barry Ritolts. You're listening to 1654 01:21:42,400 --> 01:21:44,880 Speaker 1: Masters in Business on Bloomberg Radio.