1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:15,040 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,200 --> 00:00:19,120 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:19,280 --> 00:00:22,159 Speaker 1: and How the Tech are Yet Today, I have a 5 00:00:22,200 --> 00:00:25,440 Speaker 1: special treat for you. There's a big treat for me too. 6 00:00:26,160 --> 00:00:30,200 Speaker 1: I got to sit down with Baratunde Thurston. He's a 7 00:00:30,240 --> 00:00:35,159 Speaker 1: comedian and activist technologist. On top of all that, he 8 00:00:35,200 --> 00:00:38,720 Speaker 1: hosts a show called How to Citizen. The show's third 9 00:00:38,760 --> 00:00:42,560 Speaker 1: season focused on technology's role in our world and our 10 00:00:42,680 --> 00:00:46,880 Speaker 1: our ability to citizen, as he calls it, and also 11 00:00:46,920 --> 00:00:50,400 Speaker 1: about how some really remarkable people are working hard to 12 00:00:50,479 --> 00:00:53,800 Speaker 1: see that technology lives up to its potential to facilitate 13 00:00:53,880 --> 00:00:57,360 Speaker 1: positive change. Anyone out there who has listened to tech 14 00:00:57,360 --> 00:00:59,640 Speaker 1: Stuff for any length of time, y'all know that I 15 00:00:59,720 --> 00:01:04,000 Speaker 1: often focus on the dark side of tech, from unintended 16 00:01:04,040 --> 00:01:09,200 Speaker 1: consequences that can be harmful to outrighte malicious misuses or 17 00:01:09,360 --> 00:01:13,640 Speaker 1: abuses of technology, and how that impacts people. Well. I 18 00:01:13,680 --> 00:01:17,319 Speaker 1: like to think of Barrattunda's attitude as a balm for 19 00:01:17,480 --> 00:01:22,840 Speaker 1: the cantankerous perspective I typically have of tech, and so 20 00:01:23,160 --> 00:01:26,520 Speaker 1: sit back and enjoy this conversation that I had with 21 00:01:26,560 --> 00:01:30,840 Speaker 1: Barrittunde Thurston. You're gonna learn a lot, including some stuff 22 00:01:30,880 --> 00:01:34,880 Speaker 1: you might not know about me. I want to welcome 23 00:01:35,160 --> 00:01:39,840 Speaker 1: my guest, Bartunde Thurston, someone whom I have admired for 24 00:01:39,920 --> 00:01:44,000 Speaker 1: many years. I have admired his comedy work, I've admired 25 00:01:44,120 --> 00:01:48,480 Speaker 1: his perspective on tech, and I admire him in particular because, 26 00:01:48,720 --> 00:01:51,680 Speaker 1: whether he remembers it or not, he once rescued me 27 00:01:51,920 --> 00:01:56,880 Speaker 1: from the worst day of my professional career. WHOA welcome 28 00:01:56,880 --> 00:01:59,440 Speaker 1: to the show. Yeah, that good to be here, Jonathan, 29 00:02:00,080 --> 00:02:04,800 Speaker 1: Me about this rescue mission? I don't know about Okay, 30 00:02:05,240 --> 00:02:08,040 Speaker 1: I'll set the scene. It was south By Southwest two 31 00:02:08,040 --> 00:02:11,359 Speaker 1: thousand sixteen. It was in between you getting inducted into 32 00:02:11,360 --> 00:02:13,959 Speaker 1: the Hall of Fame and me trying every taco at Torches. 33 00:02:14,919 --> 00:02:18,080 Speaker 1: And I had been assigned to moderate a panel for 34 00:02:18,120 --> 00:02:22,440 Speaker 1: which I was severely underqualified to do, but I had 35 00:02:22,440 --> 00:02:25,440 Speaker 1: no had no option. And it was for a now defunct, 36 00:02:25,760 --> 00:02:31,679 Speaker 1: anonymous GEO located social networking site called yik yak, And 37 00:02:31,919 --> 00:02:35,440 Speaker 1: for those who do not remember, yik yak was a 38 00:02:35,520 --> 00:02:40,240 Speaker 1: service that allowed people to anonymously post messages and people 39 00:02:40,280 --> 00:02:43,920 Speaker 1: within a certain radius of that person's physical location could 40 00:02:43,919 --> 00:02:46,919 Speaker 1: see those messages and post their own, So I had 41 00:02:46,960 --> 00:02:49,280 Speaker 1: I had to submit every single question I had to 42 00:02:49,320 --> 00:02:51,639 Speaker 1: the two co founders ahead of time, and that made 43 00:02:51,639 --> 00:02:53,520 Speaker 1: me feel like I was kind of restricted in what 44 00:02:53,600 --> 00:02:56,079 Speaker 1: I could ask. I wasn't confident in what I could do. 45 00:02:56,600 --> 00:02:59,840 Speaker 1: So by the time we get to the panel, fly 46 00:03:00,040 --> 00:03:04,200 Speaker 1: up sweat is evident. It's undeniable. Uh. I'm sitting there. 47 00:03:04,400 --> 00:03:06,880 Speaker 1: There's way more people than I expected to be there 48 00:03:06,880 --> 00:03:10,240 Speaker 1: because this wasn't part of the official south By Southwest 49 00:03:10,280 --> 00:03:13,160 Speaker 1: panel uh that you would find in the other locations. 50 00:03:13,160 --> 00:03:17,160 Speaker 1: This was in some bar off the main drag, and 51 00:03:18,480 --> 00:03:20,840 Speaker 1: I get through my questions pretty quickly because I'm getting 52 00:03:20,840 --> 00:03:25,400 Speaker 1: monosyllabic answers and not really any follow up. So after 53 00:03:25,440 --> 00:03:27,760 Speaker 1: thirty minutes in an hour long slot, I opened it 54 00:03:27,840 --> 00:03:29,520 Speaker 1: up to the audience because I have nothing else to 55 00:03:29,560 --> 00:03:32,120 Speaker 1: go to. And I don't know if you remember, Baritone D, 56 00:03:32,240 --> 00:03:34,360 Speaker 1: but you were the first person to stand up and 57 00:03:34,400 --> 00:03:40,360 Speaker 1: ask questions, and you held those co founders accountable. You 58 00:03:40,360 --> 00:03:44,280 Speaker 1: you asked them about the consequences of unleashing a tool 59 00:03:44,360 --> 00:03:49,120 Speaker 1: that allowed for anonymous posting in geolocated areas. Uh. And 60 00:03:50,160 --> 00:03:55,000 Speaker 1: the potential effects that could have on populations in those communities, 61 00:03:55,000 --> 00:03:59,560 Speaker 1: specifically in colleges and high schools. And they gave very 62 00:03:59,680 --> 00:04:04,280 Speaker 1: uh unsatisfying answers. And at the end of this I 63 00:04:04,320 --> 00:04:08,360 Speaker 1: was shaking. I was so upset with myself. But I 64 00:04:08,480 --> 00:04:11,520 Speaker 1: came up to you and I thanked you for your question. 65 00:04:11,560 --> 00:04:14,280 Speaker 1: You were incredibly gracious, and then I slunk off to 66 00:04:14,400 --> 00:04:17,800 Speaker 1: hide and shame and that fueled the stress dreams for 67 00:04:17,839 --> 00:04:20,680 Speaker 1: the following six years, which is not even a joke, 68 00:04:21,040 --> 00:04:23,599 Speaker 1: but um, but I was so thankful you were there 69 00:04:24,080 --> 00:04:30,000 Speaker 1: because I appreciated the fact that you asked the tough 70 00:04:30,120 --> 00:04:33,120 Speaker 1: questions and you were you were adamant about getting answers, 71 00:04:33,120 --> 00:04:36,680 Speaker 1: and when clearly there were none, then that precipitated more 72 00:04:36,800 --> 00:04:38,880 Speaker 1: questions from the rest of the audience. You. I remember 73 00:04:38,880 --> 00:04:40,680 Speaker 1: there was a teacher who came up after you and 74 00:04:40,720 --> 00:04:44,760 Speaker 1: started asking very tough questions, and I thought, man, I 75 00:04:44,760 --> 00:04:47,640 Speaker 1: wish they had asked Barratt toun Day to moderate this panel. 76 00:04:48,400 --> 00:04:51,720 Speaker 1: It would have been a much better panel. Oh, so 77 00:04:52,040 --> 00:04:55,520 Speaker 1: thank you for remembering that. I actually do remember it now. 78 00:04:55,680 --> 00:04:57,679 Speaker 1: It was It was at a bar right off Congress 79 00:04:57,680 --> 00:05:00,280 Speaker 1: Street in downtown Austin. And they had to. I think 80 00:05:00,279 --> 00:05:03,680 Speaker 1: that had a robotic bartender, which I also recall. Um, 81 00:05:03,720 --> 00:05:07,440 Speaker 1: I looked up that service just to get a visual memory, 82 00:05:07,760 --> 00:05:10,440 Speaker 1: and yeah, it was like a yak was there logo. 83 00:05:11,000 --> 00:05:14,520 Speaker 1: They went out of business in tween. They literally just 84 00:05:14,600 --> 00:05:18,400 Speaker 1: came back. I looked it up and it was like 85 00:05:18,560 --> 00:05:22,320 Speaker 1: three hours ago from Yahoo News. Yeack yak is back, baby. 86 00:05:23,200 --> 00:05:25,279 Speaker 1: So we can do this all over again, Jonathan, We 87 00:05:25,320 --> 00:05:27,640 Speaker 1: can do this well this time. This time, I feel 88 00:05:27,760 --> 00:05:31,080 Speaker 1: a little more prepared. That was that was Babe in 89 00:05:31,080 --> 00:05:38,919 Speaker 1: the Woods, Deer in the headlight, Jonathan, I'm slightly better now. Yeah, 90 00:05:39,080 --> 00:05:41,560 Speaker 1: no joke. So that kind of brings me to talk 91 00:05:41,600 --> 00:05:44,920 Speaker 1: about How to Citizen, which is already a phenomenal podcast 92 00:05:45,600 --> 00:05:49,080 Speaker 1: before you got into the the specific role of focusing 93 00:05:49,120 --> 00:05:53,280 Speaker 1: on text role in this idea of making citizen a 94 00:05:53,360 --> 00:05:57,560 Speaker 1: verb and uh and this concept of becoming more socially 95 00:05:58,000 --> 00:06:01,360 Speaker 1: responsible and active in a positive way. And first of all, 96 00:06:01,440 --> 00:06:04,240 Speaker 1: I absolutely love the show, and I love the tone 97 00:06:04,320 --> 00:06:07,960 Speaker 1: of it, and I love your perspective. So uh, congratulations 98 00:06:07,960 --> 00:06:11,679 Speaker 1: and thank you for that. Well, Um, thank you, and 99 00:06:11,880 --> 00:06:15,840 Speaker 1: you're welcome for that. I mean, this season really is 100 00:06:16,680 --> 00:06:20,360 Speaker 1: uh an outgrowth of of moments like the one that 101 00:06:20,400 --> 00:06:23,440 Speaker 1: we shared in sixteen. I was at that south By 102 00:06:23,440 --> 00:06:28,200 Speaker 1: Southwest to be inducted into the Hall of Fame, Baby Baby, 103 00:06:28,960 --> 00:06:32,320 Speaker 1: and I gave a little acceptance speech like four minutes, 104 00:06:32,760 --> 00:06:35,839 Speaker 1: and I made a lot of jokes about brand activations 105 00:06:36,080 --> 00:06:39,120 Speaker 1: at this arts festival as well as tacos and whiskey 106 00:06:39,200 --> 00:06:42,560 Speaker 1: and beer. But I also remember saying the folks, like 107 00:06:43,520 --> 00:06:45,760 Speaker 1: the assumption that tech is just going to create goodness 108 00:06:45,760 --> 00:06:47,640 Speaker 1: in the world, We've got to stop that, and we 109 00:06:47,640 --> 00:06:50,920 Speaker 1: could actually be accelerating bad things in the world, you know, 110 00:06:51,040 --> 00:06:55,080 Speaker 1: at scale, and codifying our history and sort of forcing 111 00:06:55,160 --> 00:06:57,920 Speaker 1: history into our future under the guise of machine learning. 112 00:06:57,960 --> 00:07:00,400 Speaker 1: If what we teach those machines is all of our 113 00:07:00,440 --> 00:07:02,960 Speaker 1: mistakes and our biases and all that kind of stuff. 114 00:07:03,320 --> 00:07:06,000 Speaker 1: And and now that's that's like a common perspective, you know, 115 00:07:06,040 --> 00:07:10,680 Speaker 1: on tech, but in we're just kind of entering the discordant, 116 00:07:10,680 --> 00:07:14,640 Speaker 1: skeptical realm about what might happen with all this disruption, 117 00:07:15,160 --> 00:07:17,840 Speaker 1: and this season of How the Citizen is a follow 118 00:07:17,920 --> 00:07:19,920 Speaker 1: up to that, and it's like, okay, so what else 119 00:07:19,920 --> 00:07:22,240 Speaker 1: can we do? You know, what else can we build? 120 00:07:22,240 --> 00:07:26,760 Speaker 1: Who's out here not yik yakking all over our commons, 121 00:07:27,320 --> 00:07:31,840 Speaker 1: but you know, creating something, if not perfect, at least 122 00:07:31,880 --> 00:07:34,679 Speaker 1: more perfect and helping us be more of a union, 123 00:07:34,720 --> 00:07:36,320 Speaker 1: not just in the US, but kind of just as 124 00:07:36,400 --> 00:07:39,080 Speaker 1: human beings all over the world. So it's been a 125 00:07:39,960 --> 00:07:42,920 Speaker 1: I needed that season to remind me that it's not 126 00:07:43,080 --> 00:07:46,160 Speaker 1: all like insurrection and Nazis out there and team depression 127 00:07:46,160 --> 00:07:49,400 Speaker 1: and body shaming, that there is so much more good 128 00:07:49,440 --> 00:07:52,920 Speaker 1: we can still do. Uh and and the future history 129 00:07:53,040 --> 00:07:57,800 Speaker 1: is not yet written. I could not agree more. I 130 00:07:57,800 --> 00:08:01,240 Speaker 1: am in that same need or that positive spin. A 131 00:08:01,240 --> 00:08:03,240 Speaker 1: few years ago, I used to host a show called 132 00:08:03,240 --> 00:08:06,560 Speaker 1: forward Thinking, and the whole premise was that it was 133 00:08:06,600 --> 00:08:11,040 Speaker 1: an optimistic view of the future, largely through the lens 134 00:08:11,080 --> 00:08:16,960 Speaker 1: of technology and talking about the the promise of tech 135 00:08:17,120 --> 00:08:19,960 Speaker 1: or the potential of tech and through that optimistic lens. 136 00:08:20,000 --> 00:08:23,160 Speaker 1: And I do largely consider myself an optimist. I just 137 00:08:23,280 --> 00:08:25,480 Speaker 1: unfortunately I am an optimist who also feels like I've 138 00:08:25,520 --> 00:08:30,720 Speaker 1: been put through the ringer for the yeah and and 139 00:08:30,800 --> 00:08:33,800 Speaker 1: yeah you think back, like I've heard you speak about 140 00:08:33,840 --> 00:08:38,120 Speaker 1: this too. Like. I also was an early active participant 141 00:08:38,360 --> 00:08:41,719 Speaker 1: on the Internet. I was in college when the Worldwide 142 00:08:41,720 --> 00:08:43,959 Speaker 1: Web became a thing, and I thought, oh, that's never 143 00:08:44,000 --> 00:08:47,040 Speaker 1: gonna take the place of tell net. Um now that 144 00:08:47,400 --> 00:08:49,800 Speaker 1: Oh I want to hug you right now. I met 145 00:08:49,840 --> 00:08:52,240 Speaker 1: my wife through tell net. That's how That's how I 146 00:08:52,280 --> 00:08:56,120 Speaker 1: met my wife. Yeah, not many people know that, but yeah, 147 00:08:56,160 --> 00:08:58,840 Speaker 1: I met her through a tell net chat room. Uh. 148 00:08:58,920 --> 00:09:01,880 Speaker 1: And yeah, this I thought, the whole web thing like it. 149 00:09:01,920 --> 00:09:08,680 Speaker 1: Thanks forever for that page, this prompt situation, command line 150 00:09:09,000 --> 00:09:13,440 Speaker 1: entry Universe. The slash key is my friend. It lets 151 00:09:13,520 --> 00:09:17,600 Speaker 1: me do anything. But yeah, listening to that, like when 152 00:09:17,640 --> 00:09:19,640 Speaker 1: we cast our minds back to that and we think 153 00:09:19,679 --> 00:09:22,720 Speaker 1: about the potential of of what we were seeing and 154 00:09:22,800 --> 00:09:27,559 Speaker 1: this idea of a platform that could allow for instantaneous 155 00:09:27,600 --> 00:09:32,240 Speaker 1: global communication and collaboration, um, as well as things like 156 00:09:32,320 --> 00:09:36,000 Speaker 1: commerce down the line. Once that restriction was lifted, everyone 157 00:09:36,160 --> 00:09:39,600 Speaker 1: was sure that the Internet was going to solve all problems, 158 00:09:39,640 --> 00:09:44,920 Speaker 1: including eliminate conflict, because now suddenly there would not be 159 00:09:45,000 --> 00:09:49,320 Speaker 1: these barriers to communication, ignoring places like China and North Korea, 160 00:09:49,480 --> 00:09:52,000 Speaker 1: even if even if that problems were the old true 161 00:09:52,000 --> 00:09:55,640 Speaker 1: everywhere else and um, And of course it didn't turn 162 00:09:55,679 --> 00:09:58,320 Speaker 1: out that way, but it's not a big surprise. I mean, 163 00:09:59,120 --> 00:10:01,640 Speaker 1: I look now at the landscape. I look at something 164 00:10:01,640 --> 00:10:04,800 Speaker 1: like like Facebook, you know, not just meta but the 165 00:10:04,880 --> 00:10:08,440 Speaker 1: specific platform of Facebook, which is supposed to be this 166 00:10:08,559 --> 00:10:15,920 Speaker 1: this globalized social network that connects for parents, and well, yeah, yeah, 167 00:10:16,120 --> 00:10:19,080 Speaker 1: we're we're listen. You and I are of a certain age. 168 00:10:19,240 --> 00:10:22,680 Speaker 1: I think I'm two years older than you are. So um. 169 00:10:22,720 --> 00:10:24,920 Speaker 1: But yeah, I look at that and I think, well, 170 00:10:24,920 --> 00:10:29,439 Speaker 1: of course that didn't turn into a really you know, benevolent, 171 00:10:30,000 --> 00:10:36,040 Speaker 1: um useful tool that has nothing but good in it. 172 00:10:36,040 --> 00:10:39,120 Speaker 1: It started, or at least it evolved out of a 173 00:10:39,200 --> 00:10:43,040 Speaker 1: project that was ultimately about rating the appearance of female 174 00:10:43,040 --> 00:10:49,199 Speaker 1: students at Harvard an age old human space, which always 175 00:10:49,200 --> 00:10:52,679 Speaker 1: brings out the best in our species. Yes, and you 176 00:10:52,720 --> 00:10:55,880 Speaker 1: can't have huge expectations, is I guess what I'm getting at. 177 00:10:56,640 --> 00:10:59,440 Speaker 1: But the promise is still there, right, the promise that 178 00:10:59,480 --> 00:11:05,000 Speaker 1: we could have have this democratized approach that removes barriers 179 00:11:05,040 --> 00:11:08,440 Speaker 1: that traditionally stood in the way of people connecting to 180 00:11:08,480 --> 00:11:11,600 Speaker 1: each other, or people connecting to a career, or people 181 00:11:11,640 --> 00:11:17,240 Speaker 1: connecting to contributing to their communities. That's still there. It's 182 00:11:17,280 --> 00:11:19,000 Speaker 1: it's easy to lose sight of that because we've seen 183 00:11:19,040 --> 00:11:21,880 Speaker 1: all the bad. But I love that your podcast is 184 00:11:21,880 --> 00:11:27,160 Speaker 1: looking at specific cases where people are making substantive changes two. 185 00:11:28,200 --> 00:11:31,960 Speaker 1: Uh provide this this opportunity to actually use tech to 186 00:11:32,080 --> 00:11:35,640 Speaker 1: make real benefits two people all over the world. And 187 00:11:36,000 --> 00:11:40,920 Speaker 1: one of the the ways we've been asking that question, UM, 188 00:11:40,960 --> 00:11:44,240 Speaker 1: it slows down the process. Tech is often about speed 189 00:11:44,280 --> 00:11:49,040 Speaker 1: and efficiency and and results, but not necessarily with a 190 00:11:49,160 --> 00:11:53,160 Speaker 1: very clear question like results for what faster at what? Um? 191 00:11:53,240 --> 00:11:57,440 Speaker 1: So faster at extracting value to generate advertising revenue, cool, 192 00:11:58,200 --> 00:12:02,360 Speaker 1: faster at dismantling you know, social ties or respect for 193 00:12:02,840 --> 00:12:06,920 Speaker 1: various types of community interests. Yes, we're crushing it at 194 00:12:06,960 --> 00:12:11,520 Speaker 1: that UM. But if if the question is baked a 195 00:12:11,600 --> 00:12:15,679 Speaker 1: little differently, and it's about creating a healthy community, Uh, 196 00:12:15,800 --> 00:12:18,120 Speaker 1: if it's about citizen ing as a verb, which is 197 00:12:18,160 --> 00:12:19,599 Speaker 1: kind of the premise of our show. It's like, what 198 00:12:19,720 --> 00:12:22,440 Speaker 1: if what if tech didn't make it harder to citizen 199 00:12:22,679 --> 00:12:24,840 Speaker 1: but made it easier? Then you gotta ask what does 200 00:12:24,840 --> 00:12:28,479 Speaker 1: it mean a citizen? That sounds very very cool, very abstract, 201 00:12:29,960 --> 00:12:33,160 Speaker 1: very hard to pin down, and so we we tried 202 00:12:33,200 --> 00:12:36,920 Speaker 1: to define our terms and repeatedly, so citizen means you 203 00:12:36,920 --> 00:12:40,199 Speaker 1: show up and participate, So, like, do these technologies allow 204 00:12:40,240 --> 00:12:43,320 Speaker 1: people to show up and participate in their various circles 205 00:12:43,320 --> 00:12:47,199 Speaker 1: of community. Uh, two citizens to invest in relationships with yourself, 206 00:12:47,240 --> 00:12:50,400 Speaker 1: with others and the planet around you. Do these tools 207 00:12:50,400 --> 00:12:53,119 Speaker 1: and platforms that help us get in touch with ourselves? 208 00:12:53,600 --> 00:12:55,720 Speaker 1: Do they help us connect with other people in a 209 00:12:55,800 --> 00:13:00,280 Speaker 1: relationship fashion, not just a transaction? Do they eliminate this 210 00:13:00,600 --> 00:13:04,680 Speaker 1: mythical separation between us and nature or create greater justice? 211 00:13:05,360 --> 00:13:07,360 Speaker 1: And the third and of the four is that it 212 00:13:07,400 --> 00:13:10,040 Speaker 1: helps us understand power, to citizens, to understand your power, 213 00:13:10,640 --> 00:13:15,040 Speaker 1: money and presence and ideas and physical strength, but also 214 00:13:15,200 --> 00:13:19,800 Speaker 1: communication strength and attention and in particularly you know, So 215 00:13:19,840 --> 00:13:24,040 Speaker 1: it does this technology strip us of our powers? Or 216 00:13:24,720 --> 00:13:28,160 Speaker 1: uh make us further aware of and capable of wielding 217 00:13:28,160 --> 00:13:31,560 Speaker 1: our power. And then the last of these four nicely 218 00:13:31,640 --> 00:13:35,880 Speaker 1: balanced pillars is that to two citizen is to um 219 00:13:36,120 --> 00:13:40,679 Speaker 1: prioritize and take account for the collective interests, not just 220 00:13:40,800 --> 00:13:43,920 Speaker 1: the individual self interests. And so if you're doing all 221 00:13:43,960 --> 00:13:47,320 Speaker 1: that previous stuff just for your individual self, you're finely 222 00:13:47,400 --> 00:13:52,079 Speaker 1: tuned social path and you could be very well funded 223 00:13:52,080 --> 00:13:54,800 Speaker 1: by venture capital and go up into the right on 224 00:13:54,840 --> 00:13:57,599 Speaker 1: all the charts, um, but you wouldn't be citizen ng 225 00:13:57,960 --> 00:13:59,800 Speaker 1: you know, you'd be hyper investing in a in a 226 00:14:00,040 --> 00:14:03,840 Speaker 1: iv IT life, which has its place, but is it 227 00:14:03,920 --> 00:14:07,200 Speaker 1: should not take up all of of the space that 228 00:14:07,200 --> 00:14:09,920 Speaker 1: that we occupied together. So yeah, we found all these 229 00:14:09,920 --> 00:14:13,520 Speaker 1: people who are doing different pieces. We found all these 230 00:14:13,520 --> 00:14:17,200 Speaker 1: people who are doing different pieces of that kind of 231 00:14:17,240 --> 00:14:21,040 Speaker 1: citizen ing, and even with a platform like a social network, 232 00:14:21,560 --> 00:14:24,800 Speaker 1: building them differently, you know, with different kinds of considerations. 233 00:14:25,160 --> 00:14:27,000 Speaker 1: And so it's just a good reminder like, oh, yeah, 234 00:14:27,120 --> 00:14:30,440 Speaker 1: we got choices, you know, um, and then you have 235 00:14:30,560 --> 00:14:32,440 Speaker 1: to ask, you know, grace, So we have choices, do 236 00:14:32,480 --> 00:14:35,800 Speaker 1: we how much are we able to exercise them? What 237 00:14:35,920 --> 00:14:39,440 Speaker 1: barriers stand in between us in our ability to create 238 00:14:39,640 --> 00:14:43,400 Speaker 1: those types of opportunities in large scale and not just 239 00:14:43,440 --> 00:14:46,800 Speaker 1: have them be kind of cute demonstration projects. Uh. And 240 00:14:47,000 --> 00:14:51,600 Speaker 1: that's a tough secondary question absolutely. Uh. I. I have 241 00:14:52,160 --> 00:14:55,640 Speaker 1: covered many times in the past, some various projects in 242 00:14:55,800 --> 00:15:02,840 Speaker 1: various stories that we're incredibly exciting, an inspirational and unfortunately 243 00:15:03,320 --> 00:15:09,400 Speaker 1: also momentary. Yeah, that there was no lasting effect or 244 00:15:09,560 --> 00:15:14,480 Speaker 1: or presence, and that um, you run the risk of 245 00:15:14,520 --> 00:15:19,400 Speaker 1: it suddenly becoming something performative or that someone like me 246 00:15:19,480 --> 00:15:22,240 Speaker 1: who covers tech for a lot starts to get really cynical. 247 00:15:22,760 --> 00:15:25,920 Speaker 1: When I start seeing moves, even moves that are sincere 248 00:15:25,960 --> 00:15:31,240 Speaker 1: and earnest in their motivations, you start to question simply 249 00:15:31,280 --> 00:15:35,240 Speaker 1: because you've been encountering the dark side for too long, 250 00:15:35,640 --> 00:15:37,600 Speaker 1: which is why I you know, I when I listened 251 00:15:37,640 --> 00:15:40,520 Speaker 1: to season three, which by the way, everyone absolutely should. 252 00:15:41,200 --> 00:15:43,440 Speaker 1: Everyone should listen to all of them. How does citizen 253 00:15:43,520 --> 00:15:45,920 Speaker 1: that come? But especially your audience here with tech stuff. 254 00:15:46,520 --> 00:15:49,000 Speaker 1: I think you know, I've listened to one of your 255 00:15:49,000 --> 00:15:51,400 Speaker 1: recent episodes about the history of like Web one and 256 00:15:51,440 --> 00:15:53,600 Speaker 1: Web two. Oh yeah, it was just it was a 257 00:15:53,640 --> 00:15:56,080 Speaker 1: great time machine, and so I think, you know, if 258 00:15:56,120 --> 00:15:59,600 Speaker 1: you like the time machine element and revisiting that, whether 259 00:15:59,600 --> 00:16:01,720 Speaker 1: you were as it or not, there's value to it. 260 00:16:02,120 --> 00:16:04,160 Speaker 1: We do a little bit of a version of that 261 00:16:04,240 --> 00:16:07,520 Speaker 1: in our season three and then start chipping away at 262 00:16:07,520 --> 00:16:10,600 Speaker 1: the assumptions about how a lot of this stuff needs 263 00:16:10,600 --> 00:16:12,880 Speaker 1: to work, and it's kind of this global It ended 264 00:16:12,960 --> 00:16:16,680 Speaker 1: up being a global expedition, you know, from Argentina to 265 00:16:16,800 --> 00:16:21,600 Speaker 1: Spain to Taiwan, two prisons in the United States, uh 266 00:16:21,640 --> 00:16:25,400 Speaker 1: and multiple cities and types of venues all around. So 267 00:16:25,400 --> 00:16:28,000 Speaker 1: it's very very cool. But I'll finish letting you plug 268 00:16:28,080 --> 00:16:30,600 Speaker 1: my podcast please contin Oh well, I was going to say, like, 269 00:16:30,680 --> 00:16:33,480 Speaker 1: I love how you start with an episode where you're 270 00:16:33,520 --> 00:16:36,840 Speaker 1: talking with your sister, which was beautiful. It was a 271 00:16:36,880 --> 00:16:40,560 Speaker 1: beautiful conversation in such a wonderful tribute to your mother. Um. 272 00:16:40,600 --> 00:16:42,920 Speaker 1: I absolutely loved it. And then you follow that up 273 00:16:42,960 --> 00:16:45,680 Speaker 1: with the what I would call the tech Stuff episode 274 00:16:45,920 --> 00:16:49,080 Speaker 1: of How to Citizens With with Scott Galloway, where you 275 00:16:49,160 --> 00:16:52,760 Speaker 1: are defining the problems uh. And I was listening to that, 276 00:16:52,800 --> 00:16:55,040 Speaker 1: I was like, oh, man, I'm so glad that he 277 00:16:55,080 --> 00:16:58,760 Speaker 1: doesn't have my job, because I would. I don't know 278 00:16:58,800 --> 00:17:00,560 Speaker 1: what i'd be doing right now, but he does it 279 00:17:00,680 --> 00:17:03,120 Speaker 1: better than I do. There's plenty of work out there 280 00:17:03,120 --> 00:17:06,600 Speaker 1: for all of us, man, right. And then and then 281 00:17:06,600 --> 00:17:09,200 Speaker 1: the later episodes, you know you're you're talking with these 282 00:17:09,240 --> 00:17:15,040 Speaker 1: extraordinary people who are spearheading incredible projects. And that's where 283 00:17:15,040 --> 00:17:19,000 Speaker 1: I really feel that Tech Stuff Listeners, if you have 284 00:17:19,160 --> 00:17:21,320 Speaker 1: been listening to me for a while and you know 285 00:17:21,440 --> 00:17:24,679 Speaker 1: you've you've especially with the news episodes, it's almost impossible 286 00:17:24,720 --> 00:17:27,480 Speaker 1: to avoid the dark stuff because that's the stuff that 287 00:17:27,520 --> 00:17:31,240 Speaker 1: tends to rise in the public consciousness. Listen to this 288 00:17:31,440 --> 00:17:36,760 Speaker 1: show and get involved because there are ways you can 289 00:17:36,920 --> 00:17:41,240 Speaker 1: do that, and it helps and it makes a better world. 290 00:17:41,760 --> 00:17:45,680 Speaker 1: And those are all things I want to see. We'll 291 00:17:45,720 --> 00:17:49,119 Speaker 1: be right back with more from Bariton day after these messages. 292 00:17:57,119 --> 00:17:58,840 Speaker 1: When we look out for one another and when we 293 00:17:58,880 --> 00:18:01,120 Speaker 1: take actions that are not us purely in our own 294 00:18:01,160 --> 00:18:06,200 Speaker 1: self interest, everyone benefits, not just the individual. And that's 295 00:18:06,200 --> 00:18:08,320 Speaker 1: what always blows my mind is that, yes, you can 296 00:18:08,400 --> 00:18:11,320 Speaker 1: act selfishly, and you can act selfishly in a way 297 00:18:11,359 --> 00:18:14,720 Speaker 1: that benefits you, but when you act in a way 298 00:18:14,760 --> 00:18:18,679 Speaker 1: that helps others, you too benefit. Like that's an everybody 299 00:18:18,720 --> 00:18:21,440 Speaker 1: wins situation and makes you it makes you feel good. 300 00:18:21,680 --> 00:18:25,080 Speaker 1: So there's that selfishness um and it's it's something I 301 00:18:25,160 --> 00:18:28,520 Speaker 1: often forget until I'm doing it. I'm like, oh, yeah, 302 00:18:28,680 --> 00:18:31,400 Speaker 1: it feels good to be around others, it feels good 303 00:18:31,400 --> 00:18:33,679 Speaker 1: to do something with others, it feels good to do 304 00:18:33,760 --> 00:18:37,879 Speaker 1: something for others. And then recognizing maybe maybe that comes 305 00:18:37,880 --> 00:18:40,960 Speaker 1: back to me in a literal or direct way, or 306 00:18:41,000 --> 00:18:43,159 Speaker 1: maybe I just I'm getting high off the endorphins of 307 00:18:43,200 --> 00:18:46,240 Speaker 1: that kind of social hit, you know, the same kind 308 00:18:46,280 --> 00:18:49,520 Speaker 1: of feeling like when someone likes your posts or or 309 00:18:49,600 --> 00:18:52,600 Speaker 1: shares your content online. You can get that in a 310 00:18:53,000 --> 00:18:56,520 Speaker 1: in a much healthier way, you know, through engaging with 311 00:18:56,520 --> 00:18:59,720 Speaker 1: with our fellow humans. Uh, And then you think about 312 00:18:59,760 --> 00:19:02,440 Speaker 1: the sort of collective interest, like how does public health 313 00:19:02,480 --> 00:19:06,840 Speaker 1: really work? How does national security work? How does how 314 00:19:06,840 --> 00:19:10,119 Speaker 1: do cities and communities of all sizes work? We're not 315 00:19:10,160 --> 00:19:13,240 Speaker 1: meant to do everything by ourselves, and I think we've 316 00:19:13,240 --> 00:19:17,080 Speaker 1: just over indexed on the the independence strain. And it's 317 00:19:17,200 --> 00:19:21,320 Speaker 1: very American and then very Western, but particularly American. It's 318 00:19:21,359 --> 00:19:26,040 Speaker 1: like me, my house, my yard, my gun, my food supplies, 319 00:19:26,119 --> 00:19:29,840 Speaker 1: my entertainment system, my pool, my kids, my private school, 320 00:19:30,600 --> 00:19:34,040 Speaker 1: and this list of my my, my, my my leaves 321 00:19:34,160 --> 00:19:38,000 Speaker 1: very little room for hour. And it's super inefficient. Actually, 322 00:19:38,720 --> 00:19:42,440 Speaker 1: it's it's highly redundant, and it's really isolating and lonely 323 00:19:42,760 --> 00:19:45,360 Speaker 1: taken to such an extreme. If we're all out here 324 00:19:45,359 --> 00:19:48,640 Speaker 1: building our own you know, armies of one and cities 325 00:19:48,680 --> 00:19:52,040 Speaker 1: of one in societies of one, then we're losing out 326 00:19:52,040 --> 00:19:55,399 Speaker 1: on the benefits of even having other people around, Like 327 00:19:55,440 --> 00:19:58,280 Speaker 1: what's what's the point of there being more than one 328 00:19:58,320 --> 00:20:00,640 Speaker 1: of us? If we're all going I try to do 329 00:20:00,840 --> 00:20:04,880 Speaker 1: the whole job of society by ourselves. It also includes 330 00:20:04,920 --> 00:20:07,960 Speaker 1: a lot of denial, like you have to live in 331 00:20:08,000 --> 00:20:13,960 Speaker 1: denial to believe that you alone are responsible for your 332 00:20:14,320 --> 00:20:18,360 Speaker 1: place and and how and your well being. It's such 333 00:20:18,400 --> 00:20:21,480 Speaker 1: an unrealistic view that it it boggles my mind. Um. 334 00:20:21,960 --> 00:20:25,479 Speaker 1: I was fortunate to be raised by teachers and science 335 00:20:25,520 --> 00:20:30,280 Speaker 1: fiction authors, so I came to my philosophy from a 336 00:20:30,359 --> 00:20:34,480 Speaker 1: perspective of the sort of the star Trek view of society, 337 00:20:34,520 --> 00:20:38,800 Speaker 1: like everyone contributes, and everyone's important. And if you if 338 00:20:38,840 --> 00:20:41,080 Speaker 1: you deny that, if you think somehow that you are 339 00:20:41,119 --> 00:20:44,520 Speaker 1: more important than everyone else, nothing but disaster awaits you 340 00:20:44,760 --> 00:20:47,400 Speaker 1: and and there's there's you know, denial is a really 341 00:20:47,440 --> 00:20:51,080 Speaker 1: good naming of one of the challenges of that kind 342 00:20:51,080 --> 00:20:55,000 Speaker 1: of thinking. I think fear is another component of it, 343 00:20:55,040 --> 00:20:59,480 Speaker 1: where we operate out of a sense of being afraid 344 00:21:00,400 --> 00:21:04,520 Speaker 1: of others and so we must do this for ourselves 345 00:21:04,640 --> 00:21:08,120 Speaker 1: because we can't trust someone else to be there for us. 346 00:21:08,520 --> 00:21:11,080 Speaker 1: And there's reasons for the denial, you know, some of 347 00:21:11,119 --> 00:21:13,359 Speaker 1: us are educated to believe we are better than others 348 00:21:13,400 --> 00:21:16,200 Speaker 1: and don't need them. Uh, there's reasons for the fear, 349 00:21:16,400 --> 00:21:18,400 Speaker 1: you know. We we we receive a lot of messaging 350 00:21:18,400 --> 00:21:21,080 Speaker 1: and even have had personal experiences which would trigger that. 351 00:21:21,520 --> 00:21:27,000 Speaker 1: It's not um outlandish, you know, it's it's not inexplicable, 352 00:21:27,359 --> 00:21:30,840 Speaker 1: but I think it is ultimately self destructive for most 353 00:21:30,880 --> 00:21:33,159 Speaker 1: of us to to live in such an extreme in 354 00:21:33,200 --> 00:21:35,920 Speaker 1: that way. So you find finding people in this little 355 00:21:35,920 --> 00:21:39,000 Speaker 1: pocket of the world who's who are asking some different questions, 356 00:21:39,040 --> 00:21:41,760 Speaker 1: finding some different answers and saying, okay, So what if 357 00:21:41,800 --> 00:21:45,919 Speaker 1: we made a social network that didn't give everyone on 358 00:21:46,160 --> 00:21:50,840 Speaker 1: day zero all the powers of the network at once. 359 00:21:51,080 --> 00:21:56,639 Speaker 1: You know what if you gradually unlocked these features over time, 360 00:21:56,680 --> 00:22:02,960 Speaker 1: after being more carefully and considerately on boarded into the community. 361 00:22:04,840 --> 00:22:06,960 Speaker 1: What do you mean I can't just start dm ng 362 00:22:07,000 --> 00:22:11,200 Speaker 1: people right off the bat and addressed millions from from 363 00:22:11,240 --> 00:22:16,640 Speaker 1: the first moment I sign up. WHOA, that's that's that's thoughtful. 364 00:22:16,800 --> 00:22:20,640 Speaker 1: Why would you do that? It's like, okay, well maybe 365 00:22:20,680 --> 00:22:22,840 Speaker 1: if you're not under the pressure of returning you know, 366 00:22:22,960 --> 00:22:26,520 Speaker 1: ten x the financial investment to people who have already 367 00:22:26,600 --> 00:22:29,840 Speaker 1: hundred x the money they were born with, you would 368 00:22:29,840 --> 00:22:34,040 Speaker 1: have time to consider that and build a community optimizing 369 00:22:34,080 --> 00:22:37,880 Speaker 1: that instead of just maximum noise. Uh and so yeah, 370 00:22:37,920 --> 00:22:40,480 Speaker 1: we had this, you know, bach Rainy lgbt Q plus 371 00:22:40,520 --> 00:22:43,960 Speaker 1: activists s ra Al Shaffi. She's one of the people 372 00:22:44,040 --> 00:22:46,520 Speaker 1: who I think everybody. It's hard to pick up favorite. 373 00:22:46,560 --> 00:22:48,920 Speaker 1: But just in this moment, given that you mentioned Facebook, 374 00:22:50,240 --> 00:22:53,200 Speaker 1: that there's more than one way. There are many different 375 00:22:53,240 --> 00:22:56,960 Speaker 1: ways to build networks and communities, and we've just done 376 00:22:56,960 --> 00:23:00,440 Speaker 1: this copy paste thing. Uh. And Facebook in particular has 377 00:23:00,520 --> 00:23:03,840 Speaker 1: just done the like copy paste thing with you know, 378 00:23:04,040 --> 00:23:07,720 Speaker 1: innovations from others. Um, but speed speed, speed is costing 379 00:23:07,800 --> 00:23:10,000 Speaker 1: us in some other ways. And the speed of enabling 380 00:23:10,080 --> 00:23:15,840 Speaker 1: people to behave in certain ways without having any connection 381 00:23:16,480 --> 00:23:19,760 Speaker 1: to the community they're part of. Yet it's like your 382 00:23:19,840 --> 00:23:23,280 Speaker 1: first day as a freshman in high school. You're just 383 00:23:23,400 --> 00:23:25,440 Speaker 1: like you're running the pep rally, You're the president of 384 00:23:25,480 --> 00:23:29,080 Speaker 1: the class, like you're you're in charge of discipline. Uh. 385 00:23:29,720 --> 00:23:34,840 Speaker 1: That's way too much power too quickly. Yeah, I uh 386 00:23:35,359 --> 00:23:37,240 Speaker 1: so much of what you say resonates with me. I 387 00:23:37,320 --> 00:23:39,680 Speaker 1: think back to some of the smaller forums that I 388 00:23:39,800 --> 00:23:44,000 Speaker 1: participated in early on in the Internet and how fundamentally 389 00:23:44,040 --> 00:23:48,240 Speaker 1: different they were, Like it felt like a community of people. 390 00:23:48,400 --> 00:23:50,879 Speaker 1: Usually it was centered around something specific which helped, like 391 00:23:50,960 --> 00:23:55,640 Speaker 1: an interest and exactly a movie or a comic. Yeah, hey, hey, 392 00:23:55,760 --> 00:23:58,600 Speaker 1: shout out to all my fellow former Buffy the Vampire 393 00:23:58,680 --> 00:24:03,400 Speaker 1: Slayer Bronze Bers. I was part of the Bronze Um, 394 00:24:04,000 --> 00:24:06,040 Speaker 1: which I also have it man revealing a lot of 395 00:24:06,080 --> 00:24:08,520 Speaker 1: personal stuff on this episode, and so do you in 396 00:24:08,600 --> 00:24:12,880 Speaker 1: that first episode of season three. So I think that's good. 397 00:24:13,000 --> 00:24:17,280 Speaker 1: But yeah, I I completely find that inspiring. I love 398 00:24:17,359 --> 00:24:20,840 Speaker 1: the idea of social networks that truly are social and 399 00:24:21,760 --> 00:24:27,240 Speaker 1: build towards that and and help people avoid traps, like 400 00:24:27,960 --> 00:24:29,680 Speaker 1: the fact that a lot of folks fall into a 401 00:24:29,760 --> 00:24:34,800 Speaker 1: sort of tribalistic experience online. UM that you have these 402 00:24:34,840 --> 00:24:38,200 Speaker 1: echo chambers that you have, You have these these environments 403 00:24:38,280 --> 00:24:42,720 Speaker 1: that they don't just allow for the noise, they don't 404 00:24:42,760 --> 00:24:47,240 Speaker 1: just allow for the flame war. The platform itself is 405 00:24:47,359 --> 00:24:51,760 Speaker 1: dependent upon that activity, the noise they need, the flame wars. 406 00:24:51,880 --> 00:24:56,919 Speaker 1: They the flame wars fuel their growth. Yes, pretty much 407 00:24:56,960 --> 00:24:59,960 Speaker 1: a literal sense. You know, there's another I'm looking bad 408 00:25:00,040 --> 00:25:02,920 Speaker 1: at that season, something I haven't thought as much about. 409 00:25:03,040 --> 00:25:06,680 Speaker 1: But given who you are and where you sit, you know, 410 00:25:06,880 --> 00:25:10,720 Speaker 1: the way we UM defer to algorithms as a huge 411 00:25:11,119 --> 00:25:14,920 Speaker 1: impact on our sense of reality and truth, and most 412 00:25:14,960 --> 00:25:16,600 Speaker 1: of us are not aware of them. So we we 413 00:25:16,680 --> 00:25:20,080 Speaker 1: think we're making choices, but we're actually you know, the 414 00:25:20,480 --> 00:25:25,880 Speaker 1: choice set is predetermined by a series of machines guided 415 00:25:25,920 --> 00:25:29,600 Speaker 1: by humans who tell us who we are before we 416 00:25:29,720 --> 00:25:32,640 Speaker 1: have a chance to become who we may be meant 417 00:25:32,720 --> 00:25:35,199 Speaker 1: to be. And so just even if you're like an 418 00:25:35,280 --> 00:25:40,199 Speaker 1: independent minded person free will and rugged individualism, it's kind 419 00:25:40,200 --> 00:25:43,200 Speaker 1: of hard to be that when you're subservient to a 420 00:25:43,280 --> 00:25:45,760 Speaker 1: machine telling you who you are before you get to 421 00:25:45,800 --> 00:25:48,560 Speaker 1: be who you are. And so there's folks working on 422 00:25:49,119 --> 00:25:52,560 Speaker 1: cleaning up the data sets that power these algorithms and 423 00:25:52,760 --> 00:25:55,680 Speaker 1: just adding a level of insight into like, oh, that 424 00:25:55,800 --> 00:25:57,639 Speaker 1: was just a garbage mailing list that someone brought in 425 00:25:57,680 --> 00:25:59,600 Speaker 1: the dark web. Maybe we shouldn't use that to determine 426 00:25:59,760 --> 00:26:03,639 Speaker 1: who has access to health care? Huh interesting? Or you know, 427 00:26:03,760 --> 00:26:08,000 Speaker 1: my favorite example is forman in Teresa Hodge, who's formerly 428 00:26:08,040 --> 00:26:12,040 Speaker 1: incarcerated and was very frustrated by folks with felony records 429 00:26:12,119 --> 00:26:14,639 Speaker 1: not being able to get employment after they serve their 430 00:26:14,720 --> 00:26:21,080 Speaker 1: time because various algorithms say they're not trustworthy, you know, 431 00:26:21,240 --> 00:26:23,960 Speaker 1: just because they got caught. I mean, I still maintain 432 00:26:24,080 --> 00:26:27,480 Speaker 1: most humans are criminals technically, right, there's a lot of 433 00:26:27,560 --> 00:26:30,480 Speaker 1: laws that we're breaking on a constant basis. Most of 434 00:26:30,560 --> 00:26:33,399 Speaker 1: us are not under constant surveillance to be busted, and 435 00:26:33,720 --> 00:26:35,960 Speaker 1: then many of us who are busted have access to 436 00:26:36,080 --> 00:26:38,600 Speaker 1: a level of resource to get out of that, get 437 00:26:38,680 --> 00:26:40,399 Speaker 1: that expunge. You gotta slap on the risk. It's get 438 00:26:40,440 --> 00:26:43,000 Speaker 1: a warning, so you have you're able to learn from 439 00:26:43,080 --> 00:26:45,560 Speaker 1: the error and then proceed in society. But but a 440 00:26:45,640 --> 00:26:48,560 Speaker 1: subset of us get caught up and then punished forever. 441 00:26:49,200 --> 00:26:53,040 Speaker 1: And so she came up with and another algorithm that 442 00:26:53,080 --> 00:26:56,680 Speaker 1: employers could use to consider if someone is really a 443 00:26:56,840 --> 00:26:59,800 Speaker 1: risky bet in terms of hiring. And and it takes 444 00:27:00,000 --> 00:27:03,040 Speaker 1: a lot more data points into considerations because it's defined 445 00:27:03,119 --> 00:27:07,520 Speaker 1: by people who have experienced that particular challenge. And so 446 00:27:07,680 --> 00:27:11,240 Speaker 1: you're putting people who are close to the problem in 447 00:27:11,400 --> 00:27:15,119 Speaker 1: the driver's seat to help co create the solution. And 448 00:27:15,760 --> 00:27:20,439 Speaker 1: there was no assumption of vindictiveness or ill will. Right, 449 00:27:20,520 --> 00:27:25,159 Speaker 1: this isn't about punishing companies, but it's also about no 450 00:27:25,359 --> 00:27:28,600 Speaker 1: longer punishing people who have already been punished and then 451 00:27:28,720 --> 00:27:33,520 Speaker 1: unlocking their ability to contribute productively for all of us, 452 00:27:33,680 --> 00:27:37,400 Speaker 1: Like we would all benefit from those folks having gainful employement. 453 00:27:38,040 --> 00:27:42,000 Speaker 1: So it's you know, it doesn't mean like no more algorithms. Ever, 454 00:27:43,240 --> 00:27:46,440 Speaker 1: I think that ship has sailed. But how we algorithm 455 00:27:46,520 --> 00:27:49,440 Speaker 1: and who's involved in crafting them, and what oversight we have, 456 00:27:49,600 --> 00:27:51,919 Speaker 1: what goes into them, who makes them, how they affect 457 00:27:51,960 --> 00:27:54,760 Speaker 1: the world. We have so much more choice in that. 458 00:27:55,359 --> 00:27:57,760 Speaker 1: Then we have realized many of us should just sitting 459 00:27:57,760 --> 00:27:59,879 Speaker 1: here kind of dormant, like a ghost. There's nothing we 460 00:28:00,000 --> 00:28:04,199 Speaker 1: can do. Oh, well that they already the computers are 461 00:28:04,240 --> 00:28:07,239 Speaker 1: going to compute. They must they must know better. It's like, oh, 462 00:28:07,400 --> 00:28:09,960 Speaker 1: they know what we tell them to know, and we 463 00:28:10,080 --> 00:28:13,800 Speaker 1: can still have an influence on that. So I just thought, Okay, 464 00:28:13,840 --> 00:28:17,080 Speaker 1: people are not expecting to hear this formerly incarcerated black 465 00:28:17,119 --> 00:28:20,359 Speaker 1: woman coming up with algorithms selling them to employers to 466 00:28:20,400 --> 00:28:23,840 Speaker 1: help them hire people better. What's and that's the same 467 00:28:23,960 --> 00:28:26,080 Speaker 1: kind of story that would be on the cover of 468 00:28:26,119 --> 00:28:29,480 Speaker 1: an Ink magazine, you know, or of a fast company magazine, 469 00:28:29,480 --> 00:28:31,879 Speaker 1: of a Forbes um But we don't hear it as 470 00:28:31,960 --> 00:28:36,000 Speaker 1: often as the I made an app called Yek Yek, 471 00:28:38,800 --> 00:28:42,760 Speaker 1: and and that alone is a crime that should be 472 00:28:42,880 --> 00:28:46,960 Speaker 1: punishable severely and over and over again. Not to call 473 00:28:47,000 --> 00:28:48,760 Speaker 1: anyone out, but one of the co founders, I don't 474 00:28:48,760 --> 00:28:50,920 Speaker 1: remember both their names, but one name is always going 475 00:28:50,960 --> 00:28:52,840 Speaker 1: to stick with me forever and ever and ever, because 476 00:28:52,920 --> 00:28:55,160 Speaker 1: it was Brooks Buffington the third, And how do you 477 00:28:55,240 --> 00:28:58,600 Speaker 1: forget a name like, No, that's an incredible name, that's 478 00:28:58,600 --> 00:29:01,800 Speaker 1: an incredible books buff Rington the third. I shouldn't really 479 00:29:02,040 --> 00:29:04,360 Speaker 1: you know, calling him out. I live in Atlanta. He's 480 00:29:04,400 --> 00:29:07,360 Speaker 1: still in Atlanta running businesses and stuff. I mean, I'm 481 00:29:07,400 --> 00:29:10,160 Speaker 1: burning bridges. Butever, and you know, that was I don't 482 00:29:10,200 --> 00:29:12,760 Speaker 1: know what yik yak is doing, you know, with its return. 483 00:29:12,880 --> 00:29:15,480 Speaker 1: Maybe they have figured out and we are in a 484 00:29:15,640 --> 00:29:19,160 Speaker 1: very different world from in in so many ways. So 485 00:29:20,080 --> 00:29:24,160 Speaker 1: let's let's assume, uh, as we have grown, so have they, 486 00:29:24,440 --> 00:29:27,240 Speaker 1: and and you can just marvel at the amazing name 487 00:29:27,680 --> 00:29:33,120 Speaker 1: that is Brooks Buffing. Yeah, phenomenal. I like that. I 488 00:29:33,240 --> 00:29:37,000 Speaker 1: like that perspective. We've got more conversation coming up with 489 00:29:37,040 --> 00:29:48,960 Speaker 1: barretun day after this short break. I love hearing about 490 00:29:49,000 --> 00:29:51,520 Speaker 1: these stories. Uh, this is the sort of it's it's 491 00:29:51,560 --> 00:29:54,880 Speaker 1: like nourishment for the soul. It's the the bomb I 492 00:29:55,080 --> 00:30:00,360 Speaker 1: needed in order to kind of just rock every that's 493 00:30:00,400 --> 00:30:03,960 Speaker 1: going on and not you know, fall into that little 494 00:30:04,040 --> 00:30:08,600 Speaker 1: puddle in the corner that's like, yeah, small and helpless 495 00:30:08,720 --> 00:30:11,600 Speaker 1: and knowing that things are wrong, but not knowing how 496 00:30:11,760 --> 00:30:16,800 Speaker 1: to right them um and feeling like the snowball is 497 00:30:16,800 --> 00:30:19,120 Speaker 1: already so large that no one's ever gonna stop it. 498 00:30:19,240 --> 00:30:21,760 Speaker 1: It's it's going to wipe out the village. We're just 499 00:30:21,880 --> 00:30:24,800 Speaker 1: waiting to see when it happens. And these are the 500 00:30:24,840 --> 00:30:28,880 Speaker 1: stories that remind me that's not that's not the case. 501 00:30:29,000 --> 00:30:32,800 Speaker 1: That's not the narrative you I tend to abstract too much, 502 00:30:34,160 --> 00:30:38,520 Speaker 1: the concept of companies being these monolithic things that somehow 503 00:30:38,600 --> 00:30:41,360 Speaker 1: existed on their own, and in fact they are the 504 00:30:41,960 --> 00:30:45,600 Speaker 1: construction of people, like actual human beings made these and 505 00:30:45,840 --> 00:30:49,200 Speaker 1: and by forgetting that, you start to just I mean, 506 00:30:49,240 --> 00:30:51,920 Speaker 1: I do it all the time, You start to ascribe 507 00:30:52,000 --> 00:30:56,800 Speaker 1: motivations to things that are not themselves really an entity. 508 00:30:56,920 --> 00:31:00,800 Speaker 1: It's a collection that was built by human beings. And 509 00:31:01,240 --> 00:31:04,120 Speaker 1: it's even possible that the motivations that you're recognizing were 510 00:31:04,200 --> 00:31:07,400 Speaker 1: never consciously in any of those human beings minds, but 511 00:31:07,840 --> 00:31:10,720 Speaker 1: it's manifested that way. And I think that's healthy to 512 00:31:10,880 --> 00:31:13,360 Speaker 1: make those realizations, because then you can say, all right, listen, 513 00:31:13,440 --> 00:31:17,000 Speaker 1: I'm not I'm not gonna necessarily lay blame on anyone 514 00:31:17,200 --> 00:31:23,560 Speaker 1: here for for maliciously approaching this situation. But we still 515 00:31:23,720 --> 00:31:29,720 Speaker 1: have to reckon with the consequences and to adjudicate, to 516 00:31:29,840 --> 00:31:33,200 Speaker 1: decide are these things we want? Are they positive? Are 517 00:31:33,280 --> 00:31:36,360 Speaker 1: they helpful? Or are they things we can do without? 518 00:31:36,440 --> 00:31:39,080 Speaker 1: Are they things we can change so that we get 519 00:31:39,720 --> 00:31:44,800 Speaker 1: more positive outcomes and more supportive outcomes. Um? And that's 520 00:31:44,840 --> 00:31:48,080 Speaker 1: really the takeaway I get from your show, which is 521 00:31:48,560 --> 00:31:53,280 Speaker 1: I think the most glowing review I can give five stars. Yes, 522 00:31:53,840 --> 00:31:58,600 Speaker 1: we've got the strict and five stars accomplished retire from podcasting. 523 00:31:58,680 --> 00:32:03,720 Speaker 1: I think, Please, don't I need more? I need more. 524 00:32:04,000 --> 00:32:06,440 Speaker 1: I need more. I need like endless number of seasons. 525 00:32:06,760 --> 00:32:10,440 Speaker 1: More is coming, More is coming. Yeah. Well, to to 526 00:32:10,560 --> 00:32:12,640 Speaker 1: kind of wrap this up, I want to just sort 527 00:32:12,680 --> 00:32:16,320 Speaker 1: of talk with you about some some random little things 528 00:32:16,440 --> 00:32:22,560 Speaker 1: like do you have any uh uh favorite like I 529 00:32:22,680 --> 00:32:26,600 Speaker 1: know you talk a lot about technological memories in your episodes, 530 00:32:26,640 --> 00:32:28,960 Speaker 1: but do you have any like favorite standouts, like things 531 00:32:29,040 --> 00:32:32,280 Speaker 1: where either something clicked for you or you encountered a 532 00:32:32,360 --> 00:32:34,680 Speaker 1: technology for the first time, just that sort of sense 533 00:32:34,800 --> 00:32:39,160 Speaker 1: of wonder and discovery. Yeah, so there's a I don't 534 00:32:39,640 --> 00:32:42,880 Speaker 1: think that we put this in How to Citizen. Um, 535 00:32:43,280 --> 00:32:46,800 Speaker 1: so this would be a pseudo exclusive. I've definitely mentioned 536 00:32:46,840 --> 00:32:49,920 Speaker 1: it publicly, but not often. So in high school. I 537 00:32:50,000 --> 00:32:54,360 Speaker 1: graduate high school to situate you in in my timeline, 538 00:32:54,920 --> 00:32:58,200 Speaker 1: and we got an internet connection at that school in 539 00:32:58,560 --> 00:33:03,760 Speaker 1: probably possibly nine two and always on T one level connection. 540 00:33:04,240 --> 00:33:08,760 Speaker 1: So I was banging around in Unix and coding and 541 00:33:08,960 --> 00:33:14,080 Speaker 1: see poorly and just bopping around Gopher servers and Usenet 542 00:33:14,160 --> 00:33:18,680 Speaker 1: servers and finding gateways into public libraries UM. And I 543 00:33:18,760 --> 00:33:21,800 Speaker 1: also worked for the student newspaper at the time, UH 544 00:33:21,960 --> 00:33:24,200 Speaker 1: and a friend of mine had been expelled from school. 545 00:33:24,680 --> 00:33:29,440 Speaker 1: So these three facts converged. A friend I felt who 546 00:33:29,520 --> 00:33:33,280 Speaker 1: was unjustly expelled from the school. His family sued our school, 547 00:33:34,000 --> 00:33:36,280 Speaker 1: and there was a court case involved, and I was 548 00:33:36,320 --> 00:33:39,600 Speaker 1: called as a witness to this court case. And court 549 00:33:39,640 --> 00:33:42,480 Speaker 1: cases create a lot of public documentation. Most people in 550 00:33:42,520 --> 00:33:46,280 Speaker 1: the school did not know about this lawsuit. Enter my 551 00:33:46,440 --> 00:33:51,640 Speaker 1: new internet skills, and so I start um a fake 552 00:33:51,760 --> 00:33:56,000 Speaker 1: email address called the informant at knowledge dot com. It 553 00:33:56,200 --> 00:33:59,200 Speaker 1: was one of my prouder I was like deep throating it, 554 00:33:59,360 --> 00:34:04,040 Speaker 1: you know, and I figured out, you know, mail servers 555 00:34:04,080 --> 00:34:07,480 Speaker 1: were very shall we say, porous at that time in 556 00:34:07,560 --> 00:34:11,360 Speaker 1: our history, and so if you knew the right keyboard 557 00:34:11,400 --> 00:34:14,520 Speaker 1: commands you could interact with a mail server as if 558 00:34:14,640 --> 00:34:17,719 Speaker 1: you were a male client and you know, have the 559 00:34:17,840 --> 00:34:20,359 Speaker 1: right syntax in the right order. And so I kind 560 00:34:20,400 --> 00:34:24,200 Speaker 1: of logged in to this mail server at Yale University 561 00:34:25,400 --> 00:34:28,359 Speaker 1: Minerva dot sys dot Yale dot e d U. I'm 562 00:34:28,400 --> 00:34:32,759 Speaker 1: sure they've battened down those ports at this stage, and 563 00:34:32,920 --> 00:34:37,800 Speaker 1: so I I used I launched my attack from from Yale, 564 00:34:38,239 --> 00:34:40,200 Speaker 1: even though I was in the computer lab in Washington, 565 00:34:40,320 --> 00:34:42,759 Speaker 1: d C. As said, well, friends, and I crafted a 566 00:34:42,840 --> 00:34:45,920 Speaker 1: message to my fellow newspaper editors because I didn't want 567 00:34:45,920 --> 00:34:47,680 Speaker 1: my fingerprints on it, and I wanted them to kind 568 00:34:47,680 --> 00:34:51,719 Speaker 1: of independently decide this is worth talking about without it 569 00:34:51,760 --> 00:34:54,520 Speaker 1: feeling like a favorite of me or without being colored 570 00:34:54,560 --> 00:34:57,680 Speaker 1: by any attitude toward me at all. Because I was 571 00:34:57,760 --> 00:35:00,040 Speaker 1: clearly close to the person who've been expelled, they I 572 00:35:00,160 --> 00:35:01,920 Speaker 1: dismiss it. You know, you're just trying to get your 573 00:35:01,920 --> 00:35:04,440 Speaker 1: friend back in school. So I alerted them to the 574 00:35:04,520 --> 00:35:06,640 Speaker 1: fact that gave them like the docket number that they 575 00:35:06,640 --> 00:35:09,400 Speaker 1: could look up in them the court system, and they 576 00:35:09,520 --> 00:35:11,759 Speaker 1: ended up you know, telling the story you know, of 577 00:35:11,840 --> 00:35:16,000 Speaker 1: this lawsuit, which triggered political action on campus and meetings 578 00:35:16,040 --> 00:35:18,800 Speaker 1: and all kinds of stuff, but the the ability to 579 00:35:19,600 --> 00:35:23,720 Speaker 1: influence in that way, to to discover a form of power. 580 00:35:24,560 --> 00:35:27,520 Speaker 1: I didn't like hack into the mail server to destroy it. 581 00:35:28,360 --> 00:35:31,920 Speaker 1: I didn't even create fake information with it. I had 582 00:35:32,040 --> 00:35:35,560 Speaker 1: created a fake header technically, but you know, trying to 583 00:35:36,200 --> 00:35:40,360 Speaker 1: raise awareness of an issue from my relatively meager position. 584 00:35:40,400 --> 00:35:42,640 Speaker 1: I wasn't a trustee, I wasn't a faculty member, I 585 00:35:42,719 --> 00:35:45,680 Speaker 1: wasn't a parent. But I was an interested and affected 586 00:35:45,760 --> 00:35:48,200 Speaker 1: party by this, and I thought it affected all of us. 587 00:35:48,880 --> 00:35:51,200 Speaker 1: I felt powerful in the moment. I felt like I 588 00:35:51,400 --> 00:35:54,480 Speaker 1: did more than I thought I could, and it just 589 00:35:54,520 --> 00:35:59,000 Speaker 1: felt kind of cool. And I didn't tell anyone for years. 590 00:35:59,120 --> 00:36:01,279 Speaker 1: I just kept my mouth shut. I'm like, I don't 591 00:36:01,280 --> 00:36:02,800 Speaker 1: know what the statute of limitations is. I don't know 592 00:36:02,800 --> 00:36:04,600 Speaker 1: if that was technically a crime. You know, I'm not 593 00:36:04,640 --> 00:36:06,719 Speaker 1: trying to go to jail for my friend, like I 594 00:36:06,840 --> 00:36:08,440 Speaker 1: like him a lot, but you know, he still he 595 00:36:08,520 --> 00:36:10,600 Speaker 1: wasn't incarcerated. He just had to go to a different school. 596 00:36:11,000 --> 00:36:14,560 Speaker 1: So measuring all of that, but also just that the 597 00:36:15,000 --> 00:36:21,800 Speaker 1: moment of realizing that I could communicate through machines in 598 00:36:21,920 --> 00:36:25,000 Speaker 1: this way, and not just for the sake of communicating 599 00:36:25,040 --> 00:36:27,880 Speaker 1: with machines, but for the sake of affecting the I 600 00:36:28,080 --> 00:36:32,080 Speaker 1: R L universe that I inhabited. That was That was 601 00:36:32,160 --> 00:36:37,439 Speaker 1: a significant moment in my technology life, in my life overall. Yeah, 602 00:36:37,800 --> 00:36:40,360 Speaker 1: I also like to imagine, although I know it's not 603 00:36:40,440 --> 00:36:42,200 Speaker 1: the case, that's why you had to go to Harvard 604 00:36:42,239 --> 00:36:45,480 Speaker 1: because Yale was going to be like, oh no, well 605 00:36:45,520 --> 00:36:48,719 Speaker 1: I didn't respect after that, you know, I'm just like, 606 00:36:48,840 --> 00:36:53,120 Speaker 1: you don't take cyber security seriously, like you let a 607 00:36:53,280 --> 00:36:58,600 Speaker 1: child manipulate your tech infrastructure. Come on, a child just 608 00:36:59,320 --> 00:37:04,520 Speaker 1: just compromise as octa authentication company. Maybe so many people 609 00:37:04,560 --> 00:37:09,239 Speaker 1: shouldn't shouldn't be dependent on that one service. Yeah, yeah, 610 00:37:09,719 --> 00:37:13,279 Speaker 1: though I won't go down that road. But but I 611 00:37:13,440 --> 00:37:16,399 Speaker 1: love that story too. It speaks to me about sort 612 00:37:16,440 --> 00:37:22,360 Speaker 1: of the the pure hacker ethos, this idea of understanding 613 00:37:22,360 --> 00:37:27,720 Speaker 1: how systems work, occasionally using that knowledge to have systems 614 00:37:27,760 --> 00:37:30,200 Speaker 1: do something that they weren't necessarily meant to do. Not not. 615 00:37:30,920 --> 00:37:32,920 Speaker 1: It doesn't have to be malicious, it doesn't have to 616 00:37:32,960 --> 00:37:37,520 Speaker 1: be criminal, it doesn't have to be destructive. The term 617 00:37:37,560 --> 00:37:40,000 Speaker 1: hacker obviously gets thrown around a lot in a very 618 00:37:40,120 --> 00:37:43,800 Speaker 1: sinister way, but we understand those of us who have 619 00:37:43,880 --> 00:37:47,160 Speaker 1: ever kind of put our hand to it. It's more 620 00:37:47,239 --> 00:37:52,800 Speaker 1: about that that innate curiosity of how does this thing work? 621 00:37:53,320 --> 00:37:55,800 Speaker 1: And once you understand how it works, like, oh, I 622 00:37:55,920 --> 00:37:58,200 Speaker 1: wonder if it could also do this other thing that 623 00:37:58,360 --> 00:38:02,799 Speaker 1: it's not designed to you and and that I feel 624 00:38:02,880 --> 00:38:05,440 Speaker 1: is is something that we need to encourage. But unfortunately, 625 00:38:06,280 --> 00:38:09,400 Speaker 1: just the the term hacker has got such a stigma 626 00:38:09,520 --> 00:38:12,759 Speaker 1: against it that I feel a lot of people immediately 627 00:38:13,000 --> 00:38:15,040 Speaker 1: make judgments as soon as they hear the term being 628 00:38:15,120 --> 00:38:18,880 Speaker 1: thrown around. Yeah, make look at we We always need 629 00:38:18,960 --> 00:38:22,960 Speaker 1: folks who are going to push things, and sometimes most 630 00:38:23,000 --> 00:38:25,359 Speaker 1: of us don't appreciate it, like why is that person 631 00:38:25,440 --> 00:38:27,279 Speaker 1: over there pushing things? I like things just the way 632 00:38:27,360 --> 00:38:30,040 Speaker 1: they are. But then we find out you know, who 633 00:38:30,120 --> 00:38:33,040 Speaker 1: we are through that process and like, oh, I guess 634 00:38:33,080 --> 00:38:35,440 Speaker 1: we don't have to always do it this way now, 635 00:38:35,600 --> 00:38:38,600 Speaker 1: you know, Forging emails isn't like a thing that I 636 00:38:38,680 --> 00:38:40,920 Speaker 1: want all of us to be doing on account of basis, 637 00:38:41,760 --> 00:38:46,640 Speaker 1: So the lesson is more metaphorical. Um, but you know, 638 00:38:47,640 --> 00:38:52,160 Speaker 1: being able to find your voice, being able to express it, 639 00:38:52,920 --> 00:38:56,359 Speaker 1: h caring enough about you know, a situation to try 640 00:38:56,440 --> 00:39:00,480 Speaker 1: to influence other people's perception or even awareness of that situation. 641 00:39:00,840 --> 00:39:04,840 Speaker 1: I mean, that's the same thing that drives something much bigger, 642 00:39:05,320 --> 00:39:08,160 Speaker 1: like a me too or Black Lives Matter. Right, it's 643 00:39:08,239 --> 00:39:11,280 Speaker 1: just and you see all kinds of social media usage 644 00:39:11,320 --> 00:39:14,799 Speaker 1: to amplify awareness of situations as well as you see 645 00:39:14,800 --> 00:39:20,640 Speaker 1: sort of secure journalistic communications and you know, very anonymized 646 00:39:21,000 --> 00:39:27,480 Speaker 1: tour based UH file drop services UH to allow folks 647 00:39:28,040 --> 00:39:31,239 Speaker 1: in a weekend position to still hold power accountable in 648 00:39:31,320 --> 00:39:34,640 Speaker 1: some way, especially in like regimes that don't allow anything 649 00:39:34,680 --> 00:39:37,200 Speaker 1: approaching the level of freedom free expression we have in 650 00:39:37,239 --> 00:39:42,480 Speaker 1: the US, and that's super important work UH and will 651 00:39:42,520 --> 00:39:44,560 Speaker 1: continue to be so, I suspect for as long as 652 00:39:44,800 --> 00:39:47,279 Speaker 1: we have more than one person on the planet. Mm hm, 653 00:39:47,400 --> 00:39:49,640 Speaker 1: I agree. And then we're even seeing it if you 654 00:39:49,719 --> 00:39:52,000 Speaker 1: go one step removed from the tech itself. We're seeing 655 00:39:52,000 --> 00:39:55,400 Speaker 1: it in the tech companies as we're watching a growing 656 00:39:55,560 --> 00:40:00,480 Speaker 1: movement among employees toward organization. That's realization. I mean, I 657 00:40:00,640 --> 00:40:04,800 Speaker 1: heart podcast union form not long ago and that was 658 00:40:04,880 --> 00:40:07,760 Speaker 1: really exciting. And we're seeing it in efforts at Amazon. 659 00:40:07,880 --> 00:40:11,520 Speaker 1: We're seeing it in efforts at parts of Activision, Blizzard, UM, 660 00:40:12,040 --> 00:40:14,239 Speaker 1: you know, the these This is a really exciting time 661 00:40:15,200 --> 00:40:19,000 Speaker 1: from that perspective too, because I see this this growing 662 00:40:19,320 --> 00:40:24,239 Speaker 1: understanding that the balance has been out of whack for 663 00:40:24,360 --> 00:40:28,160 Speaker 1: far too long and that it has been benefiting far 664 00:40:28,239 --> 00:40:31,960 Speaker 1: too few people in the process. And I feel like 665 00:40:32,080 --> 00:40:35,000 Speaker 1: we're getting to maybe not a full correction, but we're 666 00:40:35,000 --> 00:40:38,200 Speaker 1: at least reckoning with it, and I'm hopeful to see 667 00:40:38,280 --> 00:40:41,239 Speaker 1: that continue. Like that's another one of those trends. I'm 668 00:40:41,239 --> 00:40:44,279 Speaker 1: excited to be alive to see that because we were 669 00:40:44,320 --> 00:40:47,120 Speaker 1: also barred. We were also growing up in the Reagan 670 00:40:47,200 --> 00:40:50,440 Speaker 1: era where a lot of those systems were dismantled. That's right. 671 00:40:50,520 --> 00:40:53,640 Speaker 1: Yet the way that workers we have an episode of 672 00:40:53,719 --> 00:40:59,160 Speaker 1: season three, Shall We Whang focused on not just workers 673 00:40:59,320 --> 00:41:06,800 Speaker 1: organizing around better compensation, better benefits like those typical contract terms, 674 00:41:07,400 --> 00:41:11,680 Speaker 1: but organizing around their own sense of power and purpose 675 00:41:12,800 --> 00:41:17,960 Speaker 1: and technical Silicon Valley employees are highly valued financially um 676 00:41:18,480 --> 00:41:22,200 Speaker 1: and yet they're starting to recognize the limits of financial 677 00:41:22,320 --> 00:41:25,320 Speaker 1: recognition of their value and what do I want to 678 00:41:25,360 --> 00:41:28,759 Speaker 1: be doing with my power? And then expanding the lens 679 00:41:28,800 --> 00:41:31,239 Speaker 1: on what a technology worker is. You know, if you 680 00:41:32,000 --> 00:41:35,800 Speaker 1: drive for Uber Eats, you're also a technology worker, and 681 00:41:36,360 --> 00:41:41,600 Speaker 1: you know, just letting people claim that language for themselves, 682 00:41:42,120 --> 00:41:46,960 Speaker 1: gig workers, gig workers, gig that's pretty dismissive unintentionally, I think, 683 00:41:47,680 --> 00:41:50,520 Speaker 1: but it has a consequence of minimizing the labor of 684 00:41:50,640 --> 00:41:52,880 Speaker 1: some versus others. And if you can build a bridge 685 00:41:52,920 --> 00:41:56,280 Speaker 1: between the coder and the driver, you know, the coder 686 00:41:56,360 --> 00:42:00,960 Speaker 1: and the courier, then you've just magnified your community and 687 00:42:01,080 --> 00:42:04,560 Speaker 1: your potential power. And once we change who we relate to, 688 00:42:05,320 --> 00:42:08,319 Speaker 1: all kinds of new things are possible. So there's been 689 00:42:08,320 --> 00:42:12,560 Speaker 1: awakening that's happening to that's not just about contract terms, um, 690 00:42:12,640 --> 00:42:16,200 Speaker 1: and it's much looser and broader, and I think potentially 691 00:42:16,239 --> 00:42:20,680 Speaker 1: even more powerful, though contract terms very much matter. Yes, 692 00:42:21,320 --> 00:42:23,880 Speaker 1: I feel the same. I don't want to keep you 693 00:42:23,960 --> 00:42:27,399 Speaker 1: any longer because I am very cognizant of your time. 694 00:42:27,760 --> 00:42:31,640 Speaker 1: But I could chatter on about all these things for hours, 695 00:42:31,920 --> 00:42:35,920 Speaker 1: and uh, every time I'm like, no, no, stop, stop, stop, 696 00:42:36,239 --> 00:42:39,880 Speaker 1: Jonathan Chatty, Cathy, you're gonna let You're gonna let it 697 00:42:39,960 --> 00:42:41,919 Speaker 1: in here. You can have them back on the show 698 00:42:42,040 --> 00:42:43,560 Speaker 1: some future point if he wants to come back, and 699 00:42:44,320 --> 00:42:47,520 Speaker 1: I would happily come back. That's an easy, easy, yes, 700 00:42:47,719 --> 00:42:49,640 Speaker 1: thank you. I would love to have you have you 701 00:42:49,760 --> 00:42:52,200 Speaker 1: back on the show. Absolutely. I mean, you're a busy man, 702 00:42:52,320 --> 00:42:55,920 Speaker 1: so we will talk when you've got time to breathe 703 00:42:56,280 --> 00:42:59,960 Speaker 1: and uh and to engage in life in a meaningful way. 704 00:43:00,239 --> 00:43:03,120 Speaker 1: I don't want anyone to be like, oh, I gotta 705 00:43:03,160 --> 00:43:05,920 Speaker 1: get back on Johnson Show. I can't, you know, go 706 00:43:06,040 --> 00:43:10,279 Speaker 1: outside and smell the flowers. None of that. Um, but 707 00:43:10,480 --> 00:43:13,879 Speaker 1: thank you so much for joining the show. This has 708 00:43:13,920 --> 00:43:18,200 Speaker 1: been a truly enjoyable conversation as it has been for 709 00:43:18,320 --> 00:43:20,239 Speaker 1: me as well. Thank you for taking me back to 710 00:43:20,360 --> 00:43:24,239 Speaker 1: south By Southwest sixteen, back to high school, which is 711 00:43:24,280 --> 00:43:27,920 Speaker 1: not always a pleasant revisitation because it's high school and 712 00:43:28,040 --> 00:43:31,680 Speaker 1: everyone knows what that means generally unpleasant time in our lives. Um. 713 00:43:31,840 --> 00:43:34,160 Speaker 1: And yeah, and thank you for what you're doing with 714 00:43:34,280 --> 00:43:38,320 Speaker 1: your show to explain, um, why all this stuff matters, 715 00:43:38,360 --> 00:43:41,160 Speaker 1: even to some degree, what it is, because I think 716 00:43:41,160 --> 00:43:45,920 Speaker 1: it's such an important center of power and community and creativity. 717 00:43:46,640 --> 00:43:49,360 Speaker 1: It's it's not just tech stuff anymore, but it's you know, this, 718 00:43:49,520 --> 00:43:51,960 Speaker 1: this locust. So you've been at this for a while 719 00:43:52,120 --> 00:43:56,480 Speaker 1: and I appreciate what you've been contributing as well. Excellent. Well, 720 00:43:56,520 --> 00:44:00,080 Speaker 1: the show is how to Citizen. It's available anywhere you 721 00:44:00,200 --> 00:44:04,160 Speaker 1: get podcasts. It's you absolutely have to listen to it. 722 00:44:04,239 --> 00:44:06,520 Speaker 1: I guarantee after one episode you'll be hooked and you'll 723 00:44:06,560 --> 00:44:10,040 Speaker 1: just binge them it um yeah, you know. And if 724 00:44:10,080 --> 00:44:12,120 Speaker 1: you're like me, then you're gonna listen to them at 725 00:44:12,120 --> 00:44:14,640 Speaker 1: half speed because you don't want them to end. You 726 00:44:14,800 --> 00:44:18,680 Speaker 1: listen to them at half speed. I have what I've 727 00:44:18,800 --> 00:44:22,440 Speaker 1: never heard of any I've only known people, including myself, 728 00:44:22,520 --> 00:44:27,960 Speaker 1: who accelerate decelerate the playback. Your shows can be so 729 00:44:28,239 --> 00:44:32,399 Speaker 1: packed with such important stuff. I don't want to miss 730 00:44:32,480 --> 00:44:34,759 Speaker 1: any of it. This is a good note. Maybe we 731 00:44:34,760 --> 00:44:39,840 Speaker 1: should spread it out a little bit. Well, thank you again, 732 00:44:40,640 --> 00:44:43,040 Speaker 1: look forward to having you back, and uh, I look 733 00:44:43,080 --> 00:44:45,400 Speaker 1: forward to my listeners checking out your show, so do. 734 00:44:45,560 --> 00:44:48,000 Speaker 1: I thank you for having me enjoy your day and 735 00:44:48,120 --> 00:44:53,080 Speaker 1: the rest of your podcasting and beyond life. Thanks again 736 00:44:53,239 --> 00:44:55,759 Speaker 1: to Baritone Day for joining the show. I could have 737 00:44:55,880 --> 00:44:58,720 Speaker 1: talked with him for hours. I think that was pretty 738 00:44:59,000 --> 00:45:02,560 Speaker 1: clear in the episod out uh, and he probably would 739 00:45:02,560 --> 00:45:05,279 Speaker 1: have let me because he's a nice guy. But you know, 740 00:45:05,719 --> 00:45:10,000 Speaker 1: I should not monopolize anyone's time, including yours, dear listeners, 741 00:45:10,400 --> 00:45:13,000 Speaker 1: So I'll leave you with this his show How to 742 00:45:13,120 --> 00:45:17,000 Speaker 1: Citizen is in fact available on all podcasting platforms. I 743 00:45:17,080 --> 00:45:19,279 Speaker 1: really do urge you to go out and listen to it, 744 00:45:19,880 --> 00:45:24,040 Speaker 1: particularly if you've ever felt yourself wondering what can one 745 00:45:24,120 --> 00:45:26,239 Speaker 1: person do that can make a difference. I know that 746 00:45:26,320 --> 00:45:29,480 Speaker 1: there have been times where I felt kind of lost 747 00:45:29,600 --> 00:45:33,120 Speaker 1: and helpless and that the problems are just so big 748 00:45:33,760 --> 00:45:36,160 Speaker 1: that there's nothing I can really do to make a difference, 749 00:45:36,239 --> 00:45:39,080 Speaker 1: and and it it's really discouraging. Well, it turns out 750 00:45:39,120 --> 00:45:41,279 Speaker 1: there actually are a lot of ways that you can 751 00:45:41,360 --> 00:45:45,239 Speaker 1: make a difference, and his podcast kind of focuses on 752 00:45:45,480 --> 00:45:48,800 Speaker 1: that in a way that you know it's actionable. So 753 00:45:49,239 --> 00:45:51,680 Speaker 1: highly recommend you check it out, especially if you need 754 00:45:51,719 --> 00:45:53,960 Speaker 1: to pick me up. I mean, they do tackle some 755 00:45:54,200 --> 00:45:57,680 Speaker 1: tough problems on that show, but it's with that sort 756 00:45:57,719 --> 00:46:02,560 Speaker 1: of optimistic view, always a goal of doing better, and 757 00:46:02,760 --> 00:46:06,320 Speaker 1: I really admire that. If you have suggestions for topics 758 00:46:06,480 --> 00:46:08,800 Speaker 1: I should cover on future episodes of Tech Stuff, or 759 00:46:09,000 --> 00:46:12,759 Speaker 1: even guests I should invite on the show, please reach 760 00:46:12,800 --> 00:46:14,759 Speaker 1: out to me. The best way to do that is 761 00:46:14,840 --> 00:46:17,919 Speaker 1: over on Twitter. The handle for the show is text 762 00:46:17,960 --> 00:46:21,000 Speaker 1: Stuff H s W and I'll talk to you again 763 00:46:21,800 --> 00:46:30,080 Speaker 1: really soon, y text Stuff is an I heart Radio production. 764 00:46:30,360 --> 00:46:33,120 Speaker 1: For more podcasts from my heart Radio, visit the i 765 00:46:33,280 --> 00:46:36,440 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 766 00:46:36,560 --> 00:46:37,480 Speaker 1: your favorite shows.