1 00:00:13,240 --> 00:00:16,560 Speaker 1: Welcome to tech stuff. This is the story today. A 2 00:00:16,600 --> 00:00:21,040 Speaker 1: conversation with Yasmin Green, the CEO of Jigsaw. Jigsaw is 3 00:00:21,040 --> 00:00:23,840 Speaker 1: a unit within Google that focuses on finding ways to 4 00:00:23,880 --> 00:00:28,159 Speaker 1: harness technology as a solution for societal issues at a 5 00:00:28,160 --> 00:00:32,040 Speaker 1: global and local scale. The Jigsaw team doesn't have any 6 00:00:32,080 --> 00:00:35,520 Speaker 1: revenue goals, and Google grants it the time and resources 7 00:00:35,560 --> 00:00:39,480 Speaker 1: to study intricate problems around the world, from scient extremism 8 00:00:39,560 --> 00:00:43,760 Speaker 1: to political censorship, to determine why it's happening and what 9 00:00:43,840 --> 00:00:46,600 Speaker 1: could be done about it, all from a technology and 10 00:00:46,720 --> 00:00:50,839 Speaker 1: solution oriented point of view. Jigsaw's most recent project took 11 00:00:50,880 --> 00:00:53,720 Speaker 1: Yasmin and her team to a small city in Kentucky 12 00:00:54,000 --> 00:00:58,000 Speaker 1: called Bowling Green. The challenge was to use AI tools 13 00:00:58,240 --> 00:01:01,560 Speaker 1: to create a town hall meeting for the technological age, 14 00:01:02,000 --> 00:01:05,480 Speaker 1: civic engagement that can be seen, heard, and considered all 15 00:01:05,520 --> 00:01:08,240 Speaker 1: beyond the walls of city hall. I've only has been 16 00:01:08,240 --> 00:01:10,800 Speaker 1: green for many years, and in an age where big 17 00:01:10,840 --> 00:01:14,520 Speaker 1: tech consistently drives negative headlines, I wanted to hear from 18 00:01:14,520 --> 00:01:18,600 Speaker 1: someone on the inside working to find best case scenarios 19 00:01:18,640 --> 00:01:21,560 Speaker 1: for the deluge of new tools and technologies that we're 20 00:01:21,600 --> 00:01:24,559 Speaker 1: now contending with. We sat down to talk about Bowling 21 00:01:24,600 --> 00:01:28,600 Speaker 1: Green and Jigsaw's past and future ambitions. He has been Green. 22 00:01:28,640 --> 00:01:31,760 Speaker 1: Welcome back to Well, last time we spoke was on Sleepwalkers. 23 00:01:31,760 --> 00:01:33,640 Speaker 1: This is tech stuff, but welcome back to the studio. 24 00:01:34,040 --> 00:01:36,080 Speaker 2: Follow you around from podcast to podcasts. 25 00:01:36,160 --> 00:01:39,960 Speaker 1: I think it's the other way around. Tell me a 26 00:01:40,000 --> 00:01:42,600 Speaker 1: bit about Jigsaw and your mission there. 27 00:01:43,280 --> 00:01:48,160 Speaker 2: Yeah, Digsaw is an incubator inside Google. We developed technology 28 00:01:48,280 --> 00:01:52,600 Speaker 2: to give people voice and choice in the world around them. 29 00:01:52,800 --> 00:01:55,080 Speaker 2: And I have the privilege of being the CEO. 30 00:01:55,920 --> 00:01:58,000 Speaker 1: And last time we met you at the director of research, 31 00:01:58,040 --> 00:01:59,480 Speaker 1: and now you're the CEO. 32 00:02:00,880 --> 00:02:02,960 Speaker 2: I have broad a shoulders. Now my voice is a 33 00:02:03,000 --> 00:02:04,320 Speaker 2: little lower. 34 00:02:05,160 --> 00:02:07,760 Speaker 1: But what have you brought to the new role? Like, 35 00:02:07,800 --> 00:02:09,800 Speaker 1: how have you put your stamp on the organization? 36 00:02:10,720 --> 00:02:12,880 Speaker 2: So I came to New York actually fourteen years ago 37 00:02:12,919 --> 00:02:14,880 Speaker 2: to help start Jigsaw. So I feel like I've been 38 00:02:15,560 --> 00:02:17,959 Speaker 2: while I bleed Jigsaw. I've been with jigsaf from the 39 00:02:18,000 --> 00:02:21,040 Speaker 2: beginning and then three years ago I took over as 40 00:02:21,120 --> 00:02:26,160 Speaker 2: CEO and it just coincided with a time of incredible 41 00:02:26,600 --> 00:02:29,960 Speaker 2: change and volatility. But one of the things is the 42 00:02:30,040 --> 00:02:33,919 Speaker 2: kind of mainstreaming of conversational AI and both I think 43 00:02:33,960 --> 00:02:37,320 Speaker 2: the power of the air models and also the kind 44 00:02:37,360 --> 00:02:41,520 Speaker 2: of public discourse and public awareness about AI, and that 45 00:02:41,600 --> 00:02:43,840 Speaker 2: really changed. I think also the tempo inside all of 46 00:02:43,880 --> 00:02:47,360 Speaker 2: the tech companies around releasing and iterating and you know, 47 00:02:47,440 --> 00:02:50,519 Speaker 2: the near term gain is really clear, and then maybe 48 00:02:50,560 --> 00:02:54,480 Speaker 2: the long term pain is less in focus. But that's 49 00:02:54,560 --> 00:02:56,040 Speaker 2: kind of what we think we exist to do. A 50 00:02:56,120 --> 00:02:59,359 Speaker 2: jigsaw is be where the major tech companies and Google 51 00:02:59,400 --> 00:03:02,119 Speaker 2: aren't what our kind of highest and best use would 52 00:03:02,160 --> 00:03:04,240 Speaker 2: be are the things that other people aren't doing right now. 53 00:03:04,520 --> 00:03:07,760 Speaker 1: Last time we met, we talked about a program you 54 00:03:07,919 --> 00:03:12,919 Speaker 1: ran to serve targeted ads to people considering the path 55 00:03:13,000 --> 00:03:13,480 Speaker 1: of terrorism. 56 00:03:13,520 --> 00:03:16,880 Speaker 2: Right, yeah. Yeah. When we started working on radicalization, which 57 00:03:16,919 --> 00:03:19,080 Speaker 2: was one of our first focus areas over a decade ago, 58 00:03:19,760 --> 00:03:22,640 Speaker 2: the idea that the Internet would have anything to do 59 00:03:22,720 --> 00:03:26,240 Speaker 2: with why anyone would go and join the majorhaden in 60 00:03:26,280 --> 00:03:29,480 Speaker 2: Afghanistan was so unfathomable, both to people who are experts 61 00:03:29,520 --> 00:03:32,919 Speaker 2: about you know, Islamism and the people who are experts 62 00:03:32,960 --> 00:03:35,840 Speaker 2: in the Internet. Now, those you know, areas of kind 63 00:03:35,880 --> 00:03:38,640 Speaker 2: of extremism and hate, and you know, those things are 64 00:03:38,920 --> 00:03:41,040 Speaker 2: they are not solved problems, but they're also not in 65 00:03:41,280 --> 00:03:44,160 Speaker 2: anyone's behind spots like they are very crowded that the 66 00:03:44,160 --> 00:03:47,040 Speaker 2: subject of regulation, et cetera. And now we're much more 67 00:03:47,040 --> 00:03:49,680 Speaker 2: interested in, like, what if this all goes right and 68 00:03:49,800 --> 00:03:52,600 Speaker 2: we do build this incredibly powerful technology and it's prevalent, 69 00:03:53,360 --> 00:03:55,720 Speaker 2: how do we make sure that people have a voice 70 00:03:55,720 --> 00:03:57,640 Speaker 2: and choice in the world around them, because it's not 71 00:03:57,760 --> 00:04:01,040 Speaker 2: inevitable that that will come along with the intelligence. 72 00:04:01,440 --> 00:04:05,160 Speaker 1: How did your sort of personal experience prepare you for 73 00:04:05,680 --> 00:04:07,760 Speaker 1: this role? I know you worked to Google beforehand, I 74 00:04:07,760 --> 00:04:10,800 Speaker 1: think in sales in Middle East and Africa and Europe, 75 00:04:10,800 --> 00:04:12,800 Speaker 1: and I know that you left Iran I think as 76 00:04:12,920 --> 00:04:15,520 Speaker 1: a young child as well. I mean, how how has 77 00:04:15,560 --> 00:04:18,320 Speaker 1: your kind of personal experience contributed to you being the 78 00:04:18,360 --> 00:04:21,320 Speaker 1: technologist that you are today based. 79 00:04:21,040 --> 00:04:24,080 Speaker 2: In Paris for Google, managing strategy and ops for the 80 00:04:24,080 --> 00:04:25,720 Speaker 2: sales teams. And I got, of course, thing, do you 81 00:04:25,720 --> 00:04:27,600 Speaker 2: want to come to New York and help start Jigsaw? 82 00:04:28,279 --> 00:04:30,680 Speaker 2: And the original frame of Jigsaw was actually kind of 83 00:04:30,680 --> 00:04:33,520 Speaker 2: looking at geopolitical threats. The revolution that happened in Iran 84 00:04:33,560 --> 00:04:37,480 Speaker 2: which led to me leaving, where the religious extremists took over. 85 00:04:37,880 --> 00:04:41,520 Speaker 2: They were those were violent extremists, you know, and sometimes 86 00:04:41,560 --> 00:04:45,080 Speaker 2: they do they make big moves and they take over land, 87 00:04:45,120 --> 00:04:47,880 Speaker 2: and they can you know, oppress people. For such a 88 00:04:47,920 --> 00:04:50,280 Speaker 2: long time, we had this view that was prevalent in 89 00:04:50,320 --> 00:04:52,320 Speaker 2: tech then, which was, you know, the Internet will be 90 00:04:52,320 --> 00:04:55,440 Speaker 2: a democratizing technology. I don't have to worry it's going 91 00:04:55,480 --> 00:04:57,240 Speaker 2: to be bumpy along the way, but like, one thing 92 00:04:57,240 --> 00:04:58,480 Speaker 2: you can be sure is this is going to bring 93 00:04:58,480 --> 00:05:01,719 Speaker 2: people together and connect them to information. And I was like, yeah, 94 00:05:02,120 --> 00:05:03,320 Speaker 2: I think there are a lot of people going to 95 00:05:03,320 --> 00:05:06,200 Speaker 2: have their own idea about that, and so I wanted 96 00:05:06,200 --> 00:05:07,839 Speaker 2: to work on things to do with the Internet that 97 00:05:07,880 --> 00:05:12,120 Speaker 2: were very kind of discerning about what might happen. So 98 00:05:12,120 --> 00:05:13,719 Speaker 2: I think this idea of alays looking around the corner 99 00:05:13,760 --> 00:05:18,279 Speaker 2: and saying where's where's crowded now and where people aren't looking, 100 00:05:18,320 --> 00:05:21,640 Speaker 2: and having the luxury to kind of have this incredible 101 00:05:21,880 --> 00:05:25,000 Speaker 2: group of people be thinking about what's going to happen next, 102 00:05:25,040 --> 00:05:27,480 Speaker 2: and you know, having the resources of Google to try 103 00:05:27,480 --> 00:05:28,560 Speaker 2: to affect that. 104 00:05:29,000 --> 00:05:31,920 Speaker 1: Jigsaw is a division of Google with I don't think 105 00:05:31,960 --> 00:05:33,960 Speaker 1: has any revenue. Doesn't have a revenue line, right, It's a. 106 00:05:34,520 --> 00:05:35,080 Speaker 2: No, it does not. 107 00:05:36,440 --> 00:05:39,400 Speaker 1: And Google's now in this moment where I think we'll 108 00:05:39,400 --> 00:05:41,720 Speaker 1: have a lot more questions about the larger role of 109 00:05:41,760 --> 00:05:45,360 Speaker 1: technology companies in our society. But then also the business 110 00:05:45,360 --> 00:05:47,640 Speaker 1: model with search and with the rise of AI is 111 00:05:48,240 --> 00:05:51,320 Speaker 1: more threatened. And so how does an organization like the 112 00:05:51,360 --> 00:05:54,039 Speaker 1: one you lead deal with those two things? On the 113 00:05:54,040 --> 00:05:57,520 Speaker 1: one hand, maybe being odds or surfacing things at the 114 00:05:57,560 --> 00:06:00,799 Speaker 1: parent company doesn't want to be surfaced. On the other hand, 115 00:06:01,600 --> 00:06:04,320 Speaker 1: operating in a more economically constrain the environment where there 116 00:06:04,320 --> 00:06:06,240 Speaker 1: are cuts and buyouts and stuff going on. How do 117 00:06:06,320 --> 00:06:07,880 Speaker 1: those two things affect you up day to day. 118 00:06:08,839 --> 00:06:10,479 Speaker 2: I mean, there are so many parts of all the 119 00:06:10,520 --> 00:06:13,080 Speaker 2: tech companies that work on the things that are challenges. 120 00:06:13,240 --> 00:06:15,520 Speaker 2: You know, they will have these massive trust and safety 121 00:06:16,080 --> 00:06:18,040 Speaker 2: machinery that are actually looking all the time and like 122 00:06:18,120 --> 00:06:22,280 Speaker 2: a trillion pieces of content and everything from child sexual 123 00:06:22,279 --> 00:06:24,440 Speaker 2: assault imagery to the you know, the worst of it 124 00:06:24,520 --> 00:06:27,440 Speaker 2: and scams. So the tech companies are full of people 125 00:06:27,440 --> 00:06:30,000 Speaker 2: who are looking at where things aren't going well and 126 00:06:30,040 --> 00:06:32,520 Speaker 2: trying to affect them. I think maybe the thing that's 127 00:06:32,640 --> 00:06:37,920 Speaker 2: different is that Jigsaw is providing a public commentary and 128 00:06:37,960 --> 00:06:40,120 Speaker 2: there I think it probably helps that we have our 129 00:06:40,160 --> 00:06:43,120 Speaker 2: own brand, and so we go off and do our things, 130 00:06:43,120 --> 00:06:45,800 Speaker 2: and sometimes sometimes we try things that don't work, and 131 00:06:45,839 --> 00:06:48,000 Speaker 2: in which case Google's not too bollied. And then when 132 00:06:48,000 --> 00:06:51,080 Speaker 2: we do things that you really are very significant contributions 133 00:06:51,080 --> 00:06:53,800 Speaker 2: to helping make people better off with the Internet and AI, 134 00:06:54,279 --> 00:06:57,680 Speaker 2: then Google's like, this is Google Jigsaw everyone, we take 135 00:06:57,720 --> 00:06:59,880 Speaker 2: full credit. So it seems to be a good kind 136 00:06:59,880 --> 00:07:03,160 Speaker 2: of arrangement that really works for Google. But you're right 137 00:07:03,200 --> 00:07:05,320 Speaker 2: that we often are in that we are looking at 138 00:07:05,320 --> 00:07:07,760 Speaker 2: like what what is not necessarily going to go? Well, 139 00:07:08,320 --> 00:07:10,200 Speaker 2: it is in Google's long term interest to have a 140 00:07:10,200 --> 00:07:12,600 Speaker 2: group that does that and that makes significant progress that 141 00:07:12,640 --> 00:07:14,360 Speaker 2: they can rightfully take credit for. 142 00:07:14,640 --> 00:07:17,080 Speaker 1: You're sort of in a sense you're not red teaming 143 00:07:17,120 --> 00:07:19,480 Speaker 1: particular products, but you're read teaming the cult that the 144 00:07:19,520 --> 00:07:20,480 Speaker 1: wider tech industry. 145 00:07:20,680 --> 00:07:22,760 Speaker 2: Yeah, and e specifically with the view to like, what 146 00:07:22,760 --> 00:07:25,920 Speaker 2: are we going to do about it? Yeah. I assure 147 00:07:25,920 --> 00:07:27,760 Speaker 2: you that there are many many teams that are red 148 00:07:27,760 --> 00:07:30,040 Speaker 2: teaming internally. They just don't come and talk to you 149 00:07:30,040 --> 00:07:31,720 Speaker 2: about it, or maybe they can maybe actually got an 150 00:07:31,720 --> 00:07:34,640 Speaker 2: advocate work. It was so great. That was really great. 151 00:07:35,360 --> 00:07:37,760 Speaker 1: Can you describe for our audience that text stuff, what 152 00:07:38,160 --> 00:07:42,800 Speaker 1: ethnographic research looks like and how you used it. If 153 00:07:42,840 --> 00:07:44,600 Speaker 1: you're projecting Bowling Green Kentucky. 154 00:07:45,000 --> 00:07:49,040 Speaker 2: Yeah, depending on where you sit, you then press that 155 00:07:49,080 --> 00:07:51,520 Speaker 2: jigs or does it or you're kind of appalled that 156 00:07:51,600 --> 00:07:53,800 Speaker 2: not every part of a tech company does this. But 157 00:07:53,840 --> 00:07:58,320 Speaker 2: it's basically the very patient type of research that involves 158 00:07:59,000 --> 00:08:01,800 Speaker 2: going in and spending time with people and observing how 159 00:08:01,840 --> 00:08:05,160 Speaker 2: they live in their community and people are making sense 160 00:08:05,800 --> 00:08:07,960 Speaker 2: of using the Internet, which is the part we care 161 00:08:07,960 --> 00:08:12,560 Speaker 2: about as part of an existence that extends beyond technology obviously, 162 00:08:12,640 --> 00:08:14,360 Speaker 2: like who they are, what their identity is, what they do, 163 00:08:14,440 --> 00:08:17,360 Speaker 2: who their family is with their community is how they identify, 164 00:08:18,080 --> 00:08:21,560 Speaker 2: and so we bring that wide angle lens to how 165 00:08:21,600 --> 00:08:25,400 Speaker 2: we think about challenges. So we had built this big AI, 166 00:08:25,840 --> 00:08:30,000 Speaker 2: really helpful tool that was used by platforms and publishers 167 00:08:30,000 --> 00:08:32,320 Speaker 2: around the Internet to try to have conversations that serve 168 00:08:32,400 --> 00:08:35,880 Speaker 2: their communities. But last year we thought to ourselves, maybe 169 00:08:35,880 --> 00:08:41,079 Speaker 2: we could help enable conversations, large conversations that happen not 170 00:08:41,240 --> 00:08:46,920 Speaker 2: just online between people, but between policymakers and their constituents. 171 00:08:47,920 --> 00:08:50,880 Speaker 2: And we end up doing something which we've just completed, 172 00:08:50,880 --> 00:08:53,760 Speaker 2: which I'm excited to talk about in Kentucky, in this 173 00:08:53,800 --> 00:08:57,400 Speaker 2: town called Bonling Green. But we kind of understood that 174 00:08:57,480 --> 00:09:00,679 Speaker 2: they wanted to have a town wide conversation, but that 175 00:09:00,760 --> 00:09:04,640 Speaker 2: nobody participated. There was no public input when they were 176 00:09:04,640 --> 00:09:05,920 Speaker 2: given space to contribute. 177 00:09:06,040 --> 00:09:08,640 Speaker 1: In fact, the only place you obviously a town hall 178 00:09:08,840 --> 00:09:12,520 Speaker 1: in America Arizona, Cinno, Fox News too use for the election, right. 179 00:09:12,559 --> 00:09:15,079 Speaker 1: The concept of the town hall is like more or 180 00:09:15,160 --> 00:09:18,520 Speaker 1: less a cliche. So how do you use technology to 181 00:09:18,520 --> 00:09:20,880 Speaker 1: actually have what would have the town hall in premodernaty? 182 00:09:20,920 --> 00:09:23,520 Speaker 2: Actually? Okay, So we went to meet with the judge 183 00:09:23,520 --> 00:09:25,960 Speaker 2: executive of this county. It's like the mayor of this 184 00:09:26,040 --> 00:09:29,520 Speaker 2: county who's called Doug Norman. Incredible, I'm gonna call him Doug. 185 00:09:29,840 --> 00:09:32,240 Speaker 2: This town is a doubling in size over the next 186 00:09:32,320 --> 00:09:36,120 Speaker 2: twenty years. It's going from one hundred and forty eight manation. 187 00:09:36,200 --> 00:09:38,199 Speaker 2: Might not be getting those rights those numbers right, it 188 00:09:38,280 --> 00:09:40,240 Speaker 2: might be doubling even more. But it's basically like adding 189 00:09:40,280 --> 00:09:42,960 Speaker 2: another bowling green to the county, which sounds great like 190 00:09:43,120 --> 00:09:46,280 Speaker 2: growth and jobs and development. And at the same time 191 00:09:46,880 --> 00:09:49,000 Speaker 2: we're a mixed bag of emotions in the community. So 192 00:09:49,000 --> 00:09:52,320 Speaker 2: there's a big farming constituency and they are concerned about 193 00:09:52,400 --> 00:09:55,160 Speaker 2: land use. In the irreversible decision to let people develop 194 00:09:55,160 --> 00:09:58,360 Speaker 2: on your land, lifelong residents are concerned about the preserving 195 00:09:58,440 --> 00:10:00,800 Speaker 2: their culture and their building, and there's a lot of 196 00:10:00,800 --> 00:10:04,959 Speaker 2: concern about migration. And it's interesting that in Kentucky and 197 00:10:05,000 --> 00:10:07,720 Speaker 2: Bowling Green the concern is not about migration from foreigners 198 00:10:07,720 --> 00:10:11,120 Speaker 2: but from people from Nashville, Tennessee. Really, they're like, we 199 00:10:11,360 --> 00:10:13,600 Speaker 2: don't want to become like Nashville extended here. So there's 200 00:10:13,600 --> 00:10:16,800 Speaker 2: all like valid concerns that they have, and they're going 201 00:10:16,840 --> 00:10:18,600 Speaker 2: to have to have all you know, all this growth. 202 00:10:18,600 --> 00:10:22,360 Speaker 2: And actually the words from Doug capture it best. He 203 00:10:22,480 --> 00:10:25,240 Speaker 2: said to them, do we want this growth to happen 204 00:10:25,320 --> 00:10:28,040 Speaker 2: to us or do we want the growth to happen 205 00:10:28,200 --> 00:10:30,079 Speaker 2: for us? If we want to chappen for us, we 206 00:10:30,160 --> 00:10:32,400 Speaker 2: have to come together and set a vision for Bowling Green, 207 00:10:32,880 --> 00:10:35,680 Speaker 2: which is really amazing. But then you are similarly perfectly, 208 00:10:35,720 --> 00:10:38,280 Speaker 2: how's it going like with you know, Townholls, And he's like, yeah, 209 00:10:38,320 --> 00:10:42,160 Speaker 2: we get about eight or nine people, and so we're like, 210 00:10:42,280 --> 00:10:43,960 Speaker 2: maybe we can help because we have a lot of 211 00:10:44,000 --> 00:10:46,920 Speaker 2: experience with tech and large girl conversations. But it always 212 00:10:46,920 --> 00:10:49,480 Speaker 2: starts like product them always starts with like field work. 213 00:10:49,679 --> 00:10:52,319 Speaker 2: So we went to Bowling Green to go and spend 214 00:10:52,320 --> 00:10:54,959 Speaker 2: time with people, people who are really marginalized from politics, 215 00:10:55,000 --> 00:10:58,000 Speaker 2: like the elder, these students, recent migrants, et cetera, people 216 00:10:58,000 --> 00:11:01,120 Speaker 2: who've just been left incarceration, to figure out what what 217 00:11:01,120 --> 00:11:03,920 Speaker 2: would stop them from participating, And so we do things 218 00:11:03,960 --> 00:11:07,200 Speaker 2: like we took a couple to like a tree planting meeting. 219 00:11:07,600 --> 00:11:10,800 Speaker 2: We took like this college graduate to a city planning 220 00:11:11,120 --> 00:11:14,000 Speaker 2: commission get together about whether someone should be allowed to 221 00:11:14,000 --> 00:11:16,120 Speaker 2: add to the back of their pub. We took all 222 00:11:16,160 --> 00:11:19,960 Speaker 2: of them and so you can observe what is holding 223 00:11:19,960 --> 00:11:22,440 Speaker 2: them back, and it was the things that they said 224 00:11:22,520 --> 00:11:26,880 Speaker 2: resonated so much with me. They don't think that there 225 00:11:27,240 --> 00:11:29,800 Speaker 2: they have the expertise, they don't think they matter, they 226 00:11:29,800 --> 00:11:31,760 Speaker 2: don't think that if they use their voice it would 227 00:11:31,880 --> 00:11:34,319 Speaker 2: be listened to. And it's all stuff that who hasn't 228 00:11:34,320 --> 00:11:36,480 Speaker 2: been in like a large meeting where you're like, I 229 00:11:36,480 --> 00:11:39,640 Speaker 2: don't really understand what we're talking about, or I don't 230 00:11:40,160 --> 00:11:42,160 Speaker 2: understand my role in this discussion, or like if I 231 00:11:42,240 --> 00:11:44,199 Speaker 2: use my voice, you know my boss going to hear me, 232 00:11:44,280 --> 00:11:45,960 Speaker 2: or is it going to go anywhere? So if we 233 00:11:45,960 --> 00:11:48,400 Speaker 2: can relate to it, they get that level. The fact 234 00:11:48,400 --> 00:11:51,319 Speaker 2: that people really show up to their local community, whether 235 00:11:51,360 --> 00:11:55,000 Speaker 2: it's like their refugee community or their LGBTQ community, or 236 00:11:55,040 --> 00:11:58,080 Speaker 2: the sports club or the church. But then when you 237 00:11:58,240 --> 00:12:01,560 Speaker 2: like come to the town conversation, the civic oh, I 238 00:12:01,559 --> 00:12:03,920 Speaker 2: don't know, I don't look or sound the part. You know, 239 00:12:04,040 --> 00:12:06,839 Speaker 2: the college student we took to the city planning commission thing, 240 00:12:06,840 --> 00:12:08,720 Speaker 2: he was like, I can't go to that. I'm not 241 00:12:08,760 --> 00:12:11,080 Speaker 2: a lawyer. And of course when he went there were 242 00:12:11,080 --> 00:12:13,480 Speaker 2: no lawyers. But like, that's what we have in our head, 243 00:12:13,840 --> 00:12:15,600 Speaker 2: is like, I am the expertise. 244 00:12:16,080 --> 00:12:18,720 Speaker 1: Can you tell me a little bit more about perspective API. 245 00:12:19,320 --> 00:12:21,640 Speaker 1: I know it's developed by Jigsaw and it's now used 246 00:12:21,640 --> 00:12:25,400 Speaker 1: by Wikipedia and Wall Street Journal, And essentially what it 247 00:12:25,440 --> 00:12:31,240 Speaker 1: does is regulate the comments sections using AI to understand 248 00:12:31,240 --> 00:12:34,040 Speaker 1: the impact any given comment might have on other commenters 249 00:12:34,080 --> 00:12:39,040 Speaker 1: or participants and score that impact. But I'm curious, is 250 00:12:39,080 --> 00:12:43,000 Speaker 1: the Bowling Green Project an extension of Jigsaw's work on perspective. 251 00:12:43,520 --> 00:12:45,760 Speaker 2: It's hard to even remember that, you know, nine or 252 00:12:45,760 --> 00:12:48,960 Speaker 2: ten years ago, all these publishers are closing their comment 253 00:12:49,000 --> 00:12:52,080 Speaker 2: spaces because they couldn't even have a civil conversation, and 254 00:12:52,120 --> 00:12:54,760 Speaker 2: so Perspective was a you know, an AI that helps 255 00:12:54,840 --> 00:12:58,720 Speaker 2: with managing online conversations. They get ranks all different types 256 00:12:58,760 --> 00:13:02,200 Speaker 2: of things, everything from like ad hominem attacks too, is 257 00:13:02,240 --> 00:13:05,640 Speaker 2: something constructive and what explained like it scores them for 258 00:13:05,840 --> 00:13:08,040 Speaker 2: publishers to decide what kind of conversation you want to have. 259 00:13:08,600 --> 00:13:11,400 Speaker 2: But in the case of policymakers in there, you know 260 00:13:11,440 --> 00:13:13,760 Speaker 2: the people they represent, they need to be in conversation too. 261 00:13:13,800 --> 00:13:17,160 Speaker 2: There's this quote that I love, which is that conversation 262 00:13:17,320 --> 00:13:20,760 Speaker 2: is the soul of democracy. What gives democracies legitimacy is 263 00:13:20,960 --> 00:13:23,360 Speaker 2: our ability to have a free and open exchange of ideas. 264 00:13:24,080 --> 00:13:27,640 Speaker 2: And so the thing that we wanted to help Doug with, 265 00:13:28,480 --> 00:13:30,559 Speaker 2: which kind of informed what we developed, which we call 266 00:13:30,800 --> 00:13:34,320 Speaker 2: sense making, was he wants to tell his people come 267 00:13:34,400 --> 00:13:36,959 Speaker 2: and be in a conversation with me. People weren't showing up. 268 00:13:37,120 --> 00:13:40,200 Speaker 1: So it wasn't like there's an existing conversation happening. How 269 00:13:40,280 --> 00:13:42,880 Speaker 1: do we make it more provatal inclusive? It was how 270 00:13:42,920 --> 00:13:46,200 Speaker 1: do we create the environment for a conversation to happen? 271 00:13:46,360 --> 00:13:48,880 Speaker 2: Yes, and they need to believe that he will hear 272 00:13:48,920 --> 00:13:52,680 Speaker 2: their voice if they express it. People who study this say, 273 00:13:52,679 --> 00:13:54,920 Speaker 2: if there's nothing that's worse than not asking people for 274 00:13:54,960 --> 00:13:55,840 Speaker 2: their opinion. 275 00:13:55,559 --> 00:13:59,520 Speaker 1: Asking people ignoring them because that's truly demotivating. 276 00:14:00,920 --> 00:14:03,480 Speaker 2: So we realized that he would need to be able 277 00:14:03,480 --> 00:14:05,920 Speaker 2: to tell them that their voice would be heard, and 278 00:14:06,000 --> 00:14:09,240 Speaker 2: so we worked. We ended up working with a platform 279 00:14:09,320 --> 00:14:13,040 Speaker 2: called polist to solicit input and it was just open ended, 280 00:14:13,440 --> 00:14:15,520 Speaker 2: open ended like answers, so it was really in your 281 00:14:15,520 --> 00:14:18,520 Speaker 2: own words and on your own terms, So people were 282 00:14:18,520 --> 00:14:19,840 Speaker 2: free to say what they wanted. They're free to do 283 00:14:19,880 --> 00:14:22,320 Speaker 2: it at their own time, in their bedroom, you know, 284 00:14:22,400 --> 00:14:25,400 Speaker 2: an anonymously. So it was really like a lot of 285 00:14:25,440 --> 00:14:27,880 Speaker 2: what we heard in the ethnography we were trying to 286 00:14:27,920 --> 00:14:30,640 Speaker 2: account for. My team sat down with two sisters that 287 00:14:30,680 --> 00:14:34,320 Speaker 2: were recently relocated Bowling Green from Afghanistan, and they went 288 00:14:34,440 --> 00:14:36,840 Speaker 2: and in their home like with like over tree. Then 289 00:14:36,920 --> 00:14:38,760 Speaker 2: they were like you know, with the headscarves everything. They 290 00:14:38,760 --> 00:14:40,200 Speaker 2: were asking them like what what would it take for 291 00:14:40,240 --> 00:14:42,400 Speaker 2: you to come and be part of this conversation And 292 00:14:42,440 --> 00:14:44,280 Speaker 2: they were just like that, they had no idea what 293 00:14:44,320 --> 00:14:47,800 Speaker 2: civic participation referred to, the idea of like even attending 294 00:14:47,920 --> 00:14:51,480 Speaker 2: a town hall, never mind being asked to speak, was 295 00:14:51,640 --> 00:14:54,800 Speaker 2: a complete anathema to them. So this conversation that we 296 00:14:54,840 --> 00:14:57,280 Speaker 2: set up, which is called what could Bowling Green be, 297 00:14:57,320 --> 00:14:59,480 Speaker 2: which we did with this incredible partner on the ground 298 00:14:59,480 --> 00:15:02,800 Speaker 2: called Innovation an Engine. The conversation was like designed to 299 00:15:02,800 --> 00:15:05,000 Speaker 2: be like you are free to say what you want, 300 00:15:05,200 --> 00:15:07,600 Speaker 2: where you want, whenever you want. And it was a 301 00:15:07,600 --> 00:15:10,080 Speaker 2: month long conversation and we did it through this POSE 302 00:15:10,160 --> 00:15:13,280 Speaker 2: platform and the idea was speak and you'll be heard. 303 00:15:13,800 --> 00:15:17,280 Speaker 2: And the part that we used AI for was making 304 00:15:17,320 --> 00:15:20,080 Speaker 2: sense of what everybody said in a way that could 305 00:15:20,120 --> 00:15:22,040 Speaker 2: be shared back to them, that they would see their 306 00:15:22,080 --> 00:15:24,680 Speaker 2: voices in the conversation and could be shared back with 307 00:15:24,720 --> 00:15:26,840 Speaker 2: the policymaker so he could take an action. 308 00:15:37,720 --> 00:15:41,240 Speaker 1: After the break, the largest town hall in US history 309 00:15:41,600 --> 00:15:55,600 Speaker 1: and how Jigsaw made it happen. Welcome back to tech stuff. 310 00:15:55,920 --> 00:15:58,960 Speaker 1: Before the break, we were talking with Yasmin Green, CEO 311 00:15:59,040 --> 00:16:02,720 Speaker 1: of Jigsaw, about how the initial pieces of an ambitious 312 00:16:02,720 --> 00:16:06,080 Speaker 1: project using AI to create a more inclusive and active 313 00:16:06,120 --> 00:16:11,200 Speaker 1: town hall came together in the town of Bowling Green, Kentucky. So, 314 00:16:11,440 --> 00:16:13,720 Speaker 1: taking a few steps back, how does this come about? Like, 315 00:16:13,760 --> 00:16:16,480 Speaker 1: how do you choose a community to partner with? And 316 00:16:16,720 --> 00:16:18,400 Speaker 1: I guess what are the stakes for you to be 317 00:16:18,440 --> 00:16:22,200 Speaker 1: working with government versus just in a purely online arena, 318 00:16:22,320 --> 00:16:25,000 Speaker 1: Like how much pressure and also opportunities. 319 00:16:24,480 --> 00:16:27,560 Speaker 2: Does that add? Kentucky's actually a really pioneering state, and 320 00:16:27,760 --> 00:16:29,040 Speaker 2: partly a little bit by it because now I spent 321 00:16:29,200 --> 00:16:31,160 Speaker 2: much time there, but the cross even not in Bonling Green, 322 00:16:31,240 --> 00:16:33,840 Speaker 2: other places do. They're doing a lot of civic tech 323 00:16:33,880 --> 00:16:36,320 Speaker 2: work and they're very forward. And Bowling Green in particular 324 00:16:36,440 --> 00:16:39,320 Speaker 2: was known because they had done something several years ago 325 00:16:39,320 --> 00:16:42,480 Speaker 2: where they had done public participation with the local newspaper. Okay, 326 00:16:42,960 --> 00:16:44,720 Speaker 2: and we had heard that they're putting together the stake 327 00:16:44,880 --> 00:16:46,360 Speaker 2: vision for twenty fifty. 328 00:16:46,440 --> 00:16:48,320 Speaker 1: So what do we want Bowling Green? Yeah, so you'd 329 00:16:48,360 --> 00:16:50,280 Speaker 1: heard about this, you knew that they would take forward. 330 00:16:50,720 --> 00:16:52,520 Speaker 2: Yes, and we're like, maybe we can actually they come 331 00:16:52,560 --> 00:16:54,680 Speaker 2: and give them something really useful. Maybe we can co 332 00:16:54,720 --> 00:16:56,920 Speaker 2: develop something with them. And so they said, okay, let's 333 00:16:56,920 --> 00:16:58,680 Speaker 2: do this month and conversation to hear what people have 334 00:16:58,760 --> 00:17:02,160 Speaker 2: to say. And then they actually did like super local 335 00:17:02,640 --> 00:17:04,680 Speaker 2: like ground game marketing campaign. 336 00:17:04,320 --> 00:17:06,120 Speaker 1: Where you involved with a marketing campaign or. 337 00:17:06,200 --> 00:17:08,399 Speaker 2: Yeah, we partnered with them, but it was like local 338 00:17:08,400 --> 00:17:10,040 Speaker 2: creative agencies that came up with the creators. 339 00:17:10,040 --> 00:17:12,919 Speaker 1: So basically get people to be aware and think it 340 00:17:12,960 --> 00:17:15,120 Speaker 1: was fun, evangelize interact with the platform. 341 00:17:15,320 --> 00:17:17,959 Speaker 2: Yeah, yeah, and exactly, And the main thing was just 342 00:17:18,000 --> 00:17:21,960 Speaker 2: to be local. So things were translated into nine languages. 343 00:17:22,280 --> 00:17:24,359 Speaker 2: They they put them in you know, they fly at 344 00:17:24,359 --> 00:17:27,280 Speaker 2: international supermarkets, they went to make a church is, they 345 00:17:27,280 --> 00:17:28,320 Speaker 2: went to basketball games. 346 00:17:28,359 --> 00:17:32,280 Speaker 1: So you created awareness and then whether like QR codes 347 00:17:32,359 --> 00:17:34,240 Speaker 1: that people like clicked on their phones and they would 348 00:17:34,280 --> 00:17:36,359 Speaker 1: take it into the conversational platform or like how what 349 00:17:36,440 --> 00:17:36,639 Speaker 1: was the. 350 00:17:36,800 --> 00:17:38,960 Speaker 2: Yeah, basically you look, yes, they're working on code. Do 351 00:17:39,080 --> 00:17:40,760 Speaker 2: you go to what could be GB? In fact, you 352 00:17:40,760 --> 00:17:42,680 Speaker 2: can stop it. What could Bowling Green be? Is the 353 00:17:42,680 --> 00:17:45,399 Speaker 2: website dot com and you can see the report. But 354 00:17:45,520 --> 00:17:47,919 Speaker 2: people would log in and then you just get like 355 00:17:47,960 --> 00:17:51,000 Speaker 2: a question prompt what do you want for Bowling Green 356 00:17:51,040 --> 00:17:53,320 Speaker 2: in the future, Which that's that depended you put your 357 00:17:53,359 --> 00:17:54,920 Speaker 2: answers in and. 358 00:17:55,200 --> 00:17:56,280 Speaker 1: Only one single prompt. 359 00:17:56,320 --> 00:17:58,120 Speaker 2: There's one question and could you could put you could 360 00:17:58,160 --> 00:18:00,000 Speaker 2: keep putting more of what you want. You probably have lots. 361 00:18:00,119 --> 00:18:02,760 Speaker 2: It's pretty open. You might have ideas for education or infrastructure, 362 00:18:02,960 --> 00:18:06,639 Speaker 2: you know, restaurants. But then there's voting, so you you 363 00:18:06,720 --> 00:18:09,200 Speaker 2: vote on other people's input. And that was a really 364 00:18:09,240 --> 00:18:12,560 Speaker 2: important signal to see like where, if anywhere, do we 365 00:18:12,640 --> 00:18:15,840 Speaker 2: agree right and where we put so much effort into 366 00:18:15,880 --> 00:18:18,159 Speaker 2: trying to make a like make this local and be 367 00:18:18,240 --> 00:18:21,840 Speaker 2: make it accessible and still like you know, the beginning 368 00:18:21,840 --> 00:18:23,960 Speaker 2: of the month long conversation, we were like, is anyone 369 00:18:24,920 --> 00:18:27,200 Speaker 2: is anyone going to turn up to this? Yeah? Because 370 00:18:27,200 --> 00:18:28,960 Speaker 2: maybe we do all this effort and it's eight people. 371 00:18:29,000 --> 00:18:30,040 Speaker 1: Did you show a tumble. 372 00:18:31,560 --> 00:18:37,000 Speaker 2: Right? Right? So the baseline's eight okay, So we were like, 373 00:18:37,880 --> 00:18:39,919 Speaker 2: thirty will be good. Thirty will be good. Now, so 374 00:18:41,240 --> 00:18:42,600 Speaker 2: we're like, if we could get a one thousand or 375 00:18:42,640 --> 00:18:44,920 Speaker 2: two thousand, and so it was eight. 376 00:18:44,840 --> 00:18:47,360 Speaker 1: Thousand, eight thousand respondents. 377 00:18:46,960 --> 00:18:49,159 Speaker 2: This kind of like AI enabled town hall was a 378 00:18:49,240 --> 00:18:53,200 Speaker 2: thousand times larger than the regular town hall, and one 379 00:18:53,240 --> 00:18:55,640 Speaker 2: in every ten residents basically the equivalent of one every 380 00:18:55,640 --> 00:18:59,000 Speaker 2: ten residents participated. So as far as like that's what 381 00:18:59,200 --> 00:19:01,280 Speaker 2: the folks on the ground and say, is that it 382 00:19:01,320 --> 00:19:04,439 Speaker 2: meant that people were discussing this topic even outside of 383 00:19:04,640 --> 00:19:07,080 Speaker 2: the actual platform because there was that much participation, So 384 00:19:07,119 --> 00:19:09,280 Speaker 2: it kind of pas the tipping point. So it was 385 00:19:09,280 --> 00:19:10,679 Speaker 2: really nice to see the numbers go up. They did 386 00:19:10,720 --> 00:19:12,360 Speaker 2: go up, they did spike a bit towards the end, 387 00:19:12,800 --> 00:19:17,320 Speaker 2: and it was actually the largest digital town hall in 388 00:19:17,600 --> 00:19:20,560 Speaker 2: US history. So it's the most that anyone's come together, 389 00:19:20,600 --> 00:19:23,399 Speaker 2: and that's US partnering with a town in Kentucky. So 390 00:19:23,400 --> 00:19:25,560 Speaker 2: if you like imagine what is possible at the state 391 00:19:25,640 --> 00:19:28,399 Speaker 2: level or across states if you bring people together to 392 00:19:28,400 --> 00:19:31,879 Speaker 2: share their view. Although there was before like advocating for 393 00:19:32,000 --> 00:19:35,040 Speaker 2: more of those type of CIF conversations, I think the 394 00:19:35,040 --> 00:19:37,399 Speaker 2: first big apprehension was is anyone could show up? Ye? 395 00:19:38,280 --> 00:19:41,479 Speaker 2: The second was, well, you know, if this is a 396 00:19:41,520 --> 00:19:45,040 Speaker 2: scaled version of an in person town hall, we might 397 00:19:45,119 --> 00:19:47,159 Speaker 2: kind of be in trouble because the eight people that 398 00:19:47,240 --> 00:19:51,639 Speaker 2: come to an in person town hall detractors detracted typically, right, 399 00:19:51,840 --> 00:19:54,120 Speaker 2: the ambivalent people don't go and show up, or people 400 00:19:54,119 --> 00:19:55,640 Speaker 2: who are like quite like the idea and also could 401 00:19:55,640 --> 00:19:57,399 Speaker 2: see why not the idea? That's not the people that 402 00:19:57,440 --> 00:19:59,560 Speaker 2: get you motivated to get off the counch and drive 403 00:19:59,600 --> 00:20:01,560 Speaker 2: all the way and wait there. It's usually people who 404 00:20:01,880 --> 00:20:03,960 Speaker 2: really don't want the thing to happen. So I think 405 00:20:04,000 --> 00:20:05,679 Speaker 2: local leaders were like, well, we're just going to get 406 00:20:05,680 --> 00:20:08,440 Speaker 2: eight thousand of them entrance, you know, because they don't 407 00:20:08,440 --> 00:20:10,680 Speaker 2: want to scale in that sense, And that was why 408 00:20:10,720 --> 00:20:13,240 Speaker 2: it's so important to do like such a broad marketing 409 00:20:13,240 --> 00:20:16,800 Speaker 2: campaign to like all these different constituencies, but the bigger 410 00:20:16,920 --> 00:20:19,800 Speaker 2: side of relief from local leaders when they looked at 411 00:20:20,160 --> 00:20:23,000 Speaker 2: their feedback and a lot of stuff they anticipated. There 412 00:20:23,080 --> 00:20:25,080 Speaker 2: was like in a growing town, you need more infrastructure, 413 00:20:25,080 --> 00:20:28,520 Speaker 2: you need more roads. Lots of really interesting thoughts on education, 414 00:20:28,840 --> 00:20:30,919 Speaker 2: on like restaurants. There was just one submission on the 415 00:20:30,960 --> 00:20:35,920 Speaker 2: restaurant and interesting kind of like unexpected proposals on a 416 00:20:36,000 --> 00:20:37,840 Speaker 2: number of fronts, but the one that makes me giggles 417 00:20:38,200 --> 00:20:41,399 Speaker 2: in the area of like food and restaurants. One person 418 00:20:41,480 --> 00:20:45,320 Speaker 2: submitted an idea to pass a law to mandate restaurants 419 00:20:45,359 --> 00:20:48,600 Speaker 2: to include ketchup saches in the takeaway like the takeaway 420 00:20:48,640 --> 00:20:51,480 Speaker 2: bags of like fast food restaurants. So you know, that's 421 00:20:51,480 --> 00:20:53,320 Speaker 2: how you know that people are really expressing their views. 422 00:20:53,320 --> 00:20:56,480 Speaker 2: And we've got good mixed views but didn't have much support. 423 00:20:56,840 --> 00:20:59,119 Speaker 1: How did people express support for one another? Like I 424 00:20:59,160 --> 00:21:01,159 Speaker 1: go on the platform, answer this is you know, I 425 00:21:01,160 --> 00:21:05,240 Speaker 1: want more free ketchup saches? Is that that Now I 426 00:21:05,240 --> 00:21:07,280 Speaker 1: don't live in Boling. I read that it was Geo 427 00:21:07,480 --> 00:21:10,879 Speaker 1: Geo fence. Otherwise it would have been me. But then 428 00:21:11,280 --> 00:21:14,400 Speaker 1: is that then like available for other people to comment on? Okay, 429 00:21:14,600 --> 00:21:16,479 Speaker 1: so it's like a message boarders, it's not just a one. 430 00:21:16,600 --> 00:21:18,520 Speaker 2: So they're routing so you you put your idea and 431 00:21:18,560 --> 00:21:21,000 Speaker 2: then they're routing ideas for you to vote on. And 432 00:21:21,040 --> 00:21:23,720 Speaker 2: it's actually much You get a much higher percentage of 433 00:21:23,920 --> 00:21:27,600 Speaker 2: people like the The votes are much higher than the 434 00:21:27,960 --> 00:21:30,800 Speaker 2: actual submissions. Yeah, because it's much easier to vote and 435 00:21:30,800 --> 00:21:32,360 Speaker 2: you're kind of in there and it's a little addictive. 436 00:21:32,680 --> 00:21:35,280 Speaker 2: And I think that's one of the really really cool 437 00:21:35,320 --> 00:21:38,960 Speaker 2: things about this conversation is that it was designed to 438 00:21:39,040 --> 00:21:42,800 Speaker 2: advance a very large conversation. It wasn't designed to have 439 00:21:42,840 --> 00:21:44,679 Speaker 2: like eight thousand and one on ones. That's not what 440 00:21:44,680 --> 00:21:45,919 Speaker 2: we're trying to do when we're trying to have a 441 00:21:45,920 --> 00:21:49,840 Speaker 2: long conversation. And that that's interesting because there isn't the 442 00:21:49,880 --> 00:21:52,880 Speaker 2: option to reply the way that purchases platform is set up. 443 00:21:53,200 --> 00:21:54,840 Speaker 2: You have an idea, and you can vote on other 444 00:21:54,840 --> 00:21:57,240 Speaker 2: people's ideas, and if you really don't like the catch 445 00:21:57,280 --> 00:21:59,119 Speaker 2: up idea and you actually think it should be mustered, 446 00:21:59,240 --> 00:22:01,119 Speaker 2: put your idea in the area. Yeah, but we're not 447 00:22:01,160 --> 00:22:02,520 Speaker 2: you're not telling you know, you're not going to get 448 00:22:02,520 --> 00:22:05,400 Speaker 2: into some you know, back and forth with the person 449 00:22:05,480 --> 00:22:08,840 Speaker 2: who suggested Ki jump is you know, from the wrong 450 00:22:08,880 --> 00:22:10,680 Speaker 2: part of town, and everyone's asking that part of town. 451 00:22:10,680 --> 00:22:12,639 Speaker 2: You know, here's no point going down that you know, 452 00:22:13,080 --> 00:22:15,480 Speaker 2: spiral of the ad hominym attacks, et cetera. That happens 453 00:22:15,480 --> 00:22:18,040 Speaker 2: a not on social media. So the conversations dis like 454 00:22:18,200 --> 00:22:21,399 Speaker 2: designed differently from the get go that you put your ideas, 455 00:22:21,400 --> 00:22:23,120 Speaker 2: you vote, if you want to keep going after you've 456 00:22:23,160 --> 00:22:24,960 Speaker 2: given your thumbs up your thumbs down, you're welcome to 457 00:22:24,960 --> 00:22:27,199 Speaker 2: put your idea in there. And it well. So then 458 00:22:27,200 --> 00:22:28,760 Speaker 2: when we had to take stock of like, well did 459 00:22:28,800 --> 00:22:32,800 Speaker 2: any of these ideas they resonate with other people? So 460 00:22:32,840 --> 00:22:36,040 Speaker 2: there were four thousand policy proposals and over half of 461 00:22:36,080 --> 00:22:38,960 Speaker 2: them had near universal support. 462 00:22:38,840 --> 00:22:42,320 Speaker 1: Near universal support over half. So in other words, what 463 00:22:42,359 --> 00:22:45,560 Speaker 1: you take from it is the community knows what's good 464 00:22:45,560 --> 00:22:45,800 Speaker 1: for it. 465 00:22:46,280 --> 00:22:48,960 Speaker 2: Well yeah, and and that you know, there are so 466 00:22:49,040 --> 00:22:51,160 Speaker 2: many opportunities for us to hear from the people who 467 00:22:51,200 --> 00:22:53,920 Speaker 2: are the angriest or at least well informed. 468 00:22:54,040 --> 00:22:56,720 Speaker 1: Or did you know the voter registration of the participants 469 00:22:56,800 --> 00:22:57,280 Speaker 1: or I mean. 470 00:22:57,280 --> 00:23:00,480 Speaker 2: Was this no, but it's it's it is a it's 471 00:23:00,480 --> 00:23:03,960 Speaker 2: a purpolish place, so it's quite a mix. And you 472 00:23:04,000 --> 00:23:06,760 Speaker 2: saw some of the things that you see you know, nationally, 473 00:23:07,440 --> 00:23:09,560 Speaker 2: but most of the things that local people care about 474 00:23:09,600 --> 00:23:11,520 Speaker 2: are you know, what is going to happen to their 475 00:23:11,600 --> 00:23:14,560 Speaker 2: K through twelve education. It turns out they want more 476 00:23:14,640 --> 00:23:17,120 Speaker 2: vocational skills, you know, like they want their university, which 477 00:23:17,160 --> 00:23:19,240 Speaker 2: is the pride of like you know, Western Kentucky University, 478 00:23:19,240 --> 00:23:21,920 Speaker 2: they want it to be more integrated into their economy 479 00:23:21,960 --> 00:23:24,280 Speaker 2: and their like workforce development. And there's a lot of 480 00:23:24,320 --> 00:23:26,560 Speaker 2: things that they want. And the perspective of not just 481 00:23:26,640 --> 00:23:30,679 Speaker 2: dog actually but even like private section leaders was like, 482 00:23:31,000 --> 00:23:34,080 Speaker 2: that's great. Now we have now we can go and 483 00:23:34,200 --> 00:23:37,600 Speaker 2: lead and build and make changes knowing that we have 484 00:23:37,680 --> 00:23:41,480 Speaker 2: broad public support. Otherwise, if you're just based basing you know, 485 00:23:41,560 --> 00:23:46,200 Speaker 2: your like sentiment gauging on public calls and social media, 486 00:23:47,040 --> 00:23:48,879 Speaker 2: just thinking that like no one wants anything to happen, 487 00:23:48,960 --> 00:23:51,200 Speaker 2: you know, or that like you might have some supporters, 488 00:23:51,240 --> 00:23:53,879 Speaker 2: but also like everyone else is an avid, you know, 489 00:23:54,160 --> 00:23:57,200 Speaker 2: avid distractor. So so I think those are two really 490 00:23:57,240 --> 00:24:00,440 Speaker 2: important essons. One people really will show up, they really 491 00:24:00,480 --> 00:24:03,160 Speaker 2: do want to shape their future, you know, And two 492 00:24:03,200 --> 00:24:06,159 Speaker 2: that there is more that unites us then divides us, 493 00:24:06,160 --> 00:24:09,399 Speaker 2: which is kind of imperative for us to keep you know, 494 00:24:09,640 --> 00:24:12,359 Speaker 2: in our minds and cultivate at this moment. 495 00:24:12,920 --> 00:24:17,800 Speaker 1: And what was Jigsaw's role or technology deployment because the 496 00:24:17,880 --> 00:24:21,320 Speaker 1: marketing campaign was like panel by local agencies, the polished 497 00:24:21,359 --> 00:24:24,600 Speaker 1: platform was it's not a Jigsaw platform, right, So what 498 00:24:24,640 --> 00:24:27,320 Speaker 1: did what was the layer that you guys added It was. 499 00:24:27,280 --> 00:24:30,280 Speaker 2: The making sense of the conversation. So if you can 500 00:24:30,320 --> 00:24:33,760 Speaker 2: get lots of people to contribute, how can you guarantee 501 00:24:33,800 --> 00:24:36,120 Speaker 2: that their voice will be heard? And so it really 502 00:24:36,160 --> 00:24:39,800 Speaker 2: is using this case, Google's aigemini using an eye for 503 00:24:39,880 --> 00:24:42,479 Speaker 2: what it does best, which is they take large amounts 504 00:24:42,480 --> 00:24:45,479 Speaker 2: of information that it expressed in a very local and 505 00:24:45,480 --> 00:24:49,560 Speaker 2: personal way, understand it, organize it, they sort it, prioritize it, 506 00:24:49,600 --> 00:24:51,679 Speaker 2: and then give it back to people. So like, you 507 00:24:51,680 --> 00:24:55,040 Speaker 2: know the thing if you go to what could bowlinggreenpeed 508 00:24:55,080 --> 00:24:58,000 Speaker 2: dot com, you'll see the report and it's like, you know, 509 00:24:58,040 --> 00:25:00,159 Speaker 2: Google Maps, we think at Google mapps, you started very 510 00:25:00,200 --> 00:25:01,720 Speaker 2: high level. If you'll look at like the states of 511 00:25:01,920 --> 00:25:04,000 Speaker 2: in America, that's the first thing that you see, and 512 00:25:04,040 --> 00:25:05,800 Speaker 2: then you're like, actually, I'm kind of interested in New York. 513 00:25:05,840 --> 00:25:08,520 Speaker 2: And then you're like interest in my neighborhood and they're interesting. 514 00:25:08,520 --> 00:25:10,240 Speaker 2: You're like, oh, how can you see my front door? 515 00:25:10,400 --> 00:25:12,240 Speaker 2: You know, And that's how it is with this map 516 00:25:12,280 --> 00:25:15,359 Speaker 2: of opinions, like not geographical terrain, but opinion terrain. But 517 00:25:15,359 --> 00:25:16,640 Speaker 2: you start the top and you're like, what do people 518 00:25:16,640 --> 00:25:18,240 Speaker 2: have born and green care about? And then it has 519 00:25:18,240 --> 00:25:20,680 Speaker 2: those you know, instructural education, et cetera. And then you're 520 00:25:20,720 --> 00:25:22,840 Speaker 2: driven You're like, actually, what I really care about is 521 00:25:23,480 --> 00:25:25,359 Speaker 2: you know, education, and then go through, you know, you 522 00:25:25,400 --> 00:25:28,840 Speaker 2: just keep digging and most people, most people have specific 523 00:25:28,840 --> 00:25:31,600 Speaker 2: areas of interest. But the truth is people want to 524 00:25:31,640 --> 00:25:35,840 Speaker 2: know where they sit relative to their neighbors, and so 525 00:25:36,040 --> 00:25:38,200 Speaker 2: people want to actually find where they are. Because there's 526 00:25:38,200 --> 00:25:40,639 Speaker 2: this plot graph on there that has a distribution of 527 00:25:40,680 --> 00:25:42,840 Speaker 2: all the proposals and how much agreement they got, and 528 00:25:42,880 --> 00:25:44,560 Speaker 2: so you can see am I like on the am 529 00:25:44,600 --> 00:25:47,479 Speaker 2: I the fringe opinion holder even though like me and 530 00:25:47,520 --> 00:25:50,679 Speaker 2: all my two friends agree with me, or actually is 531 00:25:50,680 --> 00:25:52,800 Speaker 2: something that I've been scared to say out loud, something 532 00:25:52,840 --> 00:25:54,840 Speaker 2: that a lot of other people believe. So you can 533 00:25:54,880 --> 00:25:57,080 Speaker 2: go in there and see that, which is really powerful. 534 00:25:57,080 --> 00:26:00,159 Speaker 2: And of course most people agreed with most things. So 535 00:26:00,240 --> 00:26:02,320 Speaker 2: I think it does restore our sense of you know, 536 00:26:02,359 --> 00:26:05,160 Speaker 2: like it's a funhouse of mirror sometimes social media where 537 00:26:05,160 --> 00:26:08,119 Speaker 2: you're like, oh my goodness, everyone's extreme. It's just me 538 00:26:08,240 --> 00:26:11,240 Speaker 2: here that doesn't want to have like an extreme mistake 539 00:26:11,280 --> 00:26:11,920 Speaker 2: on things. 540 00:26:12,359 --> 00:26:15,760 Speaker 1: What are the key policy ideas that have emerged and 541 00:26:16,920 --> 00:26:19,120 Speaker 1: will you measure success based on whether or not they're 542 00:26:19,160 --> 00:26:21,040 Speaker 1: enacted or are you, in a sense now passing the 543 00:26:21,080 --> 00:26:21,640 Speaker 1: bat on the lull. 544 00:26:22,400 --> 00:26:26,160 Speaker 2: It's definitely in the court of their leaders, and it's 545 00:26:26,200 --> 00:26:28,520 Speaker 2: their process so to begin with. So I think they 546 00:26:28,640 --> 00:26:30,960 Speaker 2: genuinely do want people's input because the process kind of 547 00:26:31,000 --> 00:26:35,520 Speaker 2: proceeded our involvement. But one example was they had wanted 548 00:26:35,520 --> 00:26:38,600 Speaker 2: to do a riverside development. They built building up the 549 00:26:38,680 --> 00:26:41,320 Speaker 2: area along the riverside, and when they had got tried 550 00:26:41,320 --> 00:26:44,199 Speaker 2: to get public input, very few people pay and they 551 00:26:44,240 --> 00:26:47,440 Speaker 2: were detractors. But then if you look at the feedback 552 00:26:47,440 --> 00:26:50,120 Speaker 2: on that one topic in the reports, like seventy five 553 00:26:50,160 --> 00:26:52,120 Speaker 2: plus percent of people support it. So I think they're 554 00:26:52,119 --> 00:26:54,880 Speaker 2: now going to accelerate that. What we did, in terms 555 00:26:54,880 --> 00:26:57,119 Speaker 2: of like really trying to gauge people's sense of like 556 00:26:57,160 --> 00:27:00,359 Speaker 2: whether it mattered for them, was we say it on 557 00:27:00,400 --> 00:27:03,240 Speaker 2: whether they felt their voice mattered. We asked them whether 558 00:27:03,240 --> 00:27:06,800 Speaker 2: they felt they understanded other people's perspectives better than they 559 00:27:06,800 --> 00:27:08,879 Speaker 2: did before. And then third was where they think they 560 00:27:08,920 --> 00:27:11,040 Speaker 2: believe their input will lead to a better outcoming. On 561 00:27:11,080 --> 00:27:14,880 Speaker 2: all of those, it was like eighty percent plus satisfaction. 562 00:27:15,040 --> 00:27:19,199 Speaker 1: And were you surprised by the outcome? And how do 563 00:27:19,240 --> 00:27:21,720 Speaker 1: you how do you kind of measure the success of 564 00:27:21,800 --> 00:27:22,600 Speaker 1: a project like this. 565 00:27:23,440 --> 00:27:26,560 Speaker 2: I think it was that that people wanted to come 566 00:27:26,560 --> 00:27:30,679 Speaker 2: out and they did. I think that there's two There's like, 567 00:27:31,880 --> 00:27:33,359 Speaker 2: there's two parts of my brain. There's a part of 568 00:27:33,400 --> 00:27:36,199 Speaker 2: brains like did we achieve our you know, like our 569 00:27:36,200 --> 00:27:39,359 Speaker 2: product development goals or are like whatever engagement goals? And 570 00:27:39,400 --> 00:27:41,080 Speaker 2: then there's an other part of me that's like can 571 00:27:41,119 --> 00:27:45,679 Speaker 2: I retain my hope for like society thriving as we 572 00:27:45,760 --> 00:27:48,119 Speaker 2: get more at Amize And so I feel like on 573 00:27:48,200 --> 00:27:50,560 Speaker 2: both of those I felt really good. I felt good 574 00:27:50,560 --> 00:27:53,479 Speaker 2: because obviously the technology wor I also, you know, the 575 00:27:53,520 --> 00:27:55,439 Speaker 2: sense of the team is that we can push to 576 00:27:55,520 --> 00:27:59,080 Speaker 2: do more. And so now we're exploring what other types 577 00:27:59,119 --> 00:28:02,000 Speaker 2: of conversations in the conversation space. How could we Maybe 578 00:28:02,040 --> 00:28:04,520 Speaker 2: it's a different scale, you know, maybe it's a different 579 00:28:04,560 --> 00:28:08,200 Speaker 2: nature of conversation. Maybe we're using different capabilities of Gemini 580 00:28:08,920 --> 00:28:11,919 Speaker 2: to do more to bring people together, to have them 581 00:28:11,960 --> 00:28:14,880 Speaker 2: understand each other more and then having it count for them. 582 00:28:15,000 --> 00:28:17,359 Speaker 2: So that's the kind of fun and exploration phase that 583 00:28:17,359 --> 00:28:18,000 Speaker 2: we're in. Now. 584 00:28:18,560 --> 00:28:20,280 Speaker 1: There's another thing that came out of the research I 585 00:28:20,359 --> 00:28:21,359 Speaker 1: think called grounding. 586 00:28:21,920 --> 00:28:24,840 Speaker 2: Yeah, can you talk a bit about rounding. Groundings their 587 00:28:25,440 --> 00:28:29,520 Speaker 2: sister to hallucinations, So, which is a funny term that 588 00:28:29,960 --> 00:28:33,040 Speaker 2: has widespread adoption to describe when air models may make 589 00:28:33,080 --> 00:28:33,640 Speaker 2: stuff up. 590 00:28:33,680 --> 00:28:36,480 Speaker 1: Because of course that the very bad scenario. Who would 591 00:28:36,480 --> 00:28:37,919 Speaker 1: be that the air was making up policy of pro 592 00:28:37,960 --> 00:28:38,960 Speaker 1: puzzles and no one asked for. 593 00:28:39,040 --> 00:28:42,000 Speaker 2: Oh my goodness, imagine that in such a high stakes 594 00:28:42,160 --> 00:28:44,360 Speaker 2: context where people are trying and talk to their policy maker. 595 00:28:44,840 --> 00:28:46,880 Speaker 2: And there were some funny things that in our early 596 00:28:46,960 --> 00:28:51,200 Speaker 2: testing we were finding it would kind of conflate two 597 00:28:51,200 --> 00:28:54,320 Speaker 2: topics that seem like they're about the same thing, or 598 00:28:54,360 --> 00:28:57,640 Speaker 2: that like you would it would have only one. There'd 599 00:28:57,640 --> 00:28:59,520 Speaker 2: be like one post about you know, there should not 600 00:28:59,560 --> 00:29:04,800 Speaker 2: be some any sweet tuitets offered at in kindergarten. And 601 00:29:04,840 --> 00:29:08,000 Speaker 2: then the summarizing AI would be like the people of 602 00:29:08,080 --> 00:29:10,520 Speaker 2: Bowling Green, you do not want to have like cakes 603 00:29:10,560 --> 00:29:12,800 Speaker 2: and chocolates and candy and they would rather have like 604 00:29:13,000 --> 00:29:16,000 Speaker 2: keep compisation. It's just like wait, whatever that wasn't said, yos, 605 00:29:16,280 --> 00:29:18,480 Speaker 2: that's out of lines. So part of kind of making 606 00:29:18,520 --> 00:29:21,240 Speaker 2: sure that we understood the ways that the AI might 607 00:29:21,480 --> 00:29:24,479 Speaker 2: misfire was doing a ton of testing with humor reviews 608 00:29:24,960 --> 00:29:27,600 Speaker 2: and on previous data sets, and then we did it 609 00:29:27,760 --> 00:29:30,480 Speaker 2: kind of kind of like a small private trial a 610 00:29:30,520 --> 00:29:33,640 Speaker 2: few months earlier with Bowling Green and the local partners 611 00:29:33,720 --> 00:29:35,960 Speaker 2: reviewing them and just seeing like how might it go wrong? 612 00:29:36,360 --> 00:29:38,040 Speaker 2: And then some of the tweets we made with to 613 00:29:38,080 --> 00:29:41,480 Speaker 2: make sure that we batched appropriately. So grounding is every 614 00:29:41,520 --> 00:29:46,720 Speaker 2: output that the AI produces is that it's like appended 615 00:29:46,760 --> 00:29:51,360 Speaker 2: with citations. So everything that the AI says was a 616 00:29:51,400 --> 00:29:55,320 Speaker 2: theme in the report, you can say, you know, show 617 00:29:55,360 --> 00:29:58,080 Speaker 2: me receipts and it will take you to the underlying 618 00:29:58,360 --> 00:30:01,320 Speaker 2: policy proposals or words, the actual words of the people 619 00:30:01,360 --> 00:30:03,640 Speaker 2: at Bowling Green, so you don't have to you know, 620 00:30:03,720 --> 00:30:05,000 Speaker 2: you actually don't have to trust the AI. 621 00:30:05,280 --> 00:30:09,280 Speaker 1: Super interesting. Could grounding be used beyond this context? Do 622 00:30:09,280 --> 00:30:09,600 Speaker 1: you think? 623 00:30:09,720 --> 00:30:12,400 Speaker 2: Yeah? Yeah, and it is it's used broadly. Have you 624 00:30:12,440 --> 00:30:15,400 Speaker 2: ever used notebook M Yes, So they use grounding. So 625 00:30:15,440 --> 00:30:17,880 Speaker 2: when they you know, when they summarize a you can 626 00:30:17,920 --> 00:30:21,240 Speaker 2: give it like one hundred reports or whatever backad Democrat. 627 00:30:21,280 --> 00:30:23,200 Speaker 2: Of course, if you're don't research and it'll tell you 628 00:30:23,240 --> 00:30:25,320 Speaker 2: the themes and then it gives you like like footnotes 629 00:30:25,680 --> 00:30:28,520 Speaker 2: so that you can see where the source material and 630 00:30:28,560 --> 00:30:29,920 Speaker 2: have confidence that at accurate. 631 00:30:31,000 --> 00:30:34,760 Speaker 1: What's the wider hope with a project like Bowling Green? 632 00:30:34,840 --> 00:30:37,160 Speaker 1: I mean, do you is that, like, is there an 633 00:30:37,160 --> 00:30:39,720 Speaker 1: ideal community size? Why this is will be perfect for 634 00:30:39,800 --> 00:30:42,360 Speaker 1: towns that are growing of around one hundred thousand people 635 00:30:42,400 --> 00:30:44,600 Speaker 1: because like you have enough people in the community, you're 636 00:30:44,600 --> 00:30:47,160 Speaker 1: invested in the future because it's growing, it's not so 637 00:30:47,320 --> 00:30:49,480 Speaker 1: large that you get so many proposals that it's like 638 00:30:49,520 --> 00:30:52,560 Speaker 1: impossible to sort. And therefore maybe there's like another one 639 00:30:52,600 --> 00:30:56,720 Speaker 1: hundred Bowling Greens in the US, or is it more like, Wow, 640 00:30:56,760 --> 00:30:59,920 Speaker 1: the greatest problem in US politics today is that participation 641 00:31:00,080 --> 00:31:03,800 Speaker 1: and is low and conversation is toxic, and therefore, like 642 00:31:03,840 --> 00:31:06,400 Speaker 1: this could be a much wider solution, Like how do 643 00:31:06,480 --> 00:31:09,000 Speaker 1: you between those two things? Like where do you feel 644 00:31:09,480 --> 00:31:09,720 Speaker 1: for me? 645 00:31:09,800 --> 00:31:12,240 Speaker 2: The more conversation, the better. I think the reverse is 646 00:31:12,280 --> 00:31:14,800 Speaker 2: also true, the less conversation the worst, the worst we 647 00:31:14,920 --> 00:31:17,680 Speaker 2: ended up. I understand each other, and that was kind 648 00:31:17,720 --> 00:31:20,880 Speaker 2: of where we had taken perspective for over the last 649 00:31:20,920 --> 00:31:24,000 Speaker 2: couple of years, was actually not just helping publisher score 650 00:31:24,040 --> 00:31:26,800 Speaker 2: things that they might want to see less of, like toxicity, 651 00:31:27,080 --> 00:31:29,680 Speaker 2: but actually doing the bridging attributes as well, which was 652 00:31:29,800 --> 00:31:32,920 Speaker 2: the things that kind of help people who disagree stay 653 00:31:32,960 --> 00:31:35,720 Speaker 2: in the conversation together. So I would like to see 654 00:31:35,840 --> 00:31:38,640 Speaker 2: more of it. And I think we've also shown we 655 00:31:38,720 --> 00:31:40,720 Speaker 2: show that the appetite is there from people and that 656 00:31:40,840 --> 00:31:42,840 Speaker 2: it can go really well. That I think the judge 657 00:31:42,880 --> 00:31:46,360 Speaker 2: executive Doug Gorman in our case, was really really brave 658 00:31:46,840 --> 00:31:49,320 Speaker 2: to try and do that. Why brave because the safe 659 00:31:49,400 --> 00:31:51,800 Speaker 2: nobody shows up, or say if they show up and 660 00:31:52,000 --> 00:31:56,720 Speaker 2: you know it's Cook Coffinus. But now seeing what Doug's 661 00:31:56,760 --> 00:31:59,480 Speaker 2: that actually Doug came to So Google has a developer 662 00:31:59,520 --> 00:32:02,280 Speaker 2: conference equal to I Air. That just happened, and Google 663 00:32:02,280 --> 00:32:05,040 Speaker 2: invited Doug to go and participate in the local leaders track, 664 00:32:05,320 --> 00:32:07,360 Speaker 2: and he went and told all these like other mayors 665 00:32:07,400 --> 00:32:09,440 Speaker 2: and local leaders and they were They really lit up. 666 00:32:09,480 --> 00:32:12,040 Speaker 2: And so from what I understand, I think there is 667 00:32:12,080 --> 00:32:13,880 Speaker 2: a lot of appetite to do this, and I think 668 00:32:13,960 --> 00:32:16,760 Speaker 2: him going first and having the bowling green kind of 669 00:32:17,160 --> 00:32:21,200 Speaker 2: proof of concept will inspire others to adopt it to. 670 00:32:21,760 --> 00:32:23,760 Speaker 1: Is that part of your work or in a sense 671 00:32:23,800 --> 00:32:26,160 Speaker 1: you like the fire and it's of other people to 672 00:32:27,480 --> 00:32:30,080 Speaker 1: kindle it bad metaphor, but I mean otherwise do what 673 00:32:30,120 --> 00:32:31,520 Speaker 1: are you going to do the same experiment again in 674 00:32:31,520 --> 00:32:34,040 Speaker 1: another community or a larger community? Or do you feel 675 00:32:34,040 --> 00:32:36,760 Speaker 1: like you have the proof point that you think you know? 676 00:32:36,800 --> 00:32:38,560 Speaker 2: I think it's you know this question, like what is 677 00:32:38,600 --> 00:32:41,640 Speaker 2: Jigsaw's highest and best use. I think if we if 678 00:32:41,640 --> 00:32:44,120 Speaker 2: now we've done the concept, if there's another proof like 679 00:32:44,200 --> 00:32:46,600 Speaker 2: another place that we'd want to do it, I think 680 00:32:46,800 --> 00:32:48,480 Speaker 2: we would be up into that if it felt that 681 00:32:48,560 --> 00:32:53,200 Speaker 2: it was some additive like materially additive, if it were 682 00:32:53,560 --> 00:32:55,840 Speaker 2: you know, maybe at the state level or you know, 683 00:32:55,880 --> 00:32:58,200 Speaker 2: some other kind of complex conversation that we felt that 684 00:32:58,200 --> 00:33:02,640 Speaker 2: we could we could advance. I think there will be others, 685 00:33:02,920 --> 00:33:07,440 Speaker 2: like kind of implementers who will I think this will 686 00:33:07,480 --> 00:33:11,520 Speaker 2: become a kind of a commercial opportunity for others to 687 00:33:11,520 --> 00:33:13,120 Speaker 2: help with. I don't think we do you know, that's 688 00:33:13,120 --> 00:33:15,240 Speaker 2: the best use of Jigsaw to do many many more 689 00:33:15,240 --> 00:33:16,719 Speaker 2: of the same thing in towns. 690 00:33:17,160 --> 00:33:20,000 Speaker 1: So in other words, like a polling firm could take 691 00:33:20,000 --> 00:33:22,800 Speaker 1: it over as a business. Liane Audrey Tang, who I 692 00:33:22,840 --> 00:33:25,120 Speaker 1: know that you were, you know, to spend time with. 693 00:33:25,960 --> 00:33:29,120 Speaker 1: Recently an interview with Nick Thompson and Audrey spoke about 694 00:33:30,160 --> 00:33:33,560 Speaker 1: the Bowling Green project and in the context of how 695 00:33:34,160 --> 00:33:38,120 Speaker 1: in California post fires, Newsome is sort of using this 696 00:33:38,240 --> 00:33:44,960 Speaker 1: tech enabled participatory democratic technology to understand what people want 697 00:33:44,960 --> 00:33:49,120 Speaker 1: from the rebuild of like burnt areas of LA. Where 698 00:33:49,120 --> 00:33:52,240 Speaker 1: does this fit in with the kind of larger tech 699 00:33:52,360 --> 00:33:56,120 Speaker 1: enabled digital democracy trend. 700 00:33:56,600 --> 00:33:59,040 Speaker 2: I think it is. I think they're pioneering there because 701 00:33:59,080 --> 00:34:01,720 Speaker 2: they're doing at the state level in California's a massive state. 702 00:34:02,800 --> 00:34:05,080 Speaker 2: And interestingly, when we did it in Bowling Green, which 703 00:34:05,160 --> 00:34:08,000 Speaker 2: was you know, Well or Warren County, much much smaller, 704 00:34:08,440 --> 00:34:11,680 Speaker 2: super open question. In the case of California, that asking 705 00:34:11,680 --> 00:34:14,480 Speaker 2: a really narrow question. I mean, it's big in itself, 706 00:34:14,480 --> 00:34:16,560 Speaker 2: but it's much more focused question, which is, you know, 707 00:34:16,800 --> 00:34:19,000 Speaker 2: how are we going to where do we go from 708 00:34:19,040 --> 00:34:23,120 Speaker 2: here after the wildfire specifically, So I think that's really instructive, 709 00:34:23,160 --> 00:34:25,160 Speaker 2: Like doesn't once you go to the state level, doesn't 710 00:34:25,200 --> 00:34:27,839 Speaker 2: need to be so focused And they've already got quite 711 00:34:27,840 --> 00:34:30,120 Speaker 2: a bit of engagement and is she going to talk 712 00:34:30,120 --> 00:34:31,719 Speaker 2: to them about how they're going to make sense of 713 00:34:31,760 --> 00:34:33,920 Speaker 2: it because I don't know what they do. Well, yeah, 714 00:34:33,440 --> 00:34:36,360 Speaker 2: we're in touch with them. And Audrey, who's you know, 715 00:34:36,360 --> 00:34:38,759 Speaker 2: it was the former the kind of in agural like 716 00:34:38,800 --> 00:34:41,839 Speaker 2: I guess did a digital minister for Taiwan is like 717 00:34:41,880 --> 00:34:44,520 Speaker 2: this just she's just a global matchmaker. So she's advising 718 00:34:44,880 --> 00:34:47,960 Speaker 2: California as well, and she's advised us a lot. She's 719 00:34:48,280 --> 00:34:49,920 Speaker 2: been very instructive for our work. 720 00:34:50,480 --> 00:34:53,640 Speaker 1: Two questions are close, I guess the first. Obviously, it's 721 00:34:53,680 --> 00:34:57,880 Speaker 1: a moment where Google is being you know, the subject 722 00:34:57,920 --> 00:35:01,560 Speaker 1: of the Justice Department and Chrome. All this kind of stuff, 723 00:35:01,840 --> 00:35:05,120 Speaker 1: in other words, is a target of federal investigation when 724 00:35:05,120 --> 00:35:08,319 Speaker 1: you work with government, like do you are there people 725 00:35:08,320 --> 00:35:10,600 Speaker 1: who say this is a conflict of interest or Jigsaw 726 00:35:10,680 --> 00:35:14,799 Speaker 1: being in bed with local estate governments is somehow advancing 727 00:35:14,840 --> 00:35:17,600 Speaker 1: Google's corporate motives elsewhere, and that's a problem. 728 00:35:18,360 --> 00:35:22,920 Speaker 2: In this case. We didn't actually have any commercial relationship 729 00:35:23,000 --> 00:35:26,160 Speaker 2: with the government. We were actually partnered with the group 730 00:35:26,200 --> 00:35:28,560 Speaker 2: the I told you about Innovation Engine and they did 731 00:35:28,600 --> 00:35:31,040 Speaker 2: the kind of execution. But we did care a lot 732 00:35:31,080 --> 00:35:35,759 Speaker 2: about making the tech work for the hotesty maker and 733 00:35:35,800 --> 00:35:38,200 Speaker 2: the people because it seemed it seems like such an 734 00:35:38,239 --> 00:35:40,400 Speaker 2: important thing for us to get right. And if we 735 00:35:40,440 --> 00:35:43,919 Speaker 2: want policy makers to make good policy, even about tech, 736 00:35:43,960 --> 00:35:45,560 Speaker 2: they should kind of understand it and see how it 737 00:35:45,560 --> 00:35:47,640 Speaker 2: shows up for them. And in this case, I think 738 00:35:47,680 --> 00:35:49,640 Speaker 2: Voluningroom was so attractible was so because it was so 739 00:35:49,800 --> 00:35:54,440 Speaker 2: mixed in terms of politics. So, yeah, we've always kind 740 00:35:54,480 --> 00:35:57,640 Speaker 2: of been in and we've always been around governments, and 741 00:35:57,680 --> 00:36:02,000 Speaker 2: actually still this point you haven't actually directly partnered with 742 00:36:02,120 --> 00:36:06,640 Speaker 2: the government. But it felt really gratifying to feel like 743 00:36:06,719 --> 00:36:09,320 Speaker 2: you are helping that dynamic work better. 744 00:36:09,760 --> 00:36:12,040 Speaker 1: What do you still have to achieve a Jigsaw and 745 00:36:12,360 --> 00:36:15,279 Speaker 1: what kind of future signature projects can you tease for 746 00:36:15,360 --> 00:36:16,239 Speaker 1: us Today. 747 00:36:17,160 --> 00:36:21,960 Speaker 2: We're noticing a change in the culture at Jigsaw internally 748 00:36:22,360 --> 00:36:26,720 Speaker 2: that actually mirrors what's happening externally. That things are moving 749 00:36:26,800 --> 00:36:30,640 Speaker 2: so fast, like the technology is moving so fast, and 750 00:36:30,719 --> 00:36:33,680 Speaker 2: we are inside this incredible tech company, and so I 751 00:36:33,680 --> 00:36:36,279 Speaker 2: think we sometimes maybe things appear to us before they 752 00:36:36,360 --> 00:36:38,399 Speaker 2: mate to others. I think we could help a lot 753 00:36:38,440 --> 00:36:40,959 Speaker 2: by kind of sharing as we're going. So I think 754 00:36:41,320 --> 00:36:43,880 Speaker 2: the change in the coming year that I'd like to 755 00:36:43,880 --> 00:36:48,320 Speaker 2: see is more rapid iteration and more, sharing more frequently, 756 00:36:48,360 --> 00:36:50,399 Speaker 2: and being more public about what we're doing. 757 00:36:50,920 --> 00:36:54,560 Speaker 1: Well, you know where to come. Thank you, Thank you 758 00:36:54,760 --> 00:37:16,719 Speaker 1: us for tech Stuff. I'm oz Vaaloshian. This episode was 759 00:37:16,719 --> 00:37:21,239 Speaker 1: produced by Eliza Dennis and Adriana Topia. It was executive 760 00:37:21,239 --> 00:37:24,680 Speaker 1: produced by me Karaen Price and Kate Osborne for Kaleidoscope 761 00:37:25,120 --> 00:37:29,400 Speaker 1: and Katrina Norvel for iHeart Podcasts. Jack Insley mixed this episode, 762 00:37:29,440 --> 00:37:32,880 Speaker 1: and Kyle Murdoch Rodelphime Song join us on Friday for 763 00:37:32,920 --> 00:37:34,960 Speaker 1: the Week in Tech, when we'll run through the tech 764 00:37:35,000 --> 00:37:39,200 Speaker 1: headlines you may have missed. Please rate, review, and reach 765 00:37:39,239 --> 00:37:53,160 Speaker 1: out to us at tech Stuff podcast at gmail dot com.