1 00:00:04,840 --> 00:00:08,360 Speaker 1: Welcome everybody too. It could happen here. Um podcast about 2 00:00:08,880 --> 00:00:10,720 Speaker 1: I don't know, how things are kind of kind of 3 00:00:10,800 --> 00:00:13,240 Speaker 1: kind of crumbling and how we can maybe put it, 4 00:00:13,240 --> 00:00:16,560 Speaker 1: put stuff, put stuff back together. And today I am 5 00:00:16,560 --> 00:00:20,720 Speaker 1: excited to talk with a senior let's see what is 6 00:00:20,880 --> 00:00:23,200 Speaker 1: what is what? What is the actual? What's actual term? 7 00:00:23,239 --> 00:00:32,800 Speaker 1: I saw it A senior programs strategist at Wikimedia, Alex Stinson. Hello, greetings, Hi, 8 00:00:33,080 --> 00:00:35,800 Speaker 1: it's so good to be here. I'm very excited about 9 00:00:35,800 --> 00:00:38,400 Speaker 1: our talk today because I mean, this should this should 10 00:00:38,440 --> 00:00:41,960 Speaker 1: surprise nobody that I used to I used to be 11 00:00:42,040 --> 00:00:46,040 Speaker 1: a Wikipedia editor back in the day. Not not shocking 12 00:00:46,080 --> 00:00:49,720 Speaker 1: at all, if if if you know me. Um, but yeah, 13 00:00:49,800 --> 00:00:53,479 Speaker 1: we're gonna be talking about what kind of Wikipedia just itself, 14 00:00:53,560 --> 00:00:57,960 Speaker 1: and then also uh, climate misinformation and disinformation and how 15 00:00:58,000 --> 00:01:01,880 Speaker 1: we can maybe create a better understanding of climate change 16 00:01:01,880 --> 00:01:04,280 Speaker 1: in its effects across kind of the world and how 17 00:01:04,319 --> 00:01:07,039 Speaker 1: digital information works. Those are all kind of topics we 18 00:01:07,040 --> 00:01:11,800 Speaker 1: talk about often enough, but never within the actual context 19 00:01:11,840 --> 00:01:15,560 Speaker 1: of like Wikipedia as an entity. Um so I guess 20 00:01:15,640 --> 00:01:18,360 Speaker 1: let's let's just start there with with Wikipedia and like 21 00:01:18,520 --> 00:01:21,200 Speaker 1: for those who don't maybe maybe people like use website, 22 00:01:21,240 --> 00:01:22,600 Speaker 1: but they're not quite sure what it is, Like, how 23 00:01:22,640 --> 00:01:25,440 Speaker 1: how do you actually describe what Wikipedia is? Because it 24 00:01:25,560 --> 00:01:29,520 Speaker 1: is like an interesting kind of amorphous entity. It's so 25 00:01:29,600 --> 00:01:32,840 Speaker 1: many things. Um. I think most people are used to 26 00:01:33,040 --> 00:01:35,760 Speaker 1: thinking about Wikipedia is like the fact checking device. Like 27 00:01:35,920 --> 00:01:38,360 Speaker 1: I have a bar argument with my friends and I 28 00:01:38,400 --> 00:01:41,240 Speaker 1: pull out my phone and yeah, yeah, this website at 29 00:01:41,319 --> 00:01:44,520 Speaker 1: website at me. Right. Um, it's a lot of things. 30 00:01:44,600 --> 00:01:49,800 Speaker 1: It's three language Wikipedia's. Actually it's not just one. Each 31 00:01:49,840 --> 00:01:53,640 Speaker 1: of these communities has its own editorial community. Um. At 32 00:01:53,760 --> 00:01:57,440 Speaker 1: last I checked, it's like sixty million articles across the languages. 33 00:01:57,560 --> 00:02:00,960 Speaker 1: It's it's really it's a lot of different content, um. 34 00:02:01,120 --> 00:02:05,000 Speaker 1: And a topic can be on each of those Wikipedia's right. Um. 35 00:02:05,040 --> 00:02:08,239 Speaker 1: And this is important as we start talking about disinformation. 36 00:02:08,320 --> 00:02:10,960 Speaker 1: Is like each Wikipedia because it's edited by people in 37 00:02:11,000 --> 00:02:15,519 Speaker 1: that language, and it's written by that language community. Um. 38 00:02:15,600 --> 00:02:19,640 Speaker 1: You know, each article is different and it has different perspectives. Um. 39 00:02:19,800 --> 00:02:24,400 Speaker 1: Two d eighty thousand volunteers editing every month. So this 40 00:02:24,520 --> 00:02:27,720 Speaker 1: is a lot of people, right. But the bulk of 41 00:02:27,760 --> 00:02:31,320 Speaker 1: that's happening on English Wikipedia and some of the larger 42 00:02:31,440 --> 00:02:35,320 Speaker 1: languages that are spoken across multiple cultural context And then 43 00:02:35,360 --> 00:02:38,880 Speaker 1: there's also a lot of other content sitting behind Wikipedia. 44 00:02:38,960 --> 00:02:42,720 Speaker 1: So there's a media repository, um, And there's a we 45 00:02:42,840 --> 00:02:47,960 Speaker 1: called Wikimedia Commons, and there's a database called wiki Data, 46 00:02:48,040 --> 00:02:50,720 Speaker 1: which kind of powers those little knowledge graphs on the 47 00:02:50,800 --> 00:02:53,359 Speaker 1: right side of Google and a whole bunch of other 48 00:02:53,400 --> 00:02:56,920 Speaker 1: parts of the Internet. Wiki data shows up in Amazon, 49 00:02:56,960 --> 00:03:00,520 Speaker 1: Alexa and all kinds of other places, right and and 50 00:03:00,560 --> 00:03:03,799 Speaker 1: so it's it's we're not just like one website. It's 51 00:03:03,840 --> 00:03:09,079 Speaker 1: many websites, lots of knowledge, lots of platforms, lots of context, um. 52 00:03:09,360 --> 00:03:13,480 Speaker 1: And we'll come back to that. Yeah, what really interesting 53 00:03:13,520 --> 00:03:15,120 Speaker 1: part of it is like I don't know my my 54 00:03:15,200 --> 00:03:20,160 Speaker 1: personal kind of social leanings. I generally kind of like 55 00:03:20,240 --> 00:03:25,200 Speaker 1: things that are more decentralized in general. Um. Other other 56 00:03:25,240 --> 00:03:28,080 Speaker 1: hosts on the podcast are generally kind of on like 57 00:03:28,200 --> 00:03:32,959 Speaker 1: the progressive left libertarian spectrum. Um. And one thing I 58 00:03:33,200 --> 00:03:36,800 Speaker 1: do really appreciated about Wikipedia is is it's more like 59 00:03:37,120 --> 00:03:39,720 Speaker 1: it's It's not I I don't think it's like open source, 60 00:03:40,080 --> 00:03:44,920 Speaker 1: but it the way it has decentralized editing and all 61 00:03:44,920 --> 00:03:47,280 Speaker 1: that kind of stuff. It's just a really interesting model 62 00:03:47,280 --> 00:03:49,720 Speaker 1: of of like what if a lot more stuff worked 63 00:03:49,760 --> 00:03:52,200 Speaker 1: this way? And I'm not not sure, like how how 64 00:03:52,280 --> 00:03:54,800 Speaker 1: much of like a decentralization focus is there, like consciously 65 00:03:55,400 --> 00:03:57,840 Speaker 1: at people at like the foundation and people who try 66 00:03:57,880 --> 00:04:01,200 Speaker 1: to like actually like run it behind the scenes stuff. Yeah. 67 00:04:01,280 --> 00:04:04,480 Speaker 1: So Wikipedia grows out of the like open source movement 68 00:04:04,600 --> 00:04:06,680 Speaker 1: and the kind of early days of the Internet. Right, 69 00:04:06,720 --> 00:04:09,480 Speaker 1: this idea that like knowledge wants to be free, technology 70 00:04:09,480 --> 00:04:12,160 Speaker 1: wants to be free, software wants to be free. Um, 71 00:04:12,280 --> 00:04:15,960 Speaker 1: let's let's use the legal infrastructure to like create freedom, 72 00:04:16,160 --> 00:04:19,000 Speaker 1: right uh in that sense, And then there's also the 73 00:04:19,080 --> 00:04:21,719 Speaker 1: free as in like anyone can edit, and then the 74 00:04:21,839 --> 00:04:24,120 Speaker 1: free to like do whatever you want out there in 75 00:04:24,160 --> 00:04:27,120 Speaker 1: the world. Um. There there's people are like free as 76 00:04:27,160 --> 00:04:30,159 Speaker 1: in beer and free as in speech, right uh. And 77 00:04:30,520 --> 00:04:33,599 Speaker 1: those things are those things are also there's they're always 78 00:04:33,600 --> 00:04:36,000 Speaker 1: intention uh and they're kind of working. And as you 79 00:04:36,000 --> 00:04:38,520 Speaker 1: can imagine, especially when you get outside of kind of 80 00:04:38,600 --> 00:04:44,400 Speaker 1: multicultural Internet spaces like English Wikipedia, um, it can get challenging. 81 00:04:44,480 --> 00:04:48,360 Speaker 1: Like if you're in Croatia and everyone is speaking Croatian, 82 00:04:48,680 --> 00:04:53,000 Speaker 1: there's a very small bubble in which to create that Wikipedia, right, Um. 83 00:04:53,040 --> 00:04:55,880 Speaker 1: And so it's interesting in that sense. Um. I think 84 00:04:55,920 --> 00:04:59,240 Speaker 1: there's also another part of Wikipedia that a lot of 85 00:04:59,240 --> 00:05:01,839 Speaker 1: people don't see what just the movement behind it. So 86 00:05:01,920 --> 00:05:05,280 Speaker 1: there's the editorial communities, people show up and make edits um. 87 00:05:05,320 --> 00:05:07,800 Speaker 1: But because there's this ideology that you're talking about, this 88 00:05:07,920 --> 00:05:10,920 Speaker 1: like decentralized, like we need to share knowledge or culture 89 00:05:10,960 --> 00:05:14,520 Speaker 1: or language on the Internet, there's also a whole social 90 00:05:14,880 --> 00:05:18,640 Speaker 1: movement sitting behind the scenes. Uh. And there there's a 91 00:05:18,640 --> 00:05:23,000 Speaker 1: podcast recently dot com The Wikipedia Story that kind of 92 00:05:23,080 --> 00:05:26,800 Speaker 1: captured that the essence of that UM and it's it's 93 00:05:26,839 --> 00:05:29,680 Speaker 1: a lot of people like myself, So I started editing 94 00:05:30,200 --> 00:05:35,320 Speaker 1: in high school. Yeah, yeah, one of those like, oh, 95 00:05:35,360 --> 00:05:37,360 Speaker 1: I know how to click the edit button and I 96 00:05:37,400 --> 00:05:39,280 Speaker 1: figure out how to use the Internet and that kind 97 00:05:39,279 --> 00:05:42,240 Speaker 1: of thing. But there's a lot of people that like 98 00:05:42,440 --> 00:05:45,280 Speaker 1: the intuitiveness of clicking and edit button on a piece 99 00:05:45,279 --> 00:05:49,040 Speaker 1: of open source software to create content is just not 100 00:05:49,360 --> 00:05:51,400 Speaker 1: It's not clear, right, And so you have to organize 101 00:05:51,400 --> 00:05:53,839 Speaker 1: and invite people in and so we have a whole 102 00:05:53,880 --> 00:05:56,839 Speaker 1: movement that does that too. That there's about a hundred 103 00:05:57,560 --> 00:06:02,000 Speaker 1: fifty organizations around the world that we organized events, work 104 00:06:02,040 --> 00:06:07,000 Speaker 1: with libraries and museums and educational institutions, and so there's 105 00:06:07,000 --> 00:06:11,160 Speaker 1: always this um kind of interesting dynamic where our values, 106 00:06:11,320 --> 00:06:15,200 Speaker 1: which is this like open software platform stuff has also 107 00:06:15,360 --> 00:06:18,799 Speaker 1: lived in our practice, in our outreach, like creating change 108 00:06:18,839 --> 00:06:22,359 Speaker 1: through society by sharing knowledge and education. Um And so 109 00:06:22,600 --> 00:06:25,320 Speaker 1: I think, yeah, it's it's it's an interesting it's an 110 00:06:25,320 --> 00:06:28,200 Speaker 1: interesting dynamic. Yeah, I think that does create a really 111 00:06:29,000 --> 00:06:32,200 Speaker 1: oftentimes beautiful reflection. It could it can have some dark 112 00:06:32,160 --> 00:06:34,320 Speaker 1: side different once in a while, but it is It 113 00:06:34,360 --> 00:06:36,719 Speaker 1: is really nice to have like kind of the ideology 114 00:06:36,839 --> 00:06:39,800 Speaker 1: driving it being reflected in the actions of operating it 115 00:06:39,880 --> 00:06:42,560 Speaker 1: and spreading it and that kind of thing. Um. So 116 00:06:42,640 --> 00:06:45,560 Speaker 1: this is something we kind of briefly touched on already, 117 00:06:45,640 --> 00:06:47,920 Speaker 1: but I think I'd like to move on to kind 118 00:06:47,920 --> 00:06:52,800 Speaker 1: of why, like how climate change and broader like social 119 00:06:52,839 --> 00:06:57,080 Speaker 1: issues are covered on Wikipedia, because you already mentioned like 120 00:06:57,560 --> 00:07:00,800 Speaker 1: it's kind because there's not a Wikipedia, there's many based 121 00:07:00,839 --> 00:07:03,360 Speaker 1: on different languages in places. It feels like to me, 122 00:07:03,720 --> 00:07:06,520 Speaker 1: whenever social issues kind of get covered on Wikipedia, it's 123 00:07:06,520 --> 00:07:08,799 Speaker 1: going to be in some part like a local reflection 124 00:07:08,880 --> 00:07:11,720 Speaker 1: of whatever is in that area. You know, if if 125 00:07:11,800 --> 00:07:15,080 Speaker 1: there's like a white liberal writing articles in New York, 126 00:07:15,440 --> 00:07:18,640 Speaker 1: it's gonna be different than someone you know, halfway across 127 00:07:18,680 --> 00:07:22,200 Speaker 1: the world writing them in you know, a much smaller country, 128 00:07:22,320 --> 00:07:24,800 Speaker 1: let's say, like Belarus, who's under like what I would 129 00:07:24,800 --> 00:07:28,680 Speaker 1: call a dictatorship. Um, so that's gonna change kind of 130 00:07:28,680 --> 00:07:32,040 Speaker 1: the nature of what people are making because of that 131 00:07:32,120 --> 00:07:34,400 Speaker 1: kind of divide. So how how does that kind of 132 00:07:34,400 --> 00:07:38,080 Speaker 1: crop up and is there any like solutions to that 133 00:07:38,480 --> 00:07:41,240 Speaker 1: or because because the because of the decentralized thing, it's like, 134 00:07:41,400 --> 00:07:45,560 Speaker 1: how much can we like impose like who like I'm i'm, 135 00:07:45,640 --> 00:07:47,920 Speaker 1: I'm I'm not in Belarus? How much can I impose 136 00:07:47,960 --> 00:07:51,960 Speaker 1: what I want their Wikipedia to look like? Yeah, there's 137 00:07:52,160 --> 00:07:55,520 Speaker 1: kind of two or three dynamics you're you're touching on here. 138 00:07:55,600 --> 00:07:59,640 Speaker 1: The first is because there's kind of an intention bias, 139 00:07:59,680 --> 00:08:02,640 Speaker 1: Like something comes up in the news, and our Wikipedia 140 00:08:02,720 --> 00:08:06,560 Speaker 1: community like people are within minutes of breaking news stories 141 00:08:06,720 --> 00:08:10,160 Speaker 1: are usually like editing the page, working to improve it. Right. 142 00:08:10,600 --> 00:08:13,080 Speaker 1: Um So if things show up in the you know, 143 00:08:13,400 --> 00:08:18,440 Speaker 1: European American press, it's very likely, especially something like English 144 00:08:18,480 --> 00:08:21,080 Speaker 1: Wikipedia will pick up on it and immediately cover it. 145 00:08:21,280 --> 00:08:25,360 Speaker 1: And because there are multiple perspectives in those press, usually 146 00:08:25,880 --> 00:08:30,760 Speaker 1: um kind of the ideological uh kind of multi sided 147 00:08:30,840 --> 00:08:33,320 Speaker 1: nice like works itself out because there's a lot of 148 00:08:33,360 --> 00:08:35,120 Speaker 1: eyes and a lot of people who know how to 149 00:08:35,240 --> 00:08:41,480 Speaker 1: edit there, right, um on in a kind of cultural linguistic, 150 00:08:41,800 --> 00:08:45,840 Speaker 1: geographic context where there's like one set of stories and 151 00:08:45,840 --> 00:08:49,640 Speaker 1: there's not a lot of diversity. Um uh, this this happens. 152 00:08:49,640 --> 00:08:52,120 Speaker 1: And and I'm going to refer to the Croatian Wikipedia 153 00:08:52,160 --> 00:08:55,480 Speaker 1: because we we actually had an external researcher look at 154 00:08:55,480 --> 00:08:58,440 Speaker 1: Croatian Wikipedia because part of it has been kind of 155 00:08:58,880 --> 00:09:02,720 Speaker 1: caught by by folks with kind of very ideological leanings 156 00:09:03,040 --> 00:09:06,120 Speaker 1: in a way that's excluding others and this is not good, right. 157 00:09:06,360 --> 00:09:09,600 Speaker 1: It creates a very one sided information environment and it 158 00:09:09,679 --> 00:09:14,760 Speaker 1: really reflects um, kind of the news dynamic going on there. 159 00:09:14,800 --> 00:09:17,480 Speaker 1: So when like breaking news happens, or when a topic 160 00:09:17,559 --> 00:09:20,480 Speaker 1: like a social issue or not necess like climate change 161 00:09:20,480 --> 00:09:22,640 Speaker 1: is not a social issue, right, this is a global 162 00:09:23,080 --> 00:09:29,199 Speaker 1: like life threatening issue. Um. When when something becomes politicized, 163 00:09:29,360 --> 00:09:32,760 Speaker 1: it's very easy for especially in smaller language wikipedias, for 164 00:09:32,920 --> 00:09:38,640 Speaker 1: few people to kind of swing the whole perspective on that. Um. 165 00:09:38,720 --> 00:09:41,800 Speaker 1: So yeah, there's this breaking news issue and this is 166 00:09:41,840 --> 00:09:44,920 Speaker 1: where are kind of organized communities are really important. So 167 00:09:45,480 --> 00:09:47,280 Speaker 1: the the example we want to point out of this 168 00:09:47,440 --> 00:09:51,199 Speaker 1: working well, um is in medicine. So are our medical 169 00:09:51,240 --> 00:09:55,800 Speaker 1: community during the Bola outbreaks a few years back. UM 170 00:09:55,840 --> 00:10:00,360 Speaker 1: in West Africa, we're able to organize both on English 171 00:10:00,480 --> 00:10:04,600 Speaker 1: and in languages that were accessible for local communities, high 172 00:10:04,679 --> 00:10:07,880 Speaker 1: quality coverage of the medical content because it's like has 173 00:10:07,920 --> 00:10:12,199 Speaker 1: impact on people's lives, and so they recruited translators. They 174 00:10:12,240 --> 00:10:15,440 Speaker 1: thought about, like what's a simple way to communicate the story, 175 00:10:15,960 --> 00:10:20,040 Speaker 1: um in that context, and like what do the workers, 176 00:10:20,160 --> 00:10:22,880 Speaker 1: the or the advocates or whoever on the ground who's 177 00:10:22,920 --> 00:10:27,600 Speaker 1: working with that crisis, what knowledge do they need? Right? UM? 178 00:10:27,640 --> 00:10:32,079 Speaker 1: And you see, like other open technology movements do stuff 179 00:10:32,080 --> 00:10:35,360 Speaker 1: like this, like humanitarian Open Streetmap has a similar kind 180 00:10:35,400 --> 00:10:39,800 Speaker 1: of way of organizing. They're like, hey, there's a crisis happening, UM, 181 00:10:39,920 --> 00:10:42,720 Speaker 1: let's pull people together from different parts of the world 182 00:10:42,720 --> 00:10:45,800 Speaker 1: who have the right knowledge or skills and like address 183 00:10:45,880 --> 00:10:50,000 Speaker 1: the knowledge gap. UM, so so you can solve it. 184 00:10:50,000 --> 00:10:53,240 Speaker 1: It's just it's complicated. UM. And you know, we've been 185 00:10:53,280 --> 00:10:56,080 Speaker 1: trying to address as a movement what we call the 186 00:10:56,120 --> 00:10:59,559 Speaker 1: gender gap. So there's both less women editors as women's 187 00:10:59,559 --> 00:11:04,560 Speaker 1: content on many of the wikis, and like it's taken 188 00:11:04,640 --> 00:11:07,960 Speaker 1: years and it's very hard to organize, and even when 189 00:11:08,040 --> 00:11:12,200 Speaker 1: there's investment in it. Um, It's it's challenging to to 190 00:11:12,320 --> 00:11:16,040 Speaker 1: make substantial progress because there might be contextual issues around 191 00:11:16,040 --> 00:11:19,079 Speaker 1: it too. And so you can't just like drop in 192 00:11:19,160 --> 00:11:24,439 Speaker 1: on a Central Asian language with a like Western perspective 193 00:11:24,920 --> 00:11:28,160 Speaker 1: and expect to like change the culture of the wiki overnight. 194 00:11:28,320 --> 00:11:31,600 Speaker 1: You have to engage with it consistently and be persistent 195 00:11:31,800 --> 00:11:34,080 Speaker 1: and work on it over and over and over again. 196 00:11:34,559 --> 00:11:37,120 Speaker 1: We are going to take a short break to hear 197 00:11:37,600 --> 00:11:41,840 Speaker 1: a message from our lovely lovely advertisers unless it's exomobile again, 198 00:11:42,240 --> 00:11:55,560 Speaker 1: but we will be back shortly, okay, and we're back. Um. 199 00:11:55,720 --> 00:11:59,760 Speaker 1: One one thing that we cover decently part of my 200 00:12:00,080 --> 00:12:04,520 Speaker 1: job and and and and Robert Evans job is disinformation 201 00:12:04,559 --> 00:12:09,000 Speaker 1: and misinformation and how that type of stuff spreads online. Um, 202 00:12:09,040 --> 00:12:12,800 Speaker 1: particularly you usually kind of linked to like political extremism 203 00:12:12,920 --> 00:12:16,000 Speaker 1: um or conspiracy theories or you know, in that general 204 00:12:16,160 --> 00:12:19,840 Speaker 1: kind of bubble um. And so what what type of 205 00:12:19,960 --> 00:12:25,720 Speaker 1: kind of climate misinformation has really been festering on various 206 00:12:25,800 --> 00:12:28,200 Speaker 1: you know wikipedias across the world really, because like we 207 00:12:28,240 --> 00:12:29,960 Speaker 1: would just be talking about like these topics and how 208 00:12:30,240 --> 00:12:31,920 Speaker 1: and how and and like why it happens, But like 209 00:12:32,080 --> 00:12:35,600 Speaker 1: what are the main types of misinformation or disinformation that 210 00:12:35,800 --> 00:12:40,160 Speaker 1: is much more like prevalent. Yeah. Um, so the first 211 00:12:40,280 --> 00:12:45,600 Speaker 1: is just kind of neglect of uh content that's happening 212 00:12:45,760 --> 00:12:49,040 Speaker 1: across the various things related to climate. Um. But we've 213 00:12:49,080 --> 00:12:53,679 Speaker 1: identified on English Wikipedia over three thousand seven d articles 214 00:12:53,720 --> 00:12:57,320 Speaker 1: that are directly related to climate change. Uh. We don't 215 00:12:57,360 --> 00:13:01,160 Speaker 1: have a very big editorial community and English on that topic. 216 00:13:01,280 --> 00:13:04,400 Speaker 1: That's like interesting fluent in the science and fluent and 217 00:13:04,480 --> 00:13:06,079 Speaker 1: the other stuff. And then you go out to the 218 00:13:06,120 --> 00:13:09,679 Speaker 1: other languages and like some of the languages have like 219 00:13:09,920 --> 00:13:14,160 Speaker 1: three thousand of them, some of them have like two hundred, right, um, 220 00:13:14,240 --> 00:13:18,880 Speaker 1: And so there is both um and some of that 221 00:13:18,920 --> 00:13:22,080 Speaker 1: content was like translated several years ago, right or five 222 00:13:22,160 --> 00:13:24,080 Speaker 1: or ten years ago, and yeah, you know, and like 223 00:13:24,480 --> 00:13:29,360 Speaker 1: the climate rhetoric has really changed, and like numbers and 224 00:13:29,360 --> 00:13:32,160 Speaker 1: statistics all that stuff gets updated every year. And it's yeah, 225 00:13:32,200 --> 00:13:33,760 Speaker 1: that is there is there's there's a lot to cap 226 00:13:33,800 --> 00:13:37,440 Speaker 1: with and like reading the IPCC report or looking at 227 00:13:37,520 --> 00:13:40,040 Speaker 1: any of the consensus science, there's like a lot of 228 00:13:40,120 --> 00:13:44,040 Speaker 1: change that you have to be influent in like science communication. 229 00:13:44,200 --> 00:13:46,160 Speaker 1: You have to understand like where to look for the 230 00:13:46,240 --> 00:13:50,560 Speaker 1: information um. And it's interesting. My partner is a Spanish 231 00:13:50,600 --> 00:13:54,560 Speaker 1: language speaker and she was in a kind of workshop 232 00:13:54,600 --> 00:14:00,440 Speaker 1: for journalists in Argentina for climate communication, and the the 233 00:14:00,440 --> 00:14:03,640 Speaker 1: workshop was like, oh, you should cite the Guardian, right, 234 00:14:03,720 --> 00:14:07,400 Speaker 1: so even as to to kind of understand this climate stuff. 235 00:14:07,440 --> 00:14:10,960 Speaker 1: So a lot of these local language contexts, there aren't 236 00:14:10,960 --> 00:14:13,839 Speaker 1: even good sources, and the sources they do have are 237 00:14:13,880 --> 00:14:17,280 Speaker 1: often citing like the dominant narrative that's going on and 238 00:14:17,400 --> 00:14:21,680 Speaker 1: like the anglophone news cycle, right, because there's not a 239 00:14:21,680 --> 00:14:24,360 Speaker 1: lot of climate communication going on, and so there's just 240 00:14:24,480 --> 00:14:28,400 Speaker 1: a lot of complexity involved in updating that much content 241 00:14:29,080 --> 00:14:32,480 Speaker 1: all the time. Um. And so the bulk of the 242 00:14:32,520 --> 00:14:35,160 Speaker 1: stuff that kind of creeps in is like this neglect. Right. 243 00:14:35,240 --> 00:14:38,680 Speaker 1: It's like some dominant idea in the narrative just hasn't 244 00:14:38,680 --> 00:14:42,040 Speaker 1: been updated, and like we need someone to update it. Um. 245 00:14:42,080 --> 00:14:45,520 Speaker 1: And that's like an organizing problem, right, that's uh, like 246 00:14:45,600 --> 00:14:47,960 Speaker 1: we need more people who are science literate, who speak 247 00:14:48,000 --> 00:14:51,040 Speaker 1: the local language, who understand how to edit Wikipedia. Um. 248 00:14:51,080 --> 00:14:53,680 Speaker 1: And that's trainable, Like we can do that. Yeah. The 249 00:14:53,720 --> 00:14:56,920 Speaker 1: reason that matters, the neglect matters is it stops people 250 00:14:56,960 --> 00:15:00,160 Speaker 1: from making decisions about climate change because they don't have 251 00:15:00,680 --> 00:15:04,240 Speaker 1: like an accurate sense of what we need to do, right, 252 00:15:04,640 --> 00:15:09,320 Speaker 1: which is cut the false fuels, increase increased resilience through 253 00:15:09,400 --> 00:15:15,480 Speaker 1: adaptation like actual political change, right. Um. And so so 254 00:15:15,600 --> 00:15:19,240 Speaker 1: that that's just it's a problem. Um. The other stuff 255 00:15:19,280 --> 00:15:23,160 Speaker 1: is a bit more it's a little bit more complicated. Um. 256 00:15:23,240 --> 00:15:28,080 Speaker 1: One of the things that happens is that, as you know, 257 00:15:28,320 --> 00:15:32,360 Speaker 1: there's quite a manipulation of narrative that has happened around 258 00:15:32,440 --> 00:15:36,640 Speaker 1: climate change. There's this really great podcast by Amy Westervelt 259 00:15:37,440 --> 00:15:41,400 Speaker 1: about how the fossil fuel industry like got its message 260 00:15:41,480 --> 00:15:44,200 Speaker 1: into schools in the last three years in the US, 261 00:15:44,680 --> 00:15:47,920 Speaker 1: and like that narrative is just so prevalent. And so 262 00:15:48,040 --> 00:15:51,200 Speaker 1: one of the things about wikipedias that we try to 263 00:15:51,240 --> 00:15:57,160 Speaker 1: do a balance of positions. If there are reputable sources 264 00:15:57,480 --> 00:16:00,680 Speaker 1: kind of describing or analyzing a topic, and this is 265 00:16:01,040 --> 00:16:06,040 Speaker 1: back to here polarization question too, if they're reputable sources 266 00:16:06,240 --> 00:16:08,480 Speaker 1: describing on topic, we try to give them equal weight 267 00:16:08,800 --> 00:16:13,280 Speaker 1: and balance across the article. The problem with climate is 268 00:16:13,320 --> 00:16:17,360 Speaker 1: that some of the narratives that look like reputable sources 269 00:16:17,560 --> 00:16:22,640 Speaker 1: are just pumped out of fossil fuel industry funded think tanks. Right, 270 00:16:23,080 --> 00:16:27,120 Speaker 1: and these things are not truthful narratives, right. Um. And 271 00:16:27,200 --> 00:16:31,240 Speaker 1: so the BBC ran an article, uh two weeks ago 272 00:16:32,080 --> 00:16:35,239 Speaker 1: on kind of climate denial and some of these smaller languages, 273 00:16:36,160 --> 00:16:40,600 Speaker 1: a smaller language wikipedias, and what they found was a 274 00:16:40,640 --> 00:16:43,920 Speaker 1: lot of these narratives being given equal weight with the 275 00:16:43,960 --> 00:16:48,400 Speaker 1: climate science. Um. And I took a look our community 276 00:16:48,480 --> 00:16:51,960 Speaker 1: after that BBC article came out, started looking across all 277 00:16:52,000 --> 00:16:55,520 Speaker 1: the language Wikipedia articles about just the main climate change page, 278 00:16:55,920 --> 00:17:00,640 Speaker 1: and they found thirty one Wikipedia's that had some of 279 00:17:00,680 --> 00:17:07,040 Speaker 1: that like equal weight of bad climate science. Interesting yeah, um. 280 00:17:07,160 --> 00:17:10,040 Speaker 1: And you know the BBC article only found like five 281 00:17:10,160 --> 00:17:13,040 Speaker 1: or ten, right, we found another lot a lot more, yeah, 282 00:17:13,200 --> 00:17:15,760 Speaker 1: yeah yeah. And so it's it's like a it's a 283 00:17:15,800 --> 00:17:20,880 Speaker 1: really like these narratives just seep in and you know, again, 284 00:17:20,920 --> 00:17:24,800 Speaker 1: I'm gonna go back to the Croatian example, like if 285 00:17:25,000 --> 00:17:29,240 Speaker 1: your media environment has been locked down by a certain 286 00:17:29,280 --> 00:17:34,000 Speaker 1: political rhetoric, too, those narratives might have traveled from like 287 00:17:34,040 --> 00:17:40,800 Speaker 1: the Anglo sphere into these other spaces and then gotten stuck, right, 288 00:17:40,920 --> 00:17:43,920 Speaker 1: and it's just like keeps getting recycled, and so that 289 00:17:44,040 --> 00:17:48,600 Speaker 1: causes delay. Um. And I was listening to your podcasts 290 00:17:48,720 --> 00:17:54,160 Speaker 1: recently about soft climate denial. Like this is what's happening 291 00:17:54,200 --> 00:17:59,240 Speaker 1: in other language environments, right, is people are rehearsing this misinformation. 292 00:18:00,119 --> 00:18:04,440 Speaker 1: It seems like a valid position because it's been rehearsed 293 00:18:04,600 --> 00:18:08,199 Speaker 1: so many times by by folks. Some people who are 294 00:18:08,280 --> 00:18:13,360 Speaker 1: championing that position are like doing so unknowingly, and then 295 00:18:13,359 --> 00:18:16,600 Speaker 1: the process, we're kind of disconnecting it entirely from the 296 00:18:16,760 --> 00:18:20,160 Speaker 1: source of the information, and that is just it's it's 297 00:18:20,280 --> 00:18:25,480 Speaker 1: really bad. One interesting thing that I thought of when 298 00:18:25,480 --> 00:18:28,400 Speaker 1: you were bringing up like sourcing, how sourcing itself coming 299 00:18:28,400 --> 00:18:30,840 Speaker 1: an issue like in the States, there's kind of like 300 00:18:31,160 --> 00:18:34,080 Speaker 1: a joke that like wicked like when people use just 301 00:18:34,200 --> 00:18:36,960 Speaker 1: Wikipedia as like as a source to be like they 302 00:18:37,040 --> 00:18:39,359 Speaker 1: just they just link the article. But like that is 303 00:18:39,400 --> 00:18:41,760 Speaker 1: the default for so many people when they begin begin 304 00:18:41,800 --> 00:18:43,960 Speaker 1: a research project, is like, Okay, what's the what's the 305 00:18:44,040 --> 00:18:46,199 Speaker 1: what is what does Wikipedia have on it? What's the 306 00:18:46,200 --> 00:18:50,280 Speaker 1: sources Wikipedia uses um and kind of branch off from there. 307 00:18:50,320 --> 00:18:53,000 Speaker 1: It's a very common thing. So I'm not sure what 308 00:18:53,080 --> 00:18:55,280 Speaker 1: like how different internet culture will be different in in 309 00:18:55,320 --> 00:18:59,239 Speaker 1: other countries. But if they do not, if they if 310 00:18:59,240 --> 00:19:03,000 Speaker 1: they don't have like the base sourcing necessary to create 311 00:19:03,400 --> 00:19:07,040 Speaker 1: like a decent home page article, then just sourcing from 312 00:19:07,040 --> 00:19:10,240 Speaker 1: Wikipedia in the first place becomes so much harder. Um, 313 00:19:10,280 --> 00:19:12,639 Speaker 1: because you like you were saying, like just use the 314 00:19:12,680 --> 00:19:15,640 Speaker 1: Guardian is is like one of the things, like that's 315 00:19:15,680 --> 00:19:18,240 Speaker 1: not horrible advice, but if it's only just from one thing, 316 00:19:18,600 --> 00:19:20,880 Speaker 1: then that that's going to change the entire nature of 317 00:19:21,359 --> 00:19:25,400 Speaker 1: like coverage and information on specific topics. Yeah. Yeah, I've 318 00:19:25,400 --> 00:19:27,359 Speaker 1: had just being really interesting kind of thing that I 319 00:19:27,640 --> 00:19:31,480 Speaker 1: never thought of before is how different countries wikipedias or 320 00:19:32,160 --> 00:19:35,919 Speaker 1: like language with Wikipedia's will have will have like different sources. 321 00:19:36,119 --> 00:19:39,720 Speaker 1: So then getting information from from the page, it's just 322 00:19:39,760 --> 00:19:42,679 Speaker 1: going to be so different. And like yeah, like like 323 00:19:42,760 --> 00:19:45,720 Speaker 1: the whole, like the whole like teared of sourcing is 324 00:19:45,760 --> 00:19:49,639 Speaker 1: just completely changed. Yeah. And and I think, like you know, 325 00:19:49,720 --> 00:19:53,760 Speaker 1: in medicine, and most medical practitioners expect most of the 326 00:19:53,800 --> 00:19:56,520 Speaker 1: medical literature to be in a handful of languages like 327 00:19:56,600 --> 00:19:59,040 Speaker 1: English and Chinese and that kind of stuff, right, and 328 00:19:59,440 --> 00:20:02,320 Speaker 1: like part of your professional work and part of like 329 00:20:02,440 --> 00:20:05,480 Speaker 1: saving people's lives is being able to use those sources. 330 00:20:05,520 --> 00:20:08,920 Speaker 1: And so if a medical Wikipedia article has a translation 331 00:20:09,080 --> 00:20:13,200 Speaker 1: from like an English article into another language, and you're 332 00:20:13,240 --> 00:20:16,760 Speaker 1: distributing that to medical practitioners and they find the citation 333 00:20:17,000 --> 00:20:19,200 Speaker 1: and it's in English and they can go follow the source. 334 00:20:19,560 --> 00:20:22,119 Speaker 1: Like that's not such a big deal. But with in 335 00:20:22,119 --> 00:20:26,480 Speaker 1: a topic like climate um, where the vast majority of 336 00:20:26,480 --> 00:20:29,280 Speaker 1: the people that have to make decisions on this information 337 00:20:29,800 --> 00:20:32,879 Speaker 1: do not have access to other languages. Maybe their access 338 00:20:32,880 --> 00:20:37,080 Speaker 1: to English is through like machine translation, Google or something 339 00:20:37,200 --> 00:20:41,760 Speaker 1: like that. Like having not having sources in your local language, UM, 340 00:20:42,880 --> 00:20:45,919 Speaker 1: or just having the sources that were translated from an 341 00:20:45,920 --> 00:20:49,280 Speaker 1: English Wikipedia article, which happens a lot on these smaller 342 00:20:49,359 --> 00:20:54,320 Speaker 1: language wikipedias, is kind of like not helpful for climate 343 00:20:54,359 --> 00:20:58,240 Speaker 1: decision making. UM. And this is where it's um. It's 344 00:20:58,280 --> 00:21:01,200 Speaker 1: easy for example, and a lot of these like Eastern 345 00:21:01,280 --> 00:21:06,159 Speaker 1: European languages or Central Asian UH languages for like a 346 00:21:06,280 --> 00:21:11,840 Speaker 1: politically spun news site opinion about something to kind of 347 00:21:11,880 --> 00:21:15,280 Speaker 1: creep in at the same level of of kind of 348 00:21:15,720 --> 00:21:21,840 Speaker 1: UH validity as as another as scientific research as the 349 00:21:21,840 --> 00:21:26,960 Speaker 1: the you know, consensus understanding of the climate pricis, So 350 00:21:27,080 --> 00:21:29,280 Speaker 1: how how might I know? We talked about like like 351 00:21:29,600 --> 00:21:33,560 Speaker 1: training for like journalists and people to start editing Wikipedia's 352 00:21:33,760 --> 00:21:35,560 Speaker 1: in their language, but like, how how do we kind 353 00:21:35,560 --> 00:21:39,600 Speaker 1: of improve climate communication overall with open access to information 354 00:21:39,800 --> 00:21:45,000 Speaker 1: and you know, creating more linguistic um diversity and stuff. Yeah. Well, 355 00:21:45,040 --> 00:21:48,760 Speaker 1: I think there's like a couple of opportunities UM in 356 00:21:48,800 --> 00:21:51,920 Speaker 1: this and then I there's some other misinformation. I also 357 00:21:51,920 --> 00:21:54,560 Speaker 1: want to talk about two UM, but I think that 358 00:21:54,640 --> 00:22:00,920 Speaker 1: this the sourcing one, is a particularly challenging one. UM. 359 00:22:00,960 --> 00:22:06,359 Speaker 1: We need like more basic science based climate communication and 360 00:22:06,440 --> 00:22:10,480 Speaker 1: more languages. And I'm not saying like just the the 361 00:22:10,560 --> 00:22:13,440 Speaker 1: like more languages like the big u N languages are 362 00:22:13,480 --> 00:22:16,600 Speaker 1: the ones that are kind of colonial across cultural languages 363 00:22:16,640 --> 00:22:20,320 Speaker 1: like Spanish or French or Arabic or you know, all 364 00:22:20,359 --> 00:22:23,920 Speaker 1: these languages that have been used across cultures. We also 365 00:22:23,960 --> 00:22:26,639 Speaker 1: need it in local languages, UM. And we needed to 366 00:22:26,640 --> 00:22:29,560 Speaker 1: be evidence based and we needed to be an audience based. Right. 367 00:22:29,680 --> 00:22:33,960 Speaker 1: So if if someone is like searching online in Swahili 368 00:22:34,440 --> 00:22:37,760 Speaker 1: about how like drought is happening in Kenya, right or 369 00:22:37,840 --> 00:22:41,600 Speaker 1: Tanzania or or the you know there's suddenly flooding or 370 00:22:41,800 --> 00:22:43,879 Speaker 1: like I need to deal with X, Y and Z 371 00:22:44,480 --> 00:22:49,440 Speaker 1: adaptation to the climate crisis, UM, which is by the way, 372 00:22:50,560 --> 00:22:53,360 Speaker 1: what all of the global South is doing right now, right, 373 00:22:53,440 --> 00:22:56,359 Speaker 1: like the global South is having to adapt to this 374 00:22:56,520 --> 00:23:00,479 Speaker 1: crisis that polluting countries has have made, and we're not 375 00:23:00,520 --> 00:23:04,000 Speaker 1: actually giving them the resources to the to the problem 376 00:23:04,040 --> 00:23:06,800 Speaker 1: that we've cost. You. Well, it's not even like giving 377 00:23:06,840 --> 00:23:09,440 Speaker 1: the research. We're not even like the people who are like, 378 00:23:09,640 --> 00:23:13,280 Speaker 1: we want to adapt our society. We're not resourcing the 379 00:23:13,359 --> 00:23:16,560 Speaker 1: folks on the ground who have the agency, who have 380 00:23:16,600 --> 00:23:18,560 Speaker 1: the understanding, who know how to do the research in 381 00:23:18,560 --> 00:23:20,760 Speaker 1: the context, who know how to do the communication in 382 00:23:20,760 --> 00:23:25,320 Speaker 1: the context. Right, We're not even like bolstering their their 383 00:23:25,359 --> 00:23:29,119 Speaker 1: request for help. Right, Like the the the You and 384 00:23:29,160 --> 00:23:33,159 Speaker 1: Climate Conference kind of failed on this adaptation funding, right, 385 00:23:33,520 --> 00:23:36,840 Speaker 1: And uh, this is you know, this is where like 386 00:23:36,840 --> 00:23:39,960 Speaker 1: a platform like Wikipedia and like kind of approaching this 387 00:23:40,080 --> 00:23:43,360 Speaker 1: from a knowledge activist perspective where you're like, there are 388 00:23:43,400 --> 00:23:48,600 Speaker 1: people who need this knowledge to address like understand what's 389 00:23:48,600 --> 00:23:51,880 Speaker 1: happening around them so they can make decisions. That doesn't 390 00:23:52,000 --> 00:23:57,080 Speaker 1: like you know, yeah, we need those kinds of information. 391 00:23:57,160 --> 00:24:00,600 Speaker 1: We need open source knowledge, not just Wikipedia but one 392 00:24:00,600 --> 00:24:04,880 Speaker 1: of the platforms um and and you know the you 393 00:24:05,040 --> 00:24:07,760 Speaker 1: all do open source investigation and you're used to like 394 00:24:07,840 --> 00:24:11,320 Speaker 1: open source software communities, And I listened a couple of 395 00:24:11,320 --> 00:24:14,000 Speaker 1: your podcasts, and you're kind of constantly speaking back to 396 00:24:14,040 --> 00:24:18,560 Speaker 1: those open communities that that come out of like angleophone, 397 00:24:19,080 --> 00:24:24,000 Speaker 1: software spaces um. And like we need to acknowledge that, 398 00:24:24,440 --> 00:24:27,000 Speaker 1: like we we figured out how to knowledge, but we 399 00:24:27,119 --> 00:24:30,240 Speaker 1: haven't given all those tools. We haven't transferred the knowledge 400 00:24:30,240 --> 00:24:33,200 Speaker 1: on how to do it. We haven't adapted those tools 401 00:24:33,480 --> 00:24:36,280 Speaker 1: to other parts of the world and other languages. UM. 402 00:24:36,440 --> 00:24:41,760 Speaker 1: And so just like starting to look for these other communities, 403 00:24:41,920 --> 00:24:45,480 Speaker 1: asking for the people, like who's ready to organize, like 404 00:24:45,720 --> 00:24:49,679 Speaker 1: giving them money to go do it? Right? These things 405 00:24:49,720 --> 00:24:53,400 Speaker 1: are like really practical UM. And I think we're we're 406 00:24:53,440 --> 00:24:56,800 Speaker 1: not We're not often not listening or we're not looking 407 00:24:56,840 --> 00:25:00,320 Speaker 1: for that solution and render like most of the people 408 00:25:00,359 --> 00:25:03,480 Speaker 1: having to adapt um are in the global South and 409 00:25:03,520 --> 00:25:07,639 Speaker 1: speak other languages. Like we need to be there in 410 00:25:07,680 --> 00:25:10,600 Speaker 1: that language if we want the climate crisis to like 411 00:25:11,920 --> 00:25:19,960 Speaker 1: resolve itself without you know, destraining people's lives. Yeah, absolutely, um. Yeah, 412 00:25:20,000 --> 00:25:22,480 Speaker 1: that's that's the thing we try to bring up that 413 00:25:22,440 --> 00:25:25,840 Speaker 1: the people is going to be initially worse affected are 414 00:25:25,840 --> 00:25:28,520 Speaker 1: the people who are already kind of not in the 415 00:25:28,560 --> 00:25:32,359 Speaker 1: greatest situation in the first place. That's like how how 416 00:25:32,880 --> 00:25:35,280 Speaker 1: like how like the areas that are gonna that are 417 00:25:35,320 --> 00:25:37,560 Speaker 1: gonna experience the most flooding, the most extreme weather events, 418 00:25:37,600 --> 00:25:39,680 Speaker 1: all this kind of stuff. It's it's it's not it's 419 00:25:39,680 --> 00:25:42,280 Speaker 1: not starting with something like New York City. It's starting 420 00:25:42,280 --> 00:25:44,840 Speaker 1: with areas that are already dealing with a lot of 421 00:25:44,840 --> 00:25:48,000 Speaker 1: like local issues and now this is just something else 422 00:25:48,040 --> 00:25:51,080 Speaker 1: on top. And yeah, fixing all of that is uh, 423 00:25:52,440 --> 00:25:54,720 Speaker 1: I mean, fixing all of it's impossible. We can only 424 00:25:54,760 --> 00:25:59,320 Speaker 1: take like small adaptive steps to like mitigate some of 425 00:25:59,359 --> 00:26:01,920 Speaker 1: the worse effects. And yeah, I mean that that's that's 426 00:26:01,920 --> 00:26:04,840 Speaker 1: stuff that comes up a bunch. But you mentioned you 427 00:26:04,840 --> 00:26:08,320 Speaker 1: wanted to at least briefly mentioned, um, some other forms 428 00:26:08,320 --> 00:26:12,640 Speaker 1: of disinformation. Yeah, so we We've also witnessed a couple 429 00:26:12,680 --> 00:26:16,280 Speaker 1: of times, um where something will hit like breaking news 430 00:26:16,520 --> 00:26:19,680 Speaker 1: or become a political position in a context, and then 431 00:26:19,880 --> 00:26:22,520 Speaker 1: like we will see bad actors show up on Wikipedia 432 00:26:22,600 --> 00:26:25,520 Speaker 1: and try to manipulate it. UM. I have two examples 433 00:26:25,520 --> 00:26:31,399 Speaker 1: of those. The first is about a year year ago. Uh, 434 00:26:31,480 --> 00:26:36,000 Speaker 1: we found a group of accounts editing about some of 435 00:26:36,040 --> 00:26:42,760 Speaker 1: the inter Amazonian highways that the Bolsonaro presidency is building 436 00:26:42,960 --> 00:26:46,840 Speaker 1: through through the Amazon um where they were trying to 437 00:26:46,880 --> 00:26:52,640 Speaker 1: remove the environmental and Indigenous people's uh impact assessments from 438 00:26:52,680 --> 00:26:57,160 Speaker 1: the Wikipedia articles UH and so like basic human rights stuff, 439 00:26:57,480 --> 00:27:03,359 Speaker 1: basic you know, healthy environment things that the government is 440 00:27:03,400 --> 00:27:07,560 Speaker 1: like expected to follow through on. We're being like manipulated 441 00:27:07,640 --> 00:27:13,440 Speaker 1: out of the articles for a more like pro economic 442 00:27:13,520 --> 00:27:19,000 Speaker 1: growth narrative UM. And so you know, it's we can't 443 00:27:19,040 --> 00:27:23,760 Speaker 1: like the shift towards this like very extreme right like 444 00:27:23,960 --> 00:27:29,160 Speaker 1: economic growth only version of reality um does play out 445 00:27:29,200 --> 00:27:31,320 Speaker 1: on the wikie now were we were lucky that this 446 00:27:31,440 --> 00:27:34,199 Speaker 1: was fairly trans like fairly easy to see once we 447 00:27:34,240 --> 00:27:38,320 Speaker 1: found it, But we had to coordinate across um, English, 448 00:27:38,359 --> 00:27:43,080 Speaker 1: Spanish and Portuguese to like address the problem. So so 449 00:27:43,160 --> 00:27:45,480 Speaker 1: we need like multi lingual communities who are kind of 450 00:27:45,520 --> 00:27:48,640 Speaker 1: coordinating and talking to each other to address that. UM. 451 00:27:48,680 --> 00:27:52,080 Speaker 1: The other thing we've seen is like, so did you 452 00:27:52,880 --> 00:27:56,399 Speaker 1: I don't know how well you follow the climate movement? UM, 453 00:27:56,440 --> 00:27:59,760 Speaker 1: but did you see when Disha Rabb got arrested and 454 00:28:00,160 --> 00:28:05,120 Speaker 1: India by chance? I don't think so. So she she's 455 00:28:05,160 --> 00:28:07,800 Speaker 1: a youth climate activist that was part of Friday's for 456 00:28:07,920 --> 00:28:11,320 Speaker 1: Future India, which is like a group kind of sister 457 00:28:11,359 --> 00:28:15,240 Speaker 1: group of the group that formed in Europe. Around Greta Twinberg, right, 458 00:28:15,840 --> 00:28:23,640 Speaker 1: um and uh she uh um. Her Gmail account got 459 00:28:23,640 --> 00:28:28,520 Speaker 1: attached to a Google doc. Uh just seen active on 460 00:28:28,520 --> 00:28:32,040 Speaker 1: a Google doc that was about sharing social media about 461 00:28:32,040 --> 00:28:35,120 Speaker 1: the India the farmers protests in India, which have been 462 00:28:35,400 --> 00:28:39,320 Speaker 1: like a real political sticking point issue. And I had 463 00:28:39,400 --> 00:28:43,320 Speaker 1: written so I'm both a volunteer and a professional who 464 00:28:43,400 --> 00:28:46,120 Speaker 1: organizes the community. And in my volunteer time, I had 465 00:28:46,120 --> 00:28:50,560 Speaker 1: written the biography of Nishar Rabi like months before the 466 00:28:50,600 --> 00:28:54,040 Speaker 1: Indian government kind of identified her with this social media 467 00:28:54,080 --> 00:28:58,880 Speaker 1: tool kit. And um, when she got arrested for something 468 00:28:58,960 --> 00:29:04,120 Speaker 1: that's like just basic social organizing tactic social media, UMU, 469 00:29:05,440 --> 00:29:10,640 Speaker 1: the kind of Hindu nationalist social media environment like zoomed 470 00:29:10,640 --> 00:29:14,200 Speaker 1: in on her Wikipedia article and on all these other 471 00:29:14,440 --> 00:29:18,520 Speaker 1: social media presences she had, and they tried to silence it. 472 00:29:19,120 --> 00:29:21,240 Speaker 1: Um be like, okay, we need to leave this article. 473 00:29:21,720 --> 00:29:25,360 Speaker 1: And uh Fortunately, like a group of us were watching 474 00:29:25,400 --> 00:29:27,680 Speaker 1: the page and we caught it and we're able to 475 00:29:27,720 --> 00:29:31,040 Speaker 1: stop that. But there's kind of the the kind of 476 00:29:31,040 --> 00:29:35,320 Speaker 1: flash mob situation that happens a lot now in social media, 477 00:29:35,360 --> 00:29:40,000 Speaker 1: where it's that is, this thing has been polarized. Now 478 00:29:40,040 --> 00:29:42,760 Speaker 1: we need to go attack it um. And so you 479 00:29:42,800 --> 00:29:46,240 Speaker 1: can imagine, like English Wikipedia has a healthy immune system 480 00:29:46,360 --> 00:29:48,680 Speaker 1: for this kind of stuff, it like sees it. But 481 00:29:48,720 --> 00:29:50,440 Speaker 1: it has enough, it has enough people that it can 482 00:29:50,640 --> 00:29:54,200 Speaker 1: do that. Yeah. Yeah, but you can imagine on a 483 00:29:54,320 --> 00:29:58,280 Speaker 1: smaller wiki that the narrative could shift and stay permanently 484 00:29:58,280 --> 00:30:02,960 Speaker 1: shifted quite quickly. Yeah, um if that happened. And so 485 00:30:03,040 --> 00:30:05,400 Speaker 1: that that's another concern. Right, So there's like the subtle 486 00:30:05,440 --> 00:30:08,360 Speaker 1: like a few accounts just like quietly removing things and 487 00:30:08,400 --> 00:30:12,520 Speaker 1: then like the act of political um kind of intervention 488 00:30:12,680 --> 00:30:15,040 Speaker 1: that happens. And in terms of like disinformation, do you 489 00:30:15,040 --> 00:30:17,600 Speaker 1: see the Wikipedia as being kind of susceptible to like 490 00:30:17,720 --> 00:30:23,560 Speaker 1: intentional disinformation campaigns of people slowly kind of editing the 491 00:30:23,680 --> 00:30:27,600 Speaker 1: ideology of of articles to to push kind of some agenda. 492 00:30:27,720 --> 00:30:29,920 Speaker 1: What whether that be like individually and like like you know, 493 00:30:30,200 --> 00:30:33,040 Speaker 1: more of like a crowd operation um or even like 494 00:30:33,440 --> 00:30:36,720 Speaker 1: run by like people with political power, um. Like do 495 00:30:36,720 --> 00:30:38,720 Speaker 1: do you how much of a risk do you see 496 00:30:38,760 --> 00:30:40,720 Speaker 1: that with this kind of open source idea? Is that's 497 00:30:41,040 --> 00:30:45,560 Speaker 1: of of like intentional slow dissemination of disinformation on like 498 00:30:45,800 --> 00:30:50,320 Speaker 1: important articles and stuff. Well, so I think I might 499 00:30:50,400 --> 00:30:55,200 Speaker 1: reframe your question a little bit like, uh, all open 500 00:30:55,320 --> 00:30:59,240 Speaker 1: source kind of knowledge spaces are susceptible to that, right. Um. 501 00:30:59,280 --> 00:31:02,960 Speaker 1: The question is is to like what degree and how 502 00:31:03,000 --> 00:31:07,200 Speaker 1: harmful is it going to be? Right, Um, like is 503 00:31:07,240 --> 00:31:10,520 Speaker 1: it is it like very open to this and will 504 00:31:10,560 --> 00:31:14,000 Speaker 1: it cause a lot of problems? Um, The bigger language 505 00:31:14,000 --> 00:31:17,520 Speaker 1: Wikipedia's have healthy immune systems, but we we have a 506 00:31:17,520 --> 00:31:20,680 Speaker 1: combination of kind of bots that are like AI generated 507 00:31:20,720 --> 00:31:22,880 Speaker 1: that flag bag edits, and then we have a lot 508 00:31:22,920 --> 00:31:25,800 Speaker 1: of community patrolling happening. And even in some of the 509 00:31:25,880 --> 00:31:29,640 Speaker 1: smaller communities that have like medium sized editor communities like 510 00:31:29,720 --> 00:31:33,920 Speaker 1: Swedish Wikipedia, it doesn't take a lot for that local 511 00:31:34,000 --> 00:31:37,800 Speaker 1: language community to patrol the pages and like be like, 512 00:31:37,880 --> 00:31:41,240 Speaker 1: oh okay, um, these changes are kind of weird. I 513 00:31:41,240 --> 00:31:44,200 Speaker 1: can roll it back, um, Like this doesn't seem like 514 00:31:44,280 --> 00:31:48,160 Speaker 1: it fits our culture of Wikipedia. The problem is when 515 00:31:48,400 --> 00:31:52,560 Speaker 1: a language Wikipedia has very few editors and they're not 516 00:31:52,880 --> 00:31:57,080 Speaker 1: active all the time. Um. And and so this is 517 00:31:57,320 --> 00:32:00,640 Speaker 1: where we need kind of more eyes on the content, 518 00:32:00,840 --> 00:32:03,640 Speaker 1: right because it's it's very easy for like a really 519 00:32:03,720 --> 00:32:06,880 Speaker 1: small language community to kind of have a little bit 520 00:32:06,880 --> 00:32:10,239 Speaker 1: of content but never see it maintained. Um And and 521 00:32:10,280 --> 00:32:14,160 Speaker 1: this is where the like where where our communities are 522 00:32:14,200 --> 00:32:16,440 Speaker 1: forming around these languages, like a lot of the West 523 00:32:16,480 --> 00:32:20,160 Speaker 1: African languages for example, that our communities are are kind 524 00:32:20,160 --> 00:32:23,840 Speaker 1: of organizing, and we we like invest in those communities 525 00:32:23,880 --> 00:32:27,680 Speaker 1: existing and like figuring out the governance and training people 526 00:32:27,720 --> 00:32:29,760 Speaker 1: how to edit and getting access to the kind of 527 00:32:29,760 --> 00:32:33,480 Speaker 1: technical skills to do this. Um. And you know, we 528 00:32:33,880 --> 00:32:37,160 Speaker 1: have kind of systems that we're hoping over the next 529 00:32:37,200 --> 00:32:40,560 Speaker 1: few years invest in that resilience, right, like building a 530 00:32:40,560 --> 00:32:43,200 Speaker 1: code of conduct making it easier for communities to see 531 00:32:43,200 --> 00:32:49,080 Speaker 1: this kind of stuff. But it is three languages, right, um. Yeah, 532 00:32:49,200 --> 00:32:53,640 Speaker 1: And it is a volunteer built system, and you do 533 00:32:53,800 --> 00:32:58,160 Speaker 1: need a healthy editatorial community in order to keep a 534 00:32:58,240 --> 00:33:02,040 Speaker 1: wiki from like drift team too much. Um. So a 535 00:33:02,080 --> 00:33:04,680 Speaker 1: good example of listening to get a reference creation because 536 00:33:04,680 --> 00:33:07,960 Speaker 1: it's the one we've done research on, Like it was 537 00:33:08,040 --> 00:33:13,120 Speaker 1: possible for a few people to push people who are 538 00:33:13,160 --> 00:33:17,520 Speaker 1: more in consensus with the global position on various topics 539 00:33:18,120 --> 00:33:23,760 Speaker 1: out of the wiki. Um. And that's just like we 540 00:33:23,760 --> 00:33:28,360 Speaker 1: we have to find a balance between like local language. Uh. 541 00:33:28,520 --> 00:33:30,840 Speaker 1: And this is my personal opinion, right, we need to 542 00:33:30,880 --> 00:33:34,760 Speaker 1: find a balance between kind of local language sovereignty on 543 00:33:34,840 --> 00:33:41,200 Speaker 1: this stuff, and also not like radicalizing, it's topical environment 544 00:33:41,200 --> 00:33:43,840 Speaker 1: and we and we see this particularly on impactful topics, right, 545 00:33:43,880 --> 00:33:48,320 Speaker 1: like ones that directly affect like politics or in the 546 00:33:48,360 --> 00:33:53,040 Speaker 1: kids climate crisis, like people's livelihoods and ability to function 547 00:33:53,440 --> 00:33:57,280 Speaker 1: in society right. Um And we just like we need 548 00:33:57,320 --> 00:33:59,760 Speaker 1: to be cautious about that. But but you know, Wikipedia 549 00:33:59,800 --> 00:34:02,880 Speaker 1: is a common resource, and I think this is really important. 550 00:34:02,920 --> 00:34:06,720 Speaker 1: Like the way Wikipedia works is you know, the Wikimmunitia 551 00:34:06,760 --> 00:34:10,120 Speaker 1: Foundation provides the servers, We fund our communities, we support them, 552 00:34:10,160 --> 00:34:14,200 Speaker 1: we help them work through governance issues. But like the 553 00:34:14,680 --> 00:34:17,680 Speaker 1: we we need editorial communities to maintain it. That's what 554 00:34:17,719 --> 00:34:23,120 Speaker 1: those two thousand people are doing as volunteers as they're 555 00:34:23,160 --> 00:34:27,400 Speaker 1: building an editorial practice that makes the content work. Um. 556 00:34:27,440 --> 00:34:31,000 Speaker 1: And and we we need that um. And so we 557 00:34:31,040 --> 00:34:34,360 Speaker 1: need you know, like minded communities like the people for 558 00:34:34,400 --> 00:34:36,480 Speaker 1: your your podcasts who are like, oh, we need the 559 00:34:36,520 --> 00:34:39,359 Speaker 1: Internet to be reliable and have accurate information a lot 560 00:34:39,440 --> 00:34:42,000 Speaker 1: to show up um Because if we don't do that, 561 00:34:42,280 --> 00:34:56,040 Speaker 1: it's it's really like it's the common resource. We we 562 00:34:56,040 --> 00:34:59,799 Speaker 1: we have a decent international listening base as well. UM. 563 00:35:00,080 --> 00:35:03,160 Speaker 1: I'm thinking like, what what would you like recommend people? 564 00:35:03,600 --> 00:35:06,799 Speaker 1: You know in different countries or even people inside inside 565 00:35:06,920 --> 00:35:10,200 Speaker 1: kind of like uh, you know, the States, America, Canada, 566 00:35:10,320 --> 00:35:13,439 Speaker 1: the UK who are like multi langual would at least 567 00:35:13,480 --> 00:35:17,480 Speaker 1: encourage them to browse other language wikipedias and maybe start 568 00:35:17,520 --> 00:35:22,000 Speaker 1: making edits when they see this type of misinformation popping up. Yeah, 569 00:35:22,160 --> 00:35:26,239 Speaker 1: so I to kind of perspectives on this one. UM, 570 00:35:26,760 --> 00:35:29,880 Speaker 1: look for a local organized community. So we we have 571 00:35:29,920 --> 00:35:34,880 Speaker 1: what's called wikimedia Affiliates. These are fifty organizations around the world. 572 00:35:35,280 --> 00:35:39,000 Speaker 1: They regularly run events, especially now that we're leaving COVID, 573 00:35:39,560 --> 00:35:43,080 Speaker 1: increasingly more in person events. They trained folks like look 574 00:35:43,120 --> 00:35:47,120 Speaker 1: for them in your context and if you need help finding, 575 00:35:47,239 --> 00:35:49,400 Speaker 1: you know, find me on Twitter and I can connect 576 00:35:49,440 --> 00:35:53,840 Speaker 1: you with those communities. Um. And the other part is 577 00:35:54,000 --> 00:35:56,439 Speaker 1: small edits. So I think a lot of people look 578 00:35:56,440 --> 00:36:01,640 Speaker 1: at Wikipedia and they think about like a traditional publishing firm, right, like, oh, 579 00:36:01,920 --> 00:36:03,759 Speaker 1: you know, I have to write the whole whole article. 580 00:36:03,880 --> 00:36:05,520 Speaker 1: Yeah yeah, yeah, yeah, I have to be a master. 581 00:36:05,719 --> 00:36:08,400 Speaker 1: And and the secret sauce to all of this is 582 00:36:08,440 --> 00:36:11,839 Speaker 1: like most people start with one citation, one comma, one 583 00:36:11,880 --> 00:36:15,200 Speaker 1: type of fix, and they do a handful of those 584 00:36:15,239 --> 00:36:17,799 Speaker 1: a month, and then they keep coming back and as 585 00:36:17,840 --> 00:36:19,920 Speaker 1: you do those small edits, you start reading the content 586 00:36:19,920 --> 00:36:23,160 Speaker 1: more carefully and fixing the things you can fix. And 587 00:36:23,280 --> 00:36:26,840 Speaker 1: so I recommend going in to like add one citation, 588 00:36:27,080 --> 00:36:29,919 Speaker 1: Like if you go and add one citation today, that 589 00:36:30,040 --> 00:36:33,080 Speaker 1: like makes life better, or you fix the communication on 590 00:36:33,160 --> 00:36:36,719 Speaker 1: the sentence. Um. The other part of it is, you know, 591 00:36:37,160 --> 00:36:40,160 Speaker 1: I said, there's these organized groups, uh for the climate 592 00:36:40,200 --> 00:36:43,520 Speaker 1: In particular, I run this campaign called Wiki for Human Rights, 593 00:36:44,080 --> 00:36:48,520 Speaker 1: which is focused on UM. We it's a theme that 594 00:36:48,560 --> 00:36:50,719 Speaker 1: we kind of identified with you and human rights on 595 00:36:50,760 --> 00:36:53,520 Speaker 1: the right to a healthy environment, which is this new 596 00:36:54,000 --> 00:36:57,320 Speaker 1: human right that has been acknowledged by the Human Rights Council. 597 00:36:57,800 --> 00:37:02,440 Speaker 1: And we're organizing kind of writing contests and editor fonds 598 00:37:02,719 --> 00:37:05,160 Speaker 1: and kind of trainings for communities to go and look 599 00:37:05,200 --> 00:37:07,840 Speaker 1: for the human dimension of the climate crisis. So I 600 00:37:07,880 --> 00:37:10,759 Speaker 1: think when we think about climate communication, a lot of 601 00:37:10,800 --> 00:37:13,759 Speaker 1: people are like science right there, like oh, this is 602 00:37:13,880 --> 00:37:17,920 Speaker 1: you know about how weather systems work and how the 603 00:37:17,960 --> 00:37:23,040 Speaker 1: atmosphere it forms and the stuff and the content that's 604 00:37:23,080 --> 00:37:26,120 Speaker 1: more impactful is this like human inflected stuff, like how 605 00:37:26,239 --> 00:37:30,600 Speaker 1: does the climate crisis in fact, you as an individual 606 00:37:30,760 --> 00:37:34,319 Speaker 1: and agriculture in the cities you live in, and the 607 00:37:34,400 --> 00:37:39,520 Speaker 1: clothing you buy in the manufactured goods absolutely mine around 608 00:37:39,560 --> 00:37:43,560 Speaker 1: the corner that's producing water pollution that's gonna harm your 609 00:37:43,640 --> 00:37:47,239 Speaker 1: children for the next thirty years, right, Um. And and 610 00:37:47,280 --> 00:37:50,600 Speaker 1: that is the kind of stuff that we're encouraging communities 611 00:37:50,600 --> 00:37:52,960 Speaker 1: to pay attention to. Is it is more the like 612 00:37:53,120 --> 00:37:59,080 Speaker 1: justice and human rights oriented perspective on these topics. And 613 00:37:59,200 --> 00:38:02,239 Speaker 1: your cat is very cute. Yeah, every once in a 614 00:38:02,239 --> 00:38:07,480 Speaker 1: while they love to love to take the camera. Um 615 00:38:07,600 --> 00:38:10,880 Speaker 1: and so yeah, So so if you follow me on Twitter, 616 00:38:11,000 --> 00:38:13,280 Speaker 1: I will I can hook you up with that campaign 617 00:38:13,440 --> 00:38:17,279 Speaker 1: as well. Yeah. Yeah. Where where can people find you 618 00:38:17,520 --> 00:38:20,360 Speaker 1: online and to learn more information about you know, the 619 00:38:20,440 --> 00:38:24,400 Speaker 1: various kind of topics we've discussed today. So, um, search 620 00:38:24,800 --> 00:38:28,920 Speaker 1: if you're interested in climate change stuff on Wikipedia English. 621 00:38:28,960 --> 00:38:32,960 Speaker 1: Wikipedia has a wonderful wiki project climate change that has 622 00:38:33,000 --> 00:38:35,360 Speaker 1: a little tab at the So if you search wiki 623 00:38:35,400 --> 00:38:38,840 Speaker 1: project climate change on Google and you find there's a 624 00:38:38,880 --> 00:38:41,160 Speaker 1: tab at the top that says get started with easy 625 00:38:41,320 --> 00:38:43,680 Speaker 1: edits and that kind of can get you oriented to 626 00:38:43,840 --> 00:38:47,520 Speaker 1: like where can you affect English Wikipedia on this? And 627 00:38:47,800 --> 00:38:49,759 Speaker 1: you know, once you find a gap on English, it's 628 00:38:49,800 --> 00:38:53,480 Speaker 1: easy to find it on other languages. Um. For the 629 00:38:53,600 --> 00:38:55,640 Speaker 1: kind of learning about Wiki for Human Rights. You can 630 00:38:55,680 --> 00:38:58,760 Speaker 1: search for that um and or follow me on Twitter. 631 00:38:59,080 --> 00:39:04,400 Speaker 1: Um A S A A D S sad ads on Twitter. Um. Uh. 632 00:39:04,640 --> 00:39:08,160 Speaker 1: We also have a group called Wikimedians for Sustainable Development 633 00:39:08,719 --> 00:39:11,520 Speaker 1: who's kind of communicating on Twitter, which is the group 634 00:39:11,560 --> 00:39:15,640 Speaker 1: that's really focused on sustainability topics more generally. Um. And 635 00:39:15,800 --> 00:39:19,760 Speaker 1: you know, the other way to look is find something 636 00:39:19,800 --> 00:39:22,799 Speaker 1: you've been reading about about the climate crisis or sustainability 637 00:39:22,840 --> 00:39:26,879 Speaker 1: issues in the news. Look it up on Wikipedia. See 638 00:39:26,880 --> 00:39:30,000 Speaker 1: if it's missing. Um. If it's not, click the edit 639 00:39:30,000 --> 00:39:33,560 Speaker 1: button at a sentence. Right. Um. The good example of this, 640 00:39:34,200 --> 00:39:38,480 Speaker 1: I learned about a park and uh the center of 641 00:39:38,600 --> 00:39:42,400 Speaker 1: Nairobi that's being protested by environmental activists because some of 642 00:39:42,400 --> 00:39:47,839 Speaker 1: the big trees were being cut down a Huru park, right. Uh. 643 00:39:48,000 --> 00:39:50,839 Speaker 1: This came by on my Twitter handle, Like I'm not 644 00:39:51,000 --> 00:39:55,120 Speaker 1: connected to this at the moment, right, Um, But because 645 00:39:55,160 --> 00:39:57,160 Speaker 1: I had news sources, I had three or four news 646 00:39:57,160 --> 00:39:59,799 Speaker 1: sources I could say really simply in two thousand one, 647 00:39:59,840 --> 00:40:03,520 Speaker 1: the art came under scrutiny for renovation that included removing 648 00:40:03,520 --> 00:40:08,640 Speaker 1: old trees. That's a climate action, right Uh. And I 649 00:40:08,680 --> 00:40:12,240 Speaker 1: think you know I am constantly overwhelmed by the climate 650 00:40:12,280 --> 00:40:15,759 Speaker 1: crisis as as a lot of people. Yeah yeah, and 651 00:40:16,320 --> 00:40:23,680 Speaker 1: and like just being able to tell that little story, like, hey, um, 652 00:40:23,880 --> 00:40:28,160 Speaker 1: the decisions people are making are not productive here, right, Um, 653 00:40:28,200 --> 00:40:32,160 Speaker 1: just just gathering that story is important. And what's important 654 00:40:32,200 --> 00:40:35,359 Speaker 1: is Wikipedia plays institutional memory on this, right. I feel like, 655 00:40:35,480 --> 00:40:37,759 Speaker 1: you know, a lot of a lot of activists work 656 00:40:37,840 --> 00:40:41,680 Speaker 1: is very temporal. It's very like in that moment, right, um, 657 00:40:41,760 --> 00:40:45,000 Speaker 1: And if it doesn't get documented on Wikipedia, the local 658 00:40:45,040 --> 00:40:49,520 Speaker 1: news sources are gonna get lost in the wind of time. Um. 659 00:40:49,840 --> 00:40:52,600 Speaker 1: And so I think, you know, if you to do 660 00:40:52,680 --> 00:40:56,720 Speaker 1: your little activist motion, like a sentence describing what happened 661 00:40:57,560 --> 00:41:01,160 Speaker 1: in a moment where resistance was happening, is like a 662 00:41:01,280 --> 00:41:05,760 Speaker 1: huge step forward, right because it connects the environmental crisis, 663 00:41:05,760 --> 00:41:09,960 Speaker 1: climate crisis, human rights issues to like daily lives. Like 664 00:41:10,040 --> 00:41:13,680 Speaker 1: people look up this park probably on Google because they 665 00:41:13,719 --> 00:41:16,279 Speaker 1: want to go there, right, or they read about it 666 00:41:16,400 --> 00:41:19,000 Speaker 1: because people are like when was it created? What was 667 00:41:19,080 --> 00:41:21,920 Speaker 1: that protest that happened there the other day? And if 668 00:41:21,960 --> 00:41:25,240 Speaker 1: those source isn't there, um, then it doesn't really exist 669 00:41:25,280 --> 00:41:27,439 Speaker 1: in their minds. Yeah, it doesn't exist in their minds. 670 00:41:27,600 --> 00:41:30,520 Speaker 1: And I think that's like one of the big issues 671 00:41:30,520 --> 00:41:33,960 Speaker 1: with climate crisis and you know, amplified even worse in 672 00:41:34,040 --> 00:41:38,640 Speaker 1: other languages, right, is that people aren't making that connection. 673 00:41:38,920 --> 00:41:41,960 Speaker 1: They aren't seeing it around them, and they're not you know, 674 00:41:42,320 --> 00:41:49,040 Speaker 1: kind of connecting action to how we address it. That. Uh, 675 00:41:49,120 --> 00:41:50,839 Speaker 1: that is a really good that's a really good point. 676 00:41:50,880 --> 00:41:53,840 Speaker 1: And yeah, I mean I will encourage everybody to to 677 00:41:53,880 --> 00:41:56,240 Speaker 1: start making small leddits. That's what's what I did for 678 00:41:56,640 --> 00:41:59,240 Speaker 1: a long time before I moved into like open source 679 00:41:59,719 --> 00:42:03,680 Speaker 1: um journalism and reporting. It's a great way to get started, 680 00:42:03,719 --> 00:42:05,920 Speaker 1: and it's a great way to get just start start 681 00:42:05,960 --> 00:42:08,520 Speaker 1: disseminating small bits of information because the only thing that 682 00:42:08,560 --> 00:42:11,640 Speaker 1: we can really do is people is small steps. We 683 00:42:11,719 --> 00:42:14,759 Speaker 1: can have like an adaptive goal in mind, but you 684 00:42:14,800 --> 00:42:18,279 Speaker 1: need to take small steps to get there. And that 685 00:42:18,440 --> 00:42:21,399 Speaker 1: is a really great way to start influencing the way 686 00:42:21,440 --> 00:42:27,160 Speaker 1: people think about climate and our situation. Um. Yeah, and 687 00:42:27,160 --> 00:42:29,560 Speaker 1: and I think too, you know, your your podcast kind 688 00:42:29,560 --> 00:42:31,600 Speaker 1: of appeals to folks who are interested in like finding 689 00:42:31,640 --> 00:42:35,080 Speaker 1: the truth and reality, right, and that that's that's like 690 00:42:35,800 --> 00:42:40,080 Speaker 1: that that investigation is what a Wikipedia article is. It 691 00:42:40,320 --> 00:42:44,600 Speaker 1: is like one ten hundred editors out there in the 692 00:42:44,640 --> 00:42:46,920 Speaker 1: world trying to go like, what the heck is this 693 00:42:47,040 --> 00:42:51,200 Speaker 1: topic about? Right? How do I compile my notes? Uh 694 00:42:51,239 --> 00:42:54,040 Speaker 1: in a way that helps other people? And I think 695 00:42:54,040 --> 00:42:57,799 Speaker 1: in the face of the climate crisis. Dr Ianna Johnson says, like, 696 00:42:57,880 --> 00:42:59,759 Speaker 1: find the thing you're good at, find the thing your 697 00:42:59,760 --> 00:43:02,360 Speaker 1: past and about, and find the thing that like or 698 00:43:02,520 --> 00:43:05,680 Speaker 1: that that makes you feel good and you're you're is rewarding. 699 00:43:05,920 --> 00:43:08,960 Speaker 1: And find the thing that actually like helps affect the 700 00:43:09,000 --> 00:43:12,799 Speaker 1: climate crisis. Right, And a small DNA on Wikipedia meets 701 00:43:12,840 --> 00:43:16,520 Speaker 1: your kind of knowledge needs. It's very satisfying because people 702 00:43:16,560 --> 00:43:20,799 Speaker 1: will read it and it is incremental change in the 703 00:43:20,880 --> 00:43:26,160 Speaker 1: right direction. Right, people will make decisions on it. Uh. Yeah, 704 00:43:26,239 --> 00:43:28,040 Speaker 1: I mean, and I guess uh, I think that I 705 00:43:28,120 --> 00:43:31,120 Speaker 1: think that probably closes this up today. And let's do 706 00:43:31,239 --> 00:43:34,280 Speaker 1: anything else to add Um, I guess one more plug 707 00:43:34,360 --> 00:43:37,520 Speaker 1: for your Twitter so we can get get more eyeballs 708 00:43:37,560 --> 00:43:40,840 Speaker 1: on you, um and the work that you're doing. Yeah. 709 00:43:41,000 --> 00:43:45,480 Speaker 1: Um so at S A D A D S. It's 710 00:43:45,560 --> 00:43:49,600 Speaker 1: my long term handle on the internet and you you 711 00:43:49,640 --> 00:43:53,280 Speaker 1: can find me Oliver plates Uh and I tweet about 712 00:43:53,320 --> 00:43:56,480 Speaker 1: Wikipedia on the climate crisis, will and we'll will link 713 00:43:56,560 --> 00:44:00,640 Speaker 1: the Wikipedia wiki project climate change page in the description 714 00:44:01,000 --> 00:44:03,480 Speaker 1: for people to find. Thank you so much for taking 715 00:44:03,520 --> 00:44:07,239 Speaker 1: time to talk to us all about these topics. UM, 716 00:44:07,320 --> 00:44:10,719 Speaker 1: I'm really really great, uh, really grateful to have this 717 00:44:10,800 --> 00:44:15,200 Speaker 1: type of knowledge readily accessible to more people. Also, you know, 718 00:44:15,440 --> 00:44:20,400 Speaker 1: in the spirit of Wikipedia. Thanks so, thank you so much. UM. 719 00:44:20,440 --> 00:44:23,600 Speaker 1: You can follow us by subscribing to the feed and 720 00:44:23,680 --> 00:44:26,439 Speaker 1: on Twitter and Instagram at happen Here pod and cool 721 00:44:26,520 --> 00:44:33,160 Speaker 1: Zone Media see you on the other side, everybody. It 722 00:44:33,239 --> 00:44:35,480 Speaker 1: could Happen Here is a production of cool Zone Media. 723 00:44:35,719 --> 00:44:38,400 Speaker 1: For more podcasts from cool Zone Media, visit our website 724 00:44:38,400 --> 00:44:40,560 Speaker 1: cool zone media dot com, or check us out on 725 00:44:40,600 --> 00:44:43,120 Speaker 1: the I Heart Radio app, Apple Podcasts, or wherever you 726 00:44:43,160 --> 00:44:45,960 Speaker 1: listen to podcasts. You can find sources for it could 727 00:44:45,960 --> 00:44:48,959 Speaker 1: Happen Here, updated monthly at cool zone Media dot com 728 00:44:49,000 --> 00:44:50,920 Speaker 1: slash sources. Thanks for listening.