1 00:00:05,200 --> 00:00:06,680 Speaker 1: Hey, this is Annie and Samantha. 2 00:00:06,720 --> 00:00:08,959 Speaker 2: I'm welcome to Stefan never told your production of iHeartRadio. 3 00:00:18,680 --> 00:00:21,400 Speaker 3: And today we are having what I would call a 4 00:00:21,520 --> 00:00:26,239 Speaker 3: wonderful fantasy crossover for me, specifically because my TikTok world 5 00:00:26,600 --> 00:00:30,400 Speaker 3: is coming onto the Sminty world. That is correct, and 6 00:00:30,440 --> 00:00:34,440 Speaker 3: I'm so excited because we have a bioethicist and expert 7 00:00:34,760 --> 00:00:36,280 Speaker 3: in the field of public health. 8 00:00:36,800 --> 00:00:39,000 Speaker 1: And Okay, first of all, that's just one of. 9 00:00:38,960 --> 00:00:42,000 Speaker 3: The mini hats that they wear, so I'm just gonna 10 00:00:42,040 --> 00:00:44,600 Speaker 3: leave it at that, but I'm so excited. Evan, Welcome 11 00:00:44,640 --> 00:00:46,400 Speaker 3: to s Minty, Welcome to the show. 12 00:00:46,800 --> 00:00:48,199 Speaker 4: Hi, thanks for having me. 13 00:00:49,320 --> 00:00:51,760 Speaker 1: Can you introduce yourself to our listeners. 14 00:00:52,479 --> 00:00:55,640 Speaker 4: I can. My name is Evan and I am I 15 00:00:55,680 --> 00:00:58,400 Speaker 4: go by Evan the bioethicist on TikTok. I work in 16 00:00:58,440 --> 00:01:01,840 Speaker 4: public health, but I also have a background in all 17 00:01:01,920 --> 00:01:05,880 Speaker 4: things related to misinformation, disinformation, and conspiracy theories as they 18 00:01:05,959 --> 00:01:09,759 Speaker 4: affect public health. And this includes things like biotechnology, AI 19 00:01:09,840 --> 00:01:11,839 Speaker 4: and media so much. 20 00:01:12,040 --> 00:01:15,000 Speaker 3: And you just did like a very succinct introduction because 21 00:01:15,360 --> 00:01:17,480 Speaker 3: as I was like looking you up, because I know 22 00:01:17,560 --> 00:01:21,080 Speaker 3: you from TikTok, the discovery of TikTok, because the listeners 23 00:01:21,120 --> 00:01:24,640 Speaker 3: know I'm obsessed. But when I was reading through the 24 00:01:24,760 --> 00:01:29,720 Speaker 3: articles that you have the like the education, your background, 25 00:01:30,000 --> 00:01:34,080 Speaker 3: your career filed, I'm like, my gosh, I think we 26 00:01:34,200 --> 00:01:36,959 Speaker 3: need well you were talking about want to expand. I 27 00:01:36,959 --> 00:01:38,920 Speaker 3: think you need a podcast and if you want to 28 00:01:38,920 --> 00:01:40,200 Speaker 3: come on to ours more often. 29 00:01:42,080 --> 00:01:45,080 Speaker 4: I don't know anything about the podcasting world. People have 30 00:01:45,160 --> 00:01:47,600 Speaker 4: said I need a podcast. I don't know what that entails. 31 00:01:47,760 --> 00:01:49,720 Speaker 4: I am open to learning, though, Hey. 32 00:01:50,000 --> 00:01:52,160 Speaker 3: I love that you're stepping in with us, because I 33 00:01:52,200 --> 00:01:54,559 Speaker 3: do feel like, again, this is one of those moments 34 00:01:54,560 --> 00:01:56,960 Speaker 3: where I fan girl a little too much because I'm like, 35 00:01:57,840 --> 00:02:00,800 Speaker 3: I've seen them so often on my feed and they 36 00:02:00,840 --> 00:02:03,480 Speaker 3: tell me great information. Now they're here and I'm just 37 00:02:03,520 --> 00:02:05,680 Speaker 3: gonna pick their brains. Which I've already warned you. It 38 00:02:05,760 --> 00:02:08,359 Speaker 3: was like, be ready questions. 39 00:02:08,440 --> 00:02:11,760 Speaker 4: I'm open for it. I'm available good. 40 00:02:11,880 --> 00:02:15,160 Speaker 3: Like I said, I'm so excited to talk about your 41 00:02:15,200 --> 00:02:20,600 Speaker 3: work and your experiences and just your overall expertise because 42 00:02:20,600 --> 00:02:22,440 Speaker 3: there's a lot going on in the world. 43 00:02:22,560 --> 00:02:23,040 Speaker 4: A lot. 44 00:02:24,120 --> 00:02:31,320 Speaker 2: Yes, that is true, Yes, it definitely is. 45 00:02:31,840 --> 00:02:33,800 Speaker 4: A lot of people have told me that my research, 46 00:02:33,840 --> 00:02:36,239 Speaker 4: which I finished school about two years ago now, was 47 00:02:36,400 --> 00:02:39,920 Speaker 4: very precin and I'm like, you know, yeah, I didn't 48 00:02:39,919 --> 00:02:41,720 Speaker 4: realize how much, so but. 49 00:02:41,880 --> 00:02:49,040 Speaker 2: You know, yeah, well here you are. Can you for 50 00:02:49,120 --> 00:02:51,120 Speaker 2: those of us that might not know, can you give 51 00:02:51,200 --> 00:02:55,000 Speaker 2: us a brief explanation of what a bioethicist is and does? 52 00:02:55,600 --> 00:02:59,959 Speaker 4: Absolutely? I know when you know your listeners and my mom, 53 00:03:00,120 --> 00:03:05,120 Speaker 4: I tell you right, So, a bioethicist. There's this thing 54 00:03:05,160 --> 00:03:08,160 Speaker 4: called medical humanities, and in the science there's like the 55 00:03:08,160 --> 00:03:11,760 Speaker 4: science humanities, and so it's essentially those are folks who 56 00:03:11,760 --> 00:03:17,160 Speaker 4: are who do the like social science part of like 57 00:03:17,280 --> 00:03:19,480 Speaker 4: hard sciences. I don't really love calling things hard and 58 00:03:19,520 --> 00:03:24,280 Speaker 4: soft sciences, but you know, so, a bioethicist is someone 59 00:03:24,320 --> 00:03:27,639 Speaker 4: who has a background in training in things like decision 60 00:03:27,680 --> 00:03:32,399 Speaker 4: making policy. We learn about qualitative methodology when it comes 61 00:03:32,440 --> 00:03:36,200 Speaker 4: to research. We talk a lot about things like consent 62 00:03:36,840 --> 00:03:40,000 Speaker 4: when it comes to research subjects. So we talk a 63 00:03:40,040 --> 00:03:44,200 Speaker 4: lot about ethical design and principles That can range from 64 00:03:44,240 --> 00:03:50,360 Speaker 4: everything from like transplantation ethics and policy to clinical bioethics, 65 00:03:50,360 --> 00:03:54,920 Speaker 4: which is helping people decide things around like religion, beliefs, 66 00:03:55,120 --> 00:03:58,320 Speaker 4: like people want to be I have a DNR DNI. 67 00:03:58,480 --> 00:04:01,280 Speaker 4: There's like so much to it and every bioethicist is 68 00:04:01,400 --> 00:04:06,280 Speaker 4: very different. A lot of bioethicists combine their bioethics training 69 00:04:06,440 --> 00:04:09,160 Speaker 4: with like a medical degree, like being an MD or 70 00:04:09,200 --> 00:04:11,920 Speaker 4: being a JD. So you'll see a lot of bioethicists 71 00:04:11,960 --> 00:04:17,920 Speaker 4: who cause play as lawyers and doctors. But we're many things, 72 00:04:18,800 --> 00:04:20,440 Speaker 4: which is why you don't hear about them as often 73 00:04:20,520 --> 00:04:23,520 Speaker 4: because oftentimes, you know, people know what a lawyer is, 74 00:04:23,520 --> 00:04:25,279 Speaker 4: people know what a physician or a doctor is, and 75 00:04:25,320 --> 00:04:27,719 Speaker 4: so they don't really know that the person that they 76 00:04:28,240 --> 00:04:30,080 Speaker 4: know who's a lawyer or a doctor is is a 77 00:04:30,120 --> 00:04:36,400 Speaker 4: bioethicist or practices bioethics rather than is practicing medicine or law. 78 00:04:36,760 --> 00:04:39,720 Speaker 1: That makes sense, thank you, because I'm not gonna lie. 79 00:04:39,760 --> 00:04:42,400 Speaker 3: It wasn't until I again stumbled on your page that 80 00:04:42,480 --> 00:04:44,960 Speaker 3: I was like, what what is this? Oh? 81 00:04:45,000 --> 00:04:47,600 Speaker 1: That makes sense? Like it makes sense when you. 82 00:04:47,600 --> 00:04:50,560 Speaker 3: Hear it after the fact, you're like, yeah, that's necessary. 83 00:04:50,600 --> 00:04:51,120 Speaker 1: I like this. 84 00:04:51,640 --> 00:04:54,839 Speaker 4: And to be fair, we've only started to endeavor bioethicists 85 00:04:54,880 --> 00:04:58,880 Speaker 4: as a standalone profession, and having started to endeavor bioethics 86 00:04:58,880 --> 00:05:03,000 Speaker 4: in public health, it's not super super common that someone 87 00:05:03,160 --> 00:05:06,599 Speaker 4: is like, I'm just a bioethicis like that is my training. Fully, 88 00:05:06,680 --> 00:05:08,520 Speaker 4: that is what I'm interested in, and I work in 89 00:05:08,560 --> 00:05:12,040 Speaker 4: public health and those things are super rare. Now they're growing, 90 00:05:12,080 --> 00:05:14,080 Speaker 4: They're ever growing, and there's more and more of us 91 00:05:14,800 --> 00:05:15,600 Speaker 4: every year. 92 00:05:15,480 --> 00:05:15,640 Speaker 3: But. 93 00:05:17,320 --> 00:05:17,640 Speaker 4: Newer. 94 00:05:18,279 --> 00:05:20,840 Speaker 1: So what made you choose this work exactly? 95 00:05:21,520 --> 00:05:26,440 Speaker 4: Originally, historically in undergrad I had been pre med and 96 00:05:26,480 --> 00:05:28,960 Speaker 4: then somebody told me that I had to do rotations 97 00:05:29,000 --> 00:05:31,200 Speaker 4: where I had to treat patients. And I never wanted 98 00:05:31,200 --> 00:05:35,599 Speaker 4: to do that. I know that sounds kooky, but I 99 00:05:35,640 --> 00:05:38,040 Speaker 4: had always wanted to have the training in medicine. I 100 00:05:38,200 --> 00:05:40,720 Speaker 4: never wanted to treat people. I had thought I would 101 00:05:40,720 --> 00:05:43,120 Speaker 4: go into things like forensics, and then I realized I 102 00:05:43,160 --> 00:05:47,400 Speaker 4: actually really like the more expansive or like population health 103 00:05:47,440 --> 00:05:51,000 Speaker 4: based work that comes with public health. But I still 104 00:05:51,080 --> 00:05:54,320 Speaker 4: love the medical humanities. I still love talking to people 105 00:05:54,360 --> 00:05:56,360 Speaker 4: about like what does it mean to like think about 106 00:05:56,440 --> 00:05:59,520 Speaker 4: zeno transplantation, or like I love telling people. I love 107 00:05:59,560 --> 00:06:04,680 Speaker 4: telling people about how important horseshoe crabs are to protecting 108 00:06:04,880 --> 00:06:09,360 Speaker 4: us from having unsterile like medications, especially cancer medications which 109 00:06:09,360 --> 00:06:15,039 Speaker 4: are injectables. Like that fuels me these like sort of weird, 110 00:06:15,120 --> 00:06:19,080 Speaker 4: quirky facts. But an also like the complexity that comes 111 00:06:19,120 --> 00:06:22,360 Speaker 4: with like thought and decision making. A lot of people 112 00:06:22,400 --> 00:06:26,559 Speaker 4: don't think about how does policy happen, how do decisions happen? 113 00:06:26,640 --> 00:06:30,520 Speaker 4: What influences medicine? And I love talking about those things. 114 00:06:30,520 --> 00:06:33,920 Speaker 4: I love learning about them. I've been recently, I know 115 00:06:34,120 --> 00:06:37,680 Speaker 4: if Samantha, since you watched my TikTok, you know how 116 00:06:37,760 --> 00:06:40,480 Speaker 4: much I've been delving into like disgusted as like a 117 00:06:40,520 --> 00:06:41,720 Speaker 4: philosophic concept. 118 00:06:41,839 --> 00:06:45,040 Speaker 1: So yeah, I'm not gonna lie. That one went over 119 00:06:45,080 --> 00:06:45,400 Speaker 1: my head. 120 00:06:45,400 --> 00:06:47,000 Speaker 3: I had to sit there and like, as you were 121 00:06:47,040 --> 00:06:49,279 Speaker 3: talking about these series, I had to go back and 122 00:06:49,320 --> 00:06:49,479 Speaker 3: I was. 123 00:06:49,480 --> 00:06:51,000 Speaker 1: Like, wait, what huh? 124 00:06:51,040 --> 00:06:54,920 Speaker 3: Because that in itself as a question didn't doesn't rise 125 00:06:54,920 --> 00:06:55,400 Speaker 3: in my head. 126 00:06:55,560 --> 00:06:59,040 Speaker 1: Does that make sense? I was like, oh, huh huh. 127 00:06:59,080 --> 00:06:59,640 Speaker 1: So there's so. 128 00:06:59,600 --> 00:07:02,560 Speaker 3: Many Okay, obviously again with a fangirl is like, there 129 00:07:02,560 --> 00:07:05,080 Speaker 3: are so many moments where I've had to take your 130 00:07:05,160 --> 00:07:07,960 Speaker 3: video which is a little longer than your typical like 131 00:07:08,000 --> 00:07:10,800 Speaker 3: ten second or fifteen second run up, because you give 132 00:07:10,920 --> 00:07:12,720 Speaker 3: so much information that I have to go back and 133 00:07:12,760 --> 00:07:15,480 Speaker 3: listen to a few times, and each time I feel 134 00:07:15,480 --> 00:07:18,400 Speaker 3: like I'm catching onto something different in a lesson. So 135 00:07:18,440 --> 00:07:22,040 Speaker 3: it does feel like I'm getting a good school learning 136 00:07:22,600 --> 00:07:24,320 Speaker 3: when I'm listening to your stuff. But again, like it 137 00:07:24,400 --> 00:07:26,280 Speaker 3: rises to new questions that I'm like, I haven't even 138 00:07:26,320 --> 00:07:27,040 Speaker 3: thought of that one. 139 00:07:28,080 --> 00:07:30,440 Speaker 4: And that's the idea. I mean. When I originally started 140 00:07:30,480 --> 00:07:32,960 Speaker 4: to endeavor to create my page and really curate it 141 00:07:33,200 --> 00:07:37,200 Speaker 4: with information, I was thinking critically around, like, how can 142 00:07:37,240 --> 00:07:40,440 Speaker 4: this be an educational tool? How can people use this? 143 00:07:40,560 --> 00:07:44,360 Speaker 4: Can people revisit what I've said? And so I love 144 00:07:44,400 --> 00:07:47,000 Speaker 4: that people say that they come back and watch a 145 00:07:47,080 --> 00:07:50,160 Speaker 4: video over over, because yeah, a lot of these things 146 00:07:50,200 --> 00:07:53,160 Speaker 4: are not things you're going to like consume once and 147 00:07:53,200 --> 00:07:55,440 Speaker 4: be like, oh, I got it, Like that's good. I like, 148 00:07:55,760 --> 00:07:58,440 Speaker 4: but reflection is part of it, right, Like that's like you. 149 00:07:58,400 --> 00:08:02,280 Speaker 2: Know yeah, yeah, I mean it's obviously if you're an 150 00:08:02,320 --> 00:08:05,200 Speaker 2: expert in something, it makes sense that somebody in one 151 00:08:05,320 --> 00:08:12,440 Speaker 2: video watch would be like, yes, understood. So with everything 152 00:08:12,480 --> 00:08:15,320 Speaker 2: that we've alluded to that it is happening today, especially 153 00:08:15,400 --> 00:08:19,480 Speaker 2: in the US perhaps, and as you mentioned, you have 154 00:08:19,560 --> 00:08:24,600 Speaker 2: expertise in combating misinformation and disinformation when it comes to 155 00:08:24,600 --> 00:08:28,480 Speaker 2: public health. That seems like a really huge task. Can 156 00:08:28,520 --> 00:08:31,920 Speaker 2: you tell us what this type of work actually looks like. 157 00:08:32,920 --> 00:08:35,920 Speaker 4: It looks like a lot of things, but one of 158 00:08:35,960 --> 00:08:37,960 Speaker 4: the primary things I like to point out for people 159 00:08:38,080 --> 00:08:40,240 Speaker 4: is that a lot of people think this is like 160 00:08:40,280 --> 00:08:44,800 Speaker 4: a newer thing, and especially in medicine and health, misinformation 161 00:08:44,840 --> 00:08:47,559 Speaker 4: and disinformation and conspiracy theories have always existed. I mean, 162 00:08:47,559 --> 00:08:50,479 Speaker 4: I could go all the way back to the first 163 00:08:50,679 --> 00:08:56,079 Speaker 4: like organized invention of like vaccination with cowpox, right, which 164 00:08:56,120 --> 00:08:58,840 Speaker 4: we were injecting people with a live virus to protect 165 00:08:58,880 --> 00:09:04,280 Speaker 4: against smallpox. And there was a number of newspapers that 166 00:09:04,480 --> 00:09:07,880 Speaker 4: put out like, you know, sort of tongue in cheek 167 00:09:07,920 --> 00:09:11,680 Speaker 4: cartoons of people turning into cows, right, and that influenced 168 00:09:11,679 --> 00:09:14,640 Speaker 4: people into thinking or trying or avoiding vaccination because they 169 00:09:14,679 --> 00:09:17,880 Speaker 4: were afraid they would be turned into a cow. So 170 00:09:18,160 --> 00:09:20,000 Speaker 4: the first thing is to know the history of it, 171 00:09:20,640 --> 00:09:22,760 Speaker 4: and the second thing is to also know the difference 172 00:09:22,840 --> 00:09:26,719 Speaker 4: between misinformation and disinformation. A lot of people use those interchangeably, 173 00:09:26,760 --> 00:09:30,480 Speaker 4: and they're not the same. Misinformation is just somebody really 174 00:09:30,520 --> 00:09:33,280 Speaker 4: not understanding the science of medicine. Any one of us 175 00:09:33,280 --> 00:09:35,800 Speaker 4: has been guilty of creating it. You know, if you've 176 00:09:35,800 --> 00:09:38,000 Speaker 4: ever been to the doctor and they told you, you know, 177 00:09:38,120 --> 00:09:41,400 Speaker 4: like some big fancy word that you did not recognize, 178 00:09:41,559 --> 00:09:44,880 Speaker 4: like gastroparesis, and you came home and you were like, 179 00:09:45,040 --> 00:09:49,760 Speaker 4: I have gastro parents, and I don't know what that means. 180 00:09:51,200 --> 00:09:56,600 Speaker 4: You've inadvertently created misinformation. Doctors, public health experts, all of 181 00:09:56,640 --> 00:09:59,520 Speaker 4: those were always dealing with having to sort of be corrective. 182 00:10:00,520 --> 00:10:03,440 Speaker 4: And it's not malicious, it's just people who aren't trained 183 00:10:03,440 --> 00:10:08,679 Speaker 4: in science won't get everything. But disinformation is an intentional 184 00:10:09,440 --> 00:10:13,559 Speaker 4: creation of information that is wrong, and it usually has 185 00:10:13,679 --> 00:10:19,439 Speaker 4: monetary value. There's usually something that someone can get out 186 00:10:19,520 --> 00:10:22,559 Speaker 4: of creating disinformation, and there's a way that they can 187 00:10:22,600 --> 00:10:26,520 Speaker 4: monetize that, either the platform or they can drive people 188 00:10:26,559 --> 00:10:28,800 Speaker 4: to purchase something. There's all different ways that they do it. 189 00:10:28,880 --> 00:10:31,960 Speaker 4: And then they also target particular populations, so they have 190 00:10:32,040 --> 00:10:34,439 Speaker 4: a design. It has a science and a design to it, 191 00:10:34,559 --> 00:10:39,880 Speaker 4: and a distribution tacked and technique. So those are the 192 00:10:39,880 --> 00:10:42,480 Speaker 4: things that I think are like the most pressing in 193 00:10:42,520 --> 00:10:45,040 Speaker 4: my mind when it comes to the like at the 194 00:10:45,200 --> 00:10:47,000 Speaker 4: very if we're going to start to unravel it. 195 00:10:48,600 --> 00:10:52,240 Speaker 3: Yeah, and we are big over here in making sure 196 00:10:52,280 --> 00:10:56,760 Speaker 3: we understand the conversation when it comes to misinformation and disinformation. 197 00:10:56,840 --> 00:10:58,839 Speaker 1: So thank you for clarifying that. 198 00:10:59,240 --> 00:11:02,000 Speaker 3: Also, you do talk about the fact that misinformation can 199 00:11:02,000 --> 00:11:04,359 Speaker 3: happen even with people with good intentions. 200 00:11:05,320 --> 00:11:05,640 Speaker 1: I know. 201 00:11:05,720 --> 00:11:08,080 Speaker 3: I think I've probably been guilty of this as well, 202 00:11:08,120 --> 00:11:10,520 Speaker 3: because you hear something that's very hyperbolic and you take 203 00:11:10,559 --> 00:11:13,360 Speaker 3: it literal because I'm a very literal person. So you 204 00:11:13,360 --> 00:11:16,400 Speaker 3: can't tell me something and then not clarify immediately because 205 00:11:16,400 --> 00:11:19,120 Speaker 3: I'll be like, what so when we do that, what 206 00:11:19,280 --> 00:11:22,040 Speaker 3: is something that you think we should how we should 207 00:11:22,080 --> 00:11:24,079 Speaker 3: handle our own giving misinformation. 208 00:11:24,800 --> 00:11:27,120 Speaker 4: I always want to encourage people, especially people with a 209 00:11:27,200 --> 00:11:32,040 Speaker 4: media platform of some sort, to really check the information 210 00:11:32,400 --> 00:11:36,520 Speaker 4: and the facts that they're sharing, and also to look 211 00:11:36,559 --> 00:11:38,600 Speaker 4: at what facts do I need to share or not 212 00:11:38,720 --> 00:11:42,080 Speaker 4: need to share, or what's the context I'm providing. The 213 00:11:42,120 --> 00:11:44,040 Speaker 4: best example I can give is I know plenty of 214 00:11:44,080 --> 00:11:47,160 Speaker 4: people on TikTok and other places that really want to 215 00:11:47,240 --> 00:11:50,200 Speaker 4: keep people informed about bird flu. The issue is is 216 00:11:50,240 --> 00:11:53,760 Speaker 4: that they're not contextualizing bird flu by talking about the 217 00:11:53,880 --> 00:11:56,920 Speaker 4: number of deaths or the number of people infected so far, 218 00:11:57,440 --> 00:12:00,400 Speaker 4: the point of transmission. These are very important heat of 219 00:12:00,440 --> 00:12:04,079 Speaker 4: components to leave out. And then also if you're really 220 00:12:04,080 --> 00:12:09,120 Speaker 4: worried about something like communicable disease, to be making video 221 00:12:09,160 --> 00:12:11,480 Speaker 4: after video after a video about bird flu, but not 222 00:12:11,559 --> 00:12:16,120 Speaker 4: making a video being really informative around the actual seasonal flu, 223 00:12:16,559 --> 00:12:21,200 Speaker 4: which we have an incredible amount of hospitalizations, a record 224 00:12:21,280 --> 00:12:24,680 Speaker 4: number of hospitalizations this year when it comes to seasonal flu, 225 00:12:25,480 --> 00:12:27,360 Speaker 4: and I just sort of think it's really misleading. I 226 00:12:27,480 --> 00:12:30,560 Speaker 4: understand why people want to be informative about bird flu 227 00:12:30,600 --> 00:12:32,600 Speaker 4: because there's a lot of unknowns, But if you're not 228 00:12:32,640 --> 00:12:35,280 Speaker 4: an expert in public health and don't understand who's tracking it, 229 00:12:35,920 --> 00:12:39,320 Speaker 4: and don't understand what the threat level is from the 230 00:12:39,400 --> 00:12:42,800 Speaker 4: perspective of someone working in public health, then you can 231 00:12:42,840 --> 00:12:46,079 Speaker 4: really kind of be distracting more than you can be informative. 232 00:12:56,280 --> 00:12:56,960 Speaker 1: Oh oh yeah. 233 00:12:57,000 --> 00:13:01,000 Speaker 3: With that, With the fact that the US pulling out 234 00:13:01,000 --> 00:13:04,680 Speaker 3: of you know who, and the recent firings of so 235 00:13:04,920 --> 00:13:08,000 Speaker 3: many people from the public health and the CDC, a 236 00:13:08,040 --> 00:13:11,680 Speaker 3: lot of people are really concerned about not getting correct 237 00:13:11,800 --> 00:13:14,600 Speaker 3: or accurate information. What is some advice you would give 238 00:13:15,040 --> 00:13:18,000 Speaker 3: us lay people who has to watch your videos five 239 00:13:18,040 --> 00:13:21,120 Speaker 3: times to understand what's going on in trying to find 240 00:13:21,120 --> 00:13:23,280 Speaker 3: the most accurate and up to date health information. 241 00:13:23,600 --> 00:13:26,520 Speaker 4: I always like to remind people that they do live 242 00:13:26,800 --> 00:13:30,640 Speaker 4: in a multi layered governance space, and so you can 243 00:13:31,840 --> 00:13:33,320 Speaker 4: when it First of all, the way that it works 244 00:13:33,320 --> 00:13:35,640 Speaker 4: when it comes to tracking disease for the CDC is 245 00:13:35,679 --> 00:13:38,120 Speaker 4: the CDC works on a like a reporting up model, 246 00:13:39,000 --> 00:13:43,480 Speaker 4: And so what often happens for people or the process 247 00:13:43,600 --> 00:13:48,680 Speaker 4: is that local so like municipal and state entities usually 248 00:13:48,720 --> 00:13:51,559 Speaker 4: report up to the CDC, and the CDC gathers those 249 00:13:51,640 --> 00:13:54,280 Speaker 4: numbers and then looks at trends and then does outward 250 00:13:54,400 --> 00:13:57,840 Speaker 4: reporting nationally. Right, So the good news is is that 251 00:13:58,240 --> 00:14:00,920 Speaker 4: the things that are being reported or tracked depending on 252 00:14:00,960 --> 00:14:04,960 Speaker 4: where you live. Now, the caveat I will say here 253 00:14:05,040 --> 00:14:08,320 Speaker 4: is that not all municipalities are treated equal. But depending 254 00:14:08,320 --> 00:14:11,560 Speaker 4: on where you live, your local health department is just 255 00:14:11,640 --> 00:14:15,000 Speaker 4: as in the know as the CDC, if not more so, 256 00:14:15,360 --> 00:14:17,960 Speaker 4: because they're doing the reporting up for the CDC to 257 00:14:18,120 --> 00:14:22,360 Speaker 4: like an EIS officer or other folks. So I would 258 00:14:22,440 --> 00:14:25,920 Speaker 4: encourage people to actually be more exploratory with their local 259 00:14:26,440 --> 00:14:30,360 Speaker 4: again like city health department or their county health department 260 00:14:30,480 --> 00:14:34,360 Speaker 4: or their state health department. For folks who feel like 261 00:14:34,400 --> 00:14:37,240 Speaker 4: they can't trust those entities either, and I understand why, 262 00:14:38,080 --> 00:14:42,760 Speaker 4: I would also look to nonpartisan organizations. I've pointed out 263 00:14:42,800 --> 00:14:48,480 Speaker 4: places like the APHA CSTE, which is like the Council 264 00:14:48,840 --> 00:14:53,720 Speaker 4: of State and Territorial Epidemiologists. Those are some organizations I 265 00:14:53,720 --> 00:14:57,000 Speaker 4: think off the top, there's an organization called NASDAD that 266 00:14:57,040 --> 00:15:01,320 Speaker 4: focuses mainly on HIV and STIs. So these organizations are 267 00:15:01,440 --> 00:15:05,920 Speaker 4: non partisan, but also they are private entities. They do 268 00:15:06,000 --> 00:15:08,680 Speaker 4: government contracting, so they have a lot of good relationships 269 00:15:09,280 --> 00:15:15,800 Speaker 4: with state, municipal and federal public health experts. But you 270 00:15:15,840 --> 00:15:18,800 Speaker 4: can also sign up for the WHO newsletter use an 271 00:15:18,840 --> 00:15:23,000 Speaker 4: individual can interface with WHO yourself. A lot of people 272 00:15:23,000 --> 00:15:24,040 Speaker 4: don't realize that they can do that. 273 00:15:24,360 --> 00:15:31,360 Speaker 1: Yeah again, TikTok novel. Well I'm so, I'm so one TikTok. 274 00:15:31,360 --> 00:15:31,880 Speaker 1: I gotta stop. 275 00:15:32,360 --> 00:15:34,960 Speaker 4: But newsletters always kind of for new I always want 276 00:15:35,000 --> 00:15:36,840 Speaker 4: to encourage people to sign up for newsletters when it 277 00:15:36,840 --> 00:15:38,880 Speaker 4: comes to public health and informing yourself. 278 00:15:40,680 --> 00:15:41,600 Speaker 1: That's such a great idea. 279 00:15:41,640 --> 00:15:45,000 Speaker 3: I forget about newsletters like this is in the age 280 00:15:45,000 --> 00:15:47,160 Speaker 3: of social media, and even though I grew up in 281 00:15:47,200 --> 00:15:50,280 Speaker 3: the age of newsletters and actual newspapers, I still kind 282 00:15:50,280 --> 00:15:50,640 Speaker 3: of forget. 283 00:15:50,640 --> 00:15:52,800 Speaker 1: I'm like, oh, yeah, that does exist. 284 00:15:53,000 --> 00:15:53,440 Speaker 4: Look at this. 285 00:15:54,200 --> 00:15:58,640 Speaker 3: I think there's a lot in this conversation. You again, 286 00:15:59,080 --> 00:16:00,560 Speaker 3: I want to talk about some of the series that 287 00:16:00,600 --> 00:16:02,960 Speaker 3: you've done because you do a great job. And talking 288 00:16:02,960 --> 00:16:05,560 Speaker 3: about health and policy, you're doing a great job. And 289 00:16:05,600 --> 00:16:09,120 Speaker 3: talking about black history. You just talked about disability and 290 00:16:09,160 --> 00:16:11,600 Speaker 3: the Black Panthers and all like what so much, so 291 00:16:11,680 --> 00:16:14,680 Speaker 3: much great information. So listeners, if you don't, if you haven't, 292 00:16:14,680 --> 00:16:17,680 Speaker 3: you need to go walk in. But I do want 293 00:16:17,680 --> 00:16:19,240 Speaker 3: to kind of get it in some of the negativity, 294 00:16:19,240 --> 00:16:20,960 Speaker 3: and we're going to come to positive too, but just 295 00:16:21,000 --> 00:16:22,840 Speaker 3: because you know, we do the bad news version and 296 00:16:22,920 --> 00:16:27,680 Speaker 3: the good news. So historically and today, society and governments 297 00:16:27,720 --> 00:16:31,040 Speaker 3: for the most part have ignored or erased women and 298 00:16:31,080 --> 00:16:34,320 Speaker 3: marginalized people when it comes to public health, illness, sicknesses, 299 00:16:34,320 --> 00:16:37,000 Speaker 3: and safety in general. What are some things you've seen 300 00:16:37,080 --> 00:16:40,200 Speaker 3: during your career or even in your studies that are 301 00:16:40,240 --> 00:16:42,640 Speaker 3: concerning in this area. 302 00:16:43,160 --> 00:16:45,280 Speaker 4: The thing that concerns me the most is actually stigma. 303 00:16:45,760 --> 00:16:50,680 Speaker 4: Stigma for certain populations and stigma for certain public health conditions. 304 00:16:51,320 --> 00:16:53,840 Speaker 4: So the two that I would give as examples is. 305 00:16:53,840 --> 00:16:56,160 Speaker 4: The first stigma I would give is around like sex workers. 306 00:16:57,360 --> 00:16:58,720 Speaker 4: I think there's a huge amount I mean no, I 307 00:16:58,760 --> 00:17:00,560 Speaker 4: don't think I know that there's a huge man stigma 308 00:17:00,640 --> 00:17:04,880 Speaker 4: around that being the way that people engage in their 309 00:17:05,160 --> 00:17:08,240 Speaker 4: like you know, commerce exchange. I do like to remind 310 00:17:08,400 --> 00:17:11,280 Speaker 4: everyone that all of us who do labor are exchanging 311 00:17:11,280 --> 00:17:14,520 Speaker 4: our labor and our bodies for money. We're just not 312 00:17:14,600 --> 00:17:18,040 Speaker 4: exchanging a particular kind of labor for money. But the 313 00:17:18,040 --> 00:17:22,280 Speaker 4: stigma that really ensnares and in circles sex workers is 314 00:17:22,359 --> 00:17:25,960 Speaker 4: really insidious because we a lot of us buy into it. 315 00:17:26,480 --> 00:17:29,439 Speaker 4: A lot of us are very paternalistic, and so the 316 00:17:29,480 --> 00:17:31,840 Speaker 4: way that we talk about public health and sex workers 317 00:17:31,920 --> 00:17:37,399 Speaker 4: still sort of demands controlling people's bodies or imperiling the 318 00:17:37,440 --> 00:17:40,879 Speaker 4: way that people make money to pay their bills that 319 00:17:40,920 --> 00:17:44,679 Speaker 4: they may be comfortable with and they are consenting to 320 00:17:44,920 --> 00:17:47,280 Speaker 4: and they like doing. It's very easy to have a 321 00:17:47,320 --> 00:17:52,280 Speaker 4: conversation with people around sex work and people not wanting 322 00:17:52,320 --> 00:17:54,520 Speaker 4: to do it. It's very uncomfortable to have a conversation 323 00:17:54,600 --> 00:17:57,800 Speaker 4: with people around sex work, and how do we protect 324 00:17:57,800 --> 00:18:00,159 Speaker 4: folks who do want to be doing that work, how 325 00:18:00,160 --> 00:18:02,280 Speaker 4: do we interface with them in a way that's just 326 00:18:02,440 --> 00:18:07,359 Speaker 4: like morally benign. We want to really moralize these things 327 00:18:07,400 --> 00:18:09,800 Speaker 4: when we talk about stigma. And then the other one 328 00:18:09,880 --> 00:18:13,399 Speaker 4: is like things like STIs, particularly STIs that are very 329 00:18:13,440 --> 00:18:18,840 Speaker 4: easy to transmit and transenda and end up with Like 330 00:18:18,920 --> 00:18:23,040 Speaker 4: so things like chlamydia, for example, it's a very treatable STI. 331 00:18:23,280 --> 00:18:27,160 Speaker 4: It's very easy to transmit if you've had sexual contact 332 00:18:27,160 --> 00:18:31,480 Speaker 4: with someone. But we've still really highly stigmatized someone who 333 00:18:31,520 --> 00:18:37,240 Speaker 4: has a history of having caught or transmitted chlamydia. We 334 00:18:37,320 --> 00:18:40,879 Speaker 4: don't have the same stigmatization for things like the common cold, 335 00:18:41,359 --> 00:18:43,640 Speaker 4: we don't have the same We don't stigmatize someone who's 336 00:18:43,640 --> 00:18:46,159 Speaker 4: given us COVID even though they've you know, done some 337 00:18:46,280 --> 00:18:49,680 Speaker 4: real serious harm. But it's because we know the transmission 338 00:18:49,720 --> 00:18:53,280 Speaker 4: point that we've really stigmatized sti's and that leads into 339 00:18:53,280 --> 00:18:57,439 Speaker 4: other things too, like more chronic conditions like herpes and HIV. 340 00:18:57,920 --> 00:19:00,480 Speaker 4: When we stigmatize things, it means it's hard to treat 341 00:19:00,600 --> 00:19:03,160 Speaker 4: or care for them. I would love to see in 342 00:19:03,520 --> 00:19:06,520 Speaker 4: public health we get to a point where we ask 343 00:19:06,600 --> 00:19:08,680 Speaker 4: people about the work that they do, and someone can 344 00:19:08,720 --> 00:19:10,840 Speaker 4: reveal that they do sex work and it has no 345 00:19:11,000 --> 00:19:13,640 Speaker 4: moral bearings on how they're treated. And I would really 346 00:19:13,680 --> 00:19:17,639 Speaker 4: love to see at a day and age where somebody 347 00:19:17,680 --> 00:19:20,919 Speaker 4: can go into clinical space and talk about their past 348 00:19:21,000 --> 00:19:24,760 Speaker 4: history or their current experience with a chronic STI or 349 00:19:24,800 --> 00:19:30,400 Speaker 4: a chronic disease that they acquired through sexual transmission, and 350 00:19:30,800 --> 00:19:33,320 Speaker 4: it also be morally benign, and we treat it the 351 00:19:33,359 --> 00:19:36,560 Speaker 4: same way we treat someone living with chronic cancer or 352 00:19:36,600 --> 00:19:38,719 Speaker 4: someone with You know that they need a certain kind 353 00:19:38,760 --> 00:19:42,200 Speaker 4: of care, but it doesn't have bearing on how they're 354 00:19:42,200 --> 00:19:45,280 Speaker 4: going to be treated. I think that's what's Those are 355 00:19:45,280 --> 00:19:47,920 Speaker 4: the things that are super concerning and in the current 356 00:19:47,960 --> 00:19:52,000 Speaker 4: politicized the thing that really concerns me overall is how 357 00:19:52,040 --> 00:19:56,520 Speaker 4: we politicize public health, and especially how we politicize marginalized 358 00:19:57,080 --> 00:20:01,639 Speaker 4: folks or vulnerable populations. On top of that, right, and 359 00:20:01,680 --> 00:20:03,919 Speaker 4: the neglect of things like people with disability and the 360 00:20:03,960 --> 00:20:08,399 Speaker 4: weaponization of things like the weaponization of like anti trans 361 00:20:08,440 --> 00:20:10,720 Speaker 4: rhetoric to try and tear down Section five oh four, 362 00:20:10,800 --> 00:20:14,280 Speaker 4: for example, which is for disabilities. It's a protection for 363 00:20:14,680 --> 00:20:19,080 Speaker 4: folks with disabilities. These things don't live in bubbles and isolation, 364 00:20:19,400 --> 00:20:22,760 Speaker 4: and a lot of these things where they all dump 365 00:20:22,800 --> 00:20:23,800 Speaker 4: out is public health. 366 00:20:24,520 --> 00:20:31,479 Speaker 2: Yeah, Well, coming to the positive, what are some of 367 00:20:31,520 --> 00:20:34,480 Speaker 2: the more encouraging things that you have seen. 368 00:20:34,880 --> 00:20:38,560 Speaker 4: In public health? Yeah? Oh man, there's like so many. 369 00:20:39,000 --> 00:20:41,320 Speaker 4: I think that I love that you asked this because 370 00:20:41,400 --> 00:20:47,040 Speaker 4: I love talking about how amazing public health is first 371 00:20:47,080 --> 00:20:49,719 Speaker 4: of all, a lot of public health practitioners, a lot 372 00:20:49,720 --> 00:20:52,080 Speaker 4: of people have become more interested in public health as 373 00:20:52,080 --> 00:20:55,120 Speaker 4: a profession. I love that I've been seeing a lot 374 00:20:55,119 --> 00:20:57,800 Speaker 4: more mixed methodology and public health research, and this is 375 00:20:57,880 --> 00:21:02,879 Speaker 4: super important to talk about causation, especially around things like 376 00:21:03,000 --> 00:21:06,520 Speaker 4: racial disparities in public health. Historically, people have done research 377 00:21:06,560 --> 00:21:13,119 Speaker 4: in public health where they primarily researched disparities but didn't 378 00:21:13,640 --> 00:21:16,320 Speaker 4: sort of stopped there, and so we'd often hear like, oh, 379 00:21:16,440 --> 00:21:21,760 Speaker 4: you know, maternal health, like maternal mortality statistics are really 380 00:21:21,800 --> 00:21:24,200 Speaker 4: bad for black women, and they're you know, different for 381 00:21:24,320 --> 00:21:27,480 Speaker 4: this group and this group. But more and more researchers 382 00:21:27,480 --> 00:21:30,320 Speaker 4: have started to realize that if they're going to talk 383 00:21:30,359 --> 00:21:34,200 Speaker 4: about health disparities, especially racially, then they need to also 384 00:21:34,280 --> 00:21:38,280 Speaker 4: point out that the cause isn't race itself, like biologically 385 00:21:38,320 --> 00:21:43,360 Speaker 4: or inherently, it's racism in a system. And so that 386 00:21:43,440 --> 00:21:51,200 Speaker 4: requires less quantitative methodology and also qualitative methodology interviewing focused groups, 387 00:21:51,640 --> 00:21:56,280 Speaker 4: surveying on top of the quantitative work that you want 388 00:21:56,280 --> 00:22:00,439 Speaker 4: to be doing. So I'm really excited to see what 389 00:22:00,560 --> 00:22:02,719 Speaker 4: I think is going to be much more incisive and 390 00:22:02,840 --> 00:22:07,720 Speaker 4: insightful research from folks. And then the other thing too, 391 00:22:07,840 --> 00:22:11,760 Speaker 4: is like like biomedical breakthroughs, like the fact that you 392 00:22:11,800 --> 00:22:14,720 Speaker 4: can have you have injectibles. I mean recently it was 393 00:22:14,840 --> 00:22:19,040 Speaker 4: just like National HIV Day and we have injectables now 394 00:22:19,080 --> 00:22:23,119 Speaker 4: for HIV care. That means someone can come in once 395 00:22:23,840 --> 00:22:27,920 Speaker 4: every three months for an injectible and managed to maintain 396 00:22:28,119 --> 00:22:34,600 Speaker 4: ANNDIE an undetectable viral load. I mean that is such 397 00:22:34,600 --> 00:22:39,320 Speaker 4: a feat in a generation from back in the nineties 398 00:22:39,359 --> 00:22:41,959 Speaker 4: where people were having to take sort of fistfuls of 399 00:22:42,000 --> 00:22:45,400 Speaker 4: medication just to stay alive and the side effects were very, 400 00:22:45,520 --> 00:22:50,960 Speaker 4: very aggressive, painful and debilitating. So there's amazing work happening 401 00:22:51,520 --> 00:22:53,840 Speaker 4: in care and treatment that we're able to promote in 402 00:22:53,840 --> 00:22:54,440 Speaker 4: public health. 403 00:22:55,480 --> 00:22:59,000 Speaker 1: I love that. I feel like we need balances. 404 00:22:59,119 --> 00:23:01,840 Speaker 3: We need to know the truth and dark, but we 405 00:23:01,920 --> 00:23:04,239 Speaker 3: need to know that things are happening whether or not 406 00:23:05,520 --> 00:23:07,600 Speaker 3: we see it. You know, you did bring up the 407 00:23:07,640 --> 00:23:10,520 Speaker 3: whole thing with the disability acts and what's going down 408 00:23:11,320 --> 00:23:14,040 Speaker 3: in public health, and I've seen more and more people 409 00:23:14,040 --> 00:23:16,960 Speaker 3: coming out like this is this is a whole conversation, 410 00:23:17,560 --> 00:23:21,800 Speaker 3: an abless conversation that's using trans people as a weapon 411 00:23:21,840 --> 00:23:24,399 Speaker 3: against this disabled people's and how it comes down to, 412 00:23:24,480 --> 00:23:27,120 Speaker 3: as you and I are having a conversation before about 413 00:23:27,119 --> 00:23:30,720 Speaker 3: eugenics and what this looks like in general, what are 414 00:23:30,720 --> 00:23:33,399 Speaker 3: some things that we need to be watching for as 415 00:23:33,680 --> 00:23:35,600 Speaker 3: they're having this big conversation about this. 416 00:23:36,240 --> 00:23:40,680 Speaker 4: I think the weaponization of different vulnerable populations against other 417 00:23:40,800 --> 00:23:44,199 Speaker 4: groups is a big one, and not just like in 418 00:23:44,240 --> 00:23:47,160 Speaker 4: how we stigmatize certain groups, but in like the section 419 00:23:47,240 --> 00:23:49,480 Speaker 4: five h four is such a good example because it's 420 00:23:49,760 --> 00:23:55,399 Speaker 4: being weaponized in an attempt to be cruel to trans people, 421 00:23:56,960 --> 00:24:00,560 Speaker 4: we may inadvertently dismantle something that it has been an 422 00:24:00,600 --> 00:24:06,479 Speaker 4: incredible necessity for rights to access for a number of 423 00:24:07,960 --> 00:24:11,639 Speaker 4: number of things. Another of supports resources, et cetera for 424 00:24:11,800 --> 00:24:16,959 Speaker 4: folks with disabilities, and so our transphobia has a huge 425 00:24:16,960 --> 00:24:20,920 Speaker 4: cost and is going to be the right for someone 426 00:24:20,920 --> 00:24:24,399 Speaker 4: with a disability to have assistive technology provided to them 427 00:24:24,440 --> 00:24:29,040 Speaker 4: in school, or the right to access certain like housing 428 00:24:29,119 --> 00:24:31,360 Speaker 4: or what like all of these ways that section five 429 00:24:31,440 --> 00:24:35,080 Speaker 4: or four has been used since nineteen seventy seven, and 430 00:24:35,119 --> 00:24:36,600 Speaker 4: so I think that's the thing I really want people 431 00:24:36,640 --> 00:24:39,800 Speaker 4: to watch out for, is that you're not understanding a 432 00:24:39,840 --> 00:24:42,400 Speaker 4: population like even if you don't agree with sex work. 433 00:24:42,440 --> 00:24:43,760 Speaker 4: I'm going to use that because it's the one that 434 00:24:43,800 --> 00:24:47,520 Speaker 4: makes people the most activated. Even if you don't agree 435 00:24:47,600 --> 00:24:51,200 Speaker 4: morally with sex work, you have to understand that the 436 00:24:51,240 --> 00:24:57,040 Speaker 4: way that we address the population that does sex work 437 00:24:57,680 --> 00:25:02,280 Speaker 4: has rippling effect to our rights and can dismantle our 438 00:25:02,359 --> 00:25:06,480 Speaker 4: rights in an attempt to what like, punish someone because 439 00:25:06,520 --> 00:25:11,919 Speaker 4: they the way that they access like some sort of 440 00:25:11,960 --> 00:25:15,120 Speaker 4: financial gain or like the money they need to pay 441 00:25:15,160 --> 00:25:18,320 Speaker 4: their bills is far different than what we're willing to do. 442 00:25:18,680 --> 00:25:21,080 Speaker 4: I don't know, Like, what do you like? I guess 443 00:25:21,119 --> 00:25:23,679 Speaker 4: the question I would ask anyone is what are you 444 00:25:23,960 --> 00:25:27,320 Speaker 4: on you, on your physical self or in your life 445 00:25:27,359 --> 00:25:31,040 Speaker 4: willing to give up to punish someone else? And most 446 00:25:31,080 --> 00:25:34,480 Speaker 4: people's answer is nothing, Right, would you give up like 447 00:25:35,000 --> 00:25:37,320 Speaker 4: three of your fingers to be able to punish a 448 00:25:37,359 --> 00:25:42,520 Speaker 4: trans person? No? And when you make it clear like that, 449 00:25:42,680 --> 00:25:45,040 Speaker 4: I think then people sort of get it. But in 450 00:25:45,280 --> 00:25:48,240 Speaker 4: it when it's when it's sort of wrapped in the 451 00:25:48,440 --> 00:25:53,760 Speaker 4: sort of strange enigma of policy and legislation and lawsuits 452 00:25:53,800 --> 00:25:56,239 Speaker 4: and case law, a lot of people are more than 453 00:25:56,280 --> 00:25:59,520 Speaker 4: comfortable to sacrifice the good portion of themselves. 454 00:25:59,720 --> 00:26:03,680 Speaker 3: Right right, all in the name of whatever their morality, 455 00:26:05,119 --> 00:26:18,119 Speaker 3: which is a whole different conversation. I find it interesting, 456 00:26:18,160 --> 00:26:21,280 Speaker 3: and this is just a sidebar because I've been noticing 457 00:26:21,440 --> 00:26:23,800 Speaker 3: as on a blue sky, people going back and forth 458 00:26:23,800 --> 00:26:25,760 Speaker 3: and being in this conversation being like, no, this is 459 00:26:25,800 --> 00:26:28,600 Speaker 3: not an accident and just trying to get like trying 460 00:26:28,600 --> 00:26:32,160 Speaker 3: to punish trans people. This is purposeful in wrapping everybody 461 00:26:32,200 --> 00:26:36,960 Speaker 3: together in order to have the supremacist idea, and people 462 00:26:37,040 --> 00:26:40,040 Speaker 3: are not realizing it and try and getting too much 463 00:26:40,080 --> 00:26:42,160 Speaker 3: credit to the opposition and saying that. 464 00:26:42,280 --> 00:26:43,080 Speaker 1: They made a mistake. 465 00:26:43,280 --> 00:26:45,159 Speaker 3: No, they did this on purpose, and we need to 466 00:26:45,359 --> 00:26:47,960 Speaker 3: be very vigilant about the fact that this has always 467 00:26:48,000 --> 00:26:51,240 Speaker 3: been a part of the plan. And when it comes 468 00:26:51,280 --> 00:26:55,080 Speaker 3: to white supremacy, it is about eugenics. They really think 469 00:26:55,160 --> 00:26:59,000 Speaker 3: that there is this perfect setup, which is hilarious in 470 00:26:59,040 --> 00:27:02,359 Speaker 3: itself because you're like, your narcissism is showing, and that's 471 00:27:02,359 --> 00:27:03,200 Speaker 3: the diagnosis. 472 00:27:03,240 --> 00:27:07,359 Speaker 4: Just so you know, we're not experiencing a perfect world 473 00:27:07,400 --> 00:27:08,560 Speaker 4: with you in charge. 474 00:27:08,400 --> 00:27:12,400 Speaker 3: Right right right, We are falling apart, and I don't 475 00:27:12,400 --> 00:27:13,679 Speaker 3: know what to do and I cry a lot. 476 00:27:13,720 --> 00:27:15,760 Speaker 1: I'm just kidding, Uh, I'm not kidding. 477 00:27:16,040 --> 00:27:20,440 Speaker 3: But so here in the State of Georgia, I'm sure 478 00:27:20,440 --> 00:27:24,479 Speaker 3: you've been aware. We've watched the government not only ignore 479 00:27:24,520 --> 00:27:29,560 Speaker 3: the health officials and people about reproductive care and the rights, 480 00:27:29,760 --> 00:27:35,360 Speaker 3: but literally have dismantled almost any accountability and data by 481 00:27:35,400 --> 00:27:40,680 Speaker 3: just disbanding things like the maternal mortality boards and counsels 482 00:27:40,680 --> 00:27:43,520 Speaker 3: because they are telling too much truth and they did 483 00:27:43,520 --> 00:27:47,280 Speaker 3: not like it. So for us in here in the trenches, 484 00:27:47,320 --> 00:27:49,400 Speaker 3: such as in the state of Georgia, can you talk 485 00:27:49,440 --> 00:27:52,280 Speaker 3: to us about how to get the most accurate information 486 00:27:52,680 --> 00:27:55,480 Speaker 3: about tactics like this, like how we need to be 487 00:27:55,560 --> 00:27:58,639 Speaker 3: aware that this is one of the bigger ways that 488 00:27:58,680 --> 00:28:01,480 Speaker 3: they have used against us who are trying to get 489 00:28:01,920 --> 00:28:03,000 Speaker 3: reproductive access. 490 00:28:03,160 --> 00:28:05,120 Speaker 4: Well, I think the first thing is, first of all, 491 00:28:05,160 --> 00:28:08,760 Speaker 4: you have some really I mean consummate experts. The first 492 00:28:08,760 --> 00:28:10,640 Speaker 4: thing I want to tell people is that when you're 493 00:28:12,440 --> 00:28:15,439 Speaker 4: elected officials who are not experts in these things, they 494 00:28:15,440 --> 00:28:19,320 Speaker 4: are not career professionals. Right, I like to tell people 495 00:28:19,320 --> 00:28:21,480 Speaker 4: all the time. You know that I've worked in government 496 00:28:21,600 --> 00:28:24,600 Speaker 4: and like, but I've never been a politician. And that's 497 00:28:24,640 --> 00:28:28,919 Speaker 4: on purpose, because I want to be a civil servant 498 00:28:29,119 --> 00:28:31,480 Speaker 4: to I want to care for my fellow the folks 499 00:28:31,560 --> 00:28:34,080 Speaker 4: that I live in and around. So The first thing 500 00:28:34,080 --> 00:28:36,159 Speaker 4: I would say to folks in Georgia is that, like, 501 00:28:36,680 --> 00:28:40,280 Speaker 4: you have consummate experts in Fulton County. I mean, I've 502 00:28:40,320 --> 00:28:44,080 Speaker 4: worked with their population and there, and they do all 503 00:28:44,120 --> 00:28:47,920 Speaker 4: that they can to do the best work within the 504 00:28:47,960 --> 00:28:53,360 Speaker 4: confines and the messy confines of the ever chaotic policy 505 00:28:53,600 --> 00:28:58,840 Speaker 4: that people who want to politicize these things create. So 506 00:28:59,000 --> 00:29:02,600 Speaker 4: I think the first thing is encouraging people to have 507 00:29:03,360 --> 00:29:05,560 Speaker 4: we sort of throw the baby out with the bathwater, right, 508 00:29:06,080 --> 00:29:08,840 Speaker 4: and we're like, Okay, the state of Georgia and it's 509 00:29:09,120 --> 00:29:13,040 Speaker 4: elected officials have made all of these designs and horrible 510 00:29:13,120 --> 00:29:17,840 Speaker 4: choices and appointed these terrible leaders or whoever. But we 511 00:29:17,920 --> 00:29:22,840 Speaker 4: then extend that hatred to anyone who works in these capacities. 512 00:29:22,920 --> 00:29:25,239 Speaker 4: And a substantial portion of the people that work for 513 00:29:25,280 --> 00:29:29,760 Speaker 4: you in government work for you. They they really do 514 00:29:30,120 --> 00:29:32,440 Speaker 4: want to do the best that they can, even in 515 00:29:32,440 --> 00:29:35,560 Speaker 4: these conditions. And so I think being able to be 516 00:29:35,720 --> 00:29:42,200 Speaker 4: more precise with our hatreds is the first suggestion I have. 517 00:29:43,280 --> 00:29:45,920 Speaker 4: Because I'm an a one hater when it comes to 518 00:29:46,040 --> 00:29:50,160 Speaker 4: a lot of politicians, you know, if they have zero haters, 519 00:29:50,760 --> 00:29:53,080 Speaker 4: that's not true. Because even in debt, I plan to 520 00:29:53,120 --> 00:29:58,640 Speaker 4: haunt them. But so, yeah, I think because I see 521 00:29:58,640 --> 00:30:00,480 Speaker 4: that a lot on TikTok as people be come very 522 00:30:00,520 --> 00:30:04,320 Speaker 4: like generalizing. They're like, oh, all of pharma is bad, 523 00:30:04,520 --> 00:30:07,840 Speaker 4: all of the federal government is bad. And it's like, no, 524 00:30:09,040 --> 00:30:11,640 Speaker 4: there are a number of bioengineers who really want to 525 00:30:11,640 --> 00:30:14,880 Speaker 4: cure cancer working in pharma. They're not setting the pricing, 526 00:30:15,360 --> 00:30:18,760 Speaker 4: they're not jacking up, you know, the cost of like 527 00:30:18,840 --> 00:30:25,200 Speaker 4: obscure random you know, medications that are that they've you know, 528 00:30:26,040 --> 00:30:30,560 Speaker 4: put a patent on, like they're not doing that. Get 529 00:30:30,600 --> 00:30:34,040 Speaker 4: with the c suite, you know, and we need people 530 00:30:34,120 --> 00:30:38,520 Speaker 4: who want to make medications to cure cancer. We need them, 531 00:30:39,520 --> 00:30:42,160 Speaker 4: so trust. How to be decisive with our hate, with 532 00:30:42,200 --> 00:30:47,840 Speaker 4: our hateration in the dancery is probably my number one. 533 00:30:47,560 --> 00:30:50,120 Speaker 3: I wait, I need to note that how to be 534 00:30:50,160 --> 00:30:53,040 Speaker 3: precise with oteration in the dancery. 535 00:30:53,160 --> 00:30:54,600 Speaker 1: Yeah, I need to write that down somewhere. 536 00:30:54,680 --> 00:30:54,880 Speaker 3: You know. 537 00:30:56,400 --> 00:30:57,960 Speaker 4: I don't want to tell people stout being haters. I 538 00:30:57,960 --> 00:30:59,560 Speaker 4: want you to. I want you to do this still 539 00:30:59,640 --> 00:31:03,880 Speaker 4: and full gets your hatred right, and then to also 540 00:31:03,960 --> 00:31:06,880 Speaker 4: know who's working on behalf of you, who really is 541 00:31:06,960 --> 00:31:08,600 Speaker 4: and how you how they are, how they're trying to 542 00:31:08,640 --> 00:31:10,520 Speaker 4: work within that. The other thing I would say is 543 00:31:10,560 --> 00:31:13,760 Speaker 4: that there are plenty again of like grassroots and larger 544 00:31:13,800 --> 00:31:17,560 Speaker 4: scale organizations that are collecting this information. So like I 545 00:31:17,560 --> 00:31:21,400 Speaker 4: know in Philadelphia we have like Pew Trust with there's 546 00:31:21,520 --> 00:31:24,800 Speaker 4: like the Kaiser Family Foundation, Like these are organizations that 547 00:31:25,000 --> 00:31:28,320 Speaker 4: their whole business or the whole work that they do 548 00:31:28,960 --> 00:31:33,680 Speaker 4: is still to produce this well researched like reports on 549 00:31:33,720 --> 00:31:36,840 Speaker 4: the conditions of these things. And so you may have 550 00:31:36,920 --> 00:31:39,200 Speaker 4: a look, I don't know specifically for Georgia, but you 551 00:31:39,280 --> 00:31:43,440 Speaker 4: may have and I would assume you do a like 552 00:31:44,360 --> 00:31:47,880 Speaker 4: a reproductive justice organization that is collecting that data in 553 00:31:47,920 --> 00:31:51,720 Speaker 4: some way, shape or form to draft reporting that you 554 00:31:51,840 --> 00:31:55,440 Speaker 4: all can trust and consume and has peer review to 555 00:31:55,520 --> 00:31:58,880 Speaker 4: it and et cetera. And it also may then you 556 00:31:58,880 --> 00:32:01,000 Speaker 4: may have to also become a little renegade and know 557 00:32:01,040 --> 00:32:03,760 Speaker 4: where to ask for it. I will say this, maybe 558 00:32:03,760 --> 00:32:08,200 Speaker 4: they dismantled your maternal mortality like what was it your 559 00:32:08,200 --> 00:32:12,840 Speaker 4: maternal mortality board, But that doesn't mean that your local 560 00:32:12,840 --> 00:32:17,959 Speaker 4: health departments aren't doing annual reporting. It just made it 561 00:32:18,000 --> 00:32:23,160 Speaker 4: means that you have to read like your mortality like 562 00:32:23,200 --> 00:32:26,360 Speaker 4: what do we call them here in Philly? Or in 563 00:32:26,400 --> 00:32:31,080 Speaker 4: Pennsylvania we call it like we have like a like 564 00:32:31,120 --> 00:32:35,560 Speaker 4: a report and annual report on deaths and cause of 565 00:32:35,600 --> 00:32:38,360 Speaker 4: death is still going to be in there, so it 566 00:32:38,440 --> 00:32:44,280 Speaker 4: may take more. Again, I hate these suggestions always kind 567 00:32:44,280 --> 00:32:47,000 Speaker 4: of suck because I know that our attention spans are shorter, 568 00:32:48,280 --> 00:32:50,920 Speaker 4: and so if something's not being handed to us directly, 569 00:32:51,520 --> 00:32:53,120 Speaker 4: us having to go on a little bit of a 570 00:32:53,200 --> 00:32:56,520 Speaker 4: hunt is harder for people. But it's going to be 571 00:32:56,560 --> 00:32:59,120 Speaker 4: so important that we start to stretch our attention spans 572 00:32:59,600 --> 00:33:03,400 Speaker 4: and be come a little more hunter gatherer unfortunately. But 573 00:33:03,480 --> 00:33:05,680 Speaker 4: it's there. That's the thing that's important. It's there. 574 00:33:06,560 --> 00:33:08,400 Speaker 3: Yeah, And I think that's a great reminder I do 575 00:33:09,640 --> 00:33:12,360 Speaker 3: forget that we sometimes I forget when I look at 576 00:33:12,400 --> 00:33:14,800 Speaker 3: the big picture about all of the tragic things that 577 00:33:14,840 --> 00:33:17,040 Speaker 3: are happening, and all are the horrifying things that the 578 00:33:17,120 --> 00:33:19,920 Speaker 3: local people are doing a lot. We definitely have a 579 00:33:19,920 --> 00:33:22,680 Speaker 3: friend of the show, part Canaan, who has been working 580 00:33:22,720 --> 00:33:25,200 Speaker 3: so hard in the state of Georgia trying to get 581 00:33:25,200 --> 00:33:27,800 Speaker 3: some of these information out. And I know there are 582 00:33:27,920 --> 00:33:31,800 Speaker 3: local organizations like ARC Southeast here here in Georgia, but 583 00:33:31,840 --> 00:33:34,760 Speaker 3: it's sometimes like ah, so the focus, So thank you 584 00:33:34,840 --> 00:33:37,480 Speaker 3: for reminding me about that that local is so important, 585 00:33:37,840 --> 00:33:39,880 Speaker 3: like we do. We have talked about that quite a bit, 586 00:33:40,760 --> 00:33:42,600 Speaker 3: but sometimes it's hard to focus. 587 00:33:42,320 --> 00:33:44,760 Speaker 4: And fund them too. That's the other thing is these 588 00:33:44,880 --> 00:33:47,600 Speaker 4: organizations because they're not getting federal grants, are because they've 589 00:33:47,640 --> 00:33:50,320 Speaker 4: given up you know, state grants or whatever. Because they're 590 00:33:50,360 --> 00:33:53,160 Speaker 4: continuing to do this sort of reporting means that they 591 00:33:53,240 --> 00:33:56,520 Speaker 4: now are doing it with a less And so I 592 00:33:56,520 --> 00:33:59,960 Speaker 4: also want to encourage people to fund these local organizations 593 00:34:00,120 --> 00:34:01,880 Speaker 4: that are doing this work for you on the ground, 594 00:34:02,640 --> 00:34:04,360 Speaker 4: like in where you are and where you live. 595 00:34:05,240 --> 00:34:09,040 Speaker 3: No, that's that's definitely a definite because we need these organizations. 596 00:34:09,840 --> 00:34:13,120 Speaker 3: And with kind of all of this again, recently watching 597 00:34:13,640 --> 00:34:16,160 Speaker 3: a few of your episodes, you talked about social autopsy 598 00:34:16,200 --> 00:34:19,640 Speaker 3: and I love that phrasing, and I was like, oh, oh, okay, 599 00:34:19,920 --> 00:34:22,600 Speaker 3: can you because like the example you gave was people 600 00:34:23,120 --> 00:34:26,960 Speaker 3: manipulating content in order to cancel someone, So can you 601 00:34:27,040 --> 00:34:30,280 Speaker 3: kind of talk about what you mean about social autopsy? 602 00:34:30,280 --> 00:34:32,919 Speaker 1: And I know it's an actual actual term, but new 603 00:34:32,960 --> 00:34:33,160 Speaker 1: to me. 604 00:34:34,239 --> 00:34:37,600 Speaker 4: So social autopsy is one of my favorite qualitative methodologies. 605 00:34:37,640 --> 00:34:39,839 Speaker 4: You don't hear it so much. It was coined by 606 00:34:40,320 --> 00:34:44,480 Speaker 4: I forget the name of the researchers a sociologist. But 607 00:34:44,640 --> 00:34:47,520 Speaker 4: you see it in My favorite example is the book Heat, 608 00:34:47,880 --> 00:34:49,880 Speaker 4: which is a social autopsy, and that's I think the 609 00:34:49,960 --> 00:34:54,080 Speaker 4: first time you see that word used in a larger scale. 610 00:34:54,200 --> 00:34:57,040 Speaker 4: But it was about the Chicago heat wave that had 611 00:34:57,080 --> 00:35:00,960 Speaker 4: really detrimental It's a public health book detrimental effects. And 612 00:35:01,000 --> 00:35:03,320 Speaker 4: so social autopsy is when you see something has happened 613 00:35:03,960 --> 00:35:06,279 Speaker 4: and you decide that you are going to sort of 614 00:35:07,000 --> 00:35:10,680 Speaker 4: unpack what happened and go back in time looking at 615 00:35:10,760 --> 00:35:13,160 Speaker 4: the pieces that came together for this thing to happen. 616 00:35:13,719 --> 00:35:15,440 Speaker 4: I'm sure there's language for it. I think in the 617 00:35:15,480 --> 00:35:18,520 Speaker 4: political realm there's a similar term. I think they use 618 00:35:18,560 --> 00:35:21,719 Speaker 4: autopsy too, to look at like when there's like a 619 00:35:21,920 --> 00:35:25,400 Speaker 4: when you've lost a like an election, for example, you 620 00:35:25,440 --> 00:35:27,720 Speaker 4: look back and say, like, where did we go wrong? 621 00:35:27,840 --> 00:35:31,080 Speaker 4: What components were in place right? Because there's so many 622 00:35:31,080 --> 00:35:34,880 Speaker 4: moving parts that contributed to the perfect storm of this moment. 623 00:35:36,120 --> 00:35:38,600 Speaker 4: And so I encourage people to do a social autopsy 624 00:35:38,640 --> 00:35:44,520 Speaker 4: on things when before being before being like influenced and 625 00:35:44,560 --> 00:35:46,680 Speaker 4: the example, I know that the video you're talking about 626 00:35:46,719 --> 00:35:49,600 Speaker 4: because it was a viral one, and to also point 627 00:35:49,600 --> 00:35:52,280 Speaker 4: out that like people are not happy with me. People 628 00:35:52,320 --> 00:35:54,799 Speaker 4: really want to personalize when I do things like that, 629 00:35:54,840 --> 00:35:59,920 Speaker 4: when I contextualize the timeline, and again, like I'm not 630 00:36:00,080 --> 00:36:01,719 Speaker 4: pointing out and I think I say it several times, 631 00:36:01,760 --> 00:36:03,560 Speaker 4: I'm not pointing out whether or I think this person's 632 00:36:03,600 --> 00:36:05,000 Speaker 4: a good or bad person. I don't live in the 633 00:36:05,040 --> 00:36:07,640 Speaker 4: world the dichotomist world of good or bad. But I 634 00:36:07,680 --> 00:36:09,960 Speaker 4: want people to be careful when they make content, and 635 00:36:10,000 --> 00:36:12,440 Speaker 4: I want people to be careful when they consume it. 636 00:36:12,800 --> 00:36:15,439 Speaker 4: And so backtracking, so when you come across a video 637 00:36:15,480 --> 00:36:19,360 Speaker 4: where someone's like clip someone's this is very, very infamous. 638 00:36:19,400 --> 00:36:21,560 Speaker 4: I see this all the time when it comes to creators, 639 00:36:22,400 --> 00:36:27,480 Speaker 4: where people clip a certain piece of a video and 640 00:36:27,640 --> 00:36:33,239 Speaker 4: they clip it and they don't contextualize it, and they 641 00:36:33,280 --> 00:36:37,520 Speaker 4: don't tag the creator, or they leave out the watermark 642 00:36:37,600 --> 00:36:39,640 Speaker 4: so that you can go and look at it for yourself, 643 00:36:39,880 --> 00:36:42,000 Speaker 4: and they know that most people won't. I've had this 644 00:36:42,040 --> 00:36:44,480 Speaker 4: happen to me too as well, which you know, like 645 00:36:44,760 --> 00:36:48,560 Speaker 4: it's been absurd. But I want to encourage people to 646 00:36:48,560 --> 00:36:53,120 Speaker 4: not just take people at the clip, go and consume 647 00:36:53,239 --> 00:36:58,920 Speaker 4: the entire video, because in this podcast, for example, in 648 00:36:59,000 --> 00:37:02,160 Speaker 4: this exact moment, I could say, you know, well, everybody sucks, 649 00:37:02,640 --> 00:37:06,439 Speaker 4: and if you only clipped everybody sucks, then you would 650 00:37:06,480 --> 00:37:08,320 Speaker 4: be like Evan's anti humanist. 651 00:37:08,880 --> 00:37:09,080 Speaker 3: You know. 652 00:37:11,960 --> 00:37:13,279 Speaker 1: I agree, though, advice if you. 653 00:37:13,280 --> 00:37:17,680 Speaker 4: Put it, if you put it in context and realize 654 00:37:17,760 --> 00:37:20,480 Speaker 4: it was an example statement that I made, then you 655 00:37:20,560 --> 00:37:24,799 Speaker 4: realize that it's not that I'm anti humanist. Again. I 656 00:37:24,840 --> 00:37:26,799 Speaker 4: want to encourage people. I know it's so easy to do, 657 00:37:26,960 --> 00:37:30,280 Speaker 4: especially when we're on social media or when we're consuming 658 00:37:30,320 --> 00:37:33,239 Speaker 4: media which is just endless scrolling, and we're sort of 659 00:37:33,280 --> 00:37:35,680 Speaker 4: doing it to be a little mindless. I want to 660 00:37:35,760 --> 00:37:38,120 Speaker 4: encourage people to stop and say, like, okay, well, if 661 00:37:38,200 --> 00:37:41,600 Speaker 4: this is catching my eye, I should take that extra 662 00:37:41,680 --> 00:37:44,839 Speaker 4: step and like search up the original video or click 663 00:37:44,880 --> 00:37:46,799 Speaker 4: on the original video and watch it end to end 664 00:37:47,440 --> 00:37:50,480 Speaker 4: and come to my own conclusions. Because there's a reason 665 00:37:50,560 --> 00:37:53,799 Speaker 4: why someone has clipped somebody the way they have, and 666 00:37:53,880 --> 00:37:55,680 Speaker 4: I think that's it can be really cool. It can 667 00:37:55,719 --> 00:37:58,600 Speaker 4: be accidental. I don't know. I'm not going to impose that. 668 00:37:58,800 --> 00:38:04,160 Speaker 3: But yeah, yeah, that's and there's a lot and to 669 00:38:04,200 --> 00:38:07,400 Speaker 3: take the time to actually take an autopsy, as is implied, 670 00:38:07,600 --> 00:38:09,080 Speaker 3: is a whole other manner. 671 00:38:09,160 --> 00:38:10,839 Speaker 4: Well, it's all there. I think That's the other thing too, 672 00:38:10,880 --> 00:38:13,000 Speaker 4: is using all the tools that are there, Like, is 673 00:38:13,040 --> 00:38:15,600 Speaker 4: this something someone said in response to this? I see 674 00:38:15,600 --> 00:38:17,399 Speaker 4: this all. I see this often too when people are 675 00:38:17,440 --> 00:38:20,479 Speaker 4: like a number of creators recently have been complaining about 676 00:38:20,480 --> 00:38:23,040 Speaker 4: this where people have been chastising them for things they 677 00:38:23,080 --> 00:38:26,440 Speaker 4: did not do, and it's because people didn't look at 678 00:38:26,440 --> 00:38:29,360 Speaker 4: things like timestamps. One of the things that's really important 679 00:38:29,360 --> 00:38:32,520 Speaker 4: to think about with like Instagram and TikTok, and it's 680 00:38:32,760 --> 00:38:37,320 Speaker 4: very very popular now, is that your FYP, your newsfeed, 681 00:38:37,400 --> 00:38:41,520 Speaker 4: whatever it's called, is asynchronous, which means that if you're 682 00:38:41,560 --> 00:38:45,040 Speaker 4: scrolling and you've come across a video where a creator 683 00:38:45,120 --> 00:38:48,239 Speaker 4: is crying and saying, oh, someone's you know, I've been 684 00:38:48,920 --> 00:38:51,400 Speaker 4: someone's been talking mean about me and they you know, 685 00:38:51,440 --> 00:38:57,040 Speaker 4: they've they doxed me and and have been critiquing how 686 00:38:57,080 --> 00:39:00,360 Speaker 4: I painted my living room walls. And then you scroll 687 00:39:00,400 --> 00:39:03,120 Speaker 4: for a little while and you come across another creator 688 00:39:03,480 --> 00:39:06,680 Speaker 4: who is talking about like, oh, you know, I had 689 00:39:06,680 --> 00:39:08,400 Speaker 4: to hold somebody account all the other day and it 690 00:39:08,440 --> 00:39:12,920 Speaker 4: was really difficult, but you know it was important to 691 00:39:13,200 --> 00:39:16,919 Speaker 4: like connect and like and they're talking in it. Those 692 00:39:17,000 --> 00:39:21,560 Speaker 4: videos aren't necessarily connected to each other but we can 693 00:39:21,600 --> 00:39:24,040 Speaker 4: make a lot of assumptions based on thinking that those 694 00:39:24,040 --> 00:39:28,160 Speaker 4: people have a relationship, based on thinking that those one 695 00:39:28,160 --> 00:39:30,719 Speaker 4: person's talking about the other because one video followed the other, 696 00:39:30,800 --> 00:39:33,200 Speaker 4: and that's not true. The other thing that's really tricky 697 00:39:33,239 --> 00:39:37,560 Speaker 4: on TikTok is that little like suggested search bar, And 698 00:39:37,640 --> 00:39:41,319 Speaker 4: it's so strange because it'll suggest things that are really 699 00:39:41,440 --> 00:39:43,880 Speaker 4: unrelated to the content, but you then think it is. 700 00:39:45,120 --> 00:39:46,880 Speaker 4: So it'll be like, you know, a man killed by 701 00:39:46,920 --> 00:39:49,640 Speaker 4: elephant and you're like, wait a minute, is this video 702 00:39:49,680 --> 00:39:51,319 Speaker 4: about a man that was killed by an elephant? And 703 00:39:51,360 --> 00:39:55,520 Speaker 4: you're like, So, to not be spoon fed things, but 704 00:39:55,560 --> 00:40:00,480 Speaker 4: to actually be proactive. Look at timestamps on videos, look 705 00:40:00,520 --> 00:40:03,359 Speaker 4: at the way someone's dressed. You can usually tell if 706 00:40:03,360 --> 00:40:05,799 Speaker 4: a creator made a bundle of videos and then you know, 707 00:40:06,160 --> 00:40:08,880 Speaker 4: release them over a course of several hours or days. 708 00:40:09,600 --> 00:40:11,799 Speaker 4: Look at all of those things before you hop to 709 00:40:11,840 --> 00:40:13,960 Speaker 4: a judgment or relate things to each other. 710 00:40:17,160 --> 00:40:20,360 Speaker 3: Good advice And then, of course, also you've done a 711 00:40:20,360 --> 00:40:24,239 Speaker 3: series about trusted resources, can you give us a quick 712 00:40:24,320 --> 00:40:27,720 Speaker 3: rundown on how to vet for a good trusted resource? 713 00:40:30,680 --> 00:40:36,640 Speaker 4: Online is now it's getting more and more difficult. I 714 00:40:36,680 --> 00:40:39,720 Speaker 4: always like to encourage people to first start with looking 715 00:40:39,800 --> 00:40:44,719 Speaker 4: for open access journal articles or research and reports. And 716 00:40:44,760 --> 00:40:47,000 Speaker 4: the reason why it's not because I think that those 717 00:40:47,280 --> 00:40:51,080 Speaker 4: entities are like smarter than everybody else whatever. It's because 718 00:40:51,120 --> 00:40:55,120 Speaker 4: the process is arduous to get something published in like 719 00:40:56,000 --> 00:41:00,799 Speaker 4: Gamma or like you know, the Disabilities Core literally or 720 00:41:00,840 --> 00:41:06,040 Speaker 4: any of these journals these like med or their stem 721 00:41:06,080 --> 00:41:10,800 Speaker 4: style journals, or even on like a website like NASUM, 722 00:41:10,920 --> 00:41:14,760 Speaker 4: which is like the National something Science Education and Medicine. 723 00:41:15,080 --> 00:41:19,279 Speaker 4: These entities have a high standard for peer review. Does 724 00:41:19,280 --> 00:41:22,200 Speaker 4: this mean everything? Does you know? Some stuff gets by? 725 00:41:22,360 --> 00:41:26,880 Speaker 4: And that's there's all sorts of reasons why nepotism can 726 00:41:26,920 --> 00:41:31,000 Speaker 4: be one, cronyism can be one, but a high standard 727 00:41:31,000 --> 00:41:34,319 Speaker 4: of peer review. The next thing I will say is 728 00:41:34,320 --> 00:41:37,720 Speaker 4: that if you cannot find an open access article from 729 00:41:37,760 --> 00:41:42,600 Speaker 4: a reputable journal, to also look at citations that people 730 00:41:42,640 --> 00:41:45,000 Speaker 4: have used. So I even say this with books because 731 00:41:45,040 --> 00:41:48,719 Speaker 4: people don't realize that. You're like, I see people do 732 00:41:48,760 --> 00:41:49,960 Speaker 4: this all the time and they're like, oh, I learned 733 00:41:50,000 --> 00:41:51,680 Speaker 4: it in a book, and I'm like, yeah, well, books 734 00:41:51,760 --> 00:41:57,560 Speaker 4: are edited, but they're not peer reviewed and it and 735 00:41:57,600 --> 00:41:59,400 Speaker 4: that's really hard to say because there are some books 736 00:41:59,400 --> 00:42:03,320 Speaker 4: I love, but they are biased. They're they're designed with bias. 737 00:42:03,920 --> 00:42:06,640 Speaker 4: And so when you're reading a book that was a 738 00:42:06,680 --> 00:42:09,719 Speaker 4: research project, because oftentimes people will turn their research into 739 00:42:09,760 --> 00:42:13,160 Speaker 4: a book, I see that with I think it's Catherine 740 00:42:13,320 --> 00:42:16,920 Speaker 4: Eden's Living on two dollars a day is a good example. 741 00:42:17,080 --> 00:42:20,040 Speaker 4: She makes turns her ethnography, which is a research project, 742 00:42:20,560 --> 00:42:24,759 Speaker 4: into a book. But she also cite has citations in 743 00:42:24,800 --> 00:42:28,759 Speaker 4: her bibliography. Right, So explore the citations, because that's going 744 00:42:28,840 --> 00:42:31,400 Speaker 4: to tell you a whole lot of how reputable something 745 00:42:31,480 --> 00:42:34,080 Speaker 4: is is did they cite their work correctly? And who 746 00:42:34,080 --> 00:42:36,160 Speaker 4: did they cite? If they started to cite a bunch 747 00:42:36,200 --> 00:42:39,920 Speaker 4: of gobbledegook, you know, then yeah, that's that starts to 748 00:42:39,920 --> 00:42:42,000 Speaker 4: give you a hint. I see this a lot on 749 00:42:42,200 --> 00:42:48,640 Speaker 4: like pseudo intellectual or pseudoscientific materials and disinformation websites. They'll 750 00:42:48,880 --> 00:42:53,360 Speaker 4: have citations, and their citations are actually cyclical or circular, 751 00:42:54,000 --> 00:42:57,920 Speaker 4: so they'll cite themselves a lot, they'll cite other people 752 00:42:57,960 --> 00:43:01,200 Speaker 4: that have cited them, so often do a lot of 753 00:43:01,239 --> 00:43:05,960 Speaker 4: back and forth citation that mimics scientific research, but it's 754 00:43:06,040 --> 00:43:08,640 Speaker 4: not real. And then the other thing I will say 755 00:43:08,680 --> 00:43:11,080 Speaker 4: is that you know, if you can look for things 756 00:43:11,120 --> 00:43:16,160 Speaker 4: that are posted to reputable institutions, so universities is always 757 00:43:16,160 --> 00:43:20,680 Speaker 4: a really good one. Universities love to post sourcing. You 758 00:43:20,719 --> 00:43:23,200 Speaker 4: can always check with your local library, librarians or I 759 00:43:23,200 --> 00:43:26,080 Speaker 4: don't know why we've lost the love of librarians, but 760 00:43:26,160 --> 00:43:31,160 Speaker 4: librarians are actually really great sources to ask because they're 761 00:43:31,200 --> 00:43:35,840 Speaker 4: taught library science, so they're taught to identify effective and 762 00:43:35,880 --> 00:43:39,520 Speaker 4: ineffective sourcing. And a lot of people don't realize this 763 00:43:39,719 --> 00:43:41,799 Speaker 4: is that if there is a resource that you that's 764 00:43:41,840 --> 00:43:44,680 Speaker 4: behind a paywall, you can request it from your local 765 00:43:44,719 --> 00:43:46,560 Speaker 4: library and they can access it for free for you. 766 00:43:47,320 --> 00:43:50,279 Speaker 4: So if there's like a research like paper that you 767 00:43:50,400 --> 00:43:53,279 Speaker 4: want to get a hold of, your local librarian can 768 00:43:53,480 --> 00:43:55,960 Speaker 4: get it for you and you can request it. There's 769 00:43:55,960 --> 00:43:58,279 Speaker 4: no cost to you, and I think there might be 770 00:43:58,320 --> 00:44:00,680 Speaker 4: a printing cost if you want it printed. I don't 771 00:44:00,719 --> 00:44:04,360 Speaker 4: know with each library, but yeah, library like library science. 772 00:44:04,400 --> 00:44:08,120 Speaker 4: We have to lean into your librarians. They are Please 773 00:44:08,520 --> 00:44:11,960 Speaker 4: let's in twenty twenty five, let's rediscover the magic of librarians. 774 00:44:12,520 --> 00:44:14,680 Speaker 3: Thanks coming back around. I think it's trending because we 775 00:44:14,719 --> 00:44:16,080 Speaker 3: love some libraries right here. 776 00:44:16,400 --> 00:44:19,239 Speaker 4: We love and library scientists you. 777 00:44:19,200 --> 00:44:23,239 Speaker 1: Know, they have awful masters of them. What they have 778 00:44:23,280 --> 00:44:24,640 Speaker 1: to go through isn't tend. 779 00:44:24,840 --> 00:44:28,960 Speaker 4: If there's anybody who knows citation, it is them, like ask. 780 00:44:28,800 --> 00:44:32,280 Speaker 1: Them go to research advice. That's why you're here. 781 00:44:34,800 --> 00:44:36,480 Speaker 2: Oh, that's awesome. 782 00:44:36,800 --> 00:44:37,840 Speaker 4: We do love libraries. 783 00:44:38,040 --> 00:44:38,480 Speaker 1: We're here. 784 00:44:38,520 --> 00:44:52,480 Speaker 2: We love libraries. So you already gave us some examples. 785 00:44:52,640 --> 00:44:55,400 Speaker 2: But going off that question, who are some people that 786 00:44:55,440 --> 00:44:59,120 Speaker 2: you would recommend for us and our listeners to follow 787 00:44:59,280 --> 00:44:59,960 Speaker 2: or seek out? 788 00:45:00,400 --> 00:45:04,000 Speaker 4: Well, it is Black History Month and I am a bioethicist, 789 00:45:04,000 --> 00:45:06,279 Speaker 4: so I would be remiss if I didn't shout out 790 00:45:06,280 --> 00:45:11,080 Speaker 4: some amazing great bioethicis and people who do work that 791 00:45:11,160 --> 00:45:14,239 Speaker 4: I like to say is bioethical in theory. Right, some 792 00:45:14,280 --> 00:45:19,040 Speaker 4: of them are sociologists, but that's okay. We all hang out. 793 00:45:19,080 --> 00:45:21,880 Speaker 4: We're all girls. You know. Some of my favorite is 794 00:45:21,880 --> 00:45:25,640 Speaker 4: doctor Keisherat. Doctor Kisherre is a bioethicis. She is the 795 00:45:25,719 --> 00:45:30,000 Speaker 4: person who's coined the concept black bioethics. She wrote I 796 00:45:30,040 --> 00:45:33,360 Speaker 4: think her book is called Black Bioethics, The Black Bioethics 797 00:45:33,400 --> 00:45:36,520 Speaker 4: Reader or something like that. The amazing thing about doctor 798 00:45:36,600 --> 00:45:40,440 Speaker 4: Keisherret's work, anything that she's published, is actually published at 799 00:45:40,480 --> 00:45:45,600 Speaker 4: a very accessible reading level. So her the book that 800 00:45:45,640 --> 00:45:48,239 Speaker 4: she has about Black bioethics is written at like a 801 00:45:48,280 --> 00:45:51,120 Speaker 4: sixth grade reading level. It's very consumable and you can 802 00:45:51,200 --> 00:45:54,719 Speaker 4: learn very easily, and it's very interesting. She's also the 803 00:45:54,840 --> 00:45:58,640 Speaker 4: editor for the blog Bioethics Today, so I also want 804 00:45:58,640 --> 00:46:01,120 Speaker 4: to encourage people to explore Bioeths Today if you're curious 805 00:46:01,120 --> 00:46:04,600 Speaker 4: about bioethics. There's all sorts of bioethicists talking about all 806 00:46:04,600 --> 00:46:07,319 Speaker 4: sorts of things in that blog. It's so fascinating. I 807 00:46:07,360 --> 00:46:10,279 Speaker 4: love reading it. I have some of my favorites up there. 808 00:46:11,560 --> 00:46:18,600 Speaker 4: I've also written for them on occasion, shameless plug. The 809 00:46:18,719 --> 00:46:22,920 Speaker 4: other one is, of course Dorothy Roberts. Doctor Dorothy Roberts 810 00:46:23,000 --> 00:46:26,560 Speaker 4: is a sociologist but also a bioethicist, and she's written 811 00:46:27,239 --> 00:46:29,799 Speaker 4: Killing the Black Body, and she talks a lot about 812 00:46:29,840 --> 00:46:35,480 Speaker 4: maternal mortality and black maternal health. She is like one 813 00:46:35,520 --> 00:46:38,360 Speaker 4: of the chairs here and you Penn in Philadelphia, so 814 00:46:38,600 --> 00:46:41,680 Speaker 4: very very close. Very one day I'll cross her path. 815 00:46:41,680 --> 00:46:45,800 Speaker 4: One day I'll be important enough to know her. Harriet Washington. 816 00:46:46,200 --> 00:46:48,839 Speaker 4: A lot of people know Harriet Washington's work because she 817 00:46:48,880 --> 00:46:53,080 Speaker 4: wrote Medical Apartheid, which sort of put Black bioethics conceptually 818 00:46:53,120 --> 00:46:56,200 Speaker 4: on the map because it was such a best selling 819 00:46:56,239 --> 00:46:59,320 Speaker 4: book and so many people cite it in like African 820 00:46:59,320 --> 00:47:03,600 Speaker 4: American history. But she also does an amazing series of 821 00:47:03,640 --> 00:47:07,360 Speaker 4: writing on things like consent, and so I want to 822 00:47:07,400 --> 00:47:10,360 Speaker 4: encourage people to write her up to read her other books. 823 00:47:10,440 --> 00:47:14,000 Speaker 4: And she also has a book about environmental health and 824 00:47:14,520 --> 00:47:20,120 Speaker 4: Black Black disparities in health. So she's really great to 825 00:47:20,160 --> 00:47:24,520 Speaker 4: explore beyond medical apartheid, which you should read. And then 826 00:47:24,560 --> 00:47:26,840 Speaker 4: the last one I want to mention is Ruha Benjamin. 827 00:47:27,560 --> 00:47:31,760 Speaker 4: And the last two is rue Benjamin is a really 828 00:47:31,800 --> 00:47:35,360 Speaker 4: great sociologist who talks a lot about inequality in tech. 829 00:47:36,440 --> 00:47:40,320 Speaker 4: She talks a lot about things like algorithms and bias, 830 00:47:40,440 --> 00:47:46,360 Speaker 4: race based tech and science. She talks a lot about 831 00:47:46,480 --> 00:47:48,600 Speaker 4: she I think she's been more exploring more things like 832 00:47:48,640 --> 00:47:51,440 Speaker 4: AI and the disparity of AI and things like labor 833 00:47:51,840 --> 00:47:56,360 Speaker 4: and labor disparities and labor oppression. And then there's a 834 00:47:56,440 --> 00:47:58,480 Speaker 4: Landra Nelson. And the reason why I bring up a 835 00:47:58,520 --> 00:48:00,480 Speaker 4: Landa Nelson is because I feel like I her all 836 00:48:00,480 --> 00:48:04,880 Speaker 4: the time. In February, because she wrote the entire book 837 00:48:05,200 --> 00:48:08,280 Speaker 4: Body and Soul about the Black Panthers health justice work, 838 00:48:09,160 --> 00:48:12,080 Speaker 4: and she just I mean end to end. The amount 839 00:48:12,200 --> 00:48:15,320 Speaker 4: of work that the Black Panthers endeavored in health justice 840 00:48:16,000 --> 00:48:17,960 Speaker 4: is not talked about enough, and it makes me so 841 00:48:18,080 --> 00:48:22,000 Speaker 4: ill because it's like their consummate, like magnum opus work. 842 00:48:22,719 --> 00:48:24,520 Speaker 4: And we just constantly bring up the fact that they 843 00:48:24,600 --> 00:48:28,120 Speaker 4: like marched around in like black beerys and like guns, 844 00:48:28,160 --> 00:48:30,960 Speaker 4: which is fine, like that, you know, militancy is you know, 845 00:48:31,320 --> 00:48:34,200 Speaker 4: this is not a judgment on that. But the longer 846 00:48:34,239 --> 00:48:37,440 Speaker 4: standing work has been their work in health justice. Some 847 00:48:37,520 --> 00:48:40,440 Speaker 4: of the clinics that they set up in the nineteen 848 00:48:40,480 --> 00:48:45,000 Speaker 4: seventies are still operating today and they are the reason 849 00:48:45,040 --> 00:48:47,600 Speaker 4: why we have some of the more systemic public health 850 00:48:47,640 --> 00:48:51,239 Speaker 4: designs around things like testing for sickle cell or doing 851 00:48:51,360 --> 00:48:55,120 Speaker 4: lead testing in homes. They design those systems and the 852 00:48:55,280 --> 00:48:59,560 Speaker 4: US government federally adopted those designs later on. So Laundry 853 00:48:59,600 --> 00:49:04,279 Speaker 4: Nelson's book, she is a a sociologist as well. She 854 00:49:04,560 --> 00:49:06,719 Speaker 4: put it all together at the history of the Black 855 00:49:06,719 --> 00:49:09,960 Speaker 4: Panthers in Health Justice. So explore her work for sure. 856 00:49:10,640 --> 00:49:11,640 Speaker 1: Trying to get all this new. 857 00:49:14,200 --> 00:49:16,399 Speaker 3: You see me like pulling up the names, like yep, 858 00:49:16,440 --> 00:49:18,600 Speaker 3: got get that book, Gotta get this book, and gotta 859 00:49:18,600 --> 00:49:22,200 Speaker 3: get this one making us but. 860 00:49:22,120 --> 00:49:23,920 Speaker 4: It gets good. 861 00:49:26,920 --> 00:49:29,879 Speaker 3: And you know what, with all of this, I again 862 00:49:30,200 --> 00:49:32,799 Speaker 3: I told you at the very beginning, I'm so impressed 863 00:49:33,000 --> 00:49:35,680 Speaker 3: by the fact that you come in in these really 864 00:49:35,760 --> 00:49:41,160 Speaker 3: deep and hard conversations and do it with such eloquence. 865 00:49:41,160 --> 00:49:44,239 Speaker 3: But not only but like calm. The calm is what 866 00:49:44,280 --> 00:49:46,640 Speaker 3: I'm most impressed by. I honestly like you do it 867 00:49:46,719 --> 00:49:49,080 Speaker 3: as an educator. I'm sure you're like, yeah, this is 868 00:49:49,120 --> 00:49:51,440 Speaker 3: what we do. Uh. When I read things like this 869 00:49:51,480 --> 00:49:54,120 Speaker 3: and it feels like injustice, I go on a rampage 870 00:49:54,480 --> 00:49:58,400 Speaker 3: post coursing, maybe my face turning red. And you do 871 00:49:58,440 --> 00:50:01,160 Speaker 3: it in such a way that's so educational. So it's approachable, 872 00:50:01,920 --> 00:50:04,600 Speaker 3: in a way that even though it takes me a 873 00:50:04,640 --> 00:50:08,480 Speaker 3: few few listens to really grasp it, it makes me 874 00:50:08,520 --> 00:50:11,520 Speaker 3: want to learn more. But with that, I know it 875 00:50:11,600 --> 00:50:14,360 Speaker 3: has to be stressful. I know this work is stressful. 876 00:50:14,400 --> 00:50:18,799 Speaker 1: I know this timing is really really, really stressful. 877 00:50:19,960 --> 00:50:22,239 Speaker 3: So with all of that, we always have to ask, 878 00:50:22,680 --> 00:50:24,880 Speaker 3: what do you do when you need to take a break. 879 00:50:24,960 --> 00:50:27,759 Speaker 3: What is your Uh I'm not gonna say necessarily like 880 00:50:27,800 --> 00:50:28,399 Speaker 3: self care, but. 881 00:50:28,360 --> 00:50:30,440 Speaker 1: What how do you detox when you do that? 882 00:50:30,600 --> 00:50:33,920 Speaker 4: I I'm a lover and a traveler usually, but when 883 00:50:33,960 --> 00:50:38,239 Speaker 4: I can't afford that's high brow stuff, you know. But 884 00:50:39,640 --> 00:50:43,120 Speaker 4: when I'm really trying to, like you said, detox and 885 00:50:43,200 --> 00:50:47,719 Speaker 4: relax out, I really am a huge lover of graphic novels. 886 00:50:48,719 --> 00:50:54,520 Speaker 4: I love reading graphic novels. I love independent pressings I have. 887 00:50:54,640 --> 00:50:58,520 Speaker 4: I love the illustrations. I love all sorts of like 888 00:50:58,600 --> 00:51:00,520 Speaker 4: sci fi based one. I mean, there's this so much. 889 00:51:00,560 --> 00:51:04,200 Speaker 4: The graphic novel world is so fascinating because growing up 890 00:51:04,280 --> 00:51:08,160 Speaker 4: is like a little black queer like a fab nerd. 891 00:51:09,160 --> 00:51:11,560 Speaker 4: One of the first places I could see myself like 892 00:51:11,680 --> 00:51:14,520 Speaker 4: reflected as like an interesting character was actually in comic 893 00:51:14,600 --> 00:51:19,200 Speaker 4: books and in and maybe not specifically in all those ways. 894 00:51:19,239 --> 00:51:21,960 Speaker 4: But the first time I saw queer people as like 895 00:51:23,160 --> 00:51:27,880 Speaker 4: interesting dynamic, not just villains and like coded villains, was 896 00:51:27,920 --> 00:51:33,040 Speaker 4: like in comic books, especially independent ones, and it's only 897 00:51:33,040 --> 00:51:35,439 Speaker 4: gotten better since then. I would say that a lot 898 00:51:35,480 --> 00:51:38,840 Speaker 4: of the comic book world, weirdly enough, is like twenty 899 00:51:38,920 --> 00:51:41,520 Speaker 4: years ahead of us when it comes to like perception 900 00:51:41,800 --> 00:51:46,120 Speaker 4: and how we display certain groups. That's not to say 901 00:51:46,120 --> 00:51:48,319 Speaker 4: that there's not comic book artists and writers who are 902 00:51:48,360 --> 00:51:54,560 Speaker 4: like absolute jerks, but you can find your story, and 903 00:51:54,680 --> 00:51:57,440 Speaker 4: I could. I could find my story pretty early on 904 00:51:57,520 --> 00:51:59,920 Speaker 4: in comic books, and I just never stopped loving any 905 00:52:00,000 --> 00:52:03,080 Speaker 4: exploring them. So they usually when I'm like done with 906 00:52:03,160 --> 00:52:08,960 Speaker 4: the day, I like to go through my very extensive 907 00:52:09,000 --> 00:52:13,800 Speaker 4: and embarrassingly like organized graphic novel and comic book collection 908 00:52:13,920 --> 00:52:17,840 Speaker 4: and like reread things. And then I also sometimes like 909 00:52:17,880 --> 00:52:20,120 Speaker 4: to read things that are more amorphic or like more 910 00:52:21,640 --> 00:52:25,080 Speaker 4: like I love culture. I love learning about different groups 911 00:52:25,120 --> 00:52:29,120 Speaker 4: and populations and history and so like right now, I'm 912 00:52:29,160 --> 00:52:32,719 Speaker 4: currently reading a book about heaven and Hell and the 913 00:52:32,760 --> 00:52:36,120 Speaker 4: way that heaven and hell have been imagined across the world, 914 00:52:37,120 --> 00:52:39,799 Speaker 4: and it says a lot about the human condition for 915 00:52:39,880 --> 00:52:45,440 Speaker 4: how we imagine heaven and hell or paradise and purgatory, 916 00:52:45,600 --> 00:52:49,520 Speaker 4: and it's been very fascinating. Pretty quickly, you. 917 00:52:49,760 --> 00:52:52,640 Speaker 3: Fit into our world, Annie's I'm like over here chopping 918 00:52:52,640 --> 00:52:55,200 Speaker 3: at the book because we love a good comic graphic 919 00:52:55,680 --> 00:52:57,200 Speaker 3: novel it I love it. 920 00:52:57,560 --> 00:53:02,400 Speaker 2: We do. And Samantha knows a lot about religions, so fits. 921 00:53:02,719 --> 00:53:05,640 Speaker 1: Like I said, month every week, come on the way that. 922 00:53:05,600 --> 00:53:07,520 Speaker 4: We could sit here and talk about what I just 923 00:53:07,640 --> 00:53:10,280 Speaker 4: learned about, which was infernal cartography. 924 00:53:10,520 --> 00:53:16,680 Speaker 3: Please please okay, no, we don't have time stop why 925 00:53:16,920 --> 00:53:19,000 Speaker 3: so that means we get to have. 926 00:53:23,600 --> 00:53:26,839 Speaker 2: Yes, you should definitely come back. We would oh gosh. 927 00:53:27,080 --> 00:53:28,160 Speaker 1: I wouldn't talk about every. 928 00:53:28,080 --> 00:53:31,280 Speaker 4: Day every day. 929 00:53:32,680 --> 00:53:37,680 Speaker 2: Yeah, we'll negotiate it later later, But in the meantime, 930 00:53:37,920 --> 00:53:40,200 Speaker 2: where can the good listeners find you? 931 00:53:40,560 --> 00:53:44,440 Speaker 4: They can find me on TikTok I am Evan the bioethicist. 932 00:53:44,440 --> 00:53:47,719 Speaker 4: E v n the bioethicist on TikTok I come up 933 00:53:47,800 --> 00:53:49,960 Speaker 4: right away if you google bioethics. I think I'm the 934 00:53:50,000 --> 00:53:54,360 Speaker 4: only person that uses that hashtag sadness. And then the 935 00:53:54,400 --> 00:53:56,360 Speaker 4: other place that I've recently started to build up is 936 00:53:56,400 --> 00:53:58,960 Speaker 4: a Patreon. There is a free side of it, like 937 00:53:58,960 --> 00:54:01,880 Speaker 4: a free membership. I don't like paid barriers. There is 938 00:54:01,920 --> 00:54:05,200 Speaker 4: a five dollars supporter membership that people can get into 939 00:54:05,600 --> 00:54:10,839 Speaker 4: if they want to get like longer form more like 940 00:54:11,000 --> 00:54:13,600 Speaker 4: curated content that I'm working on now. It's not up 941 00:54:13,640 --> 00:54:15,399 Speaker 4: there yet, but I'm working on now, but it will 942 00:54:15,400 --> 00:54:18,760 Speaker 4: be behind that five dollars a month little minimum paywall, 943 00:54:18,760 --> 00:54:22,200 Speaker 4: and that's mainly to pay for trying to produce like 944 00:54:22,280 --> 00:54:25,320 Speaker 4: higher quality more than me just talking at my phone content. 945 00:54:26,239 --> 00:54:29,120 Speaker 4: If people aren't into TikTok I just repost the tiktoks 946 00:54:29,120 --> 00:54:31,799 Speaker 4: to the free membership section of Patreon. A lot of 947 00:54:31,800 --> 00:54:35,400 Speaker 4: people like that better because sometimes people don't like you know, 948 00:54:36,160 --> 00:54:38,840 Speaker 4: TikTok's a blessing and a curse, and when things go viral, 949 00:54:38,960 --> 00:54:42,240 Speaker 4: you can end up with a lot of really terrible 950 00:54:42,360 --> 00:54:45,440 Speaker 4: vibes in the comments section and what have you. So 951 00:54:46,360 --> 00:54:50,360 Speaker 4: a lot of my softer hearted I want to say, 952 00:54:50,480 --> 00:54:54,280 Speaker 4: followers who want less of that static noise that TikTok 953 00:54:54,320 --> 00:54:56,960 Speaker 4: can produce, like to go to Patreon and interact with 954 00:54:57,000 --> 00:54:59,560 Speaker 4: Patreon instead, because those are people who are intentionally there, 955 00:54:59,680 --> 00:55:02,360 Speaker 4: you know, So find either one. I'm the same on 956 00:55:02,440 --> 00:55:03,439 Speaker 4: both platforms. 957 00:55:04,080 --> 00:55:07,000 Speaker 2: Oh yes, I'm going to check that out. Listeners, you 958 00:55:07,000 --> 00:55:09,799 Speaker 2: should check it out as well. Thank you so much 959 00:55:09,880 --> 00:55:13,000 Speaker 2: Evan for joining us today. It's been in delight and yes, 960 00:55:13,160 --> 00:55:15,120 Speaker 2: come back, come back anytime. 961 00:55:15,680 --> 00:55:18,080 Speaker 4: I love it. I'm interested. We could talk about the 962 00:55:18,520 --> 00:55:21,880 Speaker 4: non bioethics or professional things too. I'm into it. But 963 00:55:21,920 --> 00:55:22,799 Speaker 4: thanks for having me. 964 00:55:22,920 --> 00:55:24,920 Speaker 3: Yes, we'll just have an Evan corner like where you 965 00:55:24,960 --> 00:55:30,359 Speaker 3: come in at every so often a lot actually, I 966 00:55:30,400 --> 00:55:30,799 Speaker 3: love it. 967 00:55:32,200 --> 00:55:33,840 Speaker 2: But Listeners. In the meantime, if you would like to 968 00:55:33,840 --> 00:55:36,000 Speaker 2: contact us, you can you can email us at Hello 969 00:55:36,000 --> 00:55:38,439 Speaker 2: at stuff onever told you dot com. You can find 970 00:55:38,480 --> 00:55:40,719 Speaker 2: us on Blue Sky at most of a Podcast, or 971 00:55:40,760 --> 00:55:43,200 Speaker 2: on Instagram and TikTok at stuff one Never told you. 972 00:55:43,320 --> 00:55:45,000 Speaker 2: We're us on YouTube, We have a tea public store, 973 00:55:45,040 --> 00:55:46,560 Speaker 2: and we have book you can get wherever you get 974 00:55:46,560 --> 00:55:49,400 Speaker 2: your books. Thanks as always to our super producer, contenior 975 00:55:49,440 --> 00:55:51,480 Speaker 2: executive producer Maya and our contributor Joey. 976 00:55:51,719 --> 00:55:53,800 Speaker 1: Thank you and thanks to you for listening stuff I 977 00:55:53,920 --> 00:55:54,279 Speaker 1: Never told you. 978 00:55:54,320 --> 00:55:56,000 Speaker 2: Use production to by Heart Radio. For more podcast with 979 00:55:56,080 --> 00:55:57,680 Speaker 2: my Heart Radio, you can check out the art Radio app, 980 00:55:57,680 --> 00:55:59,640 Speaker 2: Apple Podcast, or wherever you listen to your favorite shows. 981 00:56:04,160 --> 00:56:04,279 Speaker 3: Ye