1 00:00:10,840 --> 00:00:14,520 Speaker 1: Welcome to the Therapy for Black Girls Podcast, a weekly 2 00:00:14,560 --> 00:00:19,320 Speaker 1: conversation about mental health, personal development, and all the small 3 00:00:19,360 --> 00:00:22,520 Speaker 1: decisions we can make to become the best possible versions 4 00:00:22,520 --> 00:00:26,640 Speaker 1: of ourselves. I'm your host, doctor Joy hard and Bradford, 5 00:00:27,000 --> 00:00:32,080 Speaker 1: a licensed psychologist in Atlanta, Georgia. For more information or 6 00:00:32,200 --> 00:00:35,600 Speaker 1: to find a therapist in your area, visit our website 7 00:00:35,720 --> 00:00:39,440 Speaker 1: at Therapy for Blackgirls dot com. While I hope you 8 00:00:39,479 --> 00:00:43,479 Speaker 1: love listening to and learning from the podcast, it is 9 00:00:43,520 --> 00:00:46,440 Speaker 1: not meant to be a substitute for a relationship with 10 00:00:46,479 --> 00:00:57,480 Speaker 1: a licensed mental health professional. Hey, y'all, thanks so much 11 00:00:57,520 --> 00:00:59,680 Speaker 1: for joining us for session four thirty eight of the 12 00:00:59,680 --> 00:01:02,760 Speaker 1: Therapy for Black Girls Podcast. We'll get right into our 13 00:01:02,800 --> 00:01:14,840 Speaker 1: conversation after word from our sponsors. As we head into 14 00:01:14,840 --> 00:01:17,760 Speaker 1: the holidays, many of us will encounter old memories and 15 00:01:17,800 --> 00:01:21,119 Speaker 1: create new ones. You may spend time recounting stories from 16 00:01:21,120 --> 00:01:24,480 Speaker 1: your childhood, learning a new line dance, or even taking 17 00:01:24,560 --> 00:01:27,679 Speaker 1: orders in the kitchen on how to make a family recipe. 18 00:01:27,840 --> 00:01:30,600 Speaker 1: Memory keeping has long been a practice for humans across 19 00:01:30,600 --> 00:01:34,120 Speaker 1: the globe, but for Black people, though, traditions look different, 20 00:01:34,680 --> 00:01:38,360 Speaker 1: and in the age of technology, the way memories are created, stored, 21 00:01:38,480 --> 00:01:41,560 Speaker 1: and used introduce a new set of questions around who 22 00:01:41,600 --> 00:01:44,800 Speaker 1: gets to call them their own. Today, I'm excited to 23 00:01:44,800 --> 00:01:48,160 Speaker 1: be joined by doctor Tanya Sutherland, currently a professor and 24 00:01:48,240 --> 00:01:51,800 Speaker 1: dean at UCLA. She has dedicated her research to unpacking 25 00:01:51,800 --> 00:01:54,960 Speaker 1: the uniqueness of black memory work, and in her book 26 00:01:55,320 --> 00:01:59,280 Speaker 1: Resurrecting the Black Body Race and the Digital Afterlife, she 27 00:01:59,360 --> 00:02:03,280 Speaker 1: digs into how technology, history and data longevity affect how 28 00:02:03,280 --> 00:02:08,320 Speaker 1: we practice arkavism and how those practices impact our digital afterlives. 29 00:02:08,919 --> 00:02:12,320 Speaker 1: If something resonates with you while enjoying our conversation, please 30 00:02:12,320 --> 00:02:15,280 Speaker 1: share with us on social media using the hashtag TVG 31 00:02:15,440 --> 00:02:18,760 Speaker 1: in session, or join us over in our patreons to 32 00:02:18,840 --> 00:02:21,320 Speaker 1: talk more about the episode. You can join us at 33 00:02:21,360 --> 00:02:25,200 Speaker 1: community dot therapy for Blackgirls dot com. Here's our conversation. 34 00:02:28,760 --> 00:02:31,120 Speaker 1: Thank you so much for joining us today, doctor Sutherland. 35 00:02:31,600 --> 00:02:34,959 Speaker 2: It is an absolute pleasure to be here, and it's Tanya. Please. 36 00:02:35,480 --> 00:02:37,920 Speaker 1: Oh, Tanya, got it? Got it? So I wonder if 37 00:02:37,919 --> 00:02:40,480 Speaker 1: you can start by just giving us a definition of 38 00:02:40,520 --> 00:02:43,000 Speaker 1: your work. So what do you mean when you talk 39 00:02:43,040 --> 00:02:48,000 Speaker 1: about black memory work in black digital afterlives? Wow? 40 00:02:48,080 --> 00:02:51,440 Speaker 2: Thank you? What an important question and a perfect place 41 00:02:51,520 --> 00:02:55,639 Speaker 2: to begin when we talk about digital afterlives. We're really 42 00:02:55,760 --> 00:03:01,119 Speaker 2: talking about the data that's collected by us for us 43 00:03:01,600 --> 00:03:07,440 Speaker 2: about us. That includes anything from our social media, to 44 00:03:07,800 --> 00:03:13,880 Speaker 2: our email, to our communications such as text messages, instant messages, right, 45 00:03:14,080 --> 00:03:17,399 Speaker 2: all of this content that we're producing in our own 46 00:03:17,440 --> 00:03:21,880 Speaker 2: personal lives, and then actual data bits and bytes. When 47 00:03:21,880 --> 00:03:26,200 Speaker 2: you take a photograph, the metadata from that gets collected 48 00:03:26,320 --> 00:03:28,600 Speaker 2: on your phone, So if you have an iPhone, right, 49 00:03:28,680 --> 00:03:32,359 Speaker 2: that information then goes to Apple. They're collecting that data 50 00:03:32,360 --> 00:03:35,960 Speaker 2: and that metadata. They're collecting the time of the photograph, 51 00:03:36,120 --> 00:03:40,520 Speaker 2: the location of the photograph, some information, some other information 52 00:03:40,840 --> 00:03:45,240 Speaker 2: about the image itself, and increasingly with the use of AI, 53 00:03:46,320 --> 00:03:51,680 Speaker 2: that includes some maybe sort of description about what might 54 00:03:51,720 --> 00:03:54,240 Speaker 2: appear in the image. This is sort of a new 55 00:03:54,280 --> 00:03:58,280 Speaker 2: added layer of what is being collected and noted. And 56 00:03:58,320 --> 00:04:01,760 Speaker 2: then when you pass away, the question becomes what is 57 00:04:01,800 --> 00:04:05,400 Speaker 2: done with all of this data? And so in many 58 00:04:05,440 --> 00:04:10,200 Speaker 2: ways we have what I have termed digital afterlife, where 59 00:04:10,240 --> 00:04:13,960 Speaker 2: all of this data and our digital detritis lives on 60 00:04:14,400 --> 00:04:17,600 Speaker 2: after we do, and we have very little say in 61 00:04:17,720 --> 00:04:20,599 Speaker 2: terms of what is done with that data, how it's used, 62 00:04:20,640 --> 00:04:23,960 Speaker 2: how it might be manipulated. There are, as we know 63 00:04:24,120 --> 00:04:28,880 Speaker 2: now digital resurrection, and other kinds of digital immortality practices. 64 00:04:29,360 --> 00:04:31,960 Speaker 2: All of these things get tied up in what we 65 00:04:32,080 --> 00:04:35,760 Speaker 2: mean when we say a digital afterlife. When I talk 66 00:04:35,800 --> 00:04:38,560 Speaker 2: about Black Memory Work, I'm looking at all of the 67 00:04:38,560 --> 00:04:43,280 Speaker 2: potential harms that can be served up by a digital afterlife, 68 00:04:43,720 --> 00:04:47,280 Speaker 2: and I'm looking to Black memory work, which really describes 69 00:04:47,360 --> 00:04:53,640 Speaker 2: a long history, robust history of practices within black communities, 70 00:04:54,120 --> 00:05:00,920 Speaker 2: diasporic communities across the globe that are about preserved memory 71 00:05:02,520 --> 00:05:07,919 Speaker 2: in ways that make sense to us, and in some ways, 72 00:05:08,160 --> 00:05:11,960 Speaker 2: I think about Black Memory Work as a corrective or 73 00:05:12,800 --> 00:05:18,040 Speaker 2: as an alternative path to a digital after life, something 74 00:05:18,040 --> 00:05:21,440 Speaker 2: more intentional, right, something that we craft for ourselves. 75 00:05:22,000 --> 00:05:24,599 Speaker 1: Can you talk about like what inspired you and like 76 00:05:24,680 --> 00:05:27,280 Speaker 1: what interested you in this work and how it inspired 77 00:05:27,320 --> 00:05:28,360 Speaker 1: the writing of your book. 78 00:05:28,720 --> 00:05:31,080 Speaker 2: I had been thinking about this in some way, shape 79 00:05:31,160 --> 00:05:34,599 Speaker 2: or form since my childhood. I was distressed as a 80 00:05:34,680 --> 00:05:38,520 Speaker 2: child to see the Save the Children campaign. Do you 81 00:05:38,520 --> 00:05:45,120 Speaker 2: remember this? On television commercials for Evening TV. There would 82 00:05:45,120 --> 00:05:49,760 Speaker 2: be pictures of ostensibly starving children, and they would show 83 00:05:49,839 --> 00:05:54,200 Speaker 2: it was always black children. They always had very distended bellies. 84 00:05:54,600 --> 00:06:00,800 Speaker 2: What we would now probably refer to colloquially as poverty, porn. Right, 85 00:06:01,080 --> 00:06:04,560 Speaker 2: we were looking at black babies in distress, and as 86 00:06:04,600 --> 00:06:07,640 Speaker 2: a child, I remember being struck by how it only 87 00:06:07,680 --> 00:06:10,840 Speaker 2: ever was kids who looked like me, and it didn't 88 00:06:10,880 --> 00:06:12,760 Speaker 2: make sense to me. As a child, I didn't have 89 00:06:12,800 --> 00:06:17,520 Speaker 2: a language or reference points really to understand what I 90 00:06:17,600 --> 00:06:19,839 Speaker 2: was seeing, or to make sense of it, or even 91 00:06:19,880 --> 00:06:24,400 Speaker 2: to sort of contend with it. And then when Hurricane 92 00:06:24,480 --> 00:06:28,960 Speaker 2: Katrina happened, I was immediately taken back to those pictures, 93 00:06:29,760 --> 00:06:32,960 Speaker 2: to those commercials, to those ads, to that entire ad campaign, 94 00:06:33,720 --> 00:06:38,600 Speaker 2: and I thought, we are seeing dead people who look 95 00:06:38,760 --> 00:06:43,640 Speaker 2: like me on television once again, now on the internet 96 00:06:43,680 --> 00:06:48,040 Speaker 2: as well, in the overflowing waters of Lake pontchitran Right, 97 00:06:48,240 --> 00:06:51,400 Speaker 2: we are once again looking at images of black people's 98 00:06:51,400 --> 00:06:55,240 Speaker 2: bodies and distress, deceased, and it's being presented to us 99 00:06:55,320 --> 00:06:59,520 Speaker 2: as something that we should taken again any real context. 100 00:07:00,279 --> 00:07:04,040 Speaker 2: We aren't given tools with which to process what we're seeing. 101 00:07:04,720 --> 00:07:08,479 Speaker 2: And I thought, there's got to be a reason that 102 00:07:08,560 --> 00:07:12,240 Speaker 2: it's always people who look like me, and that reason 103 00:07:12,440 --> 00:07:15,520 Speaker 2: can't just be racism. What else is at play here? 104 00:07:15,880 --> 00:07:17,920 Speaker 2: And so that sort of became the center of the 105 00:07:17,920 --> 00:07:21,320 Speaker 2: inquiry for my book Resurrecting Black Body, which is about 106 00:07:21,440 --> 00:07:25,920 Speaker 2: race and the digital afterlife and really explores these questions 107 00:07:25,920 --> 00:07:26,400 Speaker 2: in depth. 108 00:07:26,920 --> 00:07:28,960 Speaker 1: And what are some of the most common practices of 109 00:07:28,960 --> 00:07:31,040 Speaker 1: black memory work that we may not even realize that 110 00:07:31,080 --> 00:07:31,840 Speaker 1: we're practicing. 111 00:07:32,320 --> 00:07:36,840 Speaker 2: Oh my gosh, I love this question. So we are 112 00:07:37,120 --> 00:07:40,000 Speaker 2: very creative when I say we right now, I'm talking 113 00:07:40,000 --> 00:07:44,000 Speaker 2: about the Black Memory Collective, which is a small community 114 00:07:44,040 --> 00:07:48,040 Speaker 2: based organization that I'm the founder and director of. And 115 00:07:48,320 --> 00:07:51,080 Speaker 2: when we talk about black memory work, the kinds of 116 00:07:51,120 --> 00:07:54,640 Speaker 2: practices that we first started talking about and seeing, we're 117 00:07:54,680 --> 00:07:59,440 Speaker 2: things like quilts, right it was we understand memory to 118 00:07:59,480 --> 00:08:03,040 Speaker 2: be braid in someone's hair as a root to freedom. 119 00:08:03,360 --> 00:08:07,240 Speaker 2: We understand black memory work to be quilting. We understand 120 00:08:07,280 --> 00:08:10,280 Speaker 2: black memory work in a way that other people might 121 00:08:10,320 --> 00:08:15,040 Speaker 2: not recognize it. Other people might call someone a hoarder, 122 00:08:15,200 --> 00:08:17,679 Speaker 2: for example. And that's not to say that there aren't 123 00:08:17,880 --> 00:08:20,680 Speaker 2: issues around hoarding, but it is to say that when 124 00:08:20,720 --> 00:08:26,600 Speaker 2: you have been dispossessed of your physical belongings repeatedly over generations, 125 00:08:27,560 --> 00:08:31,800 Speaker 2: there's something to that holding on to things. Right, that 126 00:08:32,040 --> 00:08:37,600 Speaker 2: is about a memory process, a memory ritual, a memory practice. Right, 127 00:08:38,559 --> 00:08:43,720 Speaker 2: If you're not out here hoarding or braiding hair or quilting. 128 00:08:43,760 --> 00:08:46,760 Speaker 2: Don't think you're not doing black memory work. Certainly for 129 00:08:46,840 --> 00:08:49,880 Speaker 2: our photographs and the things that we might find in 130 00:08:49,920 --> 00:08:53,520 Speaker 2: the boxes under our beds, that's black memory work. And 131 00:08:53,559 --> 00:08:57,880 Speaker 2: then in an even more expansive way, we are finding 132 00:08:57,880 --> 00:09:00,720 Speaker 2: that the way that folks show up for one another 133 00:09:00,720 --> 00:09:04,840 Speaker 2: in community is also about a memory practice or a 134 00:09:04,880 --> 00:09:08,520 Speaker 2: memory ritual. It's about building memory and that memory then 135 00:09:08,720 --> 00:09:13,080 Speaker 2: gets held in community. And I think that's a lot 136 00:09:13,080 --> 00:09:15,120 Speaker 2: of it right, just the ways that we are showing 137 00:09:15,160 --> 00:09:18,439 Speaker 2: up in community and asking one another to hold things 138 00:09:18,720 --> 00:09:21,840 Speaker 2: for ourselves, for our communities, all of that, all of 139 00:09:21,840 --> 00:09:23,360 Speaker 2: that counts as block memory. 140 00:09:24,080 --> 00:09:26,000 Speaker 1: You know, as you're talking to me, I'm thinking about 141 00:09:26,040 --> 00:09:30,120 Speaker 1: this large black leather purse that my grandmother had that 142 00:09:30,240 --> 00:09:32,600 Speaker 1: had all of these photos, and every time I would 143 00:09:32,640 --> 00:09:35,360 Speaker 1: go home for like holidays, like I'm going straight to 144 00:09:35,400 --> 00:09:38,000 Speaker 1: this purse to like ask my aunts and uncles like 145 00:09:38,040 --> 00:09:41,000 Speaker 1: tell me about this picture? Who was this? And really 146 00:09:41,000 --> 00:09:44,320 Speaker 1: wanting to like understand like why there were these collection 147 00:09:44,440 --> 00:09:48,280 Speaker 1: of photos. It also makes me think about how important 148 00:09:48,280 --> 00:09:51,240 Speaker 1: it was to my grandmother that anytime she went to 149 00:09:51,280 --> 00:09:54,000 Speaker 1: a funeral, she had to have a program, right. And 150 00:09:54,000 --> 00:09:56,120 Speaker 1: then would come back from the funeral and like put 151 00:09:56,160 --> 00:09:57,680 Speaker 1: it in the back of her bible. Right, And so 152 00:09:57,760 --> 00:10:00,400 Speaker 1: it feels like as a culture, like as a community, 153 00:10:00,440 --> 00:10:02,800 Speaker 1: it feels like we have been doing this black memory 154 00:10:02,800 --> 00:10:04,240 Speaker 1: where even if we weren't calling it it. 155 00:10:04,679 --> 00:10:07,520 Speaker 2: That's exactly right, that's exactly right. It's in our art. 156 00:10:07,800 --> 00:10:11,760 Speaker 2: It's literally woven into the fabrics of our culture. Yeah, 157 00:10:11,280 --> 00:10:14,000 Speaker 2: So what are some of the I love it. I 158 00:10:14,080 --> 00:10:16,400 Speaker 2: just yeah, like you have to go on a big 159 00:10:16,480 --> 00:10:19,160 Speaker 2: black bag full of black memory. 160 00:10:19,520 --> 00:10:21,480 Speaker 1: Yeah, I need to figure out where that person is 161 00:10:21,520 --> 00:10:23,720 Speaker 1: so that that doesn't get like displaced, and like you know, 162 00:10:23,720 --> 00:10:25,640 Speaker 1: we always kind of have an eye on where that 163 00:10:25,800 --> 00:10:27,240 Speaker 1: is and who has ownership with it. 164 00:10:27,760 --> 00:10:28,559 Speaker 2: So what are some of. 165 00:10:28,480 --> 00:10:32,160 Speaker 1: The differences between black memory practices and maybe more widespread 166 00:10:32,280 --> 00:10:34,280 Speaker 1: like western archival practices. 167 00:10:34,760 --> 00:10:37,240 Speaker 2: Thank you so much for asking us question. So, in 168 00:10:37,320 --> 00:10:40,040 Speaker 2: a brick and mortar archives, what you will find is 169 00:10:40,120 --> 00:10:44,600 Speaker 2: that there are papers, and those papers go in boxes, 170 00:10:44,640 --> 00:10:48,800 Speaker 2: and those boxes go on shelves, and there are processes 171 00:10:49,000 --> 00:10:52,679 Speaker 2: through which things may cross the arkable threshold or enter 172 00:10:52,760 --> 00:10:58,280 Speaker 2: the archives. There are high level descriptive practices, high level 173 00:10:58,280 --> 00:11:03,600 Speaker 2: discovery practices, high level access practices that really are about 174 00:11:03,679 --> 00:11:07,800 Speaker 2: ensuring the longevity of those materials and the condition of 175 00:11:07,840 --> 00:11:13,079 Speaker 2: them in that very specific way. And for the most part, 176 00:11:13,160 --> 00:11:16,199 Speaker 2: the brick and mortar archives that exist in the United 177 00:11:16,240 --> 00:11:22,640 Speaker 2: States were created to document the dominant culture. Right the 178 00:11:22,679 --> 00:11:25,760 Speaker 2: ways that we tend to show up in those archives 179 00:11:25,920 --> 00:11:32,679 Speaker 2: is in newspapers, in databases of runaway slave ads. The 180 00:11:32,720 --> 00:11:38,440 Speaker 2: institutions that hold those archives have not traditionally made it 181 00:11:38,559 --> 00:11:45,040 Speaker 2: their agenda, their goal, or their policy to document black life. 182 00:11:45,440 --> 00:11:48,960 Speaker 2: And if they do, what they're documenting is the same 183 00:11:49,040 --> 00:11:53,320 Speaker 2: the children campaign and the bodies in like Pontitrain, they're 184 00:11:53,360 --> 00:12:00,640 Speaker 2: documenting trauma, death, and disposition. Black memory worked the exact 185 00:12:00,720 --> 00:12:03,199 Speaker 2: opposite of that. It starts with us, It starts from 186 00:12:03,240 --> 00:12:06,920 Speaker 2: the bottom. It holds us. It is generative, It shifts, 187 00:12:07,440 --> 00:12:11,400 Speaker 2: It doesn't have to stay static. There's no demand that 188 00:12:11,520 --> 00:12:17,560 Speaker 2: it be accessible broadly, widely accessible. In so many ways, 189 00:12:17,640 --> 00:12:22,360 Speaker 2: they are side by side practices that almost have nothing 190 00:12:22,440 --> 00:12:26,360 Speaker 2: to do with one another. Right because the goals aren't 191 00:12:26,360 --> 00:12:29,360 Speaker 2: the same, the practices and the policies aren't the same, 192 00:12:29,679 --> 00:12:33,520 Speaker 2: they can speak to one another, they could influence one another, certainly. 193 00:12:34,000 --> 00:12:38,439 Speaker 2: Black memory workers who call themselves such often are trained 194 00:12:38,480 --> 00:12:43,240 Speaker 2: as archivists. So they come with the knowledge of how 195 00:12:43,280 --> 00:12:47,040 Speaker 2: to preserve materials. What they're doing with that knowledge isn't 196 00:12:47,080 --> 00:12:49,360 Speaker 2: just preserving materials putting them in a box on a 197 00:12:49,400 --> 00:12:51,840 Speaker 2: shelf and saying, hey, come and visit my brick and 198 00:12:51,880 --> 00:12:55,360 Speaker 2: mortar archives, to visit your stuff. They're saying, I know 199 00:12:55,440 --> 00:12:58,640 Speaker 2: how to preserve this. Let me show you, let me 200 00:12:58,760 --> 00:13:01,240 Speaker 2: teach you, let me help you. And so it's also 201 00:13:01,520 --> 00:13:05,960 Speaker 2: a knowledge practice in a way, and a generational one 202 00:13:06,520 --> 00:13:10,520 Speaker 2: in a different way than institutional brick and mortar archives 203 00:13:10,920 --> 00:13:13,360 Speaker 2: like the National Archives or even the Library of Congress, 204 00:13:13,520 --> 00:13:17,920 Speaker 2: or I mean even your local repository or university archives 205 00:13:17,960 --> 00:13:20,440 Speaker 2: may hold. They are set about to do sort of 206 00:13:20,480 --> 00:13:21,160 Speaker 2: different things. 207 00:13:21,640 --> 00:13:23,720 Speaker 1: And what do you feel like is the goal of 208 00:13:23,920 --> 00:13:26,560 Speaker 1: a black memory work? Then it's a different goal, like 209 00:13:26,600 --> 00:13:27,880 Speaker 1: you said, than archiving. 210 00:13:29,160 --> 00:13:33,480 Speaker 2: Yeah, I think what I've observed, what I've heard people say, 211 00:13:34,040 --> 00:13:38,680 Speaker 2: is that we need to preserve our memory. No one 212 00:13:38,720 --> 00:13:41,080 Speaker 2: else is doing that for us, and we're going to 213 00:13:41,120 --> 00:13:44,480 Speaker 2: do it in a way that is intentionally black, and 214 00:13:44,520 --> 00:13:48,360 Speaker 2: so that shows up differently, right, We show up differently 215 00:13:48,480 --> 00:13:52,199 Speaker 2: in intentionally black spaces. The work is done differently in 216 00:13:52,200 --> 00:13:55,840 Speaker 2: intentionally black spaces, and I want to be really clear 217 00:13:55,880 --> 00:14:00,160 Speaker 2: that not everyone who is a black memory worker, right, yea, 218 00:14:00,160 --> 00:14:03,679 Speaker 2: and your grandmother is a black memory worker, that bag 219 00:14:04,080 --> 00:14:06,960 Speaker 2: that's black memory work. So it doesn't matter that she 220 00:14:06,960 --> 00:14:09,560 Speaker 2: didn't have a master's degree in library and information studies 221 00:14:09,640 --> 00:14:12,760 Speaker 2: or science, right. I assume she did bond anyway, though 222 00:14:14,640 --> 00:14:17,400 Speaker 2: neither of my grandmothers did either, and I would absolutely 223 00:14:17,440 --> 00:14:19,440 Speaker 2: call both of them black memory workers. I think the 224 00:14:19,480 --> 00:14:23,400 Speaker 2: purpose and the goal for us is about preservation of memory. 225 00:14:23,640 --> 00:14:27,480 Speaker 2: It's also about preservation of cultural heritage, and there's a 226 00:14:27,520 --> 00:14:31,359 Speaker 2: certain protective layer I want to say for future generations 227 00:14:31,400 --> 00:14:32,920 Speaker 2: that's very intentional. There too. 228 00:14:33,560 --> 00:14:46,600 Speaker 1: More from our conversation after the break And so is it, Tanya, 229 00:14:46,680 --> 00:14:48,800 Speaker 1: a more personal kind of like, Okay, it is for 230 00:14:49,000 --> 00:14:52,680 Speaker 1: my family to preserve the history of our family, not 231 00:14:52,720 --> 00:14:55,880 Speaker 1: necessarily for me to learn about your family, because that's 232 00:14:55,920 --> 00:14:58,360 Speaker 1: kind of what it feels like. It's like, oh, with archiving, 233 00:14:58,520 --> 00:15:00,680 Speaker 1: like you can go visit like a library, a museum 234 00:15:00,720 --> 00:15:03,360 Speaker 1: and like learn more. This feels like it is a 235 00:15:03,400 --> 00:15:07,120 Speaker 1: more personal thing maybe for your community. Well definitely your 236 00:15:07,160 --> 00:15:09,800 Speaker 1: direct family, but maybe also your community that is still 237 00:15:09,840 --> 00:15:11,000 Speaker 1: like close to the family. 238 00:15:11,800 --> 00:15:14,440 Speaker 2: Absolutely, I would say that's true. And I would also 239 00:15:14,520 --> 00:15:18,520 Speaker 2: say that in terms of concentric circles or community and 240 00:15:18,560 --> 00:15:21,400 Speaker 2: how we build and how we live and work with 241 00:15:21,440 --> 00:15:26,080 Speaker 2: one another, there are many sort of points of possible 242 00:15:26,160 --> 00:15:31,400 Speaker 2: or potential interaction with other people's historical memories. Right. I 243 00:15:31,480 --> 00:15:35,200 Speaker 2: know that ancestry has become a very popular site. I 244 00:15:35,240 --> 00:15:37,800 Speaker 2: have my own issues with it, but part of what 245 00:15:37,880 --> 00:15:41,600 Speaker 2: it's done is to help people who otherwise couldn't even 246 00:15:41,720 --> 00:15:44,960 Speaker 2: make those family connections be able to make those family connections. 247 00:15:45,400 --> 00:15:47,880 Speaker 2: So I think Black memory work serves a similar function 248 00:15:47,960 --> 00:15:52,240 Speaker 2: as well, inasmuch as you know, you're telling a story 249 00:15:52,720 --> 00:15:55,160 Speaker 2: and you mentioned a name, and the next thing you know, 250 00:15:55,440 --> 00:15:58,600 Speaker 2: it's oh, I have a picture of him, and it's 251 00:15:58,680 --> 00:16:02,200 Speaker 2: through its storytelling is also a really key element of 252 00:16:02,240 --> 00:16:06,080 Speaker 2: Black memory work. As we are telling each other one 253 00:16:06,120 --> 00:16:09,040 Speaker 2: another's stories, as we're telling our own family stories, as 254 00:16:09,080 --> 00:16:11,800 Speaker 2: we all of those things that we hold in collective, 255 00:16:12,320 --> 00:16:16,200 Speaker 2: I think makes it more communal, more community based and 256 00:16:16,240 --> 00:16:21,960 Speaker 2: community oriented. And also, if there is a desire to 257 00:16:22,040 --> 00:16:27,240 Speaker 2: do long term preservation work that requires resources, and it's 258 00:16:27,680 --> 00:16:31,960 Speaker 2: generally going to be easier to do that in community. Right, So, if, 259 00:16:31,960 --> 00:16:34,120 Speaker 2: for example, one of the things that the Black memory 260 00:16:34,120 --> 00:16:38,640 Speaker 2: collective in Los Angeles is talking about doing is building 261 00:16:39,720 --> 00:16:44,800 Speaker 2: a repository of sorts, perhaps a digital repository to hold photographs, 262 00:16:44,880 --> 00:16:49,960 Speaker 2: to hold recipes. We're still talking with the broader community 263 00:16:50,000 --> 00:16:53,920 Speaker 2: about who would want to participate, who should hold and 264 00:16:54,000 --> 00:16:57,120 Speaker 2: steward it, what an ingest process would look like. And 265 00:16:57,160 --> 00:16:59,360 Speaker 2: then it starts to really feel like we're having the 266 00:16:59,400 --> 00:17:03,960 Speaker 2: conversation about a brick and mortar archives. Right it shifts. 267 00:17:03,600 --> 00:17:07,120 Speaker 2: It doesn't stop being black memory work, but it takes 268 00:17:07,200 --> 00:17:10,679 Speaker 2: on sort of the patna of traditional artical space. 269 00:17:12,119 --> 00:17:14,320 Speaker 1: So in your work you also talk about Black memory 270 00:17:14,320 --> 00:17:19,000 Speaker 1: work being restorative and healing. Can you talk about what ways, like, 271 00:17:19,040 --> 00:17:20,960 Speaker 1: what does that look like for black memory work to 272 00:17:21,000 --> 00:17:21,680 Speaker 1: actually heal. 273 00:17:22,440 --> 00:17:25,920 Speaker 2: There's a member of our collective who is very interested 274 00:17:25,960 --> 00:17:29,520 Speaker 2: in birth stories and part of the so when I 275 00:17:29,520 --> 00:17:33,200 Speaker 2: said storytelling, narrative is a really important part of Black 276 00:17:33,240 --> 00:17:36,520 Speaker 2: memory work because it isn't in part about telling our stories, 277 00:17:36,840 --> 00:17:39,480 Speaker 2: preserving those stories for the future and for our children, 278 00:17:39,520 --> 00:17:47,000 Speaker 2: our grandchildren, our descendants are communities more broadly, and so Dominique, 279 00:17:47,000 --> 00:17:49,119 Speaker 2: I don't think she would mind me naming her is 280 00:17:49,200 --> 00:17:53,720 Speaker 2: doing really beautiful work around talking with people who have 281 00:17:53,880 --> 00:18:00,000 Speaker 2: difficult birth stories and recording those stories for that person 282 00:18:00,440 --> 00:18:04,360 Speaker 2: so that they can have something to hold. Right So 283 00:18:04,400 --> 00:18:07,399 Speaker 2: in some instances, they may not have a child to hold, 284 00:18:07,840 --> 00:18:10,240 Speaker 2: but they now have a story that they can hold 285 00:18:10,280 --> 00:18:13,439 Speaker 2: that has been told to someone who is holding it 286 00:18:13,800 --> 00:18:16,480 Speaker 2: in care and in community with them. And I think 287 00:18:16,480 --> 00:18:20,960 Speaker 2: there's something very healing and restorative about that kind of practice. 288 00:18:21,560 --> 00:18:24,880 Speaker 2: And we think about all of that as Black Memory work. 289 00:18:26,040 --> 00:18:27,720 Speaker 1: You know, telling you you started off by given some 290 00:18:27,840 --> 00:18:30,639 Speaker 1: very powerfil examples of like things in your personal experience 291 00:18:30,720 --> 00:18:33,600 Speaker 1: that have shaped your interest in doing this work. So 292 00:18:33,640 --> 00:18:35,440 Speaker 1: you talked about the State of the Children campaign as 293 00:18:35,440 --> 00:18:38,560 Speaker 1: well as Hurricane Katrina, And I'm wondering, you know, you're 294 00:18:38,600 --> 00:18:41,520 Speaker 1: talking about that as like as told to us, right 295 00:18:41,560 --> 00:18:44,080 Speaker 1: like these messages and stories that we saw, But what 296 00:18:44,119 --> 00:18:46,439 Speaker 1: you're talking about with Black Memory work is us owning 297 00:18:46,480 --> 00:18:49,159 Speaker 1: this process. I wonder what does it look like for 298 00:18:49,320 --> 00:18:52,679 Speaker 1: something like Hurricane Katrina, or even I'm thinking about the pandemic, 299 00:18:52,760 --> 00:18:55,720 Speaker 1: right like, how traumatic that was for so many of us, 300 00:18:55,760 --> 00:18:58,000 Speaker 1: and how many of us lost loved ones and lost 301 00:18:58,040 --> 00:19:00,880 Speaker 1: experiences and all kinds of things. What does it look 302 00:19:01,000 --> 00:19:03,399 Speaker 1: like in Black Memory works a whole space for both 303 00:19:03,760 --> 00:19:07,280 Speaker 1: the joy of black life, but also the more people 304 00:19:07,800 --> 00:19:08,520 Speaker 1: parts of it. 305 00:19:09,160 --> 00:19:11,560 Speaker 2: That's something that I think about a lot. I taught 306 00:19:11,600 --> 00:19:16,080 Speaker 2: a course in the fall of twenty twenty four, yes, 307 00:19:16,080 --> 00:19:18,880 Speaker 2: in the fall of twenty twenty four on black memory work, 308 00:19:19,359 --> 00:19:23,399 Speaker 2: and there's not much written about it in the professional literature, 309 00:19:23,560 --> 00:19:27,480 Speaker 2: certainly not in information studies, and so I'm casting about 310 00:19:27,520 --> 00:19:30,520 Speaker 2: a little bit looking for materials for my students to read, 311 00:19:31,119 --> 00:19:35,520 Speaker 2: and about halfway through the quarter they came to me 312 00:19:35,560 --> 00:19:38,880 Speaker 2: and they said, Professor Sutherland, this is beautiful and we're 313 00:19:38,920 --> 00:19:41,359 Speaker 2: really enjoying this class. But if we had one note 314 00:19:41,400 --> 00:19:45,199 Speaker 2: for you, more joy, Please more joy, because it was 315 00:19:45,240 --> 00:19:50,520 Speaker 2: a lot easier to find representations or examples of how 316 00:19:50,560 --> 00:19:54,040 Speaker 2: we are actually holding space for one another in tragedy 317 00:19:54,119 --> 00:19:57,520 Speaker 2: and in trauma than it was to find things written 318 00:19:57,520 --> 00:20:00,119 Speaker 2: about how we are holding one another in joy. So 319 00:20:00,160 --> 00:20:04,240 Speaker 2: I'm really glad that you asked that question, because with 320 00:20:04,440 --> 00:20:10,280 Speaker 2: COVID so much loss, there are a lot of community based, 321 00:20:10,680 --> 00:20:17,720 Speaker 2: grassroots level projects to document people's experiences with COVID. I 322 00:20:17,760 --> 00:20:20,600 Speaker 2: can't say the same thing was true with Katrina in 323 00:20:20,680 --> 00:20:24,280 Speaker 2: the same way, but it certainly is true now with COVID. 324 00:20:24,280 --> 00:20:27,679 Speaker 2: There are oral history projects all kinds of things that 325 00:20:27,720 --> 00:20:31,919 Speaker 2: are sort of attuned to that kind of pain. I 326 00:20:31,960 --> 00:20:34,639 Speaker 2: think where we are maybe not doing as good a 327 00:20:34,720 --> 00:20:38,800 Speaker 2: job is and finding those moments of celebration and lifting 328 00:20:38,800 --> 00:20:41,800 Speaker 2: those up. Part of black memory work is teaching the 329 00:20:41,880 --> 00:20:44,639 Speaker 2: kids how to play spades, right, Like, we got to 330 00:20:44,680 --> 00:20:47,159 Speaker 2: know how to play dominoes. If we don't know, then 331 00:20:47,240 --> 00:20:50,680 Speaker 2: we're only bringing the bad stuff with us through the generations. 332 00:20:51,119 --> 00:20:54,280 Speaker 2: So I know y'all want a gate keep, but teach 333 00:20:54,320 --> 00:20:57,200 Speaker 2: the kids, keep them maybies how to play spades and dominoes. 334 00:20:57,320 --> 00:20:59,879 Speaker 1: Okay, So what I do you're seeing, Tanya, is that 335 00:20:59,920 --> 00:21:02,200 Speaker 1: I got to break out my hula hoop and teach 336 00:21:02,240 --> 00:21:05,040 Speaker 1: the babies how to do the hula hoop. 337 00:21:05,880 --> 00:21:11,120 Speaker 2: Need double dutch? Yeah, I want to see there waiting, Okay, 338 00:21:11,840 --> 00:21:15,400 Speaker 2: you want to see that's right? Yeah. You know, many 339 00:21:15,480 --> 00:21:18,600 Speaker 2: years ago, there was a project in Trinidad that was 340 00:21:19,400 --> 00:21:21,720 Speaker 2: now I'm not going to remember exactly what it was called, 341 00:21:21,760 --> 00:21:23,840 Speaker 2: but it was something. It was a cultural memory project, 342 00:21:23,840 --> 00:21:27,120 Speaker 2: and the idea was sort of sort of this living museum. 343 00:21:28,080 --> 00:21:31,840 Speaker 2: What a beautiful notion. They had a storyteller who was 344 00:21:31,840 --> 00:21:34,920 Speaker 2: there to tell the stories of Afro Trinidadians and the Caribbean, 345 00:21:35,520 --> 00:21:39,960 Speaker 2: there was hopscotch and they would teach anybody who didn't 346 00:21:40,000 --> 00:21:42,480 Speaker 2: know how to play how to play. And I thought that, 347 00:21:43,000 --> 00:21:45,240 Speaker 2: right this, we need more of this, We need more 348 00:21:45,320 --> 00:21:48,960 Speaker 2: living museums, so to speak, more of that energy anyway, 349 00:21:49,600 --> 00:21:51,760 Speaker 2: if not formulation. 350 00:21:51,520 --> 00:21:54,040 Speaker 1: Yeah, yeah, I love that. I love it. So I 351 00:21:54,080 --> 00:21:56,800 Speaker 1: think lots of people who are enjoying our conversation, will 352 00:21:56,960 --> 00:21:59,000 Speaker 1: you know hear this and like feel like it's beautiful 353 00:21:59,000 --> 00:22:01,560 Speaker 1: and be very inspired and then feel like, oh my gosh, 354 00:22:01,560 --> 00:22:03,240 Speaker 1: where do I start? Right? Like what does it look 355 00:22:03,280 --> 00:22:06,280 Speaker 1: like for me to have my own personal family kind 356 00:22:06,320 --> 00:22:08,639 Speaker 1: of black memory work? So what would you suggest to 357 00:22:08,680 --> 00:22:10,160 Speaker 1: them for how to get started? 358 00:22:11,280 --> 00:22:15,840 Speaker 2: I would say, look to your people, right, what are 359 00:22:15,880 --> 00:22:18,520 Speaker 2: you collecting? Because we're all collecting something, even if we're 360 00:22:18,520 --> 00:22:20,800 Speaker 2: not really aware of it. So stop and take a 361 00:22:20,840 --> 00:22:23,480 Speaker 2: beat and are what is it your collect Is it shoes? Honey, 362 00:22:23,480 --> 00:22:26,480 Speaker 2: I'm not mad at you. Just ask yourself what is it? Right? 363 00:22:26,880 --> 00:22:31,320 Speaker 2: Because the shoes are important. All God's children need traveling shoes. Right, 364 00:22:31,880 --> 00:22:34,480 Speaker 2: ask yourself what you're collecting, and then look and see 365 00:22:34,600 --> 00:22:37,640 Speaker 2: talk to your people what are they collecting. My mother 366 00:22:38,280 --> 00:22:44,240 Speaker 2: is a tremendously hard working person. And a really brilliant cook. 367 00:22:45,000 --> 00:22:47,879 Speaker 2: I don't know that she would tell you that she 368 00:22:48,040 --> 00:22:51,200 Speaker 2: is a collector of recipes or that she is the 369 00:22:51,320 --> 00:22:56,119 Speaker 2: keeper of our family culinary traditions. I don't know that 370 00:22:56,160 --> 00:22:58,560 Speaker 2: she would see that in herself, but I see it 371 00:22:58,800 --> 00:23:05,360 Speaker 2: very clearly something I want to make sure gets preserved, respected, held, Right, 372 00:23:05,560 --> 00:23:08,200 Speaker 2: those are the kinds of things I would practice. My dad, 373 00:23:08,560 --> 00:23:11,479 Speaker 2: for example, is a lover of music and has a 374 00:23:11,520 --> 00:23:15,600 Speaker 2: tremendous record and other music collection. I don't think if 375 00:23:15,600 --> 00:23:18,560 Speaker 2: you asked my dad you know what he collects, that 376 00:23:18,600 --> 00:23:21,479 Speaker 2: he would say music. And he probably wouldn't even consider 377 00:23:21,560 --> 00:23:26,840 Speaker 2: himself the music officionado. He just likes music. But those 378 00:23:26,840 --> 00:23:31,119 Speaker 2: are the family stories, right. He comes from a line 379 00:23:31,160 --> 00:23:34,639 Speaker 2: of musicians, and he's probably not thinking about the history 380 00:23:34,680 --> 00:23:37,880 Speaker 2: of his own family as being important to his love 381 00:23:37,920 --> 00:23:41,480 Speaker 2: of music, right. I think we start there. We really 382 00:23:41,520 --> 00:23:46,879 Speaker 2: start by looking inward and looking outward and having conversations 383 00:23:46,960 --> 00:23:48,840 Speaker 2: with the people closest to us, and then we go 384 00:23:48,920 --> 00:23:51,879 Speaker 2: a little bit further out talk to my cousins, I 385 00:23:51,920 --> 00:23:56,600 Speaker 2: talk to my aunties, and then at the same time, right, 386 00:23:56,760 --> 00:23:59,040 Speaker 2: once you feel like you have the capacity for it, 387 00:23:59,400 --> 00:24:02,280 Speaker 2: start talking to other people in your community, start talking 388 00:24:02,359 --> 00:24:05,400 Speaker 2: to people at church, or you know, trying to think 389 00:24:05,440 --> 00:24:09,240 Speaker 2: of where else people gather. These days, we've become so dispersed. 390 00:24:09,640 --> 00:24:13,359 Speaker 2: Ask people when you're playing cards, ask people at wherever 391 00:24:13,359 --> 00:24:15,840 Speaker 2: it is at the cookout. The cookout is actually the 392 00:24:15,840 --> 00:24:18,760 Speaker 2: perfect place to have this conversation, to start this conversation. 393 00:24:18,960 --> 00:24:21,520 Speaker 1: Yeah, as we are kind of moving quickly into the 394 00:24:21,520 --> 00:24:23,680 Speaker 1: holiday season, it feels like a good time to think 395 00:24:23,720 --> 00:24:26,399 Speaker 1: about you know, so for example, and the things you've shared, right, Like, 396 00:24:26,600 --> 00:24:29,080 Speaker 1: would it be an opportunity maybe for you to record 397 00:24:29,119 --> 00:24:31,800 Speaker 1: your dad talking about a couple of his favorite albums, 398 00:24:31,880 --> 00:24:35,440 Speaker 1: or to record mom as she's preparing for the holiday 399 00:24:35,520 --> 00:24:37,600 Speaker 1: dishes as a way of kind of getting started with 400 00:24:37,920 --> 00:24:39,240 Speaker 1: some of these conversations. 401 00:24:40,000 --> 00:24:43,760 Speaker 2: Absolutely, absolutely get your phone out. Don't just record mom 402 00:24:43,800 --> 00:24:47,520 Speaker 2: doing what she's doing, but learn it with your hands too. Right, 403 00:24:47,720 --> 00:24:51,359 Speaker 2: There's something really important and important aspect of black memory 404 00:24:51,400 --> 00:24:56,520 Speaker 2: work is the cultural transmission. And cultural transmission is fancy 405 00:24:56,560 --> 00:24:59,880 Speaker 2: speak for handed down from generation to generation, and by 406 00:25:00,080 --> 00:25:02,880 Speaker 2: that we mean like physically it's a practice that has 407 00:25:02,920 --> 00:25:07,879 Speaker 2: been transmitted from one generation to the next. So I 408 00:25:08,200 --> 00:25:10,639 Speaker 2: learned how to roll out a roty dough from my 409 00:25:10,680 --> 00:25:14,040 Speaker 2: anti intrinidad. I now have that memory in my muscle. 410 00:25:14,080 --> 00:25:15,800 Speaker 2: I have muscle memory of how to do that. I 411 00:25:15,800 --> 00:25:17,920 Speaker 2: can teach my son how to roll out a rote. 412 00:25:18,160 --> 00:25:21,480 Speaker 2: So if I had just had a video of Anti 413 00:25:21,560 --> 00:25:24,480 Speaker 2: doing it, Anti and Iurene doing it, maybe maybe I 414 00:25:24,520 --> 00:25:28,320 Speaker 2: could still teach my son. Maybe. But having that muscle 415 00:25:28,359 --> 00:25:30,520 Speaker 2: memory is really important, which is why I'm saying I 416 00:25:30,560 --> 00:25:32,359 Speaker 2: want to see the kids with their hands in there 417 00:25:32,720 --> 00:25:36,199 Speaker 2: waiting to jump it for double dutch. Right, the recordings. 418 00:25:36,760 --> 00:25:40,160 Speaker 2: We can take all of the preservation practices to heart 419 00:25:40,200 --> 00:25:47,640 Speaker 2: and utilize them, and digital things vanish. Sometimes those things 420 00:25:47,680 --> 00:25:50,720 Speaker 2: get lost. It's a lot harder to lose something if 421 00:25:50,760 --> 00:25:51,879 Speaker 2: you know it in your body. 422 00:25:53,600 --> 00:25:55,960 Speaker 1: So this feels like a perfect intersection of where we 423 00:25:56,040 --> 00:25:59,280 Speaker 1: started in talking about black memory work and then digital afterlife, right, 424 00:25:59,320 --> 00:26:01,880 Speaker 1: because when we're talking about is these videos that kind 425 00:26:01,880 --> 00:26:04,679 Speaker 1: of live beyond us. And I'm sure as we know 426 00:26:04,720 --> 00:26:07,520 Speaker 1: that there are all kinds of like ethical considerations around, 427 00:26:07,600 --> 00:26:10,960 Speaker 1: like digital archives and who has a permission to share? 428 00:26:11,119 --> 00:26:13,679 Speaker 1: Can you talk about like some of the ethical considerations, 429 00:26:13,920 --> 00:26:16,320 Speaker 1: especially when we're thinking about digital afterlives. 430 00:26:16,880 --> 00:26:21,040 Speaker 2: Yes, I start from a place of concern and care 431 00:26:21,160 --> 00:26:24,320 Speaker 2: when it comes to any kind of digital recording. I 432 00:26:24,359 --> 00:26:28,439 Speaker 2: don't often allow myself to be recorded in fact, because 433 00:26:28,440 --> 00:26:32,680 Speaker 2: if there is no digital video of me, then there 434 00:26:32,720 --> 00:26:36,320 Speaker 2: can't be an aid fake of me saying some stuff 435 00:26:36,359 --> 00:26:38,840 Speaker 2: I would never say, doing some stuff I would never do, 436 00:26:39,320 --> 00:26:42,040 Speaker 2: and then having that be one hundred years from now 437 00:26:42,480 --> 00:26:46,760 Speaker 2: what my descendants are looking back on and thinking, Oh, 438 00:26:46,760 --> 00:26:49,480 Speaker 2: that was Tanya. So that's what we don't want. That's 439 00:26:49,480 --> 00:26:52,600 Speaker 2: sort of the core of the ethical concern is that 440 00:26:53,200 --> 00:26:57,520 Speaker 2: our images and likenesses, which we already know are typically 441 00:26:57,560 --> 00:27:02,600 Speaker 2: taken in moments of trauma and despair. Those are the 442 00:27:02,640 --> 00:27:07,840 Speaker 2: images that are then held kept. That's the about us 443 00:27:07,880 --> 00:27:12,199 Speaker 2: and sort of on our behalves, not bias and for 444 00:27:12,400 --> 00:27:18,600 Speaker 2: us approach. And there are ethical concerns that one can 445 00:27:18,840 --> 00:27:22,760 Speaker 2: raise and think through in terms of access and sharing, 446 00:27:23,200 --> 00:27:25,040 Speaker 2: and I have a lot of thoughts about that. I 447 00:27:25,040 --> 00:27:29,359 Speaker 2: think we should be thinking about what is appropriate access, 448 00:27:29,960 --> 00:27:32,760 Speaker 2: but who determines what is appropriate right? I think that 449 00:27:32,920 --> 00:27:36,600 Speaker 2: should be done at the community level, whoever the community 450 00:27:36,720 --> 00:27:40,760 Speaker 2: is that's being affected or influenced or included in whatever 451 00:27:40,800 --> 00:27:45,520 Speaker 2: the digital project is. Also, I'm thinking about Frederick Douglass. 452 00:27:45,840 --> 00:27:51,280 Speaker 2: Frederick Douglas was big on the photograph and the potential 453 00:27:51,359 --> 00:27:54,640 Speaker 2: what it could do for us. He did not think 454 00:27:54,680 --> 00:27:58,440 Speaker 2: that this was a technology, that photography was a technology 455 00:27:58,920 --> 00:28:04,160 Speaker 2: that should be to convince white people that black folks 456 00:28:04,240 --> 00:28:07,359 Speaker 2: had value. He didn't think the photograph should be necessary, 457 00:28:07,359 --> 00:28:10,720 Speaker 2: that any kind of technology could in fact demonstrate the 458 00:28:10,800 --> 00:28:13,560 Speaker 2: value of a black life to somebody who didn't want 459 00:28:13,600 --> 00:28:18,840 Speaker 2: to see it. So imagine in the year twenty twenty two, 460 00:28:19,119 --> 00:28:22,280 Speaker 2: twenty twenty three, somewhere in there, when I got on 461 00:28:22,359 --> 00:28:26,680 Speaker 2: Beyonce's Internet and I saw a photograph of Frederick Douglass 462 00:28:26,760 --> 00:28:31,159 Speaker 2: that had been reanimated through Ancestry or one of those companies, 463 00:28:31,400 --> 00:28:34,359 Speaker 2: and he's shaking his head back and forth, back and forth, 464 00:28:34,359 --> 00:28:38,120 Speaker 2: and I thought, everything I know about this man from 465 00:28:38,160 --> 00:28:41,600 Speaker 2: the words that came out of his mouth, he would 466 00:28:41,640 --> 00:28:44,000 Speaker 2: hate this and would not stand behind it at all. 467 00:28:44,200 --> 00:28:46,920 Speaker 2: And for me, that's where the ethical issues start. If 468 00:28:46,920 --> 00:28:49,360 Speaker 2: we have no agency and no ability to say what 469 00:28:49,520 --> 00:28:52,360 Speaker 2: is going to be done with those materials, and we 470 00:28:52,520 --> 00:28:56,680 Speaker 2: know that there is a long history of using our 471 00:28:56,760 --> 00:29:01,440 Speaker 2: images in nefarious ways that end up harming us, doing 472 00:29:01,560 --> 00:29:05,640 Speaker 2: further harm, then that's for me, the first ethical flag. 473 00:29:05,800 --> 00:29:07,640 Speaker 2: That's where we have to start. We have to start 474 00:29:07,640 --> 00:29:10,520 Speaker 2: with people having a certain amount of agency and being 475 00:29:10,560 --> 00:29:12,360 Speaker 2: able to say yes I want to do that or 476 00:29:12,360 --> 00:29:16,040 Speaker 2: no I don't. It's got to be a consent based model, absolutely, 477 00:29:16,120 --> 00:29:18,280 Speaker 2: is the first thing I think. And then I think 478 00:29:18,320 --> 00:29:19,960 Speaker 2: we have to be really careful. You know, I said 479 00:29:20,000 --> 00:29:23,040 Speaker 2: the thing about gatekeeping earlier in kind of a joking way, 480 00:29:23,120 --> 00:29:26,680 Speaker 2: but there are real needs for gatekeeping, and it's because 481 00:29:27,000 --> 00:29:32,680 Speaker 2: we are collecting materials for beautiful reasons and not nefarious ones, 482 00:29:32,720 --> 00:29:35,520 Speaker 2: and we don't want to open ourselves up to further attack. 483 00:29:36,400 --> 00:29:49,720 Speaker 1: More from our conversation after the break, I guess that's 484 00:29:49,760 --> 00:29:51,680 Speaker 1: my next question, what you telling you? Because you've already 485 00:29:51,680 --> 00:29:54,000 Speaker 1: talked about like how dispersed we all are, right, Like, 486 00:29:54,320 --> 00:29:57,360 Speaker 1: you know, the cookout largely doesn't exist all the time, 487 00:29:57,480 --> 00:30:00,640 Speaker 1: maybe as it did kind of historically, And so many 488 00:30:00,680 --> 00:30:03,400 Speaker 1: of the ways that we are creating community and talking 489 00:30:03,400 --> 00:30:05,960 Speaker 1: with one another about these kinds of things is online. 490 00:30:06,480 --> 00:30:10,280 Speaker 1: And so what does it look like to gatekeep when 491 00:30:10,560 --> 00:30:13,200 Speaker 1: so many of these conversations in ways that we are 492 00:30:13,280 --> 00:30:16,160 Speaker 1: kind of practicing black memory work kind of happen in public. 493 00:30:17,200 --> 00:30:21,880 Speaker 2: Yeah, it's a really really important point. And the thing 494 00:30:21,920 --> 00:30:24,960 Speaker 2: that immediately occurred to me as you were. As you 495 00:30:25,000 --> 00:30:27,960 Speaker 2: were speaking, is all of the times that I have 496 00:30:28,040 --> 00:30:31,960 Speaker 2: seen a mean pop up on the Internet and then 497 00:30:32,040 --> 00:30:36,120 Speaker 2: a whole bunch of people, non malanated people rushing in 498 00:30:36,160 --> 00:30:38,160 Speaker 2: to be like, explain it to me, explain it to 499 00:30:38,200 --> 00:30:40,280 Speaker 2: me and get it, Like I don't get it. I 500 00:30:40,320 --> 00:30:43,440 Speaker 2: have seen gatekeeping happening there in a way that I'm like, 501 00:30:43,520 --> 00:30:46,280 Speaker 2: that's right, you know exactly what to do. You understood 502 00:30:46,320 --> 00:30:48,560 Speaker 2: the assignment. Your people are like, I'm not going to 503 00:30:48,640 --> 00:30:51,719 Speaker 2: tell you that it's actually not your business, so it's 504 00:30:51,880 --> 00:30:55,640 Speaker 2: saus business. And I have been very impressed actually with 505 00:30:55,720 --> 00:31:01,360 Speaker 2: our ability to find ourselves and commune online. Black Twitter, 506 00:31:02,120 --> 00:31:07,880 Speaker 2: while it was a thing, was a very it was open. 507 00:31:08,080 --> 00:31:11,440 Speaker 2: It's not like you couldn't follow some accounts and find 508 00:31:11,480 --> 00:31:14,600 Speaker 2: your way into black Twitter. But if you didn't have 509 00:31:14,680 --> 00:31:17,200 Speaker 2: a guide of some kind, if you didn't know where 510 00:31:17,240 --> 00:31:20,440 Speaker 2: to start, black Twitter might as well have been a 511 00:31:20,640 --> 00:31:24,680 Speaker 2: locked room in a castle somewhere, Like you could not 512 00:31:24,880 --> 00:31:28,560 Speaker 2: find us queens. We weren't available like that. And even 513 00:31:28,600 --> 00:31:30,720 Speaker 2: though we were right there saying the stuff that we 514 00:31:30,720 --> 00:31:34,560 Speaker 2: were saying in public on a very open public platform. 515 00:31:35,040 --> 00:31:38,280 Speaker 2: So I think that we've always found ways were very 516 00:31:38,360 --> 00:31:42,760 Speaker 2: creative wokes. And it's one of the things that's actually 517 00:31:42,800 --> 00:31:46,800 Speaker 2: impressed me most about studying the way that black people 518 00:31:46,840 --> 00:31:52,800 Speaker 2: kind of move on the internet is that somehow the 519 00:31:52,840 --> 00:31:55,280 Speaker 2: way that we are in real life, of course gets 520 00:31:55,360 --> 00:31:57,320 Speaker 2: echoed and we understand the assignment. 521 00:31:57,520 --> 00:32:00,719 Speaker 1: Yeah, yeah, you know, you bringing up leg Twitter just 522 00:32:00,880 --> 00:32:03,360 Speaker 1: unlocked all of these memories for me, because you're right, 523 00:32:03,480 --> 00:32:06,120 Speaker 1: it was not like a place to go. You did 524 00:32:06,160 --> 00:32:07,720 Speaker 1: have to know kind of like who you needed to 525 00:32:07,760 --> 00:32:10,480 Speaker 1: follow and like the stories and like the things that 526 00:32:10,520 --> 00:32:13,200 Speaker 1: we would revisit year after year. Like it was very 527 00:32:13,280 --> 00:32:15,200 Speaker 1: much that if you know, you know kind of thing 528 00:32:15,840 --> 00:32:20,360 Speaker 1: and thinking about you know, I think how difficult it 529 00:32:20,640 --> 00:32:23,800 Speaker 1: was for people when ownership of Twitter changed, right, and 530 00:32:23,880 --> 00:32:25,840 Speaker 1: so then it very much felt like not a safe 531 00:32:25,880 --> 00:32:27,840 Speaker 1: place or as safe as it could be for black 532 00:32:27,840 --> 00:32:30,360 Speaker 1: folks to kind of congregate, and you know, just rise 533 00:32:30,400 --> 00:32:33,160 Speaker 1: in bots and trolling, and so it very much feels 534 00:32:33,200 --> 00:32:36,160 Speaker 1: like that was a place where there were so many, 535 00:32:36,280 --> 00:32:39,520 Speaker 1: i think rich cultural conversations about like just what it 536 00:32:39,560 --> 00:32:42,959 Speaker 1: means to be black right now and now that doesn't 537 00:32:43,000 --> 00:32:46,600 Speaker 1: exist in the same way, and so you know, like 538 00:32:46,680 --> 00:32:49,440 Speaker 1: can you help me talk through like what does it 539 00:32:49,520 --> 00:32:51,560 Speaker 1: mean for us? To together in a place, like a 540 00:32:51,600 --> 00:32:54,280 Speaker 1: black Twitter that is not owned by us, right that 541 00:32:54,360 --> 00:32:57,080 Speaker 1: we have no control over and like became so meaningful 542 00:32:57,120 --> 00:32:58,960 Speaker 1: for so many people, and then for it to kind 543 00:32:58,960 --> 00:33:01,440 Speaker 1: of be disbanded, it would that means in terms of 544 00:33:01,520 --> 00:33:02,680 Speaker 1: like black memory. 545 00:33:02,400 --> 00:33:06,640 Speaker 2: Work, what a lesson to have learned, right that when 546 00:33:06,680 --> 00:33:09,840 Speaker 2: you don't control your own spaces, they can be taken 547 00:33:09,880 --> 00:33:15,040 Speaker 2: from you, even no matter how much time, energy, effort, 548 00:33:15,560 --> 00:33:21,480 Speaker 2: labor has been invested into that space. And seeing what 549 00:33:21,840 --> 00:33:25,040 Speaker 2: has become of Twitter, it's not even a place that 550 00:33:25,080 --> 00:33:29,960 Speaker 2: you want to hang out or be. So I'm watching 551 00:33:30,080 --> 00:33:35,160 Speaker 2: the development of Black Sky with great interest because I 552 00:33:35,200 --> 00:33:39,040 Speaker 2: think that a fair number of black Twitter users, at 553 00:33:39,160 --> 00:33:42,880 Speaker 2: least those in academic circles, migrated over to Blue Sky. 554 00:33:43,160 --> 00:33:47,000 Speaker 2: There's a Black Sky that has been very intentionally developed 555 00:33:47,280 --> 00:33:51,959 Speaker 2: to kind of combat that very thing. What happens if 556 00:33:51,960 --> 00:33:54,680 Speaker 2: Blue Sky goes down. We don't want to lose our space. 557 00:33:55,240 --> 00:33:59,120 Speaker 2: So I'm watching the development of Black Sky with great interest. 558 00:34:00,160 --> 00:34:04,960 Speaker 2: So noticing that we are finding ourselves, you know, we 559 00:34:05,040 --> 00:34:08,200 Speaker 2: are finding community in similar ways on Instagram and TikTok. 560 00:34:08,600 --> 00:34:12,799 Speaker 2: But Instagram, you know, much like Facebook and Twitter is, 561 00:34:14,080 --> 00:34:16,600 Speaker 2: it's not what it once was right. You can't have 562 00:34:16,680 --> 00:34:18,839 Speaker 2: the same kind of community that you might have once 563 00:34:18,840 --> 00:34:21,960 Speaker 2: been able to build. And so we're seeing this actually 564 00:34:22,000 --> 00:34:25,440 Speaker 2: happen over and over and over again. And the lesson 565 00:34:25,640 --> 00:34:29,480 Speaker 2: is that we have to build it, right. We have 566 00:34:29,560 --> 00:34:31,719 Speaker 2: to build our own thing if we want to have 567 00:34:31,840 --> 00:34:38,040 Speaker 2: our own thing and have it remain And the remaining 568 00:34:38,120 --> 00:34:42,120 Speaker 2: part has to be really intentional. It just really has 569 00:34:42,160 --> 00:34:44,640 Speaker 2: to be. We can't build a black sky. And this 570 00:34:44,760 --> 00:34:46,680 Speaker 2: is no shade to the black sky creators. I don't 571 00:34:46,680 --> 00:34:48,640 Speaker 2: know what their intentions are, but we can't build a 572 00:34:48,680 --> 00:34:53,040 Speaker 2: black sky and then allow it to ask everybody to 573 00:34:53,120 --> 00:34:56,440 Speaker 2: come and be part of that and then be like, oh, yeah, 574 00:34:56,480 --> 00:34:57,759 Speaker 2: actually it's a firefest. 575 00:34:57,920 --> 00:34:58,080 Speaker 1: Right. 576 00:34:58,120 --> 00:35:00,560 Speaker 2: We're not going to do that in our own communities 577 00:35:00,560 --> 00:35:04,280 Speaker 2: to each other because everybody else has is yeah. 578 00:35:04,760 --> 00:35:05,360 Speaker 1: Yeah. 579 00:35:05,440 --> 00:35:09,160 Speaker 2: So some amount of fortifying self fortifying, I think is 580 00:35:09,200 --> 00:35:13,960 Speaker 2: really important in digital spaces as well. So all of 581 00:35:14,000 --> 00:35:16,799 Speaker 2: the black creators, all of the black entrepreneurs, all of 582 00:35:16,840 --> 00:35:21,600 Speaker 2: the black tech folks out there, get yourselves together. Let's 583 00:35:21,600 --> 00:35:23,920 Speaker 2: be in conversation and make sure that we are building 584 00:35:23,960 --> 00:35:27,040 Speaker 2: things in a way that will allow them to last. 585 00:35:27,840 --> 00:35:31,399 Speaker 2: Now that being said, I also want to say that 586 00:35:31,480 --> 00:35:36,279 Speaker 2: not everything is made to last forever, and that there 587 00:35:36,360 --> 00:35:43,200 Speaker 2: is real value in allowing a process of forgetting. There's 588 00:35:43,280 --> 00:35:46,719 Speaker 2: value in that process of letting go right, And so 589 00:35:47,880 --> 00:35:49,600 Speaker 2: I think we need to be really careful too, that 590 00:35:49,640 --> 00:35:51,440 Speaker 2: we're not trying to hold onto something just for the 591 00:35:51,480 --> 00:35:52,600 Speaker 2: sake of holding on to it. 592 00:35:53,080 --> 00:35:56,120 Speaker 1: Yeah, that feels important. So you mentioned one of the 593 00:35:56,120 --> 00:35:58,439 Speaker 1: examples you use was like thinking, well, Okay, I don't 594 00:35:58,480 --> 00:36:01,359 Speaker 1: want myself to kind of be video because twenty years 595 00:36:01,400 --> 00:36:03,400 Speaker 1: from now, I don't want some random AI video of 596 00:36:03,440 --> 00:36:06,120 Speaker 1: me doing a thing that I never did to show up. 597 00:36:06,520 --> 00:36:08,960 Speaker 1: What does it look like for us to have agency 598 00:36:09,000 --> 00:36:11,480 Speaker 1: in this process? Because it feels like this stuff is 599 00:36:11,520 --> 00:36:15,480 Speaker 1: like growing exponentially, like much quicker than like legislation can 600 00:36:15,600 --> 00:36:17,640 Speaker 1: keep up with it, and you know, all of the things. 601 00:36:17,640 --> 00:36:19,640 Speaker 1: Not that legislation is ever you know, like the be 602 00:36:19,760 --> 00:36:22,239 Speaker 1: all in all for us anyway, but it very much 603 00:36:22,320 --> 00:36:25,360 Speaker 1: feels like there's very little regulation and we are not 604 00:36:25,440 --> 00:36:27,800 Speaker 1: keeping at pace with like the way AI is growing. 605 00:36:28,239 --> 00:36:30,080 Speaker 1: So what kinds of things can we do to like 606 00:36:30,120 --> 00:36:32,440 Speaker 1: protect ourselves and to be thinking about like how to 607 00:36:32,480 --> 00:36:37,640 Speaker 1: retain our agency and constant as technology continues to you know, expand. 608 00:36:38,120 --> 00:36:42,280 Speaker 2: Yeah, You're absolutely right that the law can't keep up. Certainly, 609 00:36:42,320 --> 00:36:46,680 Speaker 2: the legislation is nowhere near the pace of technological development, 610 00:36:47,320 --> 00:36:50,640 Speaker 2: and the folks who are working in that space have 611 00:36:50,840 --> 00:36:53,399 Speaker 2: been sounding alarm bells for a long hot minute. I'm 612 00:36:53,400 --> 00:36:58,760 Speaker 2: thinking of people like olandro Nelson, Ruhap, Benjamin Sofia and Obole, myself, 613 00:36:58,840 --> 00:37:03,440 Speaker 2: Andre brockunding bells and sounding bells. And I think that 614 00:37:04,080 --> 00:37:07,240 Speaker 2: what's challenging is that everybody wants to play, right. Everybody 615 00:37:07,280 --> 00:37:09,920 Speaker 2: wants to play with the technology. Everybody wants to know 616 00:37:09,960 --> 00:37:14,200 Speaker 2: how chat GPT works. And I don't think there's enough 617 00:37:14,200 --> 00:37:18,719 Speaker 2: information that folks don't have enough information to know that 618 00:37:18,760 --> 00:37:23,279 Speaker 2: there are potential downline effects or concerns. Right, if you 619 00:37:23,320 --> 00:37:26,560 Speaker 2: aren't a person who's watching black Mirror, you see chat 620 00:37:26,600 --> 00:37:29,840 Speaker 2: GPT as something that can just help you do X, 621 00:37:29,920 --> 00:37:32,520 Speaker 2: Y or Z right. It can help you get a 622 00:37:32,600 --> 00:37:35,239 Speaker 2: recipe together. I'm wanting to trust it to do that, 623 00:37:35,680 --> 00:37:38,520 Speaker 2: but it can help you do all kinds of things. Right, 624 00:37:38,600 --> 00:37:42,040 Speaker 2: It's a product, it's being sold. What's not being told 625 00:37:42,040 --> 00:37:45,280 Speaker 2: to people is that all of the back end data 626 00:37:45,280 --> 00:37:51,760 Speaker 2: collection is doesn't make a large language model a responsible 627 00:37:51,920 --> 00:37:57,440 Speaker 2: tool for your health, your wellness your anything right, and 628 00:37:57,560 --> 00:38:01,960 Speaker 2: looking to these digital tools like that is in the 629 00:38:02,040 --> 00:38:06,160 Speaker 2: long run, going to be deleterious. These are the alarm 630 00:38:06,200 --> 00:38:09,720 Speaker 2: bells that people keep trying to raise. So I would say, 631 00:38:10,320 --> 00:38:13,799 Speaker 2: I know it's really easy for me to say do 632 00:38:13,920 --> 00:38:16,759 Speaker 2: your homework. But when I say do your homework, I'm 633 00:38:16,800 --> 00:38:19,600 Speaker 2: not saying like, you need to read my book. Please 634 00:38:19,640 --> 00:38:21,719 Speaker 2: do read my book. I'm not saying that you need 635 00:38:21,840 --> 00:38:23,640 Speaker 2: to go out and read a bunch of academic books. 636 00:38:23,680 --> 00:38:25,040 Speaker 2: I'm not saying that you need to go read a 637 00:38:25,080 --> 00:38:28,120 Speaker 2: bunch of academic articles. I'm not even saying that. And 638 00:38:28,160 --> 00:38:29,799 Speaker 2: I would encourage you to try to read the terms 639 00:38:29,800 --> 00:38:31,440 Speaker 2: of service, but they write them in such a way 640 00:38:31,520 --> 00:38:33,880 Speaker 2: they don't want you to read them. Right. When I 641 00:38:33,920 --> 00:38:37,720 Speaker 2: say do your homework, talk to people again. Talk to people. 642 00:38:37,920 --> 00:38:41,080 Speaker 2: Talk to people about their experiences. Ask the person that 643 00:38:41,120 --> 00:38:45,719 Speaker 2: you know that is closest to these technologies or closest 644 00:38:45,800 --> 00:38:49,799 Speaker 2: to this as their life's work. Or do you know 645 00:38:49,880 --> 00:38:52,520 Speaker 2: someone who is in tech. Do you know someone who 646 00:38:52,560 --> 00:38:56,319 Speaker 2: knows a little thing or two about coding? Ask them, Hey, 647 00:38:56,360 --> 00:38:59,240 Speaker 2: what do you think about this technology? Do you think 648 00:38:59,400 --> 00:39:03,799 Speaker 2: I should be taking any steps to protect myself if 649 00:39:03,840 --> 00:39:07,919 Speaker 2: I'm using TikTok in particular, because every technology is going 650 00:39:07,960 --> 00:39:12,080 Speaker 2: to be different and have different concerns, right that come 651 00:39:12,120 --> 00:39:18,480 Speaker 2: with it. For me, it's not actually possible to craft 652 00:39:18,560 --> 00:39:20,840 Speaker 2: a life in which I'm never going to be recorded. 653 00:39:21,360 --> 00:39:26,759 Speaker 2: That doesn't mean that I'm not making more intentional offline choices, right, 654 00:39:27,200 --> 00:39:31,520 Speaker 2: And so there's balance there too, Understanding that we are 655 00:39:31,560 --> 00:39:33,279 Speaker 2: not going to be able to keep up with the 656 00:39:33,320 --> 00:39:36,600 Speaker 2: pace of how the technology is developing. It's not going 657 00:39:36,640 --> 00:39:40,080 Speaker 2: to be legislated rapidly enough in a way that's going 658 00:39:40,160 --> 00:39:44,480 Speaker 2: to protect us. What steps can we take? Understand how 659 00:39:44,560 --> 00:39:51,279 Speaker 2: the technology works, Understand who owns the technology, Understand what 660 00:39:51,480 --> 00:39:57,040 Speaker 2: they are gaining by giving you something that looks free. 661 00:39:57,160 --> 00:40:01,759 Speaker 2: They are gathering your data, they're pack they're reselling it. 662 00:40:03,520 --> 00:40:08,320 Speaker 2: You are the product, right. TikTok isn't the product. YouTube 663 00:40:08,400 --> 00:40:10,799 Speaker 2: isn't the product. It is a little bit, but you 664 00:40:10,840 --> 00:40:15,680 Speaker 2: are the product, right. Your data is extraordinarily valuable. So 665 00:40:15,920 --> 00:40:21,120 Speaker 2: perhaps the first question is how much of myself do 666 00:40:21,160 --> 00:40:23,680 Speaker 2: I want to give up? It's a hard one, but 667 00:40:23,760 --> 00:40:25,879 Speaker 2: it is a compromise, and it is one that you're 668 00:40:25,920 --> 00:40:30,000 Speaker 2: making every time you use these technologies. How much of 669 00:40:30,000 --> 00:40:32,520 Speaker 2: myself do I want to give up? How much information 670 00:40:32,560 --> 00:40:34,760 Speaker 2: do I want to share? How much of my data 671 00:40:34,760 --> 00:40:38,359 Speaker 2: do I want them to have? Right? And yes, if 672 00:40:38,360 --> 00:40:40,960 Speaker 2: you know somebody that works in any of these industries, 673 00:40:41,400 --> 00:40:44,560 Speaker 2: ask them, ask them. Is there something I should be 674 00:40:44,600 --> 00:40:45,439 Speaker 2: concerned about here? 675 00:40:45,880 --> 00:40:48,880 Speaker 1: I'm curious, Tanya, because you said you taught a class 676 00:40:48,920 --> 00:40:51,879 Speaker 1: I think last year, what kinds of conversations are young 677 00:40:51,920 --> 00:40:55,319 Speaker 1: people having about consent in like all of these, right, 678 00:40:55,360 --> 00:40:57,080 Speaker 1: because I think you know, you'd often think about young 679 00:40:57,080 --> 00:40:58,960 Speaker 1: people as being very eager. It's a kind of be 680 00:40:59,080 --> 00:41:02,239 Speaker 1: on top of technolog advances, but I would imagine there's 681 00:41:02,280 --> 00:41:04,759 Speaker 1: also some pushback and thinking through like Okay, what do 682 00:41:04,800 --> 00:41:07,359 Speaker 1: I want this to mean in twenty years from now? 683 00:41:07,600 --> 00:41:10,160 Speaker 1: So I'm just curious, like, what kinds of insights thechae. 684 00:41:10,000 --> 00:41:12,719 Speaker 2: Maybe as a part of the clay is, Yeah, that's 685 00:41:12,760 --> 00:41:15,880 Speaker 2: a great question. Teach an undergraduate class called Information and 686 00:41:15,960 --> 00:41:18,680 Speaker 2: Power where we really go into a lot of these questions. 687 00:41:18,719 --> 00:41:21,759 Speaker 2: And then I taught that black Memory work class. And 688 00:41:22,239 --> 00:41:25,439 Speaker 2: it's interesting because they kind of sit at opposite ends 689 00:41:25,560 --> 00:41:31,400 Speaker 2: of an intellectual spectrum. Right again, where the information and 690 00:41:31,480 --> 00:41:36,319 Speaker 2: power is content more like the digital afterlife conversation that 691 00:41:36,360 --> 00:41:39,440 Speaker 2: we're having, where we talk about things like consent, we 692 00:41:39,480 --> 00:41:45,760 Speaker 2: talk about data collection, we talk about surveillance, and every time, 693 00:41:45,880 --> 00:41:51,480 Speaker 2: without fail, these students are learning something which that's not 694 00:41:51,640 --> 00:41:53,840 Speaker 2: just a credit to me as a professor. I'm not 695 00:41:53,920 --> 00:41:56,719 Speaker 2: patting myself on the back. What I'm saying is that 696 00:41:56,840 --> 00:42:00,520 Speaker 2: they didn't know. They didn't know which they to be 697 00:42:00,680 --> 00:42:04,239 Speaker 2: worried about or concerned about, and the class gives them 698 00:42:04,239 --> 00:42:08,120 Speaker 2: an opportunity to think about how they want to engage. 699 00:42:08,440 --> 00:42:13,200 Speaker 2: And I always assign in this Information and Power class 700 00:42:13,280 --> 00:42:17,160 Speaker 2: a forty eight hour offline assignment where they have to 701 00:42:17,200 --> 00:42:19,840 Speaker 2: go for forty eight hours and they're not allowed to 702 00:42:19,960 --> 00:42:24,320 Speaker 2: use their student ID. They're not allowed to use their phones, 703 00:42:24,440 --> 00:42:27,960 Speaker 2: no digital technologies. You can use electricity, listen to your recirl, 704 00:42:27,960 --> 00:42:31,200 Speaker 2: put your records on. Okay, I love it. But what 705 00:42:31,239 --> 00:42:34,080 Speaker 2: you can't do is call an uber or a lift. 706 00:42:35,200 --> 00:42:38,400 Speaker 2: You can't use your ID to your student ID to 707 00:42:38,400 --> 00:42:43,000 Speaker 2: get into your dorm or into the dining hall. And 708 00:42:43,560 --> 00:42:46,560 Speaker 2: it doesn't take more than forty eight hours for people 709 00:42:46,800 --> 00:42:53,719 Speaker 2: to realize how much of their lives is absolutely controlled 710 00:42:53,800 --> 00:42:58,960 Speaker 2: and determined by outside forces that are run by technology 711 00:42:59,000 --> 00:43:01,920 Speaker 2: companies like I couldn't do this. I couldn't do that. 712 00:43:02,040 --> 00:43:04,440 Speaker 2: I couldn't do this. I couldn't do that. I basically 713 00:43:04,480 --> 00:43:08,360 Speaker 2: had to sit at home for forty eight hours have 714 00:43:08,520 --> 00:43:12,640 Speaker 2: people like have my roommate bring me food. I just 715 00:43:12,719 --> 00:43:15,640 Speaker 2: didn't realize. I just didn't see it. And I think 716 00:43:15,680 --> 00:43:19,040 Speaker 2: once you see it, it's hard to unsee it. 717 00:43:19,080 --> 00:43:22,040 Speaker 1: Sounds like a powerful little example and a real uh 718 00:43:22,480 --> 00:43:24,799 Speaker 1: ipposunity for many of us to practice, like what would 719 00:43:24,840 --> 00:43:26,719 Speaker 1: even be like for it twenty four hours to like 720 00:43:26,840 --> 00:43:28,680 Speaker 1: not engage with any of your devices like. 721 00:43:28,640 --> 00:43:33,680 Speaker 2: That, Yes, that's right, and to really observe, well, what 722 00:43:33,760 --> 00:43:36,279 Speaker 2: can I do and what can't I do? Right? Where 723 00:43:36,320 --> 00:43:38,560 Speaker 2: can I go? Can I get on the subway? Can 724 00:43:38,600 --> 00:43:41,000 Speaker 2: I take the bus? Do I have to have my 725 00:43:41,160 --> 00:43:42,840 Speaker 2: phone to pay my bills? 726 00:43:43,320 --> 00:43:43,720 Speaker 1: Yeah? 727 00:43:43,760 --> 00:43:47,200 Speaker 2: It's a lot of things that are touched by technologies 728 00:43:47,520 --> 00:43:52,919 Speaker 2: in ways that we can't opt out right, And so 729 00:43:53,120 --> 00:43:55,799 Speaker 2: I think what tends to happen is this is the 730 00:43:55,800 --> 00:43:59,480 Speaker 2: moment for people and everyone who's listening. I would encourage 731 00:43:59,480 --> 00:44:01,839 Speaker 2: you to take this moment for yourself and just say 732 00:44:02,160 --> 00:44:04,440 Speaker 2: how much how much again? How much of myself do 733 00:44:04,520 --> 00:44:06,200 Speaker 2: I want to give up? How much of my time? 734 00:44:06,880 --> 00:44:07,080 Speaker 1: You know? 735 00:44:07,200 --> 00:44:12,279 Speaker 2: Understanding that I can't completely opt out. I can't say no, 736 00:44:12,840 --> 00:44:14,880 Speaker 2: I don't I don't want to have to use my 737 00:44:15,000 --> 00:44:20,440 Speaker 2: phone ever, it's not really practical in our modern society. 738 00:44:20,600 --> 00:44:24,160 Speaker 2: But that doesn't mean that you have to be posting 739 00:44:24,160 --> 00:44:29,000 Speaker 2: every single thought that you have or a Brazilian selfies, right, like, 740 00:44:29,560 --> 00:44:33,879 Speaker 2: maybe there's room in there for moderation, for thoughtfulness, for consideration, 741 00:44:34,280 --> 00:44:38,400 Speaker 2: for taking a bit of a step back and reevaluating. 742 00:44:39,120 --> 00:44:41,319 Speaker 1: It has been so wonderful to learn so much more 743 00:44:41,360 --> 00:44:43,360 Speaker 1: about your works, Tanya. I really appreciate you spending the 744 00:44:43,360 --> 00:44:45,799 Speaker 1: same time with us today. Please let us know where 745 00:44:45,840 --> 00:44:47,880 Speaker 1: we can stay connected with you and learn more about 746 00:44:47,880 --> 00:44:49,680 Speaker 1: the work that you're doing. Do you have a website 747 00:44:49,719 --> 00:44:52,160 Speaker 1: as well as any social media handles you'd like to share? 748 00:44:52,440 --> 00:44:55,319 Speaker 2: Thank you, doctor Joy, Thank you so much. This has 749 00:44:55,360 --> 00:44:59,760 Speaker 2: been a delight. My website is at Tanyasutherland dot com. 750 00:45:00,120 --> 00:45:02,920 Speaker 2: You can find links to my socials there, but you 751 00:45:02,960 --> 00:45:06,239 Speaker 2: can pretty much find me either at Tanya Sutherland or 752 00:45:06,320 --> 00:45:10,040 Speaker 2: at Tanya dot Sutherland depending on the platform. 753 00:45:10,040 --> 00:45:11,600 Speaker 1: And the name of the book and where can we 754 00:45:11,640 --> 00:45:12,040 Speaker 1: find it. 755 00:45:12,760 --> 00:45:15,040 Speaker 2: The name of the book is Resurrecting the Black Body 756 00:45:15,239 --> 00:45:18,440 Speaker 2: Race and the Digital Afterlife, and you can find it 757 00:45:18,480 --> 00:45:21,160 Speaker 2: anywhere that books are sold. You can order it directly 758 00:45:21,200 --> 00:45:24,000 Speaker 2: through the University of California Press website. You can get 759 00:45:24,040 --> 00:45:27,200 Speaker 2: it on Amazon, or please try your local bookshop. 760 00:45:27,640 --> 00:45:29,680 Speaker 1: Beautiful. We'll be sure to include all of that in 761 00:45:29,719 --> 00:45:30,879 Speaker 1: our show notes. Thank you so. 762 00:45:30,920 --> 00:45:33,800 Speaker 2: Much, Thank you so much, what a pleasure. 763 00:45:34,160 --> 00:45:42,279 Speaker 1: Absolutely thank you, Tanya. I'm so glad doctor Sutherland was 764 00:45:42,320 --> 00:45:45,239 Speaker 1: able to join us for today's conversation. To learn more 765 00:45:45,239 --> 00:45:47,440 Speaker 1: about her and her work, be sure to visit the 766 00:45:47,440 --> 00:45:50,320 Speaker 1: show notes at Therapy for Blackgirls dot com plash Session 767 00:45:50,360 --> 00:45:53,040 Speaker 1: four thirty eight, and don't forget to text two of 768 00:45:53,080 --> 00:45:55,000 Speaker 1: your girls right now and tell them to check out 769 00:45:55,000 --> 00:45:57,759 Speaker 1: the episode. Did you know that you could leave us 770 00:45:57,760 --> 00:46:00,799 Speaker 1: a voicemail with your questions or suggestions for the podcast. 771 00:46:01,280 --> 00:46:03,520 Speaker 1: If you have movies or books you'd like us to review, 772 00:46:04,040 --> 00:46:06,640 Speaker 1: drop us a message at Memo dot fm slash Therapy 773 00:46:06,680 --> 00:46:08,880 Speaker 1: for Black Girls and let us know what's on your mind. 774 00:46:09,200 --> 00:46:12,160 Speaker 1: We just might feature it on the podcast. If you're 775 00:46:12,200 --> 00:46:15,120 Speaker 1: looking for a therapist in your area, visit our therapist 776 00:46:15,160 --> 00:46:19,600 Speaker 1: directory at Therapy for Blackgirls dot com slash directory. Don't 777 00:46:19,600 --> 00:46:22,000 Speaker 1: forget to follow us over on Instagram at Therapy for 778 00:46:22,080 --> 00:46:24,440 Speaker 1: Black Girls and come on over and join us in 779 00:46:24,480 --> 00:46:28,040 Speaker 1: our Patreon for exclusive updates, behind the scenes content, and 780 00:46:28,120 --> 00:46:31,080 Speaker 1: much more. You can join us at community dot therapy 781 00:46:31,120 --> 00:46:34,960 Speaker 1: for blackgirls dot com. This episode was produced by Aleise 782 00:46:35,040 --> 00:46:39,719 Speaker 1: Ellis Indichubu and Tyree Rush. Editing was done by Dennison Bradford. 783 00:46:40,239 --> 00:46:42,400 Speaker 1: Thank y'all so much for joining me again this week. 784 00:46:42,640 --> 00:46:45,319 Speaker 1: I look forward to continuing this conversation with you all 785 00:46:45,440 --> 00:46:47,359 Speaker 1: real soon. Take good care,