1 00:00:05,200 --> 00:00:07,240 Speaker 1: Hey, this is Annie and Samantha and what kind of 2 00:00:07,280 --> 00:00:09,160 Speaker 1: stuff I never told you protection of by Heart Radio 3 00:00:18,640 --> 00:00:22,240 Speaker 1: and we're thrilled to once again be joined by bridget Todd, 4 00:00:22,680 --> 00:00:27,840 Speaker 1: the wonderful, the marvelous, the magnificent, and the very busy. 5 00:00:28,000 --> 00:00:30,920 Speaker 2: We have missed you, Bridget Yes. 6 00:00:30,640 --> 00:00:36,919 Speaker 3: I have missed you all too, my favorite authors. It's 7 00:00:36,960 --> 00:00:43,360 Speaker 3: nice to be in the company of some literary ladies. 8 00:00:43,920 --> 00:00:45,880 Speaker 4: That's sounded like a foreign language, like who are you 9 00:00:45,920 --> 00:00:46,560 Speaker 4: talking about? 10 00:00:50,040 --> 00:00:58,960 Speaker 2: You just bring on guests. I'm here, yes, yes, are you? 11 00:00:59,120 --> 00:01:00,560 Speaker 2: Do you enjoy write it? 12 00:01:01,720 --> 00:01:03,400 Speaker 5: I used to enjoy it. 13 00:01:03,960 --> 00:01:06,360 Speaker 3: I was the kind of person who would labor over 14 00:01:07,000 --> 00:01:10,440 Speaker 3: every sentence, every word choice, and so writing a paragraph 15 00:01:10,480 --> 00:01:13,360 Speaker 3: would take me forever. It's one of the reasons why 16 00:01:13,400 --> 00:01:15,960 Speaker 3: I'm a podcaster, because I feel it's so much easier 17 00:01:16,600 --> 00:01:19,400 Speaker 3: to say it verbally. Once I started podcasting, I was 18 00:01:19,520 --> 00:01:23,400 Speaker 3: like writing, who I don't know her? Say it verbally. 19 00:01:23,920 --> 00:01:26,240 Speaker 3: But I used to enjoy it, Like I used to 20 00:01:26,400 --> 00:01:32,080 Speaker 3: like send pitches to freelance editors and stuff, and but yeah, 21 00:01:32,280 --> 00:01:33,199 Speaker 3: much less lately. 22 00:01:33,440 --> 00:01:33,880 Speaker 5: How about you? 23 00:01:33,880 --> 00:01:36,600 Speaker 3: What is your relationship to writing? The both of you 24 00:01:36,680 --> 00:01:39,480 Speaker 3: now that you're going to be published authors. 25 00:01:41,840 --> 00:01:46,399 Speaker 1: I love it. I we've we've talked about on the show. 26 00:01:46,440 --> 00:01:48,960 Speaker 1: Samantha and I have both done Nana Raimo, which I 27 00:01:48,960 --> 00:01:51,760 Speaker 1: know makes a lot of like editors roll their eyes, 28 00:01:51,760 --> 00:01:56,200 Speaker 1: but done that, and I have recently, during the pandemic 29 00:01:56,240 --> 00:02:01,280 Speaker 1: written a ton of fan fiction, like a actual shocking 30 00:02:01,280 --> 00:02:06,600 Speaker 1: amount perhaps which I've started publishing two I'll say rave reviews. 31 00:02:08,919 --> 00:02:11,200 Speaker 4: It's true she has a man art. I do. 32 00:02:11,720 --> 00:02:12,480 Speaker 5: People made art. 33 00:02:12,560 --> 00:02:13,240 Speaker 2: I got it. 34 00:02:13,360 --> 00:02:15,239 Speaker 5: People like it, Yes. 35 00:02:15,120 --> 00:02:16,919 Speaker 2: People like it, So I do love it. It is 36 00:02:16,960 --> 00:02:17,600 Speaker 2: something I enjoy. 37 00:02:17,680 --> 00:02:20,440 Speaker 1: But I guess one of the things about Nano Raimo 38 00:02:20,560 --> 00:02:24,360 Speaker 1: and fan fiction to a lesser extent is I was 39 00:02:24,440 --> 00:02:26,120 Speaker 1: kind of like you bridget with the editing and I 40 00:02:26,120 --> 00:02:28,120 Speaker 1: would get some in my head and like, this isn't perfect, 41 00:02:28,200 --> 00:02:31,400 Speaker 1: this isn't perfect. But Nano Raimo is like, I don't care, 42 00:02:31,480 --> 00:02:33,840 Speaker 1: get it done, but it has to happen. And then 43 00:02:33,919 --> 00:02:35,560 Speaker 1: you go back and maybe you get stuck in the 44 00:02:35,639 --> 00:02:39,160 Speaker 1: editing again. But it was a real nice for me 45 00:02:40,080 --> 00:02:42,840 Speaker 1: thing to do of getting out of my head and 46 00:02:42,919 --> 00:02:43,679 Speaker 1: just writing it. 47 00:02:44,120 --> 00:02:46,560 Speaker 2: And now we're reading some of that stuff on Spinty. 48 00:02:47,560 --> 00:02:51,040 Speaker 1: And then with fan fiction, it's nice because it's like 49 00:02:51,200 --> 00:02:55,280 Speaker 1: literally people like use this as an argument against it, 50 00:02:55,320 --> 00:02:56,799 Speaker 1: and I'm like, no, this is an argument for it. 51 00:02:56,840 --> 00:02:59,320 Speaker 1: Like it's a hobby and it's fun, Like I'm not 52 00:02:59,360 --> 00:03:01,840 Speaker 1: worried about like people will enjoy it. 53 00:03:01,880 --> 00:03:02,600 Speaker 2: I like it. 54 00:03:02,600 --> 00:03:04,360 Speaker 1: It's not gonna go anywhere, and that's fine, But I 55 00:03:04,440 --> 00:03:07,880 Speaker 1: just like doing it and I like writing, and it's 56 00:03:07,919 --> 00:03:10,480 Speaker 1: definitely kind of therapeutic for me. 57 00:03:11,880 --> 00:03:15,040 Speaker 4: Right, I don't do a lot of writing anymore. My 58 00:03:15,800 --> 00:03:17,800 Speaker 4: form of writing has always been fiction. But I'm also 59 00:03:17,840 --> 00:03:21,440 Speaker 4: one of those who it's the mood. The way my 60 00:03:21,600 --> 00:03:24,320 Speaker 4: mood fluctuates is how it goes. And the more depressive 61 00:03:24,360 --> 00:03:26,640 Speaker 4: I am, the more likely I'm going to write, And 62 00:03:26,800 --> 00:03:28,680 Speaker 4: the more depressive and likely I'm going to write, it's 63 00:03:28,680 --> 00:03:31,960 Speaker 4: gonna be sad. So I try to stay away from 64 00:03:31,960 --> 00:03:34,800 Speaker 4: that because I'm more fictional than anything else. I think, 65 00:03:35,360 --> 00:03:37,600 Speaker 4: for the most part, doing this type of work was 66 00:03:37,640 --> 00:03:40,200 Speaker 4: a whole different beast. Doing things like when we write 67 00:03:40,200 --> 00:03:43,080 Speaker 4: scripts and outlines and researching has been such a whole 68 00:03:43,080 --> 00:03:46,200 Speaker 4: different beast that it's kind of the only way I 69 00:03:46,240 --> 00:03:48,600 Speaker 4: do it now. Just come to just space out. I'm like, yeah, 70 00:03:48,640 --> 00:03:53,000 Speaker 4: I'm done. I'm not touching a pencil, pen or the 71 00:03:53,080 --> 00:03:57,520 Speaker 4: typewriter or a keyboard typewriter. I'm that old. That's how 72 00:03:57,520 --> 00:03:58,000 Speaker 4: long it's been. 73 00:03:58,080 --> 00:03:59,200 Speaker 5: All the quill pen. 74 00:04:02,320 --> 00:04:04,920 Speaker 4: Look, guys, I love typewriters. I love them so much. 75 00:04:04,960 --> 00:04:07,880 Speaker 4: I wish I could have like several, because I like 76 00:04:08,280 --> 00:04:10,080 Speaker 4: I would collect them, kind of like Tom Hanks does 77 00:04:10,160 --> 00:04:11,000 Speaker 4: if I had the money. 78 00:04:11,640 --> 00:04:14,160 Speaker 5: I had one in college. I thought I was so hip. 79 00:04:14,320 --> 00:04:16,039 Speaker 5: I didn't. I didn't use it. It was it was just 80 00:04:16,080 --> 00:04:18,680 Speaker 5: to like have and make me feel cool. 81 00:04:21,000 --> 00:04:23,960 Speaker 3: You know, any any to your point about being able 82 00:04:24,000 --> 00:04:27,320 Speaker 3: to have outlets to write where you're not you're not 83 00:04:27,360 --> 00:04:29,839 Speaker 3: so in your head about it. Like, that's something that 84 00:04:29,880 --> 00:04:33,760 Speaker 3: I think is really important about, like having a craft 85 00:04:33,839 --> 00:04:34,599 Speaker 3: that you're honing. 86 00:04:34,920 --> 00:04:37,240 Speaker 5: I have a post it ode on my laptop that 87 00:04:37,240 --> 00:04:38,680 Speaker 5: I'm looking at right now that says. 88 00:04:38,400 --> 00:04:42,599 Speaker 3: Do it because it's like for me, so often the 89 00:04:42,720 --> 00:04:45,280 Speaker 3: need to do something perfect or to produce something really 90 00:04:45,279 --> 00:04:49,200 Speaker 3: good just keeps me from doing anything. And so I like, 91 00:04:49,800 --> 00:04:52,800 Speaker 3: I think that we need to have outlets where the 92 00:04:52,880 --> 00:04:55,640 Speaker 3: point is doing it. The point is like actually putting 93 00:04:55,839 --> 00:04:58,559 Speaker 3: pen to paper or finger to keyboard and not really 94 00:04:58,560 --> 00:04:59,400 Speaker 3: being in your head and. 95 00:04:59,320 --> 00:05:00,719 Speaker 5: Worrying about like is it perfect? 96 00:05:01,200 --> 00:05:03,359 Speaker 3: And so I think, like, yeah, like times like the 97 00:05:03,520 --> 00:05:06,440 Speaker 3: National Novel Writing Month. Having things like that like that 98 00:05:06,640 --> 00:05:08,400 Speaker 3: kind of get you in the practice of just like 99 00:05:08,839 --> 00:05:10,960 Speaker 3: just focus on writing, execute, just. 100 00:05:10,880 --> 00:05:14,160 Speaker 5: Write, don't worry about it. Edit it later. Can be 101 00:05:14,200 --> 00:05:15,280 Speaker 5: really really powerful. 102 00:05:15,640 --> 00:05:15,800 Speaker 6: Right. 103 00:05:15,960 --> 00:05:18,159 Speaker 4: They have a scriptwriting month too for people who do 104 00:05:18,240 --> 00:05:22,320 Speaker 4: need to Speaking of which, Annie has also other wars 105 00:05:22,400 --> 00:05:24,840 Speaker 4: that are not necessarily published, but they have been out 106 00:05:24,880 --> 00:05:28,400 Speaker 4: because she's done with the Twelve Days of Halloween the 107 00:05:28,839 --> 00:05:30,680 Speaker 4: all of that, So she's got work out there she has. 108 00:05:30,720 --> 00:05:32,880 Speaker 4: I feel like she didn't. She's not known, but she 109 00:05:32,920 --> 00:05:33,800 Speaker 4: definitely has credit. 110 00:05:35,200 --> 00:05:36,600 Speaker 2: I was just waiting for you to. 111 00:05:36,520 --> 00:05:40,000 Speaker 4: Say, I know you need someone else to I got to. 112 00:05:40,839 --> 00:05:44,120 Speaker 3: It's so funny how like it's so much easier to 113 00:05:44,320 --> 00:05:46,960 Speaker 3: like hype up a friend than it is to hype 114 00:05:47,040 --> 00:05:47,960 Speaker 3: up your own work. 115 00:05:49,440 --> 00:05:52,320 Speaker 1: Yeah, oh my god. It just feels and I think 116 00:05:52,320 --> 00:05:54,320 Speaker 1: this is a lot of like conditioning we have. It 117 00:05:54,440 --> 00:05:57,520 Speaker 1: just feels like you're as a woman, like, oh, I'm 118 00:05:57,600 --> 00:05:59,560 Speaker 1: drawing too much attention to myself for oh, like what 119 00:05:59,600 --> 00:06:00,880 Speaker 1: if it's not that good? 120 00:06:01,320 --> 00:06:04,200 Speaker 4: Right, Like it's that good anyway, So you don't want 121 00:06:04,240 --> 00:06:04,920 Speaker 4: to talk about. 122 00:06:04,720 --> 00:06:08,520 Speaker 3: It, and no one wants to see no one wants 123 00:06:08,560 --> 00:06:10,360 Speaker 3: to seem like a bragger, but you should be bragging. 124 00:06:10,360 --> 00:06:12,760 Speaker 3: We should all be bragging. My friend Meredith Finder notte 125 00:06:12,800 --> 00:06:14,919 Speaker 3: a book called Brag Better that's all about the importance 126 00:06:14,920 --> 00:06:19,400 Speaker 3: of women bragging. Brag on yourself everyone and Sam and 127 00:06:19,520 --> 00:06:21,120 Speaker 3: listeners and listeners. 128 00:06:21,200 --> 00:06:24,000 Speaker 1: Yes, yes, well, speaking of I know you've been up 129 00:06:24,040 --> 00:06:25,480 Speaker 1: to a lot of stuff, bridget Do you want to 130 00:06:25,520 --> 00:06:29,440 Speaker 1: give us any highlights of what we talked? 131 00:06:30,120 --> 00:06:30,599 Speaker 5: What's happened? 132 00:06:30,600 --> 00:06:33,760 Speaker 3: Well, we have relaunched season four of my podcast There 133 00:06:33,760 --> 00:06:34,840 Speaker 3: Are No Girls on the Internet. 134 00:06:34,839 --> 00:06:37,080 Speaker 5: We launched it I think last week. 135 00:06:37,200 --> 00:06:39,400 Speaker 3: That's right, So I've been It's been one of those 136 00:06:39,400 --> 00:06:42,279 Speaker 3: speaking of like doing it crappy and just like getting 137 00:06:42,279 --> 00:06:42,599 Speaker 3: it out. 138 00:06:42,800 --> 00:06:44,359 Speaker 5: It was I don't know. 139 00:06:44,480 --> 00:06:47,040 Speaker 3: I mean, I've been doing this podcast for years now, 140 00:06:47,040 --> 00:06:49,359 Speaker 3: but like got a little bit in my head and 141 00:06:49,480 --> 00:06:51,039 Speaker 3: was like paralyzed. 142 00:06:50,440 --> 00:06:52,360 Speaker 5: With an inability to just like do the thing. 143 00:06:53,400 --> 00:06:55,279 Speaker 3: Had some stuff to work through there, but you know, 144 00:06:55,360 --> 00:06:58,080 Speaker 3: it got done and it is people can listen to it. 145 00:06:58,200 --> 00:07:00,800 Speaker 3: It's ongoing now you can check it out. So that 146 00:07:00,880 --> 00:07:05,320 Speaker 3: was really exciting. We wrapped another podcast with Next Chapter 147 00:07:05,440 --> 00:07:10,480 Speaker 3: Podcasts called Beef, all about historical beef's and rivalries, which 148 00:07:10,680 --> 00:07:13,560 Speaker 3: is a kind of side interest of mine. Listen to 149 00:07:13,600 --> 00:07:15,920 Speaker 3: all eight episodes. It was super fun to make. My 150 00:07:16,000 --> 00:07:16,560 Speaker 3: favorite was. 151 00:07:16,560 --> 00:07:20,280 Speaker 5: Probably Anne Landers and Dear Abby. Did you know they 152 00:07:20,320 --> 00:07:21,800 Speaker 5: were sisters? I didn't know that. 153 00:07:22,880 --> 00:07:26,480 Speaker 3: Yeah, they were like sisters who had this intense rivalry 154 00:07:26,680 --> 00:07:28,160 Speaker 3: because they were both advice calumnists. 155 00:07:28,400 --> 00:07:33,320 Speaker 5: Fascinating stuff. Definitely check it out. Yeah, exciting times. 156 00:07:34,200 --> 00:07:37,000 Speaker 4: That's interesting. I wonder what like family events were, because 157 00:07:37,000 --> 00:07:38,680 Speaker 4: you know, those were the times that you sit together 158 00:07:38,760 --> 00:07:41,160 Speaker 4: and kind of moll over whatever troubles you might have, 159 00:07:41,240 --> 00:07:43,080 Speaker 4: boyfriend troubles, and the both of them trying to give 160 00:07:43,120 --> 00:07:44,760 Speaker 4: you advice. How annoying would that be? 161 00:07:45,000 --> 00:07:50,240 Speaker 5: I know, can you imagine my head? 162 00:07:50,240 --> 00:07:52,720 Speaker 4: I'd be like, Okay, one was bad enough, but both 163 00:07:52,720 --> 00:07:54,280 Speaker 4: of you have to put in your two says and 164 00:07:54,360 --> 00:07:57,000 Speaker 4: sound like your therapist. That's even worse. 165 00:07:57,440 --> 00:08:00,640 Speaker 5: God, you know, Christmas Dinners were a love that family. 166 00:08:01,280 --> 00:08:04,040 Speaker 2: Geez, was there advice like similar? 167 00:08:04,880 --> 00:08:08,480 Speaker 3: No, that was something that they had very distinctive advice styles. 168 00:08:08,760 --> 00:08:12,080 Speaker 3: One of them their the her advice was like much 169 00:08:12,120 --> 00:08:15,640 Speaker 3: more kind of like traditional. Another's was like their her 170 00:08:15,680 --> 00:08:19,200 Speaker 3: advice was much more like progressive. They had they had 171 00:08:19,200 --> 00:08:23,960 Speaker 3: different attitudes, like they accepted things like you know, like 172 00:08:24,080 --> 00:08:26,560 Speaker 3: gay rights and like having people like gay couples come 173 00:08:26,600 --> 00:08:29,840 Speaker 3: in with advice. They accepted those things on very different timelines, 174 00:08:29,960 --> 00:08:33,800 Speaker 3: like they had very different styles of advice. Yeah, it's 175 00:08:33,840 --> 00:08:36,679 Speaker 3: and it's fascinating. Uh, people should like if. 176 00:08:36,600 --> 00:08:37,880 Speaker 5: You were gonna, if you were interested. 177 00:08:37,920 --> 00:08:40,560 Speaker 3: That's the episode I feel but that's the one that 178 00:08:40,600 --> 00:08:43,880 Speaker 3: I feel like is like I learned the most about advice. 179 00:08:44,240 --> 00:08:45,120 Speaker 5: Well, I'll say one more thing. 180 00:08:45,160 --> 00:08:47,280 Speaker 3: This is like I'm going on way too long, but 181 00:08:47,320 --> 00:08:51,440 Speaker 3: it's I'm fascinated by it. Advice columns were kind of 182 00:08:51,520 --> 00:08:56,600 Speaker 3: predated this like Internet era idea of people publishing things 183 00:08:56,640 --> 00:08:59,360 Speaker 3: and then the readers being able to write in before 184 00:08:59,400 --> 00:09:02,559 Speaker 3: the days of the internet, in print the only place 185 00:09:02,559 --> 00:09:05,160 Speaker 3: where people would publish something, and then the readers would 186 00:09:05,200 --> 00:09:07,520 Speaker 3: then submit letters being like oh I don't know about that, 187 00:09:07,800 --> 00:09:09,840 Speaker 3: or submit letters at all, and you know, getting advice 188 00:09:10,280 --> 00:09:13,280 Speaker 3: was advice columns. And so the whole concept of a 189 00:09:13,360 --> 00:09:17,559 Speaker 3: readership being in conversation and in dialogue with a print 190 00:09:17,559 --> 00:09:21,040 Speaker 3: publisher that is now like part of the Internet really 191 00:09:21,080 --> 00:09:25,240 Speaker 3: got it start in advice columns. So fun fact, that's random. 192 00:09:25,920 --> 00:09:27,160 Speaker 2: So were you abby? 193 00:09:27,240 --> 00:09:30,000 Speaker 4: Your team is an you said, alienators? 194 00:09:30,200 --> 00:09:34,360 Speaker 5: I what a good question. I identify the team. 195 00:09:35,160 --> 00:09:36,280 Speaker 4: As a beef, there's a team. 196 00:09:36,360 --> 00:09:38,000 Speaker 5: Yeah, if there's a beef, there's a team. Exactly. 197 00:09:38,440 --> 00:09:42,320 Speaker 3: At times I identify with both. I think that dear 198 00:09:42,400 --> 00:09:46,240 Speaker 3: Abby definitely struck me as the pettier of the two, 199 00:09:46,920 --> 00:09:49,640 Speaker 3: and so I definitely saw a lot of myself in that. 200 00:09:49,760 --> 00:09:52,079 Speaker 3: So I at times I feel I see both. 201 00:09:52,120 --> 00:09:53,160 Speaker 5: I see myself in both of them. 202 00:09:53,320 --> 00:09:59,400 Speaker 3: Put it that way, Oh that's great, good, Yes, I 203 00:09:59,440 --> 00:10:00,920 Speaker 3: definitely I need to check that out. 204 00:10:01,280 --> 00:10:03,920 Speaker 1: Yeah, because for some reason, my Google updates all the 205 00:10:03,920 --> 00:10:07,080 Speaker 1: time to this day like gives me these kinds of 206 00:10:07,160 --> 00:10:10,720 Speaker 1: questions people are apparently still asking to advice columnist and 207 00:10:10,800 --> 00:10:13,240 Speaker 1: that when I read them, I'm like, this is wild, 208 00:10:13,280 --> 00:10:14,280 Speaker 1: this is a thing that is. 209 00:10:14,200 --> 00:10:16,720 Speaker 5: Oh my god, I read advice columns. 210 00:10:16,720 --> 00:10:18,240 Speaker 3: It's like it's like my one of my favorite things. 211 00:10:18,240 --> 00:10:20,760 Speaker 3: I read advice columns every day. It's funny because I 212 00:10:20,760 --> 00:10:22,959 Speaker 3: was just talking to someone about an advice column I 213 00:10:23,000 --> 00:10:23,880 Speaker 3: read yesterday. 214 00:10:23,920 --> 00:10:24,600 Speaker 5: I think it was in. 215 00:10:24,640 --> 00:10:29,080 Speaker 3: Slate where a woman had set her single coworker up 216 00:10:29,080 --> 00:10:31,120 Speaker 3: with a friend of hers and they went on a 217 00:10:31,200 --> 00:10:34,480 Speaker 3: date at a bar and the coworker was like, well, 218 00:10:34,520 --> 00:10:35,959 Speaker 3: we're not we can't go out again. And she was 219 00:10:36,040 --> 00:10:38,520 Speaker 3: like why and it was because her friend at the 220 00:10:38,520 --> 00:10:40,960 Speaker 3: bar ordered a glass of plain milk. 221 00:10:42,000 --> 00:10:43,439 Speaker 5: And he was like, I can't see her again. 222 00:10:44,360 --> 00:10:46,840 Speaker 3: And so the letter writer was like, how could I 223 00:10:46,880 --> 00:10:48,960 Speaker 3: convince him that he's being ridiculous and like, let her 224 00:10:49,040 --> 00:10:49,920 Speaker 3: order whatever she wants. 225 00:10:49,920 --> 00:10:50,560 Speaker 5: She's a catch. 226 00:10:51,880 --> 00:10:55,080 Speaker 3: The weird petty stuff that people write into advice columns, 227 00:10:55,080 --> 00:10:57,320 Speaker 3: it just I'm endlessly fascinated by it. 228 00:10:57,800 --> 00:10:58,000 Speaker 2: Yeah. 229 00:10:58,040 --> 00:11:01,600 Speaker 4: I feel like the update to that is the whole Creddit, 230 00:11:02,760 --> 00:11:04,680 Speaker 4: which I adore obviously. 231 00:11:05,559 --> 00:11:07,640 Speaker 5: Oh I love them. I love them. And like there 232 00:11:07,640 --> 00:11:10,960 Speaker 5: are podcasts where people people just read them. Yes, I'm addictions. 233 00:11:11,000 --> 00:11:13,440 Speaker 4: That's the entire podcast is they just read it and 234 00:11:13,480 --> 00:11:16,360 Speaker 4: then they talk about it. I'm like, wow, that's simple. 235 00:11:16,440 --> 00:11:19,120 Speaker 4: Can we do that? Yeah? 236 00:11:19,320 --> 00:11:21,160 Speaker 5: No, like six page research. 237 00:11:20,880 --> 00:11:25,600 Speaker 4: On no, no need twelve pages of things and reading 238 00:11:25,679 --> 00:11:28,880 Speaker 4: horrible things, thank you. Yeah. 239 00:11:28,880 --> 00:11:33,800 Speaker 1: But it also just underscores like how people are doing 240 00:11:33,840 --> 00:11:35,880 Speaker 1: all kinds of things that I'm shocked by all the time. 241 00:11:36,880 --> 00:11:39,880 Speaker 3: Yeah, it really, It's so true, And like reading advice 242 00:11:40,000 --> 00:11:44,640 Speaker 3: columns and all of the like am I like humans 243 00:11:44,679 --> 00:11:49,160 Speaker 3: are fascinating. We are, we have fascinating motivations. We our 244 00:11:49,240 --> 00:11:52,959 Speaker 3: behavior is really fascinating. Like the ability for humans and 245 00:11:53,000 --> 00:11:55,120 Speaker 3: our behavior and our range of behaviors will. 246 00:11:55,000 --> 00:11:56,000 Speaker 5: Never cease to surprise me. 247 00:11:56,040 --> 00:11:57,840 Speaker 3: I'll put it that way, both in negative ways and 248 00:11:57,920 --> 00:11:59,560 Speaker 3: positive ways. 249 00:11:59,760 --> 00:12:16,880 Speaker 6: Yeah, yes, yes, yes, well, I guess kind of like 250 00:12:17,000 --> 00:12:23,640 Speaker 6: segueing into technology, and it's changed how things are happening 251 00:12:23,679 --> 00:12:25,240 Speaker 6: and how we operate. 252 00:12:26,000 --> 00:12:28,520 Speaker 1: The topic you've bought today is so important, and I 253 00:12:28,600 --> 00:12:30,439 Speaker 1: know it's a question that a lot of people have, 254 00:12:30,679 --> 00:12:34,200 Speaker 1: So what are we talking about today? 255 00:12:34,640 --> 00:12:36,960 Speaker 3: So this is an important topic. You know, it's almost 256 00:12:37,040 --> 00:12:40,000 Speaker 3: been almost a year since the Supreme Court overturned ro 257 00:12:40,120 --> 00:12:43,880 Speaker 3: versus Wade, And after that happened, probably immediately, one of 258 00:12:43,880 --> 00:12:46,360 Speaker 3: the biggest issues that came up was this connection between 259 00:12:46,400 --> 00:12:49,400 Speaker 3: technology and abortion access. You know, like should I delete 260 00:12:49,440 --> 00:12:52,240 Speaker 3: my period tracker app? Like what like can I google abortion? 261 00:12:52,280 --> 00:12:54,560 Speaker 3: Am I going to go to jail? And so namely 262 00:12:54,600 --> 00:12:57,400 Speaker 3: this idea that all of the data from our smartphones 263 00:12:57,520 --> 00:13:01,040 Speaker 3: could and would be used to prosecute abortions. So, you know, 264 00:13:01,200 --> 00:13:03,680 Speaker 3: back in the day, in the days before row in 265 00:13:03,760 --> 00:13:07,439 Speaker 3: nineteen seventy three, you know, it was a rough landscape 266 00:13:07,600 --> 00:13:13,439 Speaker 3: for abortion access. Obviously, we've made some really good progress, 267 00:13:13,480 --> 00:13:15,760 Speaker 3: you know, because of the expansion of things like pill 268 00:13:15,760 --> 00:13:18,360 Speaker 3: based abortion and you know, the ability to get pills 269 00:13:18,400 --> 00:13:21,800 Speaker 3: by mail. But in the days before Row in the seventies, 270 00:13:22,120 --> 00:13:24,800 Speaker 3: we'd all didn't carry like GPS devices in our pocket 271 00:13:24,880 --> 00:13:28,560 Speaker 3: or have this vast virtual network of mass surveillance that 272 00:13:28,559 --> 00:13:31,520 Speaker 3: could help criminalize people who need abortions in the communities 273 00:13:31,520 --> 00:13:32,040 Speaker 3: that help them. 274 00:13:32,280 --> 00:13:33,320 Speaker 5: And I am. 275 00:13:33,240 --> 00:13:36,480 Speaker 3: Sorry to say that Google, but the thing that most 276 00:13:36,520 --> 00:13:40,760 Speaker 3: of us use for our email, for our maps, for us, 277 00:13:40,840 --> 00:13:42,480 Speaker 3: our cell phones for some of us if you use 278 00:13:42,520 --> 00:13:45,840 Speaker 3: an Android or a Google phone, is a huge part. 279 00:13:45,679 --> 00:13:47,600 Speaker 5: Of that network. And so today I want to talk 280 00:13:47,600 --> 00:13:48,679 Speaker 5: about some. 281 00:13:48,559 --> 00:13:53,040 Speaker 3: Of the ways that Google has been collecting data related 282 00:13:53,040 --> 00:13:56,280 Speaker 3: to abortions, what Google has publicly said, what they're actually doing, 283 00:13:56,600 --> 00:13:59,920 Speaker 3: and some recent movement in news in that arena. 284 00:14:00,120 --> 00:14:01,800 Speaker 4: You know, I haven't really been thinking about it because 285 00:14:01,800 --> 00:14:03,920 Speaker 4: we've talked about again as you were talking about the 286 00:14:03,920 --> 00:14:06,560 Speaker 4: period tracker and then Facebook, because we already know some 287 00:14:06,600 --> 00:14:09,120 Speaker 4: of the headlines that it's made in what they've done 288 00:14:09,160 --> 00:14:12,080 Speaker 4: and how they have helped the anti abortion movement so 289 00:14:12,160 --> 00:14:16,160 Speaker 4: much in the whole whole right winged conversation. But I 290 00:14:16,240 --> 00:14:19,520 Speaker 4: really haven't thought about Google, even though I should be, like, 291 00:14:19,600 --> 00:14:22,600 Speaker 4: obviously we know Google minds a lot of data, and 292 00:14:22,640 --> 00:14:25,640 Speaker 4: there's still conversation about how much information they keep and 293 00:14:25,680 --> 00:14:28,800 Speaker 4: they spread. But I haven't really thought of it on 294 00:14:29,400 --> 00:14:32,680 Speaker 4: like par with what's going on with abortion. So can 295 00:14:32,720 --> 00:14:36,000 Speaker 4: you kind of explain what we're talking about here totally? 296 00:14:36,120 --> 00:14:38,920 Speaker 3: So I think what you just said, Sam, I've noticed 297 00:14:38,920 --> 00:14:41,040 Speaker 3: the same thing. I think that when it comes to 298 00:14:41,320 --> 00:14:45,680 Speaker 3: a lot of online harms, people think of Facebook, people 299 00:14:45,680 --> 00:14:48,680 Speaker 3: think of Twitter. Of course, Google is an interesting case. 300 00:14:48,720 --> 00:14:51,320 Speaker 3: I do think that anecdotally, Google has been able to 301 00:14:51,360 --> 00:14:54,240 Speaker 3: skirt a lot of public scrutiny about the way that 302 00:14:54,320 --> 00:14:56,640 Speaker 3: they handle some of this stuff in a way that 303 00:14:56,760 --> 00:14:59,280 Speaker 3: other platforms have not been able to skirt that kind 304 00:14:59,280 --> 00:14:59,960 Speaker 3: of scrutiny. 305 00:15:00,200 --> 00:15:02,280 Speaker 5: I think a lot about this. I think it's because. 306 00:15:02,440 --> 00:15:09,200 Speaker 3: Google offers a useful product for the world, right like, like 307 00:15:08,360 --> 00:15:14,320 Speaker 3: Google does make platforms and tools that are helpful to us. Facebook, 308 00:15:14,360 --> 00:15:16,960 Speaker 3: I feel like, is different and that like all they 309 00:15:17,040 --> 00:15:20,600 Speaker 3: really offer is like Facebook, WhatsApp, and Instagram. 310 00:15:21,120 --> 00:15:23,040 Speaker 5: Some of those are, you know, globally, some of. 311 00:15:22,960 --> 00:15:25,400 Speaker 3: Those platforms are like really useful, but here in the 312 00:15:25,520 --> 00:15:28,440 Speaker 3: United States, I feel like if Facebook disappeared, the tool 313 00:15:28,440 --> 00:15:30,880 Speaker 3: that they offer is not really as much of a 314 00:15:31,360 --> 00:15:33,920 Speaker 3: global good as Google. And so I think that for 315 00:15:34,360 --> 00:15:37,480 Speaker 3: that reason, I think that maybe that's why Google doesn't. 316 00:15:37,200 --> 00:15:39,320 Speaker 5: Come to mind when you think about these kind of harms. 317 00:15:39,520 --> 00:15:42,760 Speaker 3: But the reality is is that Google is really a 318 00:15:42,800 --> 00:15:44,760 Speaker 3: major player when it comes to the way that our 319 00:15:44,880 --> 00:15:48,240 Speaker 3: data can be misused, specifically to criminalize abortions. And so 320 00:15:48,680 --> 00:15:51,720 Speaker 3: shortly after Roe was overturned by the Supreme Court almost 321 00:15:51,720 --> 00:15:54,840 Speaker 3: a year ago, Google said that it would proactively delete 322 00:15:55,120 --> 00:15:59,400 Speaker 3: its location data when people visited quote, particularly personal places, 323 00:15:59,440 --> 00:16:03,320 Speaker 3: including a abortion clinics, hospitals, and shelters. Their statement said, 324 00:16:03,640 --> 00:16:06,520 Speaker 3: today we're announcing that if our systems identify that someone 325 00:16:06,520 --> 00:16:09,000 Speaker 3: has visited one of these places, we will delete these 326 00:16:09,160 --> 00:16:12,400 Speaker 3: entries from location history soon after they visit. This change 327 00:16:12,440 --> 00:16:15,280 Speaker 3: will take effect in the coming weeks. And so it's 328 00:16:15,360 --> 00:16:18,280 Speaker 3: really that last part of their statement that is important. 329 00:16:18,360 --> 00:16:22,520 Speaker 3: Like what does the word soon mean? Like my definition 330 00:16:22,560 --> 00:16:25,400 Speaker 3: of soon and Google's definition of soon might be two 331 00:16:25,440 --> 00:16:28,480 Speaker 3: different things. What does this will change will take effect 332 00:16:28,520 --> 00:16:30,720 Speaker 3: in the coming weeks? It's been almost a year, Like, 333 00:16:32,080 --> 00:16:35,360 Speaker 3: is that technically that is a matter of weeks? So, like, 334 00:16:35,440 --> 00:16:37,440 Speaker 3: is that what Google is defining as a couple of 335 00:16:37,480 --> 00:16:39,880 Speaker 3: weeks wouldn't be how I define it? And so this 336 00:16:40,000 --> 00:16:42,760 Speaker 3: was something that Google publicly agreed to do. 337 00:16:43,120 --> 00:16:43,600 Speaker 5: They didn't. 338 00:16:43,640 --> 00:16:47,000 Speaker 3: They will tell you that they just like decided to 339 00:16:47,040 --> 00:16:49,200 Speaker 3: do this as a public good on their own. 340 00:16:49,600 --> 00:16:50,600 Speaker 5: That's not exactly true. 341 00:16:50,640 --> 00:16:53,520 Speaker 3: There definitely were like digital rights groups behind the scenes 342 00:16:53,600 --> 00:16:55,880 Speaker 3: pressuring them to make this choice. But at any event, 343 00:16:55,960 --> 00:16:58,920 Speaker 3: they did, and they enjoyed a lot of positive glowing 344 00:16:59,000 --> 00:17:03,040 Speaker 3: press know for doing this. However, it's been almost a 345 00:17:03,120 --> 00:17:06,200 Speaker 3: year and Google has not really followed up in any 346 00:17:06,320 --> 00:17:09,600 Speaker 3: kind of a meaningful or consistent way in their promise 347 00:17:09,680 --> 00:17:13,280 Speaker 3: to delete this location history if they think that someone 348 00:17:13,320 --> 00:17:15,760 Speaker 3: is going to an abortion clinic. And that is really 349 00:17:15,840 --> 00:17:18,480 Speaker 3: concerning because we do live in a landscape where abortion 350 00:17:18,560 --> 00:17:21,720 Speaker 3: is so easily criminalized and your digital footprint can be 351 00:17:21,840 --> 00:17:24,960 Speaker 3: used to create evidence that you have tried to illegally 352 00:17:25,000 --> 00:17:26,720 Speaker 3: access information about abortion. 353 00:17:28,080 --> 00:17:31,080 Speaker 1: Right And the thing is like a lot of times 354 00:17:31,119 --> 00:17:35,919 Speaker 1: you're using like maps or GPS to get to any place, 355 00:17:36,200 --> 00:17:38,520 Speaker 1: like to get to an abortion clinic. And I will 356 00:17:38,560 --> 00:17:42,639 Speaker 1: say about Google, like, there are a lot of things 357 00:17:43,000 --> 00:17:45,920 Speaker 1: that you don't realize it's tracking, even if because I'm 358 00:17:45,960 --> 00:17:47,760 Speaker 1: somebody who's like I go in and I turn off, 359 00:17:47,760 --> 00:17:49,480 Speaker 1: Like you're not listening to my voice, you're not doing this, 360 00:17:49,520 --> 00:17:51,520 Speaker 1: you're not doing this, it's still doing it, Like I 361 00:17:51,600 --> 00:17:54,680 Speaker 1: can't figure out how, but it is. And once I 362 00:17:54,800 --> 00:17:58,000 Speaker 1: found a map of like everywhere i'd been, even though 363 00:17:58,040 --> 00:18:00,920 Speaker 1: I don't really use GPS that much, and I think 364 00:18:00,920 --> 00:18:03,800 Speaker 1: it's because of like emails I get it's like, oh, 365 00:18:03,840 --> 00:18:05,439 Speaker 1: you're going on this trip or oh you're doing this 366 00:18:05,480 --> 00:18:07,280 Speaker 1: thing Like I'm trying to figure it out but I can't. 367 00:18:07,280 --> 00:18:09,480 Speaker 1: But it was creepy, like you can go and find 368 00:18:09,520 --> 00:18:12,920 Speaker 1: a map of everywhere you've been, and it is frightening 369 00:18:12,920 --> 00:18:17,520 Speaker 1: when it can be used in this very scary landscape 370 00:18:18,119 --> 00:18:21,119 Speaker 1: in a case where against abortion or something in a 371 00:18:21,160 --> 00:18:25,280 Speaker 1: case of abortion. So it's not like sometimes when you 372 00:18:25,320 --> 00:18:27,760 Speaker 1: talk about stuff like this, it can feel very as 373 00:18:27,760 --> 00:18:29,840 Speaker 1: we've said several times on this show when you've been 374 00:18:29,880 --> 00:18:31,680 Speaker 1: on like oh that's an Internet thing. 375 00:18:31,760 --> 00:18:33,040 Speaker 2: I don't see it, so it's not. 376 00:18:33,080 --> 00:18:37,680 Speaker 1: Real, but it very much is, and there is data 377 00:18:38,280 --> 00:18:40,960 Speaker 1: around this whole thing right totally. 378 00:18:41,000 --> 00:18:43,040 Speaker 5: So this is not an Internet issue. 379 00:18:43,040 --> 00:18:46,680 Speaker 3: This is not an abstract issue or a hypothetical, far 380 00:18:46,720 --> 00:18:49,760 Speaker 3: away down the line, maybe one day issue. This is 381 00:18:49,760 --> 00:18:52,800 Speaker 3: an issue that's happening to people in their real lives 382 00:18:52,920 --> 00:18:55,399 Speaker 3: right now. So, just to set the scene, according to 383 00:18:55,440 --> 00:18:59,159 Speaker 3: Google's own transparency report, they have already been subpoenaed several 384 00:18:59,200 --> 00:19:01,320 Speaker 3: times for users data. In the second half of twenty 385 00:19:01,320 --> 00:19:04,840 Speaker 3: twenty one, they received eighteen thousand and thirty seven subpoenas 386 00:19:05,040 --> 00:19:07,639 Speaker 3: and twenty three thousand, nine hundred and twenty four search 387 00:19:07,680 --> 00:19:11,120 Speaker 3: warrants for user information. So it's pretty reasonable to conclude 388 00:19:11,160 --> 00:19:14,800 Speaker 3: that prosecutors and states were abortion is illegal might already 389 00:19:14,840 --> 00:19:18,520 Speaker 3: be making requests for personal data from Google to prosecute 390 00:19:18,560 --> 00:19:22,240 Speaker 3: those looking for abortions. Also, in terms of like where 391 00:19:22,240 --> 00:19:25,399 Speaker 3: we've already seen this happening, we already know that people 392 00:19:25,480 --> 00:19:28,160 Speaker 3: have already been charged with a painting abortions or helping 393 00:19:28,160 --> 00:19:31,280 Speaker 3: somebody a pain an abortion, with their digital footprint. 394 00:19:30,840 --> 00:19:32,000 Speaker 5: Being used as evidence. 395 00:19:32,080 --> 00:19:36,439 Speaker 3: You might remember that when Roe was overturned Mark Zuckerberg, 396 00:19:36,600 --> 00:19:38,439 Speaker 3: when he was asked about it, he said that he 397 00:19:38,560 --> 00:19:42,520 Speaker 3: was hoping that encrypting Facebook's messages would help protect people 398 00:19:42,560 --> 00:19:47,280 Speaker 3: from quote overbroad requests for information, but that did not 399 00:19:47,400 --> 00:19:50,520 Speaker 3: stop the company from handing over user information. In June, 400 00:19:50,960 --> 00:19:54,280 Speaker 3: before Roe was overturned, Facebook turned over the private messages 401 00:19:54,320 --> 00:19:57,639 Speaker 3: of a mother and daughter facing criminal charges for allegedly 402 00:19:57,640 --> 00:20:00,240 Speaker 3: carrying out an illegal abortion, and so, well. 403 00:20:00,840 --> 00:20:03,560 Speaker 5: This is happening now. It is not abstract. 404 00:20:04,080 --> 00:20:08,680 Speaker 3: People are currently embroiled and in legal situations that where 405 00:20:08,720 --> 00:20:14,080 Speaker 3: they're facing jail time because tech companies were like gave 406 00:20:14,119 --> 00:20:17,760 Speaker 3: over sensitive information that they had from folks who were 407 00:20:17,760 --> 00:20:21,720 Speaker 3: allegedly looking up information about abortion or talking about abortion. 408 00:20:21,800 --> 00:20:25,880 Speaker 3: And so the reason why we have some insight into 409 00:20:25,880 --> 00:20:28,560 Speaker 3: this is because of a new report from the digital 410 00:20:28,640 --> 00:20:32,159 Speaker 3: rights organization Accountable Tech. They did a couple of different 411 00:20:32,200 --> 00:20:36,480 Speaker 3: experiments to get a sense of how Google is tracking 412 00:20:36,680 --> 00:20:40,880 Speaker 3: and whether or not they're actually deleting this location information, 413 00:20:41,400 --> 00:20:44,120 Speaker 3: and they're just not. So they said that they were 414 00:20:44,160 --> 00:20:47,040 Speaker 3: going to and they are not. Like there's not any 415 00:20:47,080 --> 00:20:50,200 Speaker 3: other way to interpret kind of what's going on here. 416 00:20:50,800 --> 00:20:51,000 Speaker 5: You know. 417 00:20:51,400 --> 00:20:54,159 Speaker 4: And as you're talking about it, I just realized I 418 00:20:54,400 --> 00:20:58,240 Speaker 4: get monthly information about where where you went this month, 419 00:20:58,280 --> 00:21:00,960 Speaker 4: like as if this is like hey, let's monthly review, 420 00:21:01,040 --> 00:21:03,720 Speaker 4: look at you. You traveled these places, as if to celebrate. 421 00:21:03,720 --> 00:21:06,960 Speaker 4: And now that I'm thinking about it, like wait, what 422 00:21:07,080 --> 00:21:09,399 Speaker 4: if I don't want to remember where I went the 423 00:21:09,520 --> 00:21:12,080 Speaker 4: last month? What is happening and I don't remember ever 424 00:21:12,119 --> 00:21:14,680 Speaker 4: saying yes I want this information, Yes, I want it 425 00:21:14,720 --> 00:21:16,440 Speaker 4: to be recorded, and yes, I want to be given 426 00:21:16,480 --> 00:21:18,400 Speaker 4: back to me. So if they have that so easily 427 00:21:18,760 --> 00:21:21,639 Speaker 4: given without me even prompting or asking for it, I 428 00:21:21,680 --> 00:21:24,800 Speaker 4: can't imagine what they actually have. And honestly, I can't 429 00:21:24,840 --> 00:21:27,560 Speaker 4: imagine how they decide what to delete if they're doing 430 00:21:27,760 --> 00:21:31,520 Speaker 4: specific locations, like how that would even come up, as 431 00:21:31,520 --> 00:21:33,840 Speaker 4: well as the fact that, yeah, if they have pinpointed 432 00:21:33,920 --> 00:21:37,960 Speaker 4: specific locations that would be easily trackable and easily used 433 00:21:37,960 --> 00:21:39,359 Speaker 4: against someone, right. 434 00:21:39,560 --> 00:21:42,320 Speaker 3: Totally, And Sam, your point is such a good one, 435 00:21:42,320 --> 00:21:45,879 Speaker 3: and I think it's a good like it provides a good, 436 00:21:47,160 --> 00:21:51,359 Speaker 3: zoomed out understanding of the issue, right because I get 437 00:21:51,359 --> 00:21:53,880 Speaker 3: those two like I've talked about this on my own 438 00:21:53,920 --> 00:21:57,200 Speaker 3: show about the Spotify rap, which I always enjoy getting. 439 00:21:57,200 --> 00:21:59,960 Speaker 3: But all of these ways that we have been condition 440 00:22:00,240 --> 00:22:02,960 Speaker 3: to believe that surveillance is a good thing, Like oh, 441 00:22:03,119 --> 00:22:05,879 Speaker 3: I want to get a summary of all the different 442 00:22:05,880 --> 00:22:09,000 Speaker 3: places I've traveled in the last month. How and it's 443 00:22:09,160 --> 00:22:11,120 Speaker 3: you know, all these pictures of where I was set 444 00:22:11,160 --> 00:22:15,120 Speaker 3: to music that might feel exciting to get. And I'm 445 00:22:15,160 --> 00:22:18,399 Speaker 3: certainly susceptible to being excited to information about how I've 446 00:22:18,440 --> 00:22:20,720 Speaker 3: lived in my life but that is digital surveillance and 447 00:22:20,720 --> 00:22:23,119 Speaker 3: the way that we've been conditioned to think it is 448 00:22:23,880 --> 00:22:27,960 Speaker 3: not just commonplace to have our devices and our platforms 449 00:22:28,320 --> 00:22:31,560 Speaker 3: tracked and surveilled by big tech companies and used for 450 00:22:31,760 --> 00:22:34,600 Speaker 3: profit by those companies, but that we should be happy 451 00:22:34,640 --> 00:22:37,879 Speaker 3: that it's actually like cool and fun to see this 452 00:22:38,000 --> 00:22:41,160 Speaker 3: information about ourselves and get these little, you know, summaries 453 00:22:41,160 --> 00:22:43,160 Speaker 3: of what we listened to, where we went, what things 454 00:22:43,160 --> 00:22:46,560 Speaker 3: we bought, what we ate all of that, Like, there 455 00:22:46,680 --> 00:22:51,360 Speaker 3: really needs to be a fundamental reassessment of the relationship 456 00:22:51,359 --> 00:22:54,920 Speaker 3: between users and tech companies when it comes to our data, 457 00:22:54,960 --> 00:22:56,520 Speaker 3: because we deserve privacy. 458 00:22:56,560 --> 00:22:58,000 Speaker 5: Privacy should be the standard. 459 00:22:58,200 --> 00:22:59,720 Speaker 3: But even if you dress it up with a little 460 00:22:59,800 --> 00:23:03,240 Speaker 3: mon or a little newsletter or whatever, that is still 461 00:23:03,280 --> 00:23:06,320 Speaker 3: surveillance and we should ultimately be questioning whether that, like 462 00:23:06,440 --> 00:23:11,080 Speaker 3: that fundamental relationship between user and platform. That's my soapbox 463 00:23:11,119 --> 00:23:14,439 Speaker 3: about about surveillance. Like I get upset about it because 464 00:23:14,520 --> 00:23:16,800 Speaker 3: it gets me too, Like I, even though I know 465 00:23:16,840 --> 00:23:18,680 Speaker 3: all of this stuff and I feel very strongly about it, 466 00:23:18,840 --> 00:23:21,840 Speaker 3: when I get a pretty you know breakdown of what 467 00:23:21,880 --> 00:23:23,280 Speaker 3: I listened to at the end of the year, I 468 00:23:23,400 --> 00:23:25,800 Speaker 3: like it. And so's it's so insidious how we've been 469 00:23:25,840 --> 00:23:29,080 Speaker 3: trained to like things that ultimately may not be good 470 00:23:29,080 --> 00:23:29,879 Speaker 3: for us. 471 00:23:30,200 --> 00:23:33,320 Speaker 1: Yeah, yeah, I did. I was thinking about this the 472 00:23:33,359 --> 00:23:35,320 Speaker 1: other day because when I was in third grade, I 473 00:23:35,320 --> 00:23:39,200 Speaker 1: wrote this. We were supposed to tell a story, and 474 00:23:39,200 --> 00:23:41,080 Speaker 1: I wrote this story called the Right to be Forgotten. 475 00:23:41,119 --> 00:23:45,479 Speaker 1: I was like, what was going on with me? Great title, 476 00:23:46,320 --> 00:23:49,840 Speaker 1: I know, but now to say it became a case 477 00:23:50,000 --> 00:23:53,439 Speaker 1: against Google where people were like, I don't want to 478 00:23:53,440 --> 00:23:56,359 Speaker 1: be remembered in certain ways where you can search me 479 00:23:56,440 --> 00:24:00,200 Speaker 1: on the Internet, and it was a case with the EU, 480 00:24:00,480 --> 00:24:03,920 Speaker 1: and so I revisited it like a couple of years 481 00:24:03,960 --> 00:24:05,719 Speaker 1: ago as an actor, and I did like the Right 482 00:24:05,760 --> 00:24:06,480 Speaker 1: to be Forgotten. 483 00:24:07,600 --> 00:24:10,439 Speaker 2: And people afterwards were like, what are you talking about? 484 00:24:10,640 --> 00:24:13,040 Speaker 1: The kind of that idea of like maybe I just 485 00:24:13,080 --> 00:24:15,280 Speaker 1: don't want people to know there are certain parts I 486 00:24:15,320 --> 00:24:16,119 Speaker 1: just want to be private. 487 00:24:16,200 --> 00:24:19,840 Speaker 2: Oh my god, it is a time traveler, everybody. 488 00:24:20,080 --> 00:24:23,080 Speaker 3: It is wild that you're saying this literally. Yesterday I 489 00:24:23,119 --> 00:24:27,080 Speaker 3: did an interview with a UCLA Internet researcher named Olivia 490 00:24:27,119 --> 00:24:31,320 Speaker 3: Snow and we were talking. We were supplicitly talking about 491 00:24:31,320 --> 00:24:33,920 Speaker 3: the right to be forgotten when it comes to platforms. 492 00:24:34,040 --> 00:24:36,440 Speaker 3: I cannot believe that this is a concept that you 493 00:24:36,560 --> 00:24:40,840 Speaker 3: have been you know, grappling with since you were a child. 494 00:24:42,880 --> 00:24:45,280 Speaker 4: She does this a lot. There things that pop up 495 00:24:45,359 --> 00:24:47,040 Speaker 4: like she's like, that was my idea. I was like, 496 00:24:47,040 --> 00:24:51,280 Speaker 4: what ten years old? 497 00:24:52,240 --> 00:24:54,080 Speaker 1: When I was nine, I was doing that was like 498 00:24:54,080 --> 00:24:56,840 Speaker 1: the year I wrote out like all the like top 499 00:24:56,880 --> 00:24:58,840 Speaker 1: things I wanted to do before I died, Like. 500 00:24:59,119 --> 00:25:06,240 Speaker 2: I don't know what was going on going on? It's okay. 501 00:25:06,640 --> 00:25:12,440 Speaker 1: Well, going back to that study you mentioned, the accountable 502 00:25:12,480 --> 00:25:18,320 Speaker 1: tech thing, they did run a couple of specific experiments, right, Yeah, 503 00:25:18,359 --> 00:25:19,320 Speaker 1: it's really interesting. 504 00:25:19,400 --> 00:25:21,600 Speaker 3: I'm going to kind of gloss over it, but folks 505 00:25:21,600 --> 00:25:22,760 Speaker 3: should definitely check out the study. 506 00:25:22,760 --> 00:25:23,639 Speaker 5: You can find that online. 507 00:25:23,680 --> 00:25:27,800 Speaker 3: But essentially they did I think three different experiments where 508 00:25:27,840 --> 00:25:33,080 Speaker 3: they had a staffer buy an unused, totally new Android 509 00:25:33,119 --> 00:25:36,600 Speaker 3: smartphone and then start a new test Google account and 510 00:25:36,640 --> 00:25:39,840 Speaker 3: they accepted all the default privacy settings, and they had 511 00:25:39,840 --> 00:25:44,200 Speaker 3: to staffer travel from one state to another two different 512 00:25:44,560 --> 00:25:47,720 Speaker 3: abortion providers, so in one and they traveled from Columbus, 513 00:25:47,720 --> 00:25:51,879 Speaker 3: Ohio to Pittsburgh, Pennsylvania, and wound up that like planned 514 00:25:51,880 --> 00:25:56,240 Speaker 3: Parenthood of Pittsburgh and to see, you know, later how 515 00:25:56,280 --> 00:25:59,360 Speaker 3: that data would be handled In one test, a staffer 516 00:25:59,720 --> 00:26:01,760 Speaker 3: when they logged into the test account that they made 517 00:26:02,080 --> 00:26:04,320 Speaker 3: on her browser, so like she took her. 518 00:26:04,320 --> 00:26:05,840 Speaker 5: She took the phone to the clinic. 519 00:26:05,960 --> 00:26:09,080 Speaker 3: And then when she logged on on the browser I 520 00:26:09,119 --> 00:26:12,959 Speaker 3: think thirty days later under the Web and App activity 521 00:26:12,960 --> 00:26:16,119 Speaker 3: tab after she completed this test, she found a Google 522 00:26:16,160 --> 00:26:19,199 Speaker 3: Map search query from the abortion provider that she visited 523 00:26:19,240 --> 00:26:21,719 Speaker 3: stored on her account, So like, they're just not deleting 524 00:26:21,720 --> 00:26:23,440 Speaker 3: the data, like it's just like there's no other way 525 00:26:23,520 --> 00:26:26,960 Speaker 3: to interpret that information. And they did these tests in 526 00:26:27,000 --> 00:26:30,560 Speaker 3: three different ways, and each time there was some information 527 00:26:30,920 --> 00:26:33,480 Speaker 3: continued to be stored on these Google accounts, and so 528 00:26:33,480 --> 00:26:36,879 Speaker 3: according to Accountable Test, by retaining both location search query 529 00:26:36,920 --> 00:26:40,240 Speaker 3: and location history data, Google jeopardizes the health, safety, and 530 00:26:40,280 --> 00:26:43,200 Speaker 3: legal status of the users who visit reproductive care facilities 531 00:26:43,359 --> 00:26:44,959 Speaker 3: in states where abortion is criminalized. 532 00:26:45,119 --> 00:26:46,440 Speaker 5: If prosecutors in a state with. 533 00:26:46,400 --> 00:26:49,359 Speaker 3: A restrictive abortion law receive a tip about someone seeking 534 00:26:49,359 --> 00:26:52,320 Speaker 3: an abortion, a subpoena would likely force Google to hand 535 00:26:52,320 --> 00:26:57,560 Speaker 3: over this sensitive data. And again just to underscore that, 536 00:26:57,720 --> 00:27:00,960 Speaker 3: Google said that they were going to delete this data 537 00:27:01,560 --> 00:27:04,479 Speaker 3: soon after a person visited. What they said was going 538 00:27:04,520 --> 00:27:07,080 Speaker 3: to was one of these sensitive sites, including abortion providers. 539 00:27:07,400 --> 00:27:12,439 Speaker 3: And so this test that accountable Tech ran thirty days 540 00:27:12,920 --> 00:27:15,880 Speaker 3: is how does how was Google defining soon in this situation? 541 00:27:15,960 --> 00:27:16,120 Speaker 5: Right? 542 00:27:16,119 --> 00:27:18,680 Speaker 3: Like I would argue that a month is enough time 543 00:27:18,760 --> 00:27:22,560 Speaker 3: to have deleted this, but perhaps Google disagrees. And so 544 00:27:23,240 --> 00:27:25,520 Speaker 3: it does really feel like this is a situation where 545 00:27:25,520 --> 00:27:29,640 Speaker 3: they're I mean, I'm no lawyer, but misleading the public 546 00:27:29,760 --> 00:27:32,959 Speaker 3: saying publicly they're doing one thing and then doing another 547 00:27:33,000 --> 00:27:34,960 Speaker 3: thing in actuality. 548 00:27:34,720 --> 00:27:37,960 Speaker 5: That could get people in serious legal trouble, you know. 549 00:27:38,280 --> 00:27:42,760 Speaker 1: Right, And that's that's the thing is like, I know 550 00:27:42,800 --> 00:27:44,679 Speaker 1: we're going to talk about this later, but if if 551 00:27:44,800 --> 00:27:48,080 Speaker 1: company is telling me a thing, you know, I think 552 00:27:48,280 --> 00:27:50,200 Speaker 1: a lot of people use the word like they call 553 00:27:50,240 --> 00:27:51,960 Speaker 1: me naive, which I don't think is fair. But like 554 00:27:51,960 --> 00:27:54,840 Speaker 1: if I'm like, oh, I trust that because that's what 555 00:27:54,880 --> 00:27:58,000 Speaker 1: they said, and then it could be used in a 556 00:27:58,000 --> 00:28:04,000 Speaker 1: criminal case despite what they said, like that feels both 557 00:28:04,880 --> 00:28:08,520 Speaker 1: wrong illegal, but also it's very scary, Like that is 558 00:28:08,720 --> 00:28:10,840 Speaker 1: very frightening scenario. 559 00:28:11,080 --> 00:28:15,560 Speaker 3: Exactly, and I think it really underscores how scary and 560 00:28:15,640 --> 00:28:19,840 Speaker 3: dire and complicated things can be for someone who is pregnant, right, 561 00:28:19,920 --> 00:28:22,760 Speaker 3: Like it's already it can already be a scary, intense time. 562 00:28:22,840 --> 00:28:27,600 Speaker 3: But adding on to this vibe of you can't trust 563 00:28:27,720 --> 00:28:30,280 Speaker 3: what these companies who say they are going to do 564 00:28:31,000 --> 00:28:33,720 Speaker 3: X to keep you safe a little bit more you 565 00:28:33,720 --> 00:28:35,800 Speaker 3: can't trust what they're saying. It just adds to this 566 00:28:36,640 --> 00:28:40,840 Speaker 3: complex web where it's difficult to know who you can trust. 567 00:28:40,880 --> 00:28:44,000 Speaker 5: And so I should note that most today, according to. 568 00:28:44,000 --> 00:28:47,880 Speaker 3: The Washington Post, most criminal charges for abortion are stemmed 569 00:28:47,960 --> 00:28:51,280 Speaker 3: from a human telling the authorities, not just from your 570 00:28:51,280 --> 00:28:54,000 Speaker 3: digital footprint being scraped by Google or something like that. 571 00:28:54,280 --> 00:28:56,520 Speaker 3: But we know that your digital footprint can be used 572 00:28:56,560 --> 00:28:58,600 Speaker 3: as evidence. And so you know, we live in this 573 00:28:58,680 --> 00:29:03,360 Speaker 3: climate where the condictive ex lovers, nosy neighbors, or even 574 00:29:03,680 --> 00:29:06,960 Speaker 3: just self appointed vigilantes who may have no connection to 575 00:29:07,000 --> 00:29:09,720 Speaker 3: someone who is pregnant at all, all of these people 576 00:29:09,840 --> 00:29:12,880 Speaker 3: can be threats. In Texas, a man filed a wrongful 577 00:29:12,880 --> 00:29:16,160 Speaker 3: death lawsuit against three women for allegedly helping his then 578 00:29:16,240 --> 00:29:19,200 Speaker 3: wife of pain pills that allegedly were used to induce 579 00:29:19,200 --> 00:29:22,120 Speaker 3: an abortion last year. If you're someone who is looking 580 00:29:22,120 --> 00:29:24,240 Speaker 3: for an abortion in a state where it's been criminalized, 581 00:29:24,600 --> 00:29:28,360 Speaker 3: not only do you need to think about who you 582 00:29:28,440 --> 00:29:31,560 Speaker 3: can trust, who has access to this information about you 583 00:29:31,600 --> 00:29:34,480 Speaker 3: who's in your community again, like even just like your 584 00:29:34,520 --> 00:29:36,840 Speaker 3: nosy neighbor next door who doesn't like you for no reason. 585 00:29:37,160 --> 00:29:39,640 Speaker 3: But also on top of that, you need to think 586 00:29:39,640 --> 00:29:42,800 Speaker 3: about your digital security. You need to think about how 587 00:29:42,840 --> 00:29:46,080 Speaker 3: you are accessing information and data online, and you need 588 00:29:46,120 --> 00:29:48,520 Speaker 3: to think about whether or not Google is being upfront 589 00:29:48,640 --> 00:29:52,160 Speaker 3: about what they're publicly saying about how they use that data. 590 00:29:52,240 --> 00:29:55,920 Speaker 3: It is an incredibly high burden that nobody should have 591 00:29:55,960 --> 00:29:58,160 Speaker 3: to deal with. It's already hard enough to be in 592 00:29:58,200 --> 00:30:01,560 Speaker 3: this situation. On top of it, you should not, as 593 00:30:01,560 --> 00:30:04,520 Speaker 3: a regular person, have to be parsing lies from a 594 00:30:04,560 --> 00:30:07,560 Speaker 3: platform like Google to make choices for your life and 595 00:30:07,600 --> 00:30:08,200 Speaker 3: for your health. 596 00:30:21,840 --> 00:30:25,000 Speaker 1: I know that you came on and talked about it, 597 00:30:25,040 --> 00:30:28,000 Speaker 1: I think pretty soon after roe versus Wade got overturned. 598 00:30:28,000 --> 00:30:28,920 Speaker 2: But kind of the like. 599 00:30:30,720 --> 00:30:34,760 Speaker 1: Unfortunate tips that people need to know about, like you know, 600 00:30:35,040 --> 00:30:38,880 Speaker 1: different as I was saying, Gmail being used, but like 601 00:30:39,120 --> 00:30:41,720 Speaker 1: having a separate Gmail account and using a private browser 602 00:30:41,760 --> 00:30:47,600 Speaker 1: and all these things that Like it's like you were saying, 603 00:30:47,680 --> 00:30:49,560 Speaker 1: it's a lot to ask of someone to know that 604 00:30:49,600 --> 00:30:50,600 Speaker 1: they need to do this. 605 00:30:53,560 --> 00:30:56,480 Speaker 3: Yeah, it's the which we shouldn't be asking this of people, 606 00:30:56,600 --> 00:30:59,320 Speaker 3: like you shouldn't have to be a digital security expert 607 00:30:59,800 --> 00:31:02,840 Speaker 3: just to make choices for your health. You shouldn't have 608 00:31:02,920 --> 00:31:07,200 Speaker 3: to know how to parse corporations public pr speak from 609 00:31:07,240 --> 00:31:09,320 Speaker 3: what they're actually doing. Like that is a lot for 610 00:31:09,360 --> 00:31:12,360 Speaker 3: someone to have to navigate. And so right after Roe 611 00:31:12,440 --> 00:31:14,440 Speaker 3: was overturned, we did an episode of There Are No 612 00:31:14,520 --> 00:31:16,880 Speaker 3: Girls on the Internet with a computer scientist and digital 613 00:31:16,880 --> 00:31:19,720 Speaker 3: security expert, doctor Jen Golbeck. If you know that name, 614 00:31:19,760 --> 00:31:23,680 Speaker 3: maybe you've seen her popular TikTok series educating people on 615 00:31:23,720 --> 00:31:27,040 Speaker 3: how to be more secure when navigating abortion information online. 616 00:31:27,240 --> 00:31:28,800 Speaker 3: I should tell you, like, I am not a lawyer, 617 00:31:28,800 --> 00:31:31,440 Speaker 3: I am at a digital security expert, So I want 618 00:31:31,440 --> 00:31:33,880 Speaker 3: to be clear about that. But I wanted to, just 619 00:31:34,280 --> 00:31:36,400 Speaker 3: if it's possible, play a little clip of what she 620 00:31:36,520 --> 00:31:38,880 Speaker 3: told me, because she is the expert. I don't want 621 00:31:38,880 --> 00:31:41,880 Speaker 3: to like summarize what she said. So you know, you've 622 00:31:41,880 --> 00:31:44,240 Speaker 3: mentioned a couple of like really great tips for folks 623 00:31:44,240 --> 00:31:47,120 Speaker 3: if you're if you're you know it, looking for abortion 624 00:31:47,240 --> 00:31:48,520 Speaker 3: pills and you want to do it, you know in 625 00:31:48,520 --> 00:31:50,120 Speaker 3: a way that you're going to be less likely to 626 00:31:50,160 --> 00:31:53,560 Speaker 3: be tracked, you know, using a tour browser, using incognito 627 00:31:53,600 --> 00:31:55,680 Speaker 3: mode when you search using. 628 00:31:55,440 --> 00:31:56,240 Speaker 5: Public Wi Fi. 629 00:31:56,360 --> 00:31:58,240 Speaker 3: Are there other tips that you want to shout out 630 00:31:58,240 --> 00:31:59,880 Speaker 3: for folks if you if they if they might need 631 00:31:59,880 --> 00:32:00,520 Speaker 3: this information. 632 00:32:01,360 --> 00:32:03,800 Speaker 5: So that's all important stuff, I would say for sure. 633 00:32:03,840 --> 00:32:06,840 Speaker 7: The most important one is that you are not paying 634 00:32:07,360 --> 00:32:09,480 Speaker 7: with a credit card or a debit card connected to 635 00:32:09,520 --> 00:32:12,160 Speaker 7: your name, So figure out how much your medication is 636 00:32:12,200 --> 00:32:15,840 Speaker 7: going to cost. Use cash by like a Visa Vanilla 637 00:32:15,880 --> 00:32:18,400 Speaker 7: gift card, which you can get anywhere for that amount, 638 00:32:18,840 --> 00:32:22,680 Speaker 7: and then pay with a Visa Vanilla gift card. So 639 00:32:22,920 --> 00:32:25,880 Speaker 7: much of how we're tracked is through credit card number, 640 00:32:26,520 --> 00:32:28,720 Speaker 7: so definitely do that. And the other way that we're 641 00:32:28,760 --> 00:32:31,680 Speaker 7: really easily tracked is through email address. So set up 642 00:32:31,720 --> 00:32:33,960 Speaker 7: a fresh email address that you are only using to 643 00:32:33,960 --> 00:32:37,400 Speaker 7: buy this abortion medication. Proton mail is the one site 644 00:32:37,400 --> 00:32:39,600 Speaker 7: that I recommended for this. It's free, it's encrypted, it's 645 00:32:39,640 --> 00:32:42,840 Speaker 7: really good and secure. You can just set up an 646 00:32:42,880 --> 00:32:45,040 Speaker 7: email address, use it to buyer medicine, don't use it 647 00:32:45,040 --> 00:32:48,000 Speaker 7: for anything else. If you do that, gift card fresh 648 00:32:48,000 --> 00:32:51,360 Speaker 7: email address on something like proton mail. You know, I 649 00:32:51,400 --> 00:32:53,360 Speaker 7: love Gmail, I use it right, but they track the 650 00:32:53,400 --> 00:32:53,960 Speaker 7: hell out of. 651 00:32:53,920 --> 00:32:54,920 Speaker 5: You on Gmail. 652 00:32:55,240 --> 00:32:59,560 Speaker 7: So proton mail email address, Vanilla gift card. You get 653 00:32:59,800 --> 00:33:02,520 Speaker 7: a eighty percent of the protection from tracking just from 654 00:33:02,560 --> 00:33:05,920 Speaker 7: those two measures, So you know that's easy and accessible 655 00:33:05,960 --> 00:33:06,560 Speaker 7: to anybody. 656 00:33:07,360 --> 00:33:08,080 Speaker 5: Definitely do that. 657 00:33:09,160 --> 00:33:13,640 Speaker 3: So yeah, but she doesn't speak to like a map, 658 00:33:13,800 --> 00:33:16,160 Speaker 3: location data. And I think that's an interesting point that 659 00:33:16,440 --> 00:33:19,800 Speaker 3: when we were having conversations about how to stay secure, 660 00:33:19,840 --> 00:33:23,200 Speaker 3: it was really like searching things, paying for things. But 661 00:33:23,600 --> 00:33:25,200 Speaker 3: I guess it stands to reason that in twenty twenty 662 00:33:25,280 --> 00:33:29,080 Speaker 3: three you might be using Google Maps to access just 663 00:33:29,120 --> 00:33:30,520 Speaker 3: like where am I going if I need to go 664 00:33:30,560 --> 00:33:32,320 Speaker 3: to go get an abortion, like where am I headed? 665 00:33:32,640 --> 00:33:35,719 Speaker 3: And it kind of reminds me of like the throwback idea, 666 00:33:35,800 --> 00:33:39,640 Speaker 3: how before the ubiquity of things like GPS, you had to. 667 00:33:39,560 --> 00:33:40,880 Speaker 5: Just know where you were going. 668 00:33:41,320 --> 00:33:44,360 Speaker 3: And I certainly there are places that I've been multiple 669 00:33:44,440 --> 00:33:47,440 Speaker 3: times in my life. I could get there through muscle memory. 670 00:33:47,720 --> 00:33:49,880 Speaker 3: Now with GPS, I'm like, oh, how do I get there? 671 00:33:50,120 --> 00:33:53,440 Speaker 3: And Yeah, it's just a good reminder of how much 672 00:33:53,960 --> 00:33:58,080 Speaker 3: these platforms like Google have become commonplace in our life, 673 00:33:58,080 --> 00:34:00,600 Speaker 3: and how they've gotten between us and the things that 674 00:34:00,640 --> 00:34:02,040 Speaker 3: we need to do, and. 675 00:34:02,400 --> 00:34:04,360 Speaker 5: We don't even we don't even necessarily really think. 676 00:34:04,280 --> 00:34:07,000 Speaker 1: About it, right, it's it's like a part of your 677 00:34:07,040 --> 00:34:10,000 Speaker 1: every day. It's like, how I'm not going to get 678 00:34:10,000 --> 00:34:12,560 Speaker 1: out a map. I often remember this time I had 679 00:34:12,560 --> 00:34:16,480 Speaker 1: to like proNT out a map quest page to go 680 00:34:16,680 --> 00:34:18,640 Speaker 1: meet a friend and I could never find her, and 681 00:34:18,680 --> 00:34:20,560 Speaker 1: my dad and I got in a huge fight about it. 682 00:34:20,680 --> 00:34:23,520 Speaker 2: I was like, this is what the map quests. 683 00:34:24,600 --> 00:34:26,319 Speaker 1: But now it's just like part of her every day 684 00:34:26,360 --> 00:34:28,879 Speaker 1: life and we don't think about it until it goes away, 685 00:34:28,880 --> 00:34:30,640 Speaker 1: and then you realize, like how much you rely on 686 00:34:30,680 --> 00:34:34,920 Speaker 1: that kind of stuff. And that's true with this the 687 00:34:34,960 --> 00:34:37,640 Speaker 1: clip that you just played of like you know, email 688 00:34:38,200 --> 00:34:43,160 Speaker 1: and using your credit card and your debit card, and 689 00:34:43,200 --> 00:34:45,680 Speaker 1: it kind of it bums me out because it felt 690 00:34:45,719 --> 00:34:48,719 Speaker 1: so much like how I went about downloading music illegally 691 00:34:49,480 --> 00:34:52,640 Speaker 1: for a while. And this is like a health procedure, 692 00:34:53,000 --> 00:34:55,960 Speaker 1: Like yeahal health procedure, and we're treating. 693 00:34:55,760 --> 00:34:58,200 Speaker 2: It like I'm on lime wire and the middle. 694 00:34:58,000 --> 00:35:00,600 Speaker 4: Ratle Wow throw it back. 695 00:35:02,640 --> 00:35:02,839 Speaker 8: Yeah. 696 00:35:02,880 --> 00:35:05,759 Speaker 4: I'm thinking about how people get the season desist from 697 00:35:05,840 --> 00:35:09,439 Speaker 4: fbis for downloading movies today and even maybe a knock 698 00:35:09,480 --> 00:35:12,359 Speaker 4: on the door. They go to that level just for 699 00:35:12,400 --> 00:35:15,799 Speaker 4: a movie. Don't get me wrong, you know whatever capitalism 700 00:35:15,920 --> 00:35:18,719 Speaker 4: it is where it is. But all the stuff that 701 00:35:18,880 --> 00:35:20,960 Speaker 4: I'm thinking about as we're sitting here talking about it, 702 00:35:21,040 --> 00:35:23,000 Speaker 4: I'm like, Yeah, I use GPS just to go home, 703 00:35:23,040 --> 00:35:25,359 Speaker 4: and half the time it's because I want to get 704 00:35:25,440 --> 00:35:28,239 Speaker 4: traffic information. I need an ATA and I need to 705 00:35:28,239 --> 00:35:30,240 Speaker 4: know which was the best route and all these things, 706 00:35:30,440 --> 00:35:32,920 Speaker 4: and Google Map is my way to go obviously for 707 00:35:33,000 --> 00:35:35,080 Speaker 4: all of that. It's not just for the direction, and 708 00:35:35,120 --> 00:35:37,280 Speaker 4: sometimes it's to tell me that something's closed or open 709 00:35:37,520 --> 00:35:40,440 Speaker 4: half the time. And I can't imagine not using it 710 00:35:40,800 --> 00:35:43,759 Speaker 4: at this point because I'm so reliant on that for 711 00:35:43,880 --> 00:35:46,200 Speaker 4: my information, because I don't listen to live broadcast like 712 00:35:46,600 --> 00:35:48,279 Speaker 4: radio when they used to do I guess they still 713 00:35:48,320 --> 00:35:50,800 Speaker 4: do traffic and weather and such, but I don't. I 714 00:35:51,520 --> 00:35:54,799 Speaker 4: literally say hey, Google, what's this? Something hucks back to 715 00:35:54,840 --> 00:35:58,840 Speaker 4: me okay, but you know just how quickly that becomes 716 00:35:58,880 --> 00:36:02,800 Speaker 4: the solution. And then hearing her talking about these processes, 717 00:36:02,800 --> 00:36:06,040 Speaker 4: it reminds me of trying to do like a spy movie, yeah, 718 00:36:06,040 --> 00:36:08,480 Speaker 4: where we have to go get a burner phone with 719 00:36:08,560 --> 00:36:12,000 Speaker 4: a burner credit card and make sure that we take 720 00:36:12,040 --> 00:36:14,720 Speaker 4: out the chip just in case that that gets followed. 721 00:36:14,760 --> 00:36:17,120 Speaker 4: I'm like the links that people are gonna have to 722 00:36:17,160 --> 00:36:19,000 Speaker 4: go to and most of the people who are having 723 00:36:19,040 --> 00:36:21,960 Speaker 4: to access this type of care do, as you said, 724 00:36:22,160 --> 00:36:24,640 Speaker 4: do not know how to do this, probably can't afford 725 00:36:24,800 --> 00:36:26,400 Speaker 4: to do some of these things, may not even have 726 00:36:26,480 --> 00:36:29,600 Speaker 4: access because we know that vanilla card, I don't buy 727 00:36:29,600 --> 00:36:31,960 Speaker 4: that because there's a three dollars fee at the very least, 728 00:36:32,120 --> 00:36:33,160 Speaker 4: that pisses me off. 729 00:36:33,719 --> 00:36:36,879 Speaker 3: Yeah, no, you're angry about the wrong thing, right, No, 730 00:36:37,400 --> 00:36:39,520 Speaker 3: it's a totally a legitimate thing to be angry about. 731 00:36:39,560 --> 00:36:43,239 Speaker 3: And I think, like you know, I don't want to 732 00:36:43,360 --> 00:36:46,000 Speaker 3: make it sound like I am on like a like 733 00:36:46,080 --> 00:36:48,279 Speaker 3: I am so much more digitally secure than you, like, 734 00:36:48,320 --> 00:36:49,320 Speaker 3: I don't use these things. 735 00:36:49,400 --> 00:36:52,719 Speaker 5: I have a smartphone I could not find I cannot. 736 00:36:52,440 --> 00:36:55,600 Speaker 3: Get anywhere without Google Maps. I use these tools regularly. 737 00:36:56,080 --> 00:37:00,360 Speaker 3: I'm not suggesting that people who are not actively doing 738 00:37:00,400 --> 00:37:06,399 Speaker 3: something that is potentially criminalized or illegal should stop using 739 00:37:06,440 --> 00:37:10,799 Speaker 3: Google Maps, should stop using Gmail, should should do these things. Right, 740 00:37:10,880 --> 00:37:13,600 Speaker 3: that's not necessarily realistic in twenty twenty three. I don't 741 00:37:13,600 --> 00:37:16,400 Speaker 3: live my life that way, but I do think it 742 00:37:16,480 --> 00:37:20,759 Speaker 3: is worth stepping back and just questioning what we give 743 00:37:20,920 --> 00:37:24,160 Speaker 3: up when we get these conveniences in our life and 744 00:37:24,480 --> 00:37:27,799 Speaker 3: just having an awareness of that, because yeah, I want 745 00:37:27,800 --> 00:37:29,680 Speaker 3: to be able to find the quickest route home, I 746 00:37:29,719 --> 00:37:31,640 Speaker 3: want to be able to avoid traffic. I want to 747 00:37:31,680 --> 00:37:33,759 Speaker 3: know if there's a police officer with a scanner there, 748 00:37:33,880 --> 00:37:36,120 Speaker 3: or a toll or whatever, or a road closure that 749 00:37:36,280 --> 00:37:38,480 Speaker 3: is convenient. That is how that is a convenient way 750 00:37:38,520 --> 00:37:41,400 Speaker 3: to live modern life in twenty twenty three. But it's 751 00:37:41,480 --> 00:37:44,200 Speaker 3: not just something that I'm being given without a cost. 752 00:37:44,239 --> 00:37:46,640 Speaker 3: And so I think that we should be really aware 753 00:37:46,920 --> 00:37:49,520 Speaker 3: and have a really good sense of what those costs are, 754 00:37:49,560 --> 00:37:51,440 Speaker 3: because it can be easy to think that there is 755 00:37:51,520 --> 00:37:53,840 Speaker 3: no cost and there's no such thing as a free lunch. 756 00:37:54,040 --> 00:37:55,200 Speaker 5: There is a cost. 757 00:37:55,320 --> 00:37:59,120 Speaker 3: You know, My dad is someone who does not trust GPS, 758 00:37:59,160 --> 00:38:01,480 Speaker 3: so he doesn't have when he gets in the car 759 00:38:01,520 --> 00:38:03,960 Speaker 3: to go on a trip, he's got those old. 760 00:38:03,719 --> 00:38:06,520 Speaker 5: School big road atlases under his. 761 00:38:06,600 --> 00:38:09,480 Speaker 3: Seat still in twenty twenty three, God love them, and like, 762 00:38:10,000 --> 00:38:13,680 Speaker 3: but it's like as inconvenient as that is, that like 763 00:38:13,960 --> 00:38:16,520 Speaker 3: that is he's gaining something, So it's it's it's really 764 00:38:16,760 --> 00:38:20,000 Speaker 3: it's really about all of us making a cost benefit 765 00:38:20,040 --> 00:38:22,640 Speaker 3: analysis of what we are giving up versus what we 766 00:38:22,680 --> 00:38:25,000 Speaker 3: are getting and not thinking that like, oh, we're just 767 00:38:25,040 --> 00:38:28,360 Speaker 3: getting this cool new way to have a modern convenience 768 00:38:28,360 --> 00:38:30,520 Speaker 3: in our life that's just coming at no cost for us. 769 00:38:30,640 --> 00:38:32,759 Speaker 5: There is a cost and we should be aware of it. 770 00:38:33,440 --> 00:38:35,920 Speaker 4: And that's the thing is, like the conversation is, should 771 00:38:35,920 --> 00:38:39,160 Speaker 4: there be a cost? Why can't we have access to 772 00:38:39,719 --> 00:38:43,319 Speaker 4: these amazing things? Yes, the privacy thing maybe, But even 773 00:38:43,360 --> 00:38:44,960 Speaker 4: on top of that, the fact that there is a 774 00:38:45,120 --> 00:38:47,839 Speaker 4: chance of being persecuted and prosecuted for a thing that 775 00:38:47,880 --> 00:38:50,560 Speaker 4: shouldn't be even a law, you know, to begin with, 776 00:38:50,600 --> 00:38:53,160 Speaker 4: and having conversations like people who are going after people 777 00:38:53,360 --> 00:38:55,440 Speaker 4: and who they're going after, and we know that these 778 00:38:55,640 --> 00:38:59,280 Speaker 4: laws typically go after the marginalized people as well as 779 00:38:59,400 --> 00:39:01,879 Speaker 4: those in the lower or the social economic status. And 780 00:39:01,920 --> 00:39:05,359 Speaker 4: it's that that really is the burden of it all, 781 00:39:05,480 --> 00:39:08,759 Speaker 4: is that what this is doing is waging a war 782 00:39:08,920 --> 00:39:12,279 Speaker 4: and going after and persecuting people who in themselves are 783 00:39:12,320 --> 00:39:14,919 Speaker 4: already down and out. I guess it is the best 784 00:39:14,920 --> 00:39:18,440 Speaker 4: way to put it, or already put at the bottom 785 00:39:18,520 --> 00:39:20,799 Speaker 4: end of the poll, like they're not having the opportunities 786 00:39:20,880 --> 00:39:23,040 Speaker 4: or the ability to even defend themselves. Some of the 787 00:39:23,080 --> 00:39:25,600 Speaker 4: cases that have come forward as abortion cases having miscarriages 788 00:39:25,760 --> 00:39:28,839 Speaker 4: who have been taken out of context and say they 789 00:39:28,880 --> 00:39:30,279 Speaker 4: try to get an abortion and a lot of them 790 00:39:30,320 --> 00:39:33,919 Speaker 4: are refugees or in the immigrant status that I've seen 791 00:39:34,000 --> 00:39:36,080 Speaker 4: so many stories of that, and they have no way 792 00:39:36,120 --> 00:39:40,680 Speaker 4: of defending themselves because they don't get legal help. Typically 793 00:39:41,080 --> 00:39:43,200 Speaker 4: the good legal help will say it that way or 794 00:39:43,239 --> 00:39:45,839 Speaker 4: even the hearing. And when we talk about people who 795 00:39:45,880 --> 00:39:50,759 Speaker 4: are being held for immigration violations, like we've seen so 796 00:39:50,840 --> 00:39:53,759 Speaker 4: many conversations, well not enough because it doesn't get publicized 797 00:39:54,040 --> 00:39:57,279 Speaker 4: where convictions and things are of that nature coming out 798 00:39:57,440 --> 00:40:01,120 Speaker 4: from that or going after people who cannot get good representation. 799 00:40:01,480 --> 00:40:04,640 Speaker 4: And again, these laws even though it does affect everyone 800 00:40:04,640 --> 00:40:06,640 Speaker 4: and we should all be talking about it, but we 801 00:40:06,760 --> 00:40:09,479 Speaker 4: know that who are truly affected are those who can't 802 00:40:09,480 --> 00:40:12,839 Speaker 4: represent themselves and advocate for themselves. And that's the whole conversation, 803 00:40:13,000 --> 00:40:15,480 Speaker 4: is what is happening, who is who is doing this? 804 00:40:15,560 --> 00:40:17,600 Speaker 4: And why can't we be safe from this? Why can't 805 00:40:17,640 --> 00:40:20,720 Speaker 4: we like this shouldn't be one of the cost benefits. 806 00:40:20,760 --> 00:40:22,520 Speaker 4: It shouldn't be one or the other. It should be 807 00:40:22,640 --> 00:40:24,640 Speaker 4: that we have the rights to do this and feel 808 00:40:24,680 --> 00:40:25,680 Speaker 4: safe at the same time. 809 00:40:26,040 --> 00:40:28,799 Speaker 3: I mean, like you put it so well, privacy is 810 00:40:28,800 --> 00:40:31,920 Speaker 3: a right. Digital privacy is a right, especially when it 811 00:40:31,920 --> 00:40:34,200 Speaker 3: comes to things that are sensitive like our health information 812 00:40:34,320 --> 00:40:37,880 Speaker 3: or health data visiting places like shelters or clinics. We 813 00:40:37,920 --> 00:40:42,920 Speaker 3: should have an expectation of privacy around those issues. If 814 00:40:42,960 --> 00:40:46,120 Speaker 3: you are someone who is already burdened by being a 815 00:40:46,160 --> 00:40:49,000 Speaker 3: marginalized person in society, if you are already burdened by 816 00:40:49,360 --> 00:40:53,040 Speaker 3: you know, maybe not having stable housing or not, or 817 00:40:53,080 --> 00:40:55,759 Speaker 3: having difficulties in your life on top of that, you 818 00:40:55,760 --> 00:40:58,719 Speaker 3: shouldn't have to know what a tore browser is just 819 00:40:58,760 --> 00:41:01,480 Speaker 3: to make safe decisions for your because of the whims of. 820 00:41:01,480 --> 00:41:02,800 Speaker 5: A company like Google. 821 00:41:02,880 --> 00:41:06,680 Speaker 3: But right now, that's the landscape that Google is creating, 822 00:41:06,719 --> 00:41:10,319 Speaker 3: and they have the power and the ability to change that. 823 00:41:10,400 --> 00:41:12,719 Speaker 3: They have said they are going to not do that, 824 00:41:13,080 --> 00:41:15,479 Speaker 3: and yet they just aren't. And so I completely agree 825 00:41:15,520 --> 00:41:18,000 Speaker 3: with you. I think that Google needs to decide if 826 00:41:18,040 --> 00:41:21,440 Speaker 3: they are going to be in the business of creating. 827 00:41:20,960 --> 00:41:24,120 Speaker 5: A world where privacy is a right. 828 00:41:24,000 --> 00:41:26,799 Speaker 3: And people have the expectation of privacy or not, and 829 00:41:26,840 --> 00:41:29,600 Speaker 3: if they're not going to, don't say otherwise. I think 830 00:41:29,600 --> 00:41:31,160 Speaker 3: that's kind of one of the reasons why I get 831 00:41:31,160 --> 00:41:34,240 Speaker 3: so angry about this, is like they're able to enjoy 832 00:41:34,360 --> 00:41:39,400 Speaker 3: this public perception of proactively trying to keep people a 833 00:41:39,400 --> 00:41:42,400 Speaker 3: little bit safer while not doing that. So why did 834 00:41:42,440 --> 00:41:43,920 Speaker 3: just say it if you weren't going to do it? 835 00:41:44,000 --> 00:41:44,160 Speaker 3: You know. 836 00:41:45,200 --> 00:41:47,399 Speaker 1: Yeah, and I've been making all this money. I can 837 00:41:47,400 --> 00:41:49,439 Speaker 1: only assume they just really make a lot of money 838 00:41:49,440 --> 00:41:50,080 Speaker 1: off our data. 839 00:41:50,160 --> 00:41:53,480 Speaker 3: So they're like, oh, like, I will talk all day. 840 00:41:53,719 --> 00:41:58,200 Speaker 3: Let's just say your assumption is correct. Your assumption is very, 841 00:41:58,360 --> 00:42:02,279 Speaker 3: very correct. And yeah, I think that, like, we need 842 00:42:02,320 --> 00:42:06,680 Speaker 3: to fundamentally change our understanding when it comes to the 843 00:42:06,719 --> 00:42:10,040 Speaker 3: relationship between users, our data and platforms. Like the fact 844 00:42:10,040 --> 00:42:13,759 Speaker 3: that Google it's so extracted, they take so much from us, 845 00:42:14,040 --> 00:42:16,360 Speaker 3: they can lie and misrepresent what they take from us 846 00:42:16,400 --> 00:42:18,920 Speaker 3: and how they take it, and they can make billions 847 00:42:18,960 --> 00:42:21,560 Speaker 3: of dollars off of it. Something is wrong with that equation. 848 00:42:21,600 --> 00:42:24,319 Speaker 3: We need to fundamentally rethink the relationship that users have 849 00:42:24,400 --> 00:42:25,040 Speaker 3: with platforms. 850 00:42:25,239 --> 00:42:44,520 Speaker 8: Yes, and there has been some attempts recently to kind 851 00:42:44,520 --> 00:42:45,400 Speaker 8: of change the situation. 852 00:42:45,840 --> 00:42:46,319 Speaker 5: That's right. 853 00:42:46,440 --> 00:42:49,080 Speaker 3: So just last week, nearly a dozen cent A Democrats 854 00:42:49,120 --> 00:42:51,920 Speaker 3: wrote to Google with questions about how it deletes users' 855 00:42:51,920 --> 00:42:54,880 Speaker 3: location history when they have visited these sensitive locations like 856 00:42:54,920 --> 00:42:58,359 Speaker 3: abortion clinics, expressing concerns that the company may not be 857 00:42:58,440 --> 00:43:02,920 Speaker 3: consistently deleting the data as promised. These included Senators Amy Klobachar, 858 00:43:03,000 --> 00:43:06,120 Speaker 3: Elizabeth Warren, and Mazie Herano asking for answers from Google 859 00:43:06,160 --> 00:43:08,560 Speaker 3: about the types of locations they consider to be sensitive 860 00:43:08,760 --> 00:43:11,759 Speaker 3: and how long it takes for the company to automatically 861 00:43:11,800 --> 00:43:14,680 Speaker 3: delete visit history. Again they Google said it was going 862 00:43:14,719 --> 00:43:18,720 Speaker 3: to be quote soon, Accountable Tech found that it hadn't 863 00:43:18,719 --> 00:43:21,920 Speaker 3: happened in thirty days after that visit, So, you know, 864 00:43:22,239 --> 00:43:24,560 Speaker 3: I think these senators are right to at least get 865 00:43:24,600 --> 00:43:27,279 Speaker 3: some clarity about, well, what do you consider soon? And 866 00:43:27,520 --> 00:43:29,719 Speaker 3: just a few days ago, Google was sued by an 867 00:43:29,719 --> 00:43:34,400 Speaker 3: anonymous complaintant claiming that Google unlawfully collects health data, including 868 00:43:34,400 --> 00:43:37,439 Speaker 3: abortion searches, on third party websites that use Google tech. 869 00:43:38,000 --> 00:43:41,960 Speaker 3: Jane Doe, who is the complaintant their legal representation, is 870 00:43:42,000 --> 00:43:44,480 Speaker 3: looking to get the case certified as a class action 871 00:43:44,640 --> 00:43:48,160 Speaker 3: suit and claims that her private information was intercepted by 872 00:43:48,160 --> 00:43:51,320 Speaker 3: Google when she used the scheduling pages on Planned Parenthood's 873 00:43:51,320 --> 00:43:54,120 Speaker 3: website in twenty eighteen to search for an abortion provider. 874 00:43:54,200 --> 00:43:56,880 Speaker 3: So I think it's interesting that she is trying to 875 00:43:56,920 --> 00:44:00,879 Speaker 3: pursue this as a class action litigation because I do 876 00:44:00,960 --> 00:44:05,720 Speaker 3: think there is a class of people, like a large 877 00:44:05,719 --> 00:44:08,560 Speaker 3: group of people who are facing harm because of Google, 878 00:44:08,560 --> 00:44:11,160 Speaker 3: and it's not individuals, it is all of us. We 879 00:44:11,200 --> 00:44:13,879 Speaker 3: are a collective harmed group. And so I really am 880 00:44:13,920 --> 00:44:15,839 Speaker 3: interested in the fact that this isn't just. 881 00:44:15,800 --> 00:44:16,840 Speaker 5: One person doing Google. 882 00:44:16,920 --> 00:44:18,799 Speaker 3: She is trying to do it as a class action 883 00:44:18,920 --> 00:44:22,319 Speaker 3: lawsuit because it is representative, in my opinion, of a 884 00:44:22,320 --> 00:44:23,200 Speaker 3: collective harm. 885 00:44:23,440 --> 00:44:25,480 Speaker 4: That's interesting because we know that there's some class action 886 00:44:25,520 --> 00:44:28,279 Speaker 4: suits happening with Facebook, and it's a pretty big deal 887 00:44:28,360 --> 00:44:32,120 Speaker 4: because there's a couple with billions of dollars in hand. 888 00:44:32,719 --> 00:44:36,319 Speaker 4: We know that the EU has gone after a couple 889 00:44:36,360 --> 00:44:39,360 Speaker 4: of companies as well belowly vis it Twitter for billions 890 00:44:39,400 --> 00:44:41,440 Speaker 4: of dollars in fines possibly, But that's what's going to 891 00:44:41,600 --> 00:44:44,680 Speaker 4: change anything. We know that it's not actually about humanity, 892 00:44:44,880 --> 00:44:46,719 Speaker 4: it's about the money, and if it's going to cost 893 00:44:46,719 --> 00:44:49,440 Speaker 4: the money, then they're more likely to do something as 894 00:44:49,480 --> 00:44:51,800 Speaker 4: long as they don't have to pay out whatever it 895 00:44:51,840 --> 00:44:54,560 Speaker 4: may be, and just ten dollars a few pennies to 896 00:44:54,800 --> 00:44:57,680 Speaker 4: millions of people. There's a lot of money, so that's 897 00:44:57,719 --> 00:44:58,839 Speaker 4: actually a smart way to go. 898 00:44:59,239 --> 00:45:01,000 Speaker 5: I think Yeah, to just do a quick plug. 899 00:45:01,600 --> 00:45:04,760 Speaker 3: If you are a US resident who used Facebook between 900 00:45:04,800 --> 00:45:07,640 Speaker 3: May twenty fourth, two thousand and seven and December twenty second, 901 00:45:07,719 --> 00:45:10,719 Speaker 3: twenty twenty two, you can file a monetary claim as 902 00:45:10,719 --> 00:45:13,080 Speaker 3: long as you do so before August twenty fifth, twenty 903 00:45:13,120 --> 00:45:17,279 Speaker 3: twenty three. And so it is. You know, for a 904 00:45:17,320 --> 00:45:21,319 Speaker 3: company like Facebook, it probably won't won't be more than 905 00:45:21,320 --> 00:45:22,759 Speaker 3: like a slap on the wrist for them. It probably 906 00:45:22,800 --> 00:45:26,439 Speaker 3: won't be something that they really like feel. But if 907 00:45:26,480 --> 00:45:29,600 Speaker 3: you are in the group that I just mentioned, I 908 00:45:29,680 --> 00:45:33,960 Speaker 3: absolutely am in that group, and I'm filing for my claim. 909 00:45:34,440 --> 00:45:36,279 Speaker 3: I think that anybody listening who was in that group 910 00:45:36,320 --> 00:45:38,439 Speaker 3: should file for their claim because that is the only 911 00:45:38,480 --> 00:45:41,040 Speaker 3: way that these companies have. That's the only kind of 912 00:45:41,080 --> 00:45:42,560 Speaker 3: I mean, it makes me sad to say, but like, 913 00:45:42,840 --> 00:45:44,840 Speaker 3: we don't have a lot of recourse for companies like 914 00:45:45,160 --> 00:45:47,840 Speaker 3: like Meta and Google. There's not much that the average 915 00:45:47,920 --> 00:45:51,040 Speaker 3: user can do that will actually make them feel this. 916 00:45:51,160 --> 00:45:53,120 Speaker 3: And one way that we have is hitting them in 917 00:45:53,120 --> 00:45:55,759 Speaker 3: their pocketbook. And so, yeah, get your settlement, even if 918 00:45:55,800 --> 00:45:57,920 Speaker 3: it's ten dollars, Get your ten dollars and buy yourself 919 00:45:57,920 --> 00:46:00,000 Speaker 3: a coffee whatever. 920 00:46:00,120 --> 00:46:02,680 Speaker 4: Knowing that it's building up just we're just building up 921 00:46:02,719 --> 00:46:05,560 Speaker 4: the cost. And that's the thing is like we see 922 00:46:05,640 --> 00:46:07,799 Speaker 4: in so many things we've talked about civil suits when 923 00:46:07,840 --> 00:46:11,000 Speaker 4: it comes to like rape cases and sexual assault cases 924 00:46:11,160 --> 00:46:13,600 Speaker 4: and why they're important even though so many be like, oh, 925 00:46:13,600 --> 00:46:15,239 Speaker 4: you're just trying to go after the money. No, this 926 00:46:15,680 --> 00:46:17,400 Speaker 4: is a big deal because we know this is a 927 00:46:17,400 --> 00:46:20,920 Speaker 4: form of punishment that is more likely to happen rather 928 00:46:21,040 --> 00:46:25,319 Speaker 4: than the guilty not guilty judicial level of punishment. We 929 00:46:25,400 --> 00:46:28,640 Speaker 4: know this, So having things like this is really important 930 00:46:28,840 --> 00:46:31,840 Speaker 4: because it does make a stand, whether it's big or small. 931 00:46:32,719 --> 00:46:36,080 Speaker 4: Same thing with the dominion lawsuit that happened with Fox. 932 00:46:36,160 --> 00:46:37,799 Speaker 4: It was a big deal, like people were mad that 933 00:46:37,840 --> 00:46:41,000 Speaker 4: they settled, but that cost really did hit them in 934 00:46:41,040 --> 00:46:43,719 Speaker 4: the end. And these types of things. Is this conversation 935 00:46:43,840 --> 00:46:45,920 Speaker 4: and it's going to be a precedent of what can 936 00:46:46,000 --> 00:46:49,120 Speaker 4: be filed later on because obviously things are building up. 937 00:46:49,160 --> 00:46:52,600 Speaker 4: As long as Google holds out and not actually deleting, 938 00:46:52,840 --> 00:46:54,879 Speaker 4: they're going to get more cases. If this goes forward 939 00:46:54,960 --> 00:46:57,520 Speaker 4: and wins, they're going to get more piled on. And 940 00:46:57,800 --> 00:47:00,480 Speaker 4: that's a good thing. A sad thing that it has 941 00:47:00,520 --> 00:47:02,239 Speaker 4: to get to that point, but it is a good thing. 942 00:47:02,320 --> 00:47:05,239 Speaker 4: So it's something that definitely we should watch definitely. 943 00:47:05,320 --> 00:47:08,400 Speaker 3: And you know, one question that I have is how 944 00:47:08,480 --> 00:47:11,439 Speaker 3: is it legal for Google to say one thing about 945 00:47:11,480 --> 00:47:13,280 Speaker 3: their privacy policies and do another. 946 00:47:13,719 --> 00:47:14,960 Speaker 5: I again, I'm no lawyer. 947 00:47:16,120 --> 00:47:18,200 Speaker 3: I don't know how it is legal, though, Like I 948 00:47:18,280 --> 00:47:21,520 Speaker 3: fully do not understand how Google can mislead the public 949 00:47:21,520 --> 00:47:24,719 Speaker 3: about data privacy practices. How that is legal under the 950 00:47:24,760 --> 00:47:26,880 Speaker 3: Federal Trade Commission. I'm the expert. 951 00:47:27,400 --> 00:47:28,239 Speaker 5: Don't understand it. 952 00:47:28,280 --> 00:47:31,719 Speaker 3: Maybe it's not legal, but whether or not it's illegal, 953 00:47:31,960 --> 00:47:34,680 Speaker 3: it is certainly unethical. It is unethical to say one 954 00:47:34,680 --> 00:47:36,799 Speaker 3: thing and do another, and it is certainly unsafe. And 955 00:47:36,800 --> 00:47:39,320 Speaker 3: the bottom line is that people deserve privacy and it 956 00:47:39,360 --> 00:47:41,839 Speaker 3: shouldn't be up to the whims of the people who 957 00:47:41,920 --> 00:47:44,600 Speaker 3: run Google to decide whether or not we get it 958 00:47:44,719 --> 00:47:45,160 Speaker 3: or how. 959 00:47:45,080 --> 00:47:45,560 Speaker 5: We get it. 960 00:47:45,760 --> 00:47:50,680 Speaker 3: We really need to rethink this this schema where the 961 00:47:51,280 --> 00:47:53,759 Speaker 3: people like tech leaders at Google, they are the ones 962 00:47:53,760 --> 00:47:57,040 Speaker 3: who decide whether or not we get privacy. People deserve privacy, 963 00:47:57,440 --> 00:48:00,440 Speaker 3: especially when it comes to their health information, full stop, period, 964 00:48:00,520 --> 00:48:01,200 Speaker 3: end of sentence. 965 00:48:01,760 --> 00:48:04,799 Speaker 4: Right, And I wonder how much the government has done 966 00:48:04,800 --> 00:48:07,560 Speaker 4: because they've really taken away a lot of any kind 967 00:48:07,560 --> 00:48:11,560 Speaker 4: of power against these companies because we've seen so the 968 00:48:11,640 --> 00:48:15,759 Speaker 4: amount of updated agreements in ordering to use anything like 969 00:48:15,840 --> 00:48:18,360 Speaker 4: I think I got it for TikTok. I got well, TikTok. 970 00:48:18,400 --> 00:48:20,640 Speaker 4: We know they're underflying. We had that episode, but they 971 00:48:20,680 --> 00:48:23,560 Speaker 4: had a new agreement. I know Samsung continues to do 972 00:48:23,560 --> 00:48:26,960 Speaker 4: it every few months and the amount of privacy is 973 00:48:27,000 --> 00:48:29,880 Speaker 4: going away like it's it's interesting how quickly they're changing it. 974 00:48:29,880 --> 00:48:31,760 Speaker 4: And there's nothing you can do because you're already sucked 975 00:48:31,760 --> 00:48:34,200 Speaker 4: in to that system and that's what Google is doing. 976 00:48:34,280 --> 00:48:37,120 Speaker 4: And because of the way that the government has allowed 977 00:48:37,160 --> 00:48:40,279 Speaker 4: for these companies to be territorial and to continue to 978 00:48:40,320 --> 00:48:44,040 Speaker 4: gather that data. I'm not a conspiracist, but I think 979 00:48:44,120 --> 00:48:46,879 Speaker 4: there's a bit conspiracy behind this and about how much 980 00:48:46,920 --> 00:48:49,960 Speaker 4: control they want to have over the individual citizens when 981 00:48:50,000 --> 00:48:51,160 Speaker 4: that's concerning. 982 00:48:51,280 --> 00:48:54,279 Speaker 3: Yeah, And I think the I'm with you, you don't 983 00:48:54,320 --> 00:48:57,279 Speaker 3: tell that the conspiracy theorist at all. And I think 984 00:48:57,280 --> 00:48:59,960 Speaker 3: the first thing that we can do is really understand 985 00:49:00,000 --> 00:49:03,360 Speaker 3: and what's being asked of us. And it's not easy 986 00:49:03,400 --> 00:49:05,040 Speaker 3: because they don't make it easy. 987 00:49:05,120 --> 00:49:06,080 Speaker 5: Like when you. 988 00:49:06,000 --> 00:49:09,400 Speaker 3: Get that twenty five page thing that you have to 989 00:49:09,440 --> 00:49:12,280 Speaker 3: scroll down just to like log onto your phone. 990 00:49:12,800 --> 00:49:14,399 Speaker 5: Come on, be real, Like, who is reading that? 991 00:49:14,480 --> 00:49:16,600 Speaker 3: Like, I don't false anybody for being like, oh, I'll 992 00:49:16,600 --> 00:49:20,000 Speaker 3: just hit agree. But it shouldn't be that way. You 993 00:49:20,000 --> 00:49:23,680 Speaker 3: shouldn't have to be a trained lawyer or trained in 994 00:49:23,840 --> 00:49:27,840 Speaker 3: understanding tech speak and have to parse a thirty page 995 00:49:28,160 --> 00:49:31,320 Speaker 3: privacy agreement that you have to click through just to 996 00:49:31,360 --> 00:49:35,080 Speaker 3: get to your email, just to exist safely online with 997 00:49:35,200 --> 00:49:38,919 Speaker 3: privacy and dignity, like. 998 00:49:37,960 --> 00:49:38,839 Speaker 5: People deserve that. 999 00:49:39,120 --> 00:49:41,680 Speaker 3: We have created, and we've accepted and tolerated and created 1000 00:49:41,680 --> 00:49:45,560 Speaker 3: a system that is so burdenous to the average citizen, 1001 00:49:45,800 --> 00:49:46,520 Speaker 3: and it should not be. 1002 00:49:46,600 --> 00:49:48,640 Speaker 2: That way right right, And. 1003 00:49:50,320 --> 00:49:53,560 Speaker 1: There is so much going on right now in terms 1004 00:49:53,640 --> 00:49:59,600 Speaker 1: of technology and tumultuous times and technology and understanding those 1005 00:49:59,680 --> 00:50:04,520 Speaker 1: kinds of things, and you Bridget are amazing at explaining 1006 00:50:04,560 --> 00:50:06,760 Speaker 1: those kinds of things. So we always love having you here. 1007 00:50:07,760 --> 00:50:12,640 Speaker 1: But you're also like breaking down these issues all not 1008 00:50:12,719 --> 00:50:15,880 Speaker 1: without us on your own show. 1009 00:50:17,760 --> 00:50:22,279 Speaker 3: Thank you for that little introduction. So yeah, I don't 1010 00:50:22,280 --> 00:50:24,560 Speaker 3: know if other folks feel it, but I think in 1011 00:50:24,600 --> 00:50:27,920 Speaker 3: this particular moment in time, you know, we always talk 1012 00:50:27,960 --> 00:50:30,160 Speaker 3: about this sort of like the future of tech and 1013 00:50:30,200 --> 00:50:33,720 Speaker 3: what's next, it feels like we are in that moment today, 1014 00:50:33,840 --> 00:50:36,239 Speaker 3: right when it comes to platforms, with the state of 1015 00:50:36,239 --> 00:50:38,799 Speaker 3: Twitter and like what new platforms we're going to you know, 1016 00:50:38,840 --> 00:50:40,200 Speaker 3: pop up and where a we're all going to spend 1017 00:50:40,200 --> 00:50:42,560 Speaker 3: our time digitally, when it comes to conversations about the 1018 00:50:42,640 --> 00:50:45,680 Speaker 3: rise of AI, and when it comes to the increasing 1019 00:50:45,880 --> 00:50:49,160 Speaker 3: threat of you know, expanding tech surveillance, like we talked 1020 00:50:49,200 --> 00:50:51,920 Speaker 3: about today, it feels like a very weird time for 1021 00:50:52,000 --> 00:50:55,400 Speaker 3: technology and the Internet and where we're going in the future. 1022 00:50:55,480 --> 00:50:58,160 Speaker 3: And it feels like all that future conversation is actually 1023 00:50:58,360 --> 00:51:01,520 Speaker 3: here today, and so we're starting we have a new 1024 00:51:01,560 --> 00:51:03,880 Speaker 3: season on There Are No Girls on the Internet exploring 1025 00:51:04,120 --> 00:51:06,880 Speaker 3: how increasingly it feels like this future of technology is 1026 00:51:06,920 --> 00:51:11,120 Speaker 3: happening now, precisely because it is so important that the 1027 00:51:11,200 --> 00:51:14,280 Speaker 3: voices of people who are traditionally left out of these conversations, women, 1028 00:51:14,360 --> 00:51:17,319 Speaker 3: people of color, queer folks, disabled folks, working folks are 1029 00:51:17,320 --> 00:51:18,880 Speaker 3: not left out of those conversations. 1030 00:51:18,920 --> 00:51:19,879 Speaker 5: And so if you want to. 1031 00:51:20,560 --> 00:51:23,239 Speaker 3: Parse what it all means, where we've been and where 1032 00:51:23,239 --> 00:51:25,120 Speaker 3: we're going, and what we need to know to make 1033 00:51:25,120 --> 00:51:28,120 Speaker 3: sure that our voices are centered in these conversations, please 1034 00:51:28,200 --> 00:51:29,640 Speaker 3: check out the new season of My Pod. 1035 00:51:29,800 --> 00:51:31,359 Speaker 5: There are no girls on the internet where we are 1036 00:51:31,360 --> 00:51:32,200 Speaker 5: doing just that. 1037 00:51:32,960 --> 00:51:36,760 Speaker 1: Yeah, definitely, listeners go check it out. It does feel 1038 00:51:36,760 --> 00:51:39,160 Speaker 1: like we're in the is it Moore's curve? It feels 1039 00:51:39,200 --> 00:51:41,839 Speaker 1: like we're in the like real upward. 1040 00:51:41,520 --> 00:51:45,160 Speaker 5: For yeah, kind of siry? Good? Like what is it? 1041 00:51:45,239 --> 00:51:45,319 Speaker 4: Like? 1042 00:51:45,640 --> 00:51:49,520 Speaker 5: Good? She? Is it geometry? What is my curves and stuff? Good? 1043 00:51:49,640 --> 00:51:54,320 Speaker 5: Math reference? And wait a minute, wait math? 1044 00:51:55,120 --> 00:51:56,799 Speaker 1: You know that might not even be the correct term, 1045 00:51:56,800 --> 00:51:58,200 Speaker 1: but I think it is. And if it's not, then 1046 00:51:58,239 --> 00:51:58,960 Speaker 1: that's really funny. 1047 00:51:58,960 --> 00:52:03,320 Speaker 4: So whatever, Yeah curve. 1048 00:52:05,800 --> 00:52:08,799 Speaker 1: Yeah, soll we always we love having you, We've missed you, 1049 00:52:09,719 --> 00:52:11,359 Speaker 1: and we always take up so much of your time, 1050 00:52:11,400 --> 00:52:12,359 Speaker 1: so I appreciate it. 1051 00:52:12,840 --> 00:52:14,879 Speaker 2: We could just talk to you forever about all kinds 1052 00:52:14,880 --> 00:52:15,160 Speaker 2: of things. 1053 00:52:15,239 --> 00:52:17,640 Speaker 3: Oh my god, the pleasure, the pleasure is so mine. 1054 00:52:17,719 --> 00:52:20,120 Speaker 3: This is like, Yeah, I could talk to you guys, 1055 00:52:20,320 --> 00:52:22,239 Speaker 3: you all all day because it's just so nice to 1056 00:52:22,239 --> 00:52:24,200 Speaker 3: connect on these issues and I it's nice to be 1057 00:52:24,239 --> 00:52:25,719 Speaker 3: able to talk to them with YouTube. 1058 00:52:26,000 --> 00:52:29,799 Speaker 2: Yes, yes, it really is. Well, where can the good 1059 00:52:29,800 --> 00:52:31,120 Speaker 2: listeners find you? Well? 1060 00:52:31,160 --> 00:52:32,920 Speaker 3: As I said, you can check out my podcast. There 1061 00:52:32,920 --> 00:52:36,480 Speaker 3: are no girls on the internet. Wherever you podcast we are, there. 1062 00:52:36,680 --> 00:52:38,720 Speaker 5: Can follow me. I'm still on Twitter. 1063 00:52:38,880 --> 00:52:42,239 Speaker 3: Kind of at Bridget Marie can follow me on TikTok 1064 00:52:42,320 --> 00:52:45,280 Speaker 3: at Bridget max Pods, I can follow me on Instagram 1065 00:52:45,280 --> 00:52:47,000 Speaker 3: at Bridget Marie and DC and I would love to 1066 00:52:47,040 --> 00:52:48,400 Speaker 3: have you on any of those platforms. 1067 00:52:48,600 --> 00:52:51,480 Speaker 2: Yes, and also beef, Oh yeah, beef. 1068 00:52:51,480 --> 00:52:52,760 Speaker 5: Don't don't sleep on beef. 1069 00:52:53,000 --> 00:52:55,040 Speaker 3: It's a little bit different than my usual content because 1070 00:52:55,040 --> 00:52:57,640 Speaker 3: it's just like so much of my content is like 1071 00:52:57,719 --> 00:53:00,080 Speaker 3: heavier and based around the Internet and like just like 1072 00:53:00,080 --> 00:53:01,640 Speaker 3: here's what you need to know if you're trying to 1073 00:53:01,640 --> 00:53:03,440 Speaker 3: have a good time and just nerd ad on some 1074 00:53:03,719 --> 00:53:08,040 Speaker 3: historical rivalries like Who's not definitely check out, be super fun. 1075 00:53:08,280 --> 00:53:13,520 Speaker 1: See what's going on in the advice call of World exactly. Yes, Yes, well, 1076 00:53:13,560 --> 00:53:15,160 Speaker 1: thank you, thank you, thank you so much again for 1077 00:53:15,200 --> 00:53:18,319 Speaker 1: being here, listeners. If you would like to contact us, 1078 00:53:18,360 --> 00:53:20,840 Speaker 1: you can. Our email is Stephani your mom stuff at 1079 00:53:20,840 --> 00:53:22,879 Speaker 1: iHeartMedia dot com. You can find us on Twitter, I'm 1080 00:53:22,880 --> 00:53:25,360 Speaker 1: most Stuff podcast or in instagrament TikTok at Stuff I 1081 00:53:25,400 --> 00:53:26,680 Speaker 1: Never Told you, also on YouTube. 1082 00:53:27,600 --> 00:53:28,879 Speaker 2: We do have a book coming out. 1083 00:53:28,960 --> 00:53:30,799 Speaker 1: You can pre order it as Stuff you Should Read 1084 00:53:30,800 --> 00:53:35,000 Speaker 1: Books dot com. Thank you as always to our super producer, Christina, 1085 00:53:35,000 --> 00:53:37,480 Speaker 1: our executive producer Maya and our contributor Joey. 1086 00:53:37,640 --> 00:53:39,920 Speaker 2: Thank you and thanks to you for listening stuff I 1087 00:53:39,920 --> 00:53:41,239 Speaker 2: Never told you. Disrection of by Heart Radio. 1088 00:53:41,280 --> 00:53:42,839 Speaker 1: For more podcasts from my Heart Radio, you can check 1089 00:53:42,840 --> 00:53:44,800 Speaker 1: out the Heartradio Apple podcast, or wherever you listen to 1090 00:53:44,840 --> 00:53:45,759 Speaker 1: your favorite shows,