1 00:00:05,160 --> 00:00:07,520 Speaker 1: Hey, this is Annie and Samantha. I don't other Stephane 2 00:00:07,560 --> 00:00:19,079 Speaker 1: never told you appection, but I heart radio and today 3 00:00:19,160 --> 00:00:22,120 Speaker 1: we are thrilled once again to be joined by the delightful, 4 00:00:22,239 --> 00:00:28,800 Speaker 1: the winsome, the dessert expert. Maybe we were talking off 5 00:00:28,840 --> 00:00:32,559 Speaker 1: Mike about our favorite and least favorite desserts. Yeah, very 6 00:00:32,600 --> 00:00:36,480 Speaker 1: strongly about merangue. Oh my goodness. Some of us feel 7 00:00:36,600 --> 00:00:40,880 Speaker 1: very very very strongly about their dislike merangue. Not pointing 8 00:00:40,880 --> 00:00:44,720 Speaker 1: to myself or anything. I love it. I love these 9 00:00:44,720 --> 00:00:49,120 Speaker 1: strong favorite opinions. I still maintain we should have like 10 00:00:49,120 --> 00:00:52,120 Speaker 1: a mini podcast where we just talk about desserts. I figured, 11 00:00:52,159 --> 00:00:54,960 Speaker 1: three of us can we bring desserts and then like 12 00:00:55,040 --> 00:00:57,720 Speaker 1: somehow try to share each other's favorite desserts with each other. 13 00:00:57,800 --> 00:00:59,760 Speaker 1: That would be the way to go. Somehow we like 14 00:01:00,160 --> 00:01:03,400 Speaker 1: post it something I don't know. Okay, I wouldn't I 15 00:01:03,480 --> 00:01:05,760 Speaker 1: make that happen. I think we can make that. The 16 00:01:05,800 --> 00:01:08,800 Speaker 1: funniest thing is I'm not a huge dessert person, so 17 00:01:08,840 --> 00:01:13,560 Speaker 1: I'd be the one true co host a food podcast. 18 00:01:13,640 --> 00:01:16,760 Speaker 1: How are you add into desserts? Either are fine? I'm 19 00:01:16,800 --> 00:01:20,080 Speaker 1: just like not the thing that I go for a lot, Okay, 20 00:01:20,360 --> 00:01:25,520 Speaker 1: savory thing, right, Also, you'll you'll try almost anything even 21 00:01:25,560 --> 00:01:27,440 Speaker 1: if you're allergic to it. So therefore, I think she 22 00:01:27,680 --> 00:01:30,399 Speaker 1: maybe like the expert because she's willing to do all 23 00:01:30,440 --> 00:01:32,480 Speaker 1: the things just to try, and she will never let 24 00:01:32,520 --> 00:01:36,720 Speaker 1: anything go to waste. So I feel like she's the better. Yeah, 25 00:01:36,840 --> 00:01:40,920 Speaker 1: even with mint, I don't like. Do you not like mint? 26 00:01:41,000 --> 00:01:43,839 Speaker 1: I hate? I don't like mint in desserts. I'm learning 27 00:01:43,880 --> 00:01:47,200 Speaker 1: a lot about your taste. No, I love it. I'm 28 00:01:47,240 --> 00:01:51,720 Speaker 1: just allergic to it. I'm intolerant to it. Yeah, what 29 00:01:51,800 --> 00:01:59,920 Speaker 1: do you do for toothpaste? I suffer. It's miserable. There 30 00:02:00,040 --> 00:02:02,280 Speaker 1: are tooth I could solve this problem. This is an 31 00:02:02,360 --> 00:02:05,200 Speaker 1: ongoing issue with me. I could solve this, and I 32 00:02:05,360 --> 00:02:08,200 Speaker 1: continue to not do it because she refused to buy 33 00:02:08,280 --> 00:02:11,680 Speaker 1: something new. I'm going to send you some cinnamon flavor toothpaste. 34 00:02:11,720 --> 00:02:14,520 Speaker 1: That's what I I don't like, miss, I love cinnamon. See, 35 00:02:14,880 --> 00:02:19,720 Speaker 1: this is gonna be easy, easy fix. Such an easy fix. 36 00:02:19,800 --> 00:02:24,960 Speaker 1: It so a solvable problem. Well you are back in Washington, 37 00:02:25,040 --> 00:02:31,359 Speaker 1: d C. Bridget and happy belated birthday, my birthday, pie day, 38 00:02:31,560 --> 00:02:34,200 Speaker 1: which I feel like as a geeky nerdy person. Of 39 00:02:34,320 --> 00:02:37,959 Speaker 1: course it is three fourteen. Did you have any pie. 40 00:02:38,040 --> 00:02:41,000 Speaker 1: Did you do anything fun? I didn't really do anything fun. 41 00:02:41,080 --> 00:02:43,919 Speaker 1: We I had just got back from my trip to 42 00:02:44,120 --> 00:02:47,760 Speaker 1: Mexico City, and so I was in that weird time 43 00:02:47,880 --> 00:02:50,360 Speaker 1: where you're like, I've just been out of a country 44 00:02:50,400 --> 00:02:52,760 Speaker 1: for a long time. Is this like, this is not 45 00:02:52,840 --> 00:02:54,919 Speaker 1: my beautiful house, this is not my beautiful life. Like 46 00:02:55,000 --> 00:02:58,480 Speaker 1: I was like really taking having a weird re entry 47 00:02:58,560 --> 00:03:01,040 Speaker 1: back to my regular life. So my birthday was spent 48 00:03:01,160 --> 00:03:04,880 Speaker 1: just like on the couch watching television, which actually was good. 49 00:03:05,040 --> 00:03:13,560 Speaker 1: Was fine, yeah, yeah, lower key. Yes. Well, Samantha and 50 00:03:13,639 --> 00:03:15,720 Speaker 1: I are very very grateful you're here to talk about 51 00:03:15,720 --> 00:03:18,440 Speaker 1: this today because we were talking off Mike before you 52 00:03:18,600 --> 00:03:21,639 Speaker 1: came on. We don't really know what's going on here, 53 00:03:21,880 --> 00:03:26,040 Speaker 1: and h it's confusing, but it's in the news, it's everywhere. 54 00:03:26,080 --> 00:03:29,880 Speaker 1: Everyone's talking about it. It's true. So I'll do my best. 55 00:03:29,960 --> 00:03:31,680 Speaker 1: So I should say it right off front. There are 56 00:03:31,800 --> 00:03:34,480 Speaker 1: parts of this conversation that I'm not the expert on, 57 00:03:34,920 --> 00:03:36,880 Speaker 1: but I will do my best to break it down. 58 00:03:37,040 --> 00:03:42,080 Speaker 1: And that is the congressional hearings and conversations around banning TikTok, 59 00:03:42,120 --> 00:03:44,400 Speaker 1: which I'm sure folks have been hearing about, even if 60 00:03:44,440 --> 00:03:47,480 Speaker 1: you're not a TikTok user. You've probably cursory seen like 61 00:03:48,040 --> 00:03:52,880 Speaker 1: Congress people asking goofy questions to TikTok ceo. So yeah, 62 00:03:52,880 --> 00:03:54,800 Speaker 1: I wanted to break it down. This is the best 63 00:03:54,800 --> 00:03:58,000 Speaker 1: of my ability, Thank goodness, yes, because it is a lot. 64 00:03:58,200 --> 00:04:00,960 Speaker 1: It is a lot. It is confusing, and I was 65 00:04:01,000 --> 00:04:04,280 Speaker 1: trying to think about this. It feels to me very new, 66 00:04:04,600 --> 00:04:07,360 Speaker 1: and that it's not that we haven't had conversations like 67 00:04:07,440 --> 00:04:10,200 Speaker 1: this before, but to have a serious, like congressional thing 68 00:04:10,240 --> 00:04:12,640 Speaker 1: about like let's just get rid of this thing right 69 00:04:13,400 --> 00:04:18,080 Speaker 1: in this country feels unique to me. I don't know 70 00:04:18,080 --> 00:04:20,800 Speaker 1: if that's entirely true, but that's how it feels to me. Yeah, 71 00:04:20,880 --> 00:04:25,080 Speaker 1: I'm struggling to think of another social media app that 72 00:04:25,279 --> 00:04:29,640 Speaker 1: we have had conversations that rise to like the president, 73 00:04:29,920 --> 00:04:33,680 Speaker 1: you know, in Congress about just an outright ban. I 74 00:04:33,920 --> 00:04:37,280 Speaker 1: don't know that I'm really strong if there is some thing, 75 00:04:37,400 --> 00:04:40,120 Speaker 1: So if someone is like driving right now and they're 76 00:04:40,160 --> 00:04:42,440 Speaker 1: like screaming an example the top of their lungs, please 77 00:04:42,560 --> 00:04:44,280 Speaker 1: let me know. But I can't think of a time 78 00:04:44,800 --> 00:04:47,360 Speaker 1: where that has happened. And I feel like this is 79 00:04:47,600 --> 00:04:50,400 Speaker 1: different because it's happened so quickly. I feel like TikTok 80 00:04:50,600 --> 00:04:53,560 Speaker 1: really rose in popularity in the last few years, and 81 00:04:53,680 --> 00:04:56,360 Speaker 1: I think going from it not really being a thing, 82 00:04:56,680 --> 00:04:58,280 Speaker 1: to it being a thing that people were like, oh, 83 00:04:58,320 --> 00:05:00,440 Speaker 1: it's just an app for kids, blah blah bla, to 84 00:05:00,640 --> 00:05:03,920 Speaker 1: it being ubiquitous, to now the conversation being about it 85 00:05:04,000 --> 00:05:06,760 Speaker 1: being banned, all happening within the span of a few years. 86 00:05:06,960 --> 00:05:10,080 Speaker 1: That feels different to me as well, right, I guess 87 00:05:10,120 --> 00:05:11,760 Speaker 1: I have a lot of questions just because of the 88 00:05:12,160 --> 00:05:14,960 Speaker 1: So we know that Zuckerberg had to go and testify 89 00:05:15,279 --> 00:05:17,559 Speaker 1: in front of the courts in Congress as well because 90 00:05:17,600 --> 00:05:21,040 Speaker 1: of disinformation and misinformation of being allowed to be posted 91 00:05:21,279 --> 00:05:25,560 Speaker 1: onto Facebook, but there was no real conversation about it 92 00:05:25,720 --> 00:05:28,480 Speaker 1: going away or him doing anything wrong other than do 93 00:05:28,600 --> 00:05:30,560 Speaker 1: you take responsibility? How are you going to change to this, 94 00:05:31,040 --> 00:05:33,159 Speaker 1: which ended up being nothing. They're like, yeah, his fine, 95 00:05:33,160 --> 00:05:36,760 Speaker 1: it's not his first amendment whatever. And then having when 96 00:05:36,839 --> 00:05:41,040 Speaker 1: Trump was president, him coming in and threatening and banning TikTok, 97 00:05:41,120 --> 00:05:44,120 Speaker 1: saying that TikTok was a spy for China, which was 98 00:05:44,160 --> 00:05:47,960 Speaker 1: all about the xenophobia which happens with COVID and everything 99 00:05:48,000 --> 00:05:51,240 Speaker 1: else has happened in the last five seven years, so like, 100 00:05:51,400 --> 00:05:53,719 Speaker 1: this has been a bigger job, and I guess which 101 00:05:54,080 --> 00:05:57,840 Speaker 1: was really interesting. And what I don't quite understand, I 102 00:05:57,920 --> 00:06:01,120 Speaker 1: don't understand at all is who's in the right and 103 00:06:01,200 --> 00:06:04,360 Speaker 1: who's in the wrong. That's a great question. That is 104 00:06:04,520 --> 00:06:08,160 Speaker 1: not a question that I can answer definitively. I'll try 105 00:06:08,240 --> 00:06:10,360 Speaker 1: to break it down and then I'll give some of 106 00:06:10,520 --> 00:06:15,600 Speaker 1: my like lingering thoughts towards the end. But yeah, I 107 00:06:15,640 --> 00:06:19,040 Speaker 1: should say that this is a conversation that many people 108 00:06:19,520 --> 00:06:23,160 Speaker 1: that I trust and respect have different takes on. I 109 00:06:23,440 --> 00:06:26,279 Speaker 1: did an interview on my podcast there are no girls 110 00:06:26,279 --> 00:06:29,880 Speaker 1: on the Internet with this TikTok disinformation researcher Abby Richards, 111 00:06:29,880 --> 00:06:32,600 Speaker 1: who I'll refer to later in the episode today, but 112 00:06:32,920 --> 00:06:37,760 Speaker 1: she is a fervent TikTok user. Her academic and professional 113 00:06:37,839 --> 00:06:40,520 Speaker 1: work is Happens on TikTok, where she was one of 114 00:06:40,560 --> 00:06:43,360 Speaker 1: the first people to call TikTok out for the way 115 00:06:43,400 --> 00:06:47,320 Speaker 1: that it can spread hate speech, conspiracy theories, miss and disinformation. 116 00:06:47,720 --> 00:06:51,560 Speaker 1: But she believes that a TikTok band would harm marginalize 117 00:06:51,640 --> 00:06:53,840 Speaker 1: people the most. And so there are people like that 118 00:06:53,960 --> 00:06:56,480 Speaker 1: who were like, absolutely no, do not ban it. And 119 00:06:56,560 --> 00:06:59,040 Speaker 1: then there are other people who I also respect and 120 00:06:59,080 --> 00:07:02,320 Speaker 1: trust in the space where like, oh, well, banning TikTok 121 00:07:02,720 --> 00:07:06,240 Speaker 1: might actually be a good thing, because you know, TikTok 122 00:07:06,400 --> 00:07:09,480 Speaker 1: is the most relevant social media platform out there right now. 123 00:07:10,280 --> 00:07:13,280 Speaker 1: Twitter is really in this like weird, nebulous space. If 124 00:07:13,360 --> 00:07:16,480 Speaker 1: Twitter is in this weird space and TikTok goes down, 125 00:07:17,040 --> 00:07:21,480 Speaker 1: we might actually have a chance to sort of restructure 126 00:07:21,560 --> 00:07:26,440 Speaker 1: our social media ecosystem without the current giant that is TikTok. 127 00:07:26,720 --> 00:07:28,400 Speaker 1: I don't know if I agree with that, but I 128 00:07:28,440 --> 00:07:30,280 Speaker 1: guess what I'm saying all of this is to say 129 00:07:30,360 --> 00:07:34,040 Speaker 1: that I don't have the definitive answer. But I first 130 00:07:34,080 --> 00:07:36,360 Speaker 1: started researching this topic, I was like, I don't really 131 00:07:36,400 --> 00:07:39,960 Speaker 1: know where I stand. Having done that research, I think 132 00:07:40,000 --> 00:07:43,240 Speaker 1: I am coming down in favor of the idea that 133 00:07:43,440 --> 00:07:47,240 Speaker 1: banning TikTok is not the best solution. But again, that's 134 00:07:47,320 --> 00:07:49,840 Speaker 1: I am. That is just mine. That's just one person's take. 135 00:07:50,960 --> 00:07:55,200 Speaker 1: I've seen many other interesting takes that I'm like, oh, 136 00:07:55,400 --> 00:07:56,840 Speaker 1: that makes sense to me. I respect that, Does that 137 00:07:56,880 --> 00:07:58,800 Speaker 1: make sense? I feel like I'm rambling. No, no, no, 138 00:07:59,320 --> 00:08:01,000 Speaker 1: I mean I feel like dust. The whole thing of 139 00:08:01,120 --> 00:08:04,160 Speaker 1: TikTok is kind of going trying to reason out when 140 00:08:04,200 --> 00:08:07,400 Speaker 1: it is the most plausible decision or what's the most 141 00:08:07,440 --> 00:08:10,720 Speaker 1: responsible because there is no good or bad in this there. 142 00:08:10,920 --> 00:08:12,840 Speaker 1: It's all kind of like, oh, this could be bad, 143 00:08:13,040 --> 00:08:15,320 Speaker 1: this also could be bad, but also this could be good. 144 00:08:15,640 --> 00:08:18,480 Speaker 1: So it seems like that's that kind of conversation in itself. Again, 145 00:08:18,560 --> 00:08:20,880 Speaker 1: I'm a TikTok fan that I do love what I 146 00:08:21,000 --> 00:08:27,120 Speaker 1: get to see, but again, yeah, how does it even start? Yeah, well, 147 00:08:27,200 --> 00:08:29,440 Speaker 1: so let's let's get into it. So last week, folks 148 00:08:29,480 --> 00:08:31,600 Speaker 1: might have seen that there were congressional hearings where members 149 00:08:31,640 --> 00:08:35,679 Speaker 1: of Congress grilled TikTok ceo show z Chu. So the 150 00:08:35,720 --> 00:08:38,080 Speaker 1: big thing to know here is that TikTok's parent company 151 00:08:38,240 --> 00:08:40,680 Speaker 1: is called byte Dance, and they are a Chinese company, 152 00:08:40,960 --> 00:08:43,839 Speaker 1: and the people who want it banned are essentially worried 153 00:08:43,920 --> 00:08:47,800 Speaker 1: that it's meteoric rise represents a national security threat because 154 00:08:47,800 --> 00:08:50,199 Speaker 1: it is owned by their parent company, Byte Dance is 155 00:08:50,320 --> 00:08:52,880 Speaker 1: based in China, a country with whom the United States 156 00:08:52,960 --> 00:08:56,640 Speaker 1: has had like a chilly diplomatic relationship, And so this 157 00:08:56,800 --> 00:08:59,240 Speaker 1: comes at a time when more and more state governments 158 00:08:59,280 --> 00:09:02,680 Speaker 1: and institution are cracking down on TikTok on state issue devices, 159 00:09:03,160 --> 00:09:04,760 Speaker 1: like if you there are a lot there's like a 160 00:09:04,840 --> 00:09:08,040 Speaker 1: growing number of universities and state agencies that are if 161 00:09:08,200 --> 00:09:09,719 Speaker 1: if they give you a device, or if you have 162 00:09:09,800 --> 00:09:12,360 Speaker 1: a state run device, you're not allowed to have TikTok 163 00:09:12,400 --> 00:09:16,240 Speaker 1: on that device, and lawmakers are proposing bands of TikTok. 164 00:09:16,360 --> 00:09:19,240 Speaker 1: Two weeks ago, the Biden administration demanded that TikTok be 165 00:09:19,360 --> 00:09:21,360 Speaker 1: sold or that it would face a ban in the 166 00:09:21,480 --> 00:09:24,560 Speaker 1: United States. And this is not just like hot air 167 00:09:24,720 --> 00:09:28,080 Speaker 1: or posturing. Congress has also rolled out a bipartisan bill 168 00:09:28,120 --> 00:09:31,520 Speaker 1: allowing a nationwide TikTok ban, called the Restrict Act, which 169 00:09:31,520 --> 00:09:34,199 Speaker 1: would allow the Secretary of Commerce to ban apps that 170 00:09:34,360 --> 00:09:37,199 Speaker 1: pose a risk to the US's national security. A sale 171 00:09:37,240 --> 00:09:40,360 Speaker 1: of TikTok would require the Chinese government to go along 172 00:09:40,400 --> 00:09:44,400 Speaker 1: with it and agree, and perhaps unsurprisingly, they're saying like, no, 173 00:09:44,679 --> 00:09:47,880 Speaker 1: we will not authorize or approve a sale of TikTok. 174 00:09:48,200 --> 00:09:50,920 Speaker 1: In response, TikTok has committed to spend one point five 175 00:09:51,000 --> 00:09:54,320 Speaker 1: billion dollars on a plan that they're calling Project Texas, 176 00:09:54,600 --> 00:09:58,839 Speaker 1: which would essentially enact a stronger firewall between TikTok and 177 00:09:58,960 --> 00:10:02,319 Speaker 1: employees of its Chinese based parent company. Byte dance. It 178 00:10:02,400 --> 00:10:04,640 Speaker 1: would all be set up through a US tech firm 179 00:10:04,760 --> 00:10:09,280 Speaker 1: called Oracle as kind of dislike watchdog organization that's meant 180 00:10:09,320 --> 00:10:12,679 Speaker 1: to scrutinize TikTok's source code and act as kind of 181 00:10:12,720 --> 00:10:17,559 Speaker 1: a third party, unbiased United States based monitor to monitor 182 00:10:17,679 --> 00:10:21,160 Speaker 1: for like potential security risks. And so that's kind of 183 00:10:21,240 --> 00:10:25,079 Speaker 1: the zoomed out conversation about what exactly is going on 184 00:10:25,280 --> 00:10:28,200 Speaker 1: and the contact behind how we got to these hearings. Yes, 185 00:10:30,760 --> 00:10:33,280 Speaker 1: and you know, it is like I've used the word 186 00:10:33,320 --> 00:10:36,839 Speaker 1: confusing a lot. It is confusing. But it's been one 187 00:10:36,920 --> 00:10:42,120 Speaker 1: of those things where we've heard a lot of lawmakers 188 00:10:43,800 --> 00:10:47,000 Speaker 1: ask these questions that make clear they don't really understand 189 00:10:49,200 --> 00:10:55,400 Speaker 1: any of this. The congressman be like, what's the internet? 190 00:10:55,600 --> 00:10:58,640 Speaker 1: So if I'm all Wi fi and people making a meme, 191 00:10:58,840 --> 00:11:01,199 Speaker 1: there's been so many means, all my questions of the 192 00:11:01,240 --> 00:11:05,800 Speaker 1: congress people asking the most ridiculous questions with chosen phase 193 00:11:05,920 --> 00:11:08,760 Speaker 1: going one. Yes, I mean it reminds me of when 194 00:11:08,800 --> 00:11:11,600 Speaker 1: I go visit my parents, and you know how when 195 00:11:11,600 --> 00:11:13,160 Speaker 1: you visit I mean, if your parents are something like 196 00:11:13,280 --> 00:11:15,920 Speaker 1: my parents, you have a nice meal whatever whatever. And 197 00:11:16,080 --> 00:11:19,280 Speaker 1: then the tech support part of the evening starts where 198 00:11:19,280 --> 00:11:21,880 Speaker 1: it's like what, like can you do this? Can you 199 00:11:22,559 --> 00:11:25,000 Speaker 1: delete that? Like what's our Wi Fi password? You know, 200 00:11:25,120 --> 00:11:30,640 Speaker 1: like two to remote controls? Like so many Oh my god, 201 00:11:32,120 --> 00:11:35,000 Speaker 1: here's a pessay for anyone listening. Do your parents have 202 00:11:35,080 --> 00:11:38,120 Speaker 1: favor change their Wi Fi password to something like easier 203 00:11:38,200 --> 00:11:40,680 Speaker 1: for them to remember, so it's not just a random 204 00:11:40,720 --> 00:11:44,160 Speaker 1: collection of numbers and letters that they have to read 205 00:11:44,240 --> 00:11:51,240 Speaker 1: out to you while they're there. You know, that's exactly 206 00:11:51,400 --> 00:11:53,760 Speaker 1: to me. The few bits of the clubs that I 207 00:11:53,840 --> 00:11:58,079 Speaker 1: saw was literally feeling like watching my parents asking what 208 00:11:58,440 --> 00:12:02,240 Speaker 1: is this tickety talk? Totally you're so right. So there 209 00:12:02,320 --> 00:12:04,880 Speaker 1: were some good questions that came up, like a said 210 00:12:04,880 --> 00:12:08,200 Speaker 1: one lawmaker asked about whether or not marginalized creators are 211 00:12:08,240 --> 00:12:11,000 Speaker 1: suppressed on TikTok by the algorithm. Another one asked if 212 00:12:11,040 --> 00:12:14,080 Speaker 1: the Apple was suppressing accurate content about abortion. So there 213 00:12:14,160 --> 00:12:16,040 Speaker 1: were some like good questions where I'm like, oh, that's 214 00:12:16,040 --> 00:12:19,800 Speaker 1: a good question, But there were so many more lawmakers 215 00:12:19,840 --> 00:12:22,160 Speaker 1: asking questions that reveal that they have no idea what 216 00:12:22,240 --> 00:12:24,280 Speaker 1: they're talking about. Probably the one that got the most 217 00:12:24,360 --> 00:12:28,600 Speaker 1: play was Rep. Richard Hudson of North Carolina asking about 218 00:12:28,760 --> 00:12:32,319 Speaker 1: like whether or not TikTok I think I think he 219 00:12:32,520 --> 00:12:34,920 Speaker 1: was trying. I think the substance of the question was 220 00:12:35,320 --> 00:12:38,400 Speaker 1: if you're using Wi Fi on your phone and are 221 00:12:38,480 --> 00:12:43,360 Speaker 1: using TikTok, can TikTok access other devices that are on 222 00:12:43,520 --> 00:12:46,880 Speaker 1: that same network via your Wi Fi connection. I think 223 00:12:46,920 --> 00:12:48,880 Speaker 1: that was a spirit of the question, But the way 224 00:12:48,920 --> 00:12:52,720 Speaker 1: that he was asking it was like, does TikTok access 225 00:12:52,840 --> 00:12:55,160 Speaker 1: the home Wi Fi of a user? Made it made 226 00:12:55,160 --> 00:12:57,480 Speaker 1: it seem as though he was not clear like the 227 00:12:57,600 --> 00:13:00,560 Speaker 1: relationship between Wi Fi and TikTok, you know what I mean, 228 00:13:00,640 --> 00:13:03,320 Speaker 1: Like it wasn't a question that voted a lot of confidence, 229 00:13:03,400 --> 00:13:06,000 Speaker 1: is what I'm saying. And you know, it's It's one 230 00:13:06,000 --> 00:13:08,280 Speaker 1: of those things that you really get a sense of 231 00:13:08,360 --> 00:13:12,000 Speaker 1: the fact that so many elected officials are meant to 232 00:13:12,080 --> 00:13:16,120 Speaker 1: be legislating technology that they just perhaps don't even really understand. 233 00:13:16,200 --> 00:13:18,960 Speaker 1: And I think we're seeing that with so many different 234 00:13:19,160 --> 00:13:22,439 Speaker 1: types of tech like conversations around AI. I think is 235 00:13:22,440 --> 00:13:26,760 Speaker 1: another one where it's like stuff is moving quickly and rapidly, 236 00:13:26,880 --> 00:13:30,400 Speaker 1: and we really need elected officials and people with power 237 00:13:30,480 --> 00:13:33,800 Speaker 1: and institutions to be advocating for the best interests of 238 00:13:34,400 --> 00:13:36,839 Speaker 1: the public. If you aren't able to do that. It's 239 00:13:36,880 --> 00:13:40,000 Speaker 1: actually like that is actually a pretty big national security 240 00:13:40,080 --> 00:13:55,959 Speaker 1: risk in my book, right right, Yeah, And I get 241 00:13:56,080 --> 00:13:59,199 Speaker 1: that we have this conversation about national security threats and 242 00:14:00,400 --> 00:14:03,840 Speaker 1: invasion with privacy as we have constantly having to update 243 00:14:04,000 --> 00:14:08,280 Speaker 1: our privacy notices from everything electronic whatever it may be. 244 00:14:08,480 --> 00:14:11,839 Speaker 1: My phone just recently did it our audacity which we 245 00:14:11,960 --> 00:14:14,800 Speaker 1: use to record on the like terms and conditions are changing, 246 00:14:15,200 --> 00:14:18,320 Speaker 1: and we have already had this conversation about the fact 247 00:14:18,360 --> 00:14:21,960 Speaker 1: that our phones are listening to us because you can 248 00:14:22,040 --> 00:14:24,160 Speaker 1: say a store and the next thing you know, it 249 00:14:24,360 --> 00:14:28,240 Speaker 1: pops up as an ad for you. And we know 250 00:14:28,400 --> 00:14:31,320 Speaker 1: that in China at this point, I believe Facebook has 251 00:14:31,400 --> 00:14:34,840 Speaker 1: been banned and they use their own technology because kind 252 00:14:34,880 --> 00:14:36,960 Speaker 1: of on that same line of what's happening with a 253 00:14:37,040 --> 00:14:40,000 Speaker 1: concert of TikTok. But I have to ask, how is 254 00:14:40,280 --> 00:14:43,040 Speaker 1: this so different from any other apps that we would 255 00:14:43,120 --> 00:14:45,760 Speaker 1: use for social media. That's a great question, and I 256 00:14:45,840 --> 00:14:48,640 Speaker 1: would have to say it's not really that different other 257 00:14:48,680 --> 00:14:51,560 Speaker 1: than the fact that TikTok's parent company, byte Dance, is 258 00:14:52,000 --> 00:14:56,600 Speaker 1: a Chinese company. Right, There's not many differences, Like most 259 00:14:56,720 --> 00:15:01,120 Speaker 1: if not all, of the harms that TikTok is responsible 260 00:15:01,160 --> 00:15:03,640 Speaker 1: for that came up in that hearing. Aren't true about 261 00:15:03,680 --> 00:15:07,080 Speaker 1: every other major social media platform as well, and so 262 00:15:07,640 --> 00:15:09,960 Speaker 1: that was something that I didn't love about. What came 263 00:15:10,040 --> 00:15:11,680 Speaker 1: up in that hearing is that, like, if we're going 264 00:15:11,760 --> 00:15:16,000 Speaker 1: to be making TikTok the poster child for harms that 265 00:15:16,120 --> 00:15:19,360 Speaker 1: all social media platforms are responsible for, we should have 266 00:15:19,400 --> 00:15:21,160 Speaker 1: a clear reason why we're doing that. There should be 267 00:15:21,240 --> 00:15:23,840 Speaker 1: some kind of smoking gun, some kind of evidence, some 268 00:15:24,000 --> 00:15:26,320 Speaker 1: kind of something that's like, well, here's why. And so 269 00:15:26,440 --> 00:15:29,200 Speaker 1: the big question of the hearings is whether or not 270 00:15:29,360 --> 00:15:32,800 Speaker 1: TikTok is actually a national security risk. This is a 271 00:15:32,880 --> 00:15:34,760 Speaker 1: little bit above my pay grade. I am not a 272 00:15:34,880 --> 00:15:38,520 Speaker 1: digital national security expert, so you know, just take that 273 00:15:38,680 --> 00:15:41,400 Speaker 1: for what it's worth. So during the hearing, TikTok CEO 274 00:15:41,520 --> 00:15:44,760 Speaker 1: was grilled about his relationship with the Chinese Communist Party 275 00:15:44,840 --> 00:15:46,960 Speaker 1: and whether Project Texas was going to be enough of 276 00:15:47,000 --> 00:15:49,360 Speaker 1: a solution. I am concerned that what you're proposing with 277 00:15:49,400 --> 00:15:52,480 Speaker 1: Project Texas just doesn't have the technical capacity of providing 278 00:15:52,560 --> 00:15:55,119 Speaker 1: us the assurance that we need. This is from California 279 00:15:55,160 --> 00:15:59,320 Speaker 1: Republican Jay Obernulty, a congressman and software engineer, I should 280 00:15:59,360 --> 00:16:04,560 Speaker 1: also say that like TikTok does not have the like 281 00:16:05,000 --> 00:16:08,040 Speaker 1: squeakiest cleanest record when it comes to privacy and how 282 00:16:08,080 --> 00:16:11,480 Speaker 1: they handle your data, of course, like most social media 283 00:16:11,520 --> 00:16:13,880 Speaker 1: platforms do not. And this definitely came up in the hearing. 284 00:16:14,240 --> 00:16:17,600 Speaker 1: Neil Done, a Republican from Florida, asked pretty bluntly whether 285 00:16:17,720 --> 00:16:20,520 Speaker 1: or not byte Dance has quote spied on American citizens, 286 00:16:20,720 --> 00:16:22,600 Speaker 1: And we actually know that the answer to this is 287 00:16:22,640 --> 00:16:26,040 Speaker 1: probably yes. There are reports last December that TikTok access 288 00:16:26,160 --> 00:16:29,560 Speaker 1: journalist information and an attempt to identify which employees had 289 00:16:29,600 --> 00:16:33,200 Speaker 1: been leaking information to those journalists, and TikTok actually admitted 290 00:16:33,200 --> 00:16:36,080 Speaker 1: to this, according to an internal email. When asked about 291 00:16:36,120 --> 00:16:39,840 Speaker 1: this directly, TikTok CEO responded that spying is not the 292 00:16:39,960 --> 00:16:42,440 Speaker 1: right way to describe it, which I don't know. I mean, 293 00:16:42,520 --> 00:16:44,600 Speaker 1: it does kind of sound like spying to me, Like 294 00:16:44,680 --> 00:16:47,160 Speaker 1: if there's some sort of nuance in like, you know, 295 00:16:47,280 --> 00:16:50,520 Speaker 1: maybe maybe from his perspective, if the thing that you're 296 00:16:50,560 --> 00:16:53,000 Speaker 1: trying to sniff out is which of your employees is 297 00:16:53,320 --> 00:16:56,760 Speaker 1: leaking information to a journalist, maybe that's the distinction that 298 00:16:56,840 --> 00:17:00,280 Speaker 1: he's making. But you know, it doesn't sound great. And 299 00:17:00,320 --> 00:17:01,760 Speaker 1: so when I say that, like, I don't want to 300 00:17:01,760 --> 00:17:04,879 Speaker 1: make it seem like I am suggesting that TikTok is 301 00:17:05,560 --> 00:17:09,720 Speaker 1: a perfect platform where things like this never happened, because 302 00:17:09,760 --> 00:17:11,560 Speaker 1: that's not the case at all. That we know that, 303 00:17:11,680 --> 00:17:14,800 Speaker 1: Like this is an example of you know, perhaps TikTok 304 00:17:14,920 --> 00:17:17,639 Speaker 1: doing some things that work great, but this is, I 305 00:17:17,760 --> 00:17:20,679 Speaker 1: hate to say it pretty in step with how social 306 00:17:20,760 --> 00:17:24,120 Speaker 1: media platforms behave at large. Like I wish that wasn't 307 00:17:24,119 --> 00:17:26,200 Speaker 1: the case, but that is the case. I mean, it's 308 00:17:26,280 --> 00:17:29,199 Speaker 1: kind of we've seen a live play with Twitter as 309 00:17:29,280 --> 00:17:32,520 Speaker 1: it's breaking down of which they are going after people 310 00:17:32,760 --> 00:17:35,399 Speaker 1: or people that they don't like or disagree with and 311 00:17:35,600 --> 00:17:37,920 Speaker 1: not necessarily docting them, but definitely cutting them out and 312 00:17:38,080 --> 00:17:41,360 Speaker 1: making sure that they know they being the people who 313 00:17:41,400 --> 00:17:43,720 Speaker 1: ever did this, are that they are being watched quote 314 00:17:43,840 --> 00:17:47,040 Speaker 1: or any of those things are being seen by Twitter themselves, 315 00:17:47,240 --> 00:17:50,960 Speaker 1: the company. So it's not like this is anything new. 316 00:17:51,040 --> 00:17:54,480 Speaker 1: Once again, it's very concerning, but that's kind of that 317 00:17:54,480 --> 00:17:58,560 Speaker 1: acknowledgments us having a phone that is connected to WiFi, 318 00:17:58,760 --> 00:18:01,840 Speaker 1: that is connected to any type of internet or any 319 00:18:01,960 --> 00:18:05,760 Speaker 1: of the data they're getting our information, that that's I 320 00:18:05,800 --> 00:18:08,920 Speaker 1: would assume that we all understood this totally. So that 321 00:18:09,119 --> 00:18:12,880 Speaker 1: is really my biggest point that I always come back 322 00:18:12,920 --> 00:18:15,920 Speaker 1: to in this conversation. We don't have any kind of 323 00:18:16,080 --> 00:18:20,920 Speaker 1: meaningful data privacy legislation in this country whatsoever. All of 324 00:18:21,000 --> 00:18:23,960 Speaker 1: our information is for sale by whoever wants it, and 325 00:18:24,080 --> 00:18:27,760 Speaker 1: I mean that literally. I've done a whole episode about 326 00:18:27,960 --> 00:18:30,760 Speaker 1: doxing and how people do get doxed on their no 327 00:18:30,840 --> 00:18:35,000 Speaker 1: girls on the internet, and essentially, if you've ever done 328 00:18:35,040 --> 00:18:40,080 Speaker 1: anything like voted, or turned on the electricity in your apartment, 329 00:18:40,359 --> 00:18:44,520 Speaker 1: or paid a parking ticket, your information is for sale 330 00:18:44,560 --> 00:18:47,320 Speaker 1: on the internet. Oftentimes it is put there by our 331 00:18:47,359 --> 00:18:52,040 Speaker 1: state agencies. That information is just available widely for whoever 332 00:18:52,119 --> 00:18:53,920 Speaker 1: wants to spend a small amount of money to buy it. 333 00:18:54,040 --> 00:18:56,160 Speaker 1: I wish that wasn't the case, but that is the reality. 334 00:18:56,280 --> 00:18:59,719 Speaker 1: And so the fact that we are talking about banning 335 00:18:59,760 --> 00:19:05,639 Speaker 1: TikTok when if the whole national security threat is China 336 00:19:05,880 --> 00:19:10,280 Speaker 1: having access to American data, all of that data is 337 00:19:10,320 --> 00:19:12,960 Speaker 1: for sale. So like if you banned TikTok, that would 338 00:19:12,960 --> 00:19:15,000 Speaker 1: still be the case. China would still have access to 339 00:19:15,040 --> 00:19:17,520 Speaker 1: American data. It wouldn't be from the TikTok app, but 340 00:19:17,640 --> 00:19:20,440 Speaker 1: they there's it's just a widely available and so I 341 00:19:20,680 --> 00:19:25,399 Speaker 1: feel like banning TikTok is this flashy scapegoat of like, 342 00:19:25,600 --> 00:19:29,000 Speaker 1: see we did something when in reality you've done nothing. 343 00:19:29,119 --> 00:19:31,719 Speaker 1: The analogy I use it's like putting bars on your 344 00:19:31,800 --> 00:19:34,280 Speaker 1: window when you don't have a front door. Right, It's like, 345 00:19:34,720 --> 00:19:39,720 Speaker 1: we desperately, desperately do need meaningful legislation that protects user 346 00:19:39,840 --> 00:19:43,320 Speaker 1: data and user privacy. But banning TikTok will not get 347 00:19:43,440 --> 00:19:45,600 Speaker 1: us there. We will still all of our information will 348 00:19:45,640 --> 00:19:49,200 Speaker 1: still be widely available to whoever wants it. That's the problem, right. 349 00:19:49,760 --> 00:19:52,480 Speaker 1: I mean, let's be honest too. I think Choe was 350 00:19:52,560 --> 00:19:54,920 Speaker 1: not wrong when he said that the hearing felt like 351 00:19:55,040 --> 00:19:58,360 Speaker 1: it was a xenophobic attack, Oh my god, whole group 352 00:19:58,400 --> 00:20:03,960 Speaker 1: of people rather than just a social media platform. Absolutely 353 00:20:04,040 --> 00:20:06,479 Speaker 1: so um. There is a great piece on this by 354 00:20:06,560 --> 00:20:09,399 Speaker 1: CNN's Brian Fung, who I actually know. IRL. Shout out 355 00:20:09,440 --> 00:20:12,159 Speaker 1: to Brian Fung, where you talked about how some of 356 00:20:12,240 --> 00:20:14,840 Speaker 1: the some of the rhetoric coming out of those hearings 357 00:20:15,520 --> 00:20:20,359 Speaker 1: just felt very xenophobic, and so TikTok Ceo Chow he 358 00:20:20,520 --> 00:20:23,200 Speaker 1: is Singaporean, right, and so accusing him of working with 359 00:20:23,240 --> 00:20:25,399 Speaker 1: the Chinese government and trying to associate him with the 360 00:20:25,480 --> 00:20:29,679 Speaker 1: Chinese Communist Party is like just doesn't feel it doesn't 361 00:20:29,720 --> 00:20:32,800 Speaker 1: feel like helpful rhetoric. And to me, it did feel 362 00:20:33,160 --> 00:20:35,879 Speaker 1: rooted in xenophobe. It felt rooted in this idea of 363 00:20:36,040 --> 00:20:40,879 Speaker 1: making this connection to China feel other and thus like 364 00:20:41,240 --> 00:20:45,520 Speaker 1: nefarious or suspect or something. And so if there was 365 00:20:45,600 --> 00:20:50,320 Speaker 1: some kind of like smoking gun right that the Chinese 366 00:20:50,400 --> 00:20:56,000 Speaker 1: government is using TikTok to spy on Americans in mass 367 00:20:56,600 --> 00:20:58,960 Speaker 1: and there was some sort of evidence or a smoking 368 00:20:59,000 --> 00:21:01,160 Speaker 1: gun to illustrate that, I would have that we would 369 00:21:01,200 --> 00:21:03,920 Speaker 1: have that conversation. But right now it is just so 370 00:21:04,080 --> 00:21:06,960 Speaker 1: based in like hypotheticals like well they could you know, 371 00:21:07,200 --> 00:21:09,879 Speaker 1: that is a potential risk, And it just feels like 372 00:21:10,640 --> 00:21:17,320 Speaker 1: adding this assumption of nefarious behavior simply because we're talking 373 00:21:17,359 --> 00:21:19,800 Speaker 1: about a Chinese company, does that make sense? Yeah? No, 374 00:21:20,160 --> 00:21:22,520 Speaker 1: Like this is kind of the big question in this 375 00:21:22,800 --> 00:21:25,520 Speaker 1: rise of age and hate, in this level of discrimination 376 00:21:25,600 --> 00:21:27,800 Speaker 1: when it comes to anything that's coming out of China. 377 00:21:27,840 --> 00:21:30,440 Speaker 1: It feels like they're playing into while we're already kind 378 00:21:30,480 --> 00:21:32,840 Speaker 1: of scared and we have a base of people who 379 00:21:32,920 --> 00:21:36,000 Speaker 1: are going to blame the Chinese for everything and anything. 380 00:21:36,280 --> 00:21:39,280 Speaker 1: So let's go ahead and start this when in actuality, 381 00:21:39,400 --> 00:21:42,720 Speaker 1: this type of privacy in data collecting has been happening. 382 00:21:42,960 --> 00:21:46,240 Speaker 1: I'm trying to think back, didn't the Congress actually bypass 383 00:21:46,640 --> 00:21:49,440 Speaker 1: a privacy data law in order to get more money 384 00:21:49,640 --> 00:21:53,760 Speaker 1: out of like companies a while ago? We are based 385 00:21:53,840 --> 00:21:56,359 Speaker 1: straight up don't care like that's the thing. It's like, 386 00:21:56,680 --> 00:21:58,920 Speaker 1: this is just me kind of like tinfoil hatting a 387 00:21:59,000 --> 00:22:02,480 Speaker 1: little bit, So take that for what it's worth. But like, 388 00:22:03,440 --> 00:22:08,560 Speaker 1: I think that when there is a foreign boogeyman, you 389 00:22:08,720 --> 00:22:13,040 Speaker 1: see elected officials acting differently, but when given the chance 390 00:22:13,160 --> 00:22:17,840 Speaker 1: to legislate and govern and act on these very same issues, 391 00:22:18,480 --> 00:22:21,280 Speaker 1: when there isn't some foreign boogeyman to blame, they do nothing. 392 00:22:21,560 --> 00:22:24,600 Speaker 1: They've had the opportunity to act and they have done nothing. 393 00:22:24,840 --> 00:22:28,040 Speaker 1: So for me, it's like, is this really about security 394 00:22:28,280 --> 00:22:32,600 Speaker 1: and harm and risk? Because you have done jack up 395 00:22:32,680 --> 00:22:35,879 Speaker 1: until now. So if it truly is about protecting the 396 00:22:35,960 --> 00:22:38,480 Speaker 1: public from harm, you need to then explain to me 397 00:22:38,800 --> 00:22:41,280 Speaker 1: why at every opportunity that you've had to act, you've 398 00:22:41,320 --> 00:22:43,240 Speaker 1: done nothing. Like that needs to be explained to me 399 00:22:43,240 --> 00:22:45,520 Speaker 1: because I don't get it or actively voted against it, 400 00:22:45,640 --> 00:22:48,679 Speaker 1: which we saw again like I remember this coming about 401 00:22:48,880 --> 00:22:51,159 Speaker 1: and people talking about we need more laws, and so 402 00:22:51,320 --> 00:22:54,800 Speaker 1: lawmakers did and then it just went nowhere. And you're like, wait, 403 00:22:55,040 --> 00:22:57,920 Speaker 1: so do you actually care about I privacy? Are not? 404 00:22:58,800 --> 00:23:02,560 Speaker 1: I mean, if you were to again, so much of 405 00:23:02,640 --> 00:23:06,080 Speaker 1: this sounds like I'm, I'm it sounds like a conspiracy theory. 406 00:23:06,160 --> 00:23:11,080 Speaker 1: But I know that part of this conversation is the 407 00:23:11,359 --> 00:23:16,160 Speaker 1: immense money and energy that Facebook has put into lobbying 408 00:23:16,280 --> 00:23:20,879 Speaker 1: elected officials into seeing TikTok as harmful to take the 409 00:23:20,960 --> 00:23:23,520 Speaker 1: heat off of them, and in some cases, you know, 410 00:23:23,600 --> 00:23:25,480 Speaker 1: I would wonder if elected officials have some sort of 411 00:23:25,560 --> 00:23:29,960 Speaker 1: financial connection to Meta right, Like during that hearing, I 412 00:23:30,080 --> 00:23:33,880 Speaker 1: heard specific talking points that I know came from Facebook's 413 00:23:34,040 --> 00:23:38,800 Speaker 1: massive lobbying company called Targeted Victory, where they assign blame 414 00:23:39,400 --> 00:23:42,680 Speaker 1: for you know, these challenges that they say start on 415 00:23:42,800 --> 00:23:44,800 Speaker 1: TikTok that were where kids end up doing them and 416 00:23:44,840 --> 00:23:48,440 Speaker 1: then and then dying or getting hard harming themselves. But 417 00:23:48,560 --> 00:23:51,720 Speaker 1: those some of those challenges actually originated on Facebook, and 418 00:23:51,800 --> 00:23:56,280 Speaker 1: so Facebook actually had a huge pr campaign to make 419 00:23:56,680 --> 00:24:01,439 Speaker 1: the public and elected officials associate TikTok with harmful challenges. 420 00:24:02,080 --> 00:24:05,640 Speaker 1: You know, misinformation, all kinds of bad stuff that they 421 00:24:05,800 --> 00:24:10,080 Speaker 1: themselves are are also pushing, right, And I want to 422 00:24:10,119 --> 00:24:12,840 Speaker 1: come back to that point in a second, but to 423 00:24:12,960 --> 00:24:16,679 Speaker 1: go back to sort of the challenge, the challenge aspects 424 00:24:16,800 --> 00:24:18,639 Speaker 1: of social media and young people, I think one of 425 00:24:18,680 --> 00:24:21,600 Speaker 1: the other big pieces of this conversation that also relates 426 00:24:21,640 --> 00:24:24,800 Speaker 1: to those sort of funny and distressing soundbites, is that 427 00:24:25,800 --> 00:24:31,080 Speaker 1: older people don't use TikTok, whereas they might use Facebook, 428 00:24:31,480 --> 00:24:35,200 Speaker 1: or they might use something else, and therefore a very 429 00:24:35,400 --> 00:24:38,960 Speaker 1: dismissive of it, as like, oh, this is young folk 430 00:24:39,320 --> 00:24:44,160 Speaker 1: hooligans using this one totally. So I still hear people 431 00:24:44,280 --> 00:24:47,119 Speaker 1: talk about TikTok as a kid's dancing app, and I 432 00:24:47,240 --> 00:24:50,320 Speaker 1: always a bristle at that because that is just not true, 433 00:24:50,400 --> 00:24:52,720 Speaker 1: Like TikTok is a discourse app, like it is not 434 00:24:52,840 --> 00:24:55,320 Speaker 1: just for kids dancing. But it is absolutely true that 435 00:24:55,400 --> 00:24:58,560 Speaker 1: it has a huge young fan base. According to The Guardian, 436 00:24:58,600 --> 00:25:00,399 Speaker 1: a majority of teams in the US say that they 437 00:25:00,480 --> 00:25:03,000 Speaker 1: use TikTok, with sixty seven percent of people aged thirteen 438 00:25:03,040 --> 00:25:06,640 Speaker 1: to seventeen saying that they use the app almost constantly 439 00:25:06,680 --> 00:25:10,280 Speaker 1: according to Pew, and so it definitely is an app 440 00:25:10,400 --> 00:25:13,920 Speaker 1: where young people are congregating. But that doesn't mean that 441 00:25:13,960 --> 00:25:16,640 Speaker 1: it's not an app where serious discourse is taking place, 442 00:25:16,640 --> 00:25:18,920 Speaker 1: because it absolutely is. And so I think that you're 443 00:25:19,000 --> 00:25:22,359 Speaker 1: exactly right that that is something that takes center stage 444 00:25:22,440 --> 00:25:25,760 Speaker 1: in conversations around things like content moderation is the fact 445 00:25:25,800 --> 00:25:28,639 Speaker 1: that it does have a very young user base. And 446 00:25:28,720 --> 00:25:31,520 Speaker 1: so during the hearings, we saw lawmakers pointing out that like, 447 00:25:31,840 --> 00:25:35,520 Speaker 1: children can have access to content around guns, Like there 448 00:25:35,600 --> 00:25:37,359 Speaker 1: was an image of a gun that lawmakers pointed to 449 00:25:37,560 --> 00:25:40,560 Speaker 1: on TikTok that they found very distressing, which is almost 450 00:25:41,000 --> 00:25:42,960 Speaker 1: humorous to me that it's like, oh, this image of 451 00:25:43,000 --> 00:25:46,800 Speaker 1: a gun on social media on TikTok. Bad guns in 452 00:25:46,960 --> 00:25:50,879 Speaker 1: real life, like in schools, hurting actual kids. Nah, Like 453 00:25:51,760 --> 00:25:54,240 Speaker 1: all good. So that was kind of interesting. And so 454 00:25:54,520 --> 00:25:58,200 Speaker 1: something that Chew said was that harmful content making its 455 00:25:58,280 --> 00:26:00,720 Speaker 1: way to miners on social medi the apps is an 456 00:26:00,800 --> 00:26:04,640 Speaker 1: industry wide challenge, which is true, right, So I think 457 00:26:04,760 --> 00:26:08,520 Speaker 1: every major social media platform is struggling to make sure 458 00:26:08,600 --> 00:26:12,000 Speaker 1: that young users are not having access to content that's 459 00:26:12,000 --> 00:26:13,840 Speaker 1: going to be harmful for them. But the thing that 460 00:26:13,920 --> 00:26:16,800 Speaker 1: I am like really passionate about is how apps like 461 00:26:16,920 --> 00:26:20,439 Speaker 1: TikTok promote things like disordered eating or self harm content 462 00:26:20,520 --> 00:26:23,920 Speaker 1: and medical misinformation. So right now TikTok is facing like 463 00:26:24,280 --> 00:26:28,119 Speaker 1: lawsuits over young people who have gotten hurt or died 464 00:26:28,359 --> 00:26:31,560 Speaker 1: because of content that they came into contact with allegedly 465 00:26:31,640 --> 00:26:34,760 Speaker 1: on TikTok. And so you know, I would say this 466 00:26:34,920 --> 00:26:36,879 Speaker 1: like this is part like this. I do this for 467 00:26:36,920 --> 00:26:39,720 Speaker 1: a living. It is like I meet with social media 468 00:26:39,760 --> 00:26:43,320 Speaker 1: platforms and the leadership at these platforms to advocate for them, 469 00:26:43,600 --> 00:26:46,520 Speaker 1: you know, making platforms safer. And I will say that 470 00:26:46,640 --> 00:26:52,719 Speaker 1: like carmful, downright dangerous content working its way into platforms 471 00:26:53,440 --> 00:26:56,879 Speaker 1: is a problem for all social media platforms. I wouldn't 472 00:26:57,040 --> 00:27:00,959 Speaker 1: necessarily say that TikTok is performing worse than any other 473 00:27:01,040 --> 00:27:03,880 Speaker 1: social media platform out there. It might be even performing 474 00:27:03,960 --> 00:27:06,639 Speaker 1: like above average when you compare it to things like 475 00:27:06,680 --> 00:27:10,080 Speaker 1: Twitter or Facebook. But this is a very real issue 476 00:27:10,200 --> 00:27:13,440 Speaker 1: and all social media platforms across the board need to 477 00:27:13,480 --> 00:27:15,840 Speaker 1: be doing better. And so the fact that that came 478 00:27:15,880 --> 00:27:17,240 Speaker 1: up in the hearing, I was like, oh, well, that 479 00:27:17,440 --> 00:27:20,399 Speaker 1: is actually something that we need to be talking about, 480 00:27:20,640 --> 00:27:22,920 Speaker 1: But we don't necessarily need to be talking about it 481 00:27:23,480 --> 00:27:25,879 Speaker 1: in a way that just poses that TikTok is the 482 00:27:26,000 --> 00:27:28,520 Speaker 1: only bad actor in the space, or is the only 483 00:27:28,720 --> 00:27:31,280 Speaker 1: platform in the space where this harm is happening, because 484 00:27:31,320 --> 00:27:33,560 Speaker 1: the reality is it's happening on Facebook, it's happening on Twitter, 485 00:27:33,600 --> 00:27:37,200 Speaker 1: it's happening on Reddit, it's happening across social media. Yeah, 486 00:27:38,600 --> 00:27:40,720 Speaker 1: it's one of those things that feel so strange that 487 00:27:40,800 --> 00:27:44,280 Speaker 1: we've just accepted it. But it's difficult to find it, 488 00:27:44,400 --> 00:27:46,320 Speaker 1: I guess, but you kind of know, You're like, well, 489 00:27:47,000 --> 00:27:49,159 Speaker 1: if I'm using my phone, then they're going to know 490 00:27:49,240 --> 00:27:51,040 Speaker 1: that I looked up this thing or this thing or 491 00:27:51,080 --> 00:27:55,440 Speaker 1: this nick And I also read that if you don't 492 00:27:55,480 --> 00:27:59,280 Speaker 1: even have whatever app, it can still somehow get access 493 00:27:59,359 --> 00:28:04,440 Speaker 1: to your data. So just as kind of helpless, I guess, like, yeah, 494 00:28:04,560 --> 00:28:08,440 Speaker 1: well I get that, and I feel the same way, 495 00:28:08,480 --> 00:28:12,439 Speaker 1: and I think it makes me sad that, just as 496 00:28:12,520 --> 00:28:15,800 Speaker 1: you said, Annie, we've like just accepted that this is 497 00:28:15,880 --> 00:28:20,080 Speaker 1: how it is right that, of course social media platforms 498 00:28:20,119 --> 00:28:23,360 Speaker 1: are going to be making all my data available, whether 499 00:28:23,480 --> 00:28:24,920 Speaker 1: or not I even use them or have them on 500 00:28:24,960 --> 00:28:26,800 Speaker 1: my phone. Of course they are going to be listening 501 00:28:26,840 --> 00:28:29,440 Speaker 1: to my conversations and you know, serving me up ads 502 00:28:29,480 --> 00:28:32,720 Speaker 1: based on that. Of course, if I am interested in 503 00:28:33,200 --> 00:28:37,080 Speaker 1: obtaining an abortion, they will share that information with law enforcement. 504 00:28:37,200 --> 00:28:40,840 Speaker 1: Like I think that we should be able to expect better. 505 00:28:41,000 --> 00:28:43,720 Speaker 1: I think that we deserve better. We deserve to have 506 00:28:43,880 --> 00:28:48,880 Speaker 1: better digital tools and digital platforms. We deserve to have 507 00:28:49,200 --> 00:28:53,960 Speaker 1: our online experiences not just be marketplaces for harm and 508 00:28:54,120 --> 00:28:57,560 Speaker 1: exploitations and scams. We deserve for them to be places 509 00:28:57,600 --> 00:29:00,960 Speaker 1: where you can have meaningful discourse, get accurate information in 510 00:29:01,000 --> 00:29:03,600 Speaker 1: a way that is safe and private. And the fact 511 00:29:03,640 --> 00:29:08,240 Speaker 1: that we're just like, oh well, nope, that the baseline 512 00:29:08,320 --> 00:29:10,840 Speaker 1: experience that we can expect is quite literally the opposite. 513 00:29:11,000 --> 00:29:13,760 Speaker 1: I don't accept that. That's unacceptable to me. Oh my god, 514 00:29:13,840 --> 00:29:17,479 Speaker 1: you just completely put into my mind. I forgot about already. 515 00:29:17,480 --> 00:29:19,160 Speaker 1: I can't believe there's so many bad things in the world, 516 00:29:19,280 --> 00:29:23,720 Speaker 1: but I forgot about the case in which Facebook allow 517 00:29:23,920 --> 00:29:26,840 Speaker 1: for information to be gathered on a young woman and 518 00:29:27,000 --> 00:29:30,480 Speaker 1: her mom for getting access to abortion and actually being 519 00:29:30,520 --> 00:29:34,280 Speaker 1: obtained by law enforcement to go after these people. It's like, 520 00:29:34,560 --> 00:29:36,720 Speaker 1: if the if you want to talk about us, to me, 521 00:29:36,880 --> 00:29:39,240 Speaker 1: what a security threat that would be it? And that 522 00:29:39,440 --> 00:29:43,400 Speaker 1: was from Facebook not too long ago exactly. So that's 523 00:29:43,440 --> 00:29:46,800 Speaker 1: such a good point that, like, there are specific and 524 00:29:47,040 --> 00:29:51,880 Speaker 1: current harms that social other social media platforms like Facebook 525 00:29:52,280 --> 00:29:55,800 Speaker 1: are responsible for today. Right, We know about that, We 526 00:29:56,080 --> 00:29:59,200 Speaker 1: know about things like Cambridge Analytica, Right, Like, these are things. 527 00:29:59,320 --> 00:30:01,200 Speaker 1: These are no one of things where it's like, yeah, 528 00:30:01,720 --> 00:30:05,520 Speaker 1: we know that Facebook has admitted responsibility for literal genocide. 529 00:30:05,760 --> 00:30:08,160 Speaker 1: These are things that we know. They're not hypotheticals, they're 530 00:30:08,200 --> 00:30:10,800 Speaker 1: not potential risks or potential harms. They're things that are 531 00:30:10,880 --> 00:30:13,400 Speaker 1: that have happened. And so for me, it's a little 532 00:30:13,440 --> 00:30:16,440 Speaker 1: bit rich for me to be having this like breathless 533 00:30:16,480 --> 00:30:20,760 Speaker 1: conversation about the potential harms maybe down the line that 534 00:30:20,960 --> 00:30:25,520 Speaker 1: TikTok could be responsible for potentially, and then not having 535 00:30:25,560 --> 00:30:30,640 Speaker 1: the conversation about the laundry lists of documented admitted harms 536 00:30:30,720 --> 00:30:34,080 Speaker 1: that Facebook has been responsible for already in reality, Like, 537 00:30:34,120 --> 00:30:35,800 Speaker 1: I don't understand how they got to this point where 538 00:30:36,560 --> 00:30:41,480 Speaker 1: the hypothetical risk takes precedent over the actual documented and 539 00:30:41,680 --> 00:30:46,400 Speaker 1: oftentimes admitted harm of platforms like Facebook. Right. And you 540 00:30:46,520 --> 00:30:49,400 Speaker 1: were saying earlier because it sounded like a whole movie. 541 00:30:49,400 --> 00:30:51,200 Speaker 1: It feels like a spy movie, as you were saying, 542 00:30:51,200 --> 00:30:54,160 Speaker 1: because you were talking about how Facebook has actually been 543 00:30:54,880 --> 00:30:57,520 Speaker 1: kind of behind the scenes in doing a whole smare 544 00:30:57,600 --> 00:31:00,920 Speaker 1: campaign on TikTok. Can you talk about the Yeah, So 545 00:31:01,200 --> 00:31:04,040 Speaker 1: this whole conversation, the whole hearing feels like a real 546 00:31:04,080 --> 00:31:06,560 Speaker 1: win for Facebook. We already know that Facebook has paid 547 00:31:06,680 --> 00:31:08,880 Speaker 1: lots of money to try to make people and more importantly, 548 00:31:08,960 --> 00:31:12,920 Speaker 1: lawmakers dislike TikTok, and we definitely saw the impacts of 549 00:31:13,000 --> 00:31:15,800 Speaker 1: this on display at the hearings. Right. So, those deadly 550 00:31:15,960 --> 00:31:19,280 Speaker 1: challenges that I was talking about, several of those challenges 551 00:31:19,480 --> 00:31:23,160 Speaker 1: actually originated on Facebook, not TikTok. And the reason that 552 00:31:23,280 --> 00:31:26,600 Speaker 1: lawmakers might be associating those with TikTok is because of 553 00:31:27,000 --> 00:31:31,200 Speaker 1: this coordinated smear campaign orchestrated by Facebook. The New York 554 00:31:31,240 --> 00:31:33,880 Speaker 1: Times found that Facebook paid Targeted Victory, which is one 555 00:31:33,920 --> 00:31:36,400 Speaker 1: of the biggest Republican consulting firms in the country, to 556 00:31:36,600 --> 00:31:40,280 Speaker 1: orchestrate a nationwide campaign seeking to turn the public against TikTok. 557 00:31:40,520 --> 00:31:43,600 Speaker 1: This campaign included placing op eds and letters to the 558 00:31:43,760 --> 00:31:47,480 Speaker 1: editor in major regional news outlets, promoting dubious stories about 559 00:31:47,480 --> 00:31:51,160 Speaker 1: alleged TikTok trends that actually originated on Facebook, and pushing 560 00:31:51,200 --> 00:31:54,320 Speaker 1: to draw political reporters and local politicians into helping take 561 00:31:54,360 --> 00:31:57,360 Speaker 1: down its biggest competitor. This is an email that The 562 00:31:57,400 --> 00:32:01,000 Speaker 1: New York Times found Targeted Victory needs to quote get 563 00:32:01,080 --> 00:32:03,680 Speaker 1: the message out that while Meta is the current punching bag, 564 00:32:03,960 --> 00:32:06,880 Speaker 1: TikTok is the real threat, especially as a foreign own app, 565 00:32:06,960 --> 00:32:09,360 Speaker 1: that it's number one in sharing data that young teens 566 00:32:09,400 --> 00:32:12,480 Speaker 1: are using. A director for Targeted Victory wrote in a 567 00:32:12,560 --> 00:32:16,280 Speaker 1: February email, So I really don't like that Facebook is 568 00:32:16,720 --> 00:32:19,400 Speaker 1: at least pulling some strings behind the scenes of this 569 00:32:19,480 --> 00:32:22,720 Speaker 1: conversation about TikTok to kind of take the heat off 570 00:32:22,800 --> 00:32:26,240 Speaker 1: of the massive wrongdoing and public harms that they have 571 00:32:26,360 --> 00:32:29,640 Speaker 1: been responsible for, like the whole hearing. I'm sure that 572 00:32:29,720 --> 00:32:32,160 Speaker 1: Mark Zuckerberg is like, oh, this is taking the heat 573 00:32:32,200 --> 00:32:34,320 Speaker 1: off of me. I can keep stay over here doing 574 00:32:34,400 --> 00:32:39,920 Speaker 1: like evil while TikTok it's all the heat. Yeah, that 575 00:32:40,080 --> 00:32:48,640 Speaker 1: was my Mark Zuckerberg impression. It reminds me of when 576 00:32:48,680 --> 00:32:50,760 Speaker 1: I was in college. I had an internship in China 577 00:32:50,800 --> 00:32:52,440 Speaker 1: and I was working for this big company that will 578 00:32:52,680 --> 00:32:56,040 Speaker 1: name nameless. But they made me get a different laptop. 579 00:32:56,040 --> 00:32:57,520 Speaker 1: But they made me get a different phone because they 580 00:32:57,560 --> 00:32:59,440 Speaker 1: were like, once you go through in China, they just 581 00:32:59,560 --> 00:33:02,280 Speaker 1: take all data. And I remember thinking like I'm an intern, 582 00:33:02,400 --> 00:33:08,840 Speaker 1: You're not sending me any useful information or emails? Like sure, 583 00:33:09,000 --> 00:33:11,080 Speaker 1: I'll do it, but like, I don't know what you 584 00:33:11,280 --> 00:33:13,600 Speaker 1: think they're gonna get from me. And I love how 585 00:33:13,640 --> 00:33:16,240 Speaker 1: they kind of point out like they're spuning on the US, 586 00:33:16,320 --> 00:33:18,520 Speaker 1: not that the US aren't doing anything amazing, but it's 587 00:33:18,520 --> 00:33:21,840 Speaker 1: sort of like I think it's all about money at 588 00:33:21,880 --> 00:33:25,040 Speaker 1: least a part of it is just like we want 589 00:33:25,560 --> 00:33:27,760 Speaker 1: Meta to make money because it's in the US, sell 590 00:33:27,840 --> 00:33:31,520 Speaker 1: the data all the time to them. But TikTok no, 591 00:33:31,920 --> 00:33:33,920 Speaker 1: Oh my god. I saw this you USA Today headline 592 00:33:33,960 --> 00:33:36,920 Speaker 1: that was like, TikTok wants my data? Don't they know 593 00:33:37,000 --> 00:33:40,960 Speaker 1: that's preserved for Google and Meta? Like very much like 594 00:33:41,400 --> 00:33:44,720 Speaker 1: we want the West to be where social media platforms 595 00:33:44,760 --> 00:33:47,840 Speaker 1: and data are our marketplace, Like like I have to 596 00:33:47,880 --> 00:33:51,480 Speaker 1: wonder if how much of the conversation is exactly that Annie, Yeah, 597 00:34:05,160 --> 00:34:09,120 Speaker 1: and so I guess this brings us to another big question, 598 00:34:09,600 --> 00:34:13,120 Speaker 1: what would happen if TikTok was banned? Like if we 599 00:34:13,360 --> 00:34:16,720 Speaker 1: play out this scenario, what would happen? So great question. 600 00:34:17,120 --> 00:34:20,240 Speaker 1: I had an interview with TikTok disinformation researcher Abby Richards 601 00:34:20,360 --> 00:34:22,920 Speaker 1: this Week, who wrote a great piece for Newsweek which 602 00:34:22,960 --> 00:34:25,279 Speaker 1: folks should read. In it, she writes, I understand the 603 00:34:25,320 --> 00:34:28,520 Speaker 1: privacy concerned stemming from reporting that TikTok has been weaponized 604 00:34:28,520 --> 00:34:31,640 Speaker 1: by the Chinese Communist Party together data from Americans, But 605 00:34:31,760 --> 00:34:34,520 Speaker 1: banning TikTok is like applying a dirty, used band aid 606 00:34:34,600 --> 00:34:37,200 Speaker 1: to the gaping wound that is our broken digital privacy 607 00:34:37,239 --> 00:34:39,640 Speaker 1: status quo. It would do little to protect the data 608 00:34:39,680 --> 00:34:42,080 Speaker 1: of Americans, but it would cause a whole host of 609 00:34:42,160 --> 00:34:44,600 Speaker 1: new problems. To address this problem at its core, we 610 00:34:44,760 --> 00:34:47,680 Speaker 1: must regulate the use of data. Why should Google, Meta 611 00:34:47,719 --> 00:34:49,560 Speaker 1: and Twitter get a free pass because they are not 612 00:34:49,719 --> 00:34:52,680 Speaker 1: Chinese owned. If we banned TikTok, the channels of communication 613 00:34:52,800 --> 00:34:55,040 Speaker 1: that have been steadily established over the last half of 614 00:34:55,080 --> 00:34:57,600 Speaker 1: this decade will cease to exist, leaving some of the 615 00:34:57,640 --> 00:35:00,120 Speaker 1: most marginalized in our country suddenly in the dark. The 616 00:35:00,239 --> 00:35:02,960 Speaker 1: US is at a crossroads. We could dismantle a massive 617 00:35:03,000 --> 00:35:07,200 Speaker 1: piece of communications infrastructure used by young people, LGBTQ plus 618 00:35:07,280 --> 00:35:11,560 Speaker 1: people and people of color, exacerbating existing inequalities in information access. 619 00:35:11,840 --> 00:35:15,440 Speaker 1: Or alternatively, Congress could implement legislation that serves to protect 620 00:35:15,480 --> 00:35:18,960 Speaker 1: the digital privacy and safety of all Americans on all platforms. 621 00:35:19,360 --> 00:35:21,799 Speaker 1: And so the thing that Abby is really getting at 622 00:35:21,880 --> 00:35:24,160 Speaker 1: here is that that I hadn't really thought about until 623 00:35:24,400 --> 00:35:27,759 Speaker 1: reading her piece and talking to her, is that we're 624 00:35:27,800 --> 00:35:31,320 Speaker 1: at a time where marginalized folks like trans communities, career communities, 625 00:35:31,360 --> 00:35:34,800 Speaker 1: communities of color, women are facing a lot of attacks. 626 00:35:34,920 --> 00:35:38,920 Speaker 1: And so if you dismantle a platform where a lot 627 00:35:38,960 --> 00:35:42,000 Speaker 1: of these communities have built up a voice and built 628 00:35:42,120 --> 00:35:45,399 Speaker 1: up a platform for themselves, you would be setting those 629 00:35:45,440 --> 00:35:48,800 Speaker 1: communities in and their you know, their work for equality 630 00:35:48,840 --> 00:35:51,480 Speaker 1: and justice back quite a bit. And I hadn't really 631 00:35:51,520 --> 00:35:53,200 Speaker 1: thought about that because I don't think I had fully 632 00:35:53,880 --> 00:35:57,800 Speaker 1: fought through how big some of these spaces on TikTok 633 00:35:57,840 --> 00:36:01,040 Speaker 1: are in terms of creating discourse. Like I'll be straight 634 00:36:01,120 --> 00:36:04,200 Speaker 1: up with you, when all that stuff was happening in Iran, 635 00:36:04,440 --> 00:36:06,160 Speaker 1: I don't think I would have known what was going 636 00:36:06,239 --> 00:36:08,720 Speaker 1: on with women and girls in Iran if not for TikTok. 637 00:36:08,840 --> 00:36:10,600 Speaker 1: That was the first place that I saw it. That 638 00:36:10,760 --> 00:36:13,719 Speaker 1: was where I saw like conversations about how folks in 639 00:36:13,800 --> 00:36:16,160 Speaker 1: the United States could help amplify That was how I 640 00:36:16,200 --> 00:36:19,160 Speaker 1: got connected to have guests from Iran on my podcast, 641 00:36:19,239 --> 00:36:21,279 Speaker 1: Like if not for TikTok, I would not have been 642 00:36:21,320 --> 00:36:23,320 Speaker 1: able to do any of that. And so, you know, 643 00:36:23,400 --> 00:36:26,400 Speaker 1: the ways in which marginalized communities have been able to 644 00:36:26,560 --> 00:36:30,200 Speaker 1: use this platform to create discourse and power for themselves 645 00:36:30,560 --> 00:36:33,239 Speaker 1: really is pretty vast. And so if we ban it 646 00:36:33,360 --> 00:36:37,280 Speaker 1: outright that, according to Abbey, that could have really drastic 647 00:36:37,320 --> 00:36:40,920 Speaker 1: consequences for marginalized communities online. Oh yeah, you say that. 648 00:36:41,040 --> 00:36:44,520 Speaker 1: We literally just had a guest who was Iranian Elka, 649 00:36:44,719 --> 00:36:47,120 Speaker 1: who has a big following on TikTok. But yeah, on 650 00:36:47,239 --> 00:36:49,359 Speaker 1: the same way, I would have not known about any 651 00:36:49,400 --> 00:36:52,840 Speaker 1: of the things going on, even knowing more updated information, 652 00:36:52,880 --> 00:36:54,920 Speaker 1: because I definitely don't see it on the news. It's 653 00:36:55,080 --> 00:36:57,800 Speaker 1: very rare that I get to see personal takes on 654 00:36:57,880 --> 00:37:01,680 Speaker 1: how it's affecting individuals and families that are imprisoned. So 655 00:37:01,920 --> 00:37:03,840 Speaker 1: but yeah, it is all because of TikTok, and I 656 00:37:03,880 --> 00:37:07,200 Speaker 1: would have no clue except because of that. And honestly, 657 00:37:07,280 --> 00:37:10,600 Speaker 1: it's helped me connect even deeper into my Korean heritage, 658 00:37:10,600 --> 00:37:13,719 Speaker 1: which I feel so lost about, like I can't. I'm 659 00:37:13,800 --> 00:37:16,400 Speaker 1: not big on TikTok. I don't post things on TikTok. 660 00:37:16,480 --> 00:37:18,680 Speaker 1: I follow a lot of different people, but some of 661 00:37:18,719 --> 00:37:20,839 Speaker 1: the connections that I made without them knowing my paras. 662 00:37:20,880 --> 00:37:25,440 Speaker 1: Social connections really have brought me to a deeper understanding 663 00:37:25,560 --> 00:37:27,839 Speaker 1: or trying to understand myself or like myself a little 664 00:37:27,920 --> 00:37:30,719 Speaker 1: better in my ethnicity. And I know that's the whole 665 00:37:30,800 --> 00:37:34,480 Speaker 1: other conversation, but it truly has made me feel a 666 00:37:34,520 --> 00:37:37,960 Speaker 1: little more connected to a community I felt so ostracized 667 00:37:38,000 --> 00:37:42,560 Speaker 1: by through TikTok, and I cannot imagine. I'm sure there'll 668 00:37:42,560 --> 00:37:44,480 Speaker 1: be other platforms. I'm not gonna sit here and saying 669 00:37:44,520 --> 00:37:46,320 Speaker 1: this is going to be an end all if it 670 00:37:46,400 --> 00:37:49,319 Speaker 1: goes away, but it does seem very targeted for something 671 00:37:49,360 --> 00:37:51,080 Speaker 1: that I think has done a lot of good, I 672 00:37:51,200 --> 00:37:54,240 Speaker 1: know for me and for a lot of people. Whether 673 00:37:54,360 --> 00:37:56,520 Speaker 1: it is sending out my information to everybody in China, 674 00:37:56,560 --> 00:37:57,799 Speaker 1: I assure you that you can have it. I'm sad 675 00:37:57,880 --> 00:38:01,879 Speaker 1: and lonely in itself, but like it really does. There's 676 00:38:01,920 --> 00:38:05,759 Speaker 1: this level of me being a marginalized person and being 677 00:38:05,840 --> 00:38:09,240 Speaker 1: more marginalized in a community that's already marginalized, being adopted 678 00:38:09,320 --> 00:38:12,319 Speaker 1: and being very isolated to that felt so much more 679 00:38:12,400 --> 00:38:16,560 Speaker 1: connected through creators on TikTok that are willing to share 680 00:38:16,640 --> 00:38:19,840 Speaker 1: their life so that I feel more a part of 681 00:38:19,920 --> 00:38:23,600 Speaker 1: that community. That makes me sad. Yeah, And I mean, 682 00:38:23,719 --> 00:38:25,840 Speaker 1: I think I think that you've really said it. I 683 00:38:25,960 --> 00:38:29,880 Speaker 1: think that all social media platforms have their ups and downs, 684 00:38:30,280 --> 00:38:34,080 Speaker 1: but we can't discount those the experiences like the one 685 00:38:34,120 --> 00:38:37,000 Speaker 1: that you just shared about the way that this particular 686 00:38:37,080 --> 00:38:40,680 Speaker 1: app has enabled folks to really build community, explore their 687 00:38:40,760 --> 00:38:44,960 Speaker 1: identities and who they are. And if we just blanket 688 00:38:45,040 --> 00:38:47,840 Speaker 1: get rid of it, where will that conversation happen. I 689 00:38:47,920 --> 00:38:50,879 Speaker 1: don't see it happening on Twitter. I don't necessarily see 690 00:38:50,880 --> 00:38:54,879 Speaker 1: it happening on Facebook, right, I'm with you. I think 691 00:38:54,960 --> 00:38:59,319 Speaker 1: that marginalized people always are able to make a way 692 00:38:59,440 --> 00:39:03,160 Speaker 1: out of no way. I'm like, build power and communities 693 00:39:03,239 --> 00:39:06,000 Speaker 1: and feel seen online even in places that are hostile 694 00:39:06,080 --> 00:39:08,720 Speaker 1: to us. But it takes time. It doesn't happen overnight. 695 00:39:08,800 --> 00:39:11,800 Speaker 1: And I think we're in a weird social media landscape 696 00:39:11,800 --> 00:39:14,600 Speaker 1: where I wonder, like where what those kinds of conversations 697 00:39:14,640 --> 00:39:16,920 Speaker 1: that you just describe take place? If we didn't have TikTok, 698 00:39:17,160 --> 00:39:20,560 Speaker 1: you know, it would certainly take a long time to 699 00:39:20,719 --> 00:39:22,839 Speaker 1: rebuild them at a time, and a lot of these 700 00:39:22,880 --> 00:39:26,080 Speaker 1: communities don't really have a ton of time because they're 701 00:39:26,160 --> 00:39:28,440 Speaker 1: being attacked. Yeah, and I think those are those are 702 00:39:28,480 --> 00:39:30,600 Speaker 1: great points to make because I know we talked about 703 00:39:30,640 --> 00:39:33,040 Speaker 1: this in the most recent episode on What's going On 704 00:39:33,120 --> 00:39:37,000 Speaker 1: with Twitter. It's a lot of people will tell me, well, 705 00:39:37,040 --> 00:39:40,120 Speaker 1: I don't use it, think well, good for you. But 706 00:39:40,760 --> 00:39:43,200 Speaker 1: it's like very very meaningful for a lot of people. 707 00:39:43,360 --> 00:39:47,080 Speaker 1: And there is especially if you're in any way marginalized 708 00:39:47,160 --> 00:39:49,479 Speaker 1: or in a small town for instance, and you don't 709 00:39:49,520 --> 00:39:51,400 Speaker 1: have a lot of people you can talk to, you 710 00:39:51,520 --> 00:39:55,239 Speaker 1: go to these social media platforms and you find that 711 00:39:55,360 --> 00:39:57,760 Speaker 1: and there's something very powerful about that. And these movements 712 00:39:57,760 --> 00:40:00,360 Speaker 1: have happened on their information has happened on there that 713 00:40:00,520 --> 00:40:03,040 Speaker 1: has shifted how people think. That has shifted movements. So 714 00:40:03,200 --> 00:40:07,120 Speaker 1: it's like I just encourage people always like, don't just 715 00:40:07,280 --> 00:40:09,200 Speaker 1: say oh, I don't use it, so it doesn't matter. 716 00:40:09,960 --> 00:40:12,239 Speaker 1: It matters to sell people. It matters to a lot 717 00:40:12,320 --> 00:40:15,919 Speaker 1: of people, and it is powerful. That's such a good point. 718 00:40:16,080 --> 00:40:20,360 Speaker 1: I've only been on TikTok for a couple like two years. 719 00:40:21,280 --> 00:40:25,000 Speaker 1: In those two years, I have learned more useful information 720 00:40:25,160 --> 00:40:29,080 Speaker 1: that I didn't know than my entire time on Facebook 721 00:40:29,160 --> 00:40:32,000 Speaker 1: and Twitter. Absolutely, hands down, I'm like, did you know 722 00:40:32,000 --> 00:40:35,040 Speaker 1: what you're supposed to be boiling your wooden spoons, like 723 00:40:35,120 --> 00:40:37,480 Speaker 1: that's the way to clean them? Who do I think 724 00:40:37,520 --> 00:40:43,040 Speaker 1: those are who knew this? Like so, I don't want 725 00:40:43,040 --> 00:40:45,040 Speaker 1: to sound super biased, but I do love TikTok, and 726 00:40:45,120 --> 00:40:47,680 Speaker 1: I do I have personally seen the ways that it 727 00:40:47,760 --> 00:40:50,920 Speaker 1: can be really a really important place for discourse and 728 00:40:51,000 --> 00:40:55,400 Speaker 1: information and communication and community. And I think that given 729 00:40:55,480 --> 00:40:58,840 Speaker 1: all of that, having people who don't necessarily understand it, 730 00:40:58,840 --> 00:41:02,239 Speaker 1: who've never necessarily use used it, making a blanket ban 731 00:41:02,400 --> 00:41:06,239 Speaker 1: of it coming from a place of fear mongering, and 732 00:41:07,360 --> 00:41:10,480 Speaker 1: you know, in some cases xenophobia is not the move. 733 00:41:10,880 --> 00:41:14,200 Speaker 1: I am all for regulating social media. I am all for, 734 00:41:14,920 --> 00:41:18,880 Speaker 1: you know, a meaningful, comprehensive legislation that protects of the 735 00:41:18,920 --> 00:41:22,919 Speaker 1: American public's privacy online. Yes, give it to me, we'll 736 00:41:23,000 --> 00:41:26,239 Speaker 1: take it. Let's have that conversation. I don't see this 737 00:41:26,560 --> 00:41:29,400 Speaker 1: as a conversation that will bear that fruit that I 738 00:41:29,480 --> 00:41:33,520 Speaker 1: know that we need. I see this as posturing, scare mongering, 739 00:41:33,960 --> 00:41:37,640 Speaker 1: fear mongering, making a boogeyman of one platform, when what 740 00:41:37,760 --> 00:41:42,040 Speaker 1: we need is something much more meaningful and much more comprehensive. Right, 741 00:41:42,200 --> 00:41:46,080 Speaker 1: I completely agree, there's so much confusion because you do 742 00:41:46,280 --> 00:41:51,200 Speaker 1: partially agree that things need to change on TikTok, but 743 00:41:51,320 --> 00:41:54,160 Speaker 1: that partial agreement is an arcing to all the social media. 744 00:41:54,280 --> 00:41:56,839 Speaker 1: Like you said that, unless you say that you're doing 745 00:41:56,920 --> 00:42:00,680 Speaker 1: this for all of social media and inner and all 746 00:42:00,719 --> 00:42:03,320 Speaker 1: of that and our phone, then we're not going to 747 00:42:03,440 --> 00:42:05,640 Speaker 1: believe that you're doing this for the well being of 748 00:42:05,800 --> 00:42:09,640 Speaker 1: a nation. You're doing this for your profit exactly. Abby 749 00:42:09,680 --> 00:42:13,240 Speaker 1: had a point, she said, data privacy, misinformation, hate speech. 750 00:42:13,320 --> 00:42:15,480 Speaker 1: You got to care about it on all platforms, not 751 00:42:15,600 --> 00:42:17,440 Speaker 1: just TikTok, not just the one that you can make 752 00:42:17,560 --> 00:42:20,680 Speaker 1: a foreign biogee man out of. Right, And I would 753 00:42:20,719 --> 00:42:23,520 Speaker 1: love to keep like pitching these ideas that will probably 754 00:42:23,560 --> 00:42:25,279 Speaker 1: never do, but I would love to come back to 755 00:42:25,520 --> 00:42:29,000 Speaker 1: talk about like Also, it's interesting to me that they're 756 00:42:29,000 --> 00:42:32,240 Speaker 1: talking about this when we've had so many high profile 757 00:42:32,320 --> 00:42:37,920 Speaker 1: instances lately of like our whole flight system falling apart 758 00:42:37,960 --> 00:42:40,560 Speaker 1: here in the US because we haven't updated our technology 759 00:42:40,600 --> 00:42:50,080 Speaker 1: and years. Genuinely scares me. Right, there's so much infrastructure 760 00:42:50,400 --> 00:42:53,719 Speaker 1: but also tech and digital infrastructure, but we've just accepted 761 00:42:53,760 --> 00:42:57,160 Speaker 1: it is like yeah it's jankie. Yeah, well basically can't 762 00:42:57,280 --> 00:43:01,320 Speaker 1: use it. I don't know it. I'll just stuck in 763 00:43:01,440 --> 00:43:05,719 Speaker 1: the airport for a couple of days. Good. Look, yeah, 764 00:43:06,000 --> 00:43:08,600 Speaker 1: we just we deserve better. We deserve better. We do 765 00:43:09,520 --> 00:43:13,480 Speaker 1: we do? Oh well, thank you so much as always, 766 00:43:13,520 --> 00:43:19,879 Speaker 1: Bridget for coming on helping us to understand this whole thing. Oh, 767 00:43:20,040 --> 00:43:22,320 Speaker 1: the pleasure is all mine. Thanks for helping me. I 768 00:43:22,400 --> 00:43:24,000 Speaker 1: feel like I was very ranty, but thank you for 769 00:43:24,080 --> 00:43:26,719 Speaker 1: helping me understand it better too. I always feel like 770 00:43:26,760 --> 00:43:30,520 Speaker 1: I get clarity from connecting with these issues with you all. 771 00:43:31,000 --> 00:43:33,080 Speaker 1: That's what I feel like that too. I feel like 772 00:43:33,160 --> 00:43:39,399 Speaker 1: we have good conversations. Yes, it's good in these episodes. Yes, Well, 773 00:43:39,640 --> 00:43:41,839 Speaker 1: where can the good listeners find you? Bridget? You can 774 00:43:41,920 --> 00:43:44,160 Speaker 1: find me on my podcast there are no girls on 775 00:43:44,200 --> 00:43:46,640 Speaker 1: the internet on iHeartRadio. You can find me on Twitter 776 00:43:46,800 --> 00:43:48,760 Speaker 1: at Bridget Marie. And you can find me on Instagram 777 00:43:48,800 --> 00:43:50,640 Speaker 1: at bridget Marie DC. And you can find me on 778 00:43:50,719 --> 00:43:59,359 Speaker 1: TikTok at Bridget makes podcasts easy enough. Like I said before, 779 00:43:59,440 --> 00:44:01,960 Speaker 1: on these it's so strange. I mean, it's important to 780 00:44:02,040 --> 00:44:03,719 Speaker 1: be critical of the things that you use, right, but 781 00:44:03,719 --> 00:44:05,200 Speaker 1: it's always so strange at the end to be like 782 00:44:05,320 --> 00:44:11,680 Speaker 1: and you could find me on exactly, Yes, well, thanks 783 00:44:11,719 --> 00:44:14,759 Speaker 1: again as always, Bridgett can't wait for next time. In 784 00:44:15,000 --> 00:44:17,040 Speaker 1: the meantime, listeners, If you would like to contact as 785 00:44:17,120 --> 00:44:20,120 Speaker 1: you can our emails steffiea mom Stuff at iHeartMedia dot com. 786 00:44:20,239 --> 00:44:22,200 Speaker 1: You can find us on Twitter at mom Stuff Podcasts, 787 00:44:22,600 --> 00:44:25,279 Speaker 1: or on Instagram and TikTok at stuff I Never Told You. 788 00:44:25,600 --> 00:44:28,880 Speaker 1: Thanks as always too, our super producer Christina, our executive 789 00:44:28,880 --> 00:44:32,480 Speaker 1: producer Maya, and our contributor Joey Yes. Thank y'all, and 790 00:44:32,840 --> 00:44:34,680 Speaker 1: thanks to you for listening Stuff I Never Told You. 791 00:44:34,719 --> 00:44:36,919 Speaker 1: Distraction by Heart Radio. For more podcast from my Heart Radio, 792 00:44:36,920 --> 00:44:38,920 Speaker 1: you can check out the iHeartRadio ap Apple podcast wherever 793 00:44:38,920 --> 00:44:39,960 Speaker 1: you listen to your favorite shows