1 00:00:05,200 --> 00:00:08,080 Speaker 1: Hey, this Sanny and Samantha and welcome to Stephane. Never 2 00:00:08,119 --> 00:00:20,079 Speaker 1: told your production of iHeart Radio. Okay, any I have 3 00:00:20,160 --> 00:00:22,599 Speaker 1: to ask you for this Monday money because I kind 4 00:00:22,600 --> 00:00:26,440 Speaker 1: of already know this answer. I believe, but you can 5 00:00:26,480 --> 00:00:29,080 Speaker 1: tell me if I'm wrong. What is your favorite source 6 00:00:29,240 --> 00:00:35,520 Speaker 1: of social media slash medium style of communicating on social media? Though? 7 00:00:37,520 --> 00:00:44,320 Speaker 1: Uh I, I guess it's Instagram. I don't. I really 8 00:00:44,320 --> 00:00:47,000 Speaker 1: don't use social media that much, but people contact me 9 00:00:47,040 --> 00:00:49,800 Speaker 1: through Instagram pretty regularly, and so I'll go in and 10 00:00:49,840 --> 00:00:52,440 Speaker 1: answer messages on there. I used to love a tipsy Graham, 11 00:00:53,040 --> 00:00:54,720 Speaker 1: which that was one of my favorites, which is where 12 00:00:54,720 --> 00:00:56,520 Speaker 1: you get kind of tipsy and then you record a 13 00:00:56,520 --> 00:00:58,600 Speaker 1: story and you cannot rerecord it. You have to post 14 00:00:58,600 --> 00:01:02,080 Speaker 1: it as is. And I've gotten some gold on that. 15 00:01:02,160 --> 00:01:05,840 Speaker 1: In my opinion, I think Twitter can be great, but 16 00:01:05,920 --> 00:01:10,000 Speaker 1: it is generally not great, but it can be. It 17 00:01:10,040 --> 00:01:14,679 Speaker 1: has some like really nice areas that I do love. 18 00:01:15,400 --> 00:01:18,959 Speaker 1: But I say Twitter gives me like the news faster 19 00:01:19,080 --> 00:01:24,160 Speaker 1: than the news, like I can't and immediately, like I 20 00:01:24,480 --> 00:01:27,679 Speaker 1: read one random text yesterday and was told within fifteen 21 00:01:27,680 --> 00:01:31,839 Speaker 1: minutes of Jason Mamoa's divorce I was like or split, 22 00:01:31,880 --> 00:01:33,679 Speaker 1: and I was like, good God, what's happening? And I 23 00:01:33,720 --> 00:01:36,120 Speaker 1: got it from a person on Twitter who was already 24 00:01:36,120 --> 00:01:40,960 Speaker 1: on top of things. Yes, yes, Twitter is definitely very 25 00:01:41,040 --> 00:01:44,720 Speaker 1: very quick to report things. But also, you know, I'm 26 00:01:44,720 --> 00:01:47,360 Speaker 1: always so every time I see something or someone I 27 00:01:47,400 --> 00:01:50,640 Speaker 1: love trending on Twitter, it feels like I'm dying inside 28 00:01:50,720 --> 00:01:54,920 Speaker 1: because it's almost always bad. But so far it's been fine. 29 00:01:55,240 --> 00:01:58,560 Speaker 1: But that aspect I hate, I hate. And it's also 30 00:01:58,680 --> 00:02:00,560 Speaker 1: it plays into the algorithm because I get a bunch 31 00:02:00,600 --> 00:02:03,440 Speaker 1: of Star Wars stuff or Marvel stuff because I love 32 00:02:03,480 --> 00:02:05,920 Speaker 1: Star Wars and Marvel. That makes sense, But that means 33 00:02:05,920 --> 00:02:08,400 Speaker 1: that every time I see on trending, like I don't know, 34 00:02:08,840 --> 00:02:10,880 Speaker 1: I just saw Mark Camel's tending, I'm like, oh my god, 35 00:02:11,480 --> 00:02:14,200 Speaker 1: it's ruined. But it was fine. It's just popping up 36 00:02:14,200 --> 00:02:19,520 Speaker 1: because they know I like the streets before, right. Yeah. 37 00:02:19,560 --> 00:02:22,920 Speaker 1: I think for me too, I will say I like 38 00:02:23,080 --> 00:02:25,560 Speaker 1: Twitter for like I said, people will tell me things 39 00:02:25,639 --> 00:02:27,840 Speaker 1: so quickly, and it gives me kind of a good 40 00:02:27,919 --> 00:02:30,200 Speaker 1: way to gauge just because of the people I'm following 41 00:02:31,200 --> 00:02:32,880 Speaker 1: or the people who are following me, you know, so 42 00:02:33,040 --> 00:02:36,000 Speaker 1: and all that stuff Instagram, I've done less and less 43 00:02:36,120 --> 00:02:38,320 Speaker 1: on because they have not put in the whole like 44 00:02:38,440 --> 00:02:41,760 Speaker 1: suggested you like these posts in between all of them, 45 00:02:41,720 --> 00:02:43,720 Speaker 1: I'm like, who are these people? I'm very confused about 46 00:02:43,760 --> 00:02:45,840 Speaker 1: what is happening to things that I don't want to 47 00:02:45,880 --> 00:02:48,000 Speaker 1: see anymore? And it becomes an add to you should 48 00:02:48,000 --> 00:02:49,800 Speaker 1: follow this, and you should follow this, and I'm thinking, 49 00:02:50,600 --> 00:02:53,560 Speaker 1: I don't. I don't for a reason, why is this happening. 50 00:02:53,720 --> 00:02:55,560 Speaker 1: I don't want these pictures and some of it is 51 00:02:55,600 --> 00:02:57,480 Speaker 1: really sad, and I'm like, no, I don't. I don't 52 00:02:57,480 --> 00:02:59,480 Speaker 1: like this for a reason. Why are you doing this? 53 00:03:00,120 --> 00:03:02,280 Speaker 1: And then if you like, if they think you've already 54 00:03:02,280 --> 00:03:05,320 Speaker 1: seen your old posts, they give you all new ones, 55 00:03:05,360 --> 00:03:08,360 Speaker 1: which again I'm baffled, like, who are these people? What 56 00:03:08,400 --> 00:03:10,840 Speaker 1: has happened? What is happening to my feed? So this 57 00:03:10,919 --> 00:03:13,400 Speaker 1: plant has become less on my list? And of course 58 00:03:13,840 --> 00:03:15,520 Speaker 1: I've kind of talked about the fact that my mother 59 00:03:15,800 --> 00:03:19,360 Speaker 1: is now on Instagram, and so I post a lot 60 00:03:19,520 --> 00:03:23,440 Speaker 1: less less because of that on Instagram. So just if 61 00:03:23,480 --> 00:03:26,639 Speaker 1: y'all wonder why, it's just now like I'm kind of dropped, 62 00:03:27,520 --> 00:03:30,680 Speaker 1: but my newest favorite and you know, my newest favorite 63 00:03:30,680 --> 00:03:33,679 Speaker 1: because I've been talking about it for so much. It's TikTok, 64 00:03:34,040 --> 00:03:35,960 Speaker 1: And there may be a bit of irony with the 65 00:03:35,960 --> 00:03:38,360 Speaker 1: fact that I've talked about not knowing who is on 66 00:03:38,400 --> 00:03:40,480 Speaker 1: my you know list, and all of a sudden getting 67 00:03:40,480 --> 00:03:42,320 Speaker 1: all these new things, which is kind of the whole 68 00:03:42,320 --> 00:03:45,120 Speaker 1: point of TikTok is to share with you fees that 69 00:03:45,160 --> 00:03:48,360 Speaker 1: might interest you that you may not know about. Which 70 00:03:48,840 --> 00:03:50,400 Speaker 1: is the for you page, But we'll talk about that 71 00:03:50,440 --> 00:03:52,720 Speaker 1: in a minute. But before we started, yes, because we 72 00:03:52,760 --> 00:03:55,400 Speaker 1: wanted to get into a little bit about what TikTok 73 00:03:55,520 --> 00:03:57,920 Speaker 1: is doing and how big of an influence it is 74 00:03:58,240 --> 00:04:00,000 Speaker 1: and how big it is when it comes to movements, 75 00:04:00,040 --> 00:04:04,400 Speaker 1: activism and women. So but when it comes to social media, 76 00:04:04,520 --> 00:04:07,400 Speaker 1: there's a lot to talk about, and we just wanted 77 00:04:07,400 --> 00:04:09,040 Speaker 1: to go ahead and let you know. We're not gonna 78 00:04:09,040 --> 00:04:11,560 Speaker 1: go to in depth in some of these things, but 79 00:04:11,840 --> 00:04:16,800 Speaker 1: they mention sexual assault or violence, harassment, eating disorders, fat shaming, 80 00:04:16,839 --> 00:04:19,520 Speaker 1: trauma experience, and all of that kind of conversations because 81 00:04:19,600 --> 00:04:22,200 Speaker 1: when it comes to movement and activism, there's usually something 82 00:04:22,240 --> 00:04:25,599 Speaker 1: that they're trying to fight against or fight for, So 83 00:04:26,080 --> 00:04:28,320 Speaker 1: go ahead and let you know about that content. Warning, right, 84 00:04:28,360 --> 00:04:30,440 Speaker 1: it's a job. Again, We're not gonna get too deep 85 00:04:30,480 --> 00:04:32,720 Speaker 1: into it, but just kind of give you some examples 86 00:04:32,760 --> 00:04:38,000 Speaker 1: and um, some people who are doing the activism on TikTok. Yeah, 87 00:04:38,000 --> 00:04:40,480 Speaker 1: and if you've been listening, like I said, I haven't 88 00:04:40,560 --> 00:04:44,200 Speaker 1: caught up in the world of TikTok, and y'all. I 89 00:04:44,240 --> 00:04:48,520 Speaker 1: avoided it for so long because I knew I knew 90 00:04:49,400 --> 00:04:52,200 Speaker 1: well one, it felt like one more thing to take 91 00:04:52,279 --> 00:04:55,360 Speaker 1: up on my phone and my data in my Space, 92 00:04:55,360 --> 00:04:58,039 Speaker 1: which I've had to delete so many things recently because 93 00:04:58,040 --> 00:05:00,240 Speaker 1: I'm like, I don't have enough on my phone own. 94 00:05:00,880 --> 00:05:03,320 Speaker 1: And then also I thought maybe TikTok was a bit 95 00:05:03,320 --> 00:05:05,160 Speaker 1: too young for me, because it did it kind of 96 00:05:05,160 --> 00:05:07,920 Speaker 1: came from the world of snapchat, all of those things 97 00:05:07,960 --> 00:05:10,600 Speaker 1: which I really thought were for teenagers who like to 98 00:05:10,600 --> 00:05:13,640 Speaker 1: share videos with each other, dances and all that, and like, yeah, 99 00:05:13,839 --> 00:05:15,640 Speaker 1: I'm hitting forty, do I really need to be in 100 00:05:15,680 --> 00:05:19,760 Speaker 1: another social media world? I'm wrong, but just saying, um, 101 00:05:19,800 --> 00:05:22,960 Speaker 1: and of course, fear of what I would be diving into. 102 00:05:23,040 --> 00:05:26,440 Speaker 1: There are so many things that I'm like, what, what, 103 00:05:26,440 --> 00:05:31,719 Speaker 1: what is this? But I wasn't wrong about these fears. 104 00:05:31,920 --> 00:05:35,080 Speaker 1: I wasn't wrong to the point like I said, I'm 105 00:05:35,120 --> 00:05:38,760 Speaker 1: so caught up. I've had to place a timer setting, honest, 106 00:05:38,760 --> 00:05:40,520 Speaker 1: so that it will check me on how much time 107 00:05:40,520 --> 00:05:45,000 Speaker 1: I'm spending or consuming this app. Because there is a 108 00:05:45,040 --> 00:05:47,640 Speaker 1: setting where you will make you do a code pen. 109 00:05:48,320 --> 00:05:50,680 Speaker 1: It stops you, tells you that you've spent an hour 110 00:05:50,720 --> 00:05:52,640 Speaker 1: on here. Do you want to continue? You must enter 111 00:05:52,640 --> 00:05:56,520 Speaker 1: this pen. So I did that for myself because literally 112 00:05:56,680 --> 00:05:59,440 Speaker 1: with for three hours I stayed up from I went. 113 00:05:59,560 --> 00:06:01,440 Speaker 1: I was trying to at about at eleven o'clock looked 114 00:06:01,480 --> 00:06:04,400 Speaker 1: down after watching all of these videos on TikTok that 115 00:06:04,480 --> 00:06:07,160 Speaker 1: it was to thirty am. I had spent that much 116 00:06:07,240 --> 00:06:10,680 Speaker 1: time not realizing it. Yeah, it's a time suck for 117 00:06:10,800 --> 00:06:14,280 Speaker 1: sure if you're not careful. And I'm not the only one, 118 00:06:14,600 --> 00:06:16,920 Speaker 1: by the way, Annie, I know you're not into it, 119 00:06:17,080 --> 00:06:19,440 Speaker 1: and I've had to send you some of the videos 120 00:06:19,440 --> 00:06:22,520 Speaker 1: smart Smart, Smart, But I'm not the only one. According 121 00:06:22,520 --> 00:06:26,240 Speaker 1: to some statistics, there are currently one billion monthly active 122 00:06:26,400 --> 00:06:30,520 Speaker 1: TikTok users. In fact, they have over three billion downloads, 123 00:06:30,520 --> 00:06:33,680 Speaker 1: which is about one third of all social media users, 124 00:06:34,400 --> 00:06:37,680 Speaker 1: point blake, and they did that in four years, which 125 00:06:37,720 --> 00:06:41,200 Speaker 1: it took almost three times that long for the growth 126 00:06:41,320 --> 00:06:44,440 Speaker 1: rate of Facebook or Instagram, so it took them ten 127 00:06:44,560 --> 00:06:48,200 Speaker 1: years to grow that many users, but for TikTok, and 128 00:06:48,240 --> 00:06:50,960 Speaker 1: they did within four years. And this includes a supposed 129 00:06:51,040 --> 00:06:54,360 Speaker 1: band that was supposed to happen at one point, which 130 00:06:54,440 --> 00:06:57,839 Speaker 1: they have actually skyrocket the number of downloads. That is 131 00:06:57,880 --> 00:07:02,520 Speaker 1: that conversation, I like, so yeah, I remember that everyone 132 00:07:02,560 --> 00:07:04,239 Speaker 1: was panicking. I'm like, do you really want tik talking 133 00:07:04,240 --> 00:07:06,760 Speaker 1: like no, but what if I do later? And then 134 00:07:06,800 --> 00:07:10,240 Speaker 1: there was also the fact that, uh, they had a 135 00:07:10,280 --> 00:07:13,040 Speaker 1: lot of ads out at one point why during the 136 00:07:13,080 --> 00:07:16,440 Speaker 1: time they said that they may be banned. Yes, And 137 00:07:16,480 --> 00:07:20,040 Speaker 1: also just to add on he probably saw but SNL 138 00:07:20,160 --> 00:07:24,080 Speaker 1: with Billie Eilish did a sketch on spending all the 139 00:07:24,120 --> 00:07:27,400 Speaker 1: time on TikTok, probably too real. Well, it was very funny. 140 00:07:27,400 --> 00:07:29,520 Speaker 1: It was a dad trying to get his I'm assuming 141 00:07:29,600 --> 00:07:31,840 Speaker 1: daughter to take out the trash and she was like, 142 00:07:31,920 --> 00:07:34,480 Speaker 1: oh yeah, yeah yeah, and it was just on TikTok, TikTok, TikTok, 143 00:07:34,480 --> 00:07:36,520 Speaker 1: And finally he joined TikTok to try to send her 144 00:07:36,560 --> 00:07:39,800 Speaker 1: a message like take at the trash and did it 145 00:07:39,880 --> 00:07:41,680 Speaker 1: turn out that he actually got a part of it 146 00:07:41,720 --> 00:07:45,400 Speaker 1: and got stuck in it too. Probably probably, I have 147 00:07:45,480 --> 00:07:47,679 Speaker 1: to say that's my least face. I like getting TikTok 148 00:07:47,720 --> 00:07:49,520 Speaker 1: from people, but I hate that it auto places the 149 00:07:49,520 --> 00:07:52,640 Speaker 1: next TikTok. I hate auto play with a burning passion. 150 00:07:52,920 --> 00:07:56,720 Speaker 1: No matter where it is, I hate it. Do not 151 00:07:56,840 --> 00:08:02,240 Speaker 1: make me watch your next video. Um So, in one 152 00:08:02,280 --> 00:08:05,320 Speaker 1: TikTok was the most downloaded non game app for the 153 00:08:05,360 --> 00:08:08,640 Speaker 1: first six months and has been one of the most popular, 154 00:08:08,680 --> 00:08:13,000 Speaker 1: if not the most popular app in twenty nineteen. And 155 00:08:13,000 --> 00:08:16,200 Speaker 1: and yes, they even asked Google as the most popular 156 00:08:16,280 --> 00:08:20,120 Speaker 1: quote online destination, which was tracked through a web traffic 157 00:08:20,160 --> 00:08:23,880 Speaker 1: monitor called cloud fair Radar. Right, so this report went 158 00:08:23,880 --> 00:08:26,400 Speaker 1: out everywhere because it made such big news that it 159 00:08:26,520 --> 00:08:43,600 Speaker 1: out did Google. Yeah, and of course TikTok has its 160 00:08:43,640 --> 00:08:46,200 Speaker 1: issues and problems, whether we are talking about the continued 161 00:08:46,320 --> 00:08:48,920 Speaker 1: shadow banning and non crediting of black creators and you 162 00:08:48,920 --> 00:08:51,240 Speaker 1: can go listen to our episode that we did with 163 00:08:51,280 --> 00:08:54,400 Speaker 1: Bridget to learn more about that. Um the continued harassment 164 00:08:54,440 --> 00:08:57,600 Speaker 1: and trolls of content creators, and even fat phobia of 165 00:08:57,640 --> 00:09:00,199 Speaker 1: the app. Um So, there's a lot to watch out for, 166 00:09:00,400 --> 00:09:03,360 Speaker 1: just as there is on pretty much every every social 167 00:09:03,360 --> 00:09:06,160 Speaker 1: media platform. Also, and we will talk a little more 168 00:09:06,200 --> 00:09:08,800 Speaker 1: about some of the specifics, but justinn f y I. 169 00:09:08,880 --> 00:09:10,680 Speaker 1: There has been a big debate about whether or not 170 00:09:10,720 --> 00:09:13,920 Speaker 1: TikTok it's actually bad for you, as in your mental health. 171 00:09:14,280 --> 00:09:16,360 Speaker 1: But there's also a lot of conversation about the fact 172 00:09:16,400 --> 00:09:18,679 Speaker 1: that TikTok makes you aware of things that you might 173 00:09:18,720 --> 00:09:22,040 Speaker 1: not know of. I actually watched like real time, in 174 00:09:22,080 --> 00:09:27,280 Speaker 1: real time, someone diagnosed another woman's thyroid issues that were like, Hey, 175 00:09:27,320 --> 00:09:29,880 Speaker 1: this is really off the topic of whatever she was 176 00:09:29,920 --> 00:09:32,600 Speaker 1: talking about, but I noticed your thyroid. You may want 177 00:09:32,600 --> 00:09:34,960 Speaker 1: to go check it out because it's this and this, 178 00:09:35,120 --> 00:09:36,920 Speaker 1: and it turned out, yeah, there was an issue. So 179 00:09:36,960 --> 00:09:39,480 Speaker 1: it was really interesting on the content like this because 180 00:09:39,480 --> 00:09:43,559 Speaker 1: there's a lot of hyper awareness. There's a whole side 181 00:09:43,600 --> 00:09:46,880 Speaker 1: of TikTok called a d h D TikTok. Yeah. I 182 00:09:46,920 --> 00:09:50,280 Speaker 1: got caught up in that anyway, So there's a lot 183 00:09:50,320 --> 00:09:53,280 Speaker 1: to be said. And yes, also we could probably have 184 00:09:53,320 --> 00:09:56,680 Speaker 1: that conversation about the fact that TikTok and apps like 185 00:09:56,760 --> 00:09:59,840 Speaker 1: this create a shorter attention span and whether that is 186 00:10:00,120 --> 00:10:02,000 Speaker 1: good for you or not. I'm gonna go with not 187 00:10:03,040 --> 00:10:07,000 Speaker 1: as me who has a very short attention span. Uh, 188 00:10:07,040 --> 00:10:09,720 Speaker 1: and with it being a near medium, This is a 189 00:10:09,840 --> 00:10:13,000 Speaker 1: great opportunity to start off on the right foot, or 190 00:10:13,160 --> 00:10:17,080 Speaker 1: equal footing at the very least, meaning at this time, 191 00:10:17,080 --> 00:10:19,520 Speaker 1: the amount of women content creators are more than men 192 00:10:19,679 --> 00:10:22,760 Speaker 1: by about seven and a half percent, with women making 193 00:10:22,840 --> 00:10:25,880 Speaker 1: up fifty three point seven nine percent while men make 194 00:10:25,960 --> 00:10:30,040 Speaker 1: up about forty six point one percent as of August one, 195 00:10:30,440 --> 00:10:32,920 Speaker 1: and the gap is significantly larger when it comes to 196 00:10:33,040 --> 00:10:36,720 Speaker 1: the users. Women make up of the users while men 197 00:10:36,760 --> 00:10:39,400 Speaker 1: make up thirty nine percent. And if you're wondering the 198 00:10:39,480 --> 00:10:43,240 Speaker 1: age breakdown, nine year olds have the most users, with 199 00:10:43,280 --> 00:10:46,720 Speaker 1: thirty of users beat in that age group, followed by 200 00:10:46,760 --> 00:10:49,960 Speaker 1: eighteen and younger, which is that eight percent thirty nine 201 00:10:50,000 --> 00:10:52,640 Speaker 1: plus is that nineteen percent and thirty to thirty nine 202 00:10:52,679 --> 00:10:55,960 Speaker 1: at eighteen percent, which I found surprising honestly that the 203 00:10:55,960 --> 00:10:59,160 Speaker 1: thirty thirty nine would be higher um. However, there are 204 00:10:59,200 --> 00:11:02,120 Speaker 1: a few reports show that the under age eighteen users 205 00:11:02,280 --> 00:11:05,920 Speaker 1: may actually be larger than that number represents UM, whether 206 00:11:05,960 --> 00:11:08,079 Speaker 1: it's because they're lying about their age or whatever what not, 207 00:11:08,640 --> 00:11:11,520 Speaker 1: because some of the other UH statistics show that they 208 00:11:11,559 --> 00:11:14,920 Speaker 1: are the highest amount of the age group that uses TikTok, 209 00:11:15,240 --> 00:11:18,680 Speaker 1: which again probably is also wild Like this feels like 210 00:11:18,679 --> 00:11:20,920 Speaker 1: it's way too young for me. But it's not just 211 00:11:21,000 --> 00:11:23,800 Speaker 1: the overall demographics that have interested. It's more to do 212 00:11:23,840 --> 00:11:26,400 Speaker 1: with the actual content. Um. And if you look at 213 00:11:26,480 --> 00:11:30,240 Speaker 1: my of for you page or f y P as 214 00:11:30,240 --> 00:11:33,600 Speaker 1: the hashtag goes, which I still don't quite understand, Apparently 215 00:11:33,600 --> 00:11:36,200 Speaker 1: it makes it more popular. I don't know because someone 216 00:11:36,240 --> 00:11:38,040 Speaker 1: who's like I don't have this on my thing, And 217 00:11:38,080 --> 00:11:39,960 Speaker 1: if you're seeing this, it's been for you And I'm like, 218 00:11:40,160 --> 00:11:44,000 Speaker 1: is it? Is it? Wait? Is it? So? I'm very 219 00:11:44,000 --> 00:11:47,480 Speaker 1: confused about this. You will see that for me, I 220 00:11:47,480 --> 00:11:50,800 Speaker 1: have many a dog, cat, maybe a raccoon. Maybe in 221 00:11:50,840 --> 00:11:54,920 Speaker 1: the EMU account that I follow, it's probably cute. Um. 222 00:11:54,960 --> 00:11:57,800 Speaker 1: But with that, there are also many different activists that 223 00:11:57,840 --> 00:12:04,440 Speaker 1: have caught my attention. Okay, UM, well, I see I've 224 00:12:04,440 --> 00:12:07,080 Speaker 1: seen an evil I like an em I'm just surprised, 225 00:12:07,120 --> 00:12:11,120 Speaker 1: that's all. But I guess it's not so surprising to 226 00:12:11,200 --> 00:12:15,040 Speaker 1: see the large amount of misinformation, unfortunately that the platform allows. 227 00:12:15,080 --> 00:12:17,520 Speaker 1: But the good news is there are those who have 228 00:12:17,720 --> 00:12:21,280 Speaker 1: and continue to go viral with factual information to counter 229 00:12:21,320 --> 00:12:23,520 Speaker 1: the damaging content. I remember we talked about this with 230 00:12:23,559 --> 00:12:27,679 Speaker 1: the Dungeons and Dragons account right, yes, um, whether it's 231 00:12:27,720 --> 00:12:30,880 Speaker 1: the debunking of medical misinformation or about politics, there are 232 00:12:30,880 --> 00:12:34,200 Speaker 1: several creators out there bring scientific information in one to 233 00:12:34,320 --> 00:12:38,760 Speaker 1: three minute snippets. But with that there is also a 234 00:12:38,760 --> 00:12:42,559 Speaker 1: lot of drama and some real danger. The word doxing 235 00:12:42,559 --> 00:12:45,640 Speaker 1: has been flying around social media, the social media world 236 00:12:45,679 --> 00:12:48,040 Speaker 1: and just in case you don't know, that's publishing personal 237 00:12:48,040 --> 00:12:52,280 Speaker 1: information like a dress um online. Uh, And TikTok is 238 00:12:52,280 --> 00:12:53,920 Speaker 1: is a part of that is the top contender for 239 00:12:54,000 --> 00:12:57,600 Speaker 1: usage of the word and not too surprisingly, if something 240 00:12:57,720 --> 00:13:01,000 Speaker 1: or someone is caught with enough ferocity, people have been 241 00:13:01,040 --> 00:13:04,840 Speaker 1: swept up in firing's, harassment and swatting. Right. There's actually 242 00:13:04,880 --> 00:13:09,280 Speaker 1: a couple of incidents recently by two content creators that 243 00:13:09,360 --> 00:13:13,000 Speaker 1: they actually got swatted and uh, there's a whole lawsuits 244 00:13:13,040 --> 00:13:16,720 Speaker 1: happening and such. So there's a lot going on within 245 00:13:16,760 --> 00:13:19,880 Speaker 1: this world. There's a lot of drama. Yeah you've been 246 00:13:19,880 --> 00:13:24,559 Speaker 1: telling me anyway, um, And there are many who criticize 247 00:13:24,760 --> 00:13:28,559 Speaker 1: the monitoring or how TikTok is monitoring the app, questioning 248 00:13:28,559 --> 00:13:31,680 Speaker 1: what is considered a violation and what is allowed. Recently, 249 00:13:31,679 --> 00:13:34,280 Speaker 1: a popular TikToker was arrested for the murder of his 250 00:13:34,360 --> 00:13:36,680 Speaker 1: partner Um. I think at that point they were ex 251 00:13:36,720 --> 00:13:40,200 Speaker 1: partners Um and his account which highlighted his relationship. They 252 00:13:40,200 --> 00:13:42,960 Speaker 1: made it us like couple's account during the time was 253 00:13:43,000 --> 00:13:46,200 Speaker 1: allowed to remain up after his arrest, like they didn't 254 00:13:46,200 --> 00:13:49,040 Speaker 1: do anything for it, and they kept it all going. Uh. 255 00:13:49,080 --> 00:13:51,800 Speaker 1: They had a pretty high amount of followers. And we 256 00:13:51,880 --> 00:13:54,400 Speaker 1: talked not too long ago about the continued highlighting of 257 00:13:54,440 --> 00:13:58,000 Speaker 1: white TikTok creators over black creators who had to go 258 00:13:58,080 --> 00:14:01,319 Speaker 1: on strike to even be heard. But seemingly very little 259 00:14:01,360 --> 00:14:05,240 Speaker 1: has changed to even out the disparity. But there are 260 00:14:05,360 --> 00:14:07,760 Speaker 1: movements that are being brought to the forefront and for 261 00:14:07,800 --> 00:14:11,480 Speaker 1: the better, such as climate change activism and a group 262 00:14:11,559 --> 00:14:13,840 Speaker 1: known as eco Talk, which is made up of young 263 00:14:13,880 --> 00:14:18,040 Speaker 1: influencers posting videos about environmental and climate topics which begin 264 00:14:18,080 --> 00:14:20,640 Speaker 1: in July twenty has been able to have some much 265 00:14:20,760 --> 00:14:24,760 Speaker 1: needed conversations thanks to this app Yes and the group 266 00:14:24,840 --> 00:14:27,200 Speaker 1: has worked to make viral videos to bring awareness to 267 00:14:27,240 --> 00:14:30,480 Speaker 1: the problems happening around the world. Members like Chris and Cabrera, 268 00:14:30,600 --> 00:14:33,600 Speaker 1: a marine biologists, have been using TikTok to bring scientific 269 00:14:33,680 --> 00:14:38,160 Speaker 1: content into a fun learning resource type of area. Environment 270 00:14:38,440 --> 00:14:41,360 Speaker 1: and she has garnered hundreds of thousands of views for 271 00:14:41,480 --> 00:14:43,520 Speaker 1: her video. Um and she's not the only one that 272 00:14:43,560 --> 00:14:46,840 Speaker 1: hashtag climate change and sustainable have more than billions of 273 00:14:46,920 --> 00:14:50,880 Speaker 1: views throughout TikTok. Other movements, such as abortion rights movements, 274 00:14:51,000 --> 00:14:53,400 Speaker 1: have also been able to use TikTok to educate and 275 00:14:53,440 --> 00:14:56,720 Speaker 1: bring awareness. TikTok or Alex Krato at Alex the Feminist, 276 00:14:56,760 --> 00:14:58,960 Speaker 1: for example, has had over four million views of her 277 00:14:58,960 --> 00:15:01,600 Speaker 1: one of her post alone own which showed her confronting 278 00:15:01,640 --> 00:15:05,240 Speaker 1: anti abortion protesters. And this is working not only to 279 00:15:05,280 --> 00:15:07,760 Speaker 1: push back the protesters and educate the viewers, but also 280 00:15:07,800 --> 00:15:10,240 Speaker 1: to protect the rights of women coming to the centers. 281 00:15:24,120 --> 00:15:26,480 Speaker 1: And then there are people like Drew a Foilo I'm 282 00:15:26,480 --> 00:15:28,280 Speaker 1: so sorry if I said that wrong, a f u 283 00:15:28,480 --> 00:15:31,480 Speaker 1: a l o who has her own way of turning 284 00:15:31,480 --> 00:15:34,720 Speaker 1: the tables on misogynists. UM and I do follow her 285 00:15:34,760 --> 00:15:37,520 Speaker 1: as well. She's very entertaining. Everybody knows her by her 286 00:15:37,600 --> 00:15:39,720 Speaker 1: laugh just in case you were wondering. Uh. And she 287 00:15:39,760 --> 00:15:42,360 Speaker 1: has a whopping three point six a million followers and 288 00:15:42,360 --> 00:15:45,840 Speaker 1: has built her followership with her unique content of criticizing 289 00:15:45,840 --> 00:15:49,120 Speaker 1: and critiquing sexist content. So she'll literally stitch her videos 290 00:15:49,160 --> 00:15:52,200 Speaker 1: what they call it, which is again accompanied by her 291 00:15:52,360 --> 00:15:56,880 Speaker 1: very identifiable laugh, is quite cute, and she doesn't hold back, 292 00:15:57,120 --> 00:16:00,320 Speaker 1: and I would say she is an absolute powerhouse. She 293 00:16:00,440 --> 00:16:03,880 Speaker 1: may making sexist men shrink into oblivion with her jokes 294 00:16:03,920 --> 00:16:07,320 Speaker 1: and comebacks, and she has been criticized for her usage 295 00:16:07,320 --> 00:16:10,880 Speaker 1: of harsh jokes and pointing out physical characteristics, but as 296 00:16:10,880 --> 00:16:13,560 Speaker 1: she says of those who pushed back, quote, all of 297 00:16:13,600 --> 00:16:16,160 Speaker 1: a sudden, I'm the meanest girl ever. You make a joke, 298 00:16:16,240 --> 00:16:18,680 Speaker 1: and I make a joke, but somehow mine's mean and 299 00:16:18,760 --> 00:16:23,520 Speaker 1: yours isn't okay okay. So essentially, she's like talking about 300 00:16:23,560 --> 00:16:25,960 Speaker 1: the fact that whenever she sees the video of men 301 00:16:26,040 --> 00:16:28,360 Speaker 1: making fun of women, she comes right back at them 302 00:16:28,680 --> 00:16:31,560 Speaker 1: and let them know, hey, I'm here. I mean, yeah, 303 00:16:31,680 --> 00:16:34,640 Speaker 1: it is pretty harsh, but it is quite funny. I 304 00:16:34,800 --> 00:16:37,320 Speaker 1: enjoy it. Again. Of course, I may not do this 305 00:16:37,440 --> 00:16:40,680 Speaker 1: type of first spot. I don't think I have the 306 00:16:40,760 --> 00:16:43,680 Speaker 1: gumption to do so as she does. But she has 307 00:16:43,720 --> 00:16:46,520 Speaker 1: definitely gained a lot of traction to the point that 308 00:16:46,560 --> 00:16:50,560 Speaker 1: it's almost hilarious. How often you'll see her name tagged 309 00:16:50,560 --> 00:16:53,400 Speaker 1: in these videos ready to go, and a lot of 310 00:16:53,440 --> 00:16:58,160 Speaker 1: men and responsibill delete the videos. Immediately or delete their account. 311 00:16:58,240 --> 00:17:00,800 Speaker 1: So it's pretty interesting to see. But women like her 312 00:17:00,840 --> 00:17:03,120 Speaker 1: who do these things are coming with a whole other 313 00:17:03,200 --> 00:17:05,720 Speaker 1: level of like, hey, we're done, We're done playing. And 314 00:17:05,800 --> 00:17:07,360 Speaker 1: she even talked about the fact that, you know, it's 315 00:17:07,359 --> 00:17:09,920 Speaker 1: a it's a mistake to assume that every woman is 316 00:17:09,920 --> 00:17:11,719 Speaker 1: just going to take the high road, which we've been 317 00:17:11,720 --> 00:17:15,560 Speaker 1: taught first way too long. So stuff like that. But 318 00:17:16,160 --> 00:17:18,760 Speaker 1: with the good does come the bad. And recently Vice 319 00:17:18,840 --> 00:17:22,120 Speaker 1: reported an alarming amount of in cell content on TikTok 320 00:17:22,200 --> 00:17:25,520 Speaker 1: as well. Recent data by Claren O'Connor shows a large 321 00:17:25,520 --> 00:17:29,400 Speaker 1: amount of misogynistic case speech within the platform and oftentimes 322 00:17:29,520 --> 00:17:32,879 Speaker 1: it's ignored by TikTok. And one big movement highlighted was 323 00:17:32,920 --> 00:17:36,399 Speaker 1: the mg t OW or Men Going their Own Way, 324 00:17:36,680 --> 00:17:40,800 Speaker 1: which was a movement that often titled misogyny with white supremacy. 325 00:17:40,960 --> 00:17:43,960 Speaker 1: Not surprising, and some of the videos shared within these 326 00:17:44,000 --> 00:17:47,120 Speaker 1: hashtags and these groups have included videos that were made 327 00:17:47,119 --> 00:17:50,400 Speaker 1: by Elliott Roger, an in cell who murdered seven people 328 00:17:50,440 --> 00:17:54,640 Speaker 1: in before he killed himself, and one of these videos 329 00:17:54,760 --> 00:17:57,720 Speaker 1: include him specifically talking about women and how they have 330 00:17:57,880 --> 00:18:01,520 Speaker 1: quote have started him of sex and pleasure, so obviously 331 00:18:01,640 --> 00:18:04,440 Speaker 1: very much in the in cell manifesto UM and which, 332 00:18:04,480 --> 00:18:07,879 Speaker 1: by the way, they stayed up until this report was 333 00:18:07,920 --> 00:18:13,480 Speaker 1: shared with TikTok recently. Yeah, and though TikTok has banned 334 00:18:13,520 --> 00:18:16,600 Speaker 1: many of the hashtags and some accounts related to the movement, 335 00:18:16,640 --> 00:18:19,400 Speaker 1: it doesn't seem to be going away. And TikTok also 336 00:18:19,440 --> 00:18:22,119 Speaker 1: seems to be going UM pretty slowly to take some 337 00:18:22,200 --> 00:18:25,680 Speaker 1: of these items down. Hashtags like hypergamy and red pill 338 00:18:25,720 --> 00:18:29,159 Speaker 1: were still active when Vice released their article as of 339 00:18:29,200 --> 00:18:33,000 Speaker 1: November of one, and why would they allow so many 340 00:18:33,040 --> 00:18:36,480 Speaker 1: tags and accounts to continue unchecked. Well, Fraser Heritage, a 341 00:18:36,520 --> 00:18:39,960 Speaker 1: professor at Birmingham City University, says, quote one of the 342 00:18:40,000 --> 00:18:43,160 Speaker 1: issues online platforms face is that swass of society are 343 00:18:43,280 --> 00:18:47,360 Speaker 1: misogynistic or at least built on misogynistic ideas. So there 344 00:18:47,440 --> 00:18:50,399 Speaker 1: is a difficulty in drawing a quote line in the sand, 345 00:18:50,760 --> 00:18:52,639 Speaker 1: right And I thought that was a really important line, 346 00:18:52,720 --> 00:18:56,320 Speaker 1: that this is the idea and it's so accepted that 347 00:18:56,440 --> 00:18:58,560 Speaker 1: it allows for it. And I can't tell you how 348 00:18:58,600 --> 00:19:01,960 Speaker 1: many videos that I have actually seeing UM that didn't 349 00:19:01,960 --> 00:19:05,400 Speaker 1: necessarily equate to being in cel, but that would go 350 00:19:05,520 --> 00:19:10,199 Speaker 1: in blaming women for situations, whether it's I've seen several 351 00:19:10,600 --> 00:19:14,160 Speaker 1: great videos of women pushing back on men, harassing them 352 00:19:14,240 --> 00:19:16,640 Speaker 1: or not living them alone, and within the comments, men 353 00:19:16,720 --> 00:19:19,680 Speaker 1: are so adamant that women are in the wrong and 354 00:19:19,760 --> 00:19:23,080 Speaker 1: being too aggressive the way they dress, and the amount 355 00:19:23,119 --> 00:19:25,120 Speaker 1: of comments that flood in like that, it was kind 356 00:19:25,119 --> 00:19:29,240 Speaker 1: of shocking because I'm thinking, with this new content, maybe 357 00:19:29,320 --> 00:19:33,119 Speaker 1: people have changed perspectives and have understood and have grown, 358 00:19:33,400 --> 00:19:36,160 Speaker 1: but it doesn't seem like that. UM. And I think 359 00:19:36,440 --> 00:19:39,240 Speaker 1: with all those comments and with all those conversations happening 360 00:19:39,960 --> 00:19:43,560 Speaker 1: within the videos, none of them are really checked, is 361 00:19:43,680 --> 00:19:46,720 Speaker 1: seen from what I can gather, and that's concerning. And 362 00:19:46,760 --> 00:19:48,880 Speaker 1: I think a part of that is just the fact that, well, 363 00:19:48,920 --> 00:19:50,359 Speaker 1: it is what it is, and you knew what you 364 00:19:50,400 --> 00:19:55,840 Speaker 1: were getting into with so um and with platforms like TikTok, 365 00:19:55,840 --> 00:19:58,879 Speaker 1: which has dramatically grown in popularity, it may be harder 366 00:19:58,880 --> 00:20:01,280 Speaker 1: for workers to keep up with ever changing and ever 367 00:20:01,359 --> 00:20:04,480 Speaker 1: adding amount of content. UM. So many are hopping on 368 00:20:04,560 --> 00:20:08,960 Speaker 1: with a new dream of being a TikTok influencer, not 369 00:20:09,119 --> 00:20:13,399 Speaker 1: realizing how it can be good or bad. You never know. 370 00:20:14,200 --> 00:20:16,960 Speaker 1: And for those who do make it, they are making 371 00:20:17,280 --> 00:20:20,160 Speaker 1: bank as creators can make up to two hundred thousand 372 00:20:20,160 --> 00:20:24,399 Speaker 1: dollars for one branded post, Like I was like, wait, what, um, 373 00:20:24,400 --> 00:20:26,040 Speaker 1: And if you have a certain amount of followers, you 374 00:20:26,040 --> 00:20:29,280 Speaker 1: could possibly make from two hundred to five thousand a month. 375 00:20:29,560 --> 00:20:31,600 Speaker 1: Of course, you would need to accrue over a hundred 376 00:20:31,600 --> 00:20:35,080 Speaker 1: thousand followers to even be considered to get any of 377 00:20:35,119 --> 00:20:37,480 Speaker 1: this pay. Just a reminder and some of the top 378 00:20:37,560 --> 00:20:42,800 Speaker 1: influencers today include Charlie di Amilio I'm sorry if I'm 379 00:20:42,800 --> 00:20:45,960 Speaker 1: between these names, who has a hundred thirty two point 380 00:20:46,080 --> 00:20:50,640 Speaker 1: seven million followers, Cabby Lame at twenty four point nine million, 381 00:20:50,800 --> 00:20:54,439 Speaker 1: and Bella Porch at eighties seven million. So there's a 382 00:20:54,480 --> 00:20:56,840 Speaker 1: lot of people, and there seems to be like the 383 00:20:56,880 --> 00:21:01,320 Speaker 1: top five that just kind of go, you know, different rankings, 384 00:21:01,640 --> 00:21:04,119 Speaker 1: and a lot of them have made careers out of this. 385 00:21:04,359 --> 00:21:07,760 Speaker 1: Not only has Bella Porch has now I think she 386 00:21:07,840 --> 00:21:10,680 Speaker 1: was already a star UM and a singer, but now 387 00:21:10,720 --> 00:21:14,080 Speaker 1: her career is even more so grown in popularity. And 388 00:21:14,119 --> 00:21:17,080 Speaker 1: then uh, Kavie Laine, who was known he doesn't even 389 00:21:17,080 --> 00:21:19,000 Speaker 1: really speak in his I don't know if you know 390 00:21:19,040 --> 00:21:22,600 Speaker 1: who this is, um, but he's a content creator that 391 00:21:22,680 --> 00:21:26,200 Speaker 1: literally just points out things that are obvious so there 392 00:21:26,200 --> 00:21:29,600 Speaker 1: may be an item that helps you, uh get some 393 00:21:29,800 --> 00:21:32,600 Speaker 1: keys off the ground. But like it's so random. He 394 00:21:32,720 --> 00:21:35,080 Speaker 1: just like shows you, like picks it up, like pick 395 00:21:35,119 --> 00:21:38,640 Speaker 1: it up, and that's the thing. Super cute. I enjoy it. 396 00:21:38,640 --> 00:21:40,800 Speaker 1: It's fun. But he's made a career off of this 397 00:21:40,920 --> 00:21:43,439 Speaker 1: and has been touring around the country, around the world 398 00:21:43,840 --> 00:21:47,800 Speaker 1: modeling and getting sponsorships. And then Charlie dam Leo, she's 399 00:21:47,880 --> 00:21:49,560 Speaker 1: been through a lot of things hurt her sister are 400 00:21:49,640 --> 00:21:52,920 Speaker 1: both pretty viral, and I believe they're getting their own 401 00:21:53,040 --> 00:21:56,440 Speaker 1: reality show through this, So there are a lot. Addison 402 00:21:56,520 --> 00:21:59,400 Speaker 1: Ray is also another big name in TikTok world, and 403 00:21:59,480 --> 00:22:03,800 Speaker 1: she hasn't done a movie already, a Netflix movie, I believe. 404 00:22:04,000 --> 00:22:06,320 Speaker 1: So there's a lot to be said that if you 405 00:22:06,359 --> 00:22:09,879 Speaker 1: can make it wonderful. Of course, some of these content 406 00:22:09,880 --> 00:22:14,159 Speaker 1: creators have also been accused of stealing content from others, 407 00:22:14,240 --> 00:22:16,440 Speaker 1: and oftentimes that as women of color, people of color 408 00:22:16,520 --> 00:22:18,679 Speaker 1: that they've been stealing from. So that's a lot of 409 00:22:18,760 --> 00:22:22,119 Speaker 1: questions that we have in that as well, because yes, 410 00:22:22,320 --> 00:22:26,800 Speaker 1: this can change people's lives, this can be a career, 411 00:22:27,560 --> 00:22:30,080 Speaker 1: So it does make a difference in this level. And 412 00:22:30,080 --> 00:22:34,760 Speaker 1: and I'm not even talking about using sounds. So on TikTok, 413 00:22:34,880 --> 00:22:38,760 Speaker 1: you can take someone's original content and whatever they're saying 414 00:22:38,840 --> 00:22:42,840 Speaker 1: and put it onto yours and make it like an overlay. 415 00:22:43,080 --> 00:22:46,240 Speaker 1: And it's a whole different type of level that I'm 416 00:22:46,320 --> 00:22:49,440 Speaker 1: not sure about, but I do know that a lot 417 00:22:49,440 --> 00:22:52,879 Speaker 1: of times it's stolen and people don't credit that even 418 00:22:53,040 --> 00:22:55,920 Speaker 1: and that it's like it's obvious as someone else's content, 419 00:22:56,040 --> 00:22:58,119 Speaker 1: Where did it come from? Why wouldn't you say it? 420 00:22:58,480 --> 00:23:01,520 Speaker 1: And the problem is when people are making money off 421 00:23:01,560 --> 00:23:04,679 Speaker 1: of that content without acknowledging and or giving credit to 422 00:23:04,720 --> 00:23:08,720 Speaker 1: where it came from to begin with, which TikTok is 423 00:23:08,760 --> 00:23:11,400 Speaker 1: a whole new world. I'm interested to see how copyrighting 424 00:23:11,640 --> 00:23:13,439 Speaker 1: and any of that, and even to the point someone's 425 00:23:13,760 --> 00:23:15,520 Speaker 1: actually said and we want to dig in this to 426 00:23:15,720 --> 00:23:17,720 Speaker 1: later that it could be an n f T, that 427 00:23:17,920 --> 00:23:20,679 Speaker 1: soon maybe some of the sounds could be an n 428 00:23:20,760 --> 00:23:29,920 Speaker 1: f T. And I'm like, what, Oh, there's a lot. 429 00:23:30,480 --> 00:23:34,240 Speaker 1: It's what a world that is to explain this to 430 00:23:34,359 --> 00:23:41,879 Speaker 1: ourselves twenty years ago, I'm confused today it exists. But yeah, 431 00:23:41,880 --> 00:23:46,040 Speaker 1: so there's a lot happening in TikTok world. Obviously I 432 00:23:46,080 --> 00:23:49,480 Speaker 1: didn't even hit on some of my favorite content creators myself. 433 00:23:49,560 --> 00:23:51,520 Speaker 1: I just kind of pulled in some names that I've seen, 434 00:23:52,160 --> 00:23:54,600 Speaker 1: um and that as we're researching it. But there's a 435 00:23:54,640 --> 00:23:57,240 Speaker 1: lot to be said, and there's a lot to look 436 00:23:57,320 --> 00:24:01,680 Speaker 1: into when it comes to social media, especially in newer forms, 437 00:24:01,680 --> 00:24:05,080 Speaker 1: and why it's so popular and again the opportunities that 438 00:24:05,119 --> 00:24:08,960 Speaker 1: it can afford to women if we truly, truly are 439 00:24:09,000 --> 00:24:12,919 Speaker 1: given that opportunity. Yeah yeah, well Samantha, you and I 440 00:24:12,960 --> 00:24:18,640 Speaker 1: just need to get one thousand followers. Yeah yeah, I 441 00:24:18,720 --> 00:24:21,840 Speaker 1: have one video out and you don't. It's not even 442 00:24:21,880 --> 00:24:26,680 Speaker 1: my name, and I got six views. I'm so o hey, hey, 443 00:24:26,720 --> 00:24:30,359 Speaker 1: that's not bad. That's pretty good. We did see Samantha 444 00:24:30,400 --> 00:24:33,760 Speaker 1: and I and what feels like Forever Ago. Did actually 445 00:24:33,760 --> 00:24:36,159 Speaker 1: go to a movie theater and we saw a commercial 446 00:24:36,400 --> 00:24:41,440 Speaker 1: for somebody's TikTok account and I was like, wow, okay, um, well, 447 00:24:42,160 --> 00:24:44,879 Speaker 1: clearly there's a lot we could dig into here, and listeners, 448 00:24:45,000 --> 00:24:47,200 Speaker 1: we would love to hear about any of your favorite 449 00:24:47,520 --> 00:24:50,360 Speaker 1: TikTok activists or accounts, or what you're doing on TikTok 450 00:24:50,440 --> 00:24:52,520 Speaker 1: or anything you think we should be talking about on 451 00:24:52,880 --> 00:24:56,640 Speaker 1: about anything. Really. You can email us at stuff Media 452 00:24:56,640 --> 00:24:58,360 Speaker 1: mom Stuff at ihart media dot com. You can find 453 00:24:58,440 --> 00:25:01,320 Speaker 1: us on Twitter, app, Bomb Stuff podcast, or on Instagram 454 00:25:01,320 --> 00:25:03,920 Speaker 1: as Stuff I've Never told you. Not quite on TikTok yet, 455 00:25:03,960 --> 00:25:06,440 Speaker 1: but who knows. Maybe one day. Thanks as always to 456 00:25:06,480 --> 00:25:09,160 Speaker 1: our super producer Christina. I wonder if she's on TikTok, 457 00:25:09,280 --> 00:25:11,320 Speaker 1: but she does a good content. Do you do you? 458 00:25:11,520 --> 00:25:15,400 Speaker 1: We'll find out. Thanks to you for listening Stuff I've 459 00:25:15,440 --> 00:25:17,520 Speaker 1: Never told You. Protection to I Heart Radio. For more podcasts. 460 00:25:17,520 --> 00:25:19,879 Speaker 1: For my heart radio, visit i Heart Radio app, Apple podcast, 461 00:25:20,000 --> 00:25:21,560 Speaker 1: or if you listen to your favorite shows