1 00:00:05,160 --> 00:00:07,720 Speaker 1: Hey, this is Annie and Samantha and welcome is Stephane 2 00:00:07,760 --> 00:00:19,279 Speaker 1: Never Told You production of iHeart Radio. And today we 3 00:00:19,320 --> 00:00:23,520 Speaker 1: are continuing our tech kick, which feels like it's never ending. 4 00:00:23,600 --> 00:00:26,920 Speaker 1: To be honest, we keep adding things. You're right, we do, 5 00:00:27,120 --> 00:00:29,600 Speaker 1: we do, but it's all fascinating stuff. And I will 6 00:00:29,640 --> 00:00:33,440 Speaker 1: say this one, which is about YouTube, It was very 7 00:00:33,520 --> 00:00:39,239 Speaker 1: difficult to research because Google owns YouTube. So when you 8 00:00:40,240 --> 00:00:43,440 Speaker 1: type in like, hey, YouTube women, all it is is 9 00:00:43,520 --> 00:00:48,760 Speaker 1: like videos from YouTube with women in the right. So 10 00:00:48,800 --> 00:00:51,240 Speaker 1: it was tricky. Yeah, you know what TikTok is similar 11 00:00:51,280 --> 00:00:53,280 Speaker 1: to that. When you try to find information on TikTok, 12 00:00:53,320 --> 00:00:56,880 Speaker 1: it takes you to TikTok and like things about that 13 00:00:57,040 --> 00:01:01,360 Speaker 1: subject on TikTok. Mm hmmm, we're speaking up. We have 14 00:01:01,440 --> 00:01:05,959 Speaker 1: any TikTok now told us right, Yes we do, thanks 15 00:01:06,080 --> 00:01:09,000 Speaker 1: to Joey who has been on the show, who is 16 00:01:09,040 --> 00:01:12,400 Speaker 1: amazing and patient. You can go check us out there 17 00:01:12,840 --> 00:01:16,800 Speaker 1: at stuff mom never told you. Yes, yes, yes, yes, yes, 18 00:01:17,160 --> 00:01:23,160 Speaker 1: so yeah we're there as well. Um YouTube a lot, Samantha. 19 00:01:23,600 --> 00:01:27,080 Speaker 1: So this is the thing. I do use YouTube a lot, 20 00:01:27,160 --> 00:01:31,200 Speaker 1: but it's very specific things. Um. I have used it 21 00:01:31,280 --> 00:01:34,320 Speaker 1: to listen to music, so way back when before licensing 22 00:01:34,440 --> 00:01:37,080 Speaker 1: was a thing and people were really like and YouTube 23 00:01:37,080 --> 00:01:41,160 Speaker 1: got really aware of that. I would listen to songs 24 00:01:41,200 --> 00:01:44,440 Speaker 1: that I couldn't find. Um. Now, of course Spotify has 25 00:01:44,480 --> 00:01:48,080 Speaker 1: more songs, but like some songs you couldn't find on Spotify, 26 00:01:48,400 --> 00:01:51,520 Speaker 1: Beyonce songs were not available there, so I would go 27 00:01:51,560 --> 00:01:54,640 Speaker 1: to YouTube and listen to it there. Um at night, 28 00:01:55,080 --> 00:02:00,400 Speaker 1: I like their deep sleep sounds. Set that on a 29 00:02:00,480 --> 00:02:03,120 Speaker 1: timer and they have a lot of dark screens. So 30 00:02:03,200 --> 00:02:06,560 Speaker 1: I use it like that. Um My partner uses it NonStop. 31 00:02:06,600 --> 00:02:09,800 Speaker 1: But that's that's it. Like I don't do things outside 32 00:02:09,800 --> 00:02:14,320 Speaker 1: of that, huh. I use YouTube quite a bit actually, 33 00:02:14,680 --> 00:02:17,720 Speaker 1: um it's a thing that I have a love hate 34 00:02:17,760 --> 00:02:21,520 Speaker 1: relationship with and we're going to talk about that because 35 00:02:21,560 --> 00:02:25,600 Speaker 1: I do feel like, as as a viewer who doesn't 36 00:02:25,600 --> 00:02:29,280 Speaker 1: pay for like the expensive I don't know expensive, but 37 00:02:29,520 --> 00:02:31,640 Speaker 1: I don't pay for the service to not have ads, 38 00:02:32,240 --> 00:02:34,919 Speaker 1: I feel like they time the ads at the worst 39 00:02:35,360 --> 00:02:39,640 Speaker 1: specific place on purpose. And I've even looked up like 40 00:02:39,720 --> 00:02:43,720 Speaker 1: on Reddit does YouTube do this purposefully? There's no real 41 00:02:43,800 --> 00:02:48,040 Speaker 1: answer yet. Yeah, I think I know it's gotten longer. 42 00:02:48,160 --> 00:02:50,239 Speaker 1: That used to be able to skip. Now you don't 43 00:02:50,240 --> 00:02:53,520 Speaker 1: have that option as much. I do want a correction 44 00:02:53,600 --> 00:02:57,200 Speaker 1: because I do watch Hot Ones. Oh yeah, huh uh. 45 00:02:57,360 --> 00:02:59,360 Speaker 1: And that is a series I've watched a lot on 46 00:02:59,440 --> 00:03:03,640 Speaker 1: there because they that's where they originated. Yes, so I 47 00:03:03,680 --> 00:03:07,440 Speaker 1: love that series as well as Binging with Babish. Those 48 00:03:07,440 --> 00:03:11,119 Speaker 1: are the two shows and I will watch, Yes, those 49 00:03:11,120 --> 00:03:12,880 Speaker 1: are those are pretty great ones. There's a lot of 50 00:03:12,880 --> 00:03:15,240 Speaker 1: great stuff on YouTube. I watched. I watch a bunch 51 00:03:15,240 --> 00:03:18,600 Speaker 1: of stuff. Actually it's one of my biggest Like probably 52 00:03:18,639 --> 00:03:22,040 Speaker 1: every morning I watch a bunch of YouTube videos, which 53 00:03:22,120 --> 00:03:24,600 Speaker 1: is not the case for a lot of people. I've learned. 54 00:03:26,720 --> 00:03:30,480 Speaker 1: Yeah that is. That is like my go too. That's 55 00:03:30,520 --> 00:03:36,480 Speaker 1: my morning routine. Um yeah yeah uh. Content warning before 56 00:03:36,480 --> 00:03:40,480 Speaker 1: we get into this one. General grossness. The same thing 57 00:03:40,520 --> 00:03:42,960 Speaker 1: in our past Tech episodes. If you're like a woman 58 00:03:43,080 --> 00:03:44,800 Speaker 1: or marginalized person on the internet, you know what I'm 59 00:03:44,840 --> 00:03:51,360 Speaker 1: talking about. Brief discussion of very disturbing content pedophilia, sexual assault. Um, 60 00:03:52,200 --> 00:03:54,320 Speaker 1: We're not going to get too much into detail, but 61 00:03:54,440 --> 00:03:57,280 Speaker 1: just so you know, you can see our past episodes 62 00:03:57,400 --> 00:04:01,800 Speaker 1: on Airbnb, Twitch, Evan rachel Wood and Tech Accountability, YouTube 63 00:04:01,960 --> 00:04:06,240 Speaker 1: Beauty Grows, Single Use hate Accounts and Megan Markle the 64 00:04:06,320 --> 00:04:10,960 Speaker 1: Amber heard trial. It's involved in this where Samantha knows 65 00:04:11,080 --> 00:04:15,600 Speaker 1: I have a very hilarious, spicy outline coming up about 66 00:04:15,800 --> 00:04:23,599 Speaker 1: d C Phantom. Oh yeah, Oh it's gonna be great. Also, 67 00:04:24,040 --> 00:04:28,000 Speaker 1: we typically try to focus on intersectional feminist issues in 68 00:04:28,040 --> 00:04:30,919 Speaker 1: our episodes for what I hope are obvious reasons, and 69 00:04:30,960 --> 00:04:33,680 Speaker 1: I think pretty much everything could be that. But YouTube 70 00:04:33,720 --> 00:04:38,440 Speaker 1: has a lot of issues, so we're going to discuss 71 00:04:38,520 --> 00:04:41,640 Speaker 1: some of them. But just so you know, there's a 72 00:04:41,640 --> 00:04:44,719 Speaker 1: lot going on here that we could get into. We 73 00:04:44,720 --> 00:04:47,800 Speaker 1: didn't have time for this, but there's a lot. It 74 00:04:47,880 --> 00:04:51,880 Speaker 1: was one of those ones where I think I was like, oh, 75 00:04:51,960 --> 00:04:54,600 Speaker 1: I'll get this done in two days, and I start 76 00:04:54,680 --> 00:05:01,000 Speaker 1: opening tabs, I'm like, oh, no, been there. And I 77 00:05:01,040 --> 00:05:04,280 Speaker 1: did want to start with a pretty brief explanation of 78 00:05:04,320 --> 00:05:09,480 Speaker 1: my YouTube experience because a good chunk of my job 79 00:05:10,560 --> 00:05:14,160 Speaker 1: at in my current it's it's so confusing because we've 80 00:05:14,160 --> 00:05:17,000 Speaker 1: been acquired by so many companies, but I've essentially been 81 00:05:17,120 --> 00:05:19,279 Speaker 1: working with the same group of people for a while. 82 00:05:20,240 --> 00:05:22,920 Speaker 1: Was YouTube based and I ran the stuff one Never 83 00:05:22,960 --> 00:05:26,440 Speaker 1: told You YouTube, and so I'm going to be inserting 84 00:05:26,480 --> 00:05:29,279 Speaker 1: a lot of personal experience in this one. Off the top. 85 00:05:29,400 --> 00:05:34,240 Speaker 1: I will say that my boss is some of my 86 00:05:34,320 --> 00:05:37,479 Speaker 1: bosses treated this much more seriously than I thought it 87 00:05:37,560 --> 00:05:42,080 Speaker 1: had any warrant to be treated. Kristen, who was in 88 00:05:42,240 --> 00:05:44,920 Speaker 1: the videos and I was filming them and editing them, well, 89 00:05:45,120 --> 00:05:50,120 Speaker 1: she filmed a lot of them. I was editing them primarily. Um, 90 00:05:50,120 --> 00:05:53,360 Speaker 1: she was told to like wear makeup and get her 91 00:05:53,520 --> 00:05:56,880 Speaker 1: hair done, all of this kind of stuff. We were 92 00:05:57,200 --> 00:06:01,120 Speaker 1: instructed on, like the best thumbnails to use, which were 93 00:06:01,160 --> 00:06:04,880 Speaker 1: generally like quote pretty looking thumbnails or something so people 94 00:06:04,880 --> 00:06:08,960 Speaker 1: would click on them. Um. We did run into a 95 00:06:09,000 --> 00:06:12,320 Speaker 1: couple of copyright issues, which I'll talk about more later. 96 00:06:13,000 --> 00:06:16,080 Speaker 1: As you know, uh, probably a lot of people know 97 00:06:16,160 --> 00:06:19,200 Speaker 1: by now because it's become sort of a conversation topic. 98 00:06:19,960 --> 00:06:22,120 Speaker 1: There's a lot of videos you watch that look like 99 00:06:22,240 --> 00:06:26,279 Speaker 1: it's just vlogging where they edited and do it themselves. 100 00:06:26,320 --> 00:06:29,040 Speaker 1: Are not. There's like a huge team behind it, and 101 00:06:29,080 --> 00:06:31,600 Speaker 1: they don't. The team doesn't get credited. I was never 102 00:06:31,640 --> 00:06:35,240 Speaker 1: credited on anything, I don't think. Um, And that's just 103 00:06:35,320 --> 00:06:38,320 Speaker 1: like how it is. I had to go through YouTube 104 00:06:38,320 --> 00:06:40,560 Speaker 1: training every year and it's almost boring thing you can 105 00:06:40,640 --> 00:06:45,440 Speaker 1: imagine and you can't skip it. Oh my gosh, but 106 00:06:45,520 --> 00:06:47,520 Speaker 1: I did get an award. I want a YouTube award. 107 00:06:47,560 --> 00:06:49,560 Speaker 1: I still have it. It's pretty cool. I mean it 108 00:06:49,600 --> 00:06:54,640 Speaker 1: looks pretty cool. On the not so fun side, I 109 00:06:54,680 --> 00:06:57,440 Speaker 1: did experience a lot of harassment. One of my very 110 00:06:57,480 --> 00:07:00,160 Speaker 1: favorite comments, as I've said before, i've ever got and 111 00:07:00,240 --> 00:07:05,720 Speaker 1: was I know the producer and she's a slut. Um. 112 00:07:05,920 --> 00:07:08,960 Speaker 1: Yeah whatever. And also I did get doxed. I got 113 00:07:09,080 --> 00:07:14,120 Speaker 1: doxed based on YouTube video that we posted. That wasn't 114 00:07:14,200 --> 00:07:20,760 Speaker 1: that inflammatory. Um. And then accessibility was a thing I 115 00:07:20,800 --> 00:07:24,000 Speaker 1: was really passionate about when I was doing those videos. 116 00:07:24,040 --> 00:07:26,320 Speaker 1: So I would you would listen to the video and 117 00:07:26,360 --> 00:07:28,720 Speaker 1: type out the closed captions and that was something I 118 00:07:28,760 --> 00:07:31,000 Speaker 1: kind of did and didn't get compensated for, but it 119 00:07:31,040 --> 00:07:33,840 Speaker 1: was really important for me to do it. But that 120 00:07:33,960 --> 00:07:37,320 Speaker 1: is a big issue with YouTube, and that's a big 121 00:07:37,320 --> 00:07:40,800 Speaker 1: issue with the show. And we've tried to get transcripts 122 00:07:40,920 --> 00:07:44,920 Speaker 1: with um middling success because for a while we had 123 00:07:44,960 --> 00:07:47,960 Speaker 1: them before you and I were host, there were transcripts 124 00:07:47,960 --> 00:07:51,960 Speaker 1: and now they are not, so that's an issue as well. 125 00:07:52,760 --> 00:07:57,520 Speaker 1: But yeah, that's kind of my brief encapsulation of my time. 126 00:07:57,640 --> 00:07:59,360 Speaker 1: I like I said, I have a bunch of I 127 00:07:59,400 --> 00:08:03,040 Speaker 1: have a bunch of comments throughout this one. But okay, 128 00:08:03,120 --> 00:08:06,119 Speaker 1: let's start with a very quick rundown of the history 129 00:08:06,160 --> 00:08:09,640 Speaker 1: of YouTube, which I found kind of surprising. So YouTube, 130 00:08:09,640 --> 00:08:12,880 Speaker 1: which is an online video sharing platform that allows for commenting, 131 00:08:12,960 --> 00:08:16,280 Speaker 1: liking and disliking, sharing, making playlists, all kinds of things. 132 00:08:16,320 --> 00:08:19,239 Speaker 1: I feel like you know what YouTube is, was founded 133 00:08:19,280 --> 00:08:23,680 Speaker 1: by Chad Hurley, Steve Chen and Jawd cut him in 134 00:08:23,800 --> 00:08:26,840 Speaker 1: early two thousand five. A little over a year later, 135 00:08:27,160 --> 00:08:30,840 Speaker 1: Google purchased YouTube for a staggering one point six five 136 00:08:31,080 --> 00:08:37,320 Speaker 1: billion dollars, and it started as a dating website. Yeah. So, 137 00:08:37,360 --> 00:08:40,960 Speaker 1: the idea was users would upload videos of themselves talking 138 00:08:41,000 --> 00:08:45,600 Speaker 1: about their dream partners. The slogan was tuned in hook 139 00:08:45,720 --> 00:08:50,520 Speaker 1: up um, and I was struck once again about how 140 00:08:50,640 --> 00:08:54,199 Speaker 1: humans and especially dudes, desire for sex really runs so 141 00:08:54,320 --> 00:09:00,840 Speaker 1: much stuff. It's wild. The dating aspect failed even after 142 00:09:01,080 --> 00:09:05,520 Speaker 1: they offered women twenty dollars to upload videos, but the 143 00:09:05,600 --> 00:09:08,440 Speaker 1: video uploading system was excellent, so they decided to open 144 00:09:08,480 --> 00:09:11,760 Speaker 1: it up to any video. And they decided this in 145 00:09:11,880 --> 00:09:16,400 Speaker 1: part because after the two thousand four Janet Jackson justin 146 00:09:16,440 --> 00:09:20,760 Speaker 1: Timberlake Super Bowl incidents, they couldn't find video of it anywhere, 147 00:09:21,040 --> 00:09:23,559 Speaker 1: so they thought, Hey, we could be the place people 148 00:09:23,559 --> 00:09:26,360 Speaker 1: put videos like that and you can watch them. U 149 00:09:26,600 --> 00:09:28,839 Speaker 1: Karim at the Zoo, a short video of one of 150 00:09:28,840 --> 00:09:32,480 Speaker 1: the creators at the Zoo, was YouTube's first official video. 151 00:09:33,840 --> 00:09:37,240 Speaker 1: Yeah uh. Several studies have found that the reasons people 152 00:09:37,360 --> 00:09:42,520 Speaker 1: use YouTube are quote information seeking, sharing, information, status seeking 153 00:09:42,600 --> 00:09:46,720 Speaker 1: like that whole first thing, uh, social interaction, and entertainment. 154 00:09:46,760 --> 00:09:52,600 Speaker 1: Just by the way, So yeah, YouTube is huge. Behind Google, 155 00:09:52,640 --> 00:09:55,280 Speaker 1: it is the second most popular website in the world. 156 00:09:55,520 --> 00:09:57,840 Speaker 1: As early as two thousand and six, the site was 157 00:09:57,840 --> 00:10:01,720 Speaker 1: getting about twenty million monthly visit. There's one billion hours 158 00:10:01,720 --> 00:10:05,840 Speaker 1: of daily consumed content as of three about fifty six 159 00:10:05,880 --> 00:10:10,840 Speaker 1: percent mail to female. That same year, the first targeted 160 00:10:10,920 --> 00:10:15,120 Speaker 1: ad campaigns launched by major companies took off, and Time 161 00:10:15,160 --> 00:10:19,240 Speaker 1: featured YouTube as Person of the Year um and in eighteen, 162 00:10:19,440 --> 00:10:22,680 Speaker 1: YouTube made twice the amount of revenue than any major 163 00:10:22,720 --> 00:10:26,680 Speaker 1: television network. Numbers from twenty nineteen found that five hundred 164 00:10:26,760 --> 00:10:30,319 Speaker 1: hours of content was uploaded each minute, and the company 165 00:10:30,360 --> 00:10:35,760 Speaker 1: makes an estimated fifteen billion in revenue annually, although a 166 00:10:35,800 --> 00:10:38,000 Speaker 1: lot of that is supposed to go back to the creators. 167 00:10:38,520 --> 00:10:41,400 Speaker 1: Twenty seven percent of Americans rely on it as a 168 00:10:41,480 --> 00:10:45,839 Speaker 1: news source. Yes, And honestly, there are so many numbers 169 00:10:45,920 --> 00:10:48,160 Speaker 1: we could throw at you, um, so many awards we 170 00:10:48,200 --> 00:10:51,200 Speaker 1: could throw at you, and moments in history about YouTube, 171 00:10:51,240 --> 00:10:54,240 Speaker 1: like did you know Rick Rolling started in two thousand eight? 172 00:10:55,600 --> 00:10:58,720 Speaker 1: Ganka Style was the first videotive pass a billion views. 173 00:11:00,600 --> 00:11:03,560 Speaker 1: But the basic takeaway here is it's a big deal 174 00:11:03,800 --> 00:11:06,640 Speaker 1: and it makes a lot of money. Um. It has 175 00:11:06,720 --> 00:11:09,560 Speaker 1: also been the source of a lot of conversation and 176 00:11:09,640 --> 00:11:14,080 Speaker 1: controversy since it's founding. One of the big things that 177 00:11:14,120 --> 00:11:18,840 Speaker 1: has hounded YouTube is copyright issues. Um, how they handle copyright, 178 00:11:19,280 --> 00:11:21,120 Speaker 1: What you can upload, what will get you in trouble, 179 00:11:21,200 --> 00:11:23,920 Speaker 1: will get your videos taken down? Um? When I was 180 00:11:23,960 --> 00:11:28,480 Speaker 1: working there, uh, it was a three strength system, but 181 00:11:28,600 --> 00:11:31,480 Speaker 1: it was very wild West and what you could get 182 00:11:31,520 --> 00:11:36,320 Speaker 1: away with, what random thing would get you in trouble? Uh. 183 00:11:36,400 --> 00:11:39,280 Speaker 1: Sometimes you would get mistakenly flagged or a video would 184 00:11:39,280 --> 00:11:42,280 Speaker 1: be taken down without warning or any clear reason, which 185 00:11:42,320 --> 00:11:44,080 Speaker 1: may or may not have had to do with their 186 00:11:44,160 --> 00:11:48,240 Speaker 1: massive copyright library. So basically they have like this huge 187 00:11:49,640 --> 00:11:53,640 Speaker 1: a library that just like searches for oh you're playing 188 00:11:54,000 --> 00:11:57,240 Speaker 1: this copyrighted item, take it down. But it does make 189 00:11:57,600 --> 00:12:03,439 Speaker 1: mistake in my experience. It made some stay. Probably it's improved. 190 00:12:03,440 --> 00:12:07,600 Speaker 1: I should say It's been about a goody about like 191 00:12:07,720 --> 00:12:10,720 Speaker 1: seven years since I've been doing this job, so I 192 00:12:10,760 --> 00:12:12,400 Speaker 1: bet a lot of things have changed since I've been 193 00:12:12,400 --> 00:12:16,520 Speaker 1: doing it. Another big problem YouTube has has to do 194 00:12:16,559 --> 00:12:21,040 Speaker 1: with their recommended videos function, because there have been reports 195 00:12:21,040 --> 00:12:24,480 Speaker 1: about it pushing things like conspiracy theories, are lies, promoting 196 00:12:24,559 --> 00:12:29,679 Speaker 1: violent or sexual content to children, pedophilic content, and things 197 00:12:29,760 --> 00:12:32,520 Speaker 1: like we discussed in Bridget's episode about Evan Rachel would 198 00:12:32,800 --> 00:12:35,080 Speaker 1: that YouTube is making money off of a video she 199 00:12:35,160 --> 00:12:37,560 Speaker 1: alleges depicts her sexual assault and they won't take it 200 00:12:37,600 --> 00:12:40,720 Speaker 1: down even though she said, like hey. In twenty nineteen, 201 00:12:40,960 --> 00:12:45,160 Speaker 1: YouTube announced they would recommend fewer videos that quote could 202 00:12:45,280 --> 00:12:50,160 Speaker 1: misinform users and harmful ways. This came after rising concerns 203 00:12:50,200 --> 00:12:55,640 Speaker 1: that platforms like YouTube relating to offline violence, death, and radicalization. This, 204 00:12:55,760 --> 00:12:59,679 Speaker 1: of course led to heated discussion around free speech and 205 00:13:00,200 --> 00:13:05,080 Speaker 1: confirmation in some conspiracy theorist minds that they were being censored, 206 00:13:05,080 --> 00:13:08,960 Speaker 1: like so here's the proof, right right. So this was 207 00:13:09,040 --> 00:13:12,720 Speaker 1: after a Google engineer posted a ten page manifesto in 208 00:13:12,760 --> 00:13:18,000 Speaker 1: twenty seventeen criticizing the company's diversity initiatives, claiming it discriminated 209 00:13:18,040 --> 00:13:21,600 Speaker 1: against white men. That led to the YouTube CEO's daughter 210 00:13:21,679 --> 00:13:25,720 Speaker 1: asking mom, is it true that there are biological reasons 211 00:13:25,760 --> 00:13:28,960 Speaker 1: why there are fewer women in tech and leaderships? But 212 00:13:29,240 --> 00:13:31,720 Speaker 1: many pointed out that the engineer may have been in 213 00:13:31,840 --> 00:13:35,960 Speaker 1: part radicalized by YouTube and its algorithm. He was fired 214 00:13:36,000 --> 00:13:51,760 Speaker 1: and became an alt right hero. As far as the 215 00:13:51,800 --> 00:13:56,480 Speaker 1: disturbing content served a children goes, these videos often feature 216 00:13:56,520 --> 00:14:00,920 Speaker 1: beloved children's cartoon characters in violent and or sex rule situations. 217 00:14:01,400 --> 00:14:04,200 Speaker 1: One such video featured a woman with a mini mouse 218 00:14:04,240 --> 00:14:07,680 Speaker 1: head getting stuck in an escalator and bleeding profusely. It 219 00:14:07,760 --> 00:14:09,960 Speaker 1: got millions of views in one day and could be 220 00:14:10,040 --> 00:14:14,080 Speaker 1: viewed in the kid friendly mode. UM. Other videos depict 221 00:14:14,160 --> 00:14:18,920 Speaker 1: Pepper Pig tricked into eating bacon, um the suicide attempt 222 00:14:19,040 --> 00:14:22,760 Speaker 1: of a Paw Patrol character. A lot of these videos 223 00:14:22,760 --> 00:14:28,360 Speaker 1: are discovered through AutoPlay or the recommended video sidebar, so 224 00:14:28,560 --> 00:14:33,640 Speaker 1: after you've watched legitimate content, you find these videos. Some 225 00:14:33,720 --> 00:14:36,240 Speaker 1: tests found that a toddler went from viewing one of 226 00:14:36,320 --> 00:14:42,160 Speaker 1: his favorite legitimate videos to a video depicting stomach parasites, 227 00:14:42,360 --> 00:14:46,040 Speaker 1: eye gouging, and kids setting each other on fire. And 228 00:14:46,240 --> 00:14:49,640 Speaker 1: only a few clicks. Um, and that video had twenty 229 00:14:49,640 --> 00:14:52,480 Speaker 1: million views. And that's part of the problem is that 230 00:14:52,880 --> 00:14:55,560 Speaker 1: these videos get pushed because they have so many views, 231 00:14:55,600 --> 00:15:01,000 Speaker 1: and that's how YouTube's algorithm works. In parts, YouTube post 232 00:15:01,360 --> 00:15:04,040 Speaker 1: millions of hours of children's entertainment, and much of this 233 00:15:04,160 --> 00:15:08,960 Speaker 1: content makes a lot of money. YouTube claims it's difficult 234 00:15:09,000 --> 00:15:13,120 Speaker 1: to change their algorithm and largely remove these videos on 235 00:15:13,160 --> 00:15:15,800 Speaker 1: a case by case basis, which isn't going to fix 236 00:15:15,800 --> 00:15:18,280 Speaker 1: the problem. It also puts more pressure on the guardian 237 00:15:18,480 --> 00:15:22,680 Speaker 1: to view the media with the child when it comes 238 00:15:22,720 --> 00:15:27,080 Speaker 1: to when you know, that's something you often do to 239 00:15:27,160 --> 00:15:30,760 Speaker 1: keep a child busy while you do something else. Um. 240 00:15:30,960 --> 00:15:33,880 Speaker 1: On top of that, it's kind of complicated reporting a 241 00:15:34,000 --> 00:15:36,960 Speaker 1: video and kids YouTube, and there's no rail way to 242 00:15:37,000 --> 00:15:38,840 Speaker 1: make sure it doesn't show up again. You can delete 243 00:15:38,880 --> 00:15:40,320 Speaker 1: it from history, you can do all kinds of things 244 00:15:40,320 --> 00:15:43,400 Speaker 1: that still might show up again. YouTube has been releasing 245 00:15:43,440 --> 00:15:46,400 Speaker 1: improved reporting tools and methods for video blocking in an 246 00:15:46,480 --> 00:15:50,360 Speaker 1: attempt to improve the situation. There are plenty of articles 247 00:15:50,360 --> 00:15:52,920 Speaker 1: out there about what steps you can take on your 248 00:15:52,920 --> 00:15:55,040 Speaker 1: own if this is something you're worried about, but again, 249 00:15:55,080 --> 00:15:57,600 Speaker 1: that's sort of putting the impetus on you to do that. 250 00:15:58,360 --> 00:16:02,280 Speaker 1: In twenty nineteen, several art was reported on pedophilic content 251 00:16:02,760 --> 00:16:07,360 Speaker 1: running alongside ads for major companies, and the videos showed 252 00:16:07,400 --> 00:16:10,600 Speaker 1: children and their underwear and or their genitals with what 253 00:16:10,680 --> 00:16:14,880 Speaker 1: appeared to be pedophiles time stamping content in the comments, 254 00:16:15,400 --> 00:16:18,400 Speaker 1: basically so other users could click the time stamp to 255 00:16:18,520 --> 00:16:20,800 Speaker 1: go to the part in the video where nipples were 256 00:16:20,840 --> 00:16:26,320 Speaker 1: exposed or something like that, and then recommending other similar videos, 257 00:16:26,400 --> 00:16:31,600 Speaker 1: sometimes exchanging numbers to exchange more videos with each other. Uh. 258 00:16:31,640 --> 00:16:35,080 Speaker 1: These videos have thousands, if not millions, of yews, hundreds 259 00:16:35,120 --> 00:16:40,000 Speaker 1: of comments, and yes, they are being monetized. You probably 260 00:16:40,000 --> 00:16:43,440 Speaker 1: heard about this because the advertisers were not happy to 261 00:16:43,520 --> 00:16:45,720 Speaker 1: learn their ads are running on this content. But to 262 00:16:45,840 --> 00:16:49,480 Speaker 1: be clear, a lot of these videos, though not all 263 00:16:49,520 --> 00:16:55,720 Speaker 1: of them, are pretty innocent, like girls playing twister um 264 00:16:55,760 --> 00:16:59,280 Speaker 1: and confused. Girls who uploaded the video will respond to 265 00:16:59,360 --> 00:17:03,560 Speaker 1: comments how old are you? They'll answer, or ask a 266 00:17:03,600 --> 00:17:07,800 Speaker 1: comment or what grow means? Uh. Many of the comments 267 00:17:07,800 --> 00:17:11,080 Speaker 1: are about the child and questions beauty or claim that 268 00:17:11,119 --> 00:17:14,000 Speaker 1: they're in love with them, and sometimes they request specific 269 00:17:14,160 --> 00:17:18,399 Speaker 1: lighting or outfits for a while searching twister girl on 270 00:17:18,440 --> 00:17:24,320 Speaker 1: YouTube auto corrected to little Girl twister and skirt, so 271 00:17:24,359 --> 00:17:27,879 Speaker 1: the recommendation system was part of the issue to serving 272 00:17:27,960 --> 00:17:32,560 Speaker 1: up other videos seemingly enjoyed by pedophiles. On top of that, 273 00:17:32,720 --> 00:17:35,560 Speaker 1: while some channels have been taken down over child abuse, 274 00:17:36,000 --> 00:17:40,320 Speaker 1: there are plenty um as of writing this still up 275 00:17:40,520 --> 00:17:45,920 Speaker 1: dedicated to quote preteen models, girls bathing or doing stretches 276 00:17:46,040 --> 00:17:50,280 Speaker 1: or yoga swimming, things like that. Um YouTube enacted a 277 00:17:50,359 --> 00:17:53,959 Speaker 1: policy disabling comments on videos or the comments in question 278 00:17:54,240 --> 00:17:58,520 Speaker 1: or overwhelmingly inappropriate, but even so, the algorithms still would 279 00:17:58,560 --> 00:18:03,200 Speaker 1: serve up these videos along side others teeming with pedophilic comments. 280 00:18:03,480 --> 00:18:05,560 Speaker 1: A lot of the comments aren't in English as a 281 00:18:05,560 --> 00:18:11,880 Speaker 1: way to get around the disabling, but several of them are. Yes. Yeah. 282 00:18:12,280 --> 00:18:17,919 Speaker 1: Another issue advertising because YouTube has a changing and not 283 00:18:18,119 --> 00:18:21,359 Speaker 1: very transparent policy on what type of content and videos 284 00:18:21,400 --> 00:18:24,720 Speaker 1: can be monetized on their platform, which is primarily how 285 00:18:24,800 --> 00:18:29,600 Speaker 1: creators make money. Yeah. YouTube claims it's effective at not 286 00:18:29,760 --> 00:18:33,480 Speaker 1: running an ad over inappropriate content, but advertisers have pulled 287 00:18:33,520 --> 00:18:37,240 Speaker 1: their ads after they ran against videos containing things like rape, 288 00:18:37,280 --> 00:18:42,280 Speaker 1: apologist and a Semitism and terrorism. In seventeen, many big 289 00:18:42,320 --> 00:18:45,720 Speaker 1: advertisers told YouTube they would in their relationship if the 290 00:18:45,720 --> 00:18:50,240 Speaker 1: platform didn't fix the issue. The solution decided upon was 291 00:18:50,359 --> 00:18:54,000 Speaker 1: that YouTube would work more directly with advertisers to make 292 00:18:54,040 --> 00:18:57,720 Speaker 1: sure their ads were only placed on the desired content. Yes, 293 00:18:58,000 --> 00:19:00,720 Speaker 1: and a quick aside here, My experience was you had 294 00:19:00,920 --> 00:19:03,520 Speaker 1: very little control over the ads on your videos. As 295 00:19:03,520 --> 00:19:06,520 Speaker 1: a creator. You could flag topics you didn't want running 296 00:19:06,560 --> 00:19:10,040 Speaker 1: like cigarettes. Um, but what was served on your video 297 00:19:10,119 --> 00:19:14,560 Speaker 1: could be fairly random to downright offensive given the content. Um. 298 00:19:14,600 --> 00:19:17,280 Speaker 1: And this is an issue we encounter pretty frequently as 299 00:19:17,280 --> 00:19:19,800 Speaker 1: a feminis podcast and what advertisers think of when it 300 00:19:19,840 --> 00:19:25,439 Speaker 1: comes to women. Um. But basically it was pretty random 301 00:19:25,440 --> 00:19:28,680 Speaker 1: with a very difficult reporting system, especially if you didn't 302 00:19:28,720 --> 00:19:32,400 Speaker 1: have a specific deal with a sponsor. On top of that, 303 00:19:32,720 --> 00:19:36,320 Speaker 1: this was how you made money from a random pool 304 00:19:36,320 --> 00:19:38,680 Speaker 1: of ads. Again, if you didn't have a specific sponsor, 305 00:19:38,880 --> 00:19:42,080 Speaker 1: and plenty of advertisers didn't want to run ads on 306 00:19:42,119 --> 00:19:48,399 Speaker 1: feminist videos talking about abortion or honestly feminist videos in general. Anyway, 307 00:19:48,680 --> 00:19:52,440 Speaker 1: YouTube announces it's going to work more closely with advertisers 308 00:19:52,480 --> 00:19:55,840 Speaker 1: to prevent something like these things we've been talking about 309 00:19:55,840 --> 00:19:59,640 Speaker 1: happening again. One of their largest creators, Pauti Pie, who 310 00:19:59,720 --> 00:20:05,440 Speaker 1: had had preferred advertising status, lost it after he posted 311 00:20:05,440 --> 00:20:08,480 Speaker 1: a video with anti Semitic language and imagery in it 312 00:20:08,560 --> 00:20:11,240 Speaker 1: and basically preferred advertising. It's like, oh you want that's 313 00:20:11,240 --> 00:20:15,040 Speaker 1: like top tier, you want to advertise with this This creator. 314 00:20:16,080 --> 00:20:18,679 Speaker 1: On the flip side, creators were worried about how this 315 00:20:18,800 --> 00:20:24,800 Speaker 1: policy and demonetization would impact their own revenue. Advertisers could 316 00:20:24,840 --> 00:20:27,480 Speaker 1: opt out of specific videos, meaning if a channel about 317 00:20:27,600 --> 00:20:31,200 Speaker 1: news wanted to talk about a tragedy, ads could opt out, 318 00:20:31,280 --> 00:20:35,040 Speaker 1: which is a financial incentive not to talk about darker 319 00:20:35,080 --> 00:20:37,920 Speaker 1: things that we need to talk about. I do understand 320 00:20:37,920 --> 00:20:41,880 Speaker 1: that it's strange, like the whole Applebee's Ukraine invasion thing 321 00:20:42,560 --> 00:20:46,000 Speaker 1: where they were running seen in at this sec disturbing 322 00:20:46,000 --> 00:20:48,080 Speaker 1: imagery and then there was this very in your phase 323 00:20:48,119 --> 00:20:52,400 Speaker 1: Outbea's commercial. It is weird, um And we've had these 324 00:20:52,440 --> 00:20:56,280 Speaker 1: discussions ourselves about our darker episodes where we don't want 325 00:20:57,040 --> 00:21:00,440 Speaker 1: specific ads playing in episodes about sexual assault, for instance, 326 00:21:00,600 --> 00:21:03,520 Speaker 1: But at the same time you have to make money, 327 00:21:03,600 --> 00:21:07,000 Speaker 1: so it's just like a it's a strange situation. Creator 328 00:21:07,000 --> 00:21:10,600 Speaker 1: Philip DeFranco reported an eight percent drop in revenue after 329 00:21:10,640 --> 00:21:13,680 Speaker 1: the policy was enacted, and argued that new creators would 330 00:21:13,680 --> 00:21:17,080 Speaker 1: feel the worst of it. To address the issue, YouTube 331 00:21:17,080 --> 00:21:20,520 Speaker 1: attempted to make it more transparent to creators what videos 332 00:21:20,520 --> 00:21:24,240 Speaker 1: were being monetized and which ones weren't. Um, there is 333 00:21:24,240 --> 00:21:27,119 Speaker 1: a process for requesting a review about why a video 334 00:21:27,240 --> 00:21:30,720 Speaker 1: isn't being monetized, but it's tedious and not without human 335 00:21:30,720 --> 00:21:34,399 Speaker 1: bias because it requires a human to watch the video 336 00:21:35,200 --> 00:21:38,600 Speaker 1: and then kind of decide like I don't know. On 337 00:21:38,720 --> 00:21:42,879 Speaker 1: top of that, these reviews were prioritized for bigger creators. UM. 338 00:21:43,000 --> 00:21:46,240 Speaker 1: YouTube did update that the algorithm later, and the update 339 00:21:46,320 --> 00:21:50,479 Speaker 1: decreased demonetization by thirty percent. And this is something I 340 00:21:50,520 --> 00:21:53,360 Speaker 1: remember working with YouTube is every time a major company, 341 00:21:53,440 --> 00:21:57,240 Speaker 1: not just YouTube, would change their algorithm or policy, we 342 00:21:57,280 --> 00:21:59,280 Speaker 1: would have to have a very serious meeting about it. 343 00:21:59,320 --> 00:22:00,600 Speaker 1: Like we would sit down and be like what does 344 00:22:00,640 --> 00:22:02,440 Speaker 1: this mean? What do we have to do with our keywords? 345 00:22:02,480 --> 00:22:06,600 Speaker 1: Like what do we all of these things? Um? And also, yeah, 346 00:22:06,640 --> 00:22:09,560 Speaker 1: we just kind of experienced a meeting like this because 347 00:22:09,600 --> 00:22:12,560 Speaker 1: smin tea may or may not be uploading episodes to 348 00:22:12,680 --> 00:22:17,960 Speaker 1: YouTube soon of like our whole are all episodes and yeah, 349 00:22:18,000 --> 00:22:20,359 Speaker 1: we had to talk about it. We had to have 350 00:22:20,400 --> 00:22:22,560 Speaker 1: a whole breakdown of what we're afraid of and what 351 00:22:22,600 --> 00:22:26,280 Speaker 1: we want to avoid, including about having control about ads 352 00:22:27,160 --> 00:22:31,760 Speaker 1: because we try really hard to monitor that whether we 353 00:22:31,800 --> 00:22:33,840 Speaker 1: want to or not, because there's some things that we 354 00:22:33,880 --> 00:22:37,120 Speaker 1: really miss out on, like oh, dang, I really love 355 00:22:37,160 --> 00:22:39,440 Speaker 1: that sponsorship, you know what I mean. But we want 356 00:22:39,440 --> 00:22:42,280 Speaker 1: to be very aware, and it's hard to do that 357 00:22:42,560 --> 00:22:45,800 Speaker 1: on something like YouTube when it's so it has its 358 00:22:45,800 --> 00:22:49,920 Speaker 1: own thing. God, each social media platform it's so hard 359 00:22:49,960 --> 00:22:53,520 Speaker 1: to learn because it's all so different. Yeah yeah, but 360 00:22:53,640 --> 00:22:55,439 Speaker 1: I mean it's so different, but a lot of the 361 00:22:55,520 --> 00:23:00,679 Speaker 1: same issues, right, the same issues, but like different standards. Weird, okay, 362 00:23:01,480 --> 00:23:04,800 Speaker 1: But then the reports about pedophilic content broke and the 363 00:23:04,840 --> 00:23:08,399 Speaker 1: company was once again in crisis mode to keep advertisers 364 00:23:08,520 --> 00:23:12,280 Speaker 1: from bailing. Then logan Paul Yes, one of the most 365 00:23:12,440 --> 00:23:15,840 Speaker 1: popular vloggers, posted a video of a man who killed 366 00:23:15,880 --> 00:23:19,360 Speaker 1: himself and he was stripped of his quote preferred advertising status, 367 00:23:19,880 --> 00:23:22,240 Speaker 1: and in the wake of this, the platform announced new 368 00:23:22,280 --> 00:23:26,439 Speaker 1: policies detelling which creators were even eligible to make money. 369 00:23:26,840 --> 00:23:28,400 Speaker 1: And by the way, he's still one of their top 370 00:23:28,400 --> 00:23:31,320 Speaker 1: money makers and just had a whole controversy happened because 371 00:23:31,920 --> 00:23:34,680 Speaker 1: a pig that he had for clout has been shown 372 00:23:34,720 --> 00:23:38,760 Speaker 1: as being severely abused. Anyway, Creators now had to have 373 00:23:38,880 --> 00:23:42,040 Speaker 1: over four thousand hours of watch time over the course 374 00:23:42,080 --> 00:23:45,440 Speaker 1: of a year and a thousand subscribers to receive ads. 375 00:23:47,160 --> 00:23:50,679 Speaker 1: Small creators were understandably hurt and outrage, and there was 376 00:23:50,720 --> 00:23:54,360 Speaker 1: a whole host of emotional videos posted about the decision. 377 00:23:54,720 --> 00:23:57,520 Speaker 1: A YouTube official set of the policy quote changes will 378 00:23:57,560 --> 00:24:02,080 Speaker 1: affect a significant number of channels. Those affected we're making 379 00:24:02,160 --> 00:24:04,840 Speaker 1: less than a hundred dollars per year in the last year, 380 00:24:05,240 --> 00:24:08,520 Speaker 1: with earning less than two dollars and fifty cents in 381 00:24:08,520 --> 00:24:12,560 Speaker 1: the last month. Creators called the day the policy went 382 00:24:12,640 --> 00:24:19,280 Speaker 1: into effect the demonetization day. Yeah, that makes sense. Smaller 383 00:24:19,320 --> 00:24:21,679 Speaker 1: accounts waited up to half a year to have their 384 00:24:21,680 --> 00:24:25,320 Speaker 1: accounts reviewed to see if they could get ads. Multi 385 00:24:25,400 --> 00:24:30,080 Speaker 1: channel networks dropped huge amounts of small creators. Numerous creators quit. 386 00:24:30,320 --> 00:24:33,320 Speaker 1: A lot of the smaller or demonetized content is or 387 00:24:33,560 --> 00:24:38,320 Speaker 1: was created by marginalized folks or around less commercial topics, 388 00:24:38,680 --> 00:24:42,800 Speaker 1: and many argue that YouTube is losing what made the YouTube, Yeah, 389 00:24:42,840 --> 00:24:46,439 Speaker 1: because like it's different now, but when it started, it 390 00:24:46,560 --> 00:24:49,159 Speaker 1: was like when I was reading the history of YouTube, 391 00:24:49,200 --> 00:24:52,000 Speaker 1: they had the first time a trailer from a company 392 00:24:52,080 --> 00:24:54,399 Speaker 1: appeared on YouTube, Like, it didn't used to be what 393 00:24:54,520 --> 00:24:56,520 Speaker 1: it is now. It used to be like a much 394 00:24:56,560 --> 00:25:03,320 Speaker 1: more kind of hodgepodge random assortment of videos from various creators, 395 00:25:03,320 --> 00:25:05,200 Speaker 1: and now it's you know, it's got a lot of 396 00:25:05,240 --> 00:25:10,359 Speaker 1: like movie trailers or clips from these big companies, music videos, 397 00:25:10,359 --> 00:25:12,320 Speaker 1: things like that, which I don't think there's anything necessarily 398 00:25:12,320 --> 00:25:15,960 Speaker 1: wrong with, but if you're losing all of these small creators, 399 00:25:16,680 --> 00:25:22,159 Speaker 1: then it's just like the scale is tipping in one way. Um. 400 00:25:22,200 --> 00:25:25,560 Speaker 1: And just for the record, Sminty had hundreds of thousands 401 00:25:25,560 --> 00:25:28,840 Speaker 1: of subscribers, tens of thousands, if not millions of views 402 00:25:28,840 --> 00:25:31,679 Speaker 1: on every video, and we made less than a hundred 403 00:25:31,720 --> 00:25:34,880 Speaker 1: dollars a year. Right, I think you still have that. 404 00:25:34,920 --> 00:25:40,320 Speaker 1: We still have like twin around almost subscribers, which is 405 00:25:40,400 --> 00:25:43,120 Speaker 1: hilarious because there's not been a video we posted since 406 00:25:43,680 --> 00:25:46,800 Speaker 1: I think Emily introduced themselves as a new host. I 407 00:25:46,800 --> 00:25:50,560 Speaker 1: think that's the last video. Yeah, it's been a whole 408 00:25:50,640 --> 00:25:55,119 Speaker 1: process finding out who owns that channel. Now that's a 409 00:25:55,119 --> 00:25:58,000 Speaker 1: whole different story. It is it is, but I just 410 00:25:58,040 --> 00:26:00,159 Speaker 1: wanted to put that in there because like we had 411 00:26:00,200 --> 00:26:03,600 Speaker 1: the backing of a pretty big company. There's a whole 412 00:26:03,640 --> 00:26:07,359 Speaker 1: team dedicated to this pretty much, and we got a 413 00:26:07,359 --> 00:26:09,439 Speaker 1: lot of views and we didn't make that much money, 414 00:26:09,640 --> 00:26:13,439 Speaker 1: so mm hmm. But yeah, okay, so a lot of 415 00:26:13,440 --> 00:26:15,560 Speaker 1: the things we've been talking about do impact women. But 416 00:26:15,600 --> 00:26:19,560 Speaker 1: let's talk about women and YouTube specifically. There are conflicting 417 00:26:19,640 --> 00:26:22,960 Speaker 1: numbers when it comes to gender differences in viewership on YouTube. 418 00:26:23,359 --> 00:26:27,120 Speaker 1: One source reports six of users are mail and seventy 419 00:26:27,200 --> 00:26:29,840 Speaker 1: eight percent of men use it in the US, compared 420 00:26:29,880 --> 00:26:33,800 Speaker 1: to thirty eight percent and sixty percent for women. Google's 421 00:26:33,800 --> 00:26:37,159 Speaker 1: own numbers claim that it's more like fifty fifty, or 422 00:26:37,200 --> 00:26:40,480 Speaker 1: at least closer to that. Younger people are more likely 423 00:26:40,520 --> 00:26:44,679 Speaker 1: to use YouTube, and location urban versus rural, is a 424 00:26:44,680 --> 00:26:50,960 Speaker 1: big factor as well. In YouTube, CEO Susan Wijitski said 425 00:26:51,040 --> 00:26:53,800 Speaker 1: or one of the reasons for text lack of women 426 00:26:53,920 --> 00:26:56,719 Speaker 1: is because of its reputation as being quote a geeky 427 00:26:56,920 --> 00:27:06,159 Speaker 1: male industry which did not go overwhelmed. Yeah. One of 428 00:27:06,200 --> 00:27:09,919 Speaker 1: the ways these moves around advertising has been theorized to 429 00:27:09,960 --> 00:27:13,760 Speaker 1: impact women is around beauty. Both the pressure of the 430 00:27:13,760 --> 00:27:17,760 Speaker 1: creator to put on makeup where nice, perhaps revealing clothes 431 00:27:18,320 --> 00:27:20,879 Speaker 1: which are things that are time consuming and often expensive, 432 00:27:21,640 --> 00:27:25,080 Speaker 1: but also to make videos about beauty, like makeup tutorials, 433 00:27:25,119 --> 00:27:28,800 Speaker 1: because they know that those can be monetized. Uh, they 434 00:27:28,880 --> 00:27:33,360 Speaker 1: might be pressured to use a sexualized thumbnail like we were. Um, 435 00:27:33,400 --> 00:27:36,560 Speaker 1: there's nothing inherently wrong with these videos, these makeup tutorials. 436 00:27:36,600 --> 00:27:39,240 Speaker 1: I know people who love them. But if women are 437 00:27:39,280 --> 00:27:43,280 Speaker 1: being pressured into doing them for financial reasons and being 438 00:27:43,280 --> 00:27:47,680 Speaker 1: pressured to look a certain way, that's an issue. We've 439 00:27:47,680 --> 00:27:51,080 Speaker 1: discussed before how the beauty industry has pushed standards to 440 00:27:51,160 --> 00:27:54,360 Speaker 1: sell products, telling women they have to look a certain way, 441 00:27:54,440 --> 00:27:58,399 Speaker 1: and that is partially at play here. Um. Yeah, because 442 00:27:58,600 --> 00:28:00,399 Speaker 1: you know, makeup can be great. We just need to 443 00:28:00,400 --> 00:28:04,120 Speaker 1: be clear. We need a clear picture of what's going 444 00:28:04,160 --> 00:28:07,919 Speaker 1: on here, because a lot of women reported feeling like 445 00:28:07,960 --> 00:28:10,040 Speaker 1: they had to do this because it's the only way 446 00:28:10,119 --> 00:28:14,520 Speaker 1: they could get ads. Many women creators report criticism around 447 00:28:14,720 --> 00:28:19,160 Speaker 1: their appearance, their weight, clothes, and makeup. Further, a lot 448 00:28:19,160 --> 00:28:21,639 Speaker 1: of the top channels created by women have to do 449 00:28:21,720 --> 00:28:25,439 Speaker 1: with stereotypical, more feminine topics like cooking, and there's a 450 00:28:25,480 --> 00:28:28,120 Speaker 1: bunch of research about this, and again it's it seems 451 00:28:28,160 --> 00:28:32,199 Speaker 1: to be that they feel there's nothing wrong with it, 452 00:28:32,240 --> 00:28:33,600 Speaker 1: but they feel like this is the only way they 453 00:28:33,600 --> 00:28:35,800 Speaker 1: can make money, which I think there is something wrong 454 00:28:35,840 --> 00:28:39,040 Speaker 1: with that, right right, Although I mean we could come 455 00:28:39,080 --> 00:28:41,840 Speaker 1: back and have the conversation about how also, if you're 456 00:28:41,880 --> 00:28:44,480 Speaker 1: a woman creator, you can only be for women. Yes, 457 00:28:45,160 --> 00:28:47,560 Speaker 1: So that's I mean, we run into that a lot. 458 00:28:47,640 --> 00:28:51,240 Speaker 1: So that's a hopeful conversation. Um. And by the way, 459 00:28:51,280 --> 00:28:54,720 Speaker 1: women are more likely to be subjected to trolling, comments 460 00:28:54,760 --> 00:28:58,640 Speaker 1: and harassment, even stalking and threats, and it increases with 461 00:28:58,760 --> 00:29:03,920 Speaker 1: every intersection a sexual orientation, gender identity, disability, and having 462 00:29:03,960 --> 00:29:07,640 Speaker 1: pain dismissed, not looking sick enough, not looking pretty enough. Wait, 463 00:29:08,120 --> 00:29:11,240 Speaker 1: there was even a study specifically on YouTube comments that 464 00:29:11,280 --> 00:29:15,520 Speaker 1: showed gender differences in how we talk about intoxication. Women 465 00:29:15,560 --> 00:29:19,920 Speaker 1: are often sexualized in fell videos. Women creators have reported 466 00:29:19,920 --> 00:29:22,800 Speaker 1: fear after accounts have posted hate videos about them and 467 00:29:22,840 --> 00:29:25,760 Speaker 1: called for their followers to attack them, and this travels 468 00:29:25,760 --> 00:29:28,880 Speaker 1: onto other platforms. By the way, I've seen this, Um, 469 00:29:28,960 --> 00:29:34,040 Speaker 1: it's really interesting. In twenty nineteen, YouTube rolled out updated 470 00:29:34,040 --> 00:29:38,600 Speaker 1: harassment policies, but creators say they haven't mitigated the issue. 471 00:29:39,000 --> 00:29:42,400 Speaker 1: Two report found that harassment against women is not only 472 00:29:42,440 --> 00:29:46,600 Speaker 1: alive and well on YouTube, it's flourishing. Women creators who 473 00:29:46,600 --> 00:29:49,959 Speaker 1: have gone viral have described the delusion of harassments they encounter. 474 00:29:50,160 --> 00:29:53,760 Speaker 1: Some even cited the Amber heard Johnny Depp trial as 475 00:29:53,840 --> 00:29:58,080 Speaker 1: emboldening misogynists that creators and allowing them to amass huge 476 00:29:58,160 --> 00:30:01,760 Speaker 1: numbers of followers, and that it helped normalize a toxic 477 00:30:01,880 --> 00:30:05,160 Speaker 1: level of hate towards women. UM and I don't think 478 00:30:05,200 --> 00:30:08,320 Speaker 1: we talked about this yet, but the including Megan the 479 00:30:08,360 --> 00:30:11,840 Speaker 1: Stallion that also brought up a whole amount of misogynistic 480 00:30:11,880 --> 00:30:17,120 Speaker 1: trolls that that got some notoriety, and similarly YouTube started 481 00:30:17,240 --> 00:30:21,479 Speaker 1: de ranking Megan Markle hate channels. People like Men's Right 482 00:30:21,560 --> 00:30:26,120 Speaker 1: activist Andrew Tate have a mess millions in revenue, though 483 00:30:26,360 --> 00:30:28,440 Speaker 1: he was recently banned, but by the way, he's been 484 00:30:28,480 --> 00:30:31,200 Speaker 1: making a lot of money on the backs of harassing 485 00:30:31,240 --> 00:30:35,640 Speaker 1: women just that and just recently got banned. UM. Women 486 00:30:35,720 --> 00:30:40,160 Speaker 1: creators feel that because YouTube is monetizing these accounts and 487 00:30:40,320 --> 00:30:45,280 Speaker 1: not disciplining their top male creators who post misogynistic content, 488 00:30:45,720 --> 00:30:50,400 Speaker 1: it empowers commenters to harass women. So many women have 489 00:30:50,520 --> 00:30:54,240 Speaker 1: left because of this. The report concluded, misogyny is alive 490 00:30:54,280 --> 00:30:57,920 Speaker 1: and well on YouTube. Videos pushing misinformation, hate, and outright 491 00:30:58,000 --> 00:31:03,880 Speaker 1: conspiracies targeting women are often monetize. This is like within 492 00:31:04,000 --> 00:31:08,200 Speaker 1: the year study UM of note, all the studies mentioned 493 00:31:09,040 --> 00:31:11,400 Speaker 1: that there was a lack of data and research around 494 00:31:11,440 --> 00:31:26,840 Speaker 1: non binary folks, so that is something that's missing. Um 495 00:31:26,880 --> 00:31:30,080 Speaker 1: and I want to include this. Uh. These are actual 496 00:31:30,160 --> 00:31:34,360 Speaker 1: guidelines from Google's anti harassment policy. Quote. Here are some 497 00:31:34,480 --> 00:31:38,000 Speaker 1: examples of content that's not allowed on YouTube. Reportedly showing 498 00:31:38,040 --> 00:31:40,600 Speaker 1: pictures of someone and then making statements like look at 499 00:31:40,600 --> 00:31:44,880 Speaker 1: this creature's teeth, they're so disgusting, with similar commentary targeting 500 00:31:44,960 --> 00:31:49,080 Speaker 1: intrinsic attributes throughout the video, targeting an individual based on 501 00:31:49,120 --> 00:31:52,200 Speaker 1: their membership in a protected group, such as by saying 502 00:31:52,480 --> 00:31:55,920 Speaker 1: look at this filthy slur targeting and protected group, I 503 00:31:55,960 --> 00:31:59,040 Speaker 1: wish they'd just get hit by a truck. Targeting an 504 00:31:59,040 --> 00:32:03,000 Speaker 1: individual and making claims they are involved in human trafficking 505 00:32:03,000 --> 00:32:06,280 Speaker 1: in the context of a harmful conspiracy theory where the 506 00:32:06,280 --> 00:32:11,080 Speaker 1: conspiracy is linked to direct threats or violent acts, using 507 00:32:11,080 --> 00:32:14,880 Speaker 1: an extreme insult to dehumanize an individual based on their 508 00:32:14,880 --> 00:32:18,520 Speaker 1: intrinsic attributes. For example, look at this dog of a woman. 509 00:32:18,880 --> 00:32:21,600 Speaker 1: She's not even a human being. She must be some 510 00:32:21,680 --> 00:32:26,800 Speaker 1: sort of mutant or animal depicting an identifiable individual being murdered, 511 00:32:26,960 --> 00:32:30,640 Speaker 1: seriously injured, or engaged in a graphic sexual act without 512 00:32:30,680 --> 00:32:35,160 Speaker 1: their consent. Accounts dedicated entirely to focusing on maliciously insulting 513 00:32:35,160 --> 00:32:40,800 Speaker 1: an identifiable individual, but it has to be the entire account. Yeah. Well, 514 00:32:40,840 --> 00:32:43,840 Speaker 1: and so I wanted to include this because it feels 515 00:32:43,920 --> 00:32:46,360 Speaker 1: like you're trying to teach a child how to like 516 00:32:47,040 --> 00:32:55,280 Speaker 1: behave like that. We have to say, like, hey, here's 517 00:32:55,280 --> 00:33:01,040 Speaker 1: an example. I was reading it, like whoa, Okay, I 518 00:33:01,320 --> 00:33:02,960 Speaker 1: mean there's a part of you that's like it. Has 519 00:33:03,000 --> 00:33:06,240 Speaker 1: there ever been anybody who read these things like, oh, 520 00:33:06,400 --> 00:33:12,120 Speaker 1: I see, I'll stop doing that. Um, But anyway, okay. 521 00:33:12,520 --> 00:33:17,960 Speaker 1: When it comes to money making, as only one creator 522 00:33:18,000 --> 00:33:20,040 Speaker 1: on the top ten list of YouTube was not a man, 523 00:33:20,520 --> 00:33:24,440 Speaker 1: and that was seven year old Nastia in one. She 524 00:33:24,520 --> 00:33:28,400 Speaker 1: made an estimated twenty eight million dollars. Uh, and this 525 00:33:28,440 --> 00:33:31,080 Speaker 1: is in keeping with the trends of recent years for YouTube. 526 00:33:31,480 --> 00:33:34,719 Speaker 1: One woman on the list, but in no women at 527 00:33:34,760 --> 00:33:37,400 Speaker 1: all made it. Wait, she's a seven year old? What 528 00:33:37,560 --> 00:33:43,320 Speaker 1: is her content? Yeah? Uh, because it's definitely not other 529 00:33:43,400 --> 00:33:46,720 Speaker 1: seven year olds watching it. Well, maybe I don't know, 530 00:33:47,560 --> 00:33:50,240 Speaker 1: because there's also a ten year old boy on there. 531 00:33:50,280 --> 00:33:55,720 Speaker 1: I think, I don't know. Yeah, you've also probably heard 532 00:33:55,760 --> 00:33:58,360 Speaker 1: of the credibility gap when it comes to women and 533 00:33:58,480 --> 00:34:02,320 Speaker 1: science or news content. Many women creators who are scientists 534 00:34:02,320 --> 00:34:05,480 Speaker 1: have posted the man splaining comments they get I'm sure 535 00:34:05,480 --> 00:34:07,760 Speaker 1: you've seen some of them or heard some of them, 536 00:34:07,920 --> 00:34:12,759 Speaker 1: or sexualization or outright harassment that they receive viewership on 537 00:34:13,080 --> 00:34:17,480 Speaker 1: STEM related videos us mail um. In some cases pretty drastically, 538 00:34:18,280 --> 00:34:22,200 Speaker 1: this impacts interest in and potential pursuit of STEM topics 539 00:34:22,200 --> 00:34:25,080 Speaker 1: and careers, because several studies have found that engaging in 540 00:34:25,120 --> 00:34:28,479 Speaker 1: STEM content, especially at a young age, can ultimately lead 541 00:34:28,600 --> 00:34:31,920 Speaker 1: to a STEM career. Studies have also found that, compared 542 00:34:31,960 --> 00:34:35,919 Speaker 1: to an equally scientifically curious man, women are twenty six 543 00:34:35,960 --> 00:34:39,480 Speaker 1: percent less likely to watch a science video on YouTube. 544 00:34:40,080 --> 00:34:43,280 Speaker 1: Science creator Emily grass lely You famously published a YouTube 545 00:34:43,360 --> 00:34:47,239 Speaker 1: video called where My Lady's At where she just read 546 00:34:47,280 --> 00:34:49,480 Speaker 1: some of the comments her videos received. And if you've 547 00:34:49,480 --> 00:34:52,120 Speaker 1: never watched her videos, they're pretty sweet. They're kind of 548 00:34:52,160 --> 00:34:54,920 Speaker 1: like she works in a museum and she talks about 549 00:34:55,000 --> 00:34:57,480 Speaker 1: things in the museum, and it was just all of this, 550 00:34:57,640 --> 00:35:03,640 Speaker 1: like very sexualized, hateful she receives. Oh no, Yeah, I 551 00:35:03,640 --> 00:35:08,080 Speaker 1: was thinking about one of the YouTube channels that my 552 00:35:08,200 --> 00:35:10,920 Speaker 1: partner really wants, like like gets me to watch um 553 00:35:11,160 --> 00:35:14,239 Speaker 1: Simone Gartz. Someone's going to correct me and tell me 554 00:35:14,239 --> 00:35:17,080 Speaker 1: I'm completely because she's really, really famous and her whole 555 00:35:17,760 --> 00:35:22,200 Speaker 1: thing is trying to create unusual hacks in her house 556 00:35:22,320 --> 00:35:24,960 Speaker 1: through whatever tools she's gotten. She's gotten huge, so she's 557 00:35:24,960 --> 00:35:28,239 Speaker 1: gotten really cool tools, and I think one she did 558 00:35:28,440 --> 00:35:30,799 Speaker 1: was a chair for her dog to sit next to 559 00:35:30,840 --> 00:35:34,160 Speaker 1: her at the like it's a desk chair, so something big, 560 00:35:34,200 --> 00:35:38,640 Speaker 1: and then creating um something with tampons I forgot already, 561 00:35:38,680 --> 00:35:41,000 Speaker 1: but like, she has a lot, and she is technically 562 00:35:41,080 --> 00:35:46,240 Speaker 1: a tech slash stem creator and past millions of viewers, 563 00:35:46,239 --> 00:35:48,400 Speaker 1: so I would be interested to see what her comments 564 00:35:48,400 --> 00:35:52,120 Speaker 1: look like with her popularity. Yeah. I think one of 565 00:35:52,160 --> 00:35:54,880 Speaker 1: the things that really disheartened me in this research was 566 00:35:54,960 --> 00:35:56,799 Speaker 1: that back in the early days when Kristy and I 567 00:35:56,840 --> 00:36:00,440 Speaker 1: were doing YouTube, we worked with a lot of big 568 00:36:00,480 --> 00:36:04,200 Speaker 1: women on YouTube because it's like how you kind of 569 00:36:04,200 --> 00:36:06,759 Speaker 1: cross promoted and got more viewers, and a lot of 570 00:36:06,800 --> 00:36:09,239 Speaker 1: the women we worked with who had like millions of 571 00:36:09,320 --> 00:36:13,520 Speaker 1: views and followers left YouTube like they would show up 572 00:36:13,680 --> 00:36:15,200 Speaker 1: in a lot of the articles I was reading, more 573 00:36:15,239 --> 00:36:17,960 Speaker 1: like here's how why I left. And they're like big 574 00:36:18,120 --> 00:36:24,520 Speaker 1: creators and so it just only imagining like the smaller creators. Um. 575 00:36:24,600 --> 00:36:28,680 Speaker 1: And also didn't you you showed me that video girlfriend reviews? Yes, 576 00:36:29,160 --> 00:36:33,359 Speaker 1: and now they had to like respond to a lot 577 00:36:33,400 --> 00:36:38,440 Speaker 1: of hate yeah, last yeah, yeah, they had to respond 578 00:36:38,440 --> 00:36:41,080 Speaker 1: to a lot a lot of hate in general. But 579 00:36:41,120 --> 00:36:44,000 Speaker 1: apparently they're doing huge stuff in Twitch. But the whole 580 00:36:44,239 --> 00:36:47,240 Speaker 1: premises is pretty much the girlfriend is watching her boyfriend 581 00:36:47,320 --> 00:36:49,839 Speaker 1: play these games and does a commentary and they grew 582 00:36:49,920 --> 00:36:53,640 Speaker 1: up to a huge success and is I think living 583 00:36:53,680 --> 00:36:57,040 Speaker 1: off of the content as content creators because they have 584 00:36:57,200 --> 00:37:00,640 Speaker 1: made sponsor deals. Um. But yeah, they they had to 585 00:37:00,640 --> 00:37:04,279 Speaker 1: go through some things where several video games, including The 586 00:37:04,320 --> 00:37:07,640 Speaker 1: Last of Us too, because they loved it. M much 587 00:37:07,840 --> 00:37:13,080 Speaker 1: like you we always have to bring it back to 588 00:37:13,120 --> 00:37:17,319 Speaker 1: the Last of Us or Star Star Wars here. So 589 00:37:17,360 --> 00:37:20,600 Speaker 1: a part of the issue is a production of budget staff. Statistically, 590 00:37:20,800 --> 00:37:24,360 Speaker 1: mail run shows are able to secure more funds and staff. 591 00:37:25,239 --> 00:37:28,280 Speaker 1: Another is that women who are scientists often have families 592 00:37:28,320 --> 00:37:30,920 Speaker 1: and have more of the share in taking care of children, 593 00:37:30,960 --> 00:37:36,040 Speaker 1: which means yes, less time for actually making content and videos. Yeah, 594 00:37:36,080 --> 00:37:40,879 Speaker 1: which is something we heard during the pandemic with why 595 00:37:40,920 --> 00:37:45,040 Speaker 1: they were less scientific papers turned in by women. Um. 596 00:37:45,080 --> 00:37:47,640 Speaker 1: And also have you ever heard of saying and never 597 00:37:47,719 --> 00:37:56,800 Speaker 1: read the comments that million and mistake and keep going? Well? 598 00:37:57,320 --> 00:38:01,440 Speaker 1: Researcher in Noca Maris Takara, and I apologized if I 599 00:38:01,480 --> 00:38:04,640 Speaker 1: butchered your name, I could not find a pronunciation anywhere. Uh, 600 00:38:05,080 --> 00:38:10,680 Speaker 1: but um, they did read the comments. Came out the 601 00:38:10,760 --> 00:38:14,080 Speaker 1: other side with a published paper about sexism and YouTube. 602 00:38:15,280 --> 00:38:19,160 Speaker 1: They looked at over twenty three thousand YouTube comments specifically 603 00:38:19,200 --> 00:38:21,480 Speaker 1: comparing the treatment of men and women when it came 604 00:38:21,520 --> 00:38:25,760 Speaker 1: to science content, and no surprise, the women were treated 605 00:38:25,800 --> 00:38:31,080 Speaker 1: more harshly negative comments versus six percent for men. Um. 606 00:38:31,200 --> 00:38:33,040 Speaker 1: I've I've said it a million times on the show, 607 00:38:33,200 --> 00:38:35,960 Speaker 1: but we once did this experiment at work where we 608 00:38:36,040 --> 00:38:39,640 Speaker 1: had a male host and a female host do the 609 00:38:39,680 --> 00:38:44,000 Speaker 1: exact same video and we compared the comments and it 610 00:38:44,080 --> 00:38:47,279 Speaker 1: was very clear it was very stark which one they 611 00:38:47,320 --> 00:38:51,400 Speaker 1: believed was smarter and all of the physical comments that 612 00:38:51,440 --> 00:38:54,080 Speaker 1: the woman got. And yeah, the women in the study 613 00:38:54,120 --> 00:38:57,640 Speaker 1: got a lot more sexually charged comments or comments about 614 00:38:57,640 --> 00:39:01,960 Speaker 1: their appearance in general and the way YouTube comments work 615 00:39:02,440 --> 00:39:05,440 Speaker 1: is the most controversial ones right to the top, And 616 00:39:05,480 --> 00:39:09,520 Speaker 1: in fact, I never personally would do this, but Kristen 617 00:39:09,600 --> 00:39:12,280 Speaker 1: would sometimes be like, let's do something really controversial because 618 00:39:12,280 --> 00:39:15,279 Speaker 1: it will generate, Like you, it'll make you go to 619 00:39:15,360 --> 00:39:17,799 Speaker 1: the top faster and make you get more comments. And 620 00:39:17,840 --> 00:39:21,760 Speaker 1: that's just how YouTube's algorithm works, which, as I believe, 621 00:39:22,160 --> 00:39:26,200 Speaker 1: one big YouTuber was like, this does not facilitate healthy, 622 00:39:26,560 --> 00:39:31,759 Speaker 1: helpful conversation, right. I feel like that's that's for everything, 623 00:39:31,800 --> 00:39:36,680 Speaker 1: including uh news, like getting the worst, the most controversial 624 00:39:36,680 --> 00:39:38,680 Speaker 1: stuff out there first, because that's going to get the tension, 625 00:39:39,000 --> 00:39:43,360 Speaker 1: which sucks. Yes, So YouTube is trying to combat this 626 00:39:43,520 --> 00:39:47,120 Speaker 1: to varying degrees of success. In eighteen, they launched the 627 00:39:47,160 --> 00:39:50,000 Speaker 1: first hashtag women to Watch as part of the Next 628 00:39:50,120 --> 00:39:53,080 Speaker 1: Up program. In the company's own words, this was the 629 00:39:53,160 --> 00:39:57,399 Speaker 1: first time they had focused on empowering female voices UM. 630 00:39:57,440 --> 00:40:01,360 Speaker 1: In sixteen, they formed a year long partnership between Women 631 00:40:01,440 --> 00:40:05,759 Speaker 1: Creators and the UN to Advocate to Gender Equality. UM 632 00:40:05,840 --> 00:40:08,000 Speaker 1: and other organizations have been trying to fight this as well. 633 00:40:08,040 --> 00:40:13,320 Speaker 1: In you Coalition now Uplift together, formed to combat sexual abuse, 634 00:40:13,400 --> 00:40:19,319 Speaker 1: emotional manipulation, and other forms of violence in the YouTube community. Yeah. Yeah, 635 00:40:19,360 --> 00:40:23,799 Speaker 1: I mean with every tech topic we do, because of 636 00:40:23,800 --> 00:40:25,759 Speaker 1: the show we are, we focus on the negative. There 637 00:40:25,760 --> 00:40:27,680 Speaker 1: are a lot of positives about YouTube. It can be 638 00:40:27,719 --> 00:40:30,200 Speaker 1: a great place to find community, could be great place 639 00:40:30,239 --> 00:40:34,960 Speaker 1: to share ideas, to educate all kinds of things. But 640 00:40:35,080 --> 00:40:41,160 Speaker 1: this is I just there's so much fluctuation. I can't 641 00:40:41,200 --> 00:40:43,880 Speaker 1: say for sure the situation is getting better, like people 642 00:40:44,560 --> 00:40:49,839 Speaker 1: are talking about it, but um, and it's also kind 643 00:40:49,840 --> 00:40:53,560 Speaker 1: of it annoys me in so many of these tech 644 00:40:53,640 --> 00:40:56,600 Speaker 1: episodes where it's like the creators are the ones making 645 00:40:56,680 --> 00:41:04,279 Speaker 1: YouTube money and they're not getting paid anything and are 646 00:41:04,360 --> 00:41:07,319 Speaker 1: forced to leave or I don't know. It's just like 647 00:41:08,040 --> 00:41:09,760 Speaker 1: there's a lot of things that need to be figured 648 00:41:09,800 --> 00:41:14,200 Speaker 1: out the situations. I think in general, content creation and 649 00:41:14,239 --> 00:41:16,919 Speaker 1: content creators is such a new thing. We've talked about 650 00:41:16,920 --> 00:41:19,719 Speaker 1: it before. We've talked about influencers and the good and 651 00:41:19,760 --> 00:41:22,160 Speaker 1: the bad. But yeah, it is. There's a lot of 652 00:41:22,160 --> 00:41:26,280 Speaker 1: things that the law, policies and just human rights things 653 00:41:26,360 --> 00:41:30,440 Speaker 1: haven't caught up with as fast as this medium has become. Right, 654 00:41:31,080 --> 00:41:34,880 Speaker 1: it is complicated. Like to YouTube's credit, I get that 655 00:41:34,960 --> 00:41:37,680 Speaker 1: it's complicated because you know, you don't want an advertiser 656 00:41:38,440 --> 00:41:40,360 Speaker 1: to be angry about what they're at is running on. 657 00:41:40,400 --> 00:41:43,880 Speaker 1: But you also don't want a creator to not be 658 00:41:43,960 --> 00:41:47,000 Speaker 1: able to make money and leave. But right now it's 659 00:41:47,000 --> 00:41:51,879 Speaker 1: not working at least that's my view of it. So 660 00:41:52,160 --> 00:41:56,400 Speaker 1: I would love if anybody listening has more recent experience 661 00:41:56,400 --> 00:42:01,160 Speaker 1: with YouTube, has any thoughts or resources, are numbers, anything 662 00:42:01,200 --> 00:42:03,560 Speaker 1: like that, because this was one where I kind of 663 00:42:03,560 --> 00:42:06,680 Speaker 1: got overwhelmed. There's a lot more we could talk about 664 00:42:06,719 --> 00:42:09,560 Speaker 1: with this, and a lot is changing. Some of those 665 00:42:09,600 --> 00:42:12,680 Speaker 1: surveys I mentioned, even though they were recent. I bet 666 00:42:12,719 --> 00:42:15,160 Speaker 1: it's shifted. I bet the conversation has shifted since then. 667 00:42:16,080 --> 00:42:20,880 Speaker 1: Um So if there's anything like that, Uh, you can 668 00:42:20,920 --> 00:42:23,879 Speaker 1: email us at Stuff Media, Mom Stuff at iHeart media 669 00:42:23,960 --> 00:42:26,399 Speaker 1: dot com. You can find us on Twitter at Mom's 670 00:42:26,440 --> 00:42:30,600 Speaker 1: toff podcast, or on Instagram and TikTok at stuff We'll 671 00:42:30,680 --> 00:42:34,560 Speaker 1: Never Told You. Thanks as always to our super producer Christina, 672 00:42:34,960 --> 00:42:37,840 Speaker 1: thank you and thanks to you for listening Stuff I 673 00:42:37,920 --> 00:42:39,400 Speaker 1: Never Told You his protection of I Heart Radio. For 674 00:42:39,480 --> 00:42:41,120 Speaker 1: more podcast from my Heart Radio, you can check out 675 00:42:41,120 --> 00:42:43,760 Speaker 1: the heart Radio app, Apple podcast Regulus into your favorite 676 00:42:43,760 --> 00:42:44,080 Speaker 1: shows