1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,400 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,560 --> 00:00:18,079 Speaker 1: Jonathan Strickland, Diamond Executive producer with I Heart Radio and 4 00:00:18,120 --> 00:00:22,760 Speaker 1: how the tech are yet So recently, the YouTube content 5 00:00:22,960 --> 00:00:29,280 Speaker 1: moderation AI made a notable mistake. Artist and writer Chris Straub, 6 00:00:29,760 --> 00:00:33,680 Speaker 1: whom I once interviewed many years ago for an article 7 00:00:33,720 --> 00:00:37,520 Speaker 1: about how web comics work, has made a lot of 8 00:00:37,560 --> 00:00:41,240 Speaker 1: different stuff over the years, and one of his creations 9 00:00:41,520 --> 00:00:45,960 Speaker 1: is a video series called Local five eight TV, which 10 00:00:46,200 --> 00:00:50,280 Speaker 1: is described on the YouTube channel as quote an analog 11 00:00:50,400 --> 00:00:54,680 Speaker 1: horror at four seventies six mega hurts end quote. The 12 00:00:54,720 --> 00:00:59,160 Speaker 1: series relies on a low fi VHS aesthetic which has, 13 00:00:59,600 --> 00:01:01,880 Speaker 1: you know, in a popular style in the horror realm 14 00:01:02,000 --> 00:01:05,080 Speaker 1: for the better part of a decade now, and I'm 15 00:01:05,120 --> 00:01:08,400 Speaker 1: sure you've seen plenty of examples, such as the game 16 00:01:08,480 --> 00:01:13,759 Speaker 1: series Five Nights at Freddie's or the VHS anthology Horror Stories. 17 00:01:14,440 --> 00:01:17,720 Speaker 1: There are tons of these of this kind of throwback 18 00:01:17,800 --> 00:01:24,080 Speaker 1: to like nineteen eighties era looks and aesthetic for horror. Anyway, 19 00:01:24,400 --> 00:01:28,520 Speaker 1: one of the episodes Stroub created back in has the 20 00:01:28,520 --> 00:01:32,600 Speaker 1: title show for Children and the description on the video 21 00:01:32,680 --> 00:01:37,319 Speaker 1: reads not for children, and the whole stick here is 22 00:01:37,360 --> 00:01:41,520 Speaker 1: that Stroub has gone for a beyond generic title for 23 00:01:41,600 --> 00:01:46,240 Speaker 1: this short. In fact, in the context of the Local 24 00:01:46,360 --> 00:01:51,760 Speaker 1: five eight t V universe, it's the television station the 25 00:01:51,840 --> 00:01:57,280 Speaker 1: Local five eight has show for children listed on its 26 00:01:57,320 --> 00:02:01,000 Speaker 1: on its lineup, and that's just the title, like, it's 27 00:02:01,040 --> 00:02:03,480 Speaker 1: kind of a commentary on this sort of very generic, 28 00:02:03,960 --> 00:02:08,120 Speaker 1: low budget children's program um And the short itself follows 29 00:02:08,120 --> 00:02:11,760 Speaker 1: a cute little cartoon skeleton walking through a cemetery at 30 00:02:11,840 --> 00:02:16,520 Speaker 1: night with the moon overhead as a warbly soundtrack plays 31 00:02:16,520 --> 00:02:19,720 Speaker 1: in the background. The skeleton comes upon an open grave 32 00:02:19,760 --> 00:02:23,639 Speaker 1: and looks into it and sees a less cartoonish skeleton 33 00:02:23,880 --> 00:02:29,080 Speaker 1: inside it, and then continues on its way, and gradually 34 00:02:29,120 --> 00:02:31,200 Speaker 1: the music fades out. We get the sound of wind 35 00:02:31,560 --> 00:02:35,000 Speaker 1: as this cartoon skeleton looks into other open graves. It 36 00:02:35,040 --> 00:02:38,040 Speaker 1: sees a little weird skeletal creature in one that kind 37 00:02:38,040 --> 00:02:41,959 Speaker 1: of mules like a cat, and sees a bottomless hole 38 00:02:42,040 --> 00:02:45,120 Speaker 1: in another, and apparently goes inside it and is walking 39 00:02:45,600 --> 00:02:48,640 Speaker 1: in like a cave with its arms folded as if 40 00:02:48,680 --> 00:02:51,560 Speaker 1: the skeleton is freezing, and it comes upon another open 41 00:02:51,560 --> 00:02:55,000 Speaker 1: grave within this cave and lies down inside of it, 42 00:02:55,080 --> 00:02:58,320 Speaker 1: looking up through the grave to the moon, and then 43 00:02:58,360 --> 00:03:02,799 Speaker 1: transforms into a more realistic skeleton. And that's the end. Now, 44 00:03:02,840 --> 00:03:08,320 Speaker 1: on the whole, it's a pretty tame but creepy short video. 45 00:03:08,760 --> 00:03:11,760 Speaker 1: But Straub knew what audience he was aiming for and 46 00:03:11,800 --> 00:03:15,040 Speaker 1: it wasn't kids. And so even though the title read 47 00:03:15,240 --> 00:03:19,480 Speaker 1: show for children, the description said otherwise. And moreover, Straub 48 00:03:19,840 --> 00:03:22,960 Speaker 1: went a bit further. He used the content settings in 49 00:03:23,040 --> 00:03:26,800 Speaker 1: YouTube to tag the cartoon as an eighteen and up 50 00:03:27,000 --> 00:03:32,000 Speaker 1: video and that was that for a while. Anyway, flash 51 00:03:32,040 --> 00:03:36,760 Speaker 1: forward to this week in twenty two, when Straube discovered 52 00:03:36,800 --> 00:03:41,960 Speaker 1: that YouTube's content moderation AI had mysteriously changed this video's 53 00:03:42,080 --> 00:03:46,160 Speaker 1: setting from eighteen and up to being suitable for children. Now, 54 00:03:46,200 --> 00:03:48,640 Speaker 1: that also means that his short could potentially appear in 55 00:03:48,680 --> 00:03:52,760 Speaker 1: the YouTube Kids app More on that later. Now, clearly 56 00:03:53,600 --> 00:03:56,440 Speaker 1: that's not what Chris Strab wanted. He went on to 57 00:03:56,560 --> 00:04:00,760 Speaker 1: Twitter to reveal this problem, including screen shots of some 58 00:04:00,800 --> 00:04:03,440 Speaker 1: of the issues. He had intentionally flagged his video as 59 00:04:03,480 --> 00:04:07,120 Speaker 1: being for mature audiences. The YouTube moderation AI reverse that 60 00:04:07,200 --> 00:04:11,600 Speaker 1: setting without Straub's input or consent. Moreover, Strab found that 61 00:04:11,640 --> 00:04:14,760 Speaker 1: he couldn't change it back. His ability to switch the 62 00:04:14,840 --> 00:04:17,719 Speaker 1: video away from being flagged as suitable for all kids 63 00:04:18,040 --> 00:04:21,159 Speaker 1: had been grayed out. He could not switch it to 64 00:04:21,360 --> 00:04:23,839 Speaker 1: eighteen and up again, so his only option was to 65 00:04:23,880 --> 00:04:27,320 Speaker 1: file and appeal with YouTube, which he did. Now, obviously 66 00:04:27,680 --> 00:04:29,880 Speaker 1: there are a lot of questions that pop up because 67 00:04:29,920 --> 00:04:33,279 Speaker 1: of this incident. So why would an automated content moderation 68 00:04:33,360 --> 00:04:37,720 Speaker 1: tool be able to reverse an age gated restriction on 69 00:04:37,800 --> 00:04:40,760 Speaker 1: a YouTube video? You know, it makes sense if the 70 00:04:40,800 --> 00:04:45,280 Speaker 1: moderation algorithm detected a video that had no age restriction 71 00:04:45,760 --> 00:04:49,640 Speaker 1: should have age restriction, right like, if a video detection 72 00:04:49,680 --> 00:04:53,680 Speaker 1: system determines this video has got some objectionable material in it, 73 00:04:53,680 --> 00:04:56,560 Speaker 1: it makes sense that it could flag that video and 74 00:04:56,640 --> 00:04:59,200 Speaker 1: change the setting so it's not listed as appropriate for 75 00:04:59,240 --> 00:05:02,560 Speaker 1: all ages. You could understand that maybe the content creator 76 00:05:02,600 --> 00:05:05,640 Speaker 1: could say, oh, no, this is a mistake. Review it. 77 00:05:06,000 --> 00:05:09,239 Speaker 1: You'll see that this content is in fact appropriate for everyone, 78 00:05:09,560 --> 00:05:13,720 Speaker 1: and maybe get that decision reversed. You would think that 79 00:05:13,720 --> 00:05:16,080 Speaker 1: that would be the only way this kind of AI 80 00:05:16,160 --> 00:05:19,880 Speaker 1: would make an error, right like, it would be overprotective, 81 00:05:20,320 --> 00:05:24,320 Speaker 1: rather than potentially reversing a video that had been flagged 82 00:05:24,360 --> 00:05:27,760 Speaker 1: as being eighteen and up as being suitable for all ages. 83 00:05:27,880 --> 00:05:30,200 Speaker 1: So what is going on here? Well, to get to 84 00:05:30,240 --> 00:05:32,640 Speaker 1: the bottom of all that, we're gonna have to talk 85 00:05:32,640 --> 00:05:36,159 Speaker 1: about YouTube's shoddy history with content for kids, which goes 86 00:05:36,200 --> 00:05:38,760 Speaker 1: back pretty much to the founding of YouTube. We're also 87 00:05:38,800 --> 00:05:41,680 Speaker 1: gonna have to talk about the Children's Online Privacy Protection Act. 88 00:05:41,720 --> 00:05:43,760 Speaker 1: In fact, that's going to be the meat of most 89 00:05:43,839 --> 00:05:47,800 Speaker 1: of this episode. That's also known as Kappa CEO p 90 00:05:47,960 --> 00:05:51,720 Speaker 1: p A and the ridiculous tendency for lawmakers to deem 91 00:05:51,760 --> 00:05:56,200 Speaker 1: anything that animated is intended for kids. That seems to 92 00:05:56,240 --> 00:05:58,680 Speaker 1: be kind of a go to, which you would think 93 00:05:58,680 --> 00:06:01,160 Speaker 1: by now people would realize that's just not the case. 94 00:06:01,200 --> 00:06:03,800 Speaker 1: In fact, I mean, I'm going to talk about this 95 00:06:03,839 --> 00:06:07,800 Speaker 1: again later, but animation has a very long history of 96 00:06:07,839 --> 00:06:11,400 Speaker 1: being a medium for different age groups. I mean, there 97 00:06:11,400 --> 00:06:15,000 Speaker 1: are old cartoons that were never meant to be shown 98 00:06:15,000 --> 00:06:17,680 Speaker 1: to kids. And we're also going to talk about the 99 00:06:17,760 --> 00:06:20,200 Speaker 1: business of online video because that plays a big part 100 00:06:20,240 --> 00:06:23,640 Speaker 1: in it too. So let's start with Kappa because that 101 00:06:23,720 --> 00:06:26,920 Speaker 1: actually predates the launch of YouTube. You know, I'm sure 102 00:06:26,920 --> 00:06:31,400 Speaker 1: a lot of people heard about Kappa, maybe around definitely 103 00:06:31,520 --> 00:06:36,360 Speaker 1: more around twenty because that's when it was really having 104 00:06:36,400 --> 00:06:40,240 Speaker 1: a huge impact on YouTube. But in fact, KAPPA was 105 00:06:40,360 --> 00:06:45,080 Speaker 1: initially passed in and was enacted on April twenty one, 106 00:06:45,320 --> 00:06:48,440 Speaker 1: two thousand, so that's five years before YouTube came out, 107 00:06:49,080 --> 00:06:53,640 Speaker 1: and KAPPA was intended to protect children online using various measures, 108 00:06:53,640 --> 00:06:57,839 Speaker 1: including restrictions on how sites could harvest and use personal data, 109 00:06:58,040 --> 00:07:01,160 Speaker 1: as well as requiring sites to seek very aifiable consent 110 00:07:01,440 --> 00:07:04,240 Speaker 1: from a parent or a guardian before a child was 111 00:07:04,279 --> 00:07:07,760 Speaker 1: allowed to use it. The law specifically protects kids under 112 00:07:07,800 --> 00:07:10,840 Speaker 1: the age of thirteen. So if you've ever wondered why 113 00:07:11,000 --> 00:07:13,960 Speaker 1: so many social network sites require users to be thirteen 114 00:07:14,040 --> 00:07:18,120 Speaker 1: or older, this is why. It's because creating a service 115 00:07:18,200 --> 00:07:22,640 Speaker 1: that complies with Kappa is really darn hard to do. 116 00:07:22,880 --> 00:07:26,000 Speaker 1: It requires a lot of oversight. I mean, just having 117 00:07:26,040 --> 00:07:29,920 Speaker 1: a verifiable way for parents to consent is tough on 118 00:07:29,920 --> 00:07:33,440 Speaker 1: its own, right, Like, how can you verify that it 119 00:07:33,520 --> 00:07:36,640 Speaker 1: was in fact a guardian or parent who gave consent 120 00:07:36,760 --> 00:07:40,280 Speaker 1: for a child to access that material. If you've ever 121 00:07:40,400 --> 00:07:43,560 Speaker 1: encountered any site that asked you if you were of 122 00:07:43,640 --> 00:07:46,680 Speaker 1: the appropriate age, you've probably noticed that all it takes 123 00:07:46,760 --> 00:07:49,680 Speaker 1: is clicking the yes button or the I'm over eighteen 124 00:07:49,680 --> 00:07:54,040 Speaker 1: button or whatever it might be. That's not verifiable, right, 125 00:07:54,120 --> 00:07:56,760 Speaker 1: that's just taking your word for that. Anyone could click 126 00:07:56,800 --> 00:07:59,480 Speaker 1: an affirmative answer even if they didn't meet the criteria. 127 00:07:59,520 --> 00:08:02,960 Speaker 1: There's nothing stopping them. So building an age gate that 128 00:08:03,040 --> 00:08:07,280 Speaker 1: verifies answers, or at least as capable of verifying answers 129 00:08:07,920 --> 00:08:10,480 Speaker 1: is way more difficult. And it also means that you 130 00:08:10,560 --> 00:08:13,480 Speaker 1: potentially are reducing the amount of traffic to that site 131 00:08:13,600 --> 00:08:16,200 Speaker 1: or service. So a lot of companies just opt out 132 00:08:16,240 --> 00:08:18,160 Speaker 1: of doing it entirely and say, hey, our stuff is 133 00:08:18,200 --> 00:08:20,760 Speaker 1: not meant for kids under thirteen, so don't even try. 134 00:08:20,960 --> 00:08:23,960 Speaker 1: Doesn't mean that kids can't try, and it doesn't mean 135 00:08:23,960 --> 00:08:27,400 Speaker 1: that those sites prevent kids from trying, but you know, 136 00:08:27,440 --> 00:08:30,480 Speaker 1: they're at least trying to take the appearance of not 137 00:08:30,600 --> 00:08:34,160 Speaker 1: being intended for younger children in an effort to not 138 00:08:34,360 --> 00:08:37,600 Speaker 1: be lumped in as something that would be covered under 139 00:08:37,679 --> 00:08:42,640 Speaker 1: Kappa um. There. You could have a very long discussion. 140 00:08:42,840 --> 00:08:45,679 Speaker 1: It's almost like theater, right, Like it's like security theater 141 00:08:46,160 --> 00:08:49,840 Speaker 1: giving the appearance of propriety in this case but not 142 00:08:49,920 --> 00:08:53,400 Speaker 1: actually being proper. Uh, we could have a week long 143 00:08:53,480 --> 00:08:56,960 Speaker 1: discussion about that. But you know, just keep it in 144 00:08:56,960 --> 00:08:59,960 Speaker 1: the back of your mind. So KAPPA also specifically focus 145 00:09:00,080 --> 00:09:04,959 Speaker 1: is on sites and services that are specifically directed to children. Moreover, 146 00:09:05,000 --> 00:09:08,959 Speaker 1: the courts of interpreted Kappa to treat user generated channels 147 00:09:08,960 --> 00:09:13,640 Speaker 1: like user generated content, as if they the person who 148 00:09:13,679 --> 00:09:17,960 Speaker 1: created the content owned and operated the site it was on. So, 149 00:09:18,000 --> 00:09:21,679 Speaker 1: in other words, if you made a channel that a 150 00:09:21,679 --> 00:09:25,640 Speaker 1: reasonable person would interpret as having content directed to children, 151 00:09:26,320 --> 00:09:29,160 Speaker 1: well then your channel would be subject to KAPPA and 152 00:09:29,200 --> 00:09:31,320 Speaker 1: you could be treated as if you owned and operated 153 00:09:31,360 --> 00:09:34,880 Speaker 1: the site, though the site itself could also be facing 154 00:09:34,920 --> 00:09:38,760 Speaker 1: its own issues, which YouTube has faced numerous times in 155 00:09:38,800 --> 00:09:42,800 Speaker 1: the past. Now let's talk about YouTube in particular now, 156 00:09:42,840 --> 00:09:45,520 Speaker 1: as I'm sure we all know, YouTube and Google in 157 00:09:45,559 --> 00:09:49,600 Speaker 1: general makes most of its revenue through selling ads against content, 158 00:09:49,920 --> 00:09:53,480 Speaker 1: and we probably all know that the most valuable advertisements 159 00:09:53,480 --> 00:09:58,120 Speaker 1: online are targeted advertisements. Those just have a higher chance 160 00:09:58,160 --> 00:10:00,600 Speaker 1: of making an impact on a consumer. Here's a very 161 00:10:00,640 --> 00:10:04,000 Speaker 1: simple example. If you happen to know that the person 162 00:10:04,040 --> 00:10:08,040 Speaker 1: who's visiting your channel and watching a particular video has 163 00:10:08,080 --> 00:10:12,120 Speaker 1: been shopping for shoes recently, serving up an ad about 164 00:10:12,160 --> 00:10:16,120 Speaker 1: shoes could be a huge payoff. The person might respond 165 00:10:16,280 --> 00:10:19,800 Speaker 1: to that ad, and that ad becomes more valuable. But 166 00:10:19,960 --> 00:10:24,120 Speaker 1: to target ads, you have to learn about the audience first. 167 00:10:24,400 --> 00:10:28,880 Speaker 1: You can't target an audience with advertising unless you know 168 00:10:29,280 --> 00:10:31,760 Speaker 1: more about them. You need to know stuff like where 169 00:10:31,800 --> 00:10:34,360 Speaker 1: they're living, because you don't want to serve them ads 170 00:10:34,400 --> 00:10:37,240 Speaker 1: for things that aren't available in their area. You need 171 00:10:37,280 --> 00:10:39,920 Speaker 1: to know what they like and what their habits indicate 172 00:10:40,000 --> 00:10:43,240 Speaker 1: about those people. You might also want to know stuff 173 00:10:43,280 --> 00:10:46,840 Speaker 1: like how old they are, what gender do they identify with, 174 00:10:47,000 --> 00:10:50,160 Speaker 1: that kind of thing. Now, it's one thing to track 175 00:10:50,240 --> 00:10:54,080 Speaker 1: personal data that belongs to adults, it's an entirely different 176 00:10:54,120 --> 00:10:57,480 Speaker 1: matter when we're talking about children. That's when Kappa comes 177 00:10:57,480 --> 00:10:59,959 Speaker 1: into play, and that's when you really need that very 178 00:11:00,160 --> 00:11:03,840 Speaker 1: viable adult consent. Okay, we're gonna take a quick break. 179 00:11:03,840 --> 00:11:06,040 Speaker 1: When we come back, we're gonna talk more about Kappa 180 00:11:06,120 --> 00:11:17,960 Speaker 1: and YouTube. Okay, so let's talk about Kappa and YouTube. 181 00:11:18,120 --> 00:11:21,680 Speaker 1: While a channel creator might make stuff directed to children, 182 00:11:22,559 --> 00:11:26,880 Speaker 1: that channel creator isn't fully responsible for tracking the data 183 00:11:27,120 --> 00:11:31,240 Speaker 1: of that person necessarily, right, Like, you can get some 184 00:11:31,360 --> 00:11:37,160 Speaker 1: insights on your audience through the creator part of YouTube. 185 00:11:37,200 --> 00:11:40,560 Speaker 1: Like when you're actually owning a channel on your uploading videos, 186 00:11:41,040 --> 00:11:45,680 Speaker 1: you get insights into your audience to some degree, but 187 00:11:45,800 --> 00:11:48,800 Speaker 1: it's not like you have all the tools available to 188 00:11:48,920 --> 00:11:51,680 Speaker 1: you to make use of that information the way YouTube does. 189 00:11:51,760 --> 00:11:54,959 Speaker 1: Like YouTube really takes the reins on that, right, It's 190 00:11:55,400 --> 00:12:01,640 Speaker 1: it's the platform's responsibility to secure advertising against the content. 191 00:12:01,880 --> 00:12:04,320 Speaker 1: They're better qualified to do it and able to do 192 00:12:04,360 --> 00:12:08,439 Speaker 1: it on a much larger scale, so you aren't fully 193 00:12:08,480 --> 00:12:11,319 Speaker 1: responsible for that. However, you know, if you have comments 194 00:12:11,360 --> 00:12:14,719 Speaker 1: or whatever enabled on your on your video, people might 195 00:12:14,760 --> 00:12:18,920 Speaker 1: be leaving information that reveals personal data about themselves, like 196 00:12:19,280 --> 00:12:22,559 Speaker 1: their name or where they're from, or any sort of 197 00:12:22,559 --> 00:12:25,679 Speaker 1: of of information about themselves, and you could be keeping 198 00:12:25,720 --> 00:12:29,040 Speaker 1: track of that, right So it wouldn't necessarily just be YouTube. 199 00:12:29,080 --> 00:12:32,760 Speaker 1: You could be tracking that too. So if a content 200 00:12:32,800 --> 00:12:36,560 Speaker 1: creator has enabled ads against their content, if they're trying 201 00:12:36,559 --> 00:12:41,160 Speaker 1: to monetize their their videos, that brings up a ton 202 00:12:41,200 --> 00:12:45,439 Speaker 1: of red flags when you know, view it in line 203 00:12:45,440 --> 00:12:49,600 Speaker 1: with Kappa. But hey, what determines if a site or 204 00:12:49,679 --> 00:12:54,360 Speaker 1: service or channel is directed to children in the first place, Like, 205 00:12:55,400 --> 00:12:58,440 Speaker 1: what is it that makes us say, yes, this channel 206 00:12:58,600 --> 00:13:02,760 Speaker 1: is meant to be shown to children. Well, there's no 207 00:13:02,800 --> 00:13:06,920 Speaker 1: cookie cutter answer for that. Ultimately, if it came down 208 00:13:06,960 --> 00:13:10,760 Speaker 1: to it. The FTC or Federal Trade Commission would have 209 00:13:10,840 --> 00:13:14,640 Speaker 1: to rule if something is actually directed to children, using 210 00:13:14,720 --> 00:13:18,200 Speaker 1: various criteria, and that criteria can include things like the 211 00:13:18,240 --> 00:13:22,520 Speaker 1: subject matter involved. Right Like, if the video channel is 212 00:13:22,600 --> 00:13:27,600 Speaker 1: all about home appraisals, chances are the FTC is gonna say, well, clearly, 213 00:13:27,640 --> 00:13:30,640 Speaker 1: this isn't directed to children. Even if you're using cartoons 214 00:13:30,679 --> 00:13:35,520 Speaker 1: to illustrate how home appraisals work. The subject matter alone 215 00:13:35,679 --> 00:13:38,360 Speaker 1: is not going to appeal to children. It's not directed 216 00:13:38,360 --> 00:13:41,600 Speaker 1: to them. It's not really a question. But the visual 217 00:13:41,600 --> 00:13:45,080 Speaker 1: style of the content does matter, and that includes whether 218 00:13:45,160 --> 00:13:48,800 Speaker 1: or not there's any use of animation. Animation is one 219 00:13:48,840 --> 00:13:53,400 Speaker 1: of the criteria. So again, I know that my my 220 00:13:53,480 --> 00:13:58,120 Speaker 1: animation fans out there all feel this way. There's tons 221 00:13:58,200 --> 00:14:02,840 Speaker 1: of animation that's not meant for children. It's not necessarily 222 00:14:03,600 --> 00:14:08,240 Speaker 1: incredibly offensive or whatever, but there's tons of animation animations 223 00:14:08,240 --> 00:14:11,800 Speaker 1: like any other medium, right, There's there's animation out there 224 00:14:11,800 --> 00:14:16,160 Speaker 1: that's for different audiences, including audiences that don't include children. However, 225 00:14:16,240 --> 00:14:19,840 Speaker 1: animation is one of the criteria that the FTC might 226 00:14:19,880 --> 00:14:23,480 Speaker 1: look at if determining a channel is directed to children 227 00:14:23,600 --> 00:14:26,560 Speaker 1: or not. The type of music being used in a 228 00:14:26,640 --> 00:14:29,960 Speaker 1: video can also be a factor, right, So if the 229 00:14:30,040 --> 00:14:35,080 Speaker 1: music is very jaunty and fun and there's animated characters, 230 00:14:35,920 --> 00:14:39,800 Speaker 1: those things could be elements that the FTC takes into 231 00:14:39,800 --> 00:14:42,200 Speaker 1: consideration to say, like, well, you know, even if you 232 00:14:42,280 --> 00:14:45,320 Speaker 1: don't intend for this to be seen for children, it's 233 00:14:45,360 --> 00:14:48,560 Speaker 1: designed in such a way as to appeal to children, 234 00:14:49,080 --> 00:14:54,200 Speaker 1: So it gets kind of gray and vague. If your 235 00:14:54,280 --> 00:14:57,160 Speaker 1: videos have the presence of children in them, or people 236 00:14:57,240 --> 00:15:00,680 Speaker 1: posing as children and them, that can also be a factor. 237 00:15:01,360 --> 00:15:03,440 Speaker 1: There are a lot of different elements at play here. 238 00:15:04,160 --> 00:15:07,240 Speaker 1: But hang on a minute. I hear you say, isn't 239 00:15:07,280 --> 00:15:09,840 Speaker 1: this too broad of a brush? And yeah, each of 240 00:15:09,880 --> 00:15:13,520 Speaker 1: these taken by themselves, definitely would be too broad a brush. If, 241 00:15:13,560 --> 00:15:16,280 Speaker 1: if it only, if only one of these check boxes 242 00:15:16,320 --> 00:15:18,080 Speaker 1: had to be checked for the FTC to say, well, 243 00:15:18,160 --> 00:15:20,720 Speaker 1: clearly this this channel is directed to children, or this 244 00:15:20,760 --> 00:15:26,280 Speaker 1: website or this service or whatever, then that would be ludicrous. 245 00:15:26,320 --> 00:15:30,160 Speaker 1: It would be way too much over each right, and 246 00:15:30,720 --> 00:15:35,040 Speaker 1: so you would have to definitely dig further down. There 247 00:15:35,040 --> 00:15:38,400 Speaker 1: has to be multiple factors here, and it has to 248 00:15:38,440 --> 00:15:43,280 Speaker 1: be reasonable to assume that the material itself was specifically 249 00:15:43,320 --> 00:15:47,160 Speaker 1: created in order to appeal to children and attract children 250 00:15:47,200 --> 00:15:51,080 Speaker 1: to it. So this is not a new thing. This 251 00:15:51,200 --> 00:15:54,520 Speaker 1: is something that you know has been an issue for ages, 252 00:15:54,960 --> 00:16:00,160 Speaker 1: and for some cycles of content, it's more prevalent than others. Right, Like, 253 00:16:00,760 --> 00:16:03,560 Speaker 1: you can look at stuff that has been created over 254 00:16:03,640 --> 00:16:08,440 Speaker 1: time that uh, you know has like a retro appeal 255 00:16:08,520 --> 00:16:12,160 Speaker 1: or a nostalgia of heel to it that you could, 256 00:16:12,400 --> 00:16:14,400 Speaker 1: on one hand say, oh, I can see how this 257 00:16:14,480 --> 00:16:18,440 Speaker 1: could be considered directed to children. I'm thinking of things 258 00:16:18,480 --> 00:16:23,200 Speaker 1: like Robot Chicken, a series where toys and and animated 259 00:16:23,240 --> 00:16:27,320 Speaker 1: figures are used to tell jokes that are not meant 260 00:16:27,320 --> 00:16:30,560 Speaker 1: for kids. Like the material in those shows are not 261 00:16:30,640 --> 00:16:33,760 Speaker 1: meant for for kids, uh, And it's meant to appeal 262 00:16:33,840 --> 00:16:36,160 Speaker 1: to an age group of people who grew up playing 263 00:16:36,160 --> 00:16:39,320 Speaker 1: with like action figures and stuff. But the fact that 264 00:16:39,360 --> 00:16:42,360 Speaker 1: it is animated and that's using toys, those are things 265 00:16:42,360 --> 00:16:44,720 Speaker 1: that the FTC might look at and say, you know, 266 00:16:46,080 --> 00:16:50,160 Speaker 1: it's hitting some of the criteria we consider for material 267 00:16:50,200 --> 00:16:53,520 Speaker 1: that was meant to be directed to children. So yeah, 268 00:16:53,760 --> 00:16:58,080 Speaker 1: there's a real um sticky situation there. And you can 269 00:16:58,120 --> 00:17:00,360 Speaker 1: see how by using some of the criteria that the 270 00:17:00,400 --> 00:17:03,240 Speaker 1: FTC uses to determine if something is in fact directed 271 00:17:03,240 --> 00:17:07,200 Speaker 1: to children. Such a short as Chris Stroubs might get 272 00:17:07,280 --> 00:17:10,160 Speaker 1: lumped in there, right, had a cartoon skeleton and had 273 00:17:10,200 --> 00:17:12,919 Speaker 1: some fun music in the beginning. Never mind that the 274 00:17:12,960 --> 00:17:15,840 Speaker 1: short is sort of a dark parody of those kinds 275 00:17:15,840 --> 00:17:19,359 Speaker 1: of cartoons, and it is in fact leveraging that retro 276 00:17:19,520 --> 00:17:23,600 Speaker 1: aesthetic to convey an unsettling and creepy story. You could 277 00:17:23,680 --> 00:17:26,960 Speaker 1: argue that just because something looks similar to material produced 278 00:17:26,960 --> 00:17:32,280 Speaker 1: for children, that it should qualify as intended for children. 279 00:17:32,400 --> 00:17:34,600 Speaker 1: I don't think that's a good argument. I don't think 280 00:17:34,640 --> 00:17:38,680 Speaker 1: you can argue that just because something resembles something else 281 00:17:38,720 --> 00:17:40,640 Speaker 1: that it makes it the same thing. I think that's 282 00:17:41,080 --> 00:17:43,399 Speaker 1: a dangerous argument to make. But you can start to 283 00:17:43,440 --> 00:17:47,439 Speaker 1: see where YouTube's AI started to make mistakes. Now, if 284 00:17:47,480 --> 00:17:49,879 Speaker 1: you're someone like Stroube, there's a very good reason you 285 00:17:49,880 --> 00:17:53,400 Speaker 1: would want your video to not be flagged as appropriate 286 00:17:53,480 --> 00:17:57,760 Speaker 1: for all ages if in fact the video isn't. See 287 00:17:57,920 --> 00:18:01,479 Speaker 1: if the FTC determines that your service or site or 288 00:18:01,480 --> 00:18:07,200 Speaker 1: whatever is directed to children but is violating Kappa on 289 00:18:07,280 --> 00:18:10,440 Speaker 1: some level, then you could face a maximum fine of 290 00:18:10,560 --> 00:18:17,159 Speaker 1: forty thousand, five dollars per violation. That's a heck of 291 00:18:17,160 --> 00:18:21,679 Speaker 1: a fine if you're a content creator, right like, you 292 00:18:21,760 --> 00:18:24,960 Speaker 1: get hit for that because your video was deemed to 293 00:18:25,040 --> 00:18:29,720 Speaker 1: be inappropriate for children and yet designated as appropriate for children. 294 00:18:30,280 --> 00:18:33,720 Speaker 1: Explains why Straube is so frustrated that YouTube has overruled 295 00:18:33,800 --> 00:18:37,040 Speaker 1: his own designation that the video is meant for older audiences. 296 00:18:37,240 --> 00:18:39,400 Speaker 1: Strab could be on the hook for a large fine 297 00:18:39,440 --> 00:18:42,639 Speaker 1: if a complaint were brought against his channel. And now 298 00:18:43,280 --> 00:18:47,440 Speaker 1: let's talk about YouTube's own lousy history of curating content 299 00:18:47,520 --> 00:18:50,919 Speaker 1: for children. It is no surprise that there is a 300 00:18:50,960 --> 00:18:54,280 Speaker 1: lot of content on YouTube that's unsuitable for kids. I 301 00:18:54,280 --> 00:18:57,440 Speaker 1: would argue there's a lot of content on YouTube that's 302 00:18:57,480 --> 00:19:01,280 Speaker 1: unsuitable for anyone. I see examples of it all the time. 303 00:19:01,320 --> 00:19:03,200 Speaker 1: But we're gonna put that aside for now, because that's 304 00:19:03,240 --> 00:19:07,680 Speaker 1: just grumpy old man yells at cloud material. So YouTube 305 00:19:07,800 --> 00:19:10,320 Speaker 1: launched back in two thousand five, and it did not 306 00:19:10,440 --> 00:19:12,760 Speaker 1: take very long for folks to find stuff that was 307 00:19:12,840 --> 00:19:16,880 Speaker 1: not appropriate for kids and yet appeared to be aimed 308 00:19:17,600 --> 00:19:21,680 Speaker 1: at kids. Now, that pretty much has been part of 309 00:19:21,720 --> 00:19:25,080 Speaker 1: YouTube's landscape throughout its entire history, but for a very 310 00:19:25,119 --> 00:19:27,800 Speaker 1: long time that issue was largely ignored or at least 311 00:19:27,840 --> 00:19:32,640 Speaker 1: not talked about very much. In two thousand fifteen, Google 312 00:19:32,720 --> 00:19:36,200 Speaker 1: released an app for Android and iOS called YouTube Kids. 313 00:19:36,760 --> 00:19:39,320 Speaker 1: The app was meant to give parents more control over 314 00:19:39,320 --> 00:19:42,240 Speaker 1: what their kids could watch on various devices like tablets 315 00:19:42,320 --> 00:19:46,600 Speaker 1: or smartphones or smart TVs, and the content was curated. 316 00:19:46,880 --> 00:19:50,000 Speaker 1: The idea was that the only content suitable for children 317 00:19:50,080 --> 00:19:53,080 Speaker 1: under the age of thirteen would make it through a 318 00:19:53,280 --> 00:19:57,880 Speaker 1: first past filter into YouTube Kids, so that parents could 319 00:19:57,880 --> 00:20:00,920 Speaker 1: be sure that whatever their kids were watching on this 320 00:20:01,000 --> 00:20:05,439 Speaker 1: particular version of YouTube would not be inappropriate. But on 321 00:20:05,520 --> 00:20:08,440 Speaker 1: top of that, parents would actually have additional parental controls 322 00:20:08,520 --> 00:20:10,719 Speaker 1: so that they could restrict it further if they wanted to. 323 00:20:11,440 --> 00:20:14,280 Speaker 1: The curated content issue really came under the spotlight in 324 00:20:14,280 --> 00:20:17,080 Speaker 1: two thousand seventeen with the onset of what was later 325 00:20:17,160 --> 00:20:21,679 Speaker 1: called Elsa Gate. That name references the character Elsa in 326 00:20:21,760 --> 00:20:27,280 Speaker 1: Disney's Frozen series, and gate is a shorthand for saying scandal, 327 00:20:28,080 --> 00:20:31,000 Speaker 1: and they used Elsa because the character of Elsa would 328 00:20:31,000 --> 00:20:35,600 Speaker 1: figure prominently in tons of videos that were being created 329 00:20:35,720 --> 00:20:39,560 Speaker 1: for YouTube and YouTube Kids. Some of those videos were 330 00:20:39,760 --> 00:20:41,600 Speaker 1: live action. In fact, a lot of them were as 331 00:20:41,640 --> 00:20:45,439 Speaker 1: just someone in an Elsa costume and wig, some of 332 00:20:45,440 --> 00:20:49,959 Speaker 1: the shorts were animated. Sometimes the animation was actually not terrible, 333 00:20:50,080 --> 00:20:53,640 Speaker 1: most of it was super cheap and limited and shoddy, 334 00:20:54,440 --> 00:20:57,160 Speaker 1: and it was like a fire hose of content had 335 00:20:57,240 --> 00:21:02,000 Speaker 1: just been dumped on YouTube all at monts. The consequences 336 00:21:02,040 --> 00:21:06,359 Speaker 1: of that would end up being pretty serious, both for 337 00:21:06,440 --> 00:21:10,960 Speaker 1: YouTube and for content creators. I'll explain more after we 338 00:21:11,000 --> 00:21:21,840 Speaker 1: come back from this quick break. So some of the 339 00:21:21,960 --> 00:21:25,280 Speaker 1: videos that were uploaded during Elsa Gate were just weird 340 00:21:25,760 --> 00:21:30,959 Speaker 1: but otherwise mostly innocent. Other you know, they're just had 341 00:21:31,000 --> 00:21:33,600 Speaker 1: a bunch of people in cheap costumes running around doing 342 00:21:33,640 --> 00:21:38,000 Speaker 1: weird sketches, often without any dialogue at all. But other 343 00:21:38,119 --> 00:21:41,359 Speaker 1: videos included material that was really unsuitable for kids, including 344 00:21:41,520 --> 00:21:47,320 Speaker 1: sexually suggestive content, violence, other disturbing content, you know, cruelty, 345 00:21:47,480 --> 00:21:50,760 Speaker 1: that kind of thing. And the videos would smash together 346 00:21:50,960 --> 00:21:55,800 Speaker 1: tons of popular characters, and there was no question whatsoever 347 00:21:56,320 --> 00:21:59,199 Speaker 1: that the people who were making these videos weren't you know, 348 00:21:59,640 --> 00:22:03,520 Speaker 1: bother ring to actually license those characters. So this was 349 00:22:03,560 --> 00:22:08,200 Speaker 1: all a lot of folks infringing upon intellectual property going 350 00:22:08,240 --> 00:22:12,600 Speaker 1: on here, like at a truly absurd level. And you 351 00:22:12,600 --> 00:22:15,439 Speaker 1: would have videos with you know, characters like Spider man 352 00:22:15,800 --> 00:22:19,040 Speaker 1: talking to a pregnant Elsa while you see the joker 353 00:22:19,119 --> 00:22:21,800 Speaker 1: running around in the background. It was just like a 354 00:22:22,720 --> 00:22:27,640 Speaker 1: smorgas board of you know, popular characters that kids would recognize. 355 00:22:28,119 --> 00:22:30,199 Speaker 1: And like I said, some of these videos were just 356 00:22:30,280 --> 00:22:33,119 Speaker 1: bizarre with no real rhyme or reason to them, and 357 00:22:33,160 --> 00:22:35,399 Speaker 1: they had like, you know, background music, but they wouldn't 358 00:22:35,440 --> 00:22:39,359 Speaker 1: have dialogue. And obviously leaning on popular children's characters while 359 00:22:39,400 --> 00:22:42,280 Speaker 1: not using words meant that those videos could become popular 360 00:22:42,320 --> 00:22:45,560 Speaker 1: all around the world. There was no language barrier to overcome, 361 00:22:46,240 --> 00:22:48,879 Speaker 1: so you didn't have to worry if the kids in 362 00:22:49,560 --> 00:22:52,760 Speaker 1: you know, whatever region couldn't understand say English or French 363 00:22:52,840 --> 00:22:55,280 Speaker 1: or whatever, because there was no no speaking or or 364 00:22:55,359 --> 00:22:58,880 Speaker 1: language at all. And many of the videos became more bizarre, 365 00:22:58,960 --> 00:23:02,560 Speaker 1: not necessarily out of a desire to warp kids minds, 366 00:23:03,119 --> 00:23:06,720 Speaker 1: but rather driven by data. See. Content creators were paying 367 00:23:06,760 --> 00:23:09,879 Speaker 1: attention to topics that were popping up in search terms 368 00:23:10,320 --> 00:23:13,879 Speaker 1: and videos that were getting popular. So if topics were 369 00:23:13,880 --> 00:23:18,360 Speaker 1: starting to gain in popularity, you would start seeing videos 370 00:23:18,400 --> 00:23:22,480 Speaker 1: that somehow tie into those search terms. It was kind 371 00:23:22,520 --> 00:23:24,359 Speaker 1: of like the old days of the web where people 372 00:23:24,400 --> 00:23:27,800 Speaker 1: would just hide a massive text dump on a web page, 373 00:23:28,080 --> 00:23:30,600 Speaker 1: so that search engines would pick up on that page 374 00:23:30,600 --> 00:23:33,600 Speaker 1: and index them, even if the page itself had nothing 375 00:23:33,640 --> 00:23:35,520 Speaker 1: to do with the search term. It was just that, 376 00:23:35,640 --> 00:23:39,080 Speaker 1: you know, buried somewhere on the page was that term, 377 00:23:39,560 --> 00:23:41,159 Speaker 1: but the page itself had nothing to do with that. 378 00:23:41,760 --> 00:23:43,840 Speaker 1: Like I said, there have been some channels on YouTube 379 00:23:43,840 --> 00:23:46,800 Speaker 1: that created this kind of weird content directed at kids 380 00:23:47,200 --> 00:23:50,440 Speaker 1: for years, and you might wonder why. And again, if 381 00:23:50,480 --> 00:23:53,200 Speaker 1: you can monetize your content, your goal is to get 382 00:23:53,240 --> 00:23:57,280 Speaker 1: as many views on as many videos as possible. You 383 00:23:57,320 --> 00:24:01,919 Speaker 1: want to get that that view count sky rocketing. Kids 384 00:24:01,960 --> 00:24:04,560 Speaker 1: are a great audience to have that happened because kids 385 00:24:04,600 --> 00:24:08,639 Speaker 1: tend to fixate on content. They like to experience the 386 00:24:08,720 --> 00:24:12,200 Speaker 1: same thing over and over. It becomes kind of like 387 00:24:12,240 --> 00:24:15,320 Speaker 1: a ritual. I'm pretty sure every parent out there is 388 00:24:15,400 --> 00:24:17,840 Speaker 1: familiar with the experience of having to read the same 389 00:24:17,880 --> 00:24:22,440 Speaker 1: bedtime story for the hundredth time to their kids, including 390 00:24:22,480 --> 00:24:25,320 Speaker 1: doing all the voices. So for me, when I was 391 00:24:25,359 --> 00:24:27,560 Speaker 1: a kid, it was There's a monster at the end 392 00:24:27,560 --> 00:24:30,960 Speaker 1: of this book, and then it was Hamlet. I progressed 393 00:24:30,960 --> 00:24:33,399 Speaker 1: pretty quickly, and you might think I'm joking, but I 394 00:24:33,440 --> 00:24:37,720 Speaker 1: actually once found a cassette tape labeled for Jonathan that 395 00:24:37,800 --> 00:24:40,280 Speaker 1: my dad made for me that included him reading Hamlet 396 00:24:40,320 --> 00:24:44,240 Speaker 1: and doing different voices for the characters. Anyway, kids would 397 00:24:44,240 --> 00:24:47,560 Speaker 1: rewatch videos and that would drive up the watch count, 398 00:24:47,960 --> 00:24:52,080 Speaker 1: that in turn would prompt YouTube's recommendation algorithm to promote 399 00:24:52,119 --> 00:24:55,119 Speaker 1: that video to other people. The idea being that if 400 00:24:55,160 --> 00:24:58,480 Speaker 1: something is gaining popularity, then with a little help, it 401 00:24:58,520 --> 00:25:01,919 Speaker 1: could go totally viral. And because YouTube makes most of 402 00:25:01,960 --> 00:25:05,600 Speaker 1: it its revenue through advertising, and advertisers will pay more 403 00:25:05,640 --> 00:25:09,359 Speaker 1: for videos with more views, keeping the viral machine going 404 00:25:09,560 --> 00:25:13,199 Speaker 1: is what pays the bills. Plus, like if it's proven 405 00:25:13,240 --> 00:25:15,720 Speaker 1: to keep people on the platform. The other big goal 406 00:25:15,760 --> 00:25:17,800 Speaker 1: of YouTube is to keep you there as long as 407 00:25:17,840 --> 00:25:21,280 Speaker 1: it possibly can, just like any other online platform that 408 00:25:21,400 --> 00:25:24,359 Speaker 1: depends heavily on ads for its revenue. Facebook does the 409 00:25:24,400 --> 00:25:28,760 Speaker 1: same thing, right, the idea of being let's eliminate the 410 00:25:28,880 --> 00:25:32,280 Speaker 1: desire to go anywhere else and just keep people here. 411 00:25:32,720 --> 00:25:35,800 Speaker 1: So these channels started to turn out content at a 412 00:25:35,840 --> 00:25:39,000 Speaker 1: crazy pace, and kids would watch and rewatch the videos, 413 00:25:39,000 --> 00:25:42,520 Speaker 1: and YouTube would continue to promote them. And in twenty sixteen, 414 00:25:42,600 --> 00:25:47,119 Speaker 1: so just one year after YouTube Kids launched, The Guardian, 415 00:25:47,359 --> 00:25:50,480 Speaker 1: the newspaper ran a story about one of the channels 416 00:25:50,520 --> 00:25:55,960 Speaker 1: creating bizarre and sometimes disturbing content directed toward children. American 417 00:25:55,960 --> 00:25:58,920 Speaker 1: news outlets would begin to cover the story in two 418 00:25:58,920 --> 00:26:01,760 Speaker 1: thousand and seventeen. Early on it was like the tech 419 00:26:01,840 --> 00:26:03,639 Speaker 1: journals that were covering it. By the end of the 420 00:26:03,760 --> 00:26:07,679 Speaker 1: year you had mainstream newspapers and magazines covering it, and 421 00:26:07,720 --> 00:26:10,280 Speaker 1: they started listing out the dozens of channels known for 422 00:26:10,400 --> 00:26:15,440 Speaker 1: blasting out odd and sometimes unsettling videos, all marketed to children, 423 00:26:15,720 --> 00:26:18,800 Speaker 1: most of which were popping up on YouTube Kids, many 424 00:26:18,880 --> 00:26:21,600 Speaker 1: of which you would say are not appropriate for that. 425 00:26:22,200 --> 00:26:25,240 Speaker 1: The media attention cast a pretty critical light on YouTube, 426 00:26:25,280 --> 00:26:27,960 Speaker 1: asking how a platform could allow this sort of content 427 00:26:28,040 --> 00:26:34,000 Speaker 1: to proliferate across its platform, particularly the parts set aside 428 00:26:34,000 --> 00:26:38,800 Speaker 1: specifically to curate appropriate content for children, and YouTube didn't 429 00:26:38,840 --> 00:26:42,800 Speaker 1: really have any good answers for that. Now. One answer 430 00:26:43,000 --> 00:26:46,040 Speaker 1: is that YouTube has way too much content uploaded to 431 00:26:46,080 --> 00:26:48,960 Speaker 1: it per minute for humans to keep track of it all. 432 00:26:49,400 --> 00:26:52,919 Speaker 1: Back in two thousand seventeen, YouTube claimed that users were 433 00:26:52,960 --> 00:26:57,040 Speaker 1: uploading around three hundred hours worth of videos every single 434 00:26:57,160 --> 00:27:00,840 Speaker 1: minute of the day. So let's say you're hired YouTube 435 00:27:00,920 --> 00:27:03,960 Speaker 1: to review content that's been uploaded to make sure it 436 00:27:04,000 --> 00:27:07,840 Speaker 1: doesn't violate any rules. So you watch a ten minute 437 00:27:07,920 --> 00:27:11,120 Speaker 1: video and by the end of that video, three thousand 438 00:27:11,160 --> 00:27:15,080 Speaker 1: hours worth of material have been added to YouTube. There's 439 00:27:15,119 --> 00:27:18,120 Speaker 1: just no way for humans to keep up with that 440 00:27:18,240 --> 00:27:22,040 Speaker 1: kind of a content fire hose. You could employ tens 441 00:27:22,040 --> 00:27:24,280 Speaker 1: of thousands of people, you still wouldn't get through all 442 00:27:24,320 --> 00:27:27,120 Speaker 1: the material that's going up every single minute of the day. 443 00:27:27,280 --> 00:27:30,560 Speaker 1: So one reason this stuff was getting past YouTube is 444 00:27:30,640 --> 00:27:33,399 Speaker 1: yet there was just too much of it. YouTube depends 445 00:27:33,440 --> 00:27:36,960 Speaker 1: heavily on users following rules, so kind of the honor 446 00:27:37,040 --> 00:27:41,800 Speaker 1: system there, or users that flag content that violates the rules, 447 00:27:41,880 --> 00:27:45,280 Speaker 1: so it becomes like a self policing community, and then 448 00:27:45,800 --> 00:27:49,760 Speaker 1: YouTube employees are more likely contract workers can then review 449 00:27:49,880 --> 00:27:54,440 Speaker 1: the flagged videos and determine do they actually violate any policies. 450 00:27:55,280 --> 00:27:58,000 Speaker 1: Companies that hold a lot of like intellectual property, like 451 00:27:58,119 --> 00:28:01,640 Speaker 1: the various music labels and movie studios and TV studios 452 00:28:01,640 --> 00:28:04,280 Speaker 1: out there, they tend to be a lot more proactive 453 00:28:04,400 --> 00:28:07,320 Speaker 1: in seeking out videos that violate their i P and 454 00:28:07,400 --> 00:28:10,680 Speaker 1: they are known for flagging those videos to get strikes 455 00:28:10,680 --> 00:28:15,080 Speaker 1: against the channel um or they will you know, use 456 00:28:15,480 --> 00:28:18,760 Speaker 1: an option that demonetizes the video for the channel owner, 457 00:28:18,840 --> 00:28:22,000 Speaker 1: but directs all monetization to them because they owned the 458 00:28:22,040 --> 00:28:25,640 Speaker 1: i P. But while there are search algorithms that can 459 00:28:25,640 --> 00:28:28,879 Speaker 1: seek out video and audio that violates copyright, it gets 460 00:28:28,920 --> 00:28:31,199 Speaker 1: way harder when you're talking about just a bunch of 461 00:28:31,240 --> 00:28:35,320 Speaker 1: people in cheap costumes posing as licensed characters in an 462 00:28:35,400 --> 00:28:39,400 Speaker 1: unlicensed video. That's a lot harder to detect, you know, 463 00:28:39,600 --> 00:28:44,400 Speaker 1: with a regular search algorithm. YouTube did respond to the issue, however, 464 00:28:44,880 --> 00:28:48,520 Speaker 1: The company moved to demonetize videos that were deemed offensive 465 00:28:48,640 --> 00:28:52,800 Speaker 1: or controversial that were directed to children. It also shut 466 00:28:52,840 --> 00:28:57,320 Speaker 1: down lots of channels, like around fifty channels at one point, 467 00:28:57,720 --> 00:29:01,040 Speaker 1: and it turned off comments on thousands of videos because 468 00:29:01,080 --> 00:29:07,160 Speaker 1: people recognize that comments sections could include opportunities to prey 469 00:29:07,520 --> 00:29:12,120 Speaker 1: upon children, like it could include sexual predators, for example, 470 00:29:12,360 --> 00:29:16,600 Speaker 1: trying to reach out and lure children. So YouTube said, 471 00:29:16,600 --> 00:29:19,920 Speaker 1: all right, we're just gonna We're going to discontinue comments 472 00:29:19,920 --> 00:29:23,280 Speaker 1: sections on any videos that are determined to be directed 473 00:29:23,280 --> 00:29:27,560 Speaker 1: to children as a safety measure. However, that still wasn't enough. 474 00:29:28,160 --> 00:29:31,320 Speaker 1: In twenty nineteen, the New York Attorney General's Office and 475 00:29:31,400 --> 00:29:36,640 Speaker 1: the FTC would sue YouTube slash Google for violating Kappa 476 00:29:36,840 --> 00:29:39,640 Speaker 1: the lawsuit accused YouTube of collecting the personal data of 477 00:29:39,760 --> 00:29:43,520 Speaker 1: children without first getting parental consent, and that this came 478 00:29:43,600 --> 00:29:48,480 Speaker 1: from children directed channels posting videos and YouTube monetizing those 479 00:29:48,600 --> 00:29:53,000 Speaker 1: videos by pairing them with targeted advertising. Uh so this 480 00:29:53,080 --> 00:29:55,840 Speaker 1: wasn't necessarily stuff that was showing up on YouTube kids, 481 00:29:55,960 --> 00:29:59,520 Speaker 1: although some of it was, but rather just YouTube in 482 00:29:59,560 --> 00:30:04,200 Speaker 1: general saying these are our channels that are generating videos 483 00:30:05,000 --> 00:30:09,560 Speaker 1: meant for children, which means that children are overwhelmingly the 484 00:30:09,600 --> 00:30:13,600 Speaker 1: audience watching them, and you haven't put any protections in 485 00:30:13,680 --> 00:30:17,520 Speaker 1: place to prevent these the personal data of these children 486 00:30:17,560 --> 00:30:22,600 Speaker 1: being just harvested and then exploited. So Google would eventually 487 00:30:22,600 --> 00:30:26,480 Speaker 1: settle out of court for a whopping one seventy million dollars. Now, 488 00:30:26,480 --> 00:30:29,880 Speaker 1: in the grand scheme of things for Google, anyway, that's 489 00:30:29,920 --> 00:30:32,640 Speaker 1: not that much money. But I can assure you that 490 00:30:32,680 --> 00:30:35,840 Speaker 1: stakeholders would much rather see that hundred seventy million dollars 491 00:30:36,120 --> 00:30:38,680 Speaker 1: stay with the company rather than get siphoned off to 492 00:30:38,680 --> 00:30:42,440 Speaker 1: pay off fines. The settlement also required YouTube to commit 493 00:30:42,480 --> 00:30:46,680 Speaker 1: to quote, develop, implement, and maintain a system that permits 494 00:30:46,760 --> 00:30:50,360 Speaker 1: channel owners to identify their child directed content on the 495 00:30:50,400 --> 00:30:53,520 Speaker 1: YouTube platform, so that YouTube can ensure it is complying 496 00:30:53,560 --> 00:30:57,840 Speaker 1: with Kappa end quote that's from the FTC itself, and 497 00:30:57,880 --> 00:31:01,800 Speaker 1: so YouTube made a renewed effort to stave off future 498 00:31:01,880 --> 00:31:05,800 Speaker 1: Kappa violations. So it was in twenty nineteen and early 499 00:31:05,880 --> 00:31:08,680 Speaker 1: twenty twenty that YouTube really began to push harder to 500 00:31:08,760 --> 00:31:13,040 Speaker 1: force channels that had children directed content to be compliant 501 00:31:13,040 --> 00:31:16,520 Speaker 1: with Kappa. One big change was that channels that were 502 00:31:16,560 --> 00:31:20,000 Speaker 1: determined to be children directed would no longer be able 503 00:31:20,040 --> 00:31:24,680 Speaker 1: to run targeted ads against their videos, So those videos 504 00:31:24,720 --> 00:31:28,360 Speaker 1: wouldn't have a comment section either, and some other community 505 00:31:28,360 --> 00:31:33,000 Speaker 1: features would similarly be eliminated or turned off. This would 506 00:31:33,040 --> 00:31:37,160 Speaker 1: have the consequence of severely restricting ad revenue for those 507 00:31:37,200 --> 00:31:41,080 Speaker 1: types of channels. Again, because targeted ads are so valuable 508 00:31:41,160 --> 00:31:45,320 Speaker 1: compared to other types of advertising, YouTube creators would have 509 00:31:45,400 --> 00:31:49,040 Speaker 1: to designate their channels as being four kids or not 510 00:31:49,200 --> 00:31:54,560 Speaker 1: for kids. From a monetary standpoint, choosing not for kids 511 00:31:54,800 --> 00:31:57,280 Speaker 1: made the most sense, right because that's when you could 512 00:31:57,360 --> 00:32:01,320 Speaker 1: use targeted advertising. But if YouTube determined that the channel 513 00:32:01,440 --> 00:32:05,840 Speaker 1: was in fact directed to children, then YouTube might override 514 00:32:05,880 --> 00:32:09,000 Speaker 1: the user choice. You can't just say this cartoon about 515 00:32:09,080 --> 00:32:13,040 Speaker 1: lollipops playing games with peppermint candies isn't meant for kids 516 00:32:13,440 --> 00:32:16,840 Speaker 1: just so you can have targeted ads run against your video. Now, 517 00:32:16,880 --> 00:32:20,400 Speaker 1: this brings us back to Chris Stroub. His video is 518 00:32:20,440 --> 00:32:24,000 Speaker 1: clearly not meant for children. If you watch it, you 519 00:32:24,040 --> 00:32:28,360 Speaker 1: would agree, not that it's particularly dark or disturbing. It's 520 00:32:28,360 --> 00:32:31,400 Speaker 1: a little dark, but it's not super gruesome or anything, 521 00:32:31,480 --> 00:32:36,080 Speaker 1: but it's it's meant to play on adult sensibilities. And 522 00:32:36,080 --> 00:32:38,680 Speaker 1: this is where I really have issues with YouTube's AI. 523 00:32:38,840 --> 00:32:42,280 Speaker 1: I think it is terribly irresponsible to allow an automated 524 00:32:42,360 --> 00:32:45,960 Speaker 1: system to determine if content is suitable for all ages. 525 00:32:46,960 --> 00:32:49,400 Speaker 1: If you run it the other way, that makes sense. 526 00:32:49,440 --> 00:32:52,800 Speaker 1: If AI determines that a video that has previously been 527 00:32:52,840 --> 00:32:56,880 Speaker 1: set as being suitable for all ages isn't, then changing 528 00:32:56,880 --> 00:33:00,479 Speaker 1: that video so it's age restricted, that makes sense. Content 529 00:33:00,520 --> 00:33:04,200 Speaker 1: creator can appeal the decision, but more importantly, a potentially 530 00:33:04,200 --> 00:33:08,040 Speaker 1: offensive or disturbing piece of content gets removed from circulation 531 00:33:08,080 --> 00:33:11,080 Speaker 1: among kids, which I think is the most important element. 532 00:33:11,760 --> 00:33:15,120 Speaker 1: But to take a video that's been expressly flagged by 533 00:33:15,160 --> 00:33:19,480 Speaker 1: the creator as being inappropriate for children and then flipping 534 00:33:19,520 --> 00:33:24,239 Speaker 1: that switch without the creator's consent, that seems irresponsible to me. Now, 535 00:33:24,280 --> 00:33:28,600 Speaker 1: I can see some justification to review stroubs video right, like, 536 00:33:28,640 --> 00:33:32,000 Speaker 1: I can see why the AI might flag it to say, 537 00:33:32,280 --> 00:33:35,520 Speaker 1: someone check this out and make sure that, in fact, 538 00:33:35,560 --> 00:33:39,000 Speaker 1: it should be listed as eighteen and above. The video 539 00:33:39,160 --> 00:33:43,640 Speaker 1: does feature an appealing cute Cartoons skeleton character, you know, 540 00:33:43,920 --> 00:33:47,240 Speaker 1: at least cute as far as skeletons go anyway, and 541 00:33:47,480 --> 00:33:51,280 Speaker 1: early on it has some fairly jaunty music playing, so 542 00:33:51,320 --> 00:33:54,000 Speaker 1: you could see how it might at first glance be 543 00:33:54,200 --> 00:33:57,280 Speaker 1: content directed to children. But if you watch it all 544 00:33:57,280 --> 00:33:59,840 Speaker 1: the way through, you realize, okay, this is not meant 545 00:33:59,840 --> 00:34:03,360 Speaker 1: for kids, and Stroube took steps to show that, like 546 00:34:03,440 --> 00:34:06,760 Speaker 1: he he designated that video is not being for kids. 547 00:34:06,800 --> 00:34:10,400 Speaker 1: So to have an automated system reverse this decision, and moreover, 548 00:34:10,520 --> 00:34:14,640 Speaker 1: to prevent Stroube from being able to reset the age restriction, 549 00:34:15,440 --> 00:34:18,759 Speaker 1: that's not a good look for YouTube. Early in there 550 00:34:18,800 --> 00:34:21,839 Speaker 1: was a pretty big kerfuffle and multiple YouTube communities about 551 00:34:21,880 --> 00:34:25,200 Speaker 1: the impact of Kappa and YouTube's new policies. Channels that 552 00:34:25,320 --> 00:34:30,160 Speaker 1: covered topics like animation, video games, and toys particularly came 553 00:34:30,239 --> 00:34:33,319 Speaker 1: under close scrutiny as some of those channels might have 554 00:34:33,400 --> 00:34:36,120 Speaker 1: been you know, kind of borderline cases some of them 555 00:34:36,200 --> 00:34:39,640 Speaker 1: were clearly directed to children. A lot of them had 556 00:34:39,719 --> 00:34:43,240 Speaker 1: kids as hosts, but a lot of the channels weren't 557 00:34:43,400 --> 00:34:46,040 Speaker 1: directed to children. They were clearly not meant for kids. 558 00:34:46,040 --> 00:34:50,000 Speaker 1: They were expressly directed to older audiences. And it still 559 00:34:50,040 --> 00:34:54,879 Speaker 1: speaks to a lot of false preconceptions that lawmakers who 560 00:34:54,920 --> 00:34:58,160 Speaker 1: typically are you know, a little out of touch, let's say, 561 00:34:58,239 --> 00:35:00,279 Speaker 1: which is a way of saying many of the are 562 00:35:00,320 --> 00:35:06,920 Speaker 1: old and don't really have a deeper appreciation for this, 563 00:35:07,480 --> 00:35:10,480 Speaker 1: and and those false preconceptions include stuff like, you know, 564 00:35:10,560 --> 00:35:14,800 Speaker 1: thinking collectibles, video games, and animation are all expressly the 565 00:35:15,239 --> 00:35:18,520 Speaker 1: domain of children, and they're not, and they haven't been. 566 00:35:18,880 --> 00:35:21,640 Speaker 1: In cases like animation, that's never been the case. It's 567 00:35:21,640 --> 00:35:26,760 Speaker 1: never been just for kids, but that frequently is how 568 00:35:27,120 --> 00:35:30,560 Speaker 1: it gets viewed. The kerfuffle on YouTube has died down 569 00:35:30,600 --> 00:35:33,880 Speaker 1: a little bit since, I mean, obviously we got Chris 570 00:35:33,920 --> 00:35:40,680 Speaker 1: Strobs uh recent incident, but you know, as a whole topic, 571 00:35:41,200 --> 00:35:44,359 Speaker 1: it kind of died down, and there's some reasons for that. 572 00:35:44,400 --> 00:35:47,400 Speaker 1: We had a global pandemic which really took over. Headlines 573 00:35:47,480 --> 00:35:50,920 Speaker 1: like the issues with YouTube and Kappa were starting to 574 00:35:50,920 --> 00:35:55,800 Speaker 1: come to a head at early well by March that 575 00:35:56,120 --> 00:35:59,239 Speaker 1: seemed like a quaint worry compared to other things that 576 00:35:59,280 --> 00:36:01,560 Speaker 1: were going on in the world. Plus since then, we've 577 00:36:01,560 --> 00:36:05,920 Speaker 1: had some truly enormous events, ranging from political insurrections to 578 00:36:06,200 --> 00:36:09,560 Speaker 1: war in Eastern Europe and more. But we're still seeing 579 00:36:09,560 --> 00:36:12,120 Speaker 1: the effects of YouTube's shift. On the one hand, you 580 00:36:12,120 --> 00:36:14,919 Speaker 1: can understand why the platform is taking such a drastic step. 581 00:36:15,400 --> 00:36:20,800 Speaker 1: It's protecting itself from future litigation. It's trying to fulfill 582 00:36:21,080 --> 00:36:24,680 Speaker 1: the obligation it has to the FTC when it made 583 00:36:24,680 --> 00:36:28,320 Speaker 1: that settlement in that Copper case. You can also understand 584 00:36:28,560 --> 00:36:31,640 Speaker 1: this is a monumental task when you consider how much 585 00:36:31,680 --> 00:36:35,160 Speaker 1: content is joining YouTube every single minute of the day, 586 00:36:35,239 --> 00:36:38,280 Speaker 1: like now we're talking about more than five hours per minute. 587 00:36:38,920 --> 00:36:41,239 Speaker 1: But you can also see how an automated system can 588 00:36:41,280 --> 00:36:44,799 Speaker 1: make changes that unfortunately can have the opposite of the 589 00:36:44,880 --> 00:36:49,120 Speaker 1: intended effect. They can actually end up making content that 590 00:36:49,400 --> 00:36:54,000 Speaker 1: is expressly not for kids, flagged as being suitable for kids, 591 00:36:54,600 --> 00:36:59,120 Speaker 1: that is clearly not something that YouTube wants to promote. 592 00:36:59,680 --> 00:37:02,080 Speaker 1: That's just going to be asking for another Coppa case 593 00:37:02,160 --> 00:37:03,920 Speaker 1: down the road. And if it's if it's something like 594 00:37:03,960 --> 00:37:07,680 Speaker 1: this where you've got the documentation where STROB shows I 595 00:37:07,800 --> 00:37:10,919 Speaker 1: flagged this as being inappropriate when I uploaded it. It 596 00:37:10,960 --> 00:37:14,799 Speaker 1: was never meant to be shown to kids. YouTube overruled 597 00:37:14,800 --> 00:37:18,000 Speaker 1: me and then prevented me from fixing it. That's a 598 00:37:18,080 --> 00:37:21,160 Speaker 1: really bad look for YouTube. Anyway. I hope you learned 599 00:37:21,200 --> 00:37:24,359 Speaker 1: something in this episode. Obviously we could talk a lot 600 00:37:24,440 --> 00:37:29,720 Speaker 1: more about Kappa and the unintended consequences of that legislation. Again, 601 00:37:29,760 --> 00:37:33,040 Speaker 1: I think that the legislation itself came from a good place, 602 00:37:33,640 --> 00:37:37,640 Speaker 1: but as we frequently see when we talk about you 603 00:37:37,680 --> 00:37:41,520 Speaker 1: know the intent of law, and then we see what 604 00:37:41,640 --> 00:37:45,719 Speaker 1: happens when we actually enforce a law, there can be 605 00:37:45,760 --> 00:37:50,319 Speaker 1: a disconnect there. Uh. I do also think that we 606 00:37:50,320 --> 00:37:53,520 Speaker 1: should be seeing a lot more data privacy protections in 607 00:37:53,560 --> 00:37:56,120 Speaker 1: place for everyone, not just for kids. I think for 608 00:37:56,200 --> 00:37:59,360 Speaker 1: kids it's absolutely crucial, but I would love to see 609 00:37:59,400 --> 00:38:02,680 Speaker 1: that because more of a thing for everybody. You know, 610 00:38:03,280 --> 00:38:06,960 Speaker 1: Europe has certainly made great strides toward that. In America, 611 00:38:07,080 --> 00:38:09,719 Speaker 1: we're starting to see at least some discussion around it. 612 00:38:09,800 --> 00:38:12,840 Speaker 1: I don't know that it's ever going to turn into 613 00:38:12,880 --> 00:38:18,000 Speaker 1: actionable items, but here's hoping. And that's it. That's all 614 00:38:18,000 --> 00:38:21,000 Speaker 1: I've got. So we're gonna wrap up this episode if 615 00:38:21,040 --> 00:38:23,760 Speaker 1: you have any suggestions for future episodes of tech Stuff. 616 00:38:24,239 --> 00:38:25,600 Speaker 1: There are a couple of different ways you can reach 617 00:38:25,640 --> 00:38:28,000 Speaker 1: out to me. One is that you can send me 618 00:38:28,120 --> 00:38:30,799 Speaker 1: a message on the I heart Radio app. It's free 619 00:38:30,840 --> 00:38:33,680 Speaker 1: to download. You just navigate over to the tech Stuff page. 620 00:38:34,080 --> 00:38:36,879 Speaker 1: You'll see there's a little microphone icon. If you click 621 00:38:36,920 --> 00:38:38,759 Speaker 1: on that, you can leave a voice message up to 622 00:38:38,800 --> 00:38:41,920 Speaker 1: thirty seconds in length. Send it my way. Let me 623 00:38:41,960 --> 00:38:43,440 Speaker 1: know if you want me to use the audio in 624 00:38:43,480 --> 00:38:46,200 Speaker 1: an upcoming episode, because that would be fun to do. 625 00:38:46,560 --> 00:38:48,919 Speaker 1: But I only do opt in. I don't do opt out, 626 00:38:49,760 --> 00:38:52,560 Speaker 1: and then if that's not your style. You can also 627 00:38:52,640 --> 00:38:54,440 Speaker 1: reach out to me on Twitter. The handle for the 628 00:38:54,480 --> 00:38:57,920 Speaker 1: show is tech Stuff hs W and I'll talk to 629 00:38:57,960 --> 00:39:07,080 Speaker 1: you again really so yeah. Text Stuff is an I 630 00:39:07,200 --> 00:39:10,720 Speaker 1: Heart Radio production. For more podcasts from I Heart Radio, 631 00:39:11,040 --> 00:39:14,200 Speaker 1: visit the i Heart Radio app, Apple Podcasts, or wherever 632 00:39:14,280 --> 00:39:15,840 Speaker 1: you listen to your favorite shows.