1 00:00:10,520 --> 00:00:11,680 Speaker 1: People all over the world. 2 00:00:11,880 --> 00:00:14,480 Speaker 2: They log onto only fans. They think they're speaking to 3 00:00:14,600 --> 00:00:17,160 Speaker 2: a model or I guess the proper term is a creator, 4 00:00:17,640 --> 00:00:20,040 Speaker 2: but they're actually speaking to a Filipino. 5 00:00:21,600 --> 00:00:24,480 Speaker 3: Michael Bilchran is a journalist based in the Philippines, and 6 00:00:24,520 --> 00:00:26,600 Speaker 3: he's been talking to the people behind the scenes of 7 00:00:26,600 --> 00:00:29,240 Speaker 3: Only Fans, But maybe not in the way you think. 8 00:00:29,800 --> 00:00:32,360 Speaker 2: I spoke to a single mom this is the main 9 00:00:32,400 --> 00:00:33,199 Speaker 2: source of income. 10 00:00:33,400 --> 00:00:35,760 Speaker 1: I spoke to a. 11 00:00:34,960 --> 00:00:38,360 Speaker 2: College dropout who says this is what feeds their entire family, 12 00:00:39,520 --> 00:00:42,600 Speaker 2: and to be fair, compared to a call center job, 13 00:00:43,120 --> 00:00:45,040 Speaker 2: it pays a lot better. 14 00:00:45,560 --> 00:00:47,960 Speaker 3: Only Fans is what happens when someone figures out how 15 00:00:48,000 --> 00:00:51,560 Speaker 3: to disrupt the traditional porn industry. It's a platform with 16 00:00:51,600 --> 00:00:54,560 Speaker 3: an app where a creator can share pornographic pictures or 17 00:00:54,640 --> 00:00:58,000 Speaker 3: videos with their subscribers right from their phone, and as 18 00:00:58,000 --> 00:01:01,400 Speaker 3: a subscriber, there's an option to chat directly with the creator. 19 00:01:02,080 --> 00:01:05,240 Speaker 3: What subscribers don't realize is that a lot of times 20 00:01:05,319 --> 00:01:08,280 Speaker 3: they're not talking to a model, they're actually talking to 21 00:01:08,360 --> 00:01:11,600 Speaker 3: what's called a chatter. For example, someone like Tony. 22 00:01:11,959 --> 00:01:16,200 Speaker 2: Tony is a friend of a friend, he's around my age, 23 00:01:16,240 --> 00:01:19,839 Speaker 2: he's a dude. He used to have a regular job. 24 00:01:20,080 --> 00:01:22,440 Speaker 3: Tony became an OnlyFans chatter as a way to make 25 00:01:22,480 --> 00:01:24,920 Speaker 3: more money. The pay was a lot better than what 26 00:01:24,959 --> 00:01:27,440 Speaker 3: he could get at a call center job. Being an 27 00:01:27,480 --> 00:01:30,720 Speaker 3: OnlyFans chatter can be a really tough job for reasons 28 00:01:30,720 --> 00:01:33,399 Speaker 3: we're going to get into later. But now there's even 29 00:01:33,440 --> 00:01:34,080 Speaker 3: more pressure. 30 00:01:34,480 --> 00:01:37,480 Speaker 2: His company straight up told him that all your chats 31 00:01:37,560 --> 00:01:41,400 Speaker 2: are being fed into an AI. The company didn't say 32 00:01:41,440 --> 00:01:44,960 Speaker 2: that to eventually replace you, but basically, we're using our 33 00:01:45,040 --> 00:01:48,840 Speaker 2: chats to feed an AI and train it for the future. 34 00:01:49,680 --> 00:01:52,960 Speaker 3: In this episode, we're getting into the hidden label behind OnlyFans, 35 00:01:53,320 --> 00:01:56,320 Speaker 3: the outsourced workers behind the scenes that prop up the 36 00:01:56,360 --> 00:02:03,440 Speaker 3: digital sex work economy. Kaleidoscope and iHeart podcast. 37 00:02:04,440 --> 00:02:04,800 Speaker 4: Connect. 38 00:02:06,960 --> 00:02:08,000 Speaker 3: This is kill Switch. 39 00:02:08,320 --> 00:02:34,720 Speaker 5: I'm Dextor Thomas, I'm Sarah. 40 00:02:36,080 --> 00:02:47,800 Speaker 4: I'm sorry, goodbye. 41 00:02:52,680 --> 00:02:54,840 Speaker 3: How did you come to start reporting on OnlyFans? 42 00:02:55,680 --> 00:02:56,160 Speaker 1: One of my. 43 00:02:56,160 --> 00:02:59,160 Speaker 2: Friends had a friend who was working on something like this, 44 00:02:59,840 --> 00:03:02,080 Speaker 2: and I didn't know that. I was like, come again, 45 00:03:02,120 --> 00:03:04,799 Speaker 2: Like this was I never heard of anything like that? 46 00:03:05,000 --> 00:03:05,280 Speaker 3: Right? 47 00:03:05,400 --> 00:03:08,120 Speaker 2: And so I started looking online and through off on Reddit. 48 00:03:08,200 --> 00:03:10,200 Speaker 2: There's a lot of communities. There are a lot of 49 00:03:10,720 --> 00:03:15,080 Speaker 2: Facebook groups and subreddits just dedicated to only fans chatting, 50 00:03:15,200 --> 00:03:18,239 Speaker 2: hundreds and thousands of accounts talking about this every day. 51 00:03:19,160 --> 00:03:22,040 Speaker 2: Some chatters they call themselves professional cat fixtures. 52 00:03:23,560 --> 00:03:26,240 Speaker 3: Imagine a job at a call center, but instead of 53 00:03:26,280 --> 00:03:29,400 Speaker 3: helping people troubleshoot their new toaster of it, you're taking 54 00:03:29,440 --> 00:03:31,520 Speaker 3: requests for nuds via DM. 55 00:03:31,760 --> 00:03:36,080 Speaker 2: Each chatter is employed by an only fans chatting company. 56 00:03:37,080 --> 00:03:39,880 Speaker 2: So a chatter just like any other job. They have 57 00:03:39,920 --> 00:03:43,800 Speaker 2: an eight hour shift they clock in, and so throughout 58 00:03:43,800 --> 00:03:47,200 Speaker 2: their eight hour shift they're handling maybe on average, i'd 59 00:03:47,200 --> 00:03:50,720 Speaker 2: say five to eight different models at a time. Each 60 00:03:50,760 --> 00:03:53,920 Speaker 2: mall has a different personality, a different brand. Maybe one 61 00:03:54,040 --> 00:03:56,840 Speaker 2: is the girl next door, another is like a model, 62 00:03:56,960 --> 00:04:01,760 Speaker 2: another is like a very sort of conservative kind of growth. 63 00:04:02,000 --> 00:04:04,720 Speaker 2: People like a lot of different things, not only fans. 64 00:04:04,560 --> 00:04:08,360 Speaker 3: Right, people do like a lot of different things. Yeah, 65 00:04:08,480 --> 00:04:09,320 Speaker 3: that's an understatement. 66 00:04:09,800 --> 00:04:13,320 Speaker 2: So they have to switch between these different personalities throughout 67 00:04:13,360 --> 00:04:17,760 Speaker 2: their shifts, and each model has for that shift has 68 00:04:17,800 --> 00:04:20,920 Speaker 2: like as a coda. So the main goal of each 69 00:04:21,000 --> 00:04:22,400 Speaker 2: chatter is to sell content. 70 00:04:23,680 --> 00:04:26,200 Speaker 3: Let me get the straight. So basically an OnlyFans model 71 00:04:26,279 --> 00:04:28,800 Speaker 3: or creator, I guess they get big enough they have 72 00:04:28,920 --> 00:04:32,159 Speaker 3: enough fans where they can't handle all the messages at once, 73 00:04:32,800 --> 00:04:35,640 Speaker 3: and then do they reach out to a company that 74 00:04:35,720 --> 00:04:41,440 Speaker 3: then hires people as essentially a freelancer to handle the 75 00:04:41,440 --> 00:04:43,000 Speaker 3: messages for them. Is that how this works? 76 00:04:43,160 --> 00:04:44,720 Speaker 2: It could be that way, but what I found it 77 00:04:44,800 --> 00:04:46,600 Speaker 2: also it could be that the companies are reaching out 78 00:04:46,640 --> 00:04:50,080 Speaker 2: to models and saying like, hey, you give us a 79 00:04:50,120 --> 00:04:53,120 Speaker 2: portion of your content sales, and we can give you 80 00:04:53,200 --> 00:04:57,159 Speaker 2: like ten chatters and they can handle the many people 81 00:04:57,600 --> 00:04:58,800 Speaker 2: for this amount of time. 82 00:04:59,160 --> 00:05:01,640 Speaker 3: On top of a monthly subscription. The fastest way a 83 00:05:01,720 --> 00:05:04,919 Speaker 3: creator can make money is custom content. This is the 84 00:05:05,000 --> 00:05:08,520 Speaker 3: pictures or videos that are custom made for just one person. 85 00:05:09,000 --> 00:05:11,920 Speaker 3: A model obviously can charge a premium for that, and 86 00:05:12,000 --> 00:05:15,479 Speaker 3: so chatters are encouraged to sell as much custom content 87 00:05:15,600 --> 00:05:18,479 Speaker 3: as they can for each of the five to eight 88 00:05:18,560 --> 00:05:21,360 Speaker 3: models they might be working with at a time. This 89 00:05:21,640 --> 00:05:25,960 Speaker 3: is a lot to juggle. If you're talking about in 90 00:05:26,000 --> 00:05:31,280 Speaker 3: a single shift, you're chatting for eight different models. What 91 00:05:31,360 --> 00:05:33,640 Speaker 3: do you just have eight different windows open on your 92 00:05:33,640 --> 00:05:36,359 Speaker 3: computer screen and you're just going through each one and 93 00:05:36,400 --> 00:05:38,520 Speaker 3: then trying to keep track every time? How do you 94 00:05:38,520 --> 00:05:39,080 Speaker 3: even do that? 95 00:05:39,200 --> 00:05:39,400 Speaker 1: Yeah? 96 00:05:39,400 --> 00:05:42,760 Speaker 2: Pretty much. They tuggle between all of these different windows 97 00:05:42,880 --> 00:05:44,280 Speaker 2: on their interface. 98 00:05:44,000 --> 00:05:46,680 Speaker 3: And sometimes chatters don't work on their own. Sometimes they'll 99 00:05:46,680 --> 00:05:49,240 Speaker 3: group up and work together to try to get repeat 100 00:05:49,360 --> 00:05:52,479 Speaker 3: or VIP customers. They refer to these as whales to 101 00:05:52,520 --> 00:05:55,440 Speaker 3: try to get them to spend as much money as possible. 102 00:05:55,839 --> 00:05:59,880 Speaker 2: Sometimes there are teams too, especially when there are big 103 00:06:01,440 --> 00:06:05,920 Speaker 2: They can identify which fans or subscribers are the big spenders, 104 00:06:06,000 --> 00:06:08,680 Speaker 2: and when they log on, they can group into teams 105 00:06:08,720 --> 00:06:12,040 Speaker 2: and chat with them simultaneously. And while these teams are 106 00:06:12,080 --> 00:06:15,520 Speaker 2: chatting with these big spenders, their coach now their managers 107 00:06:15,560 --> 00:06:18,039 Speaker 2: are coaching them to discord. It's kind of like an 108 00:06:18,200 --> 00:06:25,479 Speaker 2: esports team, you know, but flirting draw that connection for me, 109 00:06:25,920 --> 00:06:27,640 Speaker 2: you know esports. You know, they have the teams and 110 00:06:27,720 --> 00:06:30,760 Speaker 2: you can see that they're being coached by some person 111 00:06:31,080 --> 00:06:34,159 Speaker 2: in the background. They all go left or go right 112 00:06:34,560 --> 00:06:36,440 Speaker 2: or whatever. And that's the same for only fans. So 113 00:06:36,520 --> 00:06:39,000 Speaker 2: they're speaking to a big spender. A team of maybe 114 00:06:39,040 --> 00:06:42,520 Speaker 2: three people speaking to one person, so the chat is 115 00:06:42,520 --> 00:06:45,520 Speaker 2: more seamless, and one person may be like, okay, ready 116 00:06:45,520 --> 00:06:48,400 Speaker 2: this picture of the boobs. Send it in like in 117 00:06:48,400 --> 00:06:50,960 Speaker 2: a few minutes, okay, and then somebody else think of 118 00:06:51,000 --> 00:06:53,679 Speaker 2: something we need to say, and then we'll send that soon. 119 00:06:53,960 --> 00:06:57,200 Speaker 3: I'm actually starting to realize how complicated this could get, 120 00:06:57,279 --> 00:07:00,840 Speaker 3: because keeping up the illusion just sounds difficult. I mean, 121 00:07:00,880 --> 00:07:02,800 Speaker 3: if I'm on the other end, if I'm the customer 122 00:07:03,200 --> 00:07:06,159 Speaker 3: and I think I'm sexting with somebody and I say, hey, 123 00:07:06,600 --> 00:07:08,240 Speaker 3: I want you to take off your shirt and send 124 00:07:08,279 --> 00:07:12,120 Speaker 3: me as selfie. If it's a normal person, that's fairly 125 00:07:12,200 --> 00:07:17,680 Speaker 3: easily done, right, But if the actual model, Hey, if 126 00:07:17,680 --> 00:07:20,480 Speaker 3: I'm not actually talking to them, Like, how does that 127 00:07:20,560 --> 00:07:23,920 Speaker 3: even work? Am I a chatter in the Philippines? And 128 00:07:23,960 --> 00:07:26,800 Speaker 3: then I ping the model and say, ayo, this customer 129 00:07:26,840 --> 00:07:29,600 Speaker 3: on the other line, they need a picture. Could you 130 00:07:29,640 --> 00:07:31,920 Speaker 3: wake up real quick? And I know it's two am 131 00:07:31,960 --> 00:07:35,560 Speaker 3: for you, but emergency sos like take off your shirt 132 00:07:35,600 --> 00:07:37,520 Speaker 3: and send me your picture right quick. How did this work? 133 00:07:37,680 --> 00:07:40,040 Speaker 3: Is there a library of stuff that's already banked? Like, 134 00:07:40,040 --> 00:07:40,880 Speaker 3: how does this work? 135 00:07:41,400 --> 00:07:41,600 Speaker 4: Yeah? 136 00:07:41,720 --> 00:07:44,960 Speaker 2: Actually I saw a demo of this happen, And yes, 137 00:07:45,040 --> 00:07:48,000 Speaker 2: there is a library of pictures and content and video 138 00:07:48,160 --> 00:07:51,040 Speaker 2: so that a chatter can go to when speaking to 139 00:07:51,080 --> 00:07:54,520 Speaker 2: a fan. One chatter was like speaking to this person 140 00:07:54,560 --> 00:07:57,600 Speaker 2: and this person's like hmm, innocently enough, you know, just 141 00:07:57,680 --> 00:07:59,720 Speaker 2: like hey, what are you up? To and then the 142 00:07:59,800 --> 00:08:01,720 Speaker 2: chat after he says, Oh, I have like a bunch 143 00:08:01,720 --> 00:08:03,080 Speaker 2: of things that I can go to. I have a 144 00:08:03,120 --> 00:08:05,720 Speaker 2: picture of this model eating pizza. I have a picture 145 00:08:05,760 --> 00:08:07,760 Speaker 2: of this model getting out of the shower. I have 146 00:08:07,800 --> 00:08:11,720 Speaker 2: a picture of this model preparing dinner, watching TV whatever. 147 00:08:11,840 --> 00:08:14,000 Speaker 2: So you just choose from that. So she chooses the 148 00:08:14,040 --> 00:08:17,120 Speaker 2: picture with the pizza, and so I just ordered some pizza. 149 00:08:17,160 --> 00:08:19,400 Speaker 2: I'm chilling at home. So there's a picture of the 150 00:08:19,440 --> 00:08:20,480 Speaker 2: pizza and it looks legit. 151 00:08:22,240 --> 00:08:26,240 Speaker 3: As a platform OnlyFans had some pretty incredible numbers. Last year, 152 00:08:26,240 --> 00:08:29,200 Speaker 3: they announced they are three hundred and five million fans 153 00:08:29,320 --> 00:08:32,720 Speaker 3: using the platform and their payments set a record revenue 154 00:08:32,800 --> 00:08:38,559 Speaker 3: of six point six billion dollars. Meanwhile, for the chatters. 155 00:08:38,400 --> 00:08:40,480 Speaker 2: Most chatters getting into it right now, I'd say they 156 00:08:40,559 --> 00:08:44,400 Speaker 2: get one dollar to two dollars per hour. 157 00:08:44,720 --> 00:08:47,959 Speaker 3: One to two dollars an hour. Now, some chatters can 158 00:08:47,960 --> 00:08:50,640 Speaker 3: get a commission on top of that, for example one percent, 159 00:08:51,000 --> 00:08:54,160 Speaker 3: but not everyone gets even that. Plus a shift can 160 00:08:54,160 --> 00:08:58,319 Speaker 3: be up to twelve hours long. This is not an 161 00:08:58,320 --> 00:09:03,400 Speaker 3: easy job. What are the customers, the clients, whatever you 162 00:09:03,440 --> 00:09:05,200 Speaker 3: want to call them, What are they asking for? 163 00:09:06,400 --> 00:09:08,480 Speaker 2: I can pull up the manual if you want. 164 00:09:09,679 --> 00:09:13,120 Speaker 3: Yes, you have the manual. There's a manual. 165 00:09:13,360 --> 00:09:14,360 Speaker 1: Yeah, there's a manual. 166 00:09:14,520 --> 00:09:18,120 Speaker 2: One common DM is like I'm so horny for you. 167 00:09:18,240 --> 00:09:22,560 Speaker 2: Here's my dick, and the chatter would say, oh my god, 168 00:09:22,600 --> 00:09:24,760 Speaker 2: I can't see the photo. Can you subscribe to my 169 00:09:24,920 --> 00:09:29,640 Speaker 2: VIP so I can see. One of extreme DM would 170 00:09:29,679 --> 00:09:32,720 Speaker 2: be like, I want to fuck you. Then instruction says 171 00:09:32,800 --> 00:09:37,040 Speaker 2: attach a pic of lingerie focused pick thirty nine dollars. 172 00:09:37,520 --> 00:09:39,920 Speaker 2: One of the other dms says. 173 00:09:39,800 --> 00:09:42,080 Speaker 1: Are you are you really the model? Or is this 174 00:09:42,200 --> 00:09:42,800 Speaker 1: like a chatter? 175 00:09:43,120 --> 00:09:45,679 Speaker 2: And the response is usually like don't. 176 00:09:45,400 --> 00:09:46,480 Speaker 1: You believe me? Baby? 177 00:09:47,720 --> 00:09:51,040 Speaker 3: This can be gruelly work. I mean, imagine juggling multiple 178 00:09:51,040 --> 00:09:54,640 Speaker 3: personalities for twelve hours, staying focused on what the customers 179 00:09:54,640 --> 00:09:58,400 Speaker 3: want and trying to maintain some kind of sanity while 180 00:09:58,440 --> 00:10:00,000 Speaker 3: people say whatever they want to you. 181 00:10:01,080 --> 00:10:05,880 Speaker 2: It can be physically taxing because you are imagined sexting 182 00:10:05,960 --> 00:10:12,440 Speaker 2: with dozens of people using eight different personalities for eight hours, 183 00:10:12,440 --> 00:10:15,000 Speaker 2: like you're way heavy on your on your brain, and 184 00:10:15,080 --> 00:10:18,800 Speaker 2: psychologically speaking, I didn't consult with a psychologist on this. 185 00:10:19,880 --> 00:10:24,280 Speaker 2: Normalizing these these sort of transactual interactions in a very 186 00:10:24,320 --> 00:10:29,480 Speaker 2: sexual manner can fuck you up. I spoke to a 187 00:10:29,520 --> 00:10:33,760 Speaker 2: woman who says that it's affected her dating life because 188 00:10:34,200 --> 00:10:37,800 Speaker 2: whenever she, you know, she meets somebody new, she starts 189 00:10:37,840 --> 00:10:40,520 Speaker 2: texting with them, you know, doesn't have to be sexual, 190 00:10:40,600 --> 00:10:44,439 Speaker 2: but the moment it starts to get a little flirtatious, 191 00:10:44,480 --> 00:10:47,559 Speaker 2: she feels like she's at work and she has these 192 00:10:47,840 --> 00:10:51,280 Speaker 2: sort of mannerisms through texts from OnlyFans that have filtered 193 00:10:51,280 --> 00:10:54,760 Speaker 2: in through her dating life, and so she can't enjoy 194 00:10:55,600 --> 00:11:00,520 Speaker 2: a playful conversation with somebody who she's attracted to. But 195 00:11:00,600 --> 00:11:02,600 Speaker 2: a lot of the women that I spoke to, it's 196 00:11:02,679 --> 00:11:06,720 Speaker 2: really sort of driven them to resent a lot of 197 00:11:06,760 --> 00:11:12,760 Speaker 2: them heterosexual males in the world, because according to them, 198 00:11:13,040 --> 00:11:15,760 Speaker 2: they're treated like pieces of meat for eight hours a day, 199 00:11:16,120 --> 00:11:19,640 Speaker 2: every day, and they have to like it. I think 200 00:11:19,640 --> 00:11:23,040 Speaker 2: that's also why a lot of the OnlyFans chatter had 201 00:11:23,040 --> 00:11:25,679 Speaker 2: mental health suffers a lot. It puts a strain on 202 00:11:25,720 --> 00:11:29,800 Speaker 2: some marriages. For example, not everybody can tell their families 203 00:11:29,840 --> 00:11:33,640 Speaker 2: that it's their job to send explicit pictures and messages 204 00:11:33,679 --> 00:11:36,560 Speaker 2: on OnlyFans for eight hours a day, or tell their 205 00:11:36,600 --> 00:11:38,680 Speaker 2: families that this is what puts food on the table 206 00:11:39,520 --> 00:11:44,640 Speaker 2: for a predominantly Catholic country like the Philippines. This can 207 00:11:44,720 --> 00:11:48,040 Speaker 2: put some strain on a lot of cultural and familial 208 00:11:48,160 --> 00:11:49,920 Speaker 2: values that people grow up with. 209 00:11:50,600 --> 00:11:53,680 Speaker 3: And for Tony Michael's friend of a friend, things started 210 00:11:53,679 --> 00:11:56,240 Speaker 3: to get really bad when a customer said something that 211 00:11:56,440 --> 00:11:58,440 Speaker 3: reminded him of his real life. 212 00:11:58,880 --> 00:12:03,040 Speaker 2: What happened with Tony is that is regularized or sort 213 00:12:03,040 --> 00:12:08,280 Speaker 2: of mechanical sexual transactions day in, day out, twelve hours 214 00:12:08,280 --> 00:12:11,040 Speaker 2: a day was really getting to him. He had recently 215 00:12:11,040 --> 00:12:13,840 Speaker 2: lost his grandma. He was really close with his grandmother. 216 00:12:15,000 --> 00:12:16,920 Speaker 2: And then one of the people he was speaking with 217 00:12:17,160 --> 00:12:20,680 Speaker 2: was telling him about their grandmother who was not going 218 00:12:20,760 --> 00:12:23,079 Speaker 2: to make it, and so they really sort of had 219 00:12:23,080 --> 00:12:27,600 Speaker 2: a moment of a really emotional moment on OnlyFans. And 220 00:12:27,640 --> 00:12:31,720 Speaker 2: it's quite strange because this person in America is losing 221 00:12:31,760 --> 00:12:36,040 Speaker 2: his grandmother and he's opening up, he's being vulnerable to 222 00:12:36,200 --> 00:12:39,680 Speaker 2: the model, but it's actually Tony who lost his grandmother 223 00:12:40,000 --> 00:12:40,880 Speaker 2: a few months back. 224 00:12:41,120 --> 00:12:41,640 Speaker 3: Wow. 225 00:12:41,880 --> 00:12:44,680 Speaker 2: And then Tony who wasn't able to answer for a 226 00:12:44,720 --> 00:12:48,520 Speaker 2: few minutes. And then their boss monitoring on discord, you know, 227 00:12:48,920 --> 00:12:51,640 Speaker 2: so Da Koda's get to the sale things him and 228 00:12:51,679 --> 00:12:54,560 Speaker 2: it's like, what's taking so long? Why aren't you answering. 229 00:12:54,720 --> 00:12:57,960 Speaker 2: After that, he was just telling me that what the fuck, man, 230 00:12:58,240 --> 00:13:00,560 Speaker 2: I make you ten thousand dollars last month, I can't 231 00:13:00,559 --> 00:13:03,199 Speaker 2: take thirty minutes to grieve my loved ones. 232 00:13:05,679 --> 00:13:07,720 Speaker 3: Tony's kind of unique because he has a few years 233 00:13:07,720 --> 00:13:10,440 Speaker 3: of experience. A top level chatter like him could earn 234 00:13:10,480 --> 00:13:12,960 Speaker 3: around ten thousand dollars a year. But if you have 235 00:13:13,040 --> 00:13:17,120 Speaker 3: less experience, you get less money. Meanwhile, the chatter companies 236 00:13:17,160 --> 00:13:20,200 Speaker 3: and the top OnlyFans creators are raking it in. But 237 00:13:20,360 --> 00:13:22,920 Speaker 3: what happens if fans find out that they're not talking 238 00:13:22,960 --> 00:13:41,680 Speaker 3: to the real model. That's after the break and only 239 00:13:41,720 --> 00:13:44,600 Speaker 3: fans chatter like Tony can make, for example, thirty dollars 240 00:13:44,600 --> 00:13:47,640 Speaker 3: for a twelve hour shift, the companies hiring them are 241 00:13:47,679 --> 00:13:51,600 Speaker 3: making significantly more. And on top of that, they monitor 242 00:13:51,760 --> 00:13:53,240 Speaker 3: everything their chatters do. 243 00:13:53,640 --> 00:13:57,000 Speaker 2: They assess your response to this fan, maybe you should 244 00:13:57,000 --> 00:14:00,120 Speaker 2: try it this way. You know, they're constantly trying to 245 00:14:00,160 --> 00:14:04,760 Speaker 2: improve their metrics on how to best flirt and consequently 246 00:14:04,920 --> 00:14:06,120 Speaker 2: sell more content. 247 00:14:06,400 --> 00:14:09,679 Speaker 3: And compare that to the creators themselves, to the performers 248 00:14:09,720 --> 00:14:13,040 Speaker 3: on OnlyFans who these chatters are pretending to be. Bad 249 00:14:13,080 --> 00:14:16,200 Speaker 3: Baby is one of the top earners on OnlyFans. You 250 00:14:16,280 --> 00:14:19,080 Speaker 3: might remember her from a twenty sixteen Doctor Phil appearance 251 00:14:19,120 --> 00:14:21,480 Speaker 3: that went viral and turned into a meme that you 252 00:14:21,720 --> 00:14:22,440 Speaker 3: might have heard of. 253 00:14:22,920 --> 00:14:23,440 Speaker 4: Catch me out? 254 00:14:23,800 --> 00:14:25,920 Speaker 1: How about it catch you outside? 255 00:14:27,320 --> 00:14:27,960 Speaker 3: What does that mean? 256 00:14:29,600 --> 00:14:30,360 Speaker 4: What I just said? 257 00:14:30,880 --> 00:14:33,760 Speaker 3: Bad Baby joined Only Fans in twenty twenty one. Within 258 00:14:33,800 --> 00:14:37,320 Speaker 3: the first six hours, she made a million dollars. Let 259 00:14:37,360 --> 00:14:39,640 Speaker 3: me just say that again. Within the first six hours, 260 00:14:39,640 --> 00:14:42,440 Speaker 3: she made one million dollars, and over the next three 261 00:14:42,520 --> 00:14:45,480 Speaker 3: years she went on to make another fifty six million. 262 00:14:46,120 --> 00:14:48,560 Speaker 3: There's absolutely no way she's chatting to every single person 263 00:14:48,560 --> 00:14:50,160 Speaker 3: who thinks they're chatting to her. 264 00:14:50,520 --> 00:14:53,120 Speaker 1: Yeah, I spoke to the chatter who chats for bad Baby. 265 00:14:54,080 --> 00:14:59,000 Speaker 3: Really, yo, Do people think they're chatting to her? 266 00:14:59,400 --> 00:15:00,480 Speaker 1: They do? They do? 267 00:15:00,880 --> 00:15:02,760 Speaker 3: Really they think they're chatting to her. 268 00:15:03,160 --> 00:15:08,160 Speaker 2: I think on some level there's already a suspension of disbelief. Sure, like, 269 00:15:08,560 --> 00:15:10,800 Speaker 2: I know that I might not really be speaking to 270 00:15:11,440 --> 00:15:15,600 Speaker 2: bad Baby, but what difference does it really make to them? 271 00:15:15,760 --> 00:15:18,200 Speaker 2: There was one chatter who told me that there's one 272 00:15:18,200 --> 00:15:21,760 Speaker 2: Filipino chatter. She was chatting away to a fan and 273 00:15:22,480 --> 00:15:24,920 Speaker 2: then suddenly she just said, excuse me, I have to 274 00:15:24,960 --> 00:15:27,720 Speaker 2: go use the cr which is what we say for 275 00:15:27,760 --> 00:15:31,200 Speaker 2: a toilet. And so this fan was kind of confused, like, wait, 276 00:15:31,280 --> 00:15:34,360 Speaker 2: is this really you? Are you a Filipino? And then 277 00:15:34,440 --> 00:15:38,280 Speaker 2: so basically she was outed by the fan and later 278 00:15:38,320 --> 00:15:41,080 Speaker 2: on the fan didn't really care. They just carried on 279 00:15:41,200 --> 00:15:44,480 Speaker 2: as if nothing happened, and to the chatter it was like, see, 280 00:15:44,480 --> 00:15:45,720 Speaker 2: it doesn't really matter to them. 281 00:15:46,200 --> 00:15:49,440 Speaker 3: Wow. And then they just kept talking. They didn't stop 282 00:15:49,520 --> 00:15:49,960 Speaker 3: the chat. 283 00:15:50,320 --> 00:15:53,920 Speaker 2: No, they were really worried, like I'm going to lose 284 00:15:53,920 --> 00:15:56,120 Speaker 2: my job or something like that. The chatter was pretty 285 00:15:56,120 --> 00:15:59,560 Speaker 2: happy that they just kept on talking as if nothing happened. 286 00:15:59,360 --> 00:16:04,080 Speaker 3: Wow, and continuing the fantasy. I guess what I. 287 00:16:04,080 --> 00:16:06,080 Speaker 2: Got from a lot of the chatters I spoke with, 288 00:16:06,600 --> 00:16:10,160 Speaker 2: I'd say they're eighty percent of the people who they 289 00:16:10,160 --> 00:16:12,960 Speaker 2: speak with are just looking for the content they want 290 00:16:13,000 --> 00:16:16,840 Speaker 2: to see, you know, naked people, et cetera. But a 291 00:16:16,840 --> 00:16:20,800 Speaker 2: lot of them are also just really really lonely. This 292 00:16:20,880 --> 00:16:24,480 Speaker 2: whole industry is sort of capitalizing on the lot what 293 00:16:24,480 --> 00:16:27,120 Speaker 2: a lot of people would call the loneliness epidemic. But 294 00:16:27,160 --> 00:16:29,280 Speaker 2: at the same time, on the other end, which is 295 00:16:29,280 --> 00:16:31,920 Speaker 2: I guess the hidden, sort of unseen end of all, 296 00:16:32,120 --> 00:16:36,200 Speaker 2: this whole industry is capitalizing on the divide between economies 297 00:16:36,880 --> 00:16:40,560 Speaker 2: the Global South, some would say the exploited Global South 298 00:16:42,120 --> 00:16:46,600 Speaker 2: because of the wages we get and the wages we're 299 00:16:46,760 --> 00:16:51,160 Speaker 2: supposed to appreciate, you know, even if it's part of 300 00:16:51,160 --> 00:16:56,040 Speaker 2: this whole chain's exploitation. And so, you know, with internet 301 00:16:56,160 --> 00:16:59,920 Speaker 2: and with orderless industries, sometimes you don't have to bring 302 00:17:00,120 --> 00:17:02,280 Speaker 2: laborers through the country anymore. You just need a good 303 00:17:02,320 --> 00:17:03,120 Speaker 2: internet connection. 304 00:17:03,880 --> 00:17:07,560 Speaker 3: Chatters are critical in the economy of OnlyFans period. One 305 00:17:07,560 --> 00:17:10,879 Speaker 3: study found that private messages drive almost seventy percent of 306 00:17:11,040 --> 00:17:13,720 Speaker 3: all only fans revenue. And if you want to increase 307 00:17:13,760 --> 00:17:18,280 Speaker 3: those margins, the Philippines is a logical choice. There's infrastructure, 308 00:17:18,600 --> 00:17:22,240 Speaker 3: and for chatters, this is a job when those can 309 00:17:22,280 --> 00:17:23,080 Speaker 3: be hard to find. 310 00:17:24,119 --> 00:17:26,639 Speaker 2: I think one of the reasons that OnlyFans chatting is 311 00:17:26,680 --> 00:17:29,119 Speaker 2: such a big deal here is that the Philippines is 312 00:17:29,160 --> 00:17:31,960 Speaker 2: a country, it's a former US colony with a really 313 00:17:32,000 --> 00:17:36,320 Speaker 2: really big unemployment rate. There aren't a lot of available jobs. 314 00:17:36,960 --> 00:17:41,000 Speaker 2: I think in Southeast Asia we are regularly, if not 315 00:17:41,080 --> 00:17:45,280 Speaker 2: the first, we're usually competing with Indonesia in terms of unemployment, 316 00:17:46,160 --> 00:17:48,879 Speaker 2: but in terms of percentages, we have the highest homeless 317 00:17:49,000 --> 00:17:51,479 Speaker 2: rate for a lot of the pandemic with the highest 318 00:17:51,480 --> 00:17:57,440 Speaker 2: inflation rates, so economically, only fans chatting despite this one 319 00:17:57,480 --> 00:18:01,440 Speaker 2: dollar two dollars an hour rates, it's still a pretty 320 00:18:01,480 --> 00:18:04,320 Speaker 2: big prospect, I mean, relatively speaking. 321 00:18:04,760 --> 00:18:07,440 Speaker 3: And there's a history here that's also relevant. 322 00:18:07,560 --> 00:18:12,080 Speaker 2: Call center was I guess the original outsourcing boom here 323 00:18:12,119 --> 00:18:14,440 Speaker 2: in the Philippines. I think I was a teenager when 324 00:18:14,440 --> 00:18:17,080 Speaker 2: my older brother got a job in a call center 325 00:18:17,119 --> 00:18:19,440 Speaker 2: and suddenly all his friends had to speak like they 326 00:18:19,440 --> 00:18:24,119 Speaker 2: were from southern California. But in terms of salaries, only 327 00:18:24,160 --> 00:18:27,800 Speaker 2: fans chatting is definitely a lot bigger. I did meet 328 00:18:27,840 --> 00:18:30,919 Speaker 2: one chatter she transferred from a call center job to 329 00:18:31,080 --> 00:18:35,080 Speaker 2: only fans chatting job. It does pay, i'd say, like 330 00:18:35,760 --> 00:18:39,800 Speaker 2: at least twice as more, But I think sometimes a 331 00:18:39,800 --> 00:18:43,240 Speaker 2: lot of people they're sort of blindsided by it's such 332 00:18:43,240 --> 00:18:45,600 Speaker 2: a big salary and we're just flirting with people. It 333 00:18:45,640 --> 00:18:49,119 Speaker 2: should be easy, but it's actually a really, really taxing job. 334 00:18:49,520 --> 00:18:53,199 Speaker 2: It's competitive flirting basically, because you're competing to keep your 335 00:18:53,240 --> 00:18:56,840 Speaker 2: quotas up and to keep your job. It says in 336 00:18:57,080 --> 00:19:01,639 Speaker 2: their contracts that they're pretty expendable. The moment that they 337 00:19:01,920 --> 00:19:04,879 Speaker 2: drop from their quotas it's pretty easy to terminate them 338 00:19:04,880 --> 00:19:05,560 Speaker 2: from their jobs. 339 00:19:07,359 --> 00:19:10,240 Speaker 3: And in a landscape where people expect digital labor to 340 00:19:10,240 --> 00:19:12,840 Speaker 3: be cheap, it's only a matter of time before AI 341 00:19:13,000 --> 00:19:16,560 Speaker 3: enters the picture. These workers now aren't just doing invisible labor, 342 00:19:16,600 --> 00:19:19,680 Speaker 3: they're being threatened to be replaced by it. Companies are 343 00:19:19,720 --> 00:19:22,760 Speaker 3: now using AI to generate messages to the fans. 344 00:19:23,240 --> 00:19:26,760 Speaker 2: Well, when human Filipino chatter maybe handles five to eight models, 345 00:19:26,800 --> 00:19:29,280 Speaker 2: but an AI can have a fifty at a time, 346 00:19:30,000 --> 00:19:33,200 Speaker 2: and you know, the AI doesn't have shifts. But a 347 00:19:33,240 --> 00:19:35,320 Speaker 2: lot of the AI companies, of course, they still say 348 00:19:35,359 --> 00:19:36,920 Speaker 2: that they're not replacing humans. 349 00:19:37,720 --> 00:19:40,760 Speaker 1: They're just you know, making humans more efficient. 350 00:19:41,240 --> 00:19:45,080 Speaker 3: Even if they aren't replacing human chatters. Yet this has 351 00:19:45,119 --> 00:19:47,840 Speaker 3: put new stress on them. Earlier this year, the company 352 00:19:47,880 --> 00:19:50,520 Speaker 3: Tony works for informed this team that their chats were 353 00:19:50,520 --> 00:19:53,720 Speaker 3: being used to train AI and that this AI would 354 00:19:53,760 --> 00:19:56,520 Speaker 3: eventually start to replace some of them. They just told 355 00:19:56,520 --> 00:19:57,919 Speaker 3: them straight up, they need to hide it. 356 00:19:58,320 --> 00:20:00,720 Speaker 2: And then on the other end, there are companies and 357 00:20:00,800 --> 00:20:03,160 Speaker 2: chatters who are saying that AI could never do my job. 358 00:20:04,080 --> 00:20:07,600 Speaker 2: It's better to foster these genuine connections with fans, and 359 00:20:07,640 --> 00:20:09,080 Speaker 2: that's actually more profitable. 360 00:20:09,440 --> 00:20:13,040 Speaker 3: Genuine connections when they're not genuinely speaking to the model, 361 00:20:13,040 --> 00:20:15,879 Speaker 3: they think they're speaking too. That's a wild statement, Okay, 362 00:20:16,160 --> 00:20:19,760 Speaker 3: genuine connections all right. And AI is rolling Only Fans 363 00:20:19,960 --> 00:20:23,480 Speaker 3: isn't just limited to the messages, it's also creating photos 364 00:20:23,480 --> 00:20:24,200 Speaker 3: and videos. 365 00:20:24,600 --> 00:20:27,639 Speaker 2: Custom content is like a big deal on Only Fans. 366 00:20:27,800 --> 00:20:29,800 Speaker 2: It makes a lot of money and it drives a 367 00:20:29,840 --> 00:20:33,639 Speaker 2: lot of the sales. You can't really mass produce custom 368 00:20:33,680 --> 00:20:34,840 Speaker 2: content because it's customs. 369 00:20:35,720 --> 00:20:37,440 Speaker 1: But with these new AI tools, what they. 370 00:20:37,440 --> 00:20:40,240 Speaker 2: Can do is they promise to make them indistinguishable from 371 00:20:40,240 --> 00:20:43,680 Speaker 2: a professional photo shoot. And at the same time, because 372 00:20:43,680 --> 00:20:46,760 Speaker 2: it's AI, there's a world of possibility on what custom 373 00:20:46,880 --> 00:20:49,919 Speaker 2: can be. They can generate new pictures of that model 374 00:20:50,119 --> 00:20:53,960 Speaker 2: doing something completely different in a completely different scenario. But 375 00:20:54,040 --> 00:20:59,240 Speaker 2: it's definitely sort of opening a new box of available 376 00:20:59,280 --> 00:21:04,840 Speaker 2: products can be mass produced and sold through this platform. 377 00:21:05,280 --> 00:21:07,760 Speaker 3: The CEO of OnlyFans has said that Only Fans as 378 00:21:07,800 --> 00:21:12,720 Speaker 3: a platform doesn't allow entirely AI created characters, but creators 379 00:21:12,800 --> 00:21:16,360 Speaker 3: can use AI to quote unquote enhance their content as 380 00:21:16,400 --> 00:21:19,480 Speaker 3: long as they disclose the use of that AI. That 381 00:21:19,560 --> 00:21:22,280 Speaker 3: seems really open ended to me and also kind of 382 00:21:22,280 --> 00:21:24,440 Speaker 3: hard to enforce, but let's get back to the chatters. 383 00:21:25,040 --> 00:21:27,120 Speaker 3: If this work is so hard on the humans who 384 00:21:27,119 --> 00:21:30,520 Speaker 3: are doing it, would replacing them with AI maybe not 385 00:21:30,720 --> 00:21:34,320 Speaker 3: be such a bad thing. We'll get into that after 386 00:21:34,359 --> 00:21:50,520 Speaker 3: the break. After all that you said about this, especially 387 00:21:50,560 --> 00:21:53,439 Speaker 3: what the chatters go through, right, the people who are 388 00:21:53,560 --> 00:21:56,760 Speaker 3: essentially propping up this industry, as much as I want 389 00:21:56,760 --> 00:22:01,040 Speaker 3: them to get paid, I'm starting to think maybe, in 390 00:22:01,119 --> 00:22:05,560 Speaker 3: terms of we're just talking mental health outcomes here, maybe 391 00:22:05,560 --> 00:22:07,359 Speaker 3: this is one of those industries where, you know what, 392 00:22:07,400 --> 00:22:11,000 Speaker 3: maybe AI should do this. Maybe it should be AI 393 00:22:11,680 --> 00:22:14,240 Speaker 3: talking to the lonely dude in Iowa or whatever who 394 00:22:14,240 --> 00:22:17,879 Speaker 3: thinks he's talking a bad baby, like whatever, vitriol, he's whatever, 395 00:22:17,920 --> 00:22:21,680 Speaker 3: wild stuff. He's gonna be saying at somebody potentially violent things, 396 00:22:22,080 --> 00:22:25,400 Speaker 3: potentially abusive things. Say that to the robot. I don't 397 00:22:25,440 --> 00:22:27,080 Speaker 3: want you saying that to a real human being, Like 398 00:22:27,200 --> 00:22:28,720 Speaker 3: I don't want I don't want somebody in the States 399 00:22:28,720 --> 00:22:29,840 Speaker 3: having to deal with that. I don't want somebody the 400 00:22:29,880 --> 00:22:33,400 Speaker 3: Philippines having to deal with that. Give that to the robot. 401 00:22:33,640 --> 00:22:36,280 Speaker 2: Yeah, that's true, that would be an option, But you're 402 00:22:36,280 --> 00:22:39,639 Speaker 2: still taking away jobs for an untold number of or 403 00:22:40,240 --> 00:22:43,520 Speaker 2: people who who have no job, to have no available jobs. 404 00:22:43,560 --> 00:22:46,600 Speaker 2: So again when we go back to the baseline, yeah, 405 00:22:46,720 --> 00:22:49,640 Speaker 2: what available jobs are there for the Philippines. It's maybe 406 00:22:49,680 --> 00:22:52,920 Speaker 2: another episode, but the Philippines is sort of in the 407 00:22:53,000 --> 00:22:55,840 Speaker 2: last hundred years. It's been tailor made for sending people 408 00:22:56,000 --> 00:22:59,720 Speaker 2: to work abroad for for next to nothing, and it's 409 00:22:59,760 --> 00:23:04,200 Speaker 2: been made for now for accommodating a lot of the 410 00:23:04,280 --> 00:23:07,879 Speaker 2: jobs that people don't want to do in other countries. 411 00:23:08,320 --> 00:23:11,080 Speaker 3: What does the prospects look like though, say two years 412 00:23:11,080 --> 00:23:12,959 Speaker 3: from now, do you think this industry is still going 413 00:23:13,000 --> 00:23:14,800 Speaker 3: to exist in the Philippines or is it all going 414 00:23:14,840 --> 00:23:15,760 Speaker 3: to be robots? 415 00:23:16,080 --> 00:23:18,680 Speaker 2: It's hard to say. The race for the AI techology 416 00:23:18,760 --> 00:23:23,520 Speaker 2: to improve and to distribute it to as many companies 417 00:23:23,560 --> 00:23:27,000 Speaker 2: as possible in as many different kinds of uses as possible, 418 00:23:27,320 --> 00:23:30,360 Speaker 2: and at the same time, there's a race for accessing 419 00:23:30,480 --> 00:23:35,119 Speaker 2: and tapping more cheap labor and the cheaper labor after that. 420 00:23:36,000 --> 00:23:38,280 Speaker 2: And so I think in two years or in several years, 421 00:23:38,320 --> 00:23:40,840 Speaker 2: I'm thinking, yes, this will definitely still exists. I don't 422 00:23:40,880 --> 00:23:45,240 Speaker 2: think that one the technology can exist without humans completely, 423 00:23:45,960 --> 00:23:48,840 Speaker 2: but there's definitely going to be some displacement along the way, 424 00:23:49,600 --> 00:23:53,120 Speaker 2: and it's it's kind of freaky to think what kind 425 00:23:53,119 --> 00:23:57,000 Speaker 2: of interactions will this breed in the future. Will it 426 00:23:57,000 --> 00:24:02,040 Speaker 2: be eighty percent you know, robot generated responses, or it 427 00:24:02,119 --> 00:24:06,520 Speaker 2: will it still be a majority human. But whatever happens, 428 00:24:06,880 --> 00:24:11,440 Speaker 2: I think that there will still be either more humans 429 00:24:11,480 --> 00:24:16,960 Speaker 2: looking for any kind of job available Filipinos and there 430 00:24:17,000 --> 00:24:20,480 Speaker 2: will still be more companies from the US or from 431 00:24:20,560 --> 00:24:24,880 Speaker 2: anywhere else in the world looking to make the most 432 00:24:24,880 --> 00:24:25,160 Speaker 2: of that. 433 00:24:25,720 --> 00:24:28,000 Speaker 3: And that AI race that Michael was talking about for 434 00:24:28,080 --> 00:24:31,320 Speaker 3: cheaper labor and faster results. It's already playing out in 435 00:24:31,320 --> 00:24:32,040 Speaker 3: the Philippines. 436 00:24:32,359 --> 00:24:35,359 Speaker 2: A lot of companies are figuring out what else they 437 00:24:35,359 --> 00:24:40,040 Speaker 2: can offshore, whether it's only Fans chatting to building automation 438 00:24:40,200 --> 00:24:44,000 Speaker 2: tools and data labeling. I spoke to some of the 439 00:24:44,000 --> 00:24:48,400 Speaker 2: companies that are building the AI tools that are being 440 00:24:48,440 --> 00:24:50,679 Speaker 2: deployed and will be deployed more and more on the 441 00:24:50,720 --> 00:24:54,320 Speaker 2: Only Fans platform and they said, oh, actually, some Filipinos 442 00:24:54,320 --> 00:24:58,400 Speaker 2: were doing the data labeling for our tech, so that's interesting. 443 00:24:58,480 --> 00:25:01,440 Speaker 2: So it's like Filipino labor doing the chatting and also 444 00:25:01,480 --> 00:25:04,520 Speaker 2: doing the data labeling that will be used for the 445 00:25:04,560 --> 00:25:06,560 Speaker 2: future of OnlyFans Chatting, which is AI. 446 00:25:07,320 --> 00:25:09,560 Speaker 3: And this brings us to the six point six billion 447 00:25:09,560 --> 00:25:13,439 Speaker 3: dollar question, does the user care if they're sexting with AI? 448 00:25:14,119 --> 00:25:15,560 Speaker 1: I think a. 449 00:25:15,480 --> 00:25:17,800 Speaker 2: Lot of the AI companies are banking on the fact 450 00:25:17,880 --> 00:25:21,800 Speaker 2: that it won't matter to a lot of customers whether 451 00:25:21,840 --> 00:25:25,320 Speaker 2: they're talking to an AI or an actual human or 452 00:25:25,359 --> 00:25:28,359 Speaker 2: the model. When you think about it, even if you 453 00:25:28,359 --> 00:25:30,640 Speaker 2: are actually speaking to the model, the model is sort 454 00:25:30,640 --> 00:25:35,399 Speaker 2: of inhabiting themselves as a brand. They're tailoring themselves to 455 00:25:34,960 --> 00:25:37,960 Speaker 2: this brand that they have on OnlyFans, that I'm the 456 00:25:38,280 --> 00:25:42,000 Speaker 2: schoolgirl or I'm the hot neighbor or something, you know, 457 00:25:42,960 --> 00:25:47,040 Speaker 2: And so that all of that can be distilled into 458 00:25:47,119 --> 00:25:50,119 Speaker 2: either the work of a chatter or an AI. And 459 00:25:50,200 --> 00:25:53,960 Speaker 2: so what commitments are really being held? How many ways 460 00:25:54,000 --> 00:25:55,159 Speaker 2: can this conversation really go? 461 00:25:56,040 --> 00:25:59,920 Speaker 3: Well, it's already it's an abstraction of your actual personality, 462 00:26:00,119 --> 00:26:05,000 Speaker 3: and then an abstraction of that abstraction. Everything's a caricature 463 00:26:05,320 --> 00:26:09,640 Speaker 3: of another caricature. So I want to underline one thing 464 00:26:09,680 --> 00:26:12,639 Speaker 3: real quick. I don't think this is a story so 465 00:26:12,800 --> 00:26:16,240 Speaker 3: much about sex, especially if you think about sex as 466 00:26:16,240 --> 00:26:19,560 Speaker 3: something that's separated from real life or from work. And 467 00:26:19,760 --> 00:26:22,520 Speaker 3: I think the existence of OnlyFans chatters is just one 468 00:26:22,560 --> 00:26:25,880 Speaker 3: example of how Silicon Valley and all the apps, it's 469 00:26:25,920 --> 00:26:29,159 Speaker 3: not just devices, it's people. And it's not just the 470 00:26:29,160 --> 00:26:31,680 Speaker 3: people you see, like in this case the attractive people 471 00:26:31,720 --> 00:26:34,720 Speaker 3: who you pay money to see just with less clothes on. 472 00:26:35,240 --> 00:26:37,840 Speaker 3: It's the people who you don't see that make this 473 00:26:38,160 --> 00:26:42,919 Speaker 3: entire economy work, and when that economy demands more profits, 474 00:26:44,040 --> 00:26:47,199 Speaker 3: We're about to find out what happens next, and this 475 00:26:47,359 --> 00:26:49,639 Speaker 3: is probably a good place to say that. If you 476 00:26:49,680 --> 00:26:52,760 Speaker 3: want to read Michael's full article on OnlyFans, it's on 477 00:26:52,800 --> 00:26:55,560 Speaker 3: a site called restoworld dot org, which has a lot 478 00:26:55,680 --> 00:26:58,399 Speaker 3: more really great reporting on stuff just like this, and 479 00:26:58,440 --> 00:27:00,560 Speaker 3: will have a link to that. As always in the 480 00:27:00,600 --> 00:27:06,680 Speaker 3: show notes, thank you so much for listening to another 481 00:27:06,720 --> 00:27:09,480 Speaker 3: episode of kill Switch. You can email us at kill 482 00:27:09,520 --> 00:27:13,800 Speaker 3: Switch at Kalaidoscope dot NYC, or we're on Instagram at 483 00:27:14,040 --> 00:27:17,080 Speaker 3: kill switch Pod, and wherever you're listening, you know, maybe 484 00:27:17,160 --> 00:27:19,600 Speaker 3: leave us a review. It helps other people find the show, 485 00:27:19,920 --> 00:27:22,800 Speaker 3: which helps us keep doing our thing. And once you've 486 00:27:22,840 --> 00:27:24,960 Speaker 3: done that, you might not have known this, but kill 487 00:27:25,000 --> 00:27:27,560 Speaker 3: Switch is now also on YouTube. So if you've ever 488 00:27:27,560 --> 00:27:30,159 Speaker 3: been wondering what those Fisker cars look like, or the 489 00:27:30,240 --> 00:27:33,520 Speaker 3: anime VTuber avatars look like you can see it all there. 490 00:27:33,640 --> 00:27:36,199 Speaker 3: The link for that and everything else is in the 491 00:27:36,200 --> 00:27:39,760 Speaker 3: show notes. Kill Switch is hosted by Me Dexter Thomas. 492 00:27:39,960 --> 00:27:44,240 Speaker 3: It's produced by Sena Ozaki, Darlck Potts, anle Xanderfeld, and 493 00:27:44,359 --> 00:27:47,840 Speaker 3: Julian Nutter. Our theme song is by me and Kyle Murdoch, 494 00:27:48,040 --> 00:27:51,640 Speaker 3: and Kyle also makes the show. From Kaleidoscope. Our executive 495 00:27:51,680 --> 00:27:55,880 Speaker 3: producers are Ozma lashin On, Gesha Togadur, and Kate Osborne. 496 00:27:56,040 --> 00:27:59,960 Speaker 3: From iHeart our executive producers are Katrina Norvil and Nikki. 497 00:28:00,800 --> 00:28:03,400 Speaker 3: And one more thing. Here's just one more story from 498 00:28:03,440 --> 00:28:06,800 Speaker 3: Michael about how the only fans chatter. Tony prides himself 499 00:28:06,840 --> 00:28:08,320 Speaker 3: on being really good at his job. 500 00:28:08,680 --> 00:28:12,000 Speaker 2: So for Tony, he goes to parties and he introduces 501 00:28:12,080 --> 00:28:14,160 Speaker 2: himself as a professional cat fisher. 502 00:28:14,400 --> 00:28:16,720 Speaker 3: He says that of parties really yeah. 503 00:28:16,520 --> 00:28:19,919 Speaker 2: As a joke though, but I mean, everybody knows what 504 00:28:20,080 --> 00:28:23,960 Speaker 2: his tough is and to an extent, he was even 505 00:28:24,000 --> 00:28:26,399 Speaker 2: priding himself on being really really good at it. He 506 00:28:26,480 --> 00:28:29,919 Speaker 2: said that I'm a man, and I know what guys 507 00:28:29,960 --> 00:28:31,840 Speaker 2: want to hear when they chat with the woman. 508 00:28:32,560 --> 00:28:35,480 Speaker 3: If you've ever seen the David Cronenberg film and Butterfly, 509 00:28:35,680 --> 00:28:37,720 Speaker 3: or you know the play that's based on you know 510 00:28:37,840 --> 00:28:40,880 Speaker 3: exactly why. I think what Tony just said there is 511 00:28:40,920 --> 00:28:44,880 Speaker 3: so fascinating. But this is not a literature podcast. This 512 00:28:44,960 --> 00:28:47,800 Speaker 3: is kill Switch, So we're just gonna move on, catch 513 00:28:47,800 --> 00:28:49,040 Speaker 3: on the next one.