1 00:00:00,880 --> 00:00:05,080 Speaker 1: Something's been bothering me recently. Tell me, why is it 2 00:00:05,120 --> 00:00:10,000 Speaker 1: that when women start businesses online and build platforms and 3 00:00:10,039 --> 00:00:17,520 Speaker 1: become influencers and have branded deals and branded content and sponsorships, 4 00:00:18,160 --> 00:00:22,800 Speaker 1: why are those women criticized and derided and like looked 5 00:00:22,800 --> 00:00:23,319 Speaker 1: down upon. 6 00:00:23,920 --> 00:00:25,800 Speaker 2: Why is it not seen like a real job. 7 00:00:26,320 --> 00:00:28,000 Speaker 3: Why is it not seen as a real job. 8 00:00:28,280 --> 00:00:32,159 Speaker 1: It's like like fake work, or it's not as dignified 9 00:00:32,200 --> 00:00:35,040 Speaker 1: as like working for a company or a corporation, like 10 00:00:35,080 --> 00:00:42,640 Speaker 1: in an office setting. But but when athletes and superstars 11 00:00:43,040 --> 00:00:47,480 Speaker 1: and musicians launch their own products, they're like moguls. 12 00:00:49,600 --> 00:00:53,080 Speaker 4: Why is it devalued when women do it? But when 13 00:00:53,120 --> 00:00:56,360 Speaker 4: men do it, they're titans of industry. 14 00:00:56,800 --> 00:01:00,480 Speaker 1: People are lining up around the block to get their shoe, 15 00:01:01,120 --> 00:01:04,680 Speaker 1: the latest footwear with your favorite man's name and face 16 00:01:04,720 --> 00:01:05,039 Speaker 1: on it. 17 00:01:06,280 --> 00:01:07,039 Speaker 3: What is the deal? 18 00:01:07,400 --> 00:01:11,679 Speaker 2: Look, radio, let's talk about it. 19 00:01:11,800 --> 00:01:14,560 Speaker 1: I think we need to start with this conversation by 20 00:01:14,600 --> 00:01:19,800 Speaker 1: talking about what is an influencer? What makes someone an 21 00:01:19,840 --> 00:01:22,399 Speaker 1: influencer in this day and age? Twenty twenty five, twenty 22 00:01:22,400 --> 00:01:23,360 Speaker 1: twenty six now. 23 00:01:23,760 --> 00:01:24,440 Speaker 2: Happy New year. 24 00:01:24,480 --> 00:01:27,840 Speaker 4: By the way, it's twenty twenty six now, and I'm 25 00:01:27,880 --> 00:01:31,040 Speaker 4: Viosa and I'm Mala and we're the host of Look 26 00:01:31,040 --> 00:01:34,320 Speaker 4: at Our Radio, and we want to talk about influencing 27 00:01:34,480 --> 00:01:38,360 Speaker 4: and what it's like to be a woman online, the 28 00:01:38,600 --> 00:01:44,240 Speaker 4: almost stigma that influencers carry, especially if they're women, and 29 00:01:44,800 --> 00:01:51,200 Speaker 4: the nuance of the creator economy and the ethics of it. 30 00:01:51,720 --> 00:01:56,040 Speaker 4: Sometimes they are a questionable influencing that happens, but also 31 00:01:56,160 --> 00:02:00,200 Speaker 4: there's also real women that are running their businesses who 32 00:02:00,240 --> 00:02:01,760 Speaker 4: also happened to be influencers. 33 00:02:02,080 --> 00:02:05,480 Speaker 1: It's true, and I think that there are a couple 34 00:02:05,480 --> 00:02:08,040 Speaker 1: different camps here because I think that there are people 35 00:02:08,720 --> 00:02:15,000 Speaker 1: who create online with the intention of becoming influencers and 36 00:02:15,080 --> 00:02:20,080 Speaker 1: being influencers, which is its own job category now in 37 00:02:20,200 --> 00:02:23,360 Speaker 1: twenty twenty six, being an influencer is. 38 00:02:25,400 --> 00:02:25,840 Speaker 3: A job. 39 00:02:25,919 --> 00:02:29,400 Speaker 4: It's a career, and it is a billion dollar industry. 40 00:02:29,480 --> 00:02:29,799 Speaker 3: It is. 41 00:02:30,240 --> 00:02:35,680 Speaker 4: So think what you want about influencers and whether it 42 00:02:36,000 --> 00:02:38,480 Speaker 4: is a real job or not, but there are real 43 00:02:38,520 --> 00:02:42,160 Speaker 4: hours going into it and it's a real economic industry. 44 00:02:42,360 --> 00:02:44,880 Speaker 2: Oh, it continues to grow and it's not going anywhere. 45 00:02:45,000 --> 00:02:50,320 Speaker 1: Real contracts, real checks, real products, real traffic, real views. 46 00:02:50,919 --> 00:02:56,000 Speaker 1: And to quote a rather famous influencer, Bethany Frankel, she 47 00:02:56,080 --> 00:02:58,000 Speaker 1: was on call her daddy not too long ago, and 48 00:02:58,160 --> 00:03:02,120 Speaker 1: I've been following Bethany since hows and I come across 49 00:03:02,160 --> 00:03:06,880 Speaker 1: to Bethany's TikTok's when she's like eating her bagels with 50 00:03:07,080 --> 00:03:10,880 Speaker 1: caviar and whatever else. She's very funny and if you 51 00:03:10,919 --> 00:03:13,280 Speaker 1: all are not familiar with Bethany Frankel, she also is 52 00:03:13,320 --> 00:03:18,200 Speaker 1: the CEO of The Skinny Girl Margarita's. So Bethany was 53 00:03:18,240 --> 00:03:21,920 Speaker 1: talking about how even though she was on the Apprentice 54 00:03:22,240 --> 00:03:25,440 Speaker 1: and she was on Housewives, she doesn't really do TV anymore. 55 00:03:25,680 --> 00:03:31,640 Speaker 1: But that's partially because, according to her, social media is TV. 56 00:03:32,360 --> 00:03:40,320 Speaker 1: This is television, TikTok reels, YouTube, There are sometimes more 57 00:03:40,520 --> 00:03:46,840 Speaker 1: eyes on the digital space than there are at movies 58 00:03:47,120 --> 00:03:50,920 Speaker 1: that are being released in theaters or programs that are 59 00:03:51,240 --> 00:03:52,520 Speaker 1: on the streaming platforms. 60 00:03:52,880 --> 00:03:54,720 Speaker 3: I mean, YouTube is huge. 61 00:03:54,760 --> 00:03:58,560 Speaker 1: It's like the biggest studio in the world, and TikTok 62 00:03:58,760 --> 00:04:02,960 Speaker 1: is not far behind, and Instagram is not going anywhere. 63 00:04:03,760 --> 00:04:06,960 Speaker 1: And my thought also is that you know, Michelle Obama 64 00:04:07,040 --> 00:04:09,600 Speaker 1: is podcasting and posting her podcasting clips. 65 00:04:10,000 --> 00:04:13,280 Speaker 4: Michelle Obama has a production company, Yeah Higher Grounds. 66 00:04:13,440 --> 00:04:16,160 Speaker 2: They create some really incredible. 67 00:04:15,600 --> 00:04:19,680 Speaker 4: Podcasts and documentaries and documentaries, and she also now hosts 68 00:04:19,680 --> 00:04:20,920 Speaker 4: a podcast. 69 00:04:20,520 --> 00:04:24,560 Speaker 1: Herself herself, So Michelle Obama is posting her podcasting clips. 70 00:04:24,640 --> 00:04:27,240 Speaker 1: If it's good enough for Michelle, it's good enough for me. 71 00:04:27,520 --> 00:04:29,960 Speaker 1: That's how I feel about it. So I think that 72 00:04:31,120 --> 00:04:38,919 Speaker 1: we kind of maybe have wandered into influencing, but never 73 00:04:39,120 --> 00:04:44,200 Speaker 1: outright set out with the goal of becoming influencers, because 74 00:04:44,200 --> 00:04:46,039 Speaker 1: that just wasn't the landscape when we started. 75 00:04:46,400 --> 00:04:48,880 Speaker 4: No, it's so different now, But I think that the 76 00:04:48,960 --> 00:04:52,760 Speaker 4: pandemic definitely forced us into and forces not the right word, 77 00:04:52,800 --> 00:04:56,520 Speaker 4: but there was an opportunity for us to do branded 78 00:04:56,600 --> 00:05:00,520 Speaker 4: partnerships because so much was moving to the online line 79 00:05:00,960 --> 00:05:03,960 Speaker 4: space by way of the pandemic, and so we were 80 00:05:03,960 --> 00:05:09,479 Speaker 4: doing a lot of I guess influencer campaigns, branded campaigns. 81 00:05:09,520 --> 00:05:12,880 Speaker 4: We worked with HBO, Max and Warner Brothers, and there 82 00:05:12,960 --> 00:05:17,680 Speaker 4: was incredible opportunities and our first time really exploring and 83 00:05:17,800 --> 00:05:20,560 Speaker 4: venturing into that realm. And then I think when we 84 00:05:20,600 --> 00:05:24,320 Speaker 4: signed with the network, we started doing less and. 85 00:05:24,360 --> 00:05:25,040 Speaker 2: Less of that. 86 00:05:25,760 --> 00:05:27,840 Speaker 4: So I don't think it was a conscious like, oh, 87 00:05:27,960 --> 00:05:30,080 Speaker 4: we don't want to be influencers. It just the work 88 00:05:30,080 --> 00:05:36,000 Speaker 4: shifted and it pivoted, and running multiple shows also had 89 00:05:36,120 --> 00:05:40,640 Speaker 4: us pivot for very intentional reasons and so I think 90 00:05:40,760 --> 00:05:44,400 Speaker 4: more what we're questioning in this episode today, we know 91 00:05:44,640 --> 00:05:49,600 Speaker 4: that there are a lot of women who are influencers. 92 00:05:49,800 --> 00:05:52,800 Speaker 2: And whenever there is a. 93 00:05:53,480 --> 00:05:59,040 Speaker 4: Job a career that leans a certain way, meaning there 94 00:05:59,040 --> 00:06:03,560 Speaker 4: are more women make up that role, there's a different 95 00:06:03,640 --> 00:06:07,479 Speaker 4: level of scrutiny, and there's a different level of how 96 00:06:07,520 --> 00:06:11,120 Speaker 4: we view set work. It may be as frivolous or 97 00:06:11,160 --> 00:06:15,400 Speaker 4: seen as less important or not a real profession, and 98 00:06:15,680 --> 00:06:19,480 Speaker 4: there's even pay disparities. And so just thinking, with all 99 00:06:19,520 --> 00:06:23,040 Speaker 4: that context in mind, why is there so much hate 100 00:06:23,080 --> 00:06:24,000 Speaker 4: for influencers? 101 00:06:25,320 --> 00:06:27,680 Speaker 1: I think there I think you're absolutely right. I think 102 00:06:27,680 --> 00:06:31,240 Speaker 1: that there's a lot of deep seated misogyny that comes 103 00:06:31,279 --> 00:06:36,760 Speaker 1: with women working for themselves and women being visible publicly 104 00:06:37,560 --> 00:06:40,839 Speaker 1: and all kinds of women, Like you don't have to 105 00:06:40,920 --> 00:06:45,599 Speaker 1: be a super skinny, blonde haired, blue eyed, a list 106 00:06:45,680 --> 00:06:49,120 Speaker 1: movie star type to have a huge following, to have 107 00:06:49,160 --> 00:06:54,760 Speaker 1: a successful business online and to be looked at and 108 00:06:55,040 --> 00:06:58,480 Speaker 1: followed and listened to. And I think that there are 109 00:06:58,520 --> 00:07:01,239 Speaker 1: a lot of men out there who are like sitting 110 00:07:01,279 --> 00:07:04,200 Speaker 1: in the shadows in their dwelling wherever it is they live, 111 00:07:04,440 --> 00:07:10,960 Speaker 1: and they're dungeons and they're all crusty and upset and ornery, 112 00:07:11,560 --> 00:07:15,040 Speaker 1: and they're like watching women just pump out this content 113 00:07:15,320 --> 00:07:21,680 Speaker 1: that centers themselves and that is bringing them attention and wealth. 114 00:07:21,800 --> 00:07:23,480 Speaker 1: And I think that there are men out there that 115 00:07:23,560 --> 00:07:28,400 Speaker 1: are straight up hating because they don't know how to 116 00:07:28,480 --> 00:07:31,480 Speaker 1: do the same, or would be afraid to do the same, 117 00:07:33,200 --> 00:07:37,160 Speaker 1: would maybe not be successful if they tried. And I 118 00:07:37,240 --> 00:07:39,960 Speaker 1: think that there's a lot of like people hating from 119 00:07:40,000 --> 00:07:43,240 Speaker 1: the bench. Yeah, is what this feels like to me. 120 00:07:43,880 --> 00:07:48,640 Speaker 1: Whereas there are men who are obsessed with Lebron James 121 00:07:49,440 --> 00:07:53,160 Speaker 1: or obsessed with whatever their favorite baseball player, and they're 122 00:07:53,160 --> 00:07:55,840 Speaker 1: going to buy the jerseys, they're gonna buy the merch 123 00:07:55,920 --> 00:07:59,400 Speaker 1: they're gonna buy the shoes because they can somehow maybe 124 00:07:59,480 --> 00:08:03,080 Speaker 1: see them in these men more than they could see 125 00:08:03,120 --> 00:08:07,400 Speaker 1: themselves in like a startup by a woman who's launching 126 00:08:07,440 --> 00:08:11,200 Speaker 1: her business online. I think there's something for them to 127 00:08:11,320 --> 00:08:15,680 Speaker 1: cheer for with these male athletes, but then with the women, 128 00:08:15,920 --> 00:08:19,239 Speaker 1: it's something for them to look down on and to hate, 129 00:08:19,280 --> 00:08:20,920 Speaker 1: and in a way it makes them feel better. 130 00:08:21,720 --> 00:08:21,960 Speaker 2: Yeah. 131 00:08:22,000 --> 00:08:25,880 Speaker 4: I mean, even just you can do a quick scroll 132 00:08:26,000 --> 00:08:30,520 Speaker 4: or search, there will be times where a video will 133 00:08:30,520 --> 00:08:34,080 Speaker 4: go viral and it's a young woman getting her bachelor's degree, 134 00:08:34,640 --> 00:08:36,400 Speaker 4: and the video will go viral and you'll look at 135 00:08:36,440 --> 00:08:39,439 Speaker 4: the comments and it'll be a bunch of men, nasty 136 00:08:39,800 --> 00:08:44,640 Speaker 4: saying horrible things about her degree, about her about what 137 00:08:44,720 --> 00:08:49,680 Speaker 4: she looks like, questioning her being there because of affirmative action, 138 00:08:50,320 --> 00:08:52,160 Speaker 4: which doesn't exist either anymore. 139 00:08:52,640 --> 00:08:53,240 Speaker 1: There will be. 140 00:08:53,160 --> 00:08:57,480 Speaker 4: Things like that that whenever a woman post online her success. 141 00:08:57,880 --> 00:09:00,680 Speaker 1: I've seen those specifically, I've seen some really nice comments 142 00:09:00,760 --> 00:09:06,719 Speaker 1: under Latina's posting their graduation tiktoks, their reels, their photos, 143 00:09:06,960 --> 00:09:11,280 Speaker 1: and there's always this attempt to like bring this woman 144 00:09:11,360 --> 00:09:14,800 Speaker 1: down to earth or to give her a reality check 145 00:09:15,360 --> 00:09:20,920 Speaker 1: and things like, oh, she's a tota A Nissan Ultima Activities. 146 00:09:21,440 --> 00:09:24,080 Speaker 1: I bet she dates an Edgar, Like she probably has 147 00:09:24,120 --> 00:09:27,680 Speaker 1: a baby daddy. Like there's always a weird dig to 148 00:09:27,800 --> 00:09:31,439 Speaker 1: bring her down a notch when she's clearly accomplishing something 149 00:09:31,559 --> 00:09:35,160 Speaker 1: that's very important to her. Yes, and it's really awful. 150 00:09:35,559 --> 00:09:41,439 Speaker 4: Yeah, Yes, there's definitely this one sided hatred. And I 151 00:09:42,600 --> 00:09:44,960 Speaker 4: agree with what you said that it's not just men 152 00:09:45,040 --> 00:09:48,400 Speaker 4: that hate, there's also women who hate on the influencers. 153 00:09:48,880 --> 00:09:53,719 Speaker 4: I think all people can find themselves at the bench, right, 154 00:09:53,800 --> 00:09:56,800 Speaker 4: Like you said, because it takes a lot of nerve, 155 00:09:56,920 --> 00:10:00,440 Speaker 4: a lot of guts, a lot of letting go of 156 00:10:00,480 --> 00:10:05,640 Speaker 4: being perceived, being embarrassed, to film, to put yourself out there. 157 00:10:06,480 --> 00:10:09,480 Speaker 4: There is this influencer and I would say a friend. 158 00:10:09,480 --> 00:10:10,720 Speaker 2: Now. Her name is Anissa. 159 00:10:10,760 --> 00:10:13,320 Speaker 4: She has a really big platform, and I went on 160 00:10:13,400 --> 00:10:16,920 Speaker 4: the trip to Guatemala with her two years ago. Now, 161 00:10:16,960 --> 00:10:19,440 Speaker 4: I went on a trip called Self Care for Latinas 162 00:10:19,440 --> 00:10:20,319 Speaker 4: to Guatemala. 163 00:10:20,720 --> 00:10:21,960 Speaker 2: I met her on the trip. 164 00:10:22,080 --> 00:10:26,560 Speaker 4: She's a lovely person and when we were out traveling, 165 00:10:27,240 --> 00:10:30,240 Speaker 4: the way that she would just pull her camera out, 166 00:10:30,679 --> 00:10:34,600 Speaker 4: talk to camera, film what she was doing seint bina. 167 00:10:35,040 --> 00:10:37,320 Speaker 4: I could never I wasn't judging her or anything, but 168 00:10:37,440 --> 00:10:39,520 Speaker 4: seeing the way that she was doing it, I was like, 169 00:10:39,640 --> 00:10:43,680 Speaker 4: that is challenging, and you're doing it and you're just 170 00:10:44,320 --> 00:10:46,240 Speaker 4: you don't care because it's part of your work, because 171 00:10:46,280 --> 00:10:49,439 Speaker 4: it is your work, and you have to document your 172 00:10:49,480 --> 00:10:52,000 Speaker 4: trip and you're gonna make content out of it, and 173 00:10:52,400 --> 00:10:55,520 Speaker 4: that's amazing, good for you. And so I wonder if 174 00:10:55,559 --> 00:11:01,200 Speaker 4: there's also this maybe subconscious or conscious level envy of 175 00:11:01,320 --> 00:11:03,640 Speaker 4: that I cannot put myself out there the way this 176 00:11:03,679 --> 00:11:07,080 Speaker 4: person is doing. I don't have the audacity to be 177 00:11:07,160 --> 00:11:09,839 Speaker 4: out in the street and film myself because I care 178 00:11:09,920 --> 00:11:12,960 Speaker 4: what other people think, which is very real, right, And 179 00:11:13,040 --> 00:11:14,960 Speaker 4: so I wonder if there's also a part. 180 00:11:14,760 --> 00:11:18,960 Speaker 2: Of it too. Don't go anywhere, look amotives, We'll be 181 00:11:19,120 --> 00:11:20,000 Speaker 2: right back. 182 00:11:22,280 --> 00:11:25,640 Speaker 3: Again. It reminds me of the Ja Loo hate. Yes, 183 00:11:25,679 --> 00:11:26,920 Speaker 3: there's so much Jaylo hate. 184 00:11:27,000 --> 00:11:29,040 Speaker 1: And it's okay, you do it, then let me see 185 00:11:29,080 --> 00:11:29,400 Speaker 1: you do it. 186 00:11:29,520 --> 00:11:30,520 Speaker 3: Yeah, let me see you do this. 187 00:11:30,640 --> 00:11:33,240 Speaker 2: Let me see you add fifty four. Do what she's doing. 188 00:11:33,400 --> 00:11:39,079 Speaker 1: Come on, Yeah, nobody can, right, and yeah, it's everybody's 189 00:11:39,120 --> 00:11:39,800 Speaker 1: an influencer. 190 00:11:39,960 --> 00:11:41,480 Speaker 3: No, but not everybody can do it. 191 00:11:41,640 --> 00:11:44,240 Speaker 4: Not everyone can make money off of it. Not everyone 192 00:11:44,280 --> 00:11:48,080 Speaker 4: can sustain themselves and have it be a full time career. 193 00:11:48,440 --> 00:11:51,800 Speaker 1: I mean, I like, I couldn't do it. No, honestly, 194 00:11:51,840 --> 00:11:52,440 Speaker 1: it's too hard. 195 00:11:52,520 --> 00:11:55,120 Speaker 2: It's very hard. It's very hard. It's a lot of work. 196 00:11:55,280 --> 00:11:57,640 Speaker 2: It's many, many, many hours. 197 00:11:57,880 --> 00:11:59,120 Speaker 3: It's not a forty hour week. 198 00:12:00,160 --> 00:12:01,640 Speaker 2: It's more. It's more. 199 00:12:01,720 --> 00:12:04,720 Speaker 4: And I think that there's, of course, the conversation that 200 00:12:04,720 --> 00:12:09,000 Speaker 4: we're having of how when anytime something is democratized, when 201 00:12:09,000 --> 00:12:11,079 Speaker 4: you can do it yourself, when you can make money 202 00:12:11,120 --> 00:12:15,920 Speaker 4: off of it, like podcasting, like influencing, where there's no 203 00:12:16,000 --> 00:12:18,360 Speaker 4: middleman in some way, and in the way we think 204 00:12:18,400 --> 00:12:23,240 Speaker 4: of other careers, you have a direct connection to your audience, 205 00:12:23,600 --> 00:12:27,360 Speaker 4: and your audience can sometimes also be the consumer. So 206 00:12:27,480 --> 00:12:31,559 Speaker 4: whenever there's this democratized space, I think there's a lot 207 00:12:31,559 --> 00:12:35,559 Speaker 4: of hatred and there's a lot of classism. There's a 208 00:12:35,600 --> 00:12:39,720 Speaker 4: lot of elitism that stems from it. But there's also 209 00:12:39,760 --> 00:12:44,960 Speaker 4: this reality that people are exhausted from being online. There's 210 00:12:45,000 --> 00:12:48,000 Speaker 4: a lot of I think social media fatigue. There's also 211 00:12:48,040 --> 00:12:51,280 Speaker 4: a fatigue of scrolling and every single thing you see 212 00:12:51,280 --> 00:12:54,840 Speaker 4: as an ad someone trying to sell you something. And 213 00:12:55,200 --> 00:13:00,360 Speaker 4: I definitely find myself when I scroll getting an AD 214 00:13:00,440 --> 00:13:02,959 Speaker 4: from an influencer that I don't follow. That's the way 215 00:13:02,960 --> 00:13:08,000 Speaker 4: the algorithms work now, and so I'm more likely to 216 00:13:08,040 --> 00:13:11,200 Speaker 4: be influenced by someone I do follow. But now, because 217 00:13:11,200 --> 00:13:15,160 Speaker 4: we are inundated on the social media apps very intentionally, 218 00:13:15,360 --> 00:13:21,000 Speaker 4: I think that there's this fatigue and also questioning of 219 00:13:22,880 --> 00:13:27,240 Speaker 4: is this an ethical ad? Wondering who is this person 220 00:13:27,360 --> 00:13:31,360 Speaker 4: even selling me this? And I mean the big elephant 221 00:13:31,400 --> 00:13:34,240 Speaker 4: in the room is that there are a lot of 222 00:13:34,280 --> 00:13:37,520 Speaker 4: people economically struggling, and so to see it maybe a 223 00:13:37,600 --> 00:13:42,679 Speaker 4: lavish lifestyle to see someone maybe living in excess can 224 00:13:42,720 --> 00:13:47,160 Speaker 4: also I think, hit a nerve and really touch people's buttons. 225 00:13:49,240 --> 00:13:53,920 Speaker 1: And it's i think easier to blame the woman on 226 00:13:53,960 --> 00:13:58,000 Speaker 1: the screen selling you the product and posting her content 227 00:13:58,080 --> 00:14:01,520 Speaker 1: from her cute house. It's much easier to hate and 228 00:14:01,559 --> 00:14:06,520 Speaker 1: blame that woman for participating than it is to take 229 00:14:06,559 --> 00:14:10,400 Speaker 1: a step back and say, oh right. We live in 230 00:14:10,400 --> 00:14:15,960 Speaker 1: a capitalist society, So any space that gets created for 231 00:14:16,160 --> 00:14:19,840 Speaker 1: like the purest of intentions, not that social media was 232 00:14:19,840 --> 00:14:23,120 Speaker 1: created with pure intentions necessarily, but I think it was 233 00:14:23,200 --> 00:14:26,880 Speaker 1: created to connect with friends, like first and foremost, at 234 00:14:26,880 --> 00:14:30,080 Speaker 1: the very beginning, I think it was about like following 235 00:14:30,080 --> 00:14:33,640 Speaker 1: your friends and posting your life. But any space that 236 00:14:33,680 --> 00:14:38,360 Speaker 1: gets created and becomes popular is going to be monetized eventually. 237 00:14:38,640 --> 00:14:42,320 Speaker 4: Anything that's free is not free naturally, and social media 238 00:14:42,320 --> 00:14:46,160 Speaker 4: platforms are technically free. But we are the product, yes, 239 00:14:46,440 --> 00:14:47,880 Speaker 4: and things are being sold to us. 240 00:14:48,040 --> 00:14:51,240 Speaker 1: Yes, we must post in order for the platforms to 241 00:14:51,280 --> 00:14:54,600 Speaker 1: make their money, and they're making tons of money off 242 00:14:54,600 --> 00:14:57,000 Speaker 1: of all of us. If you have an account and 243 00:14:57,320 --> 00:14:59,960 Speaker 1: you're on there for any amount of time, even if 244 00:15:00,080 --> 00:15:02,720 Speaker 1: you're not posting, even if you're not an influencer. Even 245 00:15:02,720 --> 00:15:05,520 Speaker 1: if you're not commenting, if you're scrolling, if you're watching, 246 00:15:05,600 --> 00:15:07,120 Speaker 1: you're making money for somebody. 247 00:15:07,640 --> 00:15:08,360 Speaker 3: You just are. 248 00:15:08,880 --> 00:15:11,720 Speaker 1: And if you're a little grimy, grungy man in your 249 00:15:11,760 --> 00:15:16,920 Speaker 1: cave hating on the ladies who are growing their platforms, 250 00:15:17,160 --> 00:15:20,360 Speaker 1: you are also making money for somebody else. So I 251 00:15:20,400 --> 00:15:26,000 Speaker 1: think some of the anchor here is like, oh, I 252 00:15:26,080 --> 00:15:30,160 Speaker 1: have to witness ads, but also you are subjecting yourself 253 00:15:30,400 --> 00:15:34,640 Speaker 1: to the ads, and you're participating in the consumerism even 254 00:15:34,680 --> 00:15:37,480 Speaker 1: if you're not a content creator yourself. You know, just 255 00:15:37,520 --> 00:15:40,200 Speaker 1: by being on the app at all, you are making 256 00:15:40,240 --> 00:15:43,840 Speaker 1: money for somebody. I think that there's just, of course, 257 00:15:44,200 --> 00:15:48,240 Speaker 1: in all situations like this, there's a big picture issue. 258 00:15:48,600 --> 00:15:52,080 Speaker 4: Yeah, you know, there's also like the dark side of 259 00:15:52,520 --> 00:15:58,800 Speaker 4: influencing of how not just women running their businesses and 260 00:15:59,200 --> 00:16:04,280 Speaker 4: influencing empowering their community or their audience or sharing really 261 00:16:04,360 --> 00:16:08,640 Speaker 4: important information. There's also the darker side, which I consider 262 00:16:08,720 --> 00:16:13,840 Speaker 4: to be the darker side is the red pill influencer, right, 263 00:16:14,000 --> 00:16:19,600 Speaker 4: the manosphere, the Joe Rogan crowd, the Andrew Tait followers. 264 00:16:19,200 --> 00:16:21,040 Speaker 3: Right, the white supremacy who are. 265 00:16:21,000 --> 00:16:26,520 Speaker 4: Very much being radicalized online. And that is a big 266 00:16:26,560 --> 00:16:30,160 Speaker 4: part I think of the darker side of influencing and 267 00:16:30,200 --> 00:16:33,160 Speaker 4: being on these social media platforms is that there is. 268 00:16:33,120 --> 00:16:35,720 Speaker 2: A space for all of this to live. There used 269 00:16:35,760 --> 00:16:37,320 Speaker 2: to be where it was just four chan. 270 00:16:37,640 --> 00:16:41,040 Speaker 4: You had to go onto the dark web to hear 271 00:16:41,680 --> 00:16:46,240 Speaker 4: and read this type of information, and now it's on TikTok. 272 00:16:47,400 --> 00:16:52,440 Speaker 4: It's even the anti feminist content coming from women. I 273 00:16:53,080 --> 00:16:56,480 Speaker 4: recently fell into this really dark like rage bait cycle 274 00:16:56,760 --> 00:16:59,280 Speaker 4: where this woman who I used to be friends with 275 00:17:00,160 --> 00:17:02,640 Speaker 4: not been on my timeline for a long time, and 276 00:17:02,680 --> 00:17:04,199 Speaker 4: then all of a sudden, I saw her on my 277 00:17:04,240 --> 00:17:11,680 Speaker 4: timeline and she kept kept referring to feminist as crazy feminist, 278 00:17:12,200 --> 00:17:13,840 Speaker 4: and I did a deep dive. I was like, wait, 279 00:17:13,880 --> 00:17:18,560 Speaker 4: when did this happen? And she has started to create 280 00:17:18,600 --> 00:17:22,399 Speaker 4: content where she says she'll write things like this is 281 00:17:22,440 --> 00:17:25,400 Speaker 4: what the crazy feminists want you to believe, and it's 282 00:17:25,480 --> 00:17:28,560 Speaker 4: like her husband lifting something heavy that you can do 283 00:17:28,600 --> 00:17:31,040 Speaker 4: it all by yourself, Like, actually, no, that's not what 284 00:17:31,080 --> 00:17:36,640 Speaker 4: feminism is, but okay, so she's very much reduced what 285 00:17:36,720 --> 00:17:40,200 Speaker 4: feminism is to a women can do it all by themselves. 286 00:17:40,760 --> 00:17:41,080 Speaker 2: Women. 287 00:17:41,280 --> 00:17:43,359 Speaker 4: If you're a wife, you're not a real feminist. If 288 00:17:43,400 --> 00:17:47,840 Speaker 4: you're a mother, you're not a feminist. Crazy feminists actually 289 00:17:47,880 --> 00:17:52,359 Speaker 4: promote anti motherhood. And so I fell into this rage 290 00:17:52,400 --> 00:17:55,480 Speaker 4: cycle where I was being rage bated and I didn't 291 00:17:55,520 --> 00:17:58,480 Speaker 4: realize I was being rage bated at first because I 292 00:17:58,520 --> 00:18:00,960 Speaker 4: know this person. This is not a random person I've 293 00:18:01,000 --> 00:18:04,439 Speaker 4: stumbled upon. And I did comment on one video and 294 00:18:04,480 --> 00:18:08,040 Speaker 4: I was like, would love to know what feminist theory 295 00:18:08,640 --> 00:18:10,760 Speaker 4: writing scholarship you're referencing. 296 00:18:10,920 --> 00:18:13,240 Speaker 2: Please cite, please site you, please let me know. I 297 00:18:13,240 --> 00:18:14,200 Speaker 2: would love to read it. 298 00:18:15,480 --> 00:18:18,639 Speaker 4: And she's like, well, you know, there's there's feminists like 299 00:18:18,680 --> 00:18:22,399 Speaker 4: you who understand. I was like, girl, I was like, 300 00:18:22,640 --> 00:18:27,399 Speaker 4: feminists actually support motherhood and believe mothers should have more 301 00:18:27,840 --> 00:18:28,960 Speaker 4: of a social safety net. 302 00:18:29,600 --> 00:18:31,679 Speaker 2: So what are you saying. 303 00:18:32,000 --> 00:18:34,480 Speaker 4: So that's when I realized, like, even if whether you 304 00:18:34,480 --> 00:18:37,040 Speaker 4: believe this or not, I don't know, but you're very 305 00:18:37,080 --> 00:18:41,960 Speaker 4: much creating content for like the tradwives, the anti feminist 306 00:18:42,960 --> 00:18:46,439 Speaker 4: and definitely like the red Pill, like overlap of like 307 00:18:46,560 --> 00:18:52,200 Speaker 4: men are the Prize, and it's this this other sphere 308 00:18:52,280 --> 00:18:54,560 Speaker 4: of the Internet that I have not been exposed to 309 00:18:55,280 --> 00:18:56,280 Speaker 4: but very much exists. 310 00:18:56,440 --> 00:18:59,000 Speaker 1: I think that at the top of the episode, you 311 00:18:59,040 --> 00:19:01,919 Speaker 1: know what I think of influencers I'm thinking of like 312 00:19:01,960 --> 00:19:06,000 Speaker 1: people who are generative and like running businesses, and it's skincare, 313 00:19:06,400 --> 00:19:10,360 Speaker 1: and it's food, and it's working out and it's lifestyle. 314 00:19:10,760 --> 00:19:14,800 Speaker 1: But it's very true that there is an entire right wing, 315 00:19:15,160 --> 00:19:22,840 Speaker 1: white supremacist influencer space and they are very, very, very influential. 316 00:19:23,600 --> 00:19:27,760 Speaker 1: And I think that we do ourselves a disservice to 317 00:19:27,960 --> 00:19:34,240 Speaker 1: downplay the impact of influencers, because I think that we 318 00:19:34,320 --> 00:19:40,439 Speaker 1: can literally draw a chart and relate the power of 319 00:19:40,640 --> 00:19:44,000 Speaker 1: online influencing to the things that we've seen in the 320 00:19:44,000 --> 00:19:50,320 Speaker 1: Trump administration. How quickly ICE has mobilized and has recruited 321 00:19:51,200 --> 00:19:57,000 Speaker 1: new employees to like terrorize our communities and our immigrant families. 322 00:19:57,840 --> 00:20:02,560 Speaker 1: And I don't think that these forces would have moved 323 00:20:02,680 --> 00:20:06,240 Speaker 1: as quickly as they have had it not been for 324 00:20:06,680 --> 00:20:11,399 Speaker 1: influencers in the digital space who have very nefarious purposes, 325 00:20:11,480 --> 00:20:16,080 Speaker 1: who are white supremacists, who are xenophobes. So when we 326 00:20:17,000 --> 00:20:21,280 Speaker 1: downplay women in the space, we're sort of giving cover 327 00:20:22,440 --> 00:20:27,919 Speaker 1: and we're shielding ourselves from being very realistic about the 328 00:20:28,000 --> 00:20:31,560 Speaker 1: bad actors who are influencing and influencing millions. 329 00:20:31,880 --> 00:20:33,800 Speaker 5: Don't go anywhere, lokomotives will. 330 00:20:33,680 --> 00:20:34,880 Speaker 2: Be right back. 331 00:20:38,320 --> 00:20:40,480 Speaker 3: And we're back with more of our episode. 332 00:20:41,800 --> 00:20:46,480 Speaker 4: It is downplaying the influence of the influencers, good or bad, right, 333 00:20:46,520 --> 00:20:49,639 Speaker 4: and however you want to define that. To not take 334 00:20:49,960 --> 00:20:54,840 Speaker 4: them seriously is to deny the reality that we're living 335 00:20:54,880 --> 00:20:58,560 Speaker 4: in and our future. Influencers are not going away good 336 00:20:58,720 --> 00:21:03,320 Speaker 4: or bad. Whether it's generative like you said, it's women 337 00:21:03,400 --> 00:21:08,600 Speaker 4: running their businesses teaching you something, entrepreneurs who are selling something. 338 00:21:09,200 --> 00:21:14,480 Speaker 4: There's also this information cycle that could also be rooted 339 00:21:14,480 --> 00:21:18,119 Speaker 4: in misinformation, can be rooted in hatred, and to deny 340 00:21:18,160 --> 00:21:21,200 Speaker 4: one is to deny the other exists. Is what I think. 341 00:21:21,520 --> 00:21:22,119 Speaker 3: Absolutely. 342 00:21:22,800 --> 00:21:28,399 Speaker 1: I think any time we see people really downplaying and 343 00:21:28,600 --> 00:21:35,480 Speaker 1: underestimating the power of women the arts, kind of skating 344 00:21:35,600 --> 00:21:41,600 Speaker 1: over the impact and the influence, it's almost like opening 345 00:21:41,720 --> 00:21:51,160 Speaker 1: up floodgates for bigger batterer problems. For example, AI replacing 346 00:21:51,760 --> 00:21:55,880 Speaker 1: jobs in the arts, okay, whatever, blah blah blah. They're 347 00:21:55,920 --> 00:22:00,600 Speaker 1: just just taking away jobs from designers and illustrators. Know 348 00:22:01,160 --> 00:22:03,200 Speaker 1: if they can hit the arts, they can hit every 349 00:22:03,200 --> 00:22:07,520 Speaker 1: other sector of society. Influencing Oh well, it's not real work. 350 00:22:07,600 --> 00:22:11,760 Speaker 1: It's just women messing around on the internet and putting 351 00:22:11,800 --> 00:22:16,040 Speaker 1: themselves on display, not a big deal. We underestimate and 352 00:22:16,200 --> 00:22:23,480 Speaker 1: undervalue the impact the real human influence, and then that 353 00:22:23,600 --> 00:22:28,880 Speaker 1: means that, like you said, we're devaluing the influencer and 354 00:22:29,200 --> 00:22:33,719 Speaker 1: we're not really being realistic about how much power, like 355 00:22:33,960 --> 00:22:39,959 Speaker 1: right wing influencers have, how much reach misogynist influencers have, 356 00:22:40,240 --> 00:22:43,159 Speaker 1: and the very negative damage that they can do to 357 00:22:43,280 --> 00:22:48,080 Speaker 1: society because they do have so much influence on their audiences, 358 00:22:48,920 --> 00:22:51,160 Speaker 1: and those audiences can be in the millions. I mean, 359 00:22:51,240 --> 00:22:56,960 Speaker 1: they can like organize using their digital online platforms. But 360 00:22:57,680 --> 00:23:00,760 Speaker 1: our image of the influencer is like a woman in 361 00:23:00,800 --> 00:23:04,320 Speaker 1: like a cute plash apartment selling workout gear. And so 362 00:23:04,440 --> 00:23:08,879 Speaker 1: of course we're going to downplay, downgrade, and in that way, 363 00:23:09,640 --> 00:23:13,560 Speaker 1: we're sort of like giving these right wing influencers this 364 00:23:13,680 --> 00:23:18,720 Speaker 1: kind of quiet space where they can take off. 365 00:23:19,320 --> 00:23:25,720 Speaker 4: Yes, yes, and an influencer is not a journalist. A 366 00:23:25,760 --> 00:23:30,320 Speaker 4: journalist can be an influencer, yes, right. A artist is 367 00:23:30,359 --> 00:23:34,120 Speaker 4: not always an influencer, but they can be. But when 368 00:23:34,200 --> 00:23:41,639 Speaker 4: we downplay the role that influencers now play in our lives, 369 00:23:42,359 --> 00:23:51,200 Speaker 4: the marketplace, commerce, elections, we're really missing the power and 370 00:23:51,760 --> 00:23:57,880 Speaker 4: also some discernment that we should have not only for ourselves, 371 00:23:57,960 --> 00:24:03,000 Speaker 4: but for our community, for our parents, young people, boys, 372 00:24:03,600 --> 00:24:06,480 Speaker 4: you know. And so it's I think it is really 373 00:24:07,359 --> 00:24:10,880 Speaker 4: understanding the role that an influencer now has in our 374 00:24:10,920 --> 00:24:14,440 Speaker 4: society and seeing it for what it is, that there's 375 00:24:14,640 --> 00:24:18,520 Speaker 4: real impact and there it shouldn't the work should not 376 00:24:18,560 --> 00:24:21,119 Speaker 4: be devalued because it is work and it is influencing 377 00:24:21,720 --> 00:24:23,960 Speaker 4: a lot of as aspects of our lives that I 378 00:24:23,960 --> 00:24:25,359 Speaker 4: don't think we even realize yet. 379 00:24:25,800 --> 00:24:26,399 Speaker 3: It's true. 380 00:24:26,480 --> 00:24:28,360 Speaker 1: I think that maybe there was a period of time 381 00:24:28,359 --> 00:24:32,400 Speaker 1: in this country where you can identify like public role models, 382 00:24:32,520 --> 00:24:33,080 Speaker 1: and maybe. 383 00:24:32,880 --> 00:24:33,720 Speaker 3: There was a handful. 384 00:24:34,520 --> 00:24:38,919 Speaker 1: Now there are hundreds and thousands of role models with 385 00:24:39,200 --> 00:24:43,320 Speaker 1: very very big platforms who are They're in our ears, 386 00:24:43,320 --> 00:24:45,320 Speaker 1: they're on our phones, they're on our screens. 387 00:24:45,680 --> 00:24:47,680 Speaker 3: You mentioned the youth. I mean the. 388 00:24:47,680 --> 00:24:51,760 Speaker 1: Kids are on YouTube, they're on TikTok, they are online, 389 00:24:51,960 --> 00:24:55,560 Speaker 1: and they are hanging by every word, every word of 390 00:24:55,600 --> 00:24:59,040 Speaker 1: the content creators, the influencers that they're following and that 391 00:24:59,119 --> 00:25:03,399 Speaker 1: they like most. They really value what these content creators 392 00:25:03,480 --> 00:25:06,800 Speaker 1: are putting out into the world, good or bad. And 393 00:25:07,160 --> 00:25:11,480 Speaker 1: I think that we've come to a place in human 394 00:25:11,520 --> 00:25:15,960 Speaker 1: existence where we got to be realistic about the real 395 00:25:16,119 --> 00:25:22,159 Speaker 1: power of the influencer economy of content creators, and we 396 00:25:22,640 --> 00:25:26,760 Speaker 1: do ourselves a disservice when we downplay the impact. So 397 00:25:26,840 --> 00:25:30,159 Speaker 1: the next time you're thinking about talking shit about an 398 00:25:30,240 --> 00:25:35,040 Speaker 1: influencer who is growing their platform and posting online, think 399 00:25:35,080 --> 00:25:39,560 Speaker 1: about what you're contributing to and maybe what you're ignoring. 400 00:25:40,960 --> 00:25:47,160 Speaker 4: Absolutely well, this conversation was very nourishing in a way 401 00:25:47,240 --> 00:25:51,480 Speaker 4: that I wasn't expecting. I think originally had plans for 402 00:25:51,800 --> 00:25:53,840 Speaker 4: what this conversation was going to be and when in 403 00:25:53,880 --> 00:25:57,800 Speaker 4: an entirely different direction, not that different, but we went there. 404 00:25:58,000 --> 00:25:59,560 Speaker 2: I think when it comes to. 405 00:26:01,280 --> 00:26:06,560 Speaker 4: Examining the impact of influencing in right wing media and 406 00:26:06,600 --> 00:26:10,399 Speaker 4: the right wing digital space, and that's not where I 407 00:26:10,440 --> 00:26:12,440 Speaker 4: saw this going. But I'm really glad that we did, 408 00:26:13,200 --> 00:26:16,159 Speaker 4: because there's so much there, and there's so much we 409 00:26:16,280 --> 00:26:17,080 Speaker 4: have to be aware of. 410 00:26:17,480 --> 00:26:18,240 Speaker 3: It's very real. 411 00:26:18,440 --> 00:26:22,680 Speaker 1: I mean, as the Trump administration continues to do its worst, 412 00:26:23,359 --> 00:26:26,879 Speaker 1: watch the digital space cultivate foot. 413 00:26:26,640 --> 00:26:31,880 Speaker 3: Soldiers for that cause, and watch. 414 00:26:32,080 --> 00:26:39,320 Speaker 1: The very damaging negative initiatives being carried out online and 415 00:26:39,840 --> 00:26:46,200 Speaker 1: given support online and influencers as our modern day organizers. 416 00:26:46,320 --> 00:26:50,119 Speaker 1: In a lot of ways, I don't think people are 417 00:26:50,280 --> 00:26:57,320 Speaker 1: necessarily seeing how people are being moved and organized through 418 00:26:57,400 --> 00:26:58,920 Speaker 1: digital influencing spaces. 419 00:27:00,080 --> 00:27:01,840 Speaker 4: We have not talked about this on the podcast and 420 00:27:02,200 --> 00:27:05,639 Speaker 4: we still won't, but there was a very public death 421 00:27:05,800 --> 00:27:08,480 Speaker 4: that happened, yes last year. There are a lot of 422 00:27:08,480 --> 00:27:10,680 Speaker 4: reasons we are not touching that topic and I won't 423 00:27:10,720 --> 00:27:15,360 Speaker 4: even say this person's name, but I think that that 424 00:27:16,440 --> 00:27:21,440 Speaker 4: because of that death, that there is more to come 425 00:27:21,920 --> 00:27:27,960 Speaker 4: with that social circle, if you will, with the xenophobia, 426 00:27:28,640 --> 00:27:32,720 Speaker 4: with the right wing channel that it is, and they 427 00:27:32,760 --> 00:27:38,560 Speaker 4: are organizing young people right now, successful successfully across campuses. 428 00:27:39,119 --> 00:27:43,800 Speaker 4: There are chapters everywhere, and so it is definitely something 429 00:27:43,840 --> 00:27:46,080 Speaker 4: to pay attention to because it is not going anywhere, 430 00:27:46,080 --> 00:27:49,840 Speaker 4: and if anything, they are more emboldened now because of 431 00:27:50,400 --> 00:27:51,600 Speaker 4: the loss of their leader. 432 00:27:52,280 --> 00:27:54,080 Speaker 3: A martyr has been made and that's all we'll say 433 00:27:54,119 --> 00:27:56,600 Speaker 3: about that. But who was an influencer? 434 00:27:56,960 --> 00:27:57,320 Speaker 2: Yes? 435 00:27:57,440 --> 00:27:59,600 Speaker 3: Wow, Well, thank y'all for listening. 436 00:28:00,640 --> 00:28:03,080 Speaker 2: Drink some water, go on a walk listening to this. 437 00:28:03,280 --> 00:28:04,080 Speaker 3: We're gonna go outside. 438 00:28:04,119 --> 00:28:06,640 Speaker 2: Yes, we're gonna touch grass now, all right, love y'all. 439 00:28:06,720 --> 00:28:07,760 Speaker 2: I we'll talk to you later. 440 00:28:08,040 --> 00:28:09,080 Speaker 3: Please eat those. 441 00:28:11,960 --> 00:28:12,440 Speaker 2: Look I thought. 442 00:28:12,480 --> 00:28:16,119 Speaker 5: Our radio is executive produced and hosted by me Fiosa 443 00:28:16,440 --> 00:28:47,760 Speaker 5: and me Mala, also edited by me fiosa 444 00:28:51,240 --> 00:28:52,120 Speaker 2: Local loon