1 00:00:00,440 --> 00:00:04,720 Speaker 1: Jersey and Amanda Jamn. I saw something the other day. 2 00:00:04,760 --> 00:00:07,520 Speaker 1: I saw a woman talking about something to do with 3 00:00:07,560 --> 00:00:11,440 Speaker 1: social media, and it was such an eye opener. We 4 00:00:11,560 --> 00:00:15,920 Speaker 1: all know that our social media feeds are being curated. 5 00:00:16,000 --> 00:00:18,079 Speaker 1: I think we all know that now we're in an algorithm. 6 00:00:18,120 --> 00:00:20,599 Speaker 1: If you follow puppy Instagram, you're going to get a 7 00:00:20,600 --> 00:00:22,040 Speaker 1: whole lot of dog stuff. You're going to a whole lot 8 00:00:22,040 --> 00:00:24,160 Speaker 1: of blah blah blah. If you're following politics, you're going 9 00:00:24,200 --> 00:00:25,959 Speaker 1: to get that. If you'reollowing influences, you're going to get that. 10 00:00:26,640 --> 00:00:29,800 Speaker 1: But what about the comments. This woman said that she 11 00:00:29,960 --> 00:00:33,120 Speaker 1: was watching a reel where a girl was saying, Hey, 12 00:00:33,120 --> 00:00:36,200 Speaker 1: my boyfriend's half an hour late. I'm a little bit annoyed. 13 00:00:36,200 --> 00:00:38,239 Speaker 1: He's a new boyfriend. I'm a little bit annoyed. And 14 00:00:38,280 --> 00:00:41,000 Speaker 1: this girl was giving half hour updates to he's still 15 00:00:41,040 --> 00:00:44,319 Speaker 1: not here, still not here. Two hours later I think 16 00:00:44,360 --> 00:00:47,880 Speaker 1: he turned up. And so she had updated this thing 17 00:00:48,040 --> 00:00:52,279 Speaker 1: saying I've waited two hours finally he's here. All the comments, 18 00:00:52,600 --> 00:00:54,800 Speaker 1: all the comments at the top of her feed, this 19 00:00:54,840 --> 00:00:57,160 Speaker 1: girl's feeds, she watched the reel and the comments were 20 00:00:57,760 --> 00:01:03,840 Speaker 1: how rude, that's terrible. How disrespectful, red flag, etc. Her 21 00:01:03,960 --> 00:01:08,160 Speaker 1: boyfriend watched the same reel at the same time, and 22 00:01:08,240 --> 00:01:10,000 Speaker 1: she looked at the comments that were at the top 23 00:01:10,040 --> 00:01:13,680 Speaker 1: of his feed, and the comments were she needs to 24 00:01:13,720 --> 00:01:16,960 Speaker 1: get a life. Hey bro, you're entitled to do what 25 00:01:17,040 --> 00:01:20,120 Speaker 1: you want, blah blah blah. So she presumably got the 26 00:01:20,160 --> 00:01:23,560 Speaker 1: female comments, he got the male skewed comments, and it's 27 00:01:23,600 --> 00:01:26,080 Speaker 1: a very interesting thing to discuss because that's curious. 28 00:01:26,120 --> 00:01:27,600 Speaker 2: So they're sitting next to each other. 29 00:01:27,560 --> 00:01:30,240 Speaker 1: And she looked at it, and she scrolled down right 30 00:01:30,280 --> 00:01:31,760 Speaker 1: to all the comments. The ones that were at the 31 00:01:31,760 --> 00:01:34,319 Speaker 1: top of his feed didn't even appear on hers, and 32 00:01:34,400 --> 00:01:38,080 Speaker 1: vice versa. So this is kind of alarming in how 33 00:01:38,120 --> 00:01:40,560 Speaker 1: you view the world and how young people view the world, 34 00:01:40,800 --> 00:01:43,360 Speaker 1: because there's the topic, and yes, we're being fed different 35 00:01:43,880 --> 00:01:47,760 Speaker 1: subjects to our algorithms, but the context for how we 36 00:01:47,800 --> 00:01:52,360 Speaker 1: respond to them is also being manipulated. So how we 37 00:01:52,440 --> 00:01:54,720 Speaker 1: often look at the comments to say, you know, what 38 00:01:54,760 --> 00:01:58,640 Speaker 1: are people's opinions on this? You're not getting You're not 39 00:01:58,680 --> 00:02:03,760 Speaker 1: getting any perspective. You're getting a very narrow curated perspective 40 00:02:04,000 --> 00:02:07,760 Speaker 1: that fits what your algorithm thinks you are. I said 41 00:02:07,760 --> 00:02:09,600 Speaker 1: this before, didn't I that my nephew, who's in his 42 00:02:09,680 --> 00:02:12,880 Speaker 1: mid thirties, a really smart, well read young guy from Melbourne. 43 00:02:13,160 --> 00:02:15,400 Speaker 1: He decided to go off social media because, he said, 44 00:02:16,600 --> 00:02:21,560 Speaker 1: an algorithm has decided that because he's white, presumably heterosexual, 45 00:02:22,320 --> 00:02:25,959 Speaker 1: which he is, but they've made those assumptions about him. Therefore, 46 00:02:26,040 --> 00:02:29,040 Speaker 1: he's being fed a whole lot of Joe Rogan, a 47 00:02:29,080 --> 00:02:32,240 Speaker 1: whole lot of Andrew Tait, a whole lot of homophobic, 48 00:02:33,320 --> 00:02:38,440 Speaker 1: relatively racist, anti female, anti trans stuff. And he said, 49 00:02:38,480 --> 00:02:39,919 Speaker 1: I don't want to do this anymore. And a whole 50 00:02:39,919 --> 00:02:42,240 Speaker 1: lot of people wouldn't even know they're being fed a 51 00:02:42,280 --> 00:02:44,720 Speaker 1: certain skew of the world. But then to have your 52 00:02:45,320 --> 00:02:47,760 Speaker 1: comments skewed as well, as you say, the perspective with 53 00:02:47,800 --> 00:02:50,840 Speaker 1: which you view, the lens with which you view the information, 54 00:02:51,080 --> 00:02:53,480 Speaker 1: How is everyone feeling about this? How are we feeling people? 55 00:02:53,800 --> 00:02:55,160 Speaker 1: That's all being skewed as well. 56 00:02:55,680 --> 00:02:58,560 Speaker 2: Also, it could be bots that make the comments. That's 57 00:02:58,600 --> 00:03:01,680 Speaker 2: another thing. Reggie Bird, who I follow on Instagram, she 58 00:03:02,800 --> 00:03:05,240 Speaker 2: called out this guy that had trash talked her on 59 00:03:05,320 --> 00:03:09,040 Speaker 2: her Instagram feed and I looked into it. The guy 60 00:03:09,080 --> 00:03:12,240 Speaker 2: has no profile picture, no followers. I'm thinking that he's 61 00:03:12,280 --> 00:03:12,600 Speaker 2: a bot. 62 00:03:12,800 --> 00:03:14,160 Speaker 1: Why would a bot target her. 63 00:03:14,440 --> 00:03:17,560 Speaker 2: I don't know. I see a lot of women get 64 00:03:17,600 --> 00:03:22,760 Speaker 2: targeted by men, Aaron Molin for example. And I tell 65 00:03:22,800 --> 00:03:25,200 Speaker 2: you what, none of these guys target me because I'll 66 00:03:25,280 --> 00:03:27,320 Speaker 2: just call him out. But this guy that was targeting 67 00:03:27,360 --> 00:03:30,840 Speaker 2: Reggie Bert, his name is David Elliott. He said some 68 00:03:30,919 --> 00:03:33,880 Speaker 2: terrible things about Reggie, and I wrote back, I said, well, 69 00:03:33,919 --> 00:03:36,240 Speaker 2: I think I think this guy's a bot. I don't 70 00:03:36,280 --> 00:03:39,160 Speaker 2: think he even exists. And if he exists, this flog 71 00:03:39,560 --> 00:03:42,160 Speaker 2: has done less in his life than you've done in 72 00:03:42,160 --> 00:03:44,840 Speaker 2: fifteen minutes. Because he was called around about it, I 73 00:03:44,880 --> 00:03:46,960 Speaker 2: thought you were dead. I thought you had your fifteen 74 00:03:47,000 --> 00:03:47,560 Speaker 2: minutes of favor. 75 00:03:47,720 --> 00:03:51,400 Speaker 1: He would say that stuff. But you know what, it's 76 00:03:51,440 --> 00:03:54,960 Speaker 1: also those kind of bots change and sway life. I 77 00:03:55,000 --> 00:03:58,040 Speaker 1: had a podcast recently there was talking about how Saudi 78 00:03:58,120 --> 00:04:01,640 Speaker 1: Arabia are trying to infl trait the world through sport 79 00:04:01,960 --> 00:04:07,920 Speaker 1: and also now through arts and music festivals and film festivals. 80 00:04:08,200 --> 00:04:11,600 Speaker 1: And Johnny Depp, they're saying, has befriended the head guy 81 00:04:11,640 --> 00:04:15,600 Speaker 1: in Saudi Arabia who was responsible allegedly for the chopping 82 00:04:15,680 --> 00:04:19,000 Speaker 1: up of that journalist Shock. Yeah, And so Johnny Depp 83 00:04:19,040 --> 00:04:23,120 Speaker 1: has befriended this billionaire and so he wants Johnny to 84 00:04:23,160 --> 00:04:25,719 Speaker 1: be his mate, and then a whole lot of bots 85 00:04:25,760 --> 00:04:29,520 Speaker 1: targeted Amber heard a whole lot of bots, and it's 86 00:04:29,560 --> 00:04:35,480 Speaker 1: assumed and allegedly thought that Saudi Arabian billionaire goes, let's 87 00:04:35,680 --> 00:04:38,240 Speaker 1: save our mate. So this changes the world. 88 00:04:38,320 --> 00:04:40,919 Speaker 2: We should do an experiment. We should get say, for example, 89 00:04:40,920 --> 00:04:43,320 Speaker 2: something that we've done on the radio that's man woman 90 00:04:43,400 --> 00:04:46,520 Speaker 2: divide something to man woman. Okay, what about that time 91 00:04:46,720 --> 00:04:49,720 Speaker 2: I said I was putting carpet around the garage walk because. 92 00:04:49,480 --> 00:04:51,160 Speaker 1: We got a new because you didn't trust your wife 93 00:04:51,160 --> 00:04:51,520 Speaker 1: to park. 94 00:04:51,920 --> 00:04:54,240 Speaker 2: See as soon as you say words like trust, it's 95 00:04:54,400 --> 00:04:56,800 Speaker 2: about what I'm doing is protecting the car. 96 00:04:56,960 --> 00:04:58,920 Speaker 1: So you're suggesting we shouldn't have said it out loud. 97 00:04:59,000 --> 00:05:01,040 Speaker 1: Don't tell anyone the exp We'll just do it and 98 00:05:01,080 --> 00:05:01,839 Speaker 1: we'll see what happens. 99 00:05:01,960 --> 00:05:05,280 Speaker 2: We'll get digital Jenna to post that original discussion and 100 00:05:05,360 --> 00:05:06,239 Speaker 2: see what happens. 101 00:05:06,320 --> 00:05:09,240 Speaker 1: Yeah that he said, she said stuff because the perspective 102 00:05:09,279 --> 00:05:10,080 Speaker 1: is being skewed. 103 00:05:10,520 --> 00:05:11,400 Speaker 2: Yeah, right on,