1 00:00:06,000 --> 00:00:09,039 Speaker 1: Welcome to Fearing Greed Sunday edition, where we have fallen 2 00:00:09,080 --> 00:00:12,440 Speaker 1: into the habit of asking each other five questions vaguely 3 00:00:12,480 --> 00:00:15,360 Speaker 1: related to this week's news events. I'm Sean Almer and 4 00:00:15,400 --> 00:00:19,320 Speaker 1: welcome Michael Thompson, Hello Sean, and Adam Lang. 5 00:00:19,400 --> 00:00:20,120 Speaker 2: Hello Sean. 6 00:00:21,040 --> 00:00:24,200 Speaker 1: A story this week which was interesting for many reasons 7 00:00:25,000 --> 00:00:29,200 Speaker 1: came from the man himself, Donald Trump, you may know him, 8 00:00:29,200 --> 00:00:32,960 Speaker 1: President of the United States. He was watching social media 9 00:00:33,080 --> 00:00:36,400 Speaker 1: as is his want, and on one of the US 10 00:00:36,479 --> 00:00:40,800 Speaker 1: warships was hit in the Middle East. It was a 11 00:00:40,840 --> 00:00:43,040 Speaker 1: bit shocked. It's like, you know, I didn't know what 12 00:00:43,720 --> 00:00:45,640 Speaker 1: did I do. I just rang the general in charge 13 00:00:45,640 --> 00:00:48,600 Speaker 1: and asked, hey, has your both been hit? The general 14 00:00:48,640 --> 00:00:51,839 Speaker 1: said everything was fine. No, the ship hadn't been hit, 15 00:00:52,360 --> 00:00:54,160 Speaker 1: which Donald Trump then said, so it must have been 16 00:00:54,160 --> 00:00:57,320 Speaker 1: fake news. It was ai, which brings me to today's 17 00:00:57,400 --> 00:01:03,600 Speaker 1: five questions, all about news on social media? Are you 18 00:01:03,600 --> 00:01:04,120 Speaker 1: too ready? 19 00:01:04,240 --> 00:01:07,720 Speaker 3: Oh yeah, and we have no I I had no 20 00:01:07,760 --> 00:01:09,480 Speaker 3: idea where you're going with any of that. So this 21 00:01:09,560 --> 00:01:10,240 Speaker 3: is going to be good. 22 00:01:10,400 --> 00:01:15,959 Speaker 1: Let's do this, okay, Starting with you, Adam, Yes, easy one. 23 00:01:16,160 --> 00:01:18,760 Speaker 1: Have you seen a fake video or audio clip online 24 00:01:19,400 --> 00:01:22,920 Speaker 1: that looked real and if so, what gave it away? 25 00:01:22,959 --> 00:01:23,399 Speaker 1: How do you know? 26 00:01:23,480 --> 00:01:31,039 Speaker 2: It was Twiggy Forest endorsing this crypto coin and it 27 00:01:31,080 --> 00:01:34,720 Speaker 2: was on Facebook, and I just looked at it and thought, 28 00:01:35,280 --> 00:01:39,400 Speaker 2: no way. So actually the imagery looked pretty good, but 29 00:01:39,480 --> 00:01:41,840 Speaker 2: it just was this doesn't feel. 30 00:01:41,600 --> 00:01:42,199 Speaker 3: Right at all. 31 00:01:43,080 --> 00:01:44,520 Speaker 1: So it was too far fetched. 32 00:01:44,880 --> 00:01:48,800 Speaker 2: Yes, the top exactly the topic was just implausible to. 33 00:01:48,760 --> 00:01:50,920 Speaker 1: Me, Michael. 34 00:01:51,200 --> 00:01:55,800 Speaker 3: Yeah, it was nothing spectacular, It was nothing hugely exciting, 35 00:01:56,360 --> 00:01:58,880 Speaker 3: but it was It was on social media and it 36 00:01:58,920 --> 00:02:03,080 Speaker 3: was just people to walking and there was just something wrong, 37 00:02:03,600 --> 00:02:07,120 Speaker 3: like there was something odd about it, and like that 38 00:02:07,360 --> 00:02:09,480 Speaker 3: just doesn't look right. And in the end I was 39 00:02:09,639 --> 00:02:11,560 Speaker 3: looking at it and there their faces were just too 40 00:02:12,080 --> 00:02:14,480 Speaker 3: kind of clean, and then and the movement was too 41 00:02:14,560 --> 00:02:16,600 Speaker 3: perfect and maybe it was something to do with the 42 00:02:16,639 --> 00:02:19,120 Speaker 3: shadow or something. And so I'm like, as you do, 43 00:02:19,280 --> 00:02:21,320 Speaker 3: you go to the comments in that kind of thing, 44 00:02:21,320 --> 00:02:22,639 Speaker 3: and just as soon as you get down there to 45 00:02:22,680 --> 00:02:26,800 Speaker 3: say ai ai ai ai, everybody hammering it. But I 46 00:02:26,840 --> 00:02:30,040 Speaker 3: was like, it was almost like if you'd be if 47 00:02:30,080 --> 00:02:32,239 Speaker 3: you'd asked me to define exactly what it was that 48 00:02:32,360 --> 00:02:36,160 Speaker 3: gave it away. There was just something indescribable that just 49 00:02:36,200 --> 00:02:38,839 Speaker 3: didn't feel right. But that was probably six months ago, 50 00:02:38,880 --> 00:02:41,320 Speaker 3: and I've gotten a whole lot better even in that time. 51 00:02:42,160 --> 00:02:46,400 Speaker 1: Okay, second question, have you seen something on social media 52 00:02:46,680 --> 00:02:52,440 Speaker 1: and then related it to your partners, your kids, your friends, whatever, 53 00:02:53,840 --> 00:02:58,720 Speaker 1: and halfway through the story thought, oh, that can't be true, 54 00:02:59,400 --> 00:03:01,600 Speaker 1: and maybe it's because so I have had an instance. 55 00:03:01,639 --> 00:03:04,200 Speaker 1: I was walking with Jackie during the week and telling it, 56 00:03:04,240 --> 00:03:07,000 Speaker 1: relaying this theory, and she just made a singple comment 57 00:03:07,240 --> 00:03:10,600 Speaker 1: why it was totally wrong, and I thought, oh, yeah, 58 00:03:10,639 --> 00:03:13,480 Speaker 1: good point, right, Yeah, no, that must have. 59 00:03:13,440 --> 00:03:14,440 Speaker 2: Been alf I didn't think of that. 60 00:03:14,960 --> 00:03:16,520 Speaker 1: So you've actually been taken for it. 61 00:03:18,280 --> 00:03:25,680 Speaker 3: I this may not surprise you. I'm very cynical when 62 00:03:25,680 --> 00:03:29,079 Speaker 3: it comes to what I see and read, and I've 63 00:03:29,120 --> 00:03:33,160 Speaker 3: become very dull that I've actually started cross checking all 64 00:03:33,200 --> 00:03:33,720 Speaker 3: that thought. 65 00:03:33,919 --> 00:03:39,440 Speaker 2: Okay, ah, So I've seen a number of you know, 66 00:03:39,720 --> 00:03:42,440 Speaker 2: it follows what you want. The algorithm sore so clever. 67 00:03:43,120 --> 00:03:49,040 Speaker 2: So I've seen people on Instagram doing little guitar tutorials 68 00:03:50,000 --> 00:03:52,120 Speaker 2: and some of them are really good, right and so, 69 00:03:53,080 --> 00:03:56,440 Speaker 2: and then I've seen someone who took a part this 70 00:03:56,520 --> 00:03:59,160 Speaker 2: guitar tutor and said, if you actually watch those fingers 71 00:03:59,200 --> 00:04:01,960 Speaker 2: on the fretboard, they do not play those sounds at all, 72 00:04:02,200 --> 00:04:05,760 Speaker 2: so it was completely AI generated and what the generation 73 00:04:06,000 --> 00:04:14,440 Speaker 2: was just someone more attractive playing the guitar. So I was, oh, 74 00:04:14,480 --> 00:04:15,800 Speaker 2: I'll disappointed on two. 75 00:04:15,640 --> 00:04:23,960 Speaker 1: Fronts Okay, story numbers. Question number three. Should AI videos 76 00:04:25,720 --> 00:04:29,880 Speaker 1: be obliged to be labeled AI videos or is that 77 00:04:30,000 --> 00:04:30,520 Speaker 1: just too hard? 78 00:04:31,520 --> 00:04:32,559 Speaker 3: Absolutely they should? 79 00:04:33,320 --> 00:04:36,039 Speaker 2: Oh my god, Oh yeah, that's a hard yes for me. 80 00:04:36,920 --> 00:04:42,400 Speaker 1: Okay, good that was then. Yeah, well, you know that 81 00:04:42,480 --> 00:04:47,360 Speaker 1: was the dule question. Question number four. If someone wants 82 00:04:47,440 --> 00:04:55,760 Speaker 1: to fake news one of you two, Yeah, what would 83 00:04:55,760 --> 00:04:56,880 Speaker 1: be the subject matter? 84 00:05:02,800 --> 00:05:04,600 Speaker 3: Adam? 85 00:05:05,279 --> 00:05:09,800 Speaker 1: Well, you can make a professional if you'd like professionally. 86 00:05:10,120 --> 00:05:14,359 Speaker 2: I would like to think it would be something worthy, 87 00:05:14,760 --> 00:05:18,280 Speaker 2: business related and probably a bit long winded, you know. 88 00:05:21,600 --> 00:05:26,160 Speaker 1: Put together how to put together a spreadsheet, Adam lang Staff. 89 00:05:27,000 --> 00:05:29,320 Speaker 3: And all of a sudden, there's this really succinct kind 90 00:05:29,360 --> 00:05:31,280 Speaker 3: of thirty second guy and like, nah, that's got to 91 00:05:31,320 --> 00:05:38,160 Speaker 3: be a I'm sure as heck ain't real. But and 92 00:05:38,480 --> 00:05:40,840 Speaker 3: look for me, I think it would probably end up 93 00:05:40,839 --> 00:05:43,760 Speaker 3: being it would be something for which we are already 94 00:05:43,880 --> 00:05:46,960 Speaker 3: in the public domain, right, and so it would be 95 00:05:47,360 --> 00:05:49,359 Speaker 3: it would be something for your and greed related. You 96 00:05:49,360 --> 00:05:52,640 Speaker 3: would have to have to imagine because we are out 97 00:05:52,640 --> 00:05:54,240 Speaker 3: there talking about it. We have a lot of social 98 00:05:54,320 --> 00:05:56,280 Speaker 3: media videos out there where we are talking about it. 99 00:05:56,320 --> 00:05:59,520 Speaker 3: That there is probably a risk that that that that 100 00:05:59,680 --> 00:06:01,560 Speaker 3: can be can be done. 101 00:06:01,839 --> 00:06:03,560 Speaker 1: Like if it was me, if someone was faking me 102 00:06:03,600 --> 00:06:07,440 Speaker 1: to be that I'd lost the Saturday edition, a very ingreach. 103 00:06:09,640 --> 00:06:14,000 Speaker 1: The Man of the People had one, I'm still Michael. 104 00:06:15,080 --> 00:06:16,719 Speaker 2: I was wondering if it was Michael, would it be 105 00:06:16,839 --> 00:06:19,839 Speaker 2: the single, the double or the triple butt that was 106 00:06:19,920 --> 00:06:20,800 Speaker 2: used more often? 107 00:06:21,440 --> 00:06:24,160 Speaker 3: See also the way you guys are talking it makes 108 00:06:24,160 --> 00:06:28,160 Speaker 3: it we would be passive victims of the AI fake news. 109 00:06:28,360 --> 00:06:30,920 Speaker 3: Maybe I need to get a little bit proactive here 110 00:06:31,120 --> 00:06:34,720 Speaker 3: and put out my own series of AI generated fake 111 00:06:34,760 --> 00:06:38,360 Speaker 3: news videos that feature me in a glorious winning position 112 00:06:38,400 --> 00:06:40,719 Speaker 3: on the weekend edition, on the Saturday edition. That is 113 00:06:40,760 --> 00:06:43,839 Speaker 3: actually a good point. It's probably not the best professional 114 00:06:43,880 --> 00:06:46,920 Speaker 3: move for us for adding out fake videos. 115 00:06:47,560 --> 00:06:52,680 Speaker 1: Well, this brings me to question number five. What do 116 00:06:52,720 --> 00:06:58,720 Speaker 1: you think AI slash fake news means for the mainstream media? 117 00:07:01,200 --> 00:07:05,200 Speaker 1: So I have quite a definite definite response on this. 118 00:07:05,720 --> 00:07:06,760 Speaker 1: I think's happening already. 119 00:07:07,080 --> 00:07:12,560 Speaker 3: Okay, I shall I go? I think yes, please look 120 00:07:12,600 --> 00:07:17,840 Speaker 3: for me. It is making the and again I suspect 121 00:07:17,880 --> 00:07:20,720 Speaker 3: I'm in the minority in that I do double check. 122 00:07:20,800 --> 00:07:23,480 Speaker 3: I cross check things, and I go to reputable sources 123 00:07:23,520 --> 00:07:27,520 Speaker 3: to see if what I've seen online is actually the case. 124 00:07:28,320 --> 00:07:30,680 Speaker 3: So if I see a story about something that has 125 00:07:30,680 --> 00:07:33,920 Speaker 3: happened to, say in the Middle East or somewhere, anything 126 00:07:34,080 --> 00:07:36,120 Speaker 3: about a politician, because they tend to be kind of 127 00:07:36,120 --> 00:07:38,080 Speaker 3: the subject of a lot of these, I will just 128 00:07:38,160 --> 00:07:39,920 Speaker 3: I will go and google it, and I'll go to 129 00:07:40,160 --> 00:07:45,040 Speaker 3: a whether it's The New York Times or or anyone 130 00:07:45,120 --> 00:07:49,280 Speaker 3: anyone that is a traditional reputable media source, and see 131 00:07:49,280 --> 00:07:51,560 Speaker 3: if a story lines up. And if it doesn't even 132 00:07:51,600 --> 00:07:54,600 Speaker 3: come up on a Google news search, which kind of 133 00:07:54,600 --> 00:07:56,680 Speaker 3: references all of those sites, then you can pretty safely 134 00:07:56,680 --> 00:07:59,400 Speaker 3: assume it. So I think that if anything is putting 135 00:07:59,560 --> 00:08:04,720 Speaker 3: more pressure and more importance on the content being produced 136 00:08:04,720 --> 00:08:07,320 Speaker 3: by reputable news sources. I just worry that a lot 137 00:08:07,320 --> 00:08:11,840 Speaker 3: of people, though, just take it at face value and 138 00:08:12,040 --> 00:08:13,280 Speaker 3: don't go So. 139 00:08:14,160 --> 00:08:17,120 Speaker 1: My kids will often say I don't know whether it's true. 140 00:08:17,280 --> 00:08:20,160 Speaker 1: Like they'll see something say hey, Dad, is that true that? 141 00:08:20,960 --> 00:08:24,720 Speaker 1: And like, I think that that generation is questioning this. 142 00:08:24,880 --> 00:08:30,680 Speaker 1: I think this is really good for quality news sites, 143 00:08:31,440 --> 00:08:33,839 Speaker 1: is my take. On that, And I mean, I'm exactly 144 00:08:33,880 --> 00:08:35,600 Speaker 1: the same as you. When I see something, I go 145 00:08:35,679 --> 00:08:39,520 Speaker 1: to like a reputable news site, check it out. I 146 00:08:39,559 --> 00:08:43,960 Speaker 1: also find because of the social media feed and it 147 00:08:44,040 --> 00:08:45,839 Speaker 1: tells you what you want. So I see a lot 148 00:08:45,840 --> 00:08:49,560 Speaker 1: of stuff on Donald Trump, right, but I just don't 149 00:08:49,600 --> 00:08:52,600 Speaker 1: know whether it's true or not anymore. So anything on 150 00:08:52,640 --> 00:08:56,040 Speaker 1: Donald Trump, I'll go to, Actually, I will go to 151 00:08:56,200 --> 00:08:58,280 Speaker 1: New York Times, then I'll go to CNN, and then 152 00:08:58,320 --> 00:09:00,040 Speaker 1: I'll go to Fox and I'll try and see what 153 00:09:00,080 --> 00:09:04,040 Speaker 1: they're all saying. But it's I don't believe my social 154 00:09:04,080 --> 00:09:07,320 Speaker 1: feeds on Donald Trump anymore. But because it's just it's 155 00:09:07,360 --> 00:09:09,560 Speaker 1: too hard to know what there's an eca chamber and 156 00:09:09,640 --> 00:09:11,880 Speaker 1: because of my view on Donald Trump, suddenly I'm getting 157 00:09:11,920 --> 00:09:15,559 Speaker 1: all this and it's hard to get a positive story 158 00:09:15,559 --> 00:09:17,560 Speaker 1: on Donald Trump. And maybe it's not positive stories on 159 00:09:17,600 --> 00:09:21,320 Speaker 1: Donald Trump, but of course there is. So yeah, yeah, 160 00:09:21,640 --> 00:09:21,920 Speaker 1: I was. 161 00:09:22,280 --> 00:09:24,720 Speaker 3: Looking the other day at it was actually at my 162 00:09:24,760 --> 00:09:27,520 Speaker 3: mother in law's I was helping her with her iPad 163 00:09:27,679 --> 00:09:32,280 Speaker 3: and on went into YouTube and the homepage for it 164 00:09:32,320 --> 00:09:35,800 Speaker 3: came up, and it has obviously kind of targeted her 165 00:09:35,840 --> 00:09:38,280 Speaker 3: with all of this. It was a lot of stuff 166 00:09:38,280 --> 00:09:40,320 Speaker 3: based on the algorithm, and it was a lot of 167 00:09:40,320 --> 00:09:43,800 Speaker 3: Donald Trump and I actually scrolled down through it and 168 00:09:44,559 --> 00:09:47,560 Speaker 3: out of sixteen probably videos, because it was kind of 169 00:09:47,559 --> 00:09:50,000 Speaker 3: two and about eight rows of them, I don't think 170 00:09:50,040 --> 00:09:55,040 Speaker 3: there was a single legitimate video, and they were all 171 00:09:55,559 --> 00:09:58,560 Speaker 3: claiming to be factual content, but I could see every 172 00:09:58,559 --> 00:10:00,800 Speaker 3: single one of them was aijen errated, and they were 173 00:10:00,840 --> 00:10:04,240 Speaker 3: all kind of claims about extraordinary illnesses that the president had, 174 00:10:04,920 --> 00:10:08,079 Speaker 3: things that the President had said to other people that 175 00:10:08,559 --> 00:10:11,760 Speaker 3: I know didn't happen because we talk about it so 176 00:10:11,840 --> 00:10:13,920 Speaker 3: much on the show. I was like, this is this 177 00:10:13,960 --> 00:10:16,560 Speaker 3: is absolute nonsense, and it was frightening to see it 178 00:10:16,600 --> 00:10:22,880 Speaker 3: given so much space and importance on a platform like YouTube, 179 00:10:22,880 --> 00:10:24,800 Speaker 3: which is obviously where it's going to happen. Sorry, Adam, 180 00:10:24,840 --> 00:10:25,120 Speaker 3: you go. 181 00:10:25,160 --> 00:10:30,600 Speaker 2: Adam, No, I think it. It really does reaffirm that 182 00:10:30,679 --> 00:10:35,439 Speaker 2: it should be labeled. I think that disclosures should be present. 183 00:10:36,120 --> 00:10:38,480 Speaker 2: I know that's going to be very difficult territory to. 184 00:10:39,960 --> 00:10:42,360 Speaker 1: And on the topic of the legacy media, not legacy media, 185 00:10:42,400 --> 00:10:43,520 Speaker 1: but quality media. 186 00:10:45,720 --> 00:10:50,520 Speaker 2: Oh, what does it mean for it? It's a it's 187 00:10:50,520 --> 00:10:56,200 Speaker 2: a it's it's in greater and greater peril. Yeah, so interesting. 188 00:10:56,200 --> 00:10:57,160 Speaker 1: I think the opposite. 189 00:10:57,720 --> 00:10:59,920 Speaker 2: I was just going to say the UK, right, and 190 00:11:00,120 --> 00:11:02,400 Speaker 2: this is really interesting. I know it's a labor government 191 00:11:02,400 --> 00:11:07,760 Speaker 2: in England or Britain, and they are looking at enshrining 192 00:11:07,800 --> 00:11:10,040 Speaker 2: what is a ten year application process for funding the 193 00:11:10,080 --> 00:11:14,120 Speaker 2: BBC to be permanent, as in it does not have 194 00:11:14,160 --> 00:11:17,520 Speaker 2: to go through that process ever again because of this 195 00:11:17,679 --> 00:11:21,880 Speaker 2: very thing that it is government funded and preserving independent journalism. 196 00:11:22,320 --> 00:11:25,560 Speaker 2: And so, you know, I think were it not for 197 00:11:25,640 --> 00:11:29,840 Speaker 2: mechanisms like that or the ABC here, you know that 198 00:11:30,559 --> 00:11:33,079 Speaker 2: there is real risk and so I'm my hope is that, 199 00:11:33,200 --> 00:11:36,080 Speaker 2: of course, that great journalism survives. 200 00:11:37,320 --> 00:11:39,520 Speaker 1: Yeah, I think this gives it a whole chance to 201 00:11:39,559 --> 00:11:42,280 Speaker 1: survive and thrive like never before. Anyway, thank you very 202 00:11:42,360 --> 00:11:45,120 Speaker 1: much for your comments. Adam, thank you, Sean, thank you, Michael, 203 00:11:45,160 --> 00:11:45,800 Speaker 1: thank you, Michael. 204 00:11:45,920 --> 00:11:46,680 Speaker 3: Thank you very much. 205 00:11:47,360 --> 00:11:49,520 Speaker 1: This is the Fear and Green Sunday edition. Don't forget 206 00:11:49,520 --> 00:11:51,959 Speaker 1: to follow us on LinkedIn or on Instagram. 207 00:11:52,040 --> 00:11:52,600 Speaker 2: I'm sure they all. 208 00:11:53,000 --> 00:11:59,080 Speaker 1: I'm enjoying your day