1 00:00:01,320 --> 00:00:05,000 Speaker 1: And Amanda gam Nation I put. I saw something the 2 00:00:05,040 --> 00:00:08,200 Speaker 1: other day that I put on my Instagram, and my 3 00:00:08,400 --> 00:00:11,520 Speaker 1: children and Digital Jenna saw it and thought, oh, mum's 4 00:00:11,520 --> 00:00:14,560 Speaker 1: in on the joke. This is funny. I wasn't. I'm 5 00:00:14,600 --> 00:00:17,840 Speaker 1: going to out myself as an idiot. I saw a picture, 6 00:00:17,880 --> 00:00:20,799 Speaker 1: remember the Turkish guy at the Olympics who just casually 7 00:00:20,840 --> 00:00:22,440 Speaker 1: turned up, had his hand in his pocket, pretty much 8 00:00:22,440 --> 00:00:25,400 Speaker 1: put down a smoke. Bang got gold? Remember that? 9 00:00:25,480 --> 00:00:26,279 Speaker 2: So you just showed up. 10 00:00:26,480 --> 00:00:29,960 Speaker 1: It just showed up, whatevers, I'll get gold. I saw 11 00:00:30,240 --> 00:00:33,120 Speaker 1: that with above it it says this was five years ago. 12 00:00:33,479 --> 00:00:38,680 Speaker 1: Feel old? Yet I posted that and I put underneath it. 13 00:00:38,840 --> 00:00:43,320 Speaker 1: The news cycle moves fast. It sounds like because there's 14 00:00:43,920 --> 00:00:47,360 Speaker 1: a trope at the moment where it's like Will Smith 15 00:00:47,600 --> 00:00:50,280 Speaker 1: hits Chris Rock at the Oscars twenty seven years ago. 16 00:00:50,360 --> 00:00:54,480 Speaker 1: Today it's all a joke. The timelines. It's just a 17 00:00:54,560 --> 00:00:58,240 Speaker 1: joke about how our timelines are all weird. But I 18 00:00:58,320 --> 00:01:01,360 Speaker 1: fell for that. And even behind it's got Paris twenty 19 00:01:01,360 --> 00:01:04,640 Speaker 1: twenty four in the picture. Yeah yeah, And I said 20 00:01:04,640 --> 00:01:06,199 Speaker 1: to you, I thought you were joking. No, I showed 21 00:01:06,200 --> 00:01:07,600 Speaker 1: you the pictures and I said, can you believe this 22 00:01:07,640 --> 00:01:10,360 Speaker 1: is five years ago. So it looks like I'm in 23 00:01:10,440 --> 00:01:12,560 Speaker 1: on this joke, this Instagram joke. 24 00:01:12,360 --> 00:01:13,680 Speaker 2: Everyone knowing you're a genius. 25 00:01:13,680 --> 00:01:17,080 Speaker 1: I'm a complete idiot. But it does show you how 26 00:01:17,160 --> 00:01:20,119 Speaker 1: quickly the news cycle does move that your perception of time. 27 00:01:20,160 --> 00:01:22,560 Speaker 1: And I think COVID had a lot to do with that. 28 00:01:22,560 --> 00:01:25,679 Speaker 1: That there's years that we've just lost and have sped 29 00:01:25,760 --> 00:01:27,080 Speaker 1: up and slowed down at the same time. 30 00:01:27,200 --> 00:01:29,479 Speaker 2: Something I completely forgot about during my holidays and I've 31 00:01:29,520 --> 00:01:32,039 Speaker 2: had a bit of a social medium. I just didn't 32 00:01:32,080 --> 00:01:33,959 Speaker 2: scroll through my phone as much. But one of the 33 00:01:33,959 --> 00:01:37,000 Speaker 2: things that popped up Ozzie Osborne dead at the age 34 00:01:37,000 --> 00:01:40,679 Speaker 2: of seventy six. This happened on my Instagram a week 35 00:01:40,720 --> 00:01:43,039 Speaker 2: ago and I looked at it, so you saw that 36 00:01:43,120 --> 00:01:45,360 Speaker 2: I saw it. I remember at the time googling is 37 00:01:45,400 --> 00:01:48,640 Speaker 2: Ossie still with us? That's terrible, and sure enough he was. 38 00:01:48,800 --> 00:01:51,080 Speaker 2: And it was only when Kelly Osborne, his daughter, put 39 00:01:51,080 --> 00:01:53,800 Speaker 2: out a post saying this, who does this? Who does this? 40 00:01:54,040 --> 00:01:56,440 Speaker 2: I'd say this that was a bot that put that out, 41 00:01:56,560 --> 00:01:58,880 Speaker 2: because now we've got a situation where bots are fighting 42 00:01:58,920 --> 00:02:02,800 Speaker 2: with bots and kids, the young kids Geny Rise Clan. 43 00:02:03,120 --> 00:02:05,040 Speaker 2: They put out these s posts, so they make up 44 00:02:05,080 --> 00:02:07,440 Speaker 2: stuff and they just put it up. There are classic 45 00:02:07,440 --> 00:02:10,560 Speaker 2: example this of Scott Morris and Ingodeen macis thing that 46 00:02:10,680 --> 00:02:13,239 Speaker 2: never happened. So the guy has come out and said 47 00:02:13,400 --> 00:02:16,320 Speaker 2: I was just taking the piss and everyone ran with it. 48 00:02:16,400 --> 00:02:20,560 Speaker 2: But now people firmly believe that that actually happened. It's 49 00:02:20,560 --> 00:02:23,720 Speaker 2: like the Steve Erwin effect. What's that people believe that 50 00:02:23,880 --> 00:02:26,840 Speaker 2: they've seen the video of Steve Irwin meeting his demise 51 00:02:26,880 --> 00:02:29,240 Speaker 2: with the stingray that doesn't exist. 52 00:02:29,440 --> 00:02:31,639 Speaker 1: Well, you know my friend Anita McGregor, wh's well. 53 00:02:31,600 --> 00:02:32,760 Speaker 2: It exists, but it hasn't been. 54 00:02:34,760 --> 00:02:38,679 Speaker 1: Seen it. We've never seen it. Anita McGregor, my friend, 55 00:02:38,720 --> 00:02:41,200 Speaker 1: who is a forensic psychologist. One of her colleagues does 56 00:02:41,200 --> 00:02:43,840 Speaker 1: a lot of work on memory and for when you're 57 00:02:43,840 --> 00:02:46,520 Speaker 1: in the witness box. A lot of research goes into 58 00:02:47,120 --> 00:02:49,920 Speaker 1: whether memories are real or not. And they did an 59 00:02:49,919 --> 00:02:52,160 Speaker 1: experiment years ago where they said to somebody, do you 60 00:02:52,240 --> 00:02:54,160 Speaker 1: remember when you went up in a hot air balloon? 61 00:02:54,720 --> 00:02:56,400 Speaker 1: I said, no, I never did that, and then they 62 00:02:56,400 --> 00:02:59,240 Speaker 1: show them a photo from a childhood from their childhood 63 00:02:59,240 --> 00:03:00,760 Speaker 1: where they've docted it in a shot of them in 64 00:03:00,760 --> 00:03:03,720 Speaker 1: a hot air balloon, and then later on they say 65 00:03:03,840 --> 00:03:05,200 Speaker 1: do you remember the thing? They go, yeah, I remember 66 00:03:05,240 --> 00:03:07,560 Speaker 1: that holiday with the hot air balloon. We're so open 67 00:03:07,639 --> 00:03:11,040 Speaker 1: to suggestion of the visual, but these days it is 68 00:03:11,160 --> 00:03:15,720 Speaker 1: so hard because AI has made those visuals so confusing. 69 00:03:16,080 --> 00:03:19,360 Speaker 1: We see stuff every day that isn't real and we 70 00:03:19,480 --> 00:03:22,240 Speaker 1: don't know it. I'm watching the movie j of K. 71 00:03:22,680 --> 00:03:25,240 Speaker 1: This was the Oliver Stone film. How many years ago 72 00:03:25,320 --> 00:03:28,120 Speaker 1: was that? Thirty years something like that, and they showed 73 00:03:28,160 --> 00:03:32,160 Speaker 1: some footage of an autopsy situation with him on the table, 74 00:03:33,040 --> 00:03:35,640 Speaker 1: And at the time Harley and I were talking about 75 00:03:35,640 --> 00:03:38,840 Speaker 1: this and saying, you can't tell if it's documentary footage 76 00:03:38,880 --> 00:03:41,160 Speaker 1: or if it's a movie footage, and how confusing it 77 00:03:41,200 --> 00:03:43,360 Speaker 1: is because once you see it, you think you've seen 78 00:03:43,400 --> 00:03:47,080 Speaker 1: the actuality. But these days we're seeing pictures where as 79 00:03:47,080 --> 00:03:49,720 Speaker 1: we said, yesterday Donald Trump is singing at the Oscars, 80 00:03:50,120 --> 00:03:53,560 Speaker 1: these we believe this stuff, that imagery he's put out 81 00:03:53,560 --> 00:03:56,960 Speaker 1: of Barack Obama being arrested. We see the images and 82 00:03:57,000 --> 00:04:00,320 Speaker 1: we think it's real, and we go some of go, 83 00:04:00,360 --> 00:04:02,720 Speaker 1: how can you fall for that? But as human beings, 84 00:04:03,080 --> 00:04:03,840 Speaker 1: how can we not? 85 00:04:04,200 --> 00:04:07,120 Speaker 2: This is the other thing with bots arguing with bots 86 00:04:07,120 --> 00:04:09,960 Speaker 2: on the Internet. So you put up a post, particularly 87 00:04:09,960 --> 00:04:12,440 Speaker 2: the Trump stuff. I've noticed there's bots arguing with bots, 88 00:04:12,640 --> 00:04:15,680 Speaker 2: and then humans get involved. But they've just got two 89 00:04:15,720 --> 00:04:19,360 Speaker 2: different opinions of these bots. But people join in on the. 90 00:04:18,920 --> 00:04:21,240 Speaker 1: They do because it's all about leaning. You're going to 91 00:04:21,320 --> 00:04:24,840 Speaker 1: lean in when you're anxious, when you're frightened, when you're scared. 92 00:04:25,040 --> 00:04:27,680 Speaker 1: So the bots are generating that fear and we lean in. 93 00:04:27,800 --> 00:04:28,880 Speaker 2: That's exactly what you do. 94 00:04:28,960 --> 00:04:31,560 Speaker 1: Oh it's exhausting, isn't it exhaust Meanwhile, I look like 95 00:04:31,600 --> 00:04:33,920 Speaker 1: an idiot on Instagram, thanks very much. 96 00:04:33,680 --> 00:04:35,200 Speaker 2: And there was no way I involved. 97 00:04:35,279 --> 00:04:36,839 Speaker 1: Why I can't blame anyone