1 00:00:00,080 --> 00:00:03,320 Speaker 1: AI deep fakes apparently getting so real that they're even 2 00:00:03,400 --> 00:00:05,720 Speaker 1: starting to trick the people they're supposed to be imitating. 3 00:00:05,800 --> 00:00:08,440 Speaker 1: Gareth Morgan is the latest target of one of these scams. 4 00:00:09,160 --> 00:00:11,400 Speaker 1: In this video, he's asking Keywis to invest in a 5 00:00:11,440 --> 00:00:13,080 Speaker 1: dodgy US investment scheme. 6 00:00:13,280 --> 00:00:15,680 Speaker 2: Banks are in the business of growing your wealth. They're 7 00:00:15,680 --> 00:00:18,440 Speaker 2: in the business of growing their own Stop playing their game. 8 00:00:18,720 --> 00:00:21,400 Speaker 2: I've teamed up with a US investment group to focus 9 00:00:21,400 --> 00:00:23,400 Speaker 2: on real opportunities in the US market. 10 00:00:23,520 --> 00:00:27,040 Speaker 1: Gareth is with us. Now, Hey, Gareth, the hell are you? 11 00:00:27,200 --> 00:00:28,520 Speaker 1: It's pretty good. It's pretty good. 12 00:00:29,480 --> 00:00:32,919 Speaker 3: It's awesome. Like I'm in the video, it's just unbelievable. 13 00:00:33,280 --> 00:00:35,680 Speaker 3: I just I've watched it two or three times now 14 00:00:36,360 --> 00:00:38,120 Speaker 3: and I can't believe me. 15 00:00:38,640 --> 00:00:41,559 Speaker 1: You can't believe well, is there No, I haven't watched it. 16 00:00:41,600 --> 00:00:43,640 Speaker 1: Is there no giveaway? Like, there's no kind of like 17 00:00:43,760 --> 00:00:44,839 Speaker 1: clipping or anything like that. 18 00:00:44,880 --> 00:00:45,800 Speaker 2: It's just legitimately. 19 00:00:45,880 --> 00:00:49,159 Speaker 3: The only giveaway is the backdrop is I don't recognize 20 00:00:49,200 --> 00:00:54,320 Speaker 3: the house behind behind me. But everything else like that, 21 00:00:54,560 --> 00:00:58,480 Speaker 3: you know, obviously, the face, the lip movement, the voice, obviously, 22 00:00:59,200 --> 00:01:00,200 Speaker 3: I mean I can't tell. 23 00:01:00,520 --> 00:01:02,880 Speaker 1: Yeah, interesting you have you run it past Joe or 24 00:01:02,920 --> 00:01:03,640 Speaker 1: any of the kids. 25 00:01:05,440 --> 00:01:08,160 Speaker 3: I haven't actually asked Jane what she thinks of it, 26 00:01:08,240 --> 00:01:09,320 Speaker 3: but yeah, that's a good. 27 00:01:09,120 --> 00:01:12,840 Speaker 1: Idea because unfailingly blunt, and she'll tell you what she 28 00:01:12,920 --> 00:01:13,440 Speaker 1: really thinks. 29 00:01:13,520 --> 00:01:16,160 Speaker 3: Oh yeah, no, I know. It's be embarrassing. 30 00:01:16,080 --> 00:01:17,440 Speaker 1: Because it was one of it. Was it your daughter 31 00:01:17,680 --> 00:01:18,720 Speaker 1: who spotted it first? 32 00:01:19,160 --> 00:01:21,640 Speaker 3: Yeah, one of the daughters that web rang up five 33 00:01:21,720 --> 00:01:24,039 Speaker 3: of them. I know she texted me for hey, Dad, 34 00:01:24,040 --> 00:01:26,120 Speaker 3: have you seen this? But what she saw was the 35 00:01:26,200 --> 00:01:29,080 Speaker 3: written words. She didn't see the video that came later, 36 00:01:29,720 --> 00:01:32,200 Speaker 3: So she just saw the Facebook post and it was 37 00:01:32,240 --> 00:01:35,600 Speaker 3: an ad actually, and it's come out of the UK 38 00:01:36,520 --> 00:01:39,320 Speaker 3: when you trace it back. But you know, I've been 39 00:01:39,440 --> 00:01:42,000 Speaker 3: to Facebook a few times and asked them to take 40 00:01:42,040 --> 00:01:44,440 Speaker 3: it down. Blah blah. But of course you don't get 41 00:01:44,440 --> 00:01:47,360 Speaker 3: a human being. You just get a dialogue box to 42 00:01:47,440 --> 00:01:49,360 Speaker 3: throw out and off it goes. You don't even get 43 00:01:49,360 --> 00:01:54,960 Speaker 3: an acknowledgment that it's that you've sent it, so you're 44 00:01:55,000 --> 00:01:56,880 Speaker 3: completely powerless. You can't do anything. 45 00:01:56,880 --> 00:01:59,280 Speaker 1: It just goes into this giant hole where every complaint 46 00:01:59,320 --> 00:02:01,480 Speaker 1: into Facebook and they never do anything about it. 47 00:02:02,440 --> 00:02:02,520 Speaker 2: No. 48 00:02:02,680 --> 00:02:05,080 Speaker 3: Well, I mean they're more powerful in countries, so an 49 00:02:05,120 --> 00:02:07,640 Speaker 3: individual's not going to be able to stop anything, Heather. 50 00:02:07,960 --> 00:02:10,440 Speaker 1: How do we get around the scareth if we've got 51 00:02:10,480 --> 00:02:12,680 Speaker 1: if we've got deep fake videos that are even convincing 52 00:02:12,720 --> 00:02:15,000 Speaker 1: the people that they are of, how do the rest 53 00:02:15,000 --> 00:02:17,000 Speaker 1: of us know that we can't trust it? 54 00:02:18,600 --> 00:02:21,920 Speaker 3: Well, you know, I would have thought by now that 55 00:02:22,000 --> 00:02:25,840 Speaker 3: with these sort of social media Twitter and Facebook and 56 00:02:25,880 --> 00:02:30,080 Speaker 3: that people would realize that, you know, there's so much 57 00:02:30,960 --> 00:02:34,480 Speaker 3: false stuff on them that they're not worth bothering with. 58 00:02:34,840 --> 00:02:36,840 Speaker 3: And you know, that's what we normally do, don't we 59 00:02:37,000 --> 00:02:39,600 Speaker 3: self censer? They say, well, that's just crap, so we 60 00:02:39,720 --> 00:02:42,840 Speaker 3: go somewhere else. But I think that, on the other hand, 61 00:02:42,880 --> 00:02:45,079 Speaker 3: the draw card that was having all your friends there 62 00:02:45,080 --> 00:02:47,560 Speaker 3: and being able to chat back and forth so powerful 63 00:02:48,160 --> 00:02:50,679 Speaker 3: that people just turn the blind eye to so it's 64 00:02:50,720 --> 00:02:55,359 Speaker 3: not self censoring, you know. I think that's that's the issue. 65 00:02:55,360 --> 00:02:57,400 Speaker 3: I mean, that reminds me actually the early days that 66 00:02:57,440 --> 00:03:01,200 Speaker 3: Trade Me, when we did trade Me way back then 67 00:03:01,600 --> 00:03:04,800 Speaker 3: and people started complaining, hey, this guy's taking my money 68 00:03:04,840 --> 00:03:06,720 Speaker 3: and you know, the goods are fake and all arrest 69 00:03:06,760 --> 00:03:09,160 Speaker 3: of it, and how Trade Me got round it in 70 00:03:09,200 --> 00:03:11,520 Speaker 3: the end was having that star system, you know, where 71 00:03:11,560 --> 00:03:14,240 Speaker 3: you rank each other the biers and sellers, and that's 72 00:03:14,360 --> 00:03:19,600 Speaker 3: that is self regulating. But Facebook doesn't have that, you see, 73 00:03:19,840 --> 00:03:22,360 Speaker 3: and has nothing like that, so it just lets it work. 74 00:03:22,800 --> 00:03:24,560 Speaker 3: And you know, if you read that book by that 75 00:03:24,600 --> 00:03:28,680 Speaker 3: New Zealand lady who was when Williams, that's an awesome 76 00:03:28,680 --> 00:03:31,360 Speaker 3: book by the way. I mean, she really gave you 77 00:03:31,440 --> 00:03:34,600 Speaker 3: a window into the culture of the higher ceschelons, which 78 00:03:34,639 --> 00:03:38,000 Speaker 3: is they don't care because we're you know, we're so powerful, 79 00:03:38,040 --> 00:03:40,520 Speaker 3: no one can stop us sort of thing. So it's 80 00:03:40,560 --> 00:03:43,960 Speaker 3: pretty scary. But boy, I'm impressed by the power of AI. 81 00:03:44,040 --> 00:03:44,360 Speaker 1: Isn't it. 82 00:03:44,400 --> 00:03:44,720 Speaker 3: Boran. 83 00:03:45,080 --> 00:03:47,520 Speaker 1: Yeah, Gareth, thank you very much. You look after yourself. 84 00:03:47,680 --> 00:03:50,400 Speaker 1: Thanks for the reading recommendation. It's Gareth Morgan, investment manager, 85 00:03:50,760 --> 00:03:54,040 Speaker 1: former economists and obviously daded to trade me founded. For 86 00:03:54,160 --> 00:03:57,720 Speaker 1: more from Heather Duplessy Allen Drive, listen live to news talks. 87 00:03:57,760 --> 00:04:02,040 Speaker 1: It'd be from four pm weekdays. Follow the podcast on iHeartRadio.