1 00:00:00,120 --> 00:00:02,360 Speaker 1: Now the rest of the second hour of the show 2 00:00:02,400 --> 00:00:05,360 Speaker 1: today is given entirely over to a conversation about online 3 00:00:05,480 --> 00:00:08,520 Speaker 1: dating and particularly the issue of how to avoid falling 4 00:00:08,600 --> 00:00:11,159 Speaker 1: victim to the scam artists who are out there and 5 00:00:11,240 --> 00:00:15,480 Speaker 1: unfortunately are getting ever more polished in how to manipulate 6 00:00:15,560 --> 00:00:20,000 Speaker 1: this space. I've never used online dating platforms before. They 7 00:00:20,000 --> 00:00:22,240 Speaker 1: didn't exist in the days when I was dating, and 8 00:00:22,280 --> 00:00:25,159 Speaker 1: from what I've heard, I'm quite relief that. But I 9 00:00:25,200 --> 00:00:27,880 Speaker 1: can tell you this that my Inbox and Wendy Nola 10 00:00:27,960 --> 00:00:31,200 Speaker 1: are consumer journalists. In Box have seen more than their 11 00:00:31,280 --> 00:00:35,760 Speaker 1: fair share of the aftermath, shall we put it that way, 12 00:00:36,080 --> 00:00:39,360 Speaker 1: emails from victims saying can you help get money back 13 00:00:40,040 --> 00:00:42,879 Speaker 1: that has been taken by scammers. We've also seen our 14 00:00:42,920 --> 00:00:46,400 Speaker 1: fair share of emails from desperate friends and family members saying, 15 00:00:46,760 --> 00:00:49,560 Speaker 1: I am watching it happen. I can see my loved 16 00:00:49,560 --> 00:00:52,320 Speaker 1: one being drawn into a trap, and for love or money, 17 00:00:52,320 --> 00:00:54,560 Speaker 1: I cannot get them to recognize that that is what 18 00:00:54,720 --> 00:00:55,200 Speaker 1: is happening. 19 00:00:55,240 --> 00:00:55,880 Speaker 2: What do I do? 20 00:00:56,920 --> 00:00:59,040 Speaker 1: So today, we want to share a couple of perspectives 21 00:00:59,040 --> 00:01:01,840 Speaker 1: with you. Is quite humorous. The other one is going 22 00:01:01,880 --> 00:01:04,360 Speaker 1: to take them more serious. Is it a business like approach, 23 00:01:04,360 --> 00:01:06,160 Speaker 1: if we can put it that way, because we didnt 24 00:01:06,200 --> 00:01:08,679 Speaker 1: have from someone who has had her own quite recent 25 00:01:08,720 --> 00:01:12,080 Speaker 1: experience of looking for love online and had such a 26 00:01:12,160 --> 00:01:16,080 Speaker 1: riot navigating the space that she's actually written her experiences 27 00:01:16,120 --> 00:01:18,960 Speaker 1: down and published a book called Red Flags and Roses, 28 00:01:19,640 --> 00:01:21,240 Speaker 1: and she is going to chat to us about some 29 00:01:21,280 --> 00:01:24,240 Speaker 1: of those experiences, particularly the ones that took us into 30 00:01:24,680 --> 00:01:27,399 Speaker 1: the area of the scam artist. And then a little 31 00:01:27,400 --> 00:01:29,200 Speaker 1: bit later on in this conversation, we'll hear from a 32 00:01:29,200 --> 00:01:31,440 Speaker 1: fraud prevention expert who is going to share some cold 33 00:01:31,480 --> 00:01:34,160 Speaker 1: hard facts with us about how these scams are implemented 34 00:01:34,480 --> 00:01:36,880 Speaker 1: and how to avoid being a victim. So to both 35 00:01:36,920 --> 00:01:39,440 Speaker 1: of our guests, really looking forward to that conversation, and 36 00:01:39,440 --> 00:01:43,320 Speaker 1: to those of you listening today who have had direct experience, 37 00:01:43,360 --> 00:01:46,840 Speaker 1: whether you have been a victim, a near victim, maybe 38 00:01:46,840 --> 00:01:48,800 Speaker 1: you saw them a mile away and you were able 39 00:01:48,840 --> 00:01:51,320 Speaker 1: to spot them and react. Maybe right now you were 40 00:01:51,360 --> 00:01:53,880 Speaker 1: watching a family member who you think is being sucked in. 41 00:01:54,160 --> 00:01:55,920 Speaker 1: If you want to share with us, you can welcome. 42 00:01:56,120 --> 00:01:58,120 Speaker 1: You are welcome to send a what's up to seven 43 00:01:58,200 --> 00:02:02,120 Speaker 1: to five six seven five six seven. Let's start with 44 00:02:02,120 --> 00:02:04,800 Speaker 1: the writer Susan Harrison with me in studio today, the 45 00:02:04,840 --> 00:02:07,880 Speaker 1: author of Red Flags and Roses, which is her account 46 00:02:08,160 --> 00:02:12,000 Speaker 1: of her experiences with online dating and some of the 47 00:02:12,040 --> 00:02:14,720 Speaker 1: really funny anecdotes, some of them quite serious, some of 48 00:02:14,760 --> 00:02:17,360 Speaker 1: them into the realm of the scams and would be 49 00:02:17,440 --> 00:02:19,720 Speaker 1: con artists and Susan, thanks for coming into studio to 50 00:02:19,720 --> 00:02:21,400 Speaker 1: share this with us. It's great to have you with us. 51 00:02:21,600 --> 00:02:23,720 Speaker 3: Thank you so much for having me today, Papa, A 52 00:02:23,800 --> 00:02:25,359 Speaker 3: huge pleasure, wonderful to be here. 53 00:02:25,480 --> 00:02:27,760 Speaker 1: So you dipped your toes back into the dating pool 54 00:02:27,919 --> 00:02:30,239 Speaker 1: after your eleven year marriage came to an end. Have 55 00:02:30,360 --> 00:02:32,079 Speaker 1: you ever tried online dating before then? 56 00:02:32,400 --> 00:02:35,840 Speaker 3: Never? And in fact, was relatively new in those days. 57 00:02:36,480 --> 00:02:38,960 Speaker 3: Because this book spans fifteen years. 58 00:02:39,480 --> 00:02:42,000 Speaker 1: Gosh, okay, So I didn't realize the time span was 59 00:02:42,080 --> 00:02:45,320 Speaker 1: quite that extensive. Yes, So you really gave it a 60 00:02:45,320 --> 00:02:47,120 Speaker 1: full of thorough vetting, didn't you. 61 00:02:48,160 --> 00:02:51,040 Speaker 3: I did. It wasn't always online a lot of the time. 62 00:02:51,240 --> 00:02:53,880 Speaker 3: I was. I was, you know, I was initially I 63 00:02:53,919 --> 00:02:57,520 Speaker 3: was very very busy and had a very demanding career, 64 00:02:58,200 --> 00:03:01,880 Speaker 3: and I really wasn't looking for the one. I was 65 00:03:01,960 --> 00:03:05,520 Speaker 3: more actually, I was more interested in meeting someone. I 66 00:03:05,680 --> 00:03:10,119 Speaker 3: just having a male boyfriend. But preferably at a distance. 67 00:03:10,600 --> 00:03:14,360 Speaker 3: I didn't live in Cape Town and that and it 68 00:03:14,440 --> 00:03:17,560 Speaker 3: then it evolved into different needs and wants over the years, 69 00:03:17,639 --> 00:03:20,000 Speaker 3: and sometimes there was a four year gap where I 70 00:03:20,040 --> 00:03:23,600 Speaker 3: didn't go online at all, but it had It has 71 00:03:23,760 --> 00:03:27,040 Speaker 3: been an amazing journey for me, and I know that 72 00:03:27,120 --> 00:03:31,320 Speaker 3: there's a lot of talk around scammers. My advice to 73 00:03:31,440 --> 00:03:35,440 Speaker 3: women in particular is, if you have never gone online, 74 00:03:36,040 --> 00:03:42,000 Speaker 3: treat this like you would any business venture or something 75 00:03:42,080 --> 00:03:45,160 Speaker 3: that you're going to do that needs a lot of thought. 76 00:03:45,640 --> 00:03:47,640 Speaker 3: So what is it that you want to achieve from 77 00:03:47,720 --> 00:03:51,160 Speaker 3: the online dating? That's your first objective. The second one 78 00:03:51,280 --> 00:03:57,200 Speaker 3: is to establish the dos and don'ts. What wouldn't work 79 00:03:57,240 --> 00:03:59,120 Speaker 3: for you? You know, So there are two lots of 80 00:03:59,160 --> 00:04:02,560 Speaker 3: red flags. The one is what doesn't work for you 81 00:04:03,000 --> 00:04:05,440 Speaker 3: if you don't like dogs, or you don't like cats, 82 00:04:05,600 --> 00:04:09,080 Speaker 3: or you're not a cyclist, you're not athletic. Those are 83 00:04:09,160 --> 00:04:14,440 Speaker 3: very important things because we get very caught up in 84 00:04:14,520 --> 00:04:18,120 Speaker 3: the conversation with a new person and we tend to 85 00:04:18,240 --> 00:04:23,240 Speaker 3: ignore those very factors that we initially have said are no. 86 00:04:23,320 --> 00:04:24,160 Speaker 1: Important to us. 87 00:04:24,320 --> 00:04:28,719 Speaker 3: Absolutely, and you find yourself there sort of engaging with someone, 88 00:04:28,800 --> 00:04:31,200 Speaker 3: spending time with someone who ultimately is going to be 89 00:04:31,240 --> 00:04:33,920 Speaker 3: a waste of time because you're not going to meet 90 00:04:34,240 --> 00:04:35,680 Speaker 3: they're not going to meet your need and you're not 91 00:04:35,720 --> 00:04:38,760 Speaker 3: going to meet their need. And the other very very 92 00:04:38,760 --> 00:04:41,680 Speaker 3: obvious red flag, and I'm not sure that one can 93 00:04:41,760 --> 00:04:45,960 Speaker 3: say this, but I'm very conscious of the fact that 94 00:04:46,000 --> 00:04:49,960 Speaker 3: a lot of the dating sites are earned by one organization, 95 00:04:51,200 --> 00:04:55,200 Speaker 3: so they usually are there's a head organization, and then 96 00:04:55,240 --> 00:05:03,520 Speaker 3: they have different apps that engage different objectives and they, 97 00:05:04,400 --> 00:05:07,320 Speaker 3: as far as I'm concerned, are aware that these fraudsters 98 00:05:07,360 --> 00:05:11,000 Speaker 3: are online that the owners of the platform, so it's 99 00:05:11,000 --> 00:05:15,400 Speaker 3: income generating. And the big red flag is one picture, 100 00:05:15,480 --> 00:05:21,440 Speaker 3: only a short bio or three pictures and they're very 101 00:05:21,480 --> 00:05:25,159 Speaker 3: photoshopped and you kind of get the feeling is this 102 00:05:25,240 --> 00:05:29,240 Speaker 3: the same guy? And the other famous one is they're 103 00:05:29,240 --> 00:05:33,440 Speaker 3: a project manager or they're an engineer and they're working 104 00:05:33,440 --> 00:05:36,960 Speaker 3: abroad but they wanting to come back to South Africa, 105 00:05:37,040 --> 00:05:40,400 Speaker 3: or they're coming down to do something and would like 106 00:05:40,440 --> 00:05:44,760 Speaker 3: to engage with you, and they immediately start to profess 107 00:05:44,920 --> 00:05:49,719 Speaker 3: love at the very onset. You're so beautiful. Oh, I've 108 00:05:49,800 --> 00:05:52,760 Speaker 3: never seen such a beautiful woman. I know that you're 109 00:05:52,760 --> 00:05:56,560 Speaker 3: the love of my life. And you've barely said two words. 110 00:05:56,960 --> 00:05:57,479 Speaker 2: Yeah. 111 00:05:57,520 --> 00:06:00,360 Speaker 3: And here's the kicker, this is the one that I love. 112 00:06:01,080 --> 00:06:05,719 Speaker 3: I always spend very little time engaging online. I like 113 00:06:05,800 --> 00:06:10,599 Speaker 3: to get a Whatsap number. I don't necessarily give my number, 114 00:06:11,240 --> 00:06:14,800 Speaker 3: but I get a Whatsap number and then I phone 115 00:06:14,839 --> 00:06:21,440 Speaker 3: that number randomly to see who answers. And invariably it's 116 00:06:21,480 --> 00:06:24,760 Speaker 3: an accent or a voice that goes hang on. You 117 00:06:24,760 --> 00:06:27,200 Speaker 3: think to yourself, hang on a minute, that doesn't match 118 00:06:27,240 --> 00:06:31,520 Speaker 3: the first of all, last, this is something that's completely off. Yeah, yeah, 119 00:06:31,520 --> 00:06:36,080 Speaker 3: and you pick that up very very quickly. I'm trying 120 00:06:36,080 --> 00:06:36,480 Speaker 3: to think. 121 00:06:36,360 --> 00:06:40,520 Speaker 1: Of You've said something so important there around the offline conversation. 122 00:06:40,600 --> 00:06:42,680 Speaker 1: And I know when we speak to our food prevention 123 00:06:42,760 --> 00:06:44,560 Speaker 1: guests they get a flag this as well. It's one 124 00:06:44,600 --> 00:06:47,080 Speaker 1: of the big big red flags if the person is 125 00:06:47,160 --> 00:06:51,200 Speaker 1: resisting having an actual conversation with you voice to voice, 126 00:06:51,680 --> 00:06:53,800 Speaker 1: if not face to face, at least voice to voice. 127 00:06:53,839 --> 00:06:55,840 Speaker 1: If it's always a WhatsApp or I can't talk now, 128 00:06:55,880 --> 00:06:57,640 Speaker 1: I'm in a meeting, I'm going to phone you back, 129 00:06:57,920 --> 00:07:00,960 Speaker 1: and it's always a text or a WhatsApp. There's generally 130 00:07:01,000 --> 00:07:03,040 Speaker 1: a reason for that, and that is because the voice 131 00:07:03,160 --> 00:07:06,479 Speaker 1: attached to this person does not match the profile you 132 00:07:06,480 --> 00:07:07,279 Speaker 1: think you're talking about. 133 00:07:07,320 --> 00:07:11,680 Speaker 3: Absolutely. The other thing also is that you need to 134 00:07:11,760 --> 00:07:19,360 Speaker 3: familiarize yourself with chatchypt because that is an incredibly informative platform. 135 00:07:19,480 --> 00:07:24,280 Speaker 3: If you're new to online dating, do a course. It's 136 00:07:24,360 --> 00:07:27,320 Speaker 3: not it's you can do a basic course on chatch ept. 137 00:07:27,520 --> 00:07:29,720 Speaker 3: It'll tell you exactly how to pose the questions. You 138 00:07:29,760 --> 00:07:32,880 Speaker 3: impose the questions exactly as you need them. It'll tell 139 00:07:32,920 --> 00:07:36,080 Speaker 3: you which online starts to aim for, It'll tell you 140 00:07:36,120 --> 00:07:39,840 Speaker 3: all the dos and don't. It is really really informative 141 00:07:40,320 --> 00:07:44,880 Speaker 3: and get yourself informed. You know, you're not going to 142 00:07:45,000 --> 00:07:48,880 Speaker 3: jump out of an errorplane without doing some research as 143 00:07:48,920 --> 00:07:52,800 Speaker 3: to what your reality is, how safe it might be. 144 00:07:53,480 --> 00:07:56,400 Speaker 3: So it really is about taking a good, hard look 145 00:07:56,440 --> 00:08:01,560 Speaker 3: at yourself and taking responsibility really for the reality that 146 00:08:01,640 --> 00:08:05,600 Speaker 3: you're going online and the fact that there are already 147 00:08:05,680 --> 00:08:08,920 Speaker 3: so many red flags around online dating. You know, it's 148 00:08:08,920 --> 00:08:12,960 Speaker 3: a serious business. It's not just flashing the pan. It's 149 00:08:13,080 --> 00:08:14,680 Speaker 3: you've got to really know what you're doing. 150 00:08:15,280 --> 00:08:18,680 Speaker 1: And I really want to thank you for coming in 151 00:08:18,680 --> 00:08:20,520 Speaker 1: in person to share some of these tips with us. 152 00:08:20,520 --> 00:08:22,480 Speaker 1: And we're going to pick up after the news headlines 153 00:08:22,520 --> 00:08:25,400 Speaker 1: on what you've just said about using technology to help 154 00:08:25,440 --> 00:08:28,320 Speaker 1: you use technology more safely, because I think there are 155 00:08:28,360 --> 00:08:31,720 Speaker 1: some very practical, easy to use tools that can help 156 00:08:31,760 --> 00:08:34,840 Speaker 1: you very very quickly discern whether you're talking to a 157 00:08:34,840 --> 00:08:37,559 Speaker 1: genuine human being or a fake profile. Yeah, I mean 158 00:08:37,640 --> 00:08:39,800 Speaker 1: somebody's message me Keith saying I've heard of a woman 159 00:08:39,800 --> 00:08:42,840 Speaker 1: who thinks she's communicating with Yanick Sinner. It was Brad 160 00:08:42,920 --> 00:08:45,079 Speaker 1: Pitt was the famous one for many years that people 161 00:08:45,080 --> 00:08:47,320 Speaker 1: were falling for. I mean, you know, you read that 162 00:08:47,440 --> 00:08:50,440 Speaker 1: out and you think, how on earth could anybody believe 163 00:08:51,080 --> 00:08:54,480 Speaker 1: that they are genuinely in a relationship with a famous megastar, 164 00:08:54,960 --> 00:08:57,439 Speaker 1: and yet it is happening. And we're going to talk 165 00:08:57,480 --> 00:08:59,719 Speaker 1: to our second guest about how it's happening and the 166 00:08:59,800 --> 00:09:03,360 Speaker 1: kind of social engineering that is making that kind of possible. 167 00:09:04,800 --> 00:09:09,520 Speaker 3: You with Cape Talk listens Papoutan on lunch. 168 00:09:09,960 --> 00:09:13,559 Speaker 1: We're back continuing our conversation about online dating scams, and 169 00:09:14,200 --> 00:09:16,120 Speaker 1: just to recap before the break, you were listening to 170 00:09:16,160 --> 00:09:18,600 Speaker 1: Susan Harrison, who's still with me in studio, the author 171 00:09:18,640 --> 00:09:21,600 Speaker 1: of a book called Red Flags and Roses, which details 172 00:09:21,640 --> 00:09:25,080 Speaker 1: her experiences, warts and awe of I should say Harry 173 00:09:25,080 --> 00:09:28,719 Speaker 1: Baxon or Susan more accurately of online dating. We'll leave 174 00:09:28,760 --> 00:09:31,360 Speaker 1: that story for another day, but of course, I mean 175 00:09:31,400 --> 00:09:33,640 Speaker 1: you were raising a couple of really important points before 176 00:09:33,679 --> 00:09:35,000 Speaker 1: the break, and I just want to circle back to 177 00:09:35,040 --> 00:09:36,920 Speaker 1: one or two of your experiences, and then we are 178 00:09:36,920 --> 00:09:39,760 Speaker 1: going to bring in a fraud prevention expert to take 179 00:09:40,040 --> 00:09:42,080 Speaker 1: a sort of a cold hard look at some of 180 00:09:42,080 --> 00:09:44,840 Speaker 1: the forms this is taking. You have several anecdotes in 181 00:09:44,880 --> 00:09:48,679 Speaker 1: your book, Susan. The first approach from Miguel. Miguel working 182 00:09:48,720 --> 00:09:51,640 Speaker 1: on a project in Mozambique, doesn't speak great English, but 183 00:09:51,800 --> 00:09:54,280 Speaker 1: is really keen to meet somebody in South Africa and 184 00:09:54,400 --> 00:09:56,160 Speaker 1: is planning to be spending six months of the year 185 00:09:56,200 --> 00:09:59,520 Speaker 1: in France and six months in essay. And what I 186 00:09:59,600 --> 00:10:02,120 Speaker 1: laughed about the story is how you responded to him 187 00:10:02,160 --> 00:10:05,040 Speaker 1: once you'd spotted the con because you didn't just block 188 00:10:05,120 --> 00:10:07,560 Speaker 1: him and move on. You you actually decided to dig 189 00:10:07,679 --> 00:10:09,120 Speaker 1: right in there, tell us what happened. 190 00:10:10,160 --> 00:10:13,120 Speaker 3: Well, after a while, when you've been doing a bit 191 00:10:13,120 --> 00:10:15,480 Speaker 3: of online daty, you get quite savvy, and I just 192 00:10:15,480 --> 00:10:17,480 Speaker 3: thought aboutself, you know, I really have to play this 193 00:10:17,600 --> 00:10:22,960 Speaker 3: well along. And what I did was I really engaged 194 00:10:23,000 --> 00:10:24,840 Speaker 3: with him, and I was, Oh, my goodness, I can't 195 00:10:24,880 --> 00:10:29,360 Speaker 3: wait to meet you. I'm so excited. And then he said, 196 00:10:29,520 --> 00:10:34,920 Speaker 3: you know, unfortunately my girlfriend has emptied my bank account 197 00:10:34,960 --> 00:10:39,479 Speaker 3: and I'm going to need some assistance. And I then responded, 198 00:10:39,520 --> 00:10:42,559 Speaker 3: after a bit of a fair amount of communication with him, 199 00:10:42,600 --> 00:10:45,080 Speaker 3: I said to me, you know, I have to be 200 00:10:45,160 --> 00:10:49,000 Speaker 3: honest with you, but I actually don't have any money, 201 00:10:49,880 --> 00:10:54,680 Speaker 3: and I also don't look like my pictures, and I've 202 00:10:54,720 --> 00:10:59,440 Speaker 3: put on quite a lot of weight, and my car's 203 00:10:59,520 --> 00:11:02,040 Speaker 3: broken and my son's going to fix it for me. 204 00:11:02,840 --> 00:11:06,560 Speaker 3: But I'll try and see what I can do about 205 00:11:06,760 --> 00:11:09,280 Speaker 3: at least getting a friend to take me to the 206 00:11:09,360 --> 00:11:13,000 Speaker 3: airport to meet you. And that was the end of Everard. 207 00:11:12,960 --> 00:11:17,400 Speaker 1: Never from He just disappeared, and all the declarations of 208 00:11:17,440 --> 00:11:20,640 Speaker 1: love went with him. I mean a couple of similar anecdotes, 209 00:11:20,679 --> 00:11:24,280 Speaker 1: and Okay, you you very I mean, you are a 210 00:11:24,440 --> 00:11:28,720 Speaker 1: very independent, strong minded, strong personality. You're not going to 211 00:11:28,920 --> 00:11:31,280 Speaker 1: suffer fools. I mean that's very clear reading this book 212 00:11:31,320 --> 00:11:34,240 Speaker 1: and meeting you at person, Susan, that you you had 213 00:11:34,280 --> 00:11:37,640 Speaker 1: eyes wide open for this kind of thing. But as 214 00:11:37,679 --> 00:11:40,800 Speaker 1: we heard before the break, there are unfortunately a huge 215 00:11:40,920 --> 00:11:43,800 Speaker 1: number of people who are falling for it. You either 216 00:11:43,920 --> 00:11:46,880 Speaker 1: don't know to look out for it or are are 217 00:11:47,040 --> 00:11:51,280 Speaker 1: dragged into it because they are handled with such skill 218 00:11:51,320 --> 00:11:52,959 Speaker 1: by the fraudsters. And this is where I want to 219 00:11:52,960 --> 00:11:55,480 Speaker 1: bring in this concept of social engineering and bring in 220 00:11:55,520 --> 00:11:58,920 Speaker 1: our second guest this afternoon, because listening into what happened before, 221 00:12:00,160 --> 00:12:02,600 Speaker 1: Camp who is communications lead for the South African Ford 222 00:12:02,679 --> 00:12:06,040 Speaker 1: Prevention Service and is joining us via Assume this afternoon, 223 00:12:06,040 --> 00:12:07,960 Speaker 1: and Elmi, thank you so much. We really value your 224 00:12:08,000 --> 00:12:11,480 Speaker 1: time and input this afternoon. Thanks for listening into that 225 00:12:11,520 --> 00:12:14,000 Speaker 1: first place part and for joining us. Now welcome to 226 00:12:14,040 --> 00:12:14,360 Speaker 1: the show. 227 00:12:15,400 --> 00:12:16,640 Speaker 2: Thanks, but that's good to be here. 228 00:12:17,520 --> 00:12:20,000 Speaker 1: And can we just raise the volume so that Susan's 229 00:12:20,040 --> 00:12:23,480 Speaker 1: able to hear el me, Thanks, Alessandro. I mean Susan 230 00:12:23,559 --> 00:12:25,880 Speaker 1: was just relating before the break. She was able to 231 00:12:25,920 --> 00:12:28,280 Speaker 1: spot the frauds deisim mile away. She could see the 232 00:12:28,320 --> 00:12:31,320 Speaker 1: sort of the progression of the love bombing that came first, 233 00:12:31,360 --> 00:12:34,400 Speaker 1: and then the subtle request for some money that became 234 00:12:34,480 --> 00:12:36,880 Speaker 1: less and less subtle the further it went on. That's 235 00:12:37,120 --> 00:12:42,200 Speaker 1: a fairly sort of traditional progression of the scam. But 236 00:12:42,679 --> 00:12:44,840 Speaker 1: the question is how often this is happening. What are 237 00:12:44,880 --> 00:12:47,560 Speaker 1: you seeing at the Ford Prevention Service, And in terms 238 00:12:47,559 --> 00:12:50,360 Speaker 1: of how often people are reporting that they actually fell 239 00:12:50,400 --> 00:12:53,360 Speaker 1: for it, that they were fraudulately handled on an online 240 00:12:53,440 --> 00:12:56,320 Speaker 1: dating scam, that they handed over money and never saw 241 00:12:56,360 --> 00:12:58,480 Speaker 1: it again. Is it happening more and more frequently in 242 00:12:58,520 --> 00:12:59,160 Speaker 1: South Africa? 243 00:13:01,000 --> 00:13:04,560 Speaker 2: I think the thought answer is that it is happening 244 00:13:04,640 --> 00:13:07,760 Speaker 2: consistently and continuously, and we can see that it is. 245 00:13:08,559 --> 00:13:10,560 Speaker 2: It's evolving. Let me put it that way. And you 246 00:13:10,720 --> 00:13:16,120 Speaker 2: very carefully mentioned social engineering. This is a this is 247 00:13:16,160 --> 00:13:20,120 Speaker 2: a very skilled tactic where they essentially use your behavior 248 00:13:20,160 --> 00:13:22,160 Speaker 2: and what you share with the world against you to 249 00:13:22,280 --> 00:13:24,839 Speaker 2: influence or deceive you. You know, they study you, they 250 00:13:24,920 --> 00:13:27,800 Speaker 2: profile you, and they document key details that they can 251 00:13:27,960 --> 00:13:31,960 Speaker 2: use in their scams. So really, the more you put 252 00:13:32,000 --> 00:13:34,320 Speaker 2: online and the more we act online, which we are 253 00:13:34,360 --> 00:13:37,240 Speaker 2: doing every single day, the more prevalent is going to be. 254 00:13:38,640 --> 00:13:40,439 Speaker 1: It's so true what you've said, and I mean, I 255 00:13:40,520 --> 00:13:43,160 Speaker 1: mean I've experienced it. I'm sure every one of my 256 00:13:43,240 --> 00:13:47,000 Speaker 1: colleagues has experienced it. You know, we're on public platforms 257 00:13:47,040 --> 00:13:49,040 Speaker 1: and we do share a certain amount of ourselves on 258 00:13:49,080 --> 00:13:52,360 Speaker 1: public platforms, and there are people out there who will 259 00:13:52,640 --> 00:13:55,079 Speaker 1: try to use that against you and try to insinuate 260 00:13:55,120 --> 00:13:57,600 Speaker 1: themselves using a little nugget of detail that I, oh, 261 00:13:57,640 --> 00:14:00,880 Speaker 1: I met you and your daughter she's a medical for example, 262 00:14:00,880 --> 00:14:03,000 Speaker 1: as if that's a way that they have actually met 263 00:14:03,080 --> 00:14:05,520 Speaker 1: me in person, and you learn to spot it, but 264 00:14:05,600 --> 00:14:07,719 Speaker 1: not everybody knows to look out for it and ell me. 265 00:14:08,720 --> 00:14:10,920 Speaker 1: Let's just dig into what that looks like. Because you're 266 00:14:10,960 --> 00:14:13,720 Speaker 1: talking there about people using our own information that we 267 00:14:13,920 --> 00:14:16,800 Speaker 1: have put out there against us. Do you want to 268 00:14:16,800 --> 00:14:18,839 Speaker 1: elaborate a little bit on the kinds of things you're 269 00:14:18,880 --> 00:14:21,440 Speaker 1: talking about and just raise a few red flags about 270 00:14:21,720 --> 00:14:24,960 Speaker 1: the things people should think twice about sharing on public 271 00:14:24,960 --> 00:14:26,640 Speaker 1: platforms because this might happen. 272 00:14:27,720 --> 00:14:29,880 Speaker 2: Yeah, sure, so. I think what we need to keep 273 00:14:29,920 --> 00:14:33,640 Speaker 2: in mind is that these scammers are exceptionally skilled at 274 00:14:33,680 --> 00:14:38,920 Speaker 2: crafting a persona that you would be attracted to or 275 00:14:39,000 --> 00:14:42,800 Speaker 2: drawn to. So they really look at who are you following, 276 00:14:42,880 --> 00:14:45,120 Speaker 2: They look at yours, they look at your comments, they 277 00:14:45,120 --> 00:14:47,840 Speaker 2: look at your hobbies, what you're sharing. Those are the 278 00:14:47,920 --> 00:14:49,880 Speaker 2: kind of things that they could do to create a 279 00:14:50,000 --> 00:14:53,760 Speaker 2: profile that is your ideal partner, because that is what 280 00:14:53,800 --> 00:14:58,040 Speaker 2: you are looking for. So they have a pretty predictable 281 00:14:58,160 --> 00:15:00,840 Speaker 2: behavioral cycle. So I'm just gonna just give you a 282 00:15:00,920 --> 00:15:03,760 Speaker 2: very high level idea of what that is. So they 283 00:15:03,840 --> 00:15:08,320 Speaker 2: identify emotional vulnerability. So they look at how people are acting. 284 00:15:08,400 --> 00:15:12,480 Speaker 2: Sometimes people may appear slightly lonely, or they've recently been 285 00:15:12,520 --> 00:15:16,400 Speaker 2: divorced or widowed, or they seem isolated on their public profile, 286 00:15:17,200 --> 00:15:20,120 Speaker 2: so that gives them an end. They believe this person 287 00:15:20,400 --> 00:15:25,120 Speaker 2: can do their looking for connection. Then they create emotional dependency. 288 00:15:26,040 --> 00:15:28,400 Speaker 2: Like you said, they send those daily messages and they 289 00:15:28,400 --> 00:15:31,840 Speaker 2: really compliment you, like Susan was saying, they support you. 290 00:15:31,840 --> 00:15:34,600 Speaker 2: You feel so understood and seen and chosen. And not 291 00:15:34,720 --> 00:15:39,280 Speaker 2: everyone has their armor up like Susan did, but they 292 00:15:39,320 --> 00:15:43,200 Speaker 2: feel so valued in that moment. Then they also do 293 00:15:43,360 --> 00:15:46,480 Speaker 2: very cleverly, they will try and isolate victims. They do 294 00:15:46,520 --> 00:15:49,840 Speaker 2: this quite slucky, so they'll discourage them from telling friends 295 00:15:49,880 --> 00:15:52,760 Speaker 2: or family about this relationship, saying things like no one 296 00:15:52,760 --> 00:15:56,760 Speaker 2: will understand and that their connection is so unique and special. 297 00:15:57,600 --> 00:16:00,520 Speaker 2: And then after that, once they've gained your and they've 298 00:16:00,520 --> 00:16:04,760 Speaker 2: built this connection, there comes financial manipulation angle. This can 299 00:16:04,800 --> 00:16:08,080 Speaker 2: take many forms, so you know, this can be an 300 00:16:08,120 --> 00:16:12,000 Speaker 2: emergency or their child is very sick, or their bank 301 00:16:12,080 --> 00:16:14,160 Speaker 2: has blocked their card and they're stuck in a funny 302 00:16:14,200 --> 00:16:16,920 Speaker 2: country and they won't have money, so they're quite desperate, 303 00:16:17,040 --> 00:16:19,960 Speaker 2: and they make you believe that they're actually too embarrassed 304 00:16:19,960 --> 00:16:23,520 Speaker 2: to ask, or they're not really asking. So often victims 305 00:16:23,760 --> 00:16:27,040 Speaker 2: will offer to help them and then they'll be forever 306 00:16:27,120 --> 00:16:29,560 Speaker 2: and their dead kind of thing, and then you know, 307 00:16:29,640 --> 00:16:31,840 Speaker 2: once the scam is complete and they've taken it as 308 00:16:31,920 --> 00:16:33,760 Speaker 2: far as they can, they disappear. 309 00:16:35,240 --> 00:16:37,520 Speaker 1: I mean, I'm listening to you and I'm thinking, I mean, 310 00:16:37,560 --> 00:16:40,440 Speaker 1: this is a process of grooming. As we talk about 311 00:16:40,480 --> 00:16:44,320 Speaker 1: grooming in the context of predators online grooming children to 312 00:16:44,360 --> 00:16:47,240 Speaker 1: be their victims, but you're talking about exactly the same behavior. 313 00:16:47,280 --> 00:16:52,480 Speaker 1: Patterns are very subtly insinuating themselves into your lived reality 314 00:16:53,240 --> 00:16:55,040 Speaker 1: so that you learn to trust them. You let your 315 00:16:55,040 --> 00:16:58,640 Speaker 1: guard down, and you start to consider something that you 316 00:16:58,680 --> 00:17:02,640 Speaker 1: know you wouldn't ordinarily in a wildless dreams considered doing. 317 00:17:03,000 --> 00:17:04,959 Speaker 1: And I want to highlight what you just said. It 318 00:17:04,960 --> 00:17:08,000 Speaker 1: can be, you know, stuff that is publicly available about you. 319 00:17:08,080 --> 00:17:10,080 Speaker 1: I'm thinking now of the description on all of my 320 00:17:10,240 --> 00:17:13,680 Speaker 1: social media profiles, which say I am a bulldog lover, 321 00:17:13,840 --> 00:17:16,760 Speaker 1: I am a surfer. So of course there's there's information 322 00:17:16,800 --> 00:17:20,000 Speaker 1: that somebody could potentially use against me. If their profile 323 00:17:20,000 --> 00:17:22,880 Speaker 1: photograph is going to be a person with a bulldog 324 00:17:22,920 --> 00:17:25,399 Speaker 1: in the photograph, I'm instinctively going to be going, oh, well, 325 00:17:25,440 --> 00:17:28,120 Speaker 1: this person must be okay now on that on that note, 326 00:17:28,200 --> 00:17:31,240 Speaker 1: ell me. The photographs are the one way we can 327 00:17:31,840 --> 00:17:33,840 Speaker 1: do a little bit of amateur slew thing to check 328 00:17:33,880 --> 00:17:36,719 Speaker 1: that people are who they say they are. In the 329 00:17:36,720 --> 00:17:40,400 Speaker 1: age of AI, I know it's it's harder than ever 330 00:17:40,440 --> 00:17:44,840 Speaker 1: to spot a fake photograph, but tips like reverse reverse 331 00:17:44,920 --> 00:17:49,840 Speaker 1: searching photo images, et cetera online can help us spot 332 00:17:50,240 --> 00:17:52,680 Speaker 1: a photograph that's just been lifted off somebody else's site 333 00:17:52,680 --> 00:17:55,280 Speaker 1: for example. Would you advocate for that kind of behavior, 334 00:17:55,320 --> 00:17:58,080 Speaker 1: that little bit of sort of a PI detective work. 335 00:17:59,040 --> 00:18:01,560 Speaker 2: Well, I mean, I would definitely advocate for you to 336 00:18:01,680 --> 00:18:05,200 Speaker 2: do your research. But you know, not everyone has access 337 00:18:05,200 --> 00:18:08,280 Speaker 2: to this kind of technology, so we know our advice 338 00:18:08,359 --> 00:18:10,920 Speaker 2: is really going back to basics. I mean, I'll talk 339 00:18:10,920 --> 00:18:13,399 Speaker 2: about red flags now, but one of those things that 340 00:18:13,480 --> 00:18:16,480 Speaker 2: we tell you to do to protect yourself is to 341 00:18:16,560 --> 00:18:19,400 Speaker 2: do a bit of googling, do a bit of research, research, 342 00:18:19,520 --> 00:18:22,439 Speaker 2: look at the various profiles and the photos, because you 343 00:18:22,560 --> 00:18:26,119 Speaker 2: often will find that there's inconsistencies, and I mean, you 344 00:18:26,240 --> 00:18:28,640 Speaker 2: just have to trust your gut with that kind of thing. So, yes, 345 00:18:28,720 --> 00:18:31,720 Speaker 2: there are technologies that you can use, but we're also 346 00:18:31,800 --> 00:18:34,680 Speaker 2: saying trust your gut, do a bit of research, verify, 347 00:18:35,200 --> 00:18:37,760 Speaker 2: talk to people, talk to people around you. Does this 348 00:18:37,880 --> 00:18:40,000 Speaker 2: sound funny because I think what we're doing is we 349 00:18:40,040 --> 00:18:45,040 Speaker 2: don't speak to people. We need to be skeptical as well, 350 00:18:45,119 --> 00:18:47,960 Speaker 2: and not to share any information with people. 351 00:18:49,560 --> 00:18:52,600 Speaker 1: Susan nodding an agreement in the background here with a 352 00:18:52,640 --> 00:18:54,480 Speaker 1: lot of what you're saying, and Susan going to circle 353 00:18:54,520 --> 00:18:56,240 Speaker 1: back to you in a moment, but just picking up 354 00:18:56,280 --> 00:18:58,000 Speaker 1: on what you said about the red flags on mean, 355 00:18:58,200 --> 00:19:00,119 Speaker 1: what in your opinion on the big red flags to 356 00:19:00,119 --> 00:19:01,960 Speaker 1: look up for well. 357 00:19:02,040 --> 00:19:04,600 Speaker 2: Susan actually did also touch on some of that. That 358 00:19:04,800 --> 00:19:09,159 Speaker 2: is that evasive behavior. You know, their availability is limited. 359 00:19:09,200 --> 00:19:11,640 Speaker 2: They can speak to you on their terms, if they're 360 00:19:11,680 --> 00:19:14,199 Speaker 2: avoiding conversations when you can talk to them or in 361 00:19:14,280 --> 00:19:17,280 Speaker 2: person meetings. You know, they show an intense interest in 362 00:19:17,320 --> 00:19:20,520 Speaker 2: your life, but they also tend to avoid answering personal 363 00:19:20,600 --> 00:19:24,320 Speaker 2: questions about themselves or meeting face to face, or their 364 00:19:24,359 --> 00:19:27,280 Speaker 2: stories seem a little bit inconsistent and maybe something doesn't 365 00:19:27,280 --> 00:19:30,600 Speaker 2: add up, that is an immediate red flag. Then also 366 00:19:30,720 --> 00:19:33,760 Speaker 2: if they have the unusual interest in personal details, so 367 00:19:33,840 --> 00:19:36,760 Speaker 2: they ask you a lot of questions about yourself, you know, 368 00:19:36,800 --> 00:19:39,480 Speaker 2: and that way they create that profile about you that 369 00:19:39,560 --> 00:19:42,639 Speaker 2: could be pet names, birthdays, and adversaries, things like that, 370 00:19:43,280 --> 00:19:47,120 Speaker 2: or even other sensitive information because with that personal detail 371 00:19:47,560 --> 00:19:50,959 Speaker 2: they can potentially try and commit impersonation fraud later on 372 00:19:51,119 --> 00:19:55,280 Speaker 2: if they gain enough information about you. Then also if 373 00:19:55,280 --> 00:19:58,080 Speaker 2: they start asking for money, you know, that is also 374 00:19:58,119 --> 00:20:00,640 Speaker 2: an immediate red flag. You know, they it be coming 375 00:20:00,680 --> 00:20:03,240 Speaker 2: to you for money, especially if it's on an online 376 00:20:03,520 --> 00:20:06,679 Speaker 2: profile and you guys are not serious, that is an 377 00:20:06,720 --> 00:20:09,199 Speaker 2: immediate red flag. And if they start talking to you 378 00:20:09,200 --> 00:20:13,640 Speaker 2: about investment opportunities or great other things that you can 379 00:20:13,680 --> 00:20:17,320 Speaker 2: invest in that maybe worked for them or something like that. 380 00:20:18,000 --> 00:20:22,280 Speaker 2: And yeah, I think those are the big, big red flags, Elmi. 381 00:20:22,400 --> 00:20:26,920 Speaker 1: Thanks, and they're so important to highlight and to raise 382 00:20:26,920 --> 00:20:29,520 Speaker 1: awareness of because, as I said a moment ago, with 383 00:20:29,720 --> 00:20:33,320 Speaker 1: with AI enabling us to do so much more in 384 00:20:33,400 --> 00:20:37,440 Speaker 1: terms of creating and workshopping photos, et cetera. The deep 385 00:20:37,520 --> 00:20:39,800 Speaker 1: fake phenomenon is what I'm talking about. I mean, the 386 00:20:39,840 --> 00:20:43,239 Speaker 1: potential to use that, and the scammers are very very 387 00:20:43,320 --> 00:20:46,080 Speaker 1: good at using it and making it seem so authentic 388 00:20:46,119 --> 00:20:49,720 Speaker 1: and so believable. This is how they're getting at us. 389 00:20:50,080 --> 00:20:52,240 Speaker 1: Before we let you go, anything else you'd like to mention, 390 00:20:52,760 --> 00:20:56,200 Speaker 1: particularly around if you suspect somebody is a scammer, or 391 00:20:56,240 --> 00:20:59,240 Speaker 1: you want to report somebody who's tried to scam you online. 392 00:20:59,320 --> 00:21:01,159 Speaker 1: Is that something courage because I know a lot of 393 00:21:01,160 --> 00:21:04,400 Speaker 1: people are too embarrassed to admit that it happened to them, 394 00:21:04,520 --> 00:21:07,080 Speaker 1: or to admit that they almost fell for it, which 395 00:21:07,119 --> 00:21:09,639 Speaker 1: is the better scenario. Those that did fall victim, I 396 00:21:09,680 --> 00:21:12,119 Speaker 1: know are particularly hesitant to come forward because they're so 397 00:21:12,160 --> 00:21:14,840 Speaker 1: embarrassed that they fell for it. But it is really 398 00:21:14,880 --> 00:21:18,400 Speaker 1: important to spread the word if it happens. Not so. 399 00:21:19,880 --> 00:21:22,600 Speaker 2: Definitely, and there is really no shame in being caught 400 00:21:22,640 --> 00:21:25,160 Speaker 2: out in a scam like this because of how advanced 401 00:21:25,200 --> 00:21:27,960 Speaker 2: they are. There's really no shame and your story can 402 00:21:28,000 --> 00:21:30,480 Speaker 2: help someone else. So we do say please report it, 403 00:21:30,520 --> 00:21:33,760 Speaker 2: report it to the relevant organization, to the platform, to 404 00:21:33,800 --> 00:21:36,119 Speaker 2: the authorities. If you want to. You can even report 405 00:21:36,160 --> 00:21:39,119 Speaker 2: it to Yima, which is a product of the SAFPS. 406 00:21:39,960 --> 00:21:42,640 Speaker 2: So there's more on that on our website and if 407 00:21:42,680 --> 00:21:46,359 Speaker 2: the intelligence gathered from the reporting that you do helps 408 00:21:46,400 --> 00:21:50,760 Speaker 2: prevent future fraud and also enables those organizations or platforms 409 00:21:50,840 --> 00:21:55,200 Speaker 2: to potentially add additional layers of protection within their processes 410 00:21:55,560 --> 00:21:58,600 Speaker 2: to prevent this from happening happening to other people. And 411 00:21:58,640 --> 00:22:00,679 Speaker 2: then the last thing we ask people to do is 412 00:22:00,720 --> 00:22:04,320 Speaker 2: to talk about it, spread the news and telling others 413 00:22:04,320 --> 00:22:07,000 Speaker 2: what happened to you or what you've heard about is 414 00:22:07,040 --> 00:22:10,160 Speaker 2: what will prevent other people from falling victim. 415 00:22:11,440 --> 00:22:13,440 Speaker 1: Tell me camp. Thank you so much for your time 416 00:22:13,480 --> 00:22:16,840 Speaker 1: this afternoon. Some very important practical advice there from the 417 00:22:16,840 --> 00:22:20,840 Speaker 1: communications lead for the Essay Fraud Prevention Service, and please 418 00:22:20,920 --> 00:22:25,280 Speaker 1: do Google sa FPS or Yima y ima, which is 419 00:22:25,280 --> 00:22:29,120 Speaker 1: their product for reporting this kind of thing. Susan back 420 00:22:29,160 --> 00:22:33,320 Speaker 1: to you, did you do any amount of sleuthing and 421 00:22:33,960 --> 00:22:36,360 Speaker 1: following up on people? Because I've heard some who very 422 00:22:36,359 --> 00:22:40,080 Speaker 1: savvy users of online dating platforms dig deep. They go 423 00:22:40,119 --> 00:22:43,080 Speaker 1: and look at LinkedIn profiles, their phone companies to check 424 00:22:43,119 --> 00:22:45,320 Speaker 1: that the person works where they said they worked, etc. 425 00:22:46,000 --> 00:22:46,880 Speaker 1: Did you go that far. 426 00:22:47,040 --> 00:22:49,840 Speaker 3: Absolutely. One of the first things I would do is 427 00:22:50,080 --> 00:22:53,680 Speaker 3: ask if they were on Facebook and friend request them 428 00:22:53,720 --> 00:22:56,760 Speaker 3: on Facebook. And the big red flag is when you 429 00:22:56,840 --> 00:22:59,480 Speaker 3: see a Facebook page and there's virtually nothing on it 430 00:23:00,040 --> 00:23:04,200 Speaker 3: and it's just been created, and it tells. Facebook tells 431 00:23:04,240 --> 00:23:09,359 Speaker 3: you a lot about people because that history stretches way back. 432 00:23:09,760 --> 00:23:16,919 Speaker 3: It'll tell you about family, friends, pastimes, interests. It really 433 00:23:17,000 --> 00:23:21,520 Speaker 3: really is very informative. And the very point that was 434 00:23:21,640 --> 00:23:26,520 Speaker 3: raised a little earlier about the questions that the fraudsters asking, 435 00:23:27,320 --> 00:23:30,600 Speaker 3: you can do a reverse psychology. You can ask the 436 00:23:30,680 --> 00:23:36,280 Speaker 3: questions and avoid giving answers, so when you get posed questions, 437 00:23:36,920 --> 00:23:41,639 Speaker 3: you simply throw the question back without answering, so that 438 00:23:41,760 --> 00:23:46,560 Speaker 3: you then become the interrogator and you are creating an 439 00:23:46,640 --> 00:23:50,160 Speaker 3: idea of the profile that you're dealing with, and invariably 440 00:23:50,600 --> 00:23:51,560 Speaker 3: they don't respond. 441 00:23:51,640 --> 00:23:52,840 Speaker 1: Do you want to give us an example? 442 00:23:54,680 --> 00:23:58,680 Speaker 3: An example would be I noticed that you have children? 443 00:23:59,040 --> 00:24:02,879 Speaker 3: How old are they? Do you have custody? What is 444 00:24:02,920 --> 00:24:06,240 Speaker 3: your relationship like with your children? Are you very close 445 00:24:06,280 --> 00:24:09,360 Speaker 3: to them? Do you have a large circle of friends 446 00:24:09,800 --> 00:24:13,960 Speaker 3: or would you call yourself alone? If he's athletic, do 447 00:24:14,040 --> 00:24:17,040 Speaker 3: you belong to a cycle club? How often do you 448 00:24:17,080 --> 00:24:20,840 Speaker 3: go to German? I noticed that you say you're you're 449 00:24:20,880 --> 00:24:24,320 Speaker 3: in finance. What kind of finance are you in? So 450 00:24:24,840 --> 00:24:28,320 Speaker 3: there's a host of questions you can ask what are 451 00:24:28,359 --> 00:24:31,200 Speaker 3: your interests? I love art, how do you feel about it? 452 00:24:31,560 --> 00:24:34,040 Speaker 3: What's your genre of music that you like to listen to? 453 00:24:34,160 --> 00:24:36,200 Speaker 3: I didn't even know what a genre meant to help 454 00:24:36,280 --> 00:24:39,680 Speaker 3: back until i'd heard it someone and said, what genre 455 00:24:39,720 --> 00:24:43,040 Speaker 3: do you like? Us? Like, what's the genre? Dead? Ask 456 00:24:43,119 --> 00:24:44,240 Speaker 3: him because he was amuser. 457 00:24:46,840 --> 00:24:50,200 Speaker 1: Just for anybody who's coming late to this conversation. Previously 458 00:24:50,240 --> 00:24:53,800 Speaker 1: you were hearing from the South African Fraud Prevention Service 459 00:24:53,840 --> 00:24:57,000 Speaker 1: and that was ELM Camp, their communications lead talking about 460 00:24:57,040 --> 00:24:59,960 Speaker 1: the red Flags and the importance of reporting. The voice 461 00:25:00,119 --> 00:25:02,679 Speaker 1: in studio with me today. Susan Harrison, the author of 462 00:25:02,800 --> 00:25:05,920 Speaker 1: Red Flags and Roses and Everything in Between, I should 463 00:25:05,920 --> 00:25:07,920 Speaker 1: add which just the subtitle, and it is a very 464 00:25:07,960 --> 00:25:11,399 Speaker 1: candid account of her own more than a decade of 465 00:25:11,440 --> 00:25:15,119 Speaker 1: exploring the world of online dating and it's highs and lows, 466 00:25:15,119 --> 00:25:18,840 Speaker 1: and included in those anecdotes the story of a couple 467 00:25:18,840 --> 00:25:21,159 Speaker 1: of would be scammers who didn't manage to get the 468 00:25:21,160 --> 00:25:23,200 Speaker 1: better of Susan, or if they did, she hasn't revealed 469 00:25:23,240 --> 00:25:25,440 Speaker 1: it in the book, we should say, but I suspect 470 00:25:25,440 --> 00:25:27,720 Speaker 1: that they probably bit of more than they could chew. 471 00:25:27,760 --> 00:25:31,520 Speaker 1: If they tried. I mean, we've spoken a lot about 472 00:25:31,800 --> 00:25:34,240 Speaker 1: the things to be alert to and that evasive behavior. 473 00:25:34,359 --> 00:25:36,960 Speaker 1: They're never wanting to talk in person. Your tip of 474 00:25:37,680 --> 00:25:39,320 Speaker 1: finding a phone number and then using it at a 475 00:25:39,440 --> 00:25:41,800 Speaker 1: random time when they're not expecting to hear from you 476 00:25:41,920 --> 00:25:44,119 Speaker 1: is a very quick way to catch somebody off. God 477 00:25:45,480 --> 00:25:47,719 Speaker 1: things like you go and look at a true caller, 478 00:25:47,800 --> 00:25:50,520 Speaker 1: check if the person's number is registered on true caller, 479 00:25:50,560 --> 00:25:53,919 Speaker 1: because maybe they are red flagged by somebody else who 480 00:25:54,000 --> 00:25:56,400 Speaker 1: said this is a scam artist looking for money for example. 481 00:25:57,119 --> 00:25:59,359 Speaker 1: Just do those few checks and you know, a little 482 00:25:59,359 --> 00:26:04,000 Speaker 1: bit of digging in investigation yourself to very quickly weed 483 00:26:04,080 --> 00:26:07,159 Speaker 1: out the obvious fakers. Of course, the problem is the 484 00:26:07,160 --> 00:26:09,399 Speaker 1: ones who are really really good at SUSAN, who are 485 00:26:09,480 --> 00:26:14,000 Speaker 1: using the modern technology like the AI, deep fake capacity, etc. 486 00:26:14,760 --> 00:26:18,120 Speaker 1: To build a very believable profile, and who are committed 487 00:26:18,200 --> 00:26:19,840 Speaker 1: to time and prepared to put the time in to 488 00:26:19,960 --> 00:26:24,040 Speaker 1: build a long standing profile that is believable. You really 489 00:26:24,080 --> 00:26:25,960 Speaker 1: have got to have your wits about you. I don't 490 00:26:26,000 --> 00:26:28,040 Speaker 1: want to finish on a negative, though, I do want 491 00:26:28,040 --> 00:26:30,479 Speaker 1: to end by saying that, I mean, there are more 492 00:26:30,600 --> 00:26:32,680 Speaker 1: roses than red flags in your book. There's some very 493 00:26:32,680 --> 00:26:36,639 Speaker 1: funny anecdotes of the dates that didn't go well, but 494 00:26:36,680 --> 00:26:39,520 Speaker 1: there are also some lovely examples of really deep connections 495 00:26:39,520 --> 00:26:42,400 Speaker 1: that you made with people. Looking back on it now, 496 00:26:42,880 --> 00:26:45,719 Speaker 1: do you regret the time that you spent on online 497 00:26:45,760 --> 00:26:48,639 Speaker 1: dating or do you feel that it was valuable and worthwhile. 498 00:26:49,160 --> 00:26:54,400 Speaker 3: Absolutely valuable and worthwhile. It is something I highly recommend 499 00:26:54,800 --> 00:26:58,320 Speaker 3: you will make some really good friends. You don't necessarily 500 00:26:58,359 --> 00:27:03,560 Speaker 3: meet the love of your life. But I think what 501 00:27:03,680 --> 00:27:06,639 Speaker 3: it does is it teaches us to grow because every 502 00:27:06,680 --> 00:27:09,639 Speaker 3: time you meet with someone a potential date, or you 503 00:27:09,720 --> 00:27:13,439 Speaker 3: engage with something, you learn a lot about yourself that 504 00:27:13,520 --> 00:27:19,280 Speaker 3: you weren't necessarily aware of. And sometimes you actually meet 505 00:27:19,359 --> 00:27:22,240 Speaker 3: up with somebody who is literally has an office around 506 00:27:22,240 --> 00:27:24,800 Speaker 3: the corner from where your office is, and that is 507 00:27:24,920 --> 00:27:28,120 Speaker 3: astounding because you literally meet us away from each other 508 00:27:28,119 --> 00:27:31,480 Speaker 3: and you don't know that either one of you exists. 509 00:27:32,040 --> 00:27:38,800 Speaker 3: And I've had an incredible experience. I've met some amazing people. 510 00:27:39,240 --> 00:27:42,080 Speaker 3: My current partner took up the last seven chapters of 511 00:27:42,119 --> 00:27:45,280 Speaker 3: my book and we're still together. 512 00:27:46,520 --> 00:27:49,800 Speaker 1: So it's not to say don't do it, and it's 513 00:27:49,840 --> 00:27:52,960 Speaker 1: not to say that it now can It's just go 514 00:27:53,080 --> 00:27:54,320 Speaker 1: in with your wits about you. 515 00:27:54,480 --> 00:27:57,080 Speaker 3: Yes, absolutely, absolutely. 516 00:27:56,720 --> 00:27:58,320 Speaker 1: Well, I want to thank you for being so frank 517 00:27:58,400 --> 00:28:00,520 Speaker 1: in what you've shared with us, not just in the interviewing, 518 00:28:00,560 --> 00:28:02,920 Speaker 1: in the book as well. Final question to you, Susan, 519 00:28:02,960 --> 00:28:04,440 Speaker 1: if our listeners would like to get a hold of 520 00:28:04,440 --> 00:28:07,119 Speaker 1: a copy of your book it's called Red Flags and Roses, 521 00:28:07,119 --> 00:28:07,919 Speaker 1: where can they buy it? 522 00:28:08,520 --> 00:28:11,560 Speaker 3: They would be able to buy it online at either 523 00:28:11,640 --> 00:28:16,960 Speaker 3: Take a Lot or on Amazon. Alternatively, if they go 524 00:28:17,040 --> 00:28:20,320 Speaker 3: onto the Facebook page that I've just created Red Flags 525 00:28:20,359 --> 00:28:23,480 Speaker 3: and Roses and everything in between, I will take it 526 00:28:23,560 --> 00:28:27,000 Speaker 3: upon myself to get a copy of the book for you, 527 00:28:27,080 --> 00:28:28,919 Speaker 3: sign it and post it off to you. So just 528 00:28:29,560 --> 00:28:31,359 Speaker 3: yees directly with me on Facebook. 529 00:28:31,480 --> 00:28:33,560 Speaker 1: And she Israel. I'm looking at her in the flesh, 530 00:28:33,600 --> 00:28:36,080 Speaker 1: having a face to face, voice to voice conversation, I 531 00:28:36,080 --> 00:28:37,880 Speaker 1: can vouch for the fact that even though the Facebook 532 00:28:37,920 --> 00:28:40,640 Speaker 1: page is new, it doesn't mean she's a scam artist. 533 00:28:40,960 --> 00:28:42,720 Speaker 1: Super lovely to meet you and thanks so much for 534 00:28:42,800 --> 00:28:43,400 Speaker 1: joining us today. 535 00:28:43,400 --> 00:28:46,600 Speaker 3: But thank you so much for the opportunity. It's absolutely wonderful. 536 00:28:46,640 --> 00:28:47,120 Speaker 1: That's fun