1 00:00:10,080 --> 00:00:12,320 Speaker 1: Let me read you the last text that guy, great 2 00:00:12,360 --> 00:00:15,040 Speaker 1: opportunity up ahead. A stock is about to go up 3 00:00:15,120 --> 00:00:18,400 Speaker 1: sixty percent. Would you like a trading signal? Here's another 4 00:00:18,400 --> 00:00:21,440 Speaker 1: one that I got earlier. Hi, Jennifer, what's the news? 5 00:00:22,160 --> 00:00:25,239 Speaker 1: That's all it says. I get these all the time. 6 00:00:25,320 --> 00:00:27,800 Speaker 1: You probably do too, maybe even call you Jennifer. And 7 00:00:27,840 --> 00:00:30,480 Speaker 1: it seems no matter what I do, these things keep 8 00:00:30,600 --> 00:00:35,559 Speaker 1: coming in. So scams themselves aren't new, but technology is 9 00:00:35,640 --> 00:00:36,720 Speaker 1: definitely adding to it. 10 00:00:37,440 --> 00:00:41,559 Speaker 2: What's changed is that now people are using computers to 11 00:00:41,840 --> 00:00:47,319 Speaker 2: do the communications, to create the communications, where previously they 12 00:00:47,360 --> 00:00:52,080 Speaker 2: would have literally had to press buttons on the phone. 13 00:00:52,840 --> 00:00:57,040 Speaker 2: It opens up the floodgates in terms of the volume 14 00:00:57,280 --> 00:01:00,160 Speaker 2: of messages that you can create, on the numbers of 15 00:01:00,240 --> 00:01:02,480 Speaker 2: people that you can hurt around the world. 16 00:01:03,320 --> 00:01:06,000 Speaker 1: Eric Prescons has been in a telecommunications business for a 17 00:01:06,040 --> 00:01:09,720 Speaker 1: long time. He's worked at T Mobile, at Sky, at WorldCom, 18 00:01:10,040 --> 00:01:12,520 Speaker 1: and that's just some of the places. In two thousand 19 00:01:12,560 --> 00:01:14,800 Speaker 1: and six, you founded a site called Comm's Risk, and 20 00:01:14,959 --> 00:01:17,840 Speaker 1: he left the Corporate World. Comm's Risk reports on the 21 00:01:17,840 --> 00:01:21,760 Speaker 1: telecom industry and specifically the crimes that are associated with it, 22 00:01:22,120 --> 00:01:25,560 Speaker 1: including the influx of scam texts, He says it. Scams 23 00:01:25,640 --> 00:01:28,399 Speaker 1: via text have risen over the last decade as technology 24 00:01:28,440 --> 00:01:31,280 Speaker 1: has made them easier to send on mass, and. 25 00:01:31,200 --> 00:01:34,319 Speaker 2: Then you had the pandemic and criminalists couldn't commit some 26 00:01:34,400 --> 00:01:36,840 Speaker 2: of the crimes that pretty musly committed before, So that 27 00:01:36,880 --> 00:01:39,640 Speaker 2: became an accelerant for those changes. 28 00:01:40,160 --> 00:01:42,600 Speaker 1: If it feels like you're getting more scam text these days, 29 00:01:42,680 --> 00:01:46,760 Speaker 1: it's not just your imagination. According to consumer reports, scam 30 00:01:46,800 --> 00:01:50,000 Speaker 1: texts have increased by fifty percent in the last year alone, 31 00:01:50,320 --> 00:01:53,080 Speaker 1: and they're not just increasing in number, they're getting a 32 00:01:53,120 --> 00:01:54,120 Speaker 1: lot more convincing. 33 00:01:54,440 --> 00:01:57,960 Speaker 2: The sophistication of now the criminal patents we're seeing is 34 00:01:58,000 --> 00:02:00,600 Speaker 2: a lot more elaborate than they were in the past. 35 00:02:01,040 --> 00:02:04,080 Speaker 2: This is a very rapid evolution we're seeing now where 36 00:02:04,480 --> 00:02:08,040 Speaker 2: we've gone from machines that make very very simple, repetitive 37 00:02:08,120 --> 00:02:13,480 Speaker 2: scam communications so now far more subtle patterns that may 38 00:02:13,480 --> 00:02:15,240 Speaker 2: not be so easy to identify. 39 00:02:16,760 --> 00:02:18,880 Speaker 1: Talking to Eric was an opportunity to get into the 40 00:02:18,880 --> 00:02:22,160 Speaker 1: world of text scams, how they're being sent, who's behind them, 41 00:02:22,280 --> 00:02:24,640 Speaker 1: and if there's anything you can do to stab them. 42 00:02:24,720 --> 00:02:27,240 Speaker 1: Just as a short answer to that last one, yes, 43 00:02:27,400 --> 00:02:30,959 Speaker 1: we probably could start scam text but it's a lot 44 00:02:31,000 --> 00:02:32,560 Speaker 1: more complicated than it sounds. 45 00:02:33,400 --> 00:02:38,360 Speaker 2: Sometimes our use on the triviality of scam an organized crime. 46 00:02:38,680 --> 00:02:41,000 Speaker 2: Oh you've got an annoying message. Oh I got a 47 00:02:41,040 --> 00:02:43,200 Speaker 2: phone call. I didn't like the phone call. We're not 48 00:02:43,240 --> 00:02:45,959 Speaker 2: seeing the bigger picture of how this is having an 49 00:02:46,040 --> 00:02:50,320 Speaker 2: impact on people's lives. That's become almost unimaginable. 50 00:02:54,760 --> 00:03:05,880 Speaker 1: From Kaleidoscope and iHeart podcasts, this is kill switch. I'm 51 00:03:05,919 --> 00:03:06,680 Speaker 1: Dexter Thomas. 52 00:03:09,440 --> 00:03:46,800 Speaker 3: I'm sharing. I'm sharing Goodbye. 53 00:03:50,960 --> 00:03:53,400 Speaker 1: When I say scam text, you probably know exactly what 54 00:03:53,400 --> 00:03:56,400 Speaker 1: I'm talking about. These are things like fake recruiter messages, 55 00:03:56,440 --> 00:03:59,880 Speaker 1: fake delivery messages, wrong number texts. Some of these are 56 00:04:00,000 --> 00:04:02,920 Speaker 1: obviously scams because they come from weird email addresses or 57 00:04:03,040 --> 00:04:05,840 Speaker 1: use weird characters and place of letters. These are all 58 00:04:05,840 --> 00:04:08,400 Speaker 1: workarounds that scammers are found to bypass some of the 59 00:04:08,440 --> 00:04:11,560 Speaker 1: security measures that telephone companies are put in place, but 60 00:04:11,680 --> 00:04:13,760 Speaker 1: some of the newer methods that scammers are using or 61 00:04:13,800 --> 00:04:17,120 Speaker 1: getting harder and harder to detect by the telephone networks 62 00:04:17,160 --> 00:04:20,120 Speaker 1: and by us. One that's getting really popular around the 63 00:04:20,160 --> 00:04:22,000 Speaker 1: world is called a simbox. 64 00:04:22,640 --> 00:04:26,719 Speaker 2: Think of them as phones, but with one hundred sim 65 00:04:26,760 --> 00:04:29,680 Speaker 2: cards or two hundred SIM cards in them, so instead 66 00:04:29,720 --> 00:04:32,480 Speaker 2: of like a phone making one call or sending out 67 00:04:32,520 --> 00:04:34,799 Speaker 2: one message at a time, it can do two hundred 68 00:04:34,880 --> 00:04:35,679 Speaker 2: things at once. 69 00:04:36,000 --> 00:04:39,480 Speaker 1: There's nothing nefarious about the basic technology behind these symboxes. 70 00:04:39,600 --> 00:04:41,920 Speaker 1: I mean, think of a phone with a dual SIM feature. 71 00:04:42,120 --> 00:04:43,760 Speaker 1: If you're in the middle of nowhere and the first 72 00:04:43,760 --> 00:04:45,960 Speaker 1: phone network dies, you can just switch over to the 73 00:04:46,000 --> 00:04:48,320 Speaker 1: other network. It's not that easy to do normally, but 74 00:04:48,480 --> 00:04:51,240 Speaker 1: if you have two simcards, one from your primary network 75 00:04:51,240 --> 00:04:53,479 Speaker 1: and one for your backup network, it's really easy to do. 76 00:04:54,000 --> 00:04:57,040 Speaker 1: So just imagine that would say dozens of SIM cards 77 00:04:57,200 --> 00:05:00,240 Speaker 1: or hundreds, and now you have the basic idea for 78 00:05:00,279 --> 00:05:03,880 Speaker 1: a symbox. From here, you can combine multiple symboxes to 79 00:05:03,920 --> 00:05:06,680 Speaker 1: make what people call a SIM fharm and that can 80 00:05:06,720 --> 00:05:09,240 Speaker 1: send out millions of messages per minute. 81 00:05:09,600 --> 00:05:12,240 Speaker 2: In New York, three hundred of these things were found 82 00:05:12,520 --> 00:05:14,600 Speaker 2: by the US Secret Service recently. 83 00:05:16,400 --> 00:05:18,880 Speaker 1: This was back in September, and we still don't know 84 00:05:18,920 --> 00:05:21,400 Speaker 1: a whole lot about this operation. What we do know 85 00:05:21,560 --> 00:05:24,200 Speaker 1: is what the Secret Service has told us, and they 86 00:05:24,240 --> 00:05:28,000 Speaker 1: reported that they uncovered a massive symbox operation in New York. 87 00:05:28,480 --> 00:05:32,360 Speaker 1: Three hundred symboxes containing a total of one hundred thousand 88 00:05:32,520 --> 00:05:36,479 Speaker 1: SIM cards spread across multiple abandoned apartment buildings within a 89 00:05:36,520 --> 00:05:38,039 Speaker 1: thirty five mile radius. 90 00:05:38,600 --> 00:05:42,120 Speaker 2: They could as send an AEMS message to every American 91 00:05:42,360 --> 00:05:46,560 Speaker 2: within twelve minutes. That's the level of capacity we're talking 92 00:05:46,560 --> 00:05:47,080 Speaker 2: about here. 93 00:05:47,760 --> 00:05:50,120 Speaker 1: That's not to say that they did that. We actually 94 00:05:50,160 --> 00:05:52,440 Speaker 1: don't know what these scammers were doing. One of the 95 00:05:52,440 --> 00:05:55,400 Speaker 1: agents on the investigation said that this operation could have 96 00:05:55,680 --> 00:05:59,479 Speaker 1: quote disabled cell phone towers and essentially shut down the 97 00:05:59,520 --> 00:06:02,840 Speaker 1: cell phone network in New York City. So it goes 98 00:06:02,880 --> 00:06:06,080 Speaker 1: beyond just sending some annoying texts. And again, the Secret 99 00:06:06,120 --> 00:06:08,560 Speaker 1: Service isn't really telling us what these scammers are doing. 100 00:06:08,960 --> 00:06:11,520 Speaker 1: But if we look at some other cases, we might 101 00:06:11,560 --> 00:06:13,320 Speaker 1: find some hits. 102 00:06:13,360 --> 00:06:18,520 Speaker 2: In Latvia, two hundred of these things were found, and 103 00:06:19,040 --> 00:06:21,080 Speaker 2: it's almost certainly the case that what happened in New 104 00:06:21,160 --> 00:06:23,520 Speaker 2: York was the same as what happened in Latvia. What 105 00:06:23,560 --> 00:06:28,440 Speaker 2: were they used for distributing child pornography extortion? But one 106 00:06:28,480 --> 00:06:30,840 Speaker 2: of the big things that they were being used for 107 00:06:31,440 --> 00:06:37,159 Speaker 2: was to receive one SMS message. Why would you want 108 00:06:37,200 --> 00:06:40,400 Speaker 2: to receive one SMS message? Well, if you're setting up 109 00:06:40,440 --> 00:06:43,760 Speaker 2: a WhatsApp account. It gets tied to a phone number. 110 00:06:44,240 --> 00:06:48,520 Speaker 2: So if you're creating all other spam scam accounts on WhatsApp, 111 00:06:48,800 --> 00:06:51,960 Speaker 2: you need somebody to provide the phone numbers to create 112 00:06:52,040 --> 00:06:56,280 Speaker 2: the accounts. They're using the SIM maybe only once then 113 00:06:56,360 --> 00:06:59,200 Speaker 2: throwing it away because they have the phone number. The 114 00:06:59,240 --> 00:07:03,359 Speaker 2: WhatsApp account gets validated because it received the one time 115 00:07:03,480 --> 00:07:07,920 Speaker 2: password by SMS to create the account, right, and then 116 00:07:08,040 --> 00:07:11,280 Speaker 2: it doesn't matter if even the telco blocks that SIM 117 00:07:11,400 --> 00:07:14,880 Speaker 2: card because the WhatsApp account exists. So you can create 118 00:07:15,480 --> 00:07:20,560 Speaker 2: hundreds of thousands of online accounts, and in the Latvia 119 00:07:20,920 --> 00:07:24,840 Speaker 2: they estimated that forty nine million online accounts had been 120 00:07:24,880 --> 00:07:26,880 Speaker 2: created by that particular criminal outfit. 121 00:07:28,360 --> 00:07:30,800 Speaker 1: Eric points out that scam messages are coming to us 122 00:07:30,800 --> 00:07:34,080 Speaker 1: in all different ways, not just through SMS messages, but 123 00:07:34,200 --> 00:07:37,720 Speaker 1: through WhatsApp, through Signal, and all kinds of different messaging apps. 124 00:07:37,960 --> 00:07:41,559 Speaker 1: Telecom companies can try to detect suspicious activity and block 125 00:07:41,640 --> 00:07:45,080 Speaker 1: certain numbers, but scammers are usually a step ahead, and 126 00:07:45,120 --> 00:07:49,320 Speaker 1: the latest technique bypasses the telephone companies completely. It's called 127 00:07:49,360 --> 00:07:50,520 Speaker 1: an SMS blaster. 128 00:07:51,640 --> 00:07:56,080 Speaker 2: That's a device that essentially does the same thing that 129 00:07:56,400 --> 00:08:01,520 Speaker 2: the radio antenna on a mobile phone network does. When 130 00:08:01,560 --> 00:08:05,160 Speaker 2: you connect to your mobile phone to the network. The 131 00:08:05,240 --> 00:08:07,960 Speaker 2: difference is it's not a legitimate part of the network, 132 00:08:08,120 --> 00:08:11,480 Speaker 2: so that's why they're called sometimes fake based stations, false 133 00:08:11,480 --> 00:08:15,360 Speaker 2: space stations, rogue based stations. When you're normally walking around 134 00:08:15,360 --> 00:08:18,680 Speaker 2: the city, walking around a town, or driving around, you're 135 00:08:18,720 --> 00:08:21,119 Speaker 2: going to be transferred from one base station to another 136 00:08:21,160 --> 00:08:25,160 Speaker 2: base station on your same provider's network, whichever is the 137 00:08:25,240 --> 00:08:29,240 Speaker 2: strongest signal. These rogue based stations will have the strongest 138 00:08:29,280 --> 00:08:33,600 Speaker 2: signal for people in their specific location. It's like being 139 00:08:33,640 --> 00:08:36,120 Speaker 2: connected to a network, except of course, now the base 140 00:08:36,160 --> 00:08:39,400 Speaker 2: station is a rogue one. It can fire lots of 141 00:08:39,480 --> 00:08:42,400 Speaker 2: messages at you, and that's typically how they're used. 142 00:08:43,160 --> 00:08:46,199 Speaker 1: These SMS blasters can connect to your phone without you noticing, 143 00:08:46,440 --> 00:08:49,200 Speaker 1: and once it's connected, it can send you messages and 144 00:08:49,240 --> 00:08:52,280 Speaker 1: because it's directly connected to you, the telecom company has 145 00:08:52,360 --> 00:08:55,600 Speaker 1: no ability to black it. An SMS blaster can also 146 00:08:55,640 --> 00:08:58,480 Speaker 1: spoof the cinder ID to look like, say, for example, 147 00:08:58,559 --> 00:09:02,120 Speaker 1: a legitimate text from your base So what's the stop 148 00:09:02,120 --> 00:09:06,040 Speaker 1: in SMS blaster from just spamming everyone in the country. Well, 149 00:09:06,200 --> 00:09:09,360 Speaker 1: they don't have very good range, but that doesn't really 150 00:09:09,400 --> 00:09:11,200 Speaker 1: matter because they're portable. 151 00:09:12,080 --> 00:09:16,120 Speaker 2: Just the other day, somebody had been using one of 152 00:09:16,160 --> 00:09:21,800 Speaker 2: these SMS blasters on the London Underground, the capital's subway system, 153 00:09:22,160 --> 00:09:25,840 Speaker 2: to fire out lots and lots of scam SMS messages 154 00:09:26,080 --> 00:09:29,440 Speaker 2: to people as they're standing on the platform or traveling 155 00:09:29,440 --> 00:09:33,400 Speaker 2: through the trains through the underground. Really in Auckland, for example, 156 00:09:33,760 --> 00:09:36,560 Speaker 2: there was a student. He was paid four hundred dollars 157 00:09:36,640 --> 00:09:40,160 Speaker 2: a day to drive around one of these devices sending 158 00:09:40,240 --> 00:09:43,800 Speaker 2: scam SMS messages to whoever was in range. And then 159 00:09:43,840 --> 00:09:46,839 Speaker 2: the point of the message is it includes a hyperlink 160 00:09:47,120 --> 00:09:50,520 Speaker 2: that points towards a phishing website that impersonates something that 161 00:09:50,559 --> 00:09:52,800 Speaker 2: looks like for legitimate business or government function. 162 00:09:53,240 --> 00:09:57,239 Speaker 1: The technology behind SMS blasters, again was not developed specifically 163 00:09:57,240 --> 00:10:00,000 Speaker 1: for scamming. It's just been co opted by the scammers. 164 00:10:00,480 --> 00:10:04,199 Speaker 1: It's actually similar to the technology behind what's known as stingrays. 165 00:10:04,480 --> 00:10:06,880 Speaker 1: You might have heard about stingrays recently. Those are the 166 00:10:06,880 --> 00:10:10,040 Speaker 1: devices that police sometimes use to capture data from people 167 00:10:10,240 --> 00:10:13,320 Speaker 1: at protests. But instead of taking data off your phone 168 00:10:13,360 --> 00:10:17,319 Speaker 1: for surveillance, the SMS blasters push data on your phone 169 00:10:17,600 --> 00:10:20,599 Speaker 1: in the form of very convincing scam texts. 170 00:10:20,840 --> 00:10:23,880 Speaker 2: We're not talking about a completely new, separate technology that 171 00:10:24,000 --> 00:10:26,720 Speaker 2: was the only creates for criminals. We're talking about radio 172 00:10:26,760 --> 00:10:31,360 Speaker 2: technology for communications networks, and it's just being adapted slightly 173 00:10:31,440 --> 00:10:33,520 Speaker 2: tweaked for a different kind of view. 174 00:10:33,640 --> 00:10:36,360 Speaker 1: So in a way, these almost sound kind of primitive. 175 00:10:36,360 --> 00:10:39,880 Speaker 1: I mean, it's really just like brute forcing its way 176 00:10:39,960 --> 00:10:43,000 Speaker 1: into your phones. But you got to drive up pretty close. 177 00:10:43,040 --> 00:10:44,719 Speaker 1: What is the range of these things is like a 178 00:10:44,760 --> 00:10:46,120 Speaker 1: couple of miles or something like that. 179 00:10:46,559 --> 00:10:48,640 Speaker 2: It varies. Some of them are less, so you could 180 00:10:48,640 --> 00:10:51,840 Speaker 2: have a handheld device that has only a few hundred yards. 181 00:10:52,200 --> 00:10:54,640 Speaker 2: The advantage of that, of course, much more discrete, much 182 00:10:54,640 --> 00:10:57,679 Speaker 2: easier to carry around. There's been cases where people have 183 00:10:58,120 --> 00:11:01,400 Speaker 2: carried them around in backpacks through show malls, for example. 184 00:11:01,920 --> 00:11:03,960 Speaker 2: Or you might have a larger device with a larger 185 00:11:04,040 --> 00:11:06,280 Speaker 2: range than heavier, so you'll have to put it in 186 00:11:06,280 --> 00:11:08,360 Speaker 2: the back of the car or somewhere like that. In 187 00:11:08,400 --> 00:11:11,120 Speaker 2: one case in Bangkok, there was said to be a 188 00:11:11,120 --> 00:11:14,960 Speaker 2: particularly powerful device that had a range of three kilometers 189 00:11:15,040 --> 00:11:17,599 Speaker 2: and was able to send something in the order of 190 00:11:17,679 --> 00:11:22,000 Speaker 2: one hundred thousand SMS messages per hour. Wow, but that's 191 00:11:22,000 --> 00:11:24,160 Speaker 2: not necessarily a good thing. In terms of being a criminal, 192 00:11:24,160 --> 00:11:26,280 Speaker 2: because you don't want to be caught. And that's why 193 00:11:26,280 --> 00:11:29,360 Speaker 2: I would argue the London underground example I talked about, 194 00:11:29,400 --> 00:11:32,640 Speaker 2: that's like almost the perfect example because it's about how 195 00:11:32,640 --> 00:11:36,080 Speaker 2: many people you have passing by. You don't necessarily need 196 00:11:36,120 --> 00:11:36,360 Speaker 2: to have. 197 00:11:36,280 --> 00:11:39,880 Speaker 1: A huge range. So that's in background on how you're 198 00:11:39,880 --> 00:11:42,520 Speaker 1: getting these texts. It's kind of wild to think that 199 00:11:42,559 --> 00:11:44,560 Speaker 1: when you get one of these, it could just be 200 00:11:44,679 --> 00:11:48,040 Speaker 1: some random person with a backpack across the street. So 201 00:11:48,559 --> 00:11:51,720 Speaker 1: who are these people and what do they actually want? 202 00:11:52,559 --> 00:11:55,280 Speaker 1: That's where it starts to get heavy, and that's after 203 00:11:55,280 --> 00:12:06,080 Speaker 1: the break. I want to walk through a sample scam 204 00:12:06,160 --> 00:12:08,559 Speaker 1: text and if you could tell me what's going on here? 205 00:12:08,600 --> 00:12:09,880 Speaker 1: What are they trying to do? So I'm just going 206 00:12:09,960 --> 00:12:12,600 Speaker 1: to read you one here. Hi. I'm Caroline from the 207 00:12:12,640 --> 00:12:16,319 Speaker 1: We Work Remotely human resources team. We recently saw your 208 00:12:16,360 --> 00:12:19,079 Speaker 1: excellent background would like to introduce you to a remote, 209 00:12:19,240 --> 00:12:22,080 Speaker 1: high paying job opportunity. The work is so easy you 210 00:12:22,120 --> 00:12:24,559 Speaker 1: can do it from home. You'll work about sixty minutes 211 00:12:24,559 --> 00:12:26,959 Speaker 1: a day and will provide you with free training. Daily 212 00:12:26,960 --> 00:12:29,880 Speaker 1: pay ranges from three hundred to seven hundred. If you're interested, 213 00:12:29,960 --> 00:12:35,000 Speaker 1: please contact us via WhatsApp or telegram. What's the endgame here? 214 00:12:35,640 --> 00:12:38,240 Speaker 2: There's a couple of things going on. The always bear 215 00:12:38,280 --> 00:12:42,439 Speaker 2: in mind that criminal is just looking for potential victims. 216 00:12:42,920 --> 00:12:45,640 Speaker 2: Some people are more suggestible, some people are more open 217 00:12:45,679 --> 00:12:48,680 Speaker 2: to suggestion, and so just the very fact that you 218 00:12:48,800 --> 00:12:52,760 Speaker 2: respond means that you now a better target for all 219 00:12:52,800 --> 00:12:55,959 Speaker 2: sorts of scams. So there doesn't have to necessarily be 220 00:12:56,000 --> 00:12:59,120 Speaker 2: a linear relationship between the content of the message and 221 00:12:59,240 --> 00:13:02,880 Speaker 2: the scam. But in that case, the linear scam would 222 00:13:02,920 --> 00:13:06,120 Speaker 2: typically be they've got you on the hook. They're gonna 223 00:13:06,160 --> 00:13:08,000 Speaker 2: convince you that you're gonna make a lot of money 224 00:13:08,040 --> 00:13:10,520 Speaker 2: down the line, and maybe you have to pay a 225 00:13:10,520 --> 00:13:14,600 Speaker 2: fee in order to have your application processed or something 226 00:13:14,720 --> 00:13:16,680 Speaker 2: like that. So they're going to hook you in for 227 00:13:16,760 --> 00:13:19,880 Speaker 2: spending some money in order to then get the money 228 00:13:19,880 --> 00:13:21,839 Speaker 2: that's going to be promised to you later on. 229 00:13:22,240 --> 00:13:23,240 Speaker 1: And maybe if. 230 00:13:23,120 --> 00:13:25,920 Speaker 2: You pay once, then they're gonna make you pay a 231 00:13:25,960 --> 00:13:29,240 Speaker 2: second time, because once you've paid once, you really don't 232 00:13:29,240 --> 00:13:31,200 Speaker 2: want to have that money wasted by not paying a 233 00:13:31,200 --> 00:13:34,120 Speaker 2: second fee. So then you can see how a more 234 00:13:34,160 --> 00:13:39,520 Speaker 2: susceptible person gets dragged along until eventually they've realized they've 235 00:13:39,520 --> 00:13:42,040 Speaker 2: got to stop, So that would be a very typical 236 00:13:42,080 --> 00:13:43,720 Speaker 2: way in which that scam would evolve. 237 00:13:44,000 --> 00:13:47,600 Speaker 1: It's interesting because these scam texts are so diverse, because 238 00:13:47,679 --> 00:13:49,520 Speaker 1: another one that I think a lot of people seen 239 00:13:50,280 --> 00:13:53,640 Speaker 1: is just a completely random text out of the blue 240 00:13:53,880 --> 00:13:57,200 Speaker 1: saying how do you what's going on? Or hey Jennifer, 241 00:13:57,240 --> 00:14:00,240 Speaker 1: how are you? And I'm not Jennifer, O my right 242 00:14:00,240 --> 00:14:02,520 Speaker 1: now I can say this isn't Jennifer wrong number. 243 00:14:02,920 --> 00:14:04,920 Speaker 2: We tend to call them now in the industry, the 244 00:14:05,040 --> 00:14:08,960 Speaker 2: high dad, the hey mom scam, because it's quite often 245 00:14:09,360 --> 00:14:12,560 Speaker 2: a term like that, right, so it seems like it's 246 00:14:12,559 --> 00:14:16,120 Speaker 2: somebody familiar. You associate it in your mind with somebody 247 00:14:16,160 --> 00:14:18,760 Speaker 2: that you know, and that's the start of the process. 248 00:14:18,840 --> 00:14:21,720 Speaker 2: But again it's the same sequence there where they're just 249 00:14:21,800 --> 00:14:24,080 Speaker 2: looking to see who's going to reply. They just want 250 00:14:24,120 --> 00:14:26,800 Speaker 2: to get you into a conversation. And then maybe you 251 00:14:26,840 --> 00:14:29,520 Speaker 2: do reply back and you go I'm not your mom, 252 00:14:29,880 --> 00:14:33,960 Speaker 2: I'm a man, and they go, well, I'm very sorry, right, 253 00:14:34,120 --> 00:14:36,520 Speaker 2: I have the wrong number. But you see, the point 254 00:14:36,560 --> 00:14:40,160 Speaker 2: is you've already replied back, You've already engaged with them 255 00:14:40,160 --> 00:14:43,080 Speaker 2: on some level, and so can they turn the conversation 256 00:14:43,160 --> 00:14:46,400 Speaker 2: around to selling cryptocurrency or something like that. It's all 257 00:14:46,440 --> 00:14:49,880 Speaker 2: about that initial engagement. It's also about trying to get 258 00:14:49,920 --> 00:14:52,840 Speaker 2: you off maybe the channel of communication that was used 259 00:14:52,840 --> 00:14:55,720 Speaker 2: in the beginning, which could be an SMS message, towards 260 00:14:55,720 --> 00:14:59,720 Speaker 2: something encrypted, because if it's encrypted, it's much harder for 261 00:14:59,760 --> 00:15:03,760 Speaker 2: any technology to be used to monitor the behavior, to 262 00:15:03,880 --> 00:15:07,120 Speaker 2: intervene to protect you. So that's another thing that they're 263 00:15:07,160 --> 00:15:09,040 Speaker 2: looking do to is just to try and get you 264 00:15:09,200 --> 00:15:12,480 Speaker 2: somewhere else. So some scams are very simple, They stop 265 00:15:12,560 --> 00:15:15,160 Speaker 2: very quickly and the goal is very clear very early on. 266 00:15:15,440 --> 00:15:18,000 Speaker 2: Some is a long game. They just want to hug 267 00:15:18,040 --> 00:15:21,360 Speaker 2: you in and they will take you as far as 268 00:15:21,400 --> 00:15:24,720 Speaker 2: they possibly can. They're running a lot of scams, these 269 00:15:24,920 --> 00:15:27,840 Speaker 2: organized criminals. They'll just find the one that will be 270 00:15:27,920 --> 00:15:30,640 Speaker 2: the one that you are most susceptible to. 271 00:15:32,560 --> 00:15:35,600 Speaker 1: So who exactly are these criminals with the time and 272 00:15:35,680 --> 00:15:38,120 Speaker 1: patience to play out a long game of this scale. 273 00:15:38,720 --> 00:15:41,640 Speaker 1: The world behind this is a lot bigger than whoever 274 00:15:41,680 --> 00:15:43,040 Speaker 1: sent you that annoying text. 275 00:15:44,040 --> 00:15:47,160 Speaker 2: When we talk about some of the cyber crimes that 276 00:15:47,240 --> 00:15:50,560 Speaker 2: go on in the world today, there's this pattern where 277 00:15:50,560 --> 00:15:55,800 Speaker 2: we talk about kids, maybe teenagers, maybe early twenties. They're 278 00:15:55,840 --> 00:15:59,280 Speaker 2: learning from each other on discord. They're going to certain 279 00:15:59,360 --> 00:16:02,040 Speaker 2: forums and they're showing advice on how to commit crime, 280 00:16:02,240 --> 00:16:05,320 Speaker 2: and that's definitely a way in which crime gets propagated. 281 00:16:05,560 --> 00:16:09,120 Speaker 2: But to some extent those people are amateurs. Not everybody's 282 00:16:09,120 --> 00:16:12,080 Speaker 2: an amateur. So what I think you're seeing is a 283 00:16:12,120 --> 00:16:15,880 Speaker 2: degree to which the professionals are teaching each other how 284 00:16:15,920 --> 00:16:21,200 Speaker 2: to commit crime. And also they're franchising crime, so they 285 00:16:21,240 --> 00:16:25,080 Speaker 2: may provide a criminal service, and the point of the 286 00:16:25,120 --> 00:16:28,160 Speaker 2: criminal service is that somebody else will come along and 287 00:16:28,440 --> 00:16:32,360 Speaker 2: use the service to commit a more specific crime on 288 00:16:32,480 --> 00:16:35,240 Speaker 2: top of it, and that then has a multiplier effect. 289 00:16:35,800 --> 00:16:39,560 Speaker 2: Not every criminal needs to understand how a phone network works. 290 00:16:40,120 --> 00:16:44,600 Speaker 2: It's capitalism. But in the criminal world, why try to 291 00:16:44,680 --> 00:16:47,880 Speaker 2: do everything yourself when you can get very very good 292 00:16:47,960 --> 00:16:51,040 Speaker 2: at a layer of crime and then sell your services 293 00:16:51,080 --> 00:16:52,680 Speaker 2: to somebody at a different layer. 294 00:16:54,880 --> 00:16:57,760 Speaker 1: This is when you know crime is getting sophisticated when 295 00:16:57,760 --> 00:17:01,920 Speaker 1: people specialize in one area to reduce their liability. So 296 00:17:02,000 --> 00:17:04,439 Speaker 1: that text about a great crypto deal or even the 297 00:17:04,520 --> 00:17:07,760 Speaker 1: random high Jennifer text likely has a lot of people 298 00:17:07,800 --> 00:17:11,080 Speaker 1: working on different layers behind it, and it gets pretty 299 00:17:11,160 --> 00:17:12,600 Speaker 1: dark pretty quick. 300 00:17:13,400 --> 00:17:18,320 Speaker 2: In Mi and Mah in Cambodia and in Laos, which 301 00:17:18,359 --> 00:17:21,000 Speaker 2: are the three countries that have taken a lot of 302 00:17:21,000 --> 00:17:24,040 Speaker 2: the scam compounds that used to be based in China 303 00:17:24,080 --> 00:17:27,080 Speaker 2: and have become the new homes for scam compounds. It's 304 00:17:27,200 --> 00:17:30,119 Speaker 2: estimated that there's about one hundred thousand people working in 305 00:17:30,160 --> 00:17:33,440 Speaker 2: scam compounds, and there's a lot of tragic stories about 306 00:17:33,480 --> 00:17:38,520 Speaker 2: people being murdered, tortured, hichacked. It's effectively modern day slavery. 307 00:17:38,920 --> 00:17:43,119 Speaker 2: We're talking about huge numbers of people being persuaded to 308 00:17:43,160 --> 00:17:46,280 Speaker 2: move to a foreign country to work in these ones 309 00:17:46,320 --> 00:17:48,920 Speaker 2: with offers of jobs or well paid jobs, well paid 310 00:17:48,920 --> 00:17:51,760 Speaker 2: it job. Some people move just because they know they're 311 00:17:51,760 --> 00:17:53,720 Speaker 2: going to work for a criminal enterprise and they don't 312 00:17:53,760 --> 00:17:56,240 Speaker 2: care other people Hi chacked. I mean, there was a 313 00:17:56,320 --> 00:17:59,520 Speaker 2: terrible case that I was reading the other day. Nineteen 314 00:17:59,600 --> 00:18:05,600 Speaker 2: year old old Chinese boy was sold into slavery by 315 00:18:05,640 --> 00:18:09,480 Speaker 2: a seventeen year old Chinese girl. You know, she had 316 00:18:09,640 --> 00:18:13,240 Speaker 2: befriended him, she had become his girlfriend. She had told 317 00:18:13,359 --> 00:18:17,000 Speaker 2: him that she had family running businesses in me and Mark. 318 00:18:17,200 --> 00:18:20,920 Speaker 2: They flew to Thailand. They traveled overland to the border, 319 00:18:21,400 --> 00:18:23,520 Speaker 2: and when they arrived at the border, there's a bunch 320 00:18:23,560 --> 00:18:25,760 Speaker 2: of goons to grab a hold of him. She went 321 00:18:25,800 --> 00:18:28,359 Speaker 2: and had a ten day holiday in Thailand, and she 322 00:18:28,480 --> 00:18:30,760 Speaker 2: had pocketed her I think it was something in the 323 00:18:30,920 --> 00:18:35,439 Speaker 2: order of fifteen thousand dollars she'd got paid for selling him, 324 00:18:35,480 --> 00:18:39,280 Speaker 2: selling him to slavery. And to be fair to the 325 00:18:39,400 --> 00:18:42,720 Speaker 2: Chinese government, who are alive to this problem, they are 326 00:18:42,800 --> 00:18:46,840 Speaker 2: now prosecuting her. She returned to China and they're going 327 00:18:46,920 --> 00:18:49,560 Speaker 2: to do something about her. But that poor boy was 328 00:18:49,680 --> 00:18:53,159 Speaker 2: working sixteen twenty hour shifts in the scam compound in 329 00:18:53,200 --> 00:18:57,120 Speaker 2: Myanmark for four months until his friends and family were 330 00:18:57,160 --> 00:19:01,080 Speaker 2: able to pull together a ransom of five thousand dollars 331 00:19:01,160 --> 00:19:03,480 Speaker 2: to get him out. In the meantime, the poor lad 332 00:19:03,640 --> 00:19:07,400 Speaker 2: was beaten, beaten because he's not meeting his targets. That 333 00:19:07,440 --> 00:19:10,280 Speaker 2: poor Ladders lost some of his hearing as a result 334 00:19:10,320 --> 00:19:12,959 Speaker 2: of the beating. And that's not even by far the 335 00:19:13,000 --> 00:19:15,200 Speaker 2: worst case that I could recount to you. I mean, 336 00:19:15,240 --> 00:19:20,480 Speaker 2: there are some absolutely appalling stories. The only good thing 337 00:19:20,480 --> 00:19:22,880 Speaker 2: you can say about AI is that if AI gets 338 00:19:22,960 --> 00:19:24,680 Speaker 2: so good that you don't need to make a human 339 00:19:24,720 --> 00:19:26,359 Speaker 2: being a slave to do it anymore, at least we 340 00:19:26,400 --> 00:19:28,880 Speaker 2: won't have the problem with the human trafficking In a way, 341 00:19:28,880 --> 00:19:31,760 Speaker 2: I look forward to that because the level of human 342 00:19:31,800 --> 00:19:36,080 Speaker 2: misery that's occurring in some places is it's quite upsetting. 343 00:19:36,800 --> 00:19:37,919 Speaker 2: It's quite upsetting when you. 344 00:19:37,920 --> 00:19:43,680 Speaker 1: Read the stories. A lot of the scam text that 345 00:19:43,760 --> 00:19:46,639 Speaker 1: you get about stocks or selling crypto, or job recruiting 346 00:19:46,760 --> 00:19:49,960 Speaker 1: or the random high mom oops, wrong number, those kind 347 00:19:50,000 --> 00:19:52,240 Speaker 1: of texts, the ones that are after the long game, 348 00:19:52,680 --> 00:19:57,119 Speaker 1: these likely originate from these scam compounds. This kind of 349 00:19:57,119 --> 00:20:01,080 Speaker 1: scam has exploded since twenty twenty. Why well, there is 350 00:20:01,119 --> 00:20:04,280 Speaker 1: a pretty simple theory behind it. Chinese organized criminals had 351 00:20:04,320 --> 00:20:07,040 Speaker 1: built casinos that were sitting empty during the pandemic, and 352 00:20:07,080 --> 00:20:09,960 Speaker 1: they needed something to do with the empty buildings, so 353 00:20:10,160 --> 00:20:13,880 Speaker 1: they converted them into scam compounds. So if you engage 354 00:20:13,880 --> 00:20:16,240 Speaker 1: with one of those scam texts, there's a good chance 355 00:20:16,240 --> 00:20:18,240 Speaker 1: of the person you'll end up talking to is in 356 00:20:18,280 --> 00:20:21,440 Speaker 1: one of those buildings being watched by guards in case 357 00:20:21,480 --> 00:20:22,359 Speaker 1: they try to escape. 358 00:20:25,480 --> 00:20:28,879 Speaker 2: I'm the criminal. I will hook you into talking to 359 00:20:28,960 --> 00:20:31,680 Speaker 2: somebody who's gonna appear like a beautiful woman. I'm will 360 00:20:31,680 --> 00:20:34,040 Speaker 2: to talking to somebody who's gonna try and sell your cryptocouncy. 361 00:20:34,200 --> 00:20:36,240 Speaker 2: So what do I do, I send you a message 362 00:20:36,640 --> 00:20:40,360 Speaker 2: and then oh, whoops, it was the wrong number. I'm sorry, 363 00:20:40,480 --> 00:20:43,800 Speaker 2: I didn't mean to message you right. That message was 364 00:20:43,840 --> 00:20:47,680 Speaker 2: an automated message. The follow up message was an automated message. 365 00:20:47,800 --> 00:20:50,840 Speaker 2: They're sat in the scam compound waiting to see if 366 00:20:50,880 --> 00:20:54,480 Speaker 2: you're the kind of person who's gonna respond, who's gonna 367 00:20:54,480 --> 00:20:57,200 Speaker 2: get into a conversation. And that's why they employ all 368 00:20:57,240 --> 00:21:01,040 Speaker 2: these people in these scam compounds, just to haveversations with people, 369 00:21:01,560 --> 00:21:05,560 Speaker 2: weeding through the population to find who are the ones 370 00:21:05,920 --> 00:21:10,240 Speaker 2: who can be exploited, who will be open to this idea, 371 00:21:10,320 --> 00:21:14,280 Speaker 2: that idea. Maybe your weakness is a pretty girl. You 372 00:21:14,359 --> 00:21:18,639 Speaker 2: don't know that you're messaging some guy from Manila, but 373 00:21:18,920 --> 00:21:21,160 Speaker 2: you're going to be seeing a picture of a beautiful woman. 374 00:21:21,320 --> 00:21:24,480 Speaker 2: They'll have a beautiful woman in the compound who will 375 00:21:24,520 --> 00:21:28,280 Speaker 2: be brought out to do the live video conversation when 376 00:21:28,320 --> 00:21:31,000 Speaker 2: it's needed. The rest of the time, they're just working, 377 00:21:31,040 --> 00:21:32,119 Speaker 2: you working, you working. 378 00:21:32,720 --> 00:21:35,240 Speaker 1: This is the background of all this stuff. On the 379 00:21:35,280 --> 00:21:38,080 Speaker 1: receiving end, we just see an annoying text, but the 380 00:21:38,119 --> 00:21:42,119 Speaker 1: operation behind each message can be massive. There's a technology 381 00:21:42,160 --> 00:21:44,199 Speaker 1: on the ground, and then there's the buildings full of 382 00:21:44,240 --> 00:21:48,399 Speaker 1: forced laborers. This hurts a lot of people, but it 383 00:21:48,400 --> 00:21:52,439 Speaker 1: continues because it's a viable business. How many people have 384 00:21:52,520 --> 00:21:55,080 Speaker 1: to actually answer and get hooked into this thing for 385 00:21:55,119 --> 00:21:56,159 Speaker 1: it to make money for you? 386 00:21:56,520 --> 00:22:00,040 Speaker 2: It will depend on the scam. If I'm running a 387 00:22:00,080 --> 00:22:03,560 Speaker 2: scam compound and i have somebody who's my prisoner who 388 00:22:03,560 --> 00:22:06,879 Speaker 2: has to sit at like a screen and then cheat 389 00:22:06,960 --> 00:22:09,159 Speaker 2: you over the course of six months, well I'm going 390 00:22:09,200 --> 00:22:11,119 Speaker 2: to need a much bigger amount of money from you 391 00:22:11,400 --> 00:22:13,600 Speaker 2: to make it worthwhile to do it. So it does 392 00:22:13,720 --> 00:22:16,080 Speaker 2: very a lot depending upon the scam. But what I 393 00:22:16,080 --> 00:22:21,320 Speaker 2: would say is all the evidence shows that these scams 394 00:22:21,440 --> 00:22:24,680 Speaker 2: are so incredibly lucrative, whether at the low end or 395 00:22:24,720 --> 00:22:27,320 Speaker 2: at the high end. This thing is just ramping up 396 00:22:27,359 --> 00:22:30,880 Speaker 2: and up and up. They're just taking the accumulated profits 397 00:22:30,920 --> 00:22:34,399 Speaker 2: that they're generating from crime reinvesting it in more crime. 398 00:22:34,600 --> 00:22:36,680 Speaker 2: And the things that we're doing to chip away to 399 00:22:37,040 --> 00:22:40,479 Speaker 2: we're not keeping pace. We're falling further and further behind. 400 00:22:40,680 --> 00:22:43,480 Speaker 2: So their margins, their profits are getting higher and higher 401 00:22:43,560 --> 00:22:46,680 Speaker 2: and higher, and we are not at all sophisticated and 402 00:22:46,800 --> 00:22:48,399 Speaker 2: doing anything about tackling it. 403 00:22:50,160 --> 00:22:52,800 Speaker 1: So far, authorities haven't been able to stop these scams 404 00:22:52,800 --> 00:22:55,800 Speaker 1: from continuing to pop up. But is there anything that 405 00:22:55,920 --> 00:22:59,440 Speaker 1: we can do to stop these scam texts? That's after 406 00:22:59,480 --> 00:23:10,800 Speaker 1: the break. When it comes to stopping scam text the 407 00:23:10,800 --> 00:23:13,360 Speaker 1: advice you usually hear is learning how to spot them, 408 00:23:13,720 --> 00:23:16,240 Speaker 1: watch out from misspellings or weird formatting in the text, 409 00:23:16,440 --> 00:23:18,800 Speaker 1: don't click links if you don't know the sinder, maybe 410 00:23:18,920 --> 00:23:21,959 Speaker 1: don't reply. We also hear that certain demographics are more 411 00:23:22,000 --> 00:23:24,000 Speaker 1: vulnerable and that we need to teach them to be 412 00:23:24,080 --> 00:23:27,399 Speaker 1: more savvy. But all of this is kind of missing 413 00:23:27,400 --> 00:23:28,199 Speaker 1: the bigger picture. 414 00:23:28,840 --> 00:23:32,280 Speaker 2: It's been argued that all the people are more vulnerable 415 00:23:32,320 --> 00:23:35,520 Speaker 2: to scams, but then there's also some conflicting data that 416 00:23:35,560 --> 00:23:38,359 Speaker 2: says actually, young people spend a lot more time on 417 00:23:38,400 --> 00:23:42,160 Speaker 2: the phone, they're a lot more casual, shall we say, 418 00:23:42,280 --> 00:23:45,399 Speaker 2: about what they do. They think of themselves as savvy, 419 00:23:45,600 --> 00:23:48,440 Speaker 2: so they fall for scams. And I think, really what 420 00:23:48,920 --> 00:23:53,040 Speaker 2: we should be learning from the criminals is however much 421 00:23:53,080 --> 00:23:56,320 Speaker 2: you train people, how much you educate people to be 422 00:23:56,480 --> 00:23:59,080 Speaker 2: scam aware to improve the whether it's still be less 423 00:23:59,160 --> 00:24:03,080 Speaker 2: likelyful for it. Those criminals, they're gaining a lot of 424 00:24:03,160 --> 00:24:08,920 Speaker 2: data about what works in terms of mental tricks, psychological ploys, manipulations, 425 00:24:09,400 --> 00:24:12,399 Speaker 2: Their masters are it, and they have an enormous amount 426 00:24:12,440 --> 00:24:17,040 Speaker 2: of data about what actually works in practice, so they're 427 00:24:17,080 --> 00:24:20,440 Speaker 2: always getting better and better and better. I'm not a psychologist, 428 00:24:20,800 --> 00:24:25,080 Speaker 2: but from what I understand about the topic, suggestibility is 429 00:24:25,160 --> 00:24:29,480 Speaker 2: not well correlated to intelligence, and that's what the criminals 430 00:24:29,560 --> 00:24:32,840 Speaker 2: are looking for. Are you suggestible in some ways? Can 431 00:24:32,920 --> 00:24:36,120 Speaker 2: they plant an idea that you will act upon? Can 432 00:24:36,160 --> 00:24:39,600 Speaker 2: they hook you with something that tempts you and motivates 433 00:24:39,600 --> 00:24:41,520 Speaker 2: you in a certain way? And they're looking for your 434 00:24:41,560 --> 00:24:44,359 Speaker 2: weakness and different people will have different weaknesses. 435 00:24:44,560 --> 00:24:46,080 Speaker 1: I mean, I think this is the really difficult thing 436 00:24:46,119 --> 00:24:48,000 Speaker 1: here is because I think there are probably some people 437 00:24:48,000 --> 00:24:50,560 Speaker 1: who started listening to this who are thinking this is 438 00:24:50,600 --> 00:24:52,720 Speaker 1: going to be great. Dexter got an expert here who's 439 00:24:52,760 --> 00:24:54,760 Speaker 1: going to give me the top three tips for that 440 00:24:54,920 --> 00:24:59,120 Speaker 1: far and victim to a scam. And it doesn't sound 441 00:24:59,200 --> 00:25:00,320 Speaker 1: like you want to give that to me. 442 00:25:03,240 --> 00:25:05,480 Speaker 2: I mean, look, I could do that, and I could 443 00:25:05,520 --> 00:25:08,760 Speaker 2: say things like don't let your phone get downgraded to 444 00:25:08,840 --> 00:25:11,760 Speaker 2: a two G network because that's typical for the format 445 00:25:11,840 --> 00:25:15,080 Speaker 2: for SMS blasters, And I could say look for spelling 446 00:25:15,240 --> 00:25:17,439 Speaker 2: errors in the message, and you know, if there's a 447 00:25:17,520 --> 00:25:20,440 Speaker 2: high pull link, if it's a link that's been shortened, 448 00:25:20,520 --> 00:25:23,639 Speaker 2: be suspicious or stuff like that. You know, if someone 449 00:25:23,640 --> 00:25:26,760 Speaker 2: makes a call, don't receive the call, call back to 450 00:25:26,800 --> 00:25:29,639 Speaker 2: your bank rather than it's all that's all good stuff, 451 00:25:29,680 --> 00:25:31,520 Speaker 2: that's all common sense stuff. I'm not going to argue 452 00:25:31,560 --> 00:25:34,560 Speaker 2: that you shouldn't do that, but it's not enough. It's 453 00:25:34,600 --> 00:25:38,679 Speaker 2: never going to be enough. The rate of acceleration in 454 00:25:38,800 --> 00:25:42,600 Speaker 2: criminal sophistication. To try to solve the problem like that 455 00:25:42,880 --> 00:25:44,679 Speaker 2: is fundamentally the wrong method. 456 00:25:46,920 --> 00:25:48,720 Speaker 1: So there is one small thing that you may be 457 00:25:48,760 --> 00:25:51,520 Speaker 1: able to do. If you have an Android phone, assuming 458 00:25:51,560 --> 00:25:55,040 Speaker 1: it's relatively recent, you can block those SMS blasters from 459 00:25:55,119 --> 00:25:56,880 Speaker 1: even being able to connect to your phone at all. 460 00:25:57,160 --> 00:26:00,440 Speaker 1: It's pretty simple. You just disabled two G switch in 461 00:26:00,480 --> 00:26:02,439 Speaker 1: your settings to do it. I put a link in 462 00:26:02,440 --> 00:26:04,720 Speaker 1: the show notes to a guide that walks you through it, 463 00:26:04,760 --> 00:26:07,720 Speaker 1: and trust me, its very simple. It takes like thirty seconds. 464 00:26:08,160 --> 00:26:12,520 Speaker 1: For iPhones. Unfortunately, you're kind of stuck. You can't turn 465 00:26:12,560 --> 00:26:15,640 Speaker 1: off two G unless you use lockdown mode, which would 466 00:26:15,680 --> 00:26:18,119 Speaker 1: also limit other functions in your phone that you probably 467 00:26:18,160 --> 00:26:21,760 Speaker 1: want to keep on other than that. Again, base rule. 468 00:26:21,880 --> 00:26:23,199 Speaker 1: If you get a text with a link in it, 469 00:26:23,480 --> 00:26:26,000 Speaker 1: don't click it, even if it looks legit, even if 470 00:26:26,040 --> 00:26:28,880 Speaker 1: it's a text from your bank or your doctor, call 471 00:26:28,960 --> 00:26:32,119 Speaker 1: them and confirm that it is actually from them. I 472 00:26:32,119 --> 00:26:35,760 Speaker 1: know it's annoying, but it's better safe than sorry. But again, 473 00:26:36,000 --> 00:26:38,800 Speaker 1: all these things, even the features that let you report scams, 474 00:26:39,240 --> 00:26:42,200 Speaker 1: that's treating the symptom. If we really want to stop 475 00:26:42,240 --> 00:26:45,960 Speaker 1: these scams, the solution probably isn't in your phone settings. 476 00:26:46,680 --> 00:26:47,840 Speaker 1: It's in the phone companies. 477 00:26:49,200 --> 00:26:52,359 Speaker 2: The correct method is for the businesses to look at 478 00:26:52,440 --> 00:26:55,920 Speaker 2: who they do business squit, who's buying the SIM college, 479 00:26:56,080 --> 00:27:00,359 Speaker 2: who's making the book communications these criminals. I had one 480 00:27:00,480 --> 00:27:05,959 Speaker 2: hundred thousand SIM cards in New York. Now that's a 481 00:27:06,000 --> 00:27:09,080 Speaker 2: lot of SIM cards. A phone network should be saying, 482 00:27:09,200 --> 00:27:11,480 Speaker 2: where have all our SIM card's gone? Well, all these 483 00:27:11,520 --> 00:27:13,960 Speaker 2: strange customers on a network, why are they not behaving 484 00:27:14,040 --> 00:27:15,760 Speaker 2: like a normal person. Let me give you an example 485 00:27:15,800 --> 00:27:18,920 Speaker 2: about normal and abnormal behavior that a network should pick 486 00:27:18,960 --> 00:27:21,399 Speaker 2: up on. When you carry your phone around, When you 487 00:27:21,440 --> 00:27:24,919 Speaker 2: go around, you move, you move from place to place. 488 00:27:25,440 --> 00:27:28,320 Speaker 2: While simbox doesn't move. When it's in a New York apartment, 489 00:27:28,880 --> 00:27:32,480 Speaker 2: it's sat there, not moving at all. That should be 490 00:27:32,880 --> 00:27:36,920 Speaker 2: a red flashing warning like, why have I got such 491 00:27:36,960 --> 00:27:41,080 Speaker 2: a peculiar customer? Why have I got so many peculiar 492 00:27:41,119 --> 00:27:46,080 Speaker 2: customers are in the same place, never moving. Perhaps they 493 00:27:46,160 --> 00:27:49,720 Speaker 2: only make outbound communication. That would be weird because of 494 00:27:49,840 --> 00:27:54,040 Speaker 2: normal person sends and receives. There should be lots of 495 00:27:54,200 --> 00:27:57,200 Speaker 2: flashing warning lights going off in those telcoasts. 496 00:27:57,800 --> 00:28:00,520 Speaker 1: I mean, this really reminds me of also of the 497 00:28:00,520 --> 00:28:04,480 Speaker 1: conversation about detecting AI right, which is to say, there 498 00:28:04,520 --> 00:28:07,160 Speaker 1: was a point at which you could give people tips 499 00:28:07,280 --> 00:28:10,240 Speaker 1: and you'd tell them, hey, look for too many fingers 500 00:28:10,240 --> 00:28:12,200 Speaker 1: on one hand. And I remember, and I was saying 501 00:28:12,200 --> 00:28:14,800 Speaker 1: this back here on the same show back then. You 502 00:28:14,840 --> 00:28:18,520 Speaker 1: were fooling yourself if you think that you were gonna 503 00:28:18,520 --> 00:28:21,640 Speaker 1: be able to use those same tricks six months from now. 504 00:28:22,280 --> 00:28:25,639 Speaker 1: And sure enough, you look at AI generated pictures, AI 505 00:28:25,680 --> 00:28:29,159 Speaker 1: generated videos, the hands look perfect. The idea that we 506 00:28:29,280 --> 00:28:32,720 Speaker 1: can just educate the public and that will solve everything. 507 00:28:32,920 --> 00:28:35,080 Speaker 1: It sounds like bringing a knife to a nuke fight, 508 00:28:35,160 --> 00:28:38,560 Speaker 1: you know, pardon the American violent imagery here, but it 509 00:28:39,040 --> 00:28:41,360 Speaker 1: just seems like it's not functional. 510 00:28:41,720 --> 00:28:46,120 Speaker 2: So true. I mean, for myself personal experience, a few 511 00:28:46,120 --> 00:28:51,440 Speaker 2: months ago, I received my first ever spam call from 512 00:28:51,440 --> 00:28:55,240 Speaker 2: an AI, and I think I'm a pretty savvy guy 513 00:28:55,280 --> 00:28:58,240 Speaker 2: about this kind of stuff. And I listened to the 514 00:28:58,320 --> 00:29:01,560 Speaker 2: spam AI talked to me, and I had to ask 515 00:29:01,640 --> 00:29:08,800 Speaker 2: the question, are you an AI? Because it was not obvious. Now, thankfully, 516 00:29:09,200 --> 00:29:14,280 Speaker 2: that particular spam AI did honestly answer the question. Yes, 517 00:29:14,720 --> 00:29:18,320 Speaker 2: I am an artificial intelligence because it had been programmed 518 00:29:18,320 --> 00:29:20,480 Speaker 2: that way. But I had to think about it. I 519 00:29:20,520 --> 00:29:22,840 Speaker 2: had to ask the question. But what really upset me 520 00:29:22,880 --> 00:29:26,680 Speaker 2: about that interaction was I continued my conversation with the AI, 521 00:29:27,120 --> 00:29:29,280 Speaker 2: not trying to buy the thing it wanted to sell me, 522 00:29:29,440 --> 00:29:33,400 Speaker 2: but actually trying to investigate learn what was happening. It 523 00:29:33,520 --> 00:29:37,320 Speaker 2: claimed it received my phone number from one of the 524 00:29:37,320 --> 00:29:43,200 Speaker 2: world's biggest telecoms industry associations, a conference that I had 525 00:29:43,240 --> 00:29:47,320 Speaker 2: attended shortly beforehand, where I have to hand over my 526 00:29:47,480 --> 00:29:51,000 Speaker 2: phone number in order to register and participate. What does 527 00:29:51,040 --> 00:29:56,560 Speaker 2: that say about our attitude towards data? That call was illegal. 528 00:29:57,080 --> 00:29:59,960 Speaker 2: It was illegal because my number is on the list 529 00:30:00,120 --> 00:30:03,000 Speaker 2: in the UK. There's essentially you're not allowed to call 530 00:30:03,040 --> 00:30:05,920 Speaker 2: me for marketing purposes. And yet one of the world's 531 00:30:06,000 --> 00:30:11,800 Speaker 2: biggest telecoms associations has handed over that data to another company, 532 00:30:12,080 --> 00:30:15,000 Speaker 2: doesn't bother to check the laws. Just made the spam 533 00:30:15,080 --> 00:30:18,320 Speaker 2: call because they're going to make some money from me today. 534 00:30:18,520 --> 00:30:23,040 Speaker 2: And that's where the rotten heart of it lies. In 535 00:30:23,120 --> 00:30:28,080 Speaker 2: the end, the problem we're trying to solve this is 536 00:30:28,120 --> 00:30:32,520 Speaker 2: that a phone company makes money for carrying calls, for 537 00:30:32,600 --> 00:30:36,320 Speaker 2: transmitting messages, for doing the communications. They don't make money 538 00:30:36,320 --> 00:30:40,280 Speaker 2: for blocking things. If all those calls got blocked, if 539 00:30:40,280 --> 00:30:43,200 Speaker 2: all those messages got blocked, if all those bad actors 540 00:30:43,200 --> 00:30:47,760 Speaker 2: were taking out the ecosystem, revenues would go down. Why 541 00:30:47,800 --> 00:30:51,280 Speaker 2: would you go out of your way to find crime 542 00:30:52,320 --> 00:30:55,440 Speaker 2: if your job is to increase the amount of traffic 543 00:30:55,520 --> 00:30:58,200 Speaker 2: on your network, if your job is to increase the 544 00:30:58,240 --> 00:31:02,520 Speaker 2: revenues of that network. That's the conflict of interest, that's 545 00:31:02,600 --> 00:31:08,160 Speaker 2: the fundamental issue here. They may not be actively choosing 546 00:31:08,360 --> 00:31:12,040 Speaker 2: to enable crime, but they'll also not get any reason 547 00:31:12,120 --> 00:31:15,239 Speaker 2: to work harder to prevent crime. There's a lot of 548 00:31:15,280 --> 00:31:21,120 Speaker 2: money being made off the back of scams. Ultimately, the 549 00:31:21,160 --> 00:31:24,400 Speaker 2: result is corrosive for society. You don't trust the call 550 00:31:24,480 --> 00:31:27,440 Speaker 2: coming into your phone, you don't trust the messages. I 551 00:31:27,480 --> 00:31:32,120 Speaker 2: don't answer phone calls anymore. I use other modes of 552 00:31:32,200 --> 00:31:36,560 Speaker 2: communication now, And that's sad that when somebody maybe really 553 00:31:36,560 --> 00:31:38,240 Speaker 2: does want to speak to me on the phone, that 554 00:31:38,320 --> 00:31:42,280 Speaker 2: I'm rejecting them. But it's also bad for society. The 555 00:31:42,360 --> 00:31:45,320 Speaker 2: phone network was a beautiful thing. It was a thing 556 00:31:45,360 --> 00:31:48,120 Speaker 2: where you could pick up a phone and with ninety 557 00:31:48,200 --> 00:31:52,720 Speaker 2: nine point nine nine percent probability you could dial a 558 00:31:52,840 --> 00:31:55,720 Speaker 2: number to anywhere in the world and it would connect. 559 00:31:56,280 --> 00:31:58,680 Speaker 2: And now people don't want to pick up the phone 560 00:31:58,680 --> 00:32:01,520 Speaker 2: at the other end, what does that say about how 561 00:32:01,520 --> 00:32:03,920 Speaker 2: we've gone backwards spending terms of how we communicate with 562 00:32:03,960 --> 00:32:04,400 Speaker 2: each other. 563 00:32:04,720 --> 00:32:07,320 Speaker 1: I mean, you're not wrong, I'm thinking of Actually, there's 564 00:32:07,400 --> 00:32:09,560 Speaker 1: a text thread that I got added to that I 565 00:32:09,680 --> 00:32:14,080 Speaker 1: just assumed was spam, and people are making plans for 566 00:32:14,160 --> 00:32:17,560 Speaker 1: having dinner here in LA and I'm thinking, man, they're 567 00:32:17,560 --> 00:32:22,080 Speaker 1: getting sophisticated. Come to find out these were actually people 568 00:32:22,840 --> 00:32:24,920 Speaker 1: A friend had added me into a group. I just 569 00:32:25,000 --> 00:32:27,880 Speaker 1: wasn't familiar with everybody. After the event, they sent pictures 570 00:32:27,880 --> 00:32:30,520 Speaker 1: in the thread it looked like a good time. I 571 00:32:30,560 --> 00:32:33,200 Speaker 1: didn't respond once because I thought it was a scam. 572 00:32:33,360 --> 00:32:35,960 Speaker 1: Now I got like ten people I got to apologize 573 00:32:36,000 --> 00:32:38,920 Speaker 1: to because they think I'm ignoring them. Because I'm scared 574 00:32:38,920 --> 00:32:39,640 Speaker 1: to answer text. 575 00:32:39,720 --> 00:32:42,720 Speaker 2: And here's the thing, right, it's about what we comes 576 00:32:42,760 --> 00:32:46,520 Speaker 2: to the normal. There was a time your phone you 577 00:32:46,560 --> 00:32:50,440 Speaker 2: could trust that people are basically only people. You know, 578 00:32:50,520 --> 00:32:53,440 Speaker 2: people want to communicate you. Someone with a genuine reason 579 00:32:53,520 --> 00:32:55,360 Speaker 2: to call you would call you. And that was your 580 00:32:55,400 --> 00:32:58,920 Speaker 2: attitude towards your phone. Right, what's your attitude towards email. 581 00:32:59,000 --> 00:33:01,760 Speaker 2: It's not the same, is it. No, your email is full. 582 00:33:01,840 --> 00:33:04,440 Speaker 2: You've got a junk folder. There's lots of crap in 583 00:33:04,480 --> 00:33:07,680 Speaker 2: your junk folder. Maybe sometimes a good message gets into 584 00:33:07,720 --> 00:33:10,000 Speaker 2: your junk folder. Maybe you send a message and you're 585 00:33:10,000 --> 00:33:12,120 Speaker 2: annoyed it didn't get through to someone else because it 586 00:33:12,240 --> 00:33:13,520 Speaker 2: ended up in their junk folder. 587 00:33:13,680 --> 00:33:14,480 Speaker 1: M hm. 588 00:33:14,520 --> 00:33:18,280 Speaker 2: We are transferring from a scenario where essentially your phone 589 00:33:18,560 --> 00:33:23,080 Speaker 2: voice communications, message communications, you generally would trust them, to 590 00:33:23,200 --> 00:33:26,800 Speaker 2: the scenario like email. It's gonna be like that in future. 591 00:33:26,920 --> 00:33:29,240 Speaker 2: And yeah, we're gonna use technology, and yeah we're gonna 592 00:33:29,280 --> 00:33:30,920 Speaker 2: weed it out and we're gonna get with some of 593 00:33:31,000 --> 00:33:33,560 Speaker 2: the bad stuff. But is that the level that we 594 00:33:33,640 --> 00:33:36,520 Speaker 2: now want to be communicating at that level of trust 595 00:33:36,560 --> 00:33:38,200 Speaker 2: and confidence. That's a step back. 596 00:33:38,480 --> 00:33:39,680 Speaker 1: Yeah, And that's. 597 00:33:39,440 --> 00:33:42,120 Speaker 2: Sadly the way we're going about solving problem because a 598 00:33:42,200 --> 00:33:43,560 Speaker 2: lot of it is that we're trying to solve the 599 00:33:43,560 --> 00:33:46,240 Speaker 2: problem in the way that the problem was solved for email. Email. 600 00:33:46,280 --> 00:33:49,200 Speaker 2: This email, it's not something where I am in an 601 00:33:49,200 --> 00:33:51,880 Speaker 2: emergency and I want to speak to you, I send. 602 00:33:51,720 --> 00:33:52,400 Speaker 1: You an email. 603 00:33:52,760 --> 00:33:54,480 Speaker 2: When I'm in an emergency and I want to speak 604 00:33:54,480 --> 00:33:55,520 Speaker 2: to you, I want to call you. 605 00:33:55,800 --> 00:33:57,400 Speaker 1: Yeah, So to start. 606 00:33:57,200 --> 00:34:00,840 Speaker 2: Turning that kind of communication into level of email, we're 607 00:34:00,880 --> 00:34:03,720 Speaker 2: losing something in society that makes me shock. 608 00:34:04,080 --> 00:34:09,440 Speaker 1: It makes me shock, you know. A little after we 609 00:34:09,480 --> 00:34:11,960 Speaker 1: recorded this conversation, a friend of mine sent me a 610 00:34:12,000 --> 00:34:14,040 Speaker 1: song that I had not heard in a long time. 611 00:34:14,480 --> 00:34:17,439 Speaker 1: It's this really simple and beautiful song by this guy 612 00:34:17,520 --> 00:34:20,600 Speaker 1: named labby Siffrey, and apparently it's been going viral recently. 613 00:34:21,120 --> 00:34:23,400 Speaker 1: I can't play it here, you know, copyright all that 614 00:34:23,440 --> 00:34:25,319 Speaker 1: sort of thing, but I can read you part of 615 00:34:25,320 --> 00:34:29,319 Speaker 1: the lyrics. It's strange how a phone call can change 616 00:34:29,320 --> 00:34:31,960 Speaker 1: your day and take you away from the feeling of 617 00:34:32,040 --> 00:34:36,279 Speaker 1: being alone. Blessed the Telephone, and that's the name of 618 00:34:36,320 --> 00:34:39,359 Speaker 1: the song, Blessed the Telephone, which is something I think 619 00:34:39,400 --> 00:34:42,640 Speaker 1: you could still say. Sincerely in nineteen seventy one, when 620 00:34:42,640 --> 00:34:45,200 Speaker 1: it's recorded, and then I think to twenty years later, 621 00:34:45,239 --> 00:34:47,920 Speaker 1: to nineteen ninety one when Tribe call Quest put out 622 00:34:47,920 --> 00:34:51,160 Speaker 1: a song called Skypager, which is seriously just a song 623 00:34:51,160 --> 00:34:52,880 Speaker 1: about how cool it is to have a device that 624 00:34:52,960 --> 00:34:57,000 Speaker 1: lets people contact you when you're outside. At some point, 625 00:34:57,080 --> 00:35:00,480 Speaker 1: the ability to contact someone instantly was this future or 626 00:35:00,520 --> 00:35:04,239 Speaker 1: even magical thing. I'm not that young, but I guess 627 00:35:04,280 --> 00:35:06,720 Speaker 1: I'm not old enough to really remember that time period 628 00:35:06,960 --> 00:35:09,120 Speaker 1: because by the time I was really using my own 629 00:35:09,160 --> 00:35:12,640 Speaker 1: phone and especially using email, it was already something that 630 00:35:12,680 --> 00:35:16,880 Speaker 1: we saw as potentially suspicious. And now forget about it. 631 00:35:16,960 --> 00:35:18,759 Speaker 1: When's the last time you heard someone say that they 632 00:35:18,800 --> 00:35:22,319 Speaker 1: even like their phone. Anyway, we'll leave the links to 633 00:35:22,400 --> 00:35:24,239 Speaker 1: those songs and the show notes if you want to 634 00:35:24,320 --> 00:35:30,280 Speaker 1: check that out. Thank you for listening to another episode 635 00:35:30,480 --> 00:35:32,279 Speaker 1: kill Switch. If you want to talk to us, you 636 00:35:32,280 --> 00:35:36,000 Speaker 1: could email us at kill Switch at Kaleidoscope dot NYC, 637 00:35:36,480 --> 00:35:39,600 Speaker 1: or we're also on Instagram at kill Switch Pod, and 638 00:35:39,760 --> 00:35:42,040 Speaker 1: you know, please think about leaving us a review. It 639 00:35:42,080 --> 00:35:44,719 Speaker 1: helps other people find the show, which in turn helps 640 00:35:44,800 --> 00:35:47,080 Speaker 1: us keep doing our thing. And once you've done that, 641 00:35:47,239 --> 00:35:49,640 Speaker 1: you know we're also on YouTube, so if you want 642 00:35:49,680 --> 00:35:51,920 Speaker 1: to watch this interview rather than just listen to it. 643 00:35:52,239 --> 00:35:54,680 Speaker 1: There you go. The link for that and everything else 644 00:35:54,719 --> 00:35:57,960 Speaker 1: again is in the show notes. Kill Switch is hosted 645 00:35:57,960 --> 00:36:01,760 Speaker 1: by Me Dexter Thomas. It's produce by Sheen Ozaki, darl 646 00:36:01,760 --> 00:36:05,000 Speaker 1: Of Potts and Julian Nutter. Our theme song is by 647 00:36:05,080 --> 00:36:07,839 Speaker 1: me and Kyle Murdoch and Kyle also mixed the show 648 00:36:08,120 --> 00:36:12,480 Speaker 1: from Kaleidoscope. Our executive producers are Oswa Lashin, Mcgueshatikadur and 649 00:36:12,760 --> 00:36:16,680 Speaker 1: Kate Osborne. From iHeart, our executive producers are Katrina Norville 650 00:36:16,840 --> 00:36:27,680 Speaker 1: and Nikki E. Tor Catch on the next one, Goodbye,