1 00:00:00,080 --> 00:00:03,800 Speaker 1: Before that, he worked in law enforcement and intelligence agencies, 2 00:00:03,920 --> 00:00:08,080 Speaker 1: including Scotland Yard and the National Criminal Intelligence Service. John 3 00:00:08,160 --> 00:00:11,640 Speaker 1: McGrath is a global solution Architect with IBM and works 4 00:00:11,680 --> 00:00:13,680 Speaker 1: with IBM r S to find ways in which the 5 00:00:13,720 --> 00:00:17,759 Speaker 1: company can turn its expertise and technology towards solving real 6 00:00:17,800 --> 00:00:21,280 Speaker 1: world problems. And when it comes to real world problems, 7 00:00:21,600 --> 00:00:24,959 Speaker 1: human trafficking is a major one. When you consider the 8 00:00:25,000 --> 00:00:27,720 Speaker 1: impact of the issue not just on those who are 9 00:00:27,840 --> 00:00:31,720 Speaker 1: the direct victims, but also their families and communities, as 10 00:00:31,760 --> 00:00:34,120 Speaker 1: well as the various companies that are profiting off the 11 00:00:34,159 --> 00:00:39,720 Speaker 1: proliferation of human trafficking, it can quickly become overwhelming. That's 12 00:00:39,720 --> 00:00:42,200 Speaker 1: why I was excited to speak with Neil and John, 13 00:00:42,440 --> 00:00:44,720 Speaker 1: as they helped me get a better understanding of the 14 00:00:44,800 --> 00:00:48,400 Speaker 1: issue and how technology is playing an intrinsic and at 15 00:00:48,479 --> 00:00:53,800 Speaker 1: times non intuitive part in combating human trafficking. First, let 16 00:00:53,840 --> 00:00:56,720 Speaker 1: me thank both of you for being on the show, 17 00:00:56,840 --> 00:00:59,040 Speaker 1: and before we get into the topic at hand, I 18 00:00:59,040 --> 00:01:01,560 Speaker 1: thought it would be nice to get to know the 19 00:01:01,600 --> 00:01:04,160 Speaker 1: two of you and to learn more about your background 20 00:01:04,160 --> 00:01:07,480 Speaker 1: and what brought you to your current positions. So John, 21 00:01:07,520 --> 00:01:09,000 Speaker 1: could we start with you. Could you tell us a 22 00:01:09,000 --> 00:01:12,000 Speaker 1: little bit about yourself and what it is you do 23 00:01:12,200 --> 00:01:16,480 Speaker 1: and how you got there. Sure. So, my name is 24 00:01:16,560 --> 00:01:20,399 Speaker 1: John McGrath. I'm a senior solution architect for IBM in 25 00:01:20,440 --> 00:01:24,240 Speaker 1: Ireland based out of the Dublin Lab in in Ireland. 26 00:01:24,560 --> 00:01:28,920 Speaker 1: My background, Jonathan, is fourteen years working in lab services 27 00:01:28,959 --> 00:01:32,600 Speaker 1: for IBM, which involves the dealing with clients on a 28 00:01:32,680 --> 00:01:36,559 Speaker 1: daily basis. But about two and a half three years 29 00:01:36,560 --> 00:01:40,040 Speaker 1: ago I got involved in the Traffic Analysis Hub initiative 30 00:01:40,600 --> 00:01:43,720 Speaker 1: and from that I managed to form a team called 31 00:01:43,720 --> 00:01:46,039 Speaker 1: a Tech for Good Team in Dublin and uh and 32 00:01:46,120 --> 00:01:48,280 Speaker 1: that's what I do on a daily basis. Now I 33 00:01:48,320 --> 00:01:50,720 Speaker 1: work with the Tech for Good Team excellent and Neil, 34 00:01:50,840 --> 00:01:54,640 Speaker 1: can you tell us a bit about yourself and your position? Surely, 35 00:01:54,720 --> 00:01:59,120 Speaker 1: Jonathan M. So, my name is Neil Jonles. I'm currently 36 00:01:59,240 --> 00:02:03,279 Speaker 1: CEO of a new reform not for profit called Traffic 37 00:02:03,360 --> 00:02:09,360 Speaker 1: Analysis Hub. My journey here is a torturous one. I 38 00:02:09,400 --> 00:02:15,559 Speaker 1: spent thirty six years in law enforcement in the United Kingdom, 39 00:02:15,600 --> 00:02:19,639 Speaker 1: concluding that time as Deputy Director of our National Agency. 40 00:02:20,320 --> 00:02:24,960 Speaker 1: I'm an organized crime intelligence expert UM and while I 41 00:02:25,040 --> 00:02:28,720 Speaker 1: was serving with our National Agency, I came across a 42 00:02:28,760 --> 00:02:32,119 Speaker 1: small not for profit called Stop the Traffic, who were 43 00:02:32,360 --> 00:02:37,280 Speaker 1: specializing in preventing human trafficking. They began their work with 44 00:02:37,400 --> 00:02:41,919 Speaker 1: the cocoa industry in West Africa that was using thousands 45 00:02:41,960 --> 00:02:46,600 Speaker 1: of child slaves to pick cocoa for our chocolate UM 46 00:02:46,639 --> 00:02:48,960 Speaker 1: and I was disappointed to learn that they knew more 47 00:02:49,000 --> 00:02:53,079 Speaker 1: about trafficking than the intelligence systems in my national agency. 48 00:02:53,639 --> 00:02:58,320 Speaker 1: UM so began formula relationship with them too, to grow 49 00:02:58,800 --> 00:03:01,680 Speaker 1: that understanding in the agency and and to begin to 50 00:03:01,720 --> 00:03:05,639 Speaker 1: build unusual partnerships with NGOs and other subject matter experts. 51 00:03:05,680 --> 00:03:09,840 Speaker 1: And when I left law enforcement nine years ago, I 52 00:03:09,919 --> 00:03:14,400 Speaker 1: began working with Stop the Traffic more routinely, realizing that 53 00:03:14,800 --> 00:03:18,240 Speaker 1: we needed a richer picture of trafficking if we were 54 00:03:18,280 --> 00:03:22,600 Speaker 1: going to be effective as societies to begin to make 55 00:03:22,639 --> 00:03:26,880 Speaker 1: a history. We haven't done that yet, but we've begun 56 00:03:26,960 --> 00:03:29,519 Speaker 1: to create that richer picture through the work that we've 57 00:03:29,560 --> 00:03:32,359 Speaker 1: been doing. And Neil, I think you've hit on something 58 00:03:32,680 --> 00:03:35,080 Speaker 1: that I really wanted to focus on in the early 59 00:03:35,200 --> 00:03:39,160 Speaker 1: part of our conversation, the fact that even in your 60 00:03:39,240 --> 00:03:44,040 Speaker 1: role in intelligence, that there was a lack of real 61 00:03:44,240 --> 00:03:49,360 Speaker 1: knowledge about human trafficking. I think that certainly can apply 62 00:03:49,520 --> 00:03:53,600 Speaker 1: to the general population. I know that for myself, it's 63 00:03:53,640 --> 00:03:57,800 Speaker 1: something that I am aware happens, and typically I don't 64 00:03:57,840 --> 00:04:02,040 Speaker 1: really even think about it until I'm going through an 65 00:04:02,040 --> 00:04:04,480 Speaker 1: airport and I see a poster that's bringing it to 66 00:04:04,520 --> 00:04:09,480 Speaker 1: your attention directly, and otherwise I'm kind of in the dark. 67 00:04:09,720 --> 00:04:13,640 Speaker 1: Can you give us sort of a an outline of 68 00:04:13,800 --> 00:04:16,320 Speaker 1: how big a problem this is? Give us the scope 69 00:04:16,320 --> 00:04:21,480 Speaker 1: and the impact of human trafficking. Human trafficking is pretty 70 00:04:21,520 --> 00:04:28,400 Speaker 1: well defined as a global phenomenon now. The the academic estimates, 71 00:04:28,480 --> 00:04:32,159 Speaker 1: which are reasonable, suggests that something like fourteen million people 72 00:04:32,240 --> 00:04:36,479 Speaker 1: globally are in circumstances that we would be comfortable to 73 00:04:36,560 --> 00:04:42,280 Speaker 1: describe as trafficking and exploitation. Um, that's an enormous number 74 00:04:42,320 --> 00:04:47,240 Speaker 1: of people. Even in in the UK, the best estimates 75 00:04:47,279 --> 00:04:51,080 Speaker 1: suggest that something like d thirty five thousand people are 76 00:04:51,080 --> 00:04:56,599 Speaker 1: in circumstances of exploitation having been trafficked. So you could 77 00:04:56,680 --> 00:05:00,400 Speaker 1: fill the biggest sports stadium that we've got twice with 78 00:05:00,440 --> 00:05:03,400 Speaker 1: those people. And I think the best way of describing 79 00:05:03,400 --> 00:05:06,920 Speaker 1: it to people is that it's it's an errant economy 80 00:05:06,960 --> 00:05:12,960 Speaker 1: in its own right. Traffic, trafficking and exploitation splits into 81 00:05:12,960 --> 00:05:18,560 Speaker 1: two chunks thirty five percent estimate of those people and 82 00:05:18,600 --> 00:05:22,640 Speaker 1: exploitation tend to be in some aspect of commercial sexual 83 00:05:22,680 --> 00:05:29,000 Speaker 1: exploitation are in labor markets, particularly those labor markets that 84 00:05:29,760 --> 00:05:36,120 Speaker 1: rely on seasonal workers contract workers, so agriculture, food processing, 85 00:05:36,120 --> 00:05:44,880 Speaker 1: and manufacturing, construction, big fishing, sea fleets, logistics are very 86 00:05:44,920 --> 00:05:51,680 Speaker 1: popular destinations for traffic labor where very criminal recruitment gangs 87 00:05:52,440 --> 00:05:56,800 Speaker 1: infiltrate them into the workforce. Most people's journeys into exploitation 88 00:05:57,279 --> 00:06:00,680 Speaker 1: beginners journeys of hope. They're tricked into taking a journey 89 00:06:00,760 --> 00:06:02,840 Speaker 1: on the basis that there's a great new future for 90 00:06:02,880 --> 00:06:05,359 Speaker 1: them and their family, and then when they get to 91 00:06:05,440 --> 00:06:09,880 Speaker 1: that destination, it turns the dust and becomes a creeping 92 00:06:10,000 --> 00:06:15,120 Speaker 1: debt bondage situation. And it's worth something like three quarters 93 00:06:15,120 --> 00:06:18,800 Speaker 1: of a trillion dollars a year. We estimate. There's a 94 00:06:18,839 --> 00:06:22,520 Speaker 1: new official estimate out this year or sorry, early next 95 00:06:22,560 --> 00:06:28,080 Speaker 1: year that will define it slightly differently, probably m but 96 00:06:28,080 --> 00:06:31,320 Speaker 1: but that's our best guests. I hope that that that 97 00:06:31,440 --> 00:06:35,279 Speaker 1: gives you a sense of of of how the thing operates. 98 00:06:36,279 --> 00:06:40,279 Speaker 1: It needs to recruit something like of its workforce newly 99 00:06:40,360 --> 00:06:44,039 Speaker 1: every year, so somewhere somewhere up to eight million people 100 00:06:44,040 --> 00:06:47,760 Speaker 1: a year as a recruitment requirement. It's about money, and 101 00:06:48,240 --> 00:06:51,800 Speaker 1: most of that money goes through financial institutions, And it's 102 00:06:51,800 --> 00:06:56,120 Speaker 1: about creating a market, creating demand and maintaining demand. And 103 00:06:56,160 --> 00:06:59,240 Speaker 1: it can't be solved just by the justice process, and 104 00:06:59,320 --> 00:07:03,680 Speaker 1: it can't be solved just by humanitarian activity rescuing and 105 00:07:03,720 --> 00:07:10,120 Speaker 1: rehabilitating Neil. That also brings me to a follow up question. Traditionally, 106 00:07:11,000 --> 00:07:14,800 Speaker 1: what measures do various agencies and governments take in an 107 00:07:14,840 --> 00:07:18,880 Speaker 1: effort to prevent human trafficking? You had mentioned that this 108 00:07:19,000 --> 00:07:22,120 Speaker 1: is beyond the scope of any one organization, but what 109 00:07:22,200 --> 00:07:26,560 Speaker 1: are the sort of efforts that have been put forward? Uh, 110 00:07:26,720 --> 00:07:31,440 Speaker 1: so far, we need traffickers to have a real sense 111 00:07:31,680 --> 00:07:35,440 Speaker 1: of risk if they do this, that that they are 112 00:07:35,560 --> 00:07:40,000 Speaker 1: likely to be discovered and held to account. And therefore 113 00:07:40,040 --> 00:07:44,920 Speaker 1: there there is a significant role for investigators for the 114 00:07:45,040 --> 00:07:49,480 Speaker 1: justice process. But but more broadly, we need to think 115 00:07:49,520 --> 00:07:54,040 Speaker 1: about the problem in an economic sense, um and and 116 00:07:54,120 --> 00:07:57,760 Speaker 1: that's the aspect that I think has taken too long 117 00:07:58,280 --> 00:08:01,360 Speaker 1: to develop. You know, in lots of parts of the world, 118 00:08:01,400 --> 00:08:05,440 Speaker 1: the justice process doesn't work well, and of course trafficking 119 00:08:05,680 --> 00:08:10,680 Speaker 1: is a global issue. In the more developed societies, the 120 00:08:10,760 --> 00:08:15,160 Speaker 1: justice process does hold people to account, perhaps not in 121 00:08:15,240 --> 00:08:18,040 Speaker 1: the numbers that we might like, but but it's a 122 00:08:18,080 --> 00:08:22,800 Speaker 1: sanction that people fear um and and therefore it's a 123 00:08:22,880 --> 00:08:27,800 Speaker 1: very worthy element of of the program. Um And and 124 00:08:28,000 --> 00:08:30,480 Speaker 1: encouraging other parts of the world where that doesn't work 125 00:08:30,520 --> 00:08:33,600 Speaker 1: so well to get better at it is really important. 126 00:08:34,720 --> 00:08:38,760 Speaker 1: But we have over relied on in my view, on 127 00:08:38,760 --> 00:08:43,520 Speaker 1: on that outcome as the resolution to the problem. And 128 00:08:43,559 --> 00:08:47,160 Speaker 1: of course, while there's money to make in good quantity 129 00:08:47,520 --> 00:08:54,199 Speaker 1: and not enough fear of sanction, then traffickers will still 130 00:08:54,200 --> 00:08:59,200 Speaker 1: flourish and demand will still maintain or grow. Right, so 131 00:08:59,360 --> 00:09:03,800 Speaker 1: without us having any you know, without addressing those root causes, 132 00:09:04,240 --> 00:09:09,280 Speaker 1: what we're looking at really is dealing with the consequences, 133 00:09:09,320 --> 00:09:13,880 Speaker 1: and that's just going to be a consistent issue without 134 00:09:13,920 --> 00:09:17,960 Speaker 1: addressing those root causes. Obviously, this is an enormous issue 135 00:09:18,520 --> 00:09:21,840 Speaker 1: that is going to require a lot of work across 136 00:09:21,880 --> 00:09:26,040 Speaker 1: the globe in order to really tamp down on it. John, 137 00:09:26,240 --> 00:09:31,320 Speaker 1: I'm curious about how you come into the picture. We're 138 00:09:31,360 --> 00:09:34,960 Speaker 1: about to start talking about using technology in a way 139 00:09:35,000 --> 00:09:39,360 Speaker 1: to detect and then take measures to prevent things like 140 00:09:39,440 --> 00:09:46,240 Speaker 1: human trafficking, how did you get involved with this particular challenge. Okay, 141 00:09:46,400 --> 00:09:48,719 Speaker 1: So I think I mentioned earlier Jonathan that I was 142 00:09:48,760 --> 00:09:51,480 Speaker 1: working as a services person. So I was based in 143 00:09:51,520 --> 00:09:54,319 Speaker 1: the Middle East working with some government agencies on behalf 144 00:09:54,320 --> 00:09:57,880 Speaker 1: of IBM Security, and in my role, I had a 145 00:09:57,960 --> 00:10:01,680 Speaker 1: give back opportunity and I was invited by IBM Corporate 146 00:10:01,720 --> 00:10:05,360 Speaker 1: Social Responsibility to come to London to help facilitate a 147 00:10:05,400 --> 00:10:08,960 Speaker 1: workshop for Stop the Traffic and that was the first 148 00:10:09,600 --> 00:10:12,600 Speaker 1: exposure I had really to the issue of human trafficking 149 00:10:12,600 --> 00:10:15,960 Speaker 1: beyond what the casual lay person knew about it. But 150 00:10:16,000 --> 00:10:18,000 Speaker 1: the thing that was interesting for me when I walked 151 00:10:18,000 --> 00:10:21,360 Speaker 1: into the room to host the workshop was the attendees 152 00:10:21,640 --> 00:10:24,599 Speaker 1: weren't just the people I expected. So I expected to 153 00:10:24,640 --> 00:10:27,960 Speaker 1: see non government organizations and not for profits there, and 154 00:10:28,000 --> 00:10:31,439 Speaker 1: I expected to see law enforcement agencies and some government agencies. 155 00:10:32,600 --> 00:10:35,760 Speaker 1: What I didn't expect to see where financial institutions, and 156 00:10:35,800 --> 00:10:38,800 Speaker 1: there were a lot of financial institutions present. And it 157 00:10:38,880 --> 00:10:41,520 Speaker 1: was really during that workshop that I kind of got 158 00:10:41,520 --> 00:10:45,079 Speaker 1: the realization that this was across sectoral issue and the 159 00:10:45,160 --> 00:10:49,120 Speaker 1: solution had to come from multisectoral collaboration. So that was 160 00:10:49,160 --> 00:10:51,920 Speaker 1: really the starting point for me and from that I 161 00:10:51,960 --> 00:10:54,120 Speaker 1: worked with Neil and to stop the traffic team to 162 00:10:54,200 --> 00:10:57,840 Speaker 1: learn more about the issue, and I spent many evenings 163 00:10:57,840 --> 00:11:00,720 Speaker 1: and weekends in the hotel in the Middle Least building 164 00:11:00,720 --> 00:11:04,800 Speaker 1: prototypes and sampling what could be done using various technologies. 165 00:11:06,040 --> 00:11:08,319 Speaker 1: All are all based on this principle of how do 166 00:11:08,400 --> 00:11:12,400 Speaker 1: we get to data sharing collaboration around this issue. Can 167 00:11:12,440 --> 00:11:15,000 Speaker 1: you talk a little bit more about those technologies, what 168 00:11:15,559 --> 00:11:18,640 Speaker 1: form did they take? What was it that you were thinking, like, 169 00:11:19,000 --> 00:11:21,760 Speaker 1: what metrics are you looking at and how are you 170 00:11:21,840 --> 00:11:27,880 Speaker 1: analyzing them? Sure? The the starting point in the first 171 00:11:27,880 --> 00:11:29,800 Speaker 1: workshop was there was kind of a division in the 172 00:11:29,880 --> 00:11:33,319 Speaker 1: room depending on the agencies and the at the sort 173 00:11:33,320 --> 00:11:35,920 Speaker 1: of core mission of each organization, but there was a 174 00:11:35,960 --> 00:11:40,720 Speaker 1: basic two requirements primary requirements that came out. The first 175 00:11:40,880 --> 00:11:44,640 Speaker 1: was for this ability to do a global level analysis 176 00:11:44,679 --> 00:11:48,479 Speaker 1: of the problem to see where the areas of intensity were, 177 00:11:48,480 --> 00:11:51,760 Speaker 1: first particular types of trafficking, to be able to see 178 00:11:51,760 --> 00:11:55,400 Speaker 1: how this is influenced by not just geography but by time. 179 00:11:56,320 --> 00:11:58,880 Speaker 1: And then also there there was a requirement to be 180 00:11:58,920 --> 00:12:01,600 Speaker 1: able to see the roots we're being used by the 181 00:12:01,600 --> 00:12:04,880 Speaker 1: traffickers to move their their victims from point day to 182 00:12:04,920 --> 00:12:07,320 Speaker 1: point b so, so that was kind of the one 183 00:12:07,320 --> 00:12:09,240 Speaker 1: half of the room. We're looking for this macro level 184 00:12:09,320 --> 00:12:12,000 Speaker 1: view that would give them the global picture and and 185 00:12:12,280 --> 00:12:14,680 Speaker 1: if you like, validate some of the high level figures, 186 00:12:14,720 --> 00:12:17,720 Speaker 1: the estimated figures that Neil was talking about earlier. And 187 00:12:17,760 --> 00:12:21,040 Speaker 1: then the other half of the room were more interested in, Okay, 188 00:12:21,080 --> 00:12:23,440 Speaker 1: now that I know where the issue is, how do 189 00:12:23,520 --> 00:12:25,840 Speaker 1: I pull that into a secure environment where I could 190 00:12:25,840 --> 00:12:29,120 Speaker 1: start to investigate it and start to understand the network 191 00:12:29,120 --> 00:12:31,920 Speaker 1: in more detail. Who are the people involved, how are 192 00:12:31,920 --> 00:12:35,280 Speaker 1: they moving people? What tools are they using? You know, 193 00:12:35,320 --> 00:12:38,760 Speaker 1: what addresses, account numbers, all that kind of stuff. So 194 00:12:38,880 --> 00:12:41,960 Speaker 1: we had this kind of a double requirement, so we 195 00:12:42,000 --> 00:12:44,480 Speaker 1: started to look at what kind of technologies we used 196 00:12:44,480 --> 00:12:46,640 Speaker 1: and used in the past which could help to satisfy 197 00:12:46,760 --> 00:12:51,160 Speaker 1: both of these requirements. While you were developing this in 198 00:12:51,160 --> 00:12:53,960 Speaker 1: the early days, what were some of the lessons you learned, 199 00:12:53,960 --> 00:12:56,520 Speaker 1: What were things that you know, were there pathways that 200 00:12:56,640 --> 00:12:59,920 Speaker 1: you were taking early on that turned out to be 201 00:13:00,200 --> 00:13:03,559 Speaker 1: less rutful than you hoped, or things that you discovered 202 00:13:03,600 --> 00:13:07,520 Speaker 1: that surprised you while you were developing this early approach. 203 00:13:08,200 --> 00:13:10,839 Speaker 1: Sure the well, one of the first things that hit 204 00:13:10,960 --> 00:13:15,160 Speaker 1: us wasn't necessarily a surprise, Jonathan, but uh, the extent 205 00:13:15,520 --> 00:13:18,400 Speaker 1: of how it impacted us kind of surprised. This was 206 00:13:18,480 --> 00:13:22,800 Speaker 1: the whole data privacy issue and the challenges around sharing 207 00:13:22,880 --> 00:13:27,160 Speaker 1: data across jurisdictions. So so this became a reasonably high 208 00:13:27,280 --> 00:13:29,880 Speaker 1: priority in our requirements, if you like. When we were 209 00:13:29,880 --> 00:13:33,360 Speaker 1: trying to design the system. A lot of the basis 210 00:13:33,360 --> 00:13:35,000 Speaker 1: of what we were trying to do is captured data 211 00:13:35,080 --> 00:13:38,079 Speaker 1: from all over the world and make it available to 212 00:13:38,240 --> 00:13:40,880 Speaker 1: partners from all over the world. But we had to 213 00:13:40,880 --> 00:13:43,240 Speaker 1: be very careful that we took out any sensitive information, 214 00:13:43,240 --> 00:13:46,040 Speaker 1: any unique identifiers, and then we had to run the 215 00:13:46,120 --> 00:13:50,320 Speaker 1: proposals true you know, various legal people to give us 216 00:13:50,320 --> 00:13:53,280 Speaker 1: advice on whether or not we were following the right path. 217 00:13:53,880 --> 00:13:56,480 Speaker 1: So not not so much a technical issue, although there 218 00:13:56,480 --> 00:13:59,760 Speaker 1: are technologies that can help with this. It was more about, 219 00:13:59,800 --> 00:14:02,839 Speaker 1: you know, requirements issue. And then we started to look 220 00:14:02,880 --> 00:14:06,360 Speaker 1: at things like, um, the largest amount of the data 221 00:14:06,600 --> 00:14:09,400 Speaker 1: is contained in the narratives that the victims are, the 222 00:14:09,480 --> 00:14:13,319 Speaker 1: narratives about the stories the victims, and to do that 223 00:14:13,360 --> 00:14:16,720 Speaker 1: we we turned to natural language understanding and machine learning, 224 00:14:17,280 --> 00:14:20,320 Speaker 1: and then we hit the challenges that everybody hits in 225 00:14:20,360 --> 00:14:23,640 Speaker 1: this domain of making sure it's accurate, make sure it's unbiased, 226 00:14:23,680 --> 00:14:28,320 Speaker 1: but also dealing with multilingual issues, so a lot of 227 00:14:28,360 --> 00:14:31,320 Speaker 1: the data is not necessarily in the primary languages. So 228 00:14:32,400 --> 00:14:34,160 Speaker 1: that that was another one of the big challenges that 229 00:14:34,200 --> 00:14:38,160 Speaker 1: we had to think about. Yes, this is an enormous challenge, 230 00:14:38,240 --> 00:14:41,920 Speaker 1: justin in machine learning in general, is the natural language 231 00:14:42,080 --> 00:14:46,680 Speaker 1: processing and being able to parse what someone means when 232 00:14:46,680 --> 00:14:49,360 Speaker 1: they say something in particular way. And I imagine when 233 00:14:49,400 --> 00:14:53,760 Speaker 1: you are trying to handle or analyze an enormous amount 234 00:14:53,760 --> 00:14:59,200 Speaker 1: of data, that problem becomes magnified enormously. What was it 235 00:14:59,800 --> 00:15:04,320 Speaker 1: the a particular set of efforts that then led into 236 00:15:04,480 --> 00:15:07,280 Speaker 1: the Traffic Analysis hub or did that come about in 237 00:15:07,320 --> 00:15:12,120 Speaker 1: a different way. Yeah, the Traffic Analysis Hub came out 238 00:15:12,160 --> 00:15:14,120 Speaker 1: of a kind of vision that stopped the traffic it 239 00:15:14,240 --> 00:15:17,480 Speaker 1: had for a while. It became part of that workshop 240 00:15:17,520 --> 00:15:21,240 Speaker 1: on the back in London. It was that macro level 241 00:15:21,360 --> 00:15:24,600 Speaker 1: view that everybody could share and everybody got value from, 242 00:15:24,640 --> 00:15:27,680 Speaker 1: and that became the primary target for the initial prototypes. 243 00:15:28,080 --> 00:15:29,720 Speaker 1: So when we were looking at that, we were trying 244 00:15:29,760 --> 00:15:32,560 Speaker 1: to get a geospatial view, you know, a map based 245 00:15:32,600 --> 00:15:35,000 Speaker 1: analysis of data. We were trying to figure out how 246 00:15:35,000 --> 00:15:38,000 Speaker 1: to capture data, and then we realized that every different 247 00:15:38,040 --> 00:15:41,400 Speaker 1: source that we accessed kind of classified their data uniquely, 248 00:15:41,920 --> 00:15:44,680 Speaker 1: and now it's very difficult to do comparative analysis across 249 00:15:44,760 --> 00:15:47,360 Speaker 1: these things. So then we hit the challenge of how 250 00:15:47,360 --> 00:15:50,000 Speaker 1: do we make it consistent so that it makes sense 251 00:15:50,000 --> 00:15:53,880 Speaker 1: to everybody. And then we we hit challenges like things 252 00:15:53,920 --> 00:15:58,080 Speaker 1: like locations. So there's lots of in the narratives of stories, 253 00:15:58,120 --> 00:16:01,520 Speaker 1: there's lots of references to location. We needed to understand 254 00:16:01,560 --> 00:16:05,040 Speaker 1: not just where location was referenced, but the context in 255 00:16:05,080 --> 00:16:07,520 Speaker 1: which has been referenced. And then when we knew that, 256 00:16:07,960 --> 00:16:09,640 Speaker 1: we had to go find a coordinates for it to 257 00:16:09,640 --> 00:16:11,280 Speaker 1: put it on the map. But we had to be 258 00:16:11,320 --> 00:16:13,360 Speaker 1: careful that we were getting to correct coordinates for the 259 00:16:13,400 --> 00:16:16,640 Speaker 1: correct location because there's lots of For instance, I think 260 00:16:16,640 --> 00:16:18,880 Speaker 1: there's seventeen different Londons around the world, so we have 261 00:16:18,960 --> 00:16:21,680 Speaker 1: to be clear about which London was actually been referenced 262 00:16:21,720 --> 00:16:24,400 Speaker 1: in text. So so that was really kind of the 263 00:16:24,640 --> 00:16:28,440 Speaker 1: progression of the prototypes. Yeah, I think that for for 264 00:16:28,600 --> 00:16:31,480 Speaker 1: a lot of people, myself included, we can sometimes fall 265 00:16:31,520 --> 00:16:34,840 Speaker 1: into a trap where we're thinking about these very sophisticated 266 00:16:34,840 --> 00:16:40,280 Speaker 1: systems pulling data as if it's magically all in a centralized, 267 00:16:41,160 --> 00:16:45,520 Speaker 1: uniform database. I think the magical thing for a lot 268 00:16:45,600 --> 00:16:48,960 Speaker 1: of folks who look into this is that we see 269 00:16:48,960 --> 00:16:53,360 Speaker 1: how these systems are able to spot patterns, uh and 270 00:16:53,480 --> 00:16:56,840 Speaker 1: trends in data sets that are so enormous that to 271 00:16:57,000 --> 00:17:00,720 Speaker 1: us there's no signal, it's just noise. So seeing something 272 00:17:01,080 --> 00:17:04,760 Speaker 1: that can pick out the signal does seem a little magical. Well, 273 00:17:04,800 --> 00:17:08,040 Speaker 1: as the t A hub is evolving and taking shape, 274 00:17:08,040 --> 00:17:11,879 Speaker 1: have we already seen some impact in the real world? 275 00:17:12,040 --> 00:17:15,720 Speaker 1: Is it being used right now to help identify and 276 00:17:15,800 --> 00:17:22,199 Speaker 1: prevent trafficking? Today? It's being used by over We have 277 00:17:22,240 --> 00:17:24,920 Speaker 1: over a hundred organizations who are members of the Hub 278 00:17:24,960 --> 00:17:27,880 Speaker 1: at this point, and all of them have their own 279 00:17:27,880 --> 00:17:31,320 Speaker 1: secret missions or their own, uh, their own core missions 280 00:17:31,359 --> 00:17:33,199 Speaker 1: of what they're they're trying to achieve with us, But 281 00:17:33,280 --> 00:17:37,000 Speaker 1: we have anecdotal stories from various parties of where they've 282 00:17:37,040 --> 00:17:39,479 Speaker 1: got value from the data that's in the Hub. And 283 00:17:39,600 --> 00:17:42,960 Speaker 1: sometimes the value, interestingly, is not just in the data, 284 00:17:43,040 --> 00:17:46,119 Speaker 1: it's in the collaboration with their peer organizations and the 285 00:17:46,119 --> 00:17:48,760 Speaker 1: other partners in the hub, which was part of what 286 00:17:48,920 --> 00:17:51,159 Speaker 1: we tried to set out to do in the first place, 287 00:17:51,600 --> 00:17:56,159 Speaker 1: was achieved as kind of safe collaborative environment where people 288 00:17:56,240 --> 00:18:00,480 Speaker 1: could share their expertise as well as their knowledge for 289 00:18:00,560 --> 00:18:03,480 Speaker 1: the purpose of disrupting human trafficking. But we have got 290 00:18:03,520 --> 00:18:06,800 Speaker 1: a lot of feedback from various partners where they've been 291 00:18:06,800 --> 00:18:09,200 Speaker 1: able to validate data that they had seen in their 292 00:18:09,200 --> 00:18:13,760 Speaker 1: internal systems when they were starting to investigate issues. They're 293 00:18:13,760 --> 00:18:15,880 Speaker 1: able to validate some of that in the Hub by 294 00:18:15,920 --> 00:18:19,960 Speaker 1: looking at the data that we've been collecting. And then conversely, 295 00:18:20,040 --> 00:18:23,479 Speaker 1: we've also had the same feedback from organizations who are 296 00:18:23,520 --> 00:18:26,960 Speaker 1: investigators or say, we're able to identify new areas of 297 00:18:27,000 --> 00:18:30,160 Speaker 1: investigation in the Hub that we weren't aware of because 298 00:18:30,160 --> 00:18:33,080 Speaker 1: we've never looked there before, but once we started to look, 299 00:18:33,119 --> 00:18:36,440 Speaker 1: we started to see patterns in our own data sets 300 00:18:36,440 --> 00:18:39,919 Speaker 1: in those locations. There are facilities for different audiences in 301 00:18:39,960 --> 00:18:43,520 Speaker 1: the Hub, So you've got people like researchers and academia 302 00:18:43,520 --> 00:18:47,640 Speaker 1: who come in and the facility we have in which 303 00:18:48,040 --> 00:18:51,639 Speaker 1: in the hub, which allows them to navigate by concept 304 00:18:52,040 --> 00:18:56,199 Speaker 1: through large news data sets, and that's a facility that 305 00:18:56,280 --> 00:18:59,040 Speaker 1: they give us feedback on a lot that tells us 306 00:18:59,080 --> 00:19:02,720 Speaker 1: it helps them to find information and to support their 307 00:19:02,840 --> 00:19:06,600 Speaker 1: their research. We had one person who um Every month 308 00:19:06,680 --> 00:19:09,359 Speaker 1: we have an analyst call in the community where the 309 00:19:09,400 --> 00:19:12,000 Speaker 1: community and the Hub come together. They look at the 310 00:19:12,000 --> 00:19:15,000 Speaker 1: functionalities that we're building and the data sets that were gathering, 311 00:19:15,000 --> 00:19:17,000 Speaker 1: and they give us direction of what they need and 312 00:19:17,040 --> 00:19:19,919 Speaker 1: we feature a participant on that every month. So we 313 00:19:20,000 --> 00:19:23,640 Speaker 1: have had a person who actually actually presented their thesis 314 00:19:24,480 --> 00:19:26,760 Speaker 1: and part of their thesis was based on data that 315 00:19:26,800 --> 00:19:29,840 Speaker 1: they pulled in from the Hub to to validate their 316 00:19:29,840 --> 00:19:34,480 Speaker 1: own their ow own insights into human trafficking. That's phenomenal. 317 00:19:34,520 --> 00:19:37,800 Speaker 1: So not just building a system that's doing this very 318 00:19:37,800 --> 00:19:41,919 Speaker 1: technical work, but also just building these relationships, forming relationships 319 00:19:41,960 --> 00:19:47,639 Speaker 1: across various sectors and various countries that can all be 320 00:19:48,119 --> 00:19:52,920 Speaker 1: you know, directed toward helping stop this problem. What other 321 00:19:53,000 --> 00:19:58,360 Speaker 1: ways do you see the Traffic Analysis Hub impacting various industries? 322 00:19:59,000 --> 00:20:04,480 Speaker 1: So we've well, first off, we've built a platform underpinning 323 00:20:04,480 --> 00:20:07,720 Speaker 1: the Traffic Analysis Hub which allows allows us to reuse 324 00:20:07,760 --> 00:20:12,440 Speaker 1: the capabilities across different um issues. So we've also used 325 00:20:12,440 --> 00:20:16,679 Speaker 1: it for things like food redistribution to avoid food waste, 326 00:20:16,720 --> 00:20:19,840 Speaker 1: and we've also used it in the area of migration 327 00:20:19,920 --> 00:20:25,159 Speaker 1: and population displacement and trying to create prediction models and stuff. 328 00:20:25,560 --> 00:20:28,680 Speaker 1: So the thing that kind of excites me about this 329 00:20:28,760 --> 00:20:32,480 Speaker 1: is we're starting to bring in new sectors, but also 330 00:20:33,280 --> 00:20:36,600 Speaker 1: not just industry sectors, but sectors within the n g 331 00:20:36,760 --> 00:20:40,919 Speaker 1: O world who are focused on different parts of of 332 00:20:40,920 --> 00:20:45,480 Speaker 1: of social issues and we're bringing together into one platform 333 00:20:45,560 --> 00:20:48,320 Speaker 1: and one community and start to share information. So we've 334 00:20:48,320 --> 00:20:51,159 Speaker 1: been approached by organizations who are who are focused on 335 00:20:51,200 --> 00:20:53,560 Speaker 1: animal trafficking to see see if they can get access 336 00:20:53,600 --> 00:20:55,679 Speaker 1: to the hub and start to share their data in 337 00:20:55,720 --> 00:20:58,040 Speaker 1: there as well. And we're all starting to see the 338 00:20:58,400 --> 00:21:01,800 Speaker 1: reusability of some of the things that we've built. For instance, 339 00:21:01,800 --> 00:21:05,960 Speaker 1: we've built a causality model in partnership with IBM Research, 340 00:21:06,520 --> 00:21:09,480 Speaker 1: and where we were looking at the cause that the 341 00:21:09,520 --> 00:21:15,120 Speaker 1: attributes that are most prevalent in causing things like population displacements, 342 00:21:15,880 --> 00:21:18,480 Speaker 1: and these models are things that we can then reapply 343 00:21:19,040 --> 00:21:22,920 Speaker 1: from one use case to another. So we're trying trying 344 00:21:22,960 --> 00:21:25,280 Speaker 1: now to move that model into human trafficking to see 345 00:21:25,280 --> 00:21:29,480 Speaker 1: if we can determine, for instance, the the likely outcome 346 00:21:29,520 --> 00:21:34,000 Speaker 1: analysis for interventions in certain locations. To me, that's also 347 00:21:34,040 --> 00:21:38,080 Speaker 1: inspiring because in that process you could be working on 348 00:21:38,240 --> 00:21:43,280 Speaker 1: issues that are tangentially tied into trafficking, you know, some 349 00:21:43,359 --> 00:21:46,879 Speaker 1: of those underlying root causes we were talking about, and 350 00:21:46,920 --> 00:21:50,280 Speaker 1: being able to solve some of these social issues can 351 00:21:50,320 --> 00:21:53,960 Speaker 1: also help remove some of those causes or at least 352 00:21:54,000 --> 00:21:57,840 Speaker 1: diminish them somewhat, and thus have the sort of positive 353 00:21:57,840 --> 00:22:01,800 Speaker 1: feedback loop of being able to solve these these traditionally 354 00:22:01,800 --> 00:22:05,840 Speaker 1: incredibly difficult problems, largely because it is hard for us 355 00:22:05,840 --> 00:22:09,360 Speaker 1: to even get a grasp on all the data that 356 00:22:09,680 --> 00:22:13,080 Speaker 1: plays into this. I sometimes liken this too, you know, 357 00:22:13,400 --> 00:22:17,360 Speaker 1: making making the challenge of making a long hot term 358 00:22:17,480 --> 00:22:19,960 Speaker 1: forecast for the weather. There's just so many variables that 359 00:22:20,000 --> 00:22:22,600 Speaker 1: are out there, and they interact with each other in 360 00:22:22,640 --> 00:22:26,640 Speaker 1: ways that we don't fully understand. It can be difficult 361 00:22:26,680 --> 00:22:30,080 Speaker 1: to make anything, you know, uh, like a forecast that's 362 00:22:30,119 --> 00:22:34,800 Speaker 1: ten days out. On a similar front, we see this 363 00:22:34,920 --> 00:22:38,159 Speaker 1: real world you know, unfolding of of trying to tackle 364 00:22:38,200 --> 00:22:42,760 Speaker 1: these enormous social problems that also have all these different variables, 365 00:22:42,800 --> 00:22:46,639 Speaker 1: many of which are at their heart human issues, and 366 00:22:46,760 --> 00:22:50,840 Speaker 1: humans are largely unpredictable creatures. So it's fascinating to see 367 00:22:51,280 --> 00:22:55,040 Speaker 1: these systems that are starting to glean insights into the 368 00:22:55,080 --> 00:22:58,280 Speaker 1: way these these large systems of people and and the 369 00:22:58,280 --> 00:23:00,919 Speaker 1: way we work, how how they actually perform out in 370 00:23:00,920 --> 00:23:04,520 Speaker 1: the real world, being able to draw conclusions about that, 371 00:23:04,640 --> 00:23:09,480 Speaker 1: predictions and perhaps solutions. Um, what would you say are 372 00:23:09,840 --> 00:23:12,359 Speaker 1: some of the lessons you have learned in this, both 373 00:23:12,480 --> 00:23:17,560 Speaker 1: just as seeing how the t A Hub and the 374 00:23:17,640 --> 00:23:22,760 Speaker 1: related technologies have given insight into the human trafficking problem, 375 00:23:22,800 --> 00:23:26,119 Speaker 1: and also lessons you've learned as as leaders in that space. 376 00:23:26,640 --> 00:23:29,639 Speaker 1: Sure well, certainly from from my side, one of the 377 00:23:29,680 --> 00:23:32,600 Speaker 1: big lessons I've learned is how super motivated the IBM 378 00:23:32,720 --> 00:23:36,640 Speaker 1: staff are to get involved in initiatives like this. It's 379 00:23:36,640 --> 00:23:39,600 Speaker 1: been I was talking to somebody earlier today and I 380 00:23:39,760 --> 00:23:41,920 Speaker 1: was saying, I could spend fifty of my time talking 381 00:23:41,920 --> 00:23:45,240 Speaker 1: to volunteers within IBM who want to help, and they're 382 00:23:45,280 --> 00:23:50,520 Speaker 1: all bringing individual skills and capabilities and experience here and 383 00:23:50,640 --> 00:23:53,639 Speaker 1: offering to help us out with various pieces of the puzzle. 384 00:23:54,520 --> 00:23:58,639 Speaker 1: So there's a huge potential here to apply technology to 385 00:23:58,720 --> 00:24:01,640 Speaker 1: some of these challenges. The other thing that's very interesting 386 00:24:01,640 --> 00:24:04,879 Speaker 1: at the moment is a lot of these core major 387 00:24:04,960 --> 00:24:08,240 Speaker 1: social issues, whether it's the pandemic, whether it's climate change, 388 00:24:08,240 --> 00:24:12,920 Speaker 1: whether it's population displacement, whether it's trafficking. They're all intertwined 389 00:24:13,760 --> 00:24:16,199 Speaker 1: and one is influencing the other. And the attributes that 390 00:24:16,320 --> 00:24:20,600 Speaker 1: influence influence the prevalence of this of these events and 391 00:24:20,760 --> 00:24:24,440 Speaker 1: different parts of the world, they're very often common attributes. 392 00:24:25,440 --> 00:24:27,679 Speaker 1: So we're trying to figure out can we build models 393 00:24:27,680 --> 00:24:29,639 Speaker 1: that will help us to identify, you know, what are 394 00:24:29,680 --> 00:24:32,960 Speaker 1: the attributes that are that are interesting and trying to 395 00:24:33,080 --> 00:24:37,000 Speaker 1: lead a team through this, you know, keep them focused 396 00:24:37,000 --> 00:24:39,040 Speaker 1: on stuff that we have to deliver, but also giving 397 00:24:39,080 --> 00:24:44,040 Speaker 1: him the freedom and the ability to go and explore 398 00:24:44,080 --> 00:24:49,000 Speaker 1: these new opportunities and new ideas. That's as a core 399 00:24:49,280 --> 00:24:53,600 Speaker 1: learning for me. Yeah. From my side, Jonathan, I think 400 00:24:53,640 --> 00:24:56,000 Speaker 1: the first thing I discovered was that whilst we are 401 00:24:56,400 --> 00:25:02,639 Speaker 1: absolutely data rich, we are terribly knowledge poor um and 402 00:25:02,640 --> 00:25:05,760 Speaker 1: and the work that we've been doing together with IBM 403 00:25:06,480 --> 00:25:09,080 Speaker 1: and the Tech for Good team, I think has begun 404 00:25:09,160 --> 00:25:13,880 Speaker 1: to change that picture um and and then So the 405 00:25:13,680 --> 00:25:18,119 Speaker 1: next key element in that chain of activity needs to 406 00:25:18,200 --> 00:25:22,360 Speaker 1: be to ensure the widest possible appropriate audience can access 407 00:25:22,440 --> 00:25:26,760 Speaker 1: that knowledge. Because no one's got enough resources to do 408 00:25:26,840 --> 00:25:30,000 Speaker 1: everything at once. It's it's it's the classic problem. You 409 00:25:30,040 --> 00:25:33,600 Speaker 1: can only focus on so many things, so you need 410 00:25:33,640 --> 00:25:36,800 Speaker 1: to use that knowledge like I would have used intelligence 411 00:25:37,280 --> 00:25:41,679 Speaker 1: in an investigative way in law enforcement, to focus the 412 00:25:41,720 --> 00:25:44,960 Speaker 1: resources that you've got at the hot spots and points 413 00:25:45,000 --> 00:25:48,880 Speaker 1: where you can make a difference. And that that's how 414 00:25:48,920 --> 00:25:51,840 Speaker 1: we get this thing on the run. And we need 415 00:25:51,880 --> 00:25:57,679 Speaker 1: to we need to start undermining the economic pillars that 416 00:25:57,920 --> 00:26:03,399 Speaker 1: currently comfortably support trafficking in persons and exploitation. And I 417 00:26:03,480 --> 00:26:07,119 Speaker 1: think that we've mind a decent stuff, And I like 418 00:26:07,280 --> 00:26:11,280 Speaker 1: Neil how you brought that around to this challenge of 419 00:26:11,320 --> 00:26:14,600 Speaker 1: being data rich and knowledge poor. To me, that was 420 00:26:15,320 --> 00:26:19,440 Speaker 1: we're seeing that that that pivot now where the early 421 00:26:19,520 --> 00:26:22,760 Speaker 1: days of big data seem to be an emphasis on 422 00:26:23,040 --> 00:26:26,199 Speaker 1: look at how much data we have access to, and 423 00:26:26,280 --> 00:26:28,720 Speaker 1: now we are kind of moving into a new era. 424 00:26:29,119 --> 00:26:32,080 Speaker 1: We're well into a new era really where it's how 425 00:26:32,320 --> 00:26:36,880 Speaker 1: do we actually leverage this enormous fire hose of information. 426 00:26:37,800 --> 00:26:42,199 Speaker 1: It's coming in from all directions, generated by more devices 427 00:26:42,240 --> 00:26:46,000 Speaker 1: than ever before in the history of humanity. And we're 428 00:26:46,000 --> 00:26:49,600 Speaker 1: actually starting to see systems like the the t A Hub, 429 00:26:49,800 --> 00:26:53,040 Speaker 1: systems that are able to take that information and do 430 00:26:53,160 --> 00:26:58,360 Speaker 1: something that's truly useful and impactful. How do you see 431 00:26:58,520 --> 00:27:03,480 Speaker 1: the approach to trafficking changing over the course of the future. 432 00:27:03,520 --> 00:27:08,760 Speaker 1: What do you see as the evolution of addressing human trafficking. 433 00:27:09,720 --> 00:27:13,400 Speaker 1: I think the big gains are in commerce and industry. 434 00:27:14,400 --> 00:27:21,960 Speaker 1: I think that the ability forum for corporates to begin 435 00:27:22,040 --> 00:27:27,600 Speaker 1: to understand where they need to focus their activities and 436 00:27:27,680 --> 00:27:32,440 Speaker 1: what questions they need to ask of their suppliers, particularly 437 00:27:32,480 --> 00:27:36,800 Speaker 1: and in difficult parts of the world UM And similarly 438 00:27:37,000 --> 00:27:43,119 Speaker 1: for financial institutions, again it helps them because because every 439 00:27:43,280 --> 00:27:48,600 Speaker 1: errant business has a banker and a banking facility UM, 440 00:27:48,640 --> 00:27:52,240 Speaker 1: and the clues are there. If the if the customer 441 00:27:52,280 --> 00:27:55,960 Speaker 1: management process knew what those clues were and knew what 442 00:27:56,119 --> 00:28:00,399 Speaker 1: questions to ask and and our view is that the 443 00:28:00,720 --> 00:28:04,159 Speaker 1: more the more we grow access to the data that 444 00:28:04,240 --> 00:28:10,280 Speaker 1: we've we have two businesses and financial institutions, the greater 445 00:28:10,400 --> 00:28:16,240 Speaker 1: influence they'll have on opportunity or and reduce opportunity for 446 00:28:16,400 --> 00:28:20,560 Speaker 1: trafficking to flourish. Before I sign off in this episode, 447 00:28:20,600 --> 00:28:24,080 Speaker 1: I just want to reiterate some other things we covered 448 00:28:24,200 --> 00:28:28,200 Speaker 1: in this and that is these are non trivial problems. 449 00:28:28,280 --> 00:28:31,760 Speaker 1: Both the real world problem of human trafficking, which is 450 00:28:31,960 --> 00:28:36,560 Speaker 1: clearly non trivial, it is critical, and the actual computer 451 00:28:36,720 --> 00:28:40,440 Speaker 1: problems that the teams are trying to solve in order 452 00:28:40,480 --> 00:28:45,240 Speaker 1: to really take full advantage of artificial intelligence machine learning 453 00:28:45,680 --> 00:28:50,120 Speaker 1: and apply that to this incredibly difficult issue. Everything from 454 00:28:50,240 --> 00:28:55,440 Speaker 1: natural language processing too, pulling in information from various sources 455 00:28:55,520 --> 00:28:59,040 Speaker 1: and contextualizing it in a way that's useful. These are 456 00:28:59,160 --> 00:29:02,560 Speaker 1: hard problems to solve, but as we've seen, it is 457 00:29:02,640 --> 00:29:06,280 Speaker 1: worth it in the effort to stop human trafficking. I 458 00:29:06,320 --> 00:29:09,600 Speaker 1: want to thank John and Neil again for joining the episode. 459 00:29:09,920 --> 00:29:12,760 Speaker 1: It was an honor to talk with them about such 460 00:29:12,960 --> 00:29:16,560 Speaker 1: an important issue. I hope that you learned something in 461 00:29:16,600 --> 00:29:19,160 Speaker 1: this episode, and I look forward to sharing more Smart 462 00:29:19,200 --> 00:29:23,000 Speaker 1: Talks episodes with you in the near future. Take care, ye. 463 00:29:27,240 --> 00:29:30,240 Speaker 1: Text Stuff is an I Heart Radio production. For more 464 00:29:30,320 --> 00:29:33,720 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 465 00:29:33,840 --> 00:29:37,000 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.