1 00:00:00,240 --> 00:00:03,040 Speaker 1: Brought to you by the reinvented two thousand twelve Camray. 2 00:00:03,120 --> 00:00:07,480 Speaker 1: It's ready. Are you welcome to Stuff you should know 3 00:00:08,080 --> 00:00:16,200 Speaker 1: from house Stuff Works dot Com. Hey, and welcome to 4 00:00:16,239 --> 00:00:18,880 Speaker 1: the podcast. I'm Josh Clark with me as always is 5 00:00:19,000 --> 00:00:23,160 Speaker 1: Charles W. Bryant. Now you, Chuck, I'm here, Josh. Oh wait, 6 00:00:23,520 --> 00:00:26,279 Speaker 1: this isn't the Halloween episode. No, No, that already, that 7 00:00:26,320 --> 00:00:29,520 Speaker 1: already came out. That was good, though. I appreciate that. Yeah, 8 00:00:29,560 --> 00:00:32,120 Speaker 1: that was really good. Well, truthfully, it's almost Halloween's I'm 9 00:00:32,159 --> 00:00:36,600 Speaker 1: just in a spirit, are you? Yeah? In the spirit? Yes, dude, 10 00:00:36,640 --> 00:00:38,720 Speaker 1: I have been punning lately left and right, and it's 11 00:00:38,760 --> 00:00:41,720 Speaker 1: just it makes me sick. You have the stomach punkin though? 12 00:00:41,760 --> 00:00:46,080 Speaker 1: Have you good one, Chuck? That reminds me that we've 13 00:00:46,080 --> 00:00:48,400 Speaker 1: been We've been talking about this punk and chunking thing 14 00:00:48,440 --> 00:00:51,360 Speaker 1: pretty hard. Let's do it again, Yeah time, let's we'll 15 00:00:51,360 --> 00:00:55,680 Speaker 1: say it again. Okay, So the mothership of Discovery Channel 16 00:00:56,040 --> 00:00:58,840 Speaker 1: has asked us to mention a show that's coming out 17 00:00:58,840 --> 00:01:01,480 Speaker 1: a special, actually the pair of specials. It's coming out 18 00:01:01,640 --> 00:01:06,200 Speaker 1: on Thanksgiving night, um on the Science Channel eight p 19 00:01:06,400 --> 00:01:08,840 Speaker 1: m uh you got there's two shows, Like you said, 20 00:01:08,880 --> 00:01:12,520 Speaker 1: the Road to punk and checking and punkin checking, punkin 21 00:01:12,600 --> 00:01:16,000 Speaker 1: chunking itself and again that's that begins at eight p m. 22 00:01:16,120 --> 00:01:19,640 Speaker 1: Eastern Time, because I believe it takes place in the 23 00:01:19,720 --> 00:01:24,080 Speaker 1: Town in the East, which is appropriated no on Science Channel. Yeah, 24 00:01:24,400 --> 00:01:28,520 Speaker 1: so Chuck, yes you want a podcast, Yeah, let's do it. Okay, So, Chuck, 25 00:01:28,520 --> 00:01:32,280 Speaker 1: have you ever seen Minority Report? I have? Uh, sure, 26 00:01:32,319 --> 00:01:36,640 Speaker 1: Steven Spielberg and the Yeah, yeah I knew. I knew 27 00:01:37,080 --> 00:01:39,319 Speaker 1: Tom Cruise is in it. Didn't know Spielberg directed? Yeah, 28 00:01:39,280 --> 00:01:40,400 Speaker 1: I thought it was okay, it kind of lost me 29 00:01:40,440 --> 00:01:43,480 Speaker 1: in the third act, didn't, Yeah, big time. You know 30 00:01:43,520 --> 00:01:46,520 Speaker 1: it's based on Philip K. Dick. Um, I think novel 31 00:01:46,600 --> 00:01:49,760 Speaker 1: short story wasn't. Yeah, it was called the Minority Report. 32 00:01:50,280 --> 00:01:54,080 Speaker 1: Oh yeah, that's Hollywood for you. They're always changing things. Yeah. 33 00:01:54,160 --> 00:01:57,800 Speaker 1: But okay, so you you know, um that it's about 34 00:01:57,960 --> 00:02:02,000 Speaker 1: a uh I guess, a crime section, anti crime section, 35 00:02:02,120 --> 00:02:05,600 Speaker 1: law enforcement some people call it um that arrests people 36 00:02:05,720 --> 00:02:08,880 Speaker 1: based on information given to them by this group of 37 00:02:08,919 --> 00:02:13,440 Speaker 1: people who have precognition, and that would be the Office 38 00:02:13,480 --> 00:02:18,880 Speaker 1: of pre Crime. Appropriately, the Cruise was a pre crime officer, right. Um, 39 00:02:18,919 --> 00:02:24,640 Speaker 1: we have a real dearth of um people with genuine precognition. 40 00:02:25,040 --> 00:02:26,920 Speaker 1: It's kind of tough to find three that you know, 41 00:02:27,000 --> 00:02:29,359 Speaker 1: you can really reliably count on who can serve you 42 00:02:29,880 --> 00:02:33,119 Speaker 1: images from their brain crimes that are about to happen. Yeah. 43 00:02:33,120 --> 00:02:34,960 Speaker 1: They had a bad off in that movie too, they 44 00:02:35,000 --> 00:02:39,760 Speaker 1: really did. They were Yeah, um we are. However, it 45 00:02:39,919 --> 00:02:43,680 Speaker 1: seems like working on a database that will be able 46 00:02:43,720 --> 00:02:50,160 Speaker 1: to predict crime and if so, humanity is screwed personal rights? 47 00:02:50,200 --> 00:02:52,280 Speaker 1: Is that what you're getting in. Yeah, there's a lot 48 00:02:52,320 --> 00:02:56,280 Speaker 1: of problems with this, but yeah, so there's a there's um, 49 00:02:56,320 --> 00:03:02,440 Speaker 1: there's a database that there's several databases already um around right, 50 00:03:02,480 --> 00:03:06,800 Speaker 1: there's all kinds of databases, sir. When I was a kid, um, uh, 51 00:03:07,280 --> 00:03:10,520 Speaker 1: my dad took me to get my fingers printed, uh huh, 52 00:03:10,600 --> 00:03:12,520 Speaker 1: just in case I was abducted. And I wouldn't talk 53 00:03:12,560 --> 00:03:14,119 Speaker 1: to him on the way home. I was all, like, 54 00:03:14,680 --> 00:03:18,000 Speaker 1: you redded me out, dad, just in case you got arrested. 55 00:03:18,040 --> 00:03:19,800 Speaker 1: Your dad would just afford that, and so here you go, 56 00:03:20,160 --> 00:03:22,720 Speaker 1: no no ready for him, you know, like it was 57 00:03:22,800 --> 00:03:26,520 Speaker 1: put into this database. So yeah, ostensibly, so if I 58 00:03:26,600 --> 00:03:30,120 Speaker 1: ever was kidnapped, and you know, was my brain was 59 00:03:30,120 --> 00:03:32,720 Speaker 1: washed and I lost my identity. They'd be able to 60 00:03:32,800 --> 00:03:35,600 Speaker 1: fingerprint me if I ever wandered up onto the to 61 00:03:35,760 --> 00:03:38,240 Speaker 1: the street and they'd be like, oh, it's Josh Clark. 62 00:03:38,440 --> 00:03:41,080 Speaker 1: The chances of that are slim to nune. The chances 63 00:03:41,080 --> 00:03:44,760 Speaker 1: of meet committee a crime, We're talking like near percent. See. 64 00:03:44,760 --> 00:03:48,120 Speaker 1: I was a kid during the Atlantic child murders, the 65 00:03:48,200 --> 00:03:51,760 Speaker 1: famous murders Wayne Williams, remember that. So my mom was like, go, 66 00:03:51,840 --> 00:03:54,160 Speaker 1: you know, go play by done of the Creek. Don't 67 00:03:54,200 --> 00:03:57,600 Speaker 1: worry about fingerprinting. Yeah, go go get the mail down 68 00:03:57,640 --> 00:03:59,640 Speaker 1: by the street, get the neighbor's mail, right, get all 69 00:03:59,640 --> 00:04:03,880 Speaker 1: the name urs mail. Jeez, like you know where your 70 00:04:03,920 --> 00:04:06,200 Speaker 1: children are? My mom never knew where I was. Yeah, 71 00:04:06,320 --> 00:04:10,160 Speaker 1: well I survived. Well, yes, clearly you did. But so 72 00:04:10,280 --> 00:04:13,920 Speaker 1: fingerprinting is just one database, right, Yeah, that's one. Uh, 73 00:04:13,920 --> 00:04:18,560 Speaker 1: there's another one that's a little more advanced, a little 74 00:04:18,600 --> 00:04:23,640 Speaker 1: more sophisticated. Um that is called the National Crime Information Center. Right. 75 00:04:24,360 --> 00:04:26,320 Speaker 1: You ever watched the movie or the show the first 76 00:04:26,360 --> 00:04:30,479 Speaker 1: forty eight? No, dude, it is good. Um. I sleep 77 00:04:30,520 --> 00:04:32,640 Speaker 1: with a hammer next to my bed now because of 78 00:04:32,640 --> 00:04:35,160 Speaker 1: that shows what's the concept there. Within the first two 79 00:04:35,240 --> 00:04:37,440 Speaker 1: days is when all the evidence is like hot and 80 00:04:37,760 --> 00:04:40,640 Speaker 1: yeah if they don't, if they don't close a homicide 81 00:04:40,640 --> 00:04:42,960 Speaker 1: within the first forty eight hours, the chances of them 82 00:04:42,960 --> 00:04:48,520 Speaker 1: ever closing it dropped dramatically. Yeah. So they sometimes they do, 83 00:04:48,600 --> 00:04:52,400 Speaker 1: sometimes they don't. But it's a real life, um, real 84 00:04:52,440 --> 00:04:57,120 Speaker 1: life uh show. Uh. And it follows like real life cops, 85 00:04:57,120 --> 00:04:59,279 Speaker 1: like on the beat, like after a homicide, and like 86 00:04:59,279 --> 00:05:01,400 Speaker 1: the stuff that people will do each other's just chilling. 87 00:05:01,920 --> 00:05:04,480 Speaker 1: My favorite is this ball guy who works in Memphis. 88 00:05:04,480 --> 00:05:08,440 Speaker 1: He's awesome Squidy It's not Squibby, Um, he's he would 89 00:05:08,520 --> 00:05:11,479 Speaker 1: bust Squibby though I can tell you that. But um, 90 00:05:11,520 --> 00:05:14,159 Speaker 1: they're this they they often access on the show, the 91 00:05:14,440 --> 00:05:17,680 Speaker 1: National Crime Information Center. So it's got like all this 92 00:05:17,800 --> 00:05:22,040 Speaker 1: information on people who are who have committed crimes. Um. 93 00:05:22,080 --> 00:05:23,960 Speaker 1: But it's more than just their fingerprints. It could be 94 00:05:24,040 --> 00:05:26,680 Speaker 1: like their street name that they're they're always using it 95 00:05:26,680 --> 00:05:29,839 Speaker 1: to look up like somebody's street name. Um. It also 96 00:05:29,960 --> 00:05:32,560 Speaker 1: has you know, people who are members or suspective members 97 00:05:32,560 --> 00:05:38,960 Speaker 1: of like gangs organization. Yeah what else? You? Umi has 98 00:05:39,000 --> 00:05:43,719 Speaker 1: this iPhone app that's disturbing. It's a locate a sex offender. Yeah, 99 00:05:43,720 --> 00:05:46,159 Speaker 1: I've seen the website. They well they have an iPhone 100 00:05:46,160 --> 00:05:49,400 Speaker 1: app for it now and like sure enough. There's a 101 00:05:49,400 --> 00:05:53,400 Speaker 1: lot of sex offenders around our placed. And the weird 102 00:05:53,440 --> 00:05:57,280 Speaker 1: thing is they aggregate like you'll you'll look at uh 103 00:05:57,520 --> 00:05:59,480 Speaker 1: several of them and they'll all have the same address 104 00:05:59,480 --> 00:06:02,120 Speaker 1: except like number six or number thirteen or whatever. So 105 00:06:02,160 --> 00:06:05,240 Speaker 1: it's clearly an apartment building that's been designated like a 106 00:06:05,279 --> 00:06:07,159 Speaker 1: sex offender can live here because it's not by a 107 00:06:07,200 --> 00:06:09,960 Speaker 1: school or anything like that. Remember that article set recently 108 00:06:10,000 --> 00:06:11,880 Speaker 1: about the they told the ones in Georgia to camp 109 00:06:11,880 --> 00:06:15,080 Speaker 1: out that they quickly once it hit the news there 110 00:06:15,160 --> 00:06:18,120 Speaker 1: or you can come back inside. Yeah, we'll find a 111 00:06:18,120 --> 00:06:21,560 Speaker 1: place for you. Yeah. So okay, so we've got those 112 00:06:21,560 --> 00:06:25,040 Speaker 1: two databases. There's another one and here's where we start 113 00:06:25,120 --> 00:06:27,720 Speaker 1: to reach the crux of this podcast finally, after like 114 00:06:27,800 --> 00:06:31,320 Speaker 1: seventy minutes, the one in England and here, Oh, well 115 00:06:31,320 --> 00:06:34,360 Speaker 1: you're talking about the National DNA Database. Yeah. Yes, that 116 00:06:34,440 --> 00:06:38,919 Speaker 1: started in uh England in and initially it was just 117 00:06:39,040 --> 00:06:43,000 Speaker 1: people who had been uh convicted of crimes that they 118 00:06:43,040 --> 00:06:45,159 Speaker 1: would get their DNA and keep that on file. Yeah, 119 00:06:45,279 --> 00:06:47,440 Speaker 1: but that changed, didn't it. It did change, and I 120 00:06:47,480 --> 00:06:51,599 Speaker 1: think two thousand three they expanded it UM to include 121 00:06:51,680 --> 00:06:56,039 Speaker 1: anyone who's ever been arrested. So basically, if a cop 122 00:06:56,120 --> 00:07:01,159 Speaker 1: goes I'm gonna take they can rescue and let you 123 00:07:01,200 --> 00:07:04,279 Speaker 1: go on the spot. But if the cop can collect 124 00:07:04,320 --> 00:07:06,440 Speaker 1: a sample right there, if he's got a mobile sample kit, 125 00:07:06,920 --> 00:07:09,920 Speaker 1: they can take your blood even if you didn't do anything. Yeah, 126 00:07:09,960 --> 00:07:13,239 Speaker 1: even if they're like, oh sorry, buddy, I had yeah exactly, 127 00:07:13,320 --> 00:07:15,239 Speaker 1: and you can't say can I get that swab back? 128 00:07:15,760 --> 00:07:18,440 Speaker 1: And they'll you know, they'll break your arms. And before 129 00:07:18,760 --> 00:07:22,640 Speaker 1: two thousand six, I think they that that was still 130 00:07:23,560 --> 00:07:28,000 Speaker 1: most people weren't getting arrested. But that year UM Britain 131 00:07:28,120 --> 00:07:33,520 Speaker 1: expanded the the list of arrestable offenses, including wearing a 132 00:07:33,560 --> 00:07:37,240 Speaker 1: seatbelt or more of the point, not wearing it. You're 133 00:07:37,320 --> 00:07:40,800 Speaker 1: under arrest for wearing a seat bit into this speetree, right, 134 00:07:41,200 --> 00:07:43,360 Speaker 1: so I mean that's uh yeah, and I imagine that's 135 00:07:43,360 --> 00:07:45,480 Speaker 1: how they do it too, right as a swab. I 136 00:07:45,480 --> 00:07:47,720 Speaker 1: don't know about the mobile kits. Probably I can imagine 137 00:07:47,760 --> 00:07:51,520 Speaker 1: somebody going like, you're not breaking me yeah? So um 138 00:07:51,560 --> 00:07:54,840 Speaker 1: but but yes, so now in England, if you jay walker, 139 00:07:54,920 --> 00:07:57,840 Speaker 1: if you are not wearing a seatbelt, they can collect 140 00:07:57,840 --> 00:08:00,440 Speaker 1: a sample of your DNA that they intend to keep 141 00:08:00,480 --> 00:08:04,840 Speaker 1: on file indefinitely and four millions strong. Yeah, which makes 142 00:08:04,840 --> 00:08:08,480 Speaker 1: it the second largest database in the world. DNA database 143 00:08:08,520 --> 00:08:10,880 Speaker 1: in the world, second second to ours, I would imagine, 144 00:08:10,920 --> 00:08:13,520 Speaker 1: of course. Uh. That was in two thousand seven, by 145 00:08:13,560 --> 00:08:16,640 Speaker 1: the way, the four million entries UH that same year. 146 00:08:16,680 --> 00:08:20,200 Speaker 1: In the US, we have a National DNA Index system 147 00:08:20,200 --> 00:08:23,320 Speaker 1: in d I S that's maintained by the FBI. UH, 148 00:08:23,400 --> 00:08:27,160 Speaker 1: and we had four point five million profiles that year. 149 00:08:27,680 --> 00:08:30,280 Speaker 1: I looked and all I could find where estimates like 150 00:08:30,400 --> 00:08:34,920 Speaker 1: FBI funding estimates. So these things are probably these numbers 151 00:08:34,920 --> 00:08:38,280 Speaker 1: are probably high. But for two thousand nine, they estimated 152 00:08:38,320 --> 00:08:41,920 Speaker 1: that this thing would have fourteen million samples. Now ours 153 00:08:41,960 --> 00:08:44,840 Speaker 1: isn't just if you're arrested, right, isn't it if if 154 00:08:44,880 --> 00:08:49,160 Speaker 1: you're a felon? Or does it state? That's how that's 155 00:08:49,160 --> 00:08:52,199 Speaker 1: how it started out. Um. And then in two thousand 156 00:08:52,240 --> 00:08:54,920 Speaker 1: and four, California, always on the leading edge of whatever 157 00:08:55,000 --> 00:08:59,920 Speaker 1: is going on, they passed Prop. Sixty nine controversial it's 158 00:09:00,240 --> 00:09:03,080 Speaker 1: to say the least. Yeah. Um. Basically, what it says 159 00:09:03,160 --> 00:09:07,400 Speaker 1: is that law enforcement can take your DNA if you've 160 00:09:07,400 --> 00:09:11,720 Speaker 1: been arrested for a felony. And some misdemeanors and it's 161 00:09:11,840 --> 00:09:15,320 Speaker 1: just arrested. Yeah, and illegal immigrants, which they kind of 162 00:09:15,320 --> 00:09:18,480 Speaker 1: just tossed in there. I'm sure, I'm sure. And let's see, 163 00:09:18,640 --> 00:09:21,679 Speaker 1: how can we fairly target illegal immigrants? Oh yeah, we'll 164 00:09:21,679 --> 00:09:23,480 Speaker 1: just take their DNA for no reason. Well there's kids 165 00:09:23,520 --> 00:09:28,320 Speaker 1: in there too, Yeah, that's that's causing a huge stir 166 00:09:28,559 --> 00:09:32,680 Speaker 1: is kids are being when they're arrested, they can have 167 00:09:32,720 --> 00:09:36,480 Speaker 1: their DNA take. I'm sure. So you can imagine that 168 00:09:36,640 --> 00:09:39,960 Speaker 1: just having your DNA taken, just having a swab stuck 169 00:09:40,000 --> 00:09:43,680 Speaker 1: in your mouth by a police officer is enough to 170 00:09:43,800 --> 00:09:47,160 Speaker 1: really raise the ire of some people. Yeah. Obviously a 171 00:09:47,160 --> 00:09:50,040 Speaker 1: lot of human rights advocates have problems with this, which 172 00:09:50,120 --> 00:09:51,920 Speaker 1: we'll get into in a second. Unless we're getting into 173 00:09:52,000 --> 00:09:55,600 Speaker 1: that now. Uh, we can if you want. I mean, yeah, Well, 174 00:09:55,640 --> 00:09:59,040 Speaker 1: the first thing that kind of got besides the implementation 175 00:09:59,040 --> 00:10:01,319 Speaker 1: of the program in England, was last year in two 176 00:10:01,360 --> 00:10:03,600 Speaker 1: thousand and eight, when it was revealed that half a 177 00:10:03,640 --> 00:10:06,000 Speaker 1: million names in the database are just flat out wrong, 178 00:10:06,679 --> 00:10:09,200 Speaker 1: and that was that caused a big stir Yeah, either 179 00:10:09,520 --> 00:10:12,360 Speaker 1: just incorrect or misspelled, it might have been type of 180 00:10:12,360 --> 00:10:14,320 Speaker 1: but some of them are just wrong. Yeah, that's clearly. 181 00:10:14,440 --> 00:10:18,719 Speaker 1: I mean just having DNA samples of four and a 182 00:10:18,760 --> 00:10:21,679 Speaker 1: half four million people in England and then saying oh, 183 00:10:21,720 --> 00:10:23,559 Speaker 1: and by the way, half a million of them are 184 00:10:23,880 --> 00:10:26,439 Speaker 1: are wrong. We don't know who's they are. We think 185 00:10:26,440 --> 00:10:29,640 Speaker 1: they're yours, but they're not. That's a problem. But I mean, 186 00:10:30,720 --> 00:10:33,360 Speaker 1: is there really a problem with just maintaining a database 187 00:10:33,360 --> 00:10:36,280 Speaker 1: at d NA? What are they doing with it? Well, 188 00:10:36,360 --> 00:10:39,360 Speaker 1: it depends because DNA it's not like a fingerprint. There's 189 00:10:39,400 --> 00:10:42,120 Speaker 1: a lot of information contained in your DNA that's not 190 00:10:42,200 --> 00:10:47,440 Speaker 1: just identified the person, right, your genetic code, your family history. 191 00:10:47,480 --> 00:10:50,120 Speaker 1: There's a there's a program called d N a witness 192 00:10:50,240 --> 00:10:53,840 Speaker 1: made by a company called DNA print Genomics UM, and 193 00:10:54,000 --> 00:10:59,000 Speaker 1: it can locate ancestry markers and basically say, oh, you 194 00:10:59,080 --> 00:11:02,000 Speaker 1: found some DNA, we can narrow it down to this 195 00:11:02,080 --> 00:11:05,080 Speaker 1: person is probably being Hispanic. Well yeah, it's all to 196 00:11:05,080 --> 00:11:09,160 Speaker 1: deal with racial racial breakdown, right, So I mean, there's 197 00:11:09,679 --> 00:11:12,680 Speaker 1: racial profiling is about as hot button an issue as 198 00:11:13,240 --> 00:11:16,640 Speaker 1: anything else, you know, because the problem is, as it 199 00:11:16,720 --> 00:11:22,480 Speaker 1: stands now, racial profiling is based on past statistics. If 200 00:11:22,520 --> 00:11:26,160 Speaker 1: you include DNA into the mix, does it become more 201 00:11:26,280 --> 00:11:30,280 Speaker 1: finely honed or even more egregious, right or does it 202 00:11:30,280 --> 00:11:33,960 Speaker 1: open itself up? Who knows. Here's the problem with DNA profiling, Chuck. 203 00:11:35,720 --> 00:11:39,400 Speaker 1: We have not in this country or the UK, from 204 00:11:39,400 --> 00:11:44,280 Speaker 1: what I can imagine, um, had any real discussion about 205 00:11:45,320 --> 00:11:48,720 Speaker 1: doing it right we So we've never really come together 206 00:11:48,760 --> 00:11:52,400 Speaker 1: and said, Okay, do we want a crime free society 207 00:11:52,480 --> 00:11:54,280 Speaker 1: or as close to a crime free society as we 208 00:11:54,320 --> 00:11:57,760 Speaker 1: can get. If so, then yes, everybody needs to turn 209 00:11:57,800 --> 00:12:00,360 Speaker 1: in a DNA sample. If we all agreed that's what 210 00:12:00,400 --> 00:12:03,720 Speaker 1: we want. If we decide that we would rather live 211 00:12:03,720 --> 00:12:07,080 Speaker 1: with crime in combating crime under the techniques that we 212 00:12:07,120 --> 00:12:12,000 Speaker 1: have now to maintain our privacy, then DNA sampling has 213 00:12:12,040 --> 00:12:15,760 Speaker 1: to stop. DNA profiling has to stop. And the problem 214 00:12:15,800 --> 00:12:18,600 Speaker 1: is we've never had that conversation either way, right, Well, 215 00:12:18,640 --> 00:12:22,520 Speaker 1: the public certainly hasn't had so, but it's been continuing along. 216 00:12:22,600 --> 00:12:24,440 Speaker 1: And then when you talk about the half a million 217 00:12:25,200 --> 00:12:28,040 Speaker 1: names wrong, it's like, well, you're doing this without our consenter, 218 00:12:28,280 --> 00:12:31,600 Speaker 1: even asking us, and you're not even doing it right right. 219 00:12:31,679 --> 00:12:34,920 Speaker 1: The thing is, though they I don't even know, even 220 00:12:34,920 --> 00:12:37,679 Speaker 1: if they hone this down, can you really prevent crime? 221 00:12:37,800 --> 00:12:40,080 Speaker 1: I mean, even cops will tell you there's no such 222 00:12:40,080 --> 00:12:43,440 Speaker 1: thing as preventing crime. Cops go after criminals after they've 223 00:12:43,480 --> 00:12:46,400 Speaker 1: committed a crime. But unless it's just dumb luck, how 224 00:12:46,400 --> 00:12:49,760 Speaker 1: many times has a cop come upon a crime before 225 00:12:49,760 --> 00:12:53,600 Speaker 1: it happens and stopped it? Well that's there. There's there 226 00:12:53,640 --> 00:12:56,320 Speaker 1: are two different um groups, one in the UK and 227 00:12:56,360 --> 00:13:00,280 Speaker 1: one in America, which apparently are the two leading trees 228 00:13:00,280 --> 00:13:04,319 Speaker 1: in DNA profiling for crime prevention. Um that are that 229 00:13:04,320 --> 00:13:06,920 Speaker 1: that say no, we we do need to do that 230 00:13:06,960 --> 00:13:09,800 Speaker 1: and we're trying to is that the homicide prevention unit 231 00:13:09,840 --> 00:13:13,600 Speaker 1: in London think about the name of that homicide prevention 232 00:13:13,720 --> 00:13:16,520 Speaker 1: unit and they're doing it by forecasting crime. Yeah, well, 233 00:13:16,559 --> 00:13:19,480 Speaker 1: psychological profiling to which they've done for a while, and 234 00:13:19,520 --> 00:13:25,240 Speaker 1: that's a little less hinky and invasive than obviously DNA profiling. Right, 235 00:13:25,320 --> 00:13:27,600 Speaker 1: But what about when you combine the two. Why would 236 00:13:27,640 --> 00:13:31,480 Speaker 1: you combine DNA with a psychological profile to catch the 237 00:13:31,480 --> 00:13:34,600 Speaker 1: bad guys? I guess to an extent. But at the 238 00:13:34,640 --> 00:13:38,240 Speaker 1: same time, what we're talking about is looking at DNA 239 00:13:38,679 --> 00:13:41,360 Speaker 1: to find out if we can find a genetic defect 240 00:13:41,360 --> 00:13:43,400 Speaker 1: in somebody that we suggest that maybe they have a 241 00:13:43,400 --> 00:13:47,240 Speaker 1: short temper, or that they're sociopathic or whatever. If you 242 00:13:47,280 --> 00:13:51,080 Speaker 1: combine that with a psychological profile. But where does that 243 00:13:51,120 --> 00:13:54,760 Speaker 1: profile come from? Maybe records from mental health workers, or 244 00:13:54,800 --> 00:13:59,160 Speaker 1: maybe your insurance records, or your doctor or dental records. 245 00:13:59,160 --> 00:14:01,240 Speaker 1: I mean, and then doing that now, but who knows 246 00:14:01,240 --> 00:14:04,000 Speaker 1: what could happen that. That's the point. If this database 247 00:14:04,320 --> 00:14:08,199 Speaker 1: gets big enough, or I should say, if it gets 248 00:14:08,240 --> 00:14:13,679 Speaker 1: accurate enough, then yeah, people will probably start getting leaned 249 00:14:13,720 --> 00:14:16,600 Speaker 1: on to provide information to be contributed to this database 250 00:14:16,640 --> 00:14:20,320 Speaker 1: for use by law enforce. Yeah. Once you have enough 251 00:14:20,360 --> 00:14:25,280 Speaker 1: information and you are confident enough that you can prevent 252 00:14:25,360 --> 00:14:28,600 Speaker 1: a crime, or if you can say this person is 253 00:14:28,640 --> 00:14:31,320 Speaker 1: probably going to kill somebody, what do they do though? 254 00:14:31,320 --> 00:14:32,960 Speaker 1: That's what I want to know. Did they just start 255 00:14:33,120 --> 00:14:35,440 Speaker 1: It's obviously not gonna happen like Minority Report in that 256 00:14:35,480 --> 00:14:38,880 Speaker 1: film and the story Tom Cruise knocks on your door 257 00:14:38,960 --> 00:14:41,240 Speaker 1: and says you're under arrest for the future murder of 258 00:14:41,280 --> 00:14:44,640 Speaker 1: your wife. They're clearly not talking about that. That's impossible 259 00:14:44,720 --> 00:14:47,160 Speaker 1: in the stuff of science fiction. But what they just 260 00:14:47,600 --> 00:14:52,200 Speaker 1: monitor someone or tail someone, so basically potentially dangerous people 261 00:14:52,360 --> 00:14:55,200 Speaker 1: just be under surveillance at all times, I guess so. 262 00:14:55,240 --> 00:14:57,320 Speaker 1: But what if you've never committed a crime in your 263 00:14:57,400 --> 00:15:00,480 Speaker 1: entire life and don't intend to. But you've the cops 264 00:15:00,520 --> 00:15:04,440 Speaker 1: breathing down your neck every night. Anywhere you go, there's 265 00:15:04,440 --> 00:15:07,000 Speaker 1: a cop following you. You go on a date, there's 266 00:15:07,000 --> 00:15:09,560 Speaker 1: a cop following you. You take your mother out for dinner, 267 00:15:09,720 --> 00:15:12,400 Speaker 1: there's a cop following you. I mean, if you've never 268 00:15:12,440 --> 00:15:15,160 Speaker 1: committed a crime in your life, how fair is that? Well? Right? 269 00:15:15,160 --> 00:15:18,000 Speaker 1: And plus if someone's tailing you, and I don't know, 270 00:15:18,120 --> 00:15:20,880 Speaker 1: I could I could see a scenario where some renegade 271 00:15:20,960 --> 00:15:24,480 Speaker 1: cop um trumps up a traffic violation and pulls you 272 00:15:24,560 --> 00:15:27,560 Speaker 1: over and shakes you down. And you know, it's not 273 00:15:27,640 --> 00:15:29,880 Speaker 1: like police. I mean, trust me, I'm not dragging on 274 00:15:29,920 --> 00:15:31,720 Speaker 1: the police, who do a great job. But there are 275 00:15:31,760 --> 00:15:35,480 Speaker 1: cases where people are framed and weapons are planted. And 276 00:15:35,520 --> 00:15:37,880 Speaker 1: if some guy they think is a really bad person 277 00:15:37,880 --> 00:15:40,920 Speaker 1: waiting to happen, what's to stop a cop from trailing 278 00:15:40,960 --> 00:15:44,000 Speaker 1: him and doing just that. So, not only that, but 279 00:15:44,040 --> 00:15:47,000 Speaker 1: what happens if um, somebody gets access to this, if 280 00:15:47,040 --> 00:15:50,720 Speaker 1: this information in the database is disseminated, and then you've 281 00:15:50,760 --> 00:15:52,480 Speaker 1: got somebody who's like, well, you know what, I'm going 282 00:15:52,520 --> 00:15:55,320 Speaker 1: to take it upon myself to rid society of these people. 283 00:15:55,840 --> 00:15:58,480 Speaker 1: Who may commit a crime. You know, most I can't 284 00:15:58,480 --> 00:16:01,120 Speaker 1: say most, but there's a lot serial killers out there who, 285 00:16:01,160 --> 00:16:04,000 Speaker 1: once caught say that they were doing a service to society. 286 00:16:04,560 --> 00:16:07,720 Speaker 1: Ever seen that. I haven't yet, but I am aware 287 00:16:07,760 --> 00:16:10,080 Speaker 1: of same scenario. I love six ft under and he 288 00:16:10,240 --> 00:16:14,920 Speaker 1: dies by the way. Um the the the killer John 289 00:16:14,920 --> 00:16:18,480 Speaker 1: Wayne Gacy expected that he was going to get a 290 00:16:18,560 --> 00:16:21,160 Speaker 1: rap on the knuckles because what he'd done is just 291 00:16:21,240 --> 00:16:23,920 Speaker 1: rid society and some bad kids. That's what he said, 292 00:16:24,040 --> 00:16:27,440 Speaker 1: Bernie gets I remember, and I guess was that the 293 00:16:27,440 --> 00:16:30,680 Speaker 1: New York Yeah, he went Charlie Bronson on everyone, and 294 00:16:30,760 --> 00:16:33,360 Speaker 1: he was the subway vigilantia. He definitely was. But he's 295 00:16:33,360 --> 00:16:35,760 Speaker 1: a serial killer, is what he is. I thought he 296 00:16:35,840 --> 00:16:37,680 Speaker 1: I thought he just killed some guys once. Did he 297 00:16:37,760 --> 00:16:40,800 Speaker 1: kill more than one time? Yeah? I think so wow, 298 00:16:41,280 --> 00:16:44,120 Speaker 1: I might be wrong. We'll hear from it. But um okay, 299 00:16:44,120 --> 00:16:47,359 Speaker 1: so yes, there's a possibility of vigilantism. There's a possibility 300 00:16:47,360 --> 00:16:52,080 Speaker 1: of police harassment. There's also another possibility called well self 301 00:16:52,120 --> 00:16:56,800 Speaker 1: fulfilling prophecy. Um. Remember we talked about kids getting their 302 00:16:56,840 --> 00:16:59,920 Speaker 1: DNA taken, Miners getting their DNA taken if they're ever 303 00:17:00,040 --> 00:17:05,800 Speaker 1: arrested for anything UM. And there's also a push I 304 00:17:05,840 --> 00:17:08,320 Speaker 1: guess to make to round this database out as much 305 00:17:08,320 --> 00:17:12,840 Speaker 1: as possible for any kid who has a behavioral problem 306 00:17:12,960 --> 00:17:16,000 Speaker 1: or maybe gets in trouble at school, for the school 307 00:17:16,080 --> 00:17:19,480 Speaker 1: to provide information about that kid so that they can say, 308 00:17:19,560 --> 00:17:20,919 Speaker 1: we're going to keep an eye on you for the 309 00:17:20,960 --> 00:17:23,359 Speaker 1: rest of your life because you're starting to fit this 310 00:17:23,400 --> 00:17:26,280 Speaker 1: profile with somebody who might kill somebody later, or maybe 311 00:17:26,280 --> 00:17:28,760 Speaker 1: he just has a d D or maybe the teacher 312 00:17:28,800 --> 00:17:31,800 Speaker 1: doesn't like him. Who knows. The problem is is if 313 00:17:31,880 --> 00:17:35,320 Speaker 1: you know that if you're six and somebody's like, you 314 00:17:35,400 --> 00:17:38,640 Speaker 1: might kill somebody someday, what is it like to grow 315 00:17:38,760 --> 00:17:41,720 Speaker 1: up for the next thirty years or so thinking that 316 00:17:41,760 --> 00:17:44,040 Speaker 1: people assume that you're gonna kill somebody one day? Well 317 00:17:44,200 --> 00:17:46,840 Speaker 1: they tell you, though I I don't know, there's got 318 00:17:46,840 --> 00:17:49,879 Speaker 1: to be something that it might not be that explicit. 319 00:17:49,960 --> 00:17:52,440 Speaker 1: Both children parents are probably notified at the very least, 320 00:17:52,920 --> 00:17:55,480 Speaker 1: right and if the parents say you're a bad kid, 321 00:17:55,520 --> 00:17:57,880 Speaker 1: that's why they're watching you, well, why wouldn't the kid 322 00:17:57,920 --> 00:18:00,160 Speaker 1: go be a bad kid? There's a lot of concern 323 00:18:00,200 --> 00:18:05,280 Speaker 1: is here right in in the US UM we talked 324 00:18:05,320 --> 00:18:08,720 Speaker 1: about the homicide prevention unit in the UK. By the way, 325 00:18:08,760 --> 00:18:12,480 Speaker 1: the senior criminal psychologist Laura Richards has said that her 326 00:18:12,560 --> 00:18:16,840 Speaker 1: vision is to know who the top one people most 327 00:18:16,840 --> 00:18:19,800 Speaker 1: potentially violent people in London are at any given time. 328 00:18:19,840 --> 00:18:22,359 Speaker 1: You know, Squibby's on that list. Sure, and you know 329 00:18:22,400 --> 00:18:26,720 Speaker 1: when they keep like most wanted lists, gangs especially love 330 00:18:26,760 --> 00:18:28,880 Speaker 1: getting at the top. And I should probably take the 331 00:18:29,040 --> 00:18:32,920 Speaker 1: opportunity right here to save our lives. Uh, it's MS thirteen, 332 00:18:32,960 --> 00:18:35,479 Speaker 1: I understand, not MS twelve. Right. Yeah, we've referred to 333 00:18:36,119 --> 00:18:38,920 Speaker 1: a gang as MS twelve in the Witness Protection show 334 00:18:38,960 --> 00:18:42,040 Speaker 1: and it is MS thirteen and we got that wrong. Agreed. Um. 335 00:18:42,080 --> 00:18:44,000 Speaker 1: Over here in the U S there's a guy named 336 00:18:44,040 --> 00:18:49,840 Speaker 1: Richard Burke who's the University of Pennsylvania sociologists and statistician. Yeah. 337 00:18:49,840 --> 00:18:51,880 Speaker 1: I don't mind this one as much. No, it's much 338 00:18:52,200 --> 00:18:54,600 Speaker 1: much more innocuous. Yeah, well, it's not an invasive because 339 00:18:54,600 --> 00:18:57,680 Speaker 1: they're not actually taking your DNA or fingerprints. He uh 340 00:18:58,040 --> 00:19:00,160 Speaker 1: was he a University of Pennsylvania. He's a cry I'm 341 00:19:00,160 --> 00:19:04,840 Speaker 1: a criminology professor, and he has actually developed an algorithm 342 00:19:04,960 --> 00:19:10,160 Speaker 1: using thirty different variables from you know, when a kid 343 00:19:10,200 --> 00:19:11,760 Speaker 1: was young and as they grow up, if they have 344 00:19:11,800 --> 00:19:14,280 Speaker 1: offenses or if they were abused, and he determines a 345 00:19:14,359 --> 00:19:18,440 Speaker 1: lethality score, which I don't know, it's it's a little 346 00:19:18,440 --> 00:19:22,200 Speaker 1: more I can I can accept this in a way 347 00:19:22,440 --> 00:19:24,320 Speaker 1: I can't too. But at the same time, I was 348 00:19:24,359 --> 00:19:27,760 Speaker 1: a little well, I'm still put off by the idea 349 00:19:27,760 --> 00:19:31,360 Speaker 1: of forecasting crime and an effort to prevent it from 350 00:19:31,400 --> 00:19:34,080 Speaker 1: ever happening. Um. But I did go on and check 351 00:19:34,080 --> 00:19:36,240 Speaker 1: out some of this guy's stuff and he has another 352 00:19:36,280 --> 00:19:41,040 Speaker 1: thing UM called crime regimes where he's taking into account 353 00:19:41,560 --> 00:19:45,600 Speaker 1: see that that lethality score is all uh centered around 354 00:19:45,640 --> 00:19:48,679 Speaker 1: the individual. And there's a movement of foot where sociology 355 00:19:48,760 --> 00:19:53,240 Speaker 1: is making a huge move to take crime completely away 356 00:19:53,240 --> 00:19:56,560 Speaker 1: from psychology. I talked to a sociologists who's like, psychology 357 00:19:56,640 --> 00:20:01,640 Speaker 1: is completely failed at explaining serial murder. Sociology is time 358 00:20:01,680 --> 00:20:04,320 Speaker 1: to explain it. Right. So this guy's taking into account 359 00:20:04,359 --> 00:20:08,480 Speaker 1: like time of day, day of the weak area, UM, 360 00:20:08,720 --> 00:20:13,000 Speaker 1: like like the location, uh is the drug trade there stable? 361 00:20:13,160 --> 00:20:15,720 Speaker 1: If so, then there's probably gonna be less crime because 362 00:20:15,800 --> 00:20:18,640 Speaker 1: there's not gonna be tur force things like that, UM. 363 00:20:18,760 --> 00:20:21,120 Speaker 1: And even larger stuff needs to be taken into account 364 00:20:21,119 --> 00:20:24,320 Speaker 1: to like the economic situation that always creates more crimes, 365 00:20:24,320 --> 00:20:26,119 Speaker 1: but for a group or an area and not an 366 00:20:26,280 --> 00:20:28,720 Speaker 1: individuals that what you mean. I imagine that this guy 367 00:20:28,760 --> 00:20:31,560 Speaker 1: will probably eventually try to put both together. So if 368 00:20:31,600 --> 00:20:34,840 Speaker 1: you've got a lethality score, guy with the lethality score, 369 00:20:34,880 --> 00:20:37,400 Speaker 1: and he's living in a high risk area, then all 370 00:20:37,400 --> 00:20:39,480 Speaker 1: of a sudden, the cops might want to go, we 371 00:20:39,520 --> 00:20:41,960 Speaker 1: should really keep an eye on that guy, you know. 372 00:20:42,119 --> 00:20:44,040 Speaker 1: So it makes it a little more honed. The problem 373 00:20:44,160 --> 00:20:48,199 Speaker 1: is is this would probably be eventually compiled with the 374 00:20:48,320 --> 00:20:50,080 Speaker 1: nd I S here in the States. Yeah, if it 375 00:20:50,080 --> 00:20:52,040 Speaker 1: turned out to be a pretty good algorithm and it 376 00:20:52,080 --> 00:20:54,439 Speaker 1: was pretty accurate, I'm sure the government would get their 377 00:20:54,480 --> 00:20:59,160 Speaker 1: mits on it soon enough. So yeah, and it's not like, um, 378 00:20:59,160 --> 00:21:02,640 Speaker 1: it's we were talking about health mental health workers being 379 00:21:02,720 --> 00:21:05,920 Speaker 1: leaned on to to give up information, or doctors disclosing 380 00:21:05,960 --> 00:21:11,520 Speaker 1: their met records. Insurance companies, UH and Census Information did 381 00:21:11,560 --> 00:21:13,480 Speaker 1: not know this. I didn't either, which is funny because 382 00:21:13,480 --> 00:21:16,320 Speaker 1: we're about to talk about population in about five minutes. Yeah, 383 00:21:16,320 --> 00:21:19,840 Speaker 1: apparently the census records, the public doesn't have access to 384 00:21:19,880 --> 00:21:23,119 Speaker 1: those for seventy two years after it's taken. I I guess, 385 00:21:23,240 --> 00:21:27,200 Speaker 1: I guess some aspects of it, because it's I've definitely 386 00:21:27,200 --> 00:21:30,840 Speaker 1: accessed census statistics that are a lot less than seventy 387 00:21:30,840 --> 00:21:33,280 Speaker 1: two years old, right, I think you can access the numbers, 388 00:21:33,320 --> 00:21:36,199 Speaker 1: but I think all of the information. Yeah, yeah, but 389 00:21:36,480 --> 00:21:40,720 Speaker 1: the FBI routinely gets that information if they wanted. The 390 00:21:40,840 --> 00:21:45,520 Speaker 1: Japanese Japanese Americans were identified using census statistics or sensus 391 00:21:45,560 --> 00:21:50,040 Speaker 1: information during World War Two for the internment camp, right, Um, 392 00:21:50,160 --> 00:21:53,720 Speaker 1: And uh, that that was I guess kind of against 393 00:21:53,720 --> 00:21:56,560 Speaker 1: the grain. Maybe usually we don't usually do that, I understand. 394 00:21:57,280 --> 00:21:59,160 Speaker 1: I think we do it done in Texas these days 395 00:21:59,160 --> 00:22:03,080 Speaker 1: with UM Hispanics trying to make it across the border interesting. 396 00:22:03,920 --> 00:22:07,040 Speaker 1: So writes violations all over the place, then, right, yes, 397 00:22:07,119 --> 00:22:10,960 Speaker 1: but apparently it's having an effect on crime, Yeah, I 398 00:22:10,960 --> 00:22:14,760 Speaker 1: guess so. UM supporters in England will say that more 399 00:22:14,800 --> 00:22:16,840 Speaker 1: than twice as many crimes have been solved using the 400 00:22:16,920 --> 00:22:20,040 Speaker 1: DNA samples in the year two thousand five as we're 401 00:22:20,040 --> 00:22:23,120 Speaker 1: solved six years before that, and so you know, maybe 402 00:22:23,119 --> 00:22:25,480 Speaker 1: it has an effect. It does. So they had the 403 00:22:25,520 --> 00:22:28,920 Speaker 1: cameras over there too. Oh yeah, did you see those uh, 404 00:22:29,040 --> 00:22:32,240 Speaker 1: those thugs and whales that got beat up? It was 405 00:22:32,280 --> 00:22:36,680 Speaker 1: a cross stressing cage fighters. Yeah. Yeah, it's the best 406 00:22:36,680 --> 00:22:38,440 Speaker 1: thing I've ever seen. Yeah, if you haven't seen, yeah, 407 00:22:38,440 --> 00:22:42,320 Speaker 1: I guess typing, cross dressing, cage fighters, thugs, whales and 408 00:22:42,440 --> 00:22:45,000 Speaker 1: it should bring up the video. These English thugs were 409 00:22:45,080 --> 00:22:47,720 Speaker 1: just drunk and walking down the street, really just causing 410 00:22:47,760 --> 00:22:50,359 Speaker 1: trouble and they were picking on this cross stresser and 411 00:22:50,400 --> 00:22:52,439 Speaker 1: it turned out that it was a m M a 412 00:22:52,480 --> 00:22:55,240 Speaker 1: cage fighter and the dude just kill them, wasted him. 413 00:22:55,240 --> 00:22:59,040 Speaker 1: It was great. It's it's inspiring and I'll prevent the 414 00:22:59,080 --> 00:23:01,560 Speaker 1: reader mail right now. Chuck meant Welsh and he knows it. 415 00:23:03,119 --> 00:23:06,600 Speaker 1: So um, I guess everybody in the future look for 416 00:23:06,640 --> 00:23:13,440 Speaker 1: a crime database that includes uh, psychological profile, uh hopefully 417 00:23:13,640 --> 00:23:16,320 Speaker 1: your correct name or unless you're a criminal then not 418 00:23:16,560 --> 00:23:21,720 Speaker 1: your real name, retinal scans, facial scans, medical history, pretty 419 00:23:21,800 --> 00:23:24,520 Speaker 1: much anything. I got one more thing, okay, I saw. 420 00:23:25,080 --> 00:23:27,879 Speaker 1: The United States has a project that was originally called 421 00:23:28,280 --> 00:23:34,760 Speaker 1: Project Hostile Intent. They've since renamed it Future Attribute Screening Technologies, 422 00:23:34,800 --> 00:23:37,239 Speaker 1: and it's one of these deals where they're gonna make 423 00:23:37,240 --> 00:23:40,080 Speaker 1: it a mobile unit like a trailer truck that you 424 00:23:40,680 --> 00:23:43,159 Speaker 1: walk through before you go into the football game and 425 00:23:43,200 --> 00:23:47,360 Speaker 1: it reads your your pulse, your breathing rate, your your 426 00:23:47,560 --> 00:23:51,679 Speaker 1: eye people dilation, and supposedly to predict if you're you know, 427 00:23:51,720 --> 00:23:54,720 Speaker 1: like shifty or angry. Think about that. Um, yeah, I 428 00:23:54,720 --> 00:23:59,640 Speaker 1: heard they're using uh we fit boards now to make 429 00:23:59,640 --> 00:24:01,840 Speaker 1: people standard. They're talking about they were going to make 430 00:24:01,880 --> 00:24:05,240 Speaker 1: people stand on because they think that. Um, terrorists are 431 00:24:06,040 --> 00:24:09,560 Speaker 1: literally shiftier than other people, so like they would be 432 00:24:09,560 --> 00:24:11,960 Speaker 1: shifting their weight more because they know something is about 433 00:24:11,960 --> 00:24:14,400 Speaker 1: to go down. Um. Things like that don't take into 434 00:24:14,440 --> 00:24:17,080 Speaker 1: account fear of flying. Well all, there's a lot of 435 00:24:17,119 --> 00:24:19,200 Speaker 1: problems and stuff like that. Just what kind of mood 436 00:24:19,320 --> 00:24:20,679 Speaker 1: urine that day if you and your wife just had 437 00:24:20,720 --> 00:24:22,679 Speaker 1: a fight on the way to the airport, because Emily 438 00:24:22,680 --> 00:24:25,200 Speaker 1: and I have a long standing tradition of fighting before 439 00:24:25,200 --> 00:24:27,960 Speaker 1: any plane flight. Oh yeah, oh yeah, it's it's it's 440 00:24:27,960 --> 00:24:31,719 Speaker 1: a good thing. Um. And then of course, once this 441 00:24:31,760 --> 00:24:34,119 Speaker 1: is in place, one of the guys called what we 442 00:24:34,280 --> 00:24:38,399 Speaker 1: would see a security theater, which is not being shifty 443 00:24:38,640 --> 00:24:40,600 Speaker 1: or acting like you know, you're going to Hawaii on 444 00:24:40,640 --> 00:24:44,159 Speaker 1: a vacation, so trying to trip up the machine. So 445 00:24:44,240 --> 00:24:48,040 Speaker 1: look forward to that too. Yeah, the end, the end. 446 00:24:48,240 --> 00:24:51,720 Speaker 1: So if you want to learn more about our colleague, 447 00:24:51,720 --> 00:24:56,239 Speaker 1: Shanna Freeman's um predictions for future crime databases. You can 448 00:24:56,280 --> 00:24:58,840 Speaker 1: probably get away with just typing in future crime in 449 00:24:58,960 --> 00:25:03,399 Speaker 1: the certain far at how stuff works dot Com. Also 450 00:25:03,560 --> 00:25:06,080 Speaker 1: try typing in Unicorn. See what comes up. I think 451 00:25:06,119 --> 00:25:10,159 Speaker 1: you'll be pleasantly surprised. And Chuck I said, Uh, search 452 00:25:10,240 --> 00:25:16,080 Speaker 1: bar right, which means it's time for listener mail. Josh, 453 00:25:16,080 --> 00:25:19,399 Speaker 1: I'm gonna call this. Uh, I had no idea that 454 00:25:19,480 --> 00:25:23,200 Speaker 1: we had a official listening club. That's what I'm gonna 455 00:25:23,200 --> 00:25:25,520 Speaker 1: call it. This is on the blog, but I'm gonna 456 00:25:25,560 --> 00:25:27,040 Speaker 1: read this, and this is kind of blew me away. 457 00:25:27,600 --> 00:25:31,600 Speaker 1: The there are some U. S Americans living in South Korea, 458 00:25:32,040 --> 00:25:34,239 Speaker 1: and they actually formed a little listening club and they 459 00:25:34,280 --> 00:25:38,960 Speaker 1: get together and they listen to our show. And they said, 460 00:25:39,040 --> 00:25:41,119 Speaker 1: during the assemblies, we listened to you to ponder various 461 00:25:41,160 --> 00:25:44,560 Speaker 1: interesting topics and then discuss them further by offering our 462 00:25:44,600 --> 00:25:48,320 Speaker 1: own thoughts and experiences, and complement our sessions with libations 463 00:25:48,320 --> 00:25:51,600 Speaker 1: of the alcoholic variety. So they formed a drinking game 464 00:25:52,160 --> 00:25:54,719 Speaker 1: to our show. Let me tell you what they drink too. 465 00:25:54,760 --> 00:25:56,800 Speaker 1: And they're listening now, so we're going to really get 466 00:25:56,840 --> 00:25:59,439 Speaker 1: them hammered at this point. Every time there's a new 467 00:25:59,480 --> 00:26:03,240 Speaker 1: statistic quoted, which I believe we do on four out 468 00:26:03,240 --> 00:26:06,359 Speaker 1: of five shows at least. And when we refer to 469 00:26:06,440 --> 00:26:10,000 Speaker 1: our producer Jerry. Over here, there's Jerry. Hey, Jerry. So 470 00:26:10,040 --> 00:26:12,600 Speaker 1: we've said Jerry like four times since probably about four shots. 471 00:26:12,840 --> 00:26:15,480 Speaker 1: You said Jerry, I said Jerry. We refer to our 472 00:26:15,520 --> 00:26:19,359 Speaker 1: producer Jerry when someone shares any of our first names. 473 00:26:19,920 --> 00:26:21,440 Speaker 1: So if they're in the club and we say their 474 00:26:21,480 --> 00:26:24,560 Speaker 1: first name, by happenstance, they drink And I know who 475 00:26:24,600 --> 00:26:26,560 Speaker 1: wrote this was Richard, So we're gonna say Richard like 476 00:26:26,640 --> 00:26:32,840 Speaker 1: four more times Richard, Yeah, and Jerry and uh, also, um, 477 00:26:32,840 --> 00:26:35,920 Speaker 1: when chuckers, when you say chuckers, I don't know that 478 00:26:35,960 --> 00:26:37,880 Speaker 1: I said that in this one, did I You could 479 00:26:37,880 --> 00:26:41,480 Speaker 1: say now chuckers. And the final thing, Josh, they drink too. 480 00:26:41,480 --> 00:26:44,119 Speaker 1: I think we're gonna like this is they're getting your back, dude. 481 00:26:44,440 --> 00:26:46,720 Speaker 1: The whole I'm me thing that we've been hearing about 482 00:26:46,800 --> 00:26:49,800 Speaker 1: for the past a hundred and sixty shows. Whenever you 483 00:26:49,880 --> 00:26:53,600 Speaker 1: actually correct yourself now with the IMI is when they drink, 484 00:26:54,160 --> 00:26:57,119 Speaker 1: so he writes and says, uh, they drink soju, by 485 00:26:57,119 --> 00:26:59,040 Speaker 1: the way, which is a rice wine they say on 486 00:26:59,160 --> 00:27:01,480 Speaker 1: Soju josh a say siped casually, the rice wine is 487 00:27:01,520 --> 00:27:04,120 Speaker 1: not without its merits, but done our way. It all 488 00:27:04,119 --> 00:27:06,720 Speaker 1: but guarantees a regrettable late night phone called to a 489 00:27:06,720 --> 00:27:12,320 Speaker 1: coworker or former lover, fearful platitudes, mutual admiration, or some 490 00:27:12,480 --> 00:27:16,200 Speaker 1: form of public nudity later in the evenings. So Josh, Sir, 491 00:27:16,359 --> 00:27:20,520 Speaker 1: I beseech you stop hyper correcting. Let it fly pleadingly. 492 00:27:20,720 --> 00:27:25,080 Speaker 1: Richard tired of being hungover in Korea, Richard, Richard, Well, 493 00:27:25,080 --> 00:27:29,160 Speaker 1: thanks Richard for sending that in. Chuckers like Jerry liked it, 494 00:27:29,359 --> 00:27:33,119 Speaker 1: Richard clearly liked Charle like it. And I and me 495 00:27:33,480 --> 00:27:37,080 Speaker 1: and I'm sorry I shouldn't say that, I should correct myself. Well, 496 00:27:37,160 --> 00:27:40,080 Speaker 1: if you want to send chuck in I and email, 497 00:27:40,119 --> 00:27:43,560 Speaker 1: I'm sorry, chucking me an email or Chuckers chuckers an 498 00:27:43,600 --> 00:27:48,159 Speaker 1: email and me uh, you can send that to stuff 499 00:27:48,320 --> 00:27:55,760 Speaker 1: podcast at how stuff works dot com. For more on 500 00:27:55,800 --> 00:27:58,280 Speaker 1: this and thousands of other topics, is that how stuff 501 00:27:58,280 --> 00:28:01,880 Speaker 1: works dot com. Want more house stuff works, check out 502 00:28:01,880 --> 00:28:04,439 Speaker 1: our blogs on the house stuff works dot com home page. 503 00:28:05,280 --> 00:28:09,240 Speaker 1: M HM brought to you by the reinvented two thousand 504 00:28:09,320 --> 00:28:11,440 Speaker 1: twelve camera it's ready. Are you