1 00:00:00,680 --> 00:00:03,400 Speaker 1: Hey, everybody out there in the Pacific Northwest or with 2 00:00:03,480 --> 00:00:06,000 Speaker 1: access to an airport or a car rental place that 3 00:00:06,040 --> 00:00:08,880 Speaker 1: can get you to the Pacific Northwest specifically at the 4 00:00:09,000 --> 00:00:13,240 Speaker 1: end of January. We'll see you in Seattle, Portland, and 5 00:00:13,320 --> 00:00:14,120 Speaker 1: San Francisco. 6 00:00:14,880 --> 00:00:17,320 Speaker 2: That's right to. Our new live show for twenty twenty 7 00:00:17,400 --> 00:00:21,040 Speaker 2: four is Seattle, Washington January twenty fourth at the Paramount Theater, 8 00:00:21,560 --> 00:00:24,080 Speaker 2: then Portland at our Homeway from Home at Revolution Hall 9 00:00:24,120 --> 00:00:26,000 Speaker 2: in the twenty fifth, and then winding it all up 10 00:00:26,040 --> 00:00:29,720 Speaker 2: at Sketchfest on the twenty six at the Sydney Goldstein Theater. 11 00:00:30,200 --> 00:00:32,879 Speaker 1: Very nice. If you want tickets, if you want information, 12 00:00:33,080 --> 00:00:35,400 Speaker 1: if you want tickets, you can go to a couple 13 00:00:35,400 --> 00:00:37,440 Speaker 1: of places. You can go to our link tree at 14 00:00:37,479 --> 00:00:40,240 Speaker 1: Linktree slash sysk, and you can go to our home 15 00:00:40,280 --> 00:00:42,760 Speaker 1: on the web, Stuff youshould Know dot com. Click on 16 00:00:42,800 --> 00:00:45,040 Speaker 1: the tour button and it'll take you to all of 17 00:00:45,120 --> 00:00:47,879 Speaker 1: the beautiful places you can go to buy your tickets 18 00:00:48,200 --> 00:00:49,800 Speaker 1: and we'll see you guys in January. 19 00:00:52,040 --> 00:00:59,800 Speaker 2: Welcome to Stuff you Should Know, a production of iHeartRadio. 20 00:01:01,760 --> 00:01:04,039 Speaker 1: Hey, and welcome to the podcast. I'm Josh, and there's 21 00:01:04,120 --> 00:01:07,000 Speaker 1: Chuck and Jerry's here too, and this is stuff you 22 00:01:07,040 --> 00:01:10,679 Speaker 1: should know, the last edition of the two to the 23 00:01:11,040 --> 00:01:13,280 Speaker 1: to the Deuce Tray. 24 00:01:13,959 --> 00:01:16,240 Speaker 2: Oh, is this our final of the year? 25 00:01:16,680 --> 00:01:19,080 Speaker 1: It is. It's the last one of twenty twenty three, Chuck. 26 00:01:19,120 --> 00:01:22,759 Speaker 1: We recorded all of the episodes that we're ever going 27 00:01:22,760 --> 00:01:25,480 Speaker 1: to record in the year twenty twenty three. Isn't that amazing? 28 00:01:25,720 --> 00:01:27,640 Speaker 2: Yeah? And hey, you know, since you brought that up, 29 00:01:27,680 --> 00:01:28,400 Speaker 2: can I say something? 30 00:01:29,360 --> 00:01:29,640 Speaker 1: Sure? 31 00:01:30,880 --> 00:01:34,840 Speaker 2: Spotify, who carries our show, as do all platforms, they 32 00:01:34,840 --> 00:01:37,480 Speaker 2: have this really cool thing they send out called the 33 00:01:37,640 --> 00:01:41,679 Speaker 2: wrap Wrap I guess is in year end wrap up 34 00:01:41,760 --> 00:01:46,080 Speaker 2: kind of thing, and they they sent us as a 35 00:01:46,160 --> 00:01:50,680 Speaker 2: show our own statistical analysis, but then they send individual 36 00:01:50,800 --> 00:01:53,560 Speaker 2: users their own and we just had a lot of 37 00:01:53,600 --> 00:01:57,720 Speaker 2: great listeners sending us in their rap statistics like hey, 38 00:01:57,760 --> 00:02:00,400 Speaker 2: I'm in the top one percent of stuff you should 39 00:02:00,400 --> 00:02:03,400 Speaker 2: know listenership, And it was just really neat to see 40 00:02:03,400 --> 00:02:05,160 Speaker 2: all that stuff coming in, So thank you. 41 00:02:05,560 --> 00:02:08,079 Speaker 1: It really is. It's amazing and everybody's so proud of it. 42 00:02:08,080 --> 00:02:10,600 Speaker 1: It's so great to see. So no matter what percentage 43 00:02:10,600 --> 00:02:12,560 Speaker 1: you're in, if you are proud enough to send an 44 00:02:12,560 --> 00:02:16,280 Speaker 1: email or post it, kudos to you, because we're proud 45 00:02:16,280 --> 00:02:18,920 Speaker 1: of you right back. I do think, though, Chuck, that 46 00:02:18,960 --> 00:02:22,560 Speaker 1: we probably should shout out the person who wrote in 47 00:02:22,639 --> 00:02:27,320 Speaker 1: with the far and away the largest number of listening 48 00:02:27,360 --> 00:02:28,680 Speaker 1: minutes according to Spotify. 49 00:02:28,880 --> 00:02:30,359 Speaker 2: Yeah, who's that? 50 00:02:30,360 --> 00:02:34,799 Speaker 1: That is Aravin Cancerla, who is in the top point 51 00:02:35,000 --> 00:02:39,959 Speaker 1: zero five percent of listeners and based on the eighty 52 00:02:40,280 --> 00:02:43,880 Speaker 1: thousand plus eighty six, seven hundred and seventy two minutes, 53 00:02:44,360 --> 00:02:47,799 Speaker 1: I don't see how there could be anybody else in that, 54 00:02:48,800 --> 00:02:54,400 Speaker 1: you know, in the remaining what point zero's five percent left? 55 00:02:55,080 --> 00:02:57,560 Speaker 2: Yeah. I did a little back of the envelope math 56 00:02:57,639 --> 00:03:01,519 Speaker 2: and that that's something somewhere between twenty five and thirty 57 00:03:01,560 --> 00:03:03,000 Speaker 2: hours of stuff you should know a week. 58 00:03:03,680 --> 00:03:05,280 Speaker 1: Yeah, that just it's a lot. 59 00:03:05,480 --> 00:03:09,160 Speaker 2: Doesn't seem possible. So I have suspicions that this person 60 00:03:09,240 --> 00:03:11,000 Speaker 2: might have just played it on a loop so they could, 61 00:03:11,960 --> 00:03:13,760 Speaker 2: you know, and then just went out shopping or whatever. 62 00:03:14,400 --> 00:03:17,639 Speaker 1: I don't know. I think Aravin strikes me as a 63 00:03:18,400 --> 00:03:22,960 Speaker 1: pretty straightforward person. So grats to Airvind and also seriously, 64 00:03:23,000 --> 00:03:25,240 Speaker 1: thank you to everybody who listens to us so much 65 00:03:25,280 --> 00:03:27,720 Speaker 1: that you get statistics at the end of the year 66 00:03:27,760 --> 00:03:30,239 Speaker 1: that make you proud. I mean, that's amazing, guys. Thank 67 00:03:30,280 --> 00:03:30,880 Speaker 1: you very much. 68 00:03:31,000 --> 00:03:33,799 Speaker 2: Yeah, thanks to Spotify. That's a cool service that they 69 00:03:34,120 --> 00:03:35,840 Speaker 2: or I don't know if is that a service, whatevers 70 00:03:35,880 --> 00:03:36,520 Speaker 2: cool thing they do. 71 00:03:38,040 --> 00:03:39,720 Speaker 1: It's a service, it's a public service. 72 00:03:39,840 --> 00:03:42,480 Speaker 2: We were downloaded in one hundred and sixty three countries. 73 00:03:43,080 --> 00:03:45,720 Speaker 1: Oh, I didn't know there were that many countries. 74 00:03:45,360 --> 00:03:47,600 Speaker 2: Which is we looked it up. It was actually something 75 00:03:47,640 --> 00:03:49,720 Speaker 2: like one hundred and ninety. So like that's most of 76 00:03:49,800 --> 00:03:50,280 Speaker 2: the countries. 77 00:03:50,840 --> 00:03:53,160 Speaker 1: Yeah, I would say that's the vast majority of them. 78 00:03:53,240 --> 00:03:54,960 Speaker 1: And by the way, everybody I knew that there were 79 00:03:55,040 --> 00:03:56,200 Speaker 1: more countries than that. 80 00:03:56,280 --> 00:03:59,480 Speaker 2: I was joking and quickly I saw that. I don't 81 00:03:59,480 --> 00:04:01,920 Speaker 2: know if you went through that yet, Josh, but we 82 00:04:02,480 --> 00:04:06,480 Speaker 2: our third biggest country of growth was Mexico. Oh, no way, 83 00:04:06,600 --> 00:04:12,960 Speaker 2: And I'm gaming. I'm gaming. I'm aiming for a show 84 00:04:12,960 --> 00:04:14,440 Speaker 2: in Mexico City. I'd like to do that. We just 85 00:04:14,520 --> 00:04:17,240 Speaker 2: don't know if like people would come. So, you know, 86 00:04:17,279 --> 00:04:18,720 Speaker 2: if we can get like a thousand people in a 87 00:04:18,800 --> 00:04:21,240 Speaker 2: room in Mexico City, I think that might be a 88 00:04:21,279 --> 00:04:21,920 Speaker 2: fun thing to do. 89 00:04:22,720 --> 00:04:25,080 Speaker 1: Yeah, especially if it's a room with seats. 90 00:04:25,320 --> 00:04:28,120 Speaker 2: Yeah, so we should get at least five hundred emails 91 00:04:28,720 --> 00:04:30,719 Speaker 2: saying at least two people will come and then that 92 00:04:30,800 --> 00:04:34,960 Speaker 2: means we might go all right, So anyway, should we 93 00:04:34,960 --> 00:04:37,159 Speaker 2: get on with barbaric practices? 94 00:04:38,920 --> 00:04:42,880 Speaker 1: Yeah, let's because I find this endlessly fascinating. Olivia helped 95 00:04:42,960 --> 00:04:46,160 Speaker 1: us with this, and basically what we're doing here is 96 00:04:46,920 --> 00:04:50,880 Speaker 1: reversing what we already kind of like to do smugly, 97 00:04:50,920 --> 00:04:54,000 Speaker 1: which is look back fifty one hundred, two hundred years 98 00:04:54,279 --> 00:04:56,800 Speaker 1: and be like, look at how backwards and antiquated those 99 00:04:56,800 --> 00:05:00,440 Speaker 1: people were back then, Like even in as recent is 100 00:05:00,520 --> 00:05:03,680 Speaker 1: the nineties. I remember in the mid nineties, I was 101 00:05:03,920 --> 00:05:07,279 Speaker 1: I smoked on an airplane on the way to Amsterdam. Yeah, 102 00:05:07,600 --> 00:05:10,559 Speaker 1: and like there was it was just like the last 103 00:05:10,560 --> 00:05:12,800 Speaker 1: three rows were smoking. But it's not like it was 104 00:05:12,800 --> 00:05:15,120 Speaker 1: sectioned off. There wasn't even a curtain. It's just like 105 00:05:15,200 --> 00:05:18,080 Speaker 1: this is the smoking section, even though the entire plane's 106 00:05:18,120 --> 00:05:22,040 Speaker 1: being covered in your cigarette smoke. This was the nineties, man. 107 00:05:22,680 --> 00:05:24,760 Speaker 2: Yeah, the first time I flew to Europe there was 108 00:05:24,839 --> 00:05:29,080 Speaker 2: smoking in like that was it would have been ninety six. 109 00:05:29,960 --> 00:05:32,839 Speaker 1: Yeah, I mean imagine that today. I mean you would 110 00:05:32,880 --> 00:05:35,160 Speaker 1: literally go to federal prison if you tried to light 111 00:05:35,200 --> 00:05:39,680 Speaker 1: a cigarette on an airplane today. 112 00:05:38,760 --> 00:05:42,240 Speaker 3: You know, Yeah, I mean it's about the fire, but sure, yeah, 113 00:05:42,279 --> 00:05:47,039 Speaker 3: a few decades before that there was jello salad where 114 00:05:47,040 --> 00:05:50,080 Speaker 3: it was all the rage and like the weirdest jello salad. 115 00:05:50,120 --> 00:05:52,200 Speaker 1: If you've never just kind of taken a stroll down 116 00:05:52,400 --> 00:05:55,839 Speaker 1: memory lane and looked up like pictures of jello molds 117 00:05:55,920 --> 00:05:59,160 Speaker 1: from the fifties to the seventies, yeah, treat yourself and 118 00:05:59,200 --> 00:06:01,560 Speaker 1: go do that. But make sure you have not had 119 00:06:01,640 --> 00:06:03,800 Speaker 1: lunch yet, because you're gonna want to gag when you 120 00:06:03,800 --> 00:06:05,720 Speaker 1: see a lot of them. That's another fun thing to 121 00:06:05,800 --> 00:06:08,640 Speaker 1: judge people for being stupid with cello because I don't 122 00:06:08,640 --> 00:06:09,920 Speaker 1: know if we said it yet. We're going to do 123 00:06:09,960 --> 00:06:11,800 Speaker 1: the opposite. We're going to look forward and try to 124 00:06:11,800 --> 00:06:15,400 Speaker 1: figure out what our descendants are going to ridicule us 125 00:06:15,560 --> 00:06:18,840 Speaker 1: or look down at us about, right, what will we 126 00:06:18,920 --> 00:06:22,000 Speaker 1: seem primitive or barbaric or ridiculous about? 127 00:06:22,279 --> 00:06:26,840 Speaker 2: That's right? But what we have before us are seven. 128 00:06:27,080 --> 00:06:30,080 Speaker 2: I think a little more serious things than jello molds 129 00:06:31,320 --> 00:06:35,480 Speaker 2: and spanking kids is on up there. However, it really 130 00:06:35,520 --> 00:06:41,320 Speaker 2: depends on who you ask, because about half of Americans 131 00:06:41,600 --> 00:06:44,200 Speaker 2: still think and this is a quote, and this from 132 00:06:44,240 --> 00:06:46,200 Speaker 2: a survey a couple of years ago from the American 133 00:06:46,240 --> 00:06:50,640 Speaker 2: Family Survey quote, it is sometimes necessary to discipline a 134 00:06:50,720 --> 00:06:55,320 Speaker 2: child with a good, hard spanking, and half. 135 00:06:55,080 --> 00:06:58,960 Speaker 1: Of those respondents said almost under their breath feels so right. 136 00:07:00,600 --> 00:07:03,400 Speaker 2: Yeah, I mean, big change from you know, back in 137 00:07:03,440 --> 00:07:05,920 Speaker 2: the day. They have another stat from sixty eight when 138 00:07:06,000 --> 00:07:09,840 Speaker 2: ninety four percent of parents said, yeah, hit your kids, 139 00:07:10,160 --> 00:07:10,680 Speaker 2: it's awesome. 140 00:07:10,720 --> 00:07:12,000 Speaker 1: You drop. That's a big drop. 141 00:07:12,680 --> 00:07:15,480 Speaker 2: But things are really changing because a third of the 142 00:07:15,920 --> 00:07:21,160 Speaker 2: respondents between eighteen and twenty nine agree with spanking, compared 143 00:07:21,200 --> 00:07:24,679 Speaker 2: to fifty percent of the overall survey. So it's something 144 00:07:24,720 --> 00:07:26,760 Speaker 2: that's going out of fashion for sure. 145 00:07:27,240 --> 00:07:30,000 Speaker 1: Yeah, it seems to be following a larger trend of 146 00:07:30,800 --> 00:07:34,960 Speaker 1: moving away from social acceptance of violence in any form, 147 00:07:35,440 --> 00:07:39,400 Speaker 1: and it's being supported by studies that find like, yeah, 148 00:07:39,440 --> 00:07:41,840 Speaker 1: it's actually good if you don't spank your kids, because 149 00:07:42,240 --> 00:07:45,760 Speaker 1: not only has there never been a study that shows 150 00:07:45,800 --> 00:07:50,720 Speaker 1: it improves children's behavior, study after study keeps suggesting it 151 00:07:50,760 --> 00:07:54,840 Speaker 1: does the opposite. It actually maladjusts children. I mean, I 152 00:07:54,840 --> 00:07:57,320 Speaker 1: can't imagine what a well adjusted person I would be 153 00:07:57,360 --> 00:07:59,320 Speaker 1: if I hadn't been spanked that handful of times when 154 00:07:59,360 --> 00:07:59,960 Speaker 1: I was a kid. 155 00:08:00,760 --> 00:08:03,640 Speaker 2: Yeah, you know, I was. We're always trying to poke 156 00:08:03,680 --> 00:08:06,760 Speaker 2: around to find, you know, the other opinion on something, 157 00:08:06,800 --> 00:08:09,120 Speaker 2: just to take a look at it, and you know 158 00:08:09,120 --> 00:08:11,520 Speaker 2: there are people who don't agree. I saw this one 159 00:08:11,560 --> 00:08:17,840 Speaker 2: professor from the Oklahoma State Robert Larzelli. I can't even 160 00:08:18,320 --> 00:08:21,480 Speaker 2: read my handwriting now, but he said that the studies 161 00:08:21,520 --> 00:08:25,240 Speaker 2: that are out there are flawed for a couple of reasons. One, 162 00:08:25,280 --> 00:08:28,120 Speaker 2: he says, these studies that say that if you spank 163 00:08:28,200 --> 00:08:31,360 Speaker 2: kids more, there that leads to them actually acting out more. 164 00:08:31,760 --> 00:08:34,319 Speaker 2: He's saying, no, kids, it's the kids that are acting 165 00:08:34,360 --> 00:08:36,880 Speaker 2: out more that are it's a chicken in the egg thing. Yes, 166 00:08:36,960 --> 00:08:38,800 Speaker 2: I've seen the ones getting spanked more. 167 00:08:39,080 --> 00:08:43,040 Speaker 1: I've seen that, and I think I found in a 168 00:08:43,080 --> 00:08:46,600 Speaker 1: Scientific American article there was at least one group that 169 00:08:46,720 --> 00:08:51,280 Speaker 1: managed to control for that and basically have shown like, no, 170 00:08:51,320 --> 00:08:55,440 Speaker 1: it's it's it actually does have this effect on kids. 171 00:08:55,600 --> 00:08:58,160 Speaker 1: The problem that what'd you say, Lara Zelli? 172 00:08:59,280 --> 00:09:03,360 Speaker 2: I think so if my writing is any indicator. 173 00:09:03,360 --> 00:09:06,839 Speaker 1: That what Larazzelli's saying is that these studies don't they 174 00:09:06,840 --> 00:09:12,320 Speaker 1: don't start following kids and from like birth to twenty 175 00:09:12,360 --> 00:09:15,680 Speaker 1: five or thirty and then see you know where you spanked, 176 00:09:15,679 --> 00:09:17,800 Speaker 1: where you're not spanked it's all just like they might. 177 00:09:17,960 --> 00:09:20,760 Speaker 1: They might peek in on a kid who's in at 178 00:09:20,800 --> 00:09:24,200 Speaker 1: the spanking age and look at their behavior. Then you 179 00:09:24,280 --> 00:09:26,520 Speaker 1: just can't parse it apart. So there's not really good 180 00:09:26,640 --> 00:09:30,439 Speaker 1: quality studies. But I saw it put like this. Even 181 00:09:30,600 --> 00:09:37,880 Speaker 1: if there are no studies that conclusively show spanking is 182 00:09:38,160 --> 00:09:43,600 Speaker 1: bad for kids or produces malajusted behavior in kids, there 183 00:09:43,960 --> 00:09:47,080 Speaker 1: are plenty of studies that seem to suggest that. There 184 00:09:47,080 --> 00:09:50,800 Speaker 1: aren't any studies that seem to suggest otherwise that it's 185 00:09:51,040 --> 00:09:55,280 Speaker 1: it's actually good, it's actually it's effective to spank your kids. 186 00:09:55,480 --> 00:09:57,840 Speaker 1: And so the argument that I've seen is like, why 187 00:09:58,080 --> 00:09:59,000 Speaker 1: why do it? Then? 188 00:10:00,080 --> 00:10:02,760 Speaker 2: Yeah, I saw a First of all, I'm a parent. 189 00:10:03,840 --> 00:10:07,480 Speaker 2: I can't in a million years imagine hitting Ruby for 190 00:10:07,520 --> 00:10:11,160 Speaker 2: any reason that's nice. It makes me want to cry 191 00:10:11,320 --> 00:10:16,960 Speaker 2: just thinking about that. It's terrible for our family. But 192 00:10:17,240 --> 00:10:19,360 Speaker 2: I did find a study from twenty eighteen that I 193 00:10:19,440 --> 00:10:23,319 Speaker 2: found in from NPR. They didn't do the study, but 194 00:10:23,360 --> 00:10:25,160 Speaker 2: they were, you know, did a thing on it. 195 00:10:25,600 --> 00:10:26,839 Speaker 1: I'm sure they were hot and heavy. 196 00:10:27,120 --> 00:10:31,480 Speaker 2: Oh of course it was what they claim. And it 197 00:10:31,480 --> 00:10:33,680 Speaker 2: seems like probably one of the most robust studies at 198 00:10:33,760 --> 00:10:37,400 Speaker 2: least that looks at countries that have banned spanking, because 199 00:10:38,640 --> 00:10:44,199 Speaker 2: I think something like sixty two countries have banned spanking 200 00:10:44,720 --> 00:10:46,800 Speaker 2: starting with Sweden in nineteen seventy nine. 201 00:10:47,240 --> 00:10:49,000 Speaker 1: Did you even know there were that many countries? 202 00:10:51,440 --> 00:10:55,360 Speaker 2: But they followed four or they used four hundred thousand 203 00:10:55,440 --> 00:10:59,240 Speaker 2: children from kids from eighty eight countries, so that's pretty good. 204 00:11:00,080 --> 00:11:02,439 Speaker 2: Fifty eight countries have the bands in thirty don't. I'm 205 00:11:02,480 --> 00:11:04,959 Speaker 2: not sure which one it is. But what they found 206 00:11:04,960 --> 00:11:08,240 Speaker 2: when what they were tracking was incidences of kids fighting, 207 00:11:08,679 --> 00:11:11,360 Speaker 2: like you know, getting in fights at school, right and 208 00:11:11,440 --> 00:11:16,240 Speaker 2: in the countries that have banned spanking, there was a 209 00:11:16,280 --> 00:11:22,640 Speaker 2: school fighting reduction by sixty nine percent in boys in girls, 210 00:11:22,880 --> 00:11:27,480 Speaker 2: which is I mean, that is pretty substantial. I was 211 00:11:27,480 --> 00:11:29,760 Speaker 2: curious about the United States because you know, we both 212 00:11:29,760 --> 00:11:32,880 Speaker 2: grew up, like my dad was my elementary school principal, 213 00:11:32,920 --> 00:11:35,680 Speaker 2: and they were he spanked me and other kids. It's 214 00:11:36,040 --> 00:11:39,920 Speaker 2: ridiculous to think about, but apparently in seventy seven the 215 00:11:39,960 --> 00:11:42,160 Speaker 2: Supreme Court of the United States gave the power to 216 00:11:42,160 --> 00:11:47,240 Speaker 2: the states. These days, ninety percent of schools don't use 217 00:11:47,320 --> 00:11:51,480 Speaker 2: corporal punishment, but it is still legal in seventeen states. 218 00:11:51,920 --> 00:11:54,000 Speaker 2: There are restrictions in place in a lot of those 219 00:11:54,480 --> 00:11:56,680 Speaker 2: like maybe your parents have to sign a thing. I said, sure, 220 00:11:57,080 --> 00:12:01,960 Speaker 2: hit my kid. But seventeen percent of states you can 221 00:12:02,080 --> 00:12:06,360 Speaker 2: still do this, with Mississippi leading the way in the 222 00:12:06,400 --> 00:12:09,160 Speaker 2: most spankings. And the other thing I found out that 223 00:12:09,200 --> 00:12:13,920 Speaker 2: we should point out is that black males are twice 224 00:12:13,920 --> 00:12:18,720 Speaker 2: as likely to be spanked than anyone. And get this, 225 00:12:19,040 --> 00:12:24,320 Speaker 2: sixteen point five percent of kids that are corporately punished 226 00:12:24,320 --> 00:12:30,199 Speaker 2: in schools in America today are disabled. My god, usually 227 00:12:30,240 --> 00:12:35,079 Speaker 2: it's an intellectual disability. Wow. Is that so not disturbing? 228 00:12:35,400 --> 00:12:38,400 Speaker 1: Yeah, of course it's disturbing. That's horrible. That's one of 229 00:12:38,440 --> 00:12:41,959 Speaker 1: the most horrible statistics you've ever spouted out. And I 230 00:12:41,960 --> 00:12:45,160 Speaker 1: should say, also just want to verify for the listeners 231 00:12:45,200 --> 00:12:48,240 Speaker 1: in any of those sixty two countries where spanking is banned, 232 00:12:48,559 --> 00:12:52,520 Speaker 1: you're talking about like public spanking, like in school in 233 00:12:52,600 --> 00:12:55,240 Speaker 1: seventeen states. In schools, you're allowed to do that. 234 00:12:55,280 --> 00:13:00,640 Speaker 2: Right, Yes, a teacher or a principal and they say 235 00:13:00,679 --> 00:13:01,959 Speaker 2: it's you know. And this is one of the other 236 00:13:02,040 --> 00:13:05,560 Speaker 2: problems that that professor had is that those studies, he says, 237 00:13:05,640 --> 00:13:09,640 Speaker 2: lump everyone in together as in, like the parents who 238 00:13:09,679 --> 00:13:12,120 Speaker 2: do it as the very last resort after several other 239 00:13:12,600 --> 00:13:15,520 Speaker 2: attempts at discipline or parents are just like, oh, you 240 00:13:15,600 --> 00:13:19,520 Speaker 2: screwed up, you know, let's hit you or whatever. And 241 00:13:19,600 --> 00:13:22,720 Speaker 2: apparently most, you know, almost all the schools, it is 242 00:13:22,760 --> 00:13:26,080 Speaker 2: a last resort, as in they've tried other things, but 243 00:13:26,280 --> 00:13:29,440 Speaker 2: you know, it's just I don't know, I got to 244 00:13:29,440 --> 00:13:32,680 Speaker 2: try not to judge people, but don't hit your kids. 245 00:13:32,960 --> 00:13:35,520 Speaker 1: Well, I was gonna say, it's it's legal in seventeen 246 00:13:35,559 --> 00:13:39,360 Speaker 1: states for schools to spank kids. It's legal in all 247 00:13:39,400 --> 00:13:43,800 Speaker 1: fifty states for a parent to spank kids. That's there's 248 00:13:44,120 --> 00:13:46,640 Speaker 1: not really anything coming down the horizon that makes it 249 00:13:46,640 --> 00:13:49,280 Speaker 1: seem like that's ever going to be banned. But it 250 00:13:49,320 --> 00:13:53,120 Speaker 1: does seem generationally like we're moving away from spanking pretty rapidly. 251 00:13:53,559 --> 00:13:59,000 Speaker 2: Yeah, my spankings as a kid, we're very infrequent and 252 00:13:59,160 --> 00:14:01,720 Speaker 2: very organized, as in it was never done in the 253 00:14:01,720 --> 00:14:04,960 Speaker 2: heat of anger, like just getting slapped or something. It 254 00:14:05,080 --> 00:14:09,320 Speaker 2: was like, all right, go to the bathroom and spend 255 00:14:09,360 --> 00:14:13,120 Speaker 2: ten minutes, you know, upset and card and then I 256 00:14:13,160 --> 00:14:15,440 Speaker 2: got spanked with a bolo paddle. You know, the little 257 00:14:15,480 --> 00:14:16,559 Speaker 2: bolo paddle games. 258 00:14:16,760 --> 00:14:18,000 Speaker 1: I know, the bolo tie. 259 00:14:18,200 --> 00:14:20,680 Speaker 2: No, the bolo paddle where you a little light plywood 260 00:14:20,680 --> 00:14:23,880 Speaker 2: paddle with a ball hard. Yeah, sure, that's that was 261 00:14:23,920 --> 00:14:25,200 Speaker 2: the spanking device. 262 00:14:25,240 --> 00:14:27,520 Speaker 1: In mind, those things are made of like balsa wood. 263 00:14:28,240 --> 00:14:29,800 Speaker 2: Yeah, it wasn't you bad. It's stung. 264 00:14:29,920 --> 00:14:33,040 Speaker 1: But you know, how about all of your spankings that 265 00:14:33,080 --> 00:14:34,280 Speaker 1: had grown. 266 00:14:34,120 --> 00:14:36,680 Speaker 2: Up that's usually involves leather. 267 00:14:37,240 --> 00:14:41,040 Speaker 1: Okay, you you want to move on to the next one. 268 00:14:41,240 --> 00:14:43,000 Speaker 2: Yeah, let's let's move on to chemotherapy. 269 00:14:43,360 --> 00:14:47,720 Speaker 1: Okay. So chemotherapy is one of these things where, if 270 00:14:47,760 --> 00:14:50,200 Speaker 1: you start kind of putting it down today, what you're 271 00:14:50,240 --> 00:14:55,200 Speaker 1: talking about is our current modern medical miracle that since 272 00:14:55,240 --> 00:14:58,280 Speaker 1: the nineties has reduced the cancer death rate by twenty 273 00:14:58,320 --> 00:15:01,520 Speaker 1: five percent. Yeah, it's it's a really big deal that 274 00:15:01,560 --> 00:15:05,040 Speaker 1: we have chemotherapy now, it's saved a lot of lives. 275 00:15:05,400 --> 00:15:07,160 Speaker 2: Yeah, this is not poopoo in chemotherapy. 276 00:15:07,400 --> 00:15:12,080 Speaker 1: No, it's not the reason why we can probably guess 277 00:15:12,240 --> 00:15:16,440 Speaker 1: that our descendants down the line are going to look 278 00:15:16,440 --> 00:15:20,080 Speaker 1: at chemotherapy is fairly primitive and barbaric is because it's 279 00:15:20,160 --> 00:15:24,320 Speaker 1: so indiscriminate in how it harms the body. It harms 280 00:15:24,360 --> 00:15:29,160 Speaker 1: the whole body in order to kill the cancer cells. Right, 281 00:15:30,040 --> 00:15:33,240 Speaker 1: and we're moving it seems like much more toward far, 282 00:15:33,360 --> 00:15:37,520 Speaker 1: far more specific and tailored medicine, and so all of 283 00:15:37,520 --> 00:15:40,560 Speaker 1: the side effects, in the horribleness that come with chemotherapy, 284 00:15:40,560 --> 00:15:44,560 Speaker 1: even though it does save lives, will be going away 285 00:15:44,720 --> 00:15:45,760 Speaker 1: in future decades. 286 00:15:45,800 --> 00:15:48,320 Speaker 2: It looks like, yeah, and it seems like and we're 287 00:15:48,320 --> 00:15:50,280 Speaker 2: going to talk about a few ways that things are 288 00:15:50,280 --> 00:15:54,320 Speaker 2: becoming more specific, but that seems to be the way 289 00:15:54,320 --> 00:15:57,160 Speaker 2: it's all trying to go is instead of just like 290 00:15:57,600 --> 00:16:00,080 Speaker 2: killing all the cells, let's see if they can just 291 00:16:00,160 --> 00:16:04,640 Speaker 2: specifically target cancer cells and then eventually, you know, get 292 00:16:04,680 --> 00:16:08,520 Speaker 2: down to you know, the human specific targeting of things, 293 00:16:08,560 --> 00:16:12,600 Speaker 2: which would be amazing obviously, yeah, you know, patient specific, right. 294 00:16:12,880 --> 00:16:16,480 Speaker 2: But one of the first ones is antibody drug conjugates, 295 00:16:17,120 --> 00:16:21,800 Speaker 2: and this is a type of chemotherapy, but it combines chemotherapy, 296 00:16:21,920 --> 00:16:25,960 Speaker 2: like the drugs used in chemotherapy, with monoclonal antibodies, which 297 00:16:25,960 --> 00:16:29,040 Speaker 2: are you know, lab just like antibodies that we have 298 00:16:29,080 --> 00:16:31,040 Speaker 2: in our body, except they're created in a lab. 299 00:16:31,440 --> 00:16:35,880 Speaker 1: Right, And so what happens is we inject these drugs, 300 00:16:35,920 --> 00:16:40,160 Speaker 1: these antibody drug conjugates into a patient with cancer, and 301 00:16:40,880 --> 00:16:45,280 Speaker 1: those antibodies are designed to go seek out that tumor 302 00:16:45,600 --> 00:16:48,800 Speaker 1: the specific kind of tumor that that patient has, Yeah, 303 00:16:49,200 --> 00:16:52,240 Speaker 1: and attached to that tumor, that cancer sell and it 304 00:16:52,880 --> 00:16:56,840 Speaker 1: delivers that payload of chemo drugs to it, says here 305 00:16:56,840 --> 00:16:58,840 Speaker 1: you go, here's a nice little present, and then turns 306 00:16:58,880 --> 00:17:01,720 Speaker 1: around and runs, and then in the background, the cell 307 00:17:01,800 --> 00:17:05,280 Speaker 1: explodes and the antibody like ends up on its chest 308 00:17:05,359 --> 00:17:06,720 Speaker 1: but lives to fight another day. 309 00:17:07,320 --> 00:17:12,400 Speaker 2: Yeah, exactly. That's if we're headed in that direction. That's fantastic. Yeah. 310 00:17:12,640 --> 00:17:17,040 Speaker 2: Vaccines is another one, the mr NA vaccines that we 311 00:17:17,119 --> 00:17:20,920 Speaker 2: detailed back when those came out for COVID. Two of 312 00:17:20,960 --> 00:17:25,080 Speaker 2: the most successful versions of those vaccines were originally brought 313 00:17:25,080 --> 00:17:29,480 Speaker 2: about to begin with as tumor vaccines, and the idea 314 00:17:29,520 --> 00:17:32,399 Speaker 2: is to use sort of that same technology to just 315 00:17:32,600 --> 00:17:38,160 Speaker 2: specifically target tumors themselves. So it's not like a vaccine 316 00:17:38,200 --> 00:17:41,680 Speaker 2: to prevent a disease. It's a shot that will essentially 317 00:17:41,720 --> 00:17:43,119 Speaker 2: specifically shrink a tumor. 318 00:17:43,520 --> 00:17:46,439 Speaker 1: Yeah, just like the mRNA vaccines for COVID train the 319 00:17:46,480 --> 00:17:51,040 Speaker 1: body to look for and respond to COVID viruses, saying like, hey, 320 00:17:51,280 --> 00:17:53,800 Speaker 1: if you see anything with this little horn on it, 321 00:17:53,840 --> 00:17:56,520 Speaker 1: the spike, go after it. They're doing the same thing 322 00:17:56,520 --> 00:17:59,280 Speaker 1: with tumors, right, So that's boosting the immune response. It's 323 00:17:59,280 --> 00:18:02,520 Speaker 1: also training immune response. So technically it does qualify as 324 00:18:02,520 --> 00:18:05,040 Speaker 1: a vaccine. And because like we talked about in the 325 00:18:05,040 --> 00:18:11,359 Speaker 1: COVID vaccine episode, this mRNA technology is just so you 326 00:18:11,359 --> 00:18:15,359 Speaker 1: can just it's just like ready to wear vaccines basically. 327 00:18:15,680 --> 00:18:16,120 Speaker 2: Yeah. 328 00:18:16,359 --> 00:18:21,440 Speaker 1: Apparently, yeah, apparently they they have reached a turning point 329 00:18:22,080 --> 00:18:24,399 Speaker 1: and in the next five years a lot of cancer 330 00:18:24,440 --> 00:18:26,640 Speaker 1: researchers are saying We're going to see a lot more 331 00:18:26,680 --> 00:18:29,240 Speaker 1: cancer vaccines coming down the pike. 332 00:18:30,000 --> 00:18:34,240 Speaker 2: Amazing. And then what I talked about earlier, like making 333 00:18:34,840 --> 00:18:37,719 Speaker 2: like targeting cancer cell specifically is like a great direction 334 00:18:37,760 --> 00:18:42,520 Speaker 2: to go, but really getting into personalized cancer care will 335 00:18:42,560 --> 00:18:46,199 Speaker 2: be the next step beyond that, Like, hey, I'm going 336 00:18:46,240 --> 00:18:50,600 Speaker 2: to identify exactly what kind of tumor that you have 337 00:18:50,720 --> 00:18:53,199 Speaker 2: in your body and not just maybe this kind of 338 00:18:53,240 --> 00:18:58,000 Speaker 2: tumor and you know, treat like get to patient specific 339 00:18:58,520 --> 00:19:01,520 Speaker 2: levels of treatment. And you know, I know we poo 340 00:19:01,560 --> 00:19:04,960 Speaker 2: pooed AI in certain respects, but this is a place 341 00:19:05,000 --> 00:19:07,000 Speaker 2: where AI can really probably do a lot of good. 342 00:19:07,359 --> 00:19:09,679 Speaker 1: Yeah, I think we should just clarify or position on 343 00:19:09,760 --> 00:19:11,680 Speaker 1: it if I can speak for both of us. Sure, 344 00:19:11,760 --> 00:19:14,679 Speaker 1: as long as AI is not taking over the world 345 00:19:15,480 --> 00:19:19,800 Speaker 1: or damaging humanity in some terrible way. I'm all for 346 00:19:20,080 --> 00:19:22,159 Speaker 1: all the great ways that can help things, and this 347 00:19:22,200 --> 00:19:24,000 Speaker 1: is a really sterling example of that. 348 00:19:24,359 --> 00:19:25,800 Speaker 2: Yeah, it gave us a new Beatles song. 349 00:19:26,920 --> 00:19:29,760 Speaker 1: Yeah, I would say that's right in the middle for me. 350 00:19:31,000 --> 00:19:33,480 Speaker 1: But the I think what you were talking about was 351 00:19:34,320 --> 00:19:39,280 Speaker 1: taking a sample of the specific tumor that a specific 352 00:19:39,400 --> 00:19:43,960 Speaker 1: person has, analyzing its genetic makeup, and then looking at 353 00:19:43,960 --> 00:19:46,679 Speaker 1: that genetic makeup thanks to AI spitting out all of 354 00:19:46,680 --> 00:19:50,560 Speaker 1: the information that we need from analyzing that huge genome, 355 00:19:51,640 --> 00:19:54,440 Speaker 1: saying oh, this is an achilles heel, this is another weakness, 356 00:19:54,440 --> 00:19:56,360 Speaker 1: this is another way we can attack it, and then 357 00:19:56,680 --> 00:20:00,679 Speaker 1: tailoring the treatment for that specific tumor like that tumor, 358 00:20:00,800 --> 00:20:03,800 Speaker 1: Like you said, now that kind of tumor that tumor 359 00:20:04,160 --> 00:20:06,760 Speaker 1: is getting at tech. It's so specific you could name 360 00:20:06,800 --> 00:20:10,360 Speaker 1: the tumor, name the tumor Melvin. Melvin is toast when 361 00:20:10,400 --> 00:20:14,320 Speaker 1: you're using precision or personal cancer treatments. 362 00:20:14,760 --> 00:20:18,679 Speaker 2: Yeah, you know, this kind of stuff could even be possible. Now, 363 00:20:18,720 --> 00:20:22,640 Speaker 2: it's just really really expensive to target a specific tumor 364 00:20:22,640 --> 00:20:26,080 Speaker 2: for a specific human And so the idea is hopefully 365 00:20:26,080 --> 00:20:28,760 Speaker 2: with the help of AI, they can just reduce a 366 00:20:28,800 --> 00:20:31,879 Speaker 2: lot of the you know, the cost basically for doing that, 367 00:20:32,000 --> 00:20:35,119 Speaker 2: So it becomes instead of something that's not even you know, 368 00:20:35,800 --> 00:20:38,879 Speaker 2: not even something worth pursuing or able to be pursued 369 00:20:38,880 --> 00:20:42,680 Speaker 2: because of finances, something' is like, oh yeah, just step 370 00:20:42,760 --> 00:20:44,960 Speaker 2: right up and we'll spit out your treatment. 371 00:20:45,280 --> 00:20:47,480 Speaker 1: Yeah. And as more more does the cost come down, 372 00:20:47,520 --> 00:20:49,680 Speaker 1: more people use it, which means more people using it 373 00:20:49,960 --> 00:20:54,720 Speaker 1: allows for greater chance of new breakthrough. So yeah, hopefully 374 00:20:54,720 --> 00:20:57,439 Speaker 1: we're going to have cancer licked in the future. I 375 00:20:57,480 --> 00:21:00,719 Speaker 1: saw somebody suggest that it'll end up being kind of 376 00:21:00,720 --> 00:21:03,360 Speaker 1: like a chronic disease akin to diabetes in the. 377 00:21:03,320 --> 00:21:06,760 Speaker 2: Future, just something you can live fairly healthily with. 378 00:21:07,000 --> 00:21:09,639 Speaker 1: Yeah, you can manage and there'll be plenty of drugs 379 00:21:09,680 --> 00:21:14,439 Speaker 1: to keep you going amazing. You want to take a break, Yeah, okay, 380 00:21:14,560 --> 00:21:15,600 Speaker 1: well then let's do that. 381 00:21:36,960 --> 00:21:39,439 Speaker 2: All right. Next up on the list of things that 382 00:21:39,480 --> 00:21:41,720 Speaker 2: people may one day look back and say, why did 383 00:21:41,720 --> 00:21:44,480 Speaker 2: you do it that way? Dummies of the twenty first 384 00:21:44,520 --> 00:21:49,720 Speaker 2: century is organ transplants. We have a pretty great episode 385 00:21:49,720 --> 00:21:53,280 Speaker 2: on organ transplants somewhere in our back catalog. It feels 386 00:21:53,320 --> 00:21:55,240 Speaker 2: like a long time ago. It was a little while, 387 00:21:56,119 --> 00:21:59,560 Speaker 2: but what we're basically talking about, and again, organ transplants awesome. 388 00:22:00,000 --> 00:22:03,840 Speaker 2: It's amazing how far they've come in the past, you know, 389 00:22:03,880 --> 00:22:07,480 Speaker 2: since they've been doing them. But rejection rates are still 390 00:22:07,520 --> 00:22:11,360 Speaker 2: an issue, up to ten to fifteen percent for kidneys 391 00:22:11,359 --> 00:22:14,399 Speaker 2: for instance. And then also the fact that you know, 392 00:22:14,600 --> 00:22:19,040 Speaker 2: transporting organs, getting them to the people in time can 393 00:22:19,119 --> 00:22:22,760 Speaker 2: still be an issue. Seventeen Americans die every day waiting 394 00:22:22,800 --> 00:22:26,840 Speaker 2: on organs. And it's also inequitable in that, you know, 395 00:22:27,200 --> 00:22:31,399 Speaker 2: generally people that have are the most funded, get the 396 00:22:31,440 --> 00:22:34,960 Speaker 2: most organs for transplant. But there's a better way forward, right. 397 00:22:35,080 --> 00:22:37,080 Speaker 1: Yeah, I just want to say one I found a 398 00:22:37,119 --> 00:22:40,480 Speaker 1: stat that I found rather shocking. One in five donated 399 00:22:40,560 --> 00:22:44,040 Speaker 1: kidneys goes unused. It goes to waste, even though people 400 00:22:44,080 --> 00:22:48,120 Speaker 1: die waiting for kidneys. That's just how kluge the whole 401 00:22:48,160 --> 00:22:51,720 Speaker 1: setup is right now. Yeah, So, yeah, they're trying to 402 00:22:51,720 --> 00:22:54,879 Speaker 1: fix the process and the system and the organization in 403 00:22:54,960 --> 00:22:58,240 Speaker 1: charge of that in the United States, But like further 404 00:22:58,320 --> 00:23:03,600 Speaker 1: down the pike on a longer timeline, the goal is organagenesis, 405 00:23:04,200 --> 00:23:09,880 Speaker 1: which is what it sounds like. It's creating entirely new 406 00:23:10,000 --> 00:23:14,040 Speaker 1: organs from cells from scratch. It's like watch this grow. 407 00:23:14,240 --> 00:23:16,720 Speaker 1: You remember those little dinosaur sponges that you added water 408 00:23:16,800 --> 00:23:19,399 Speaker 1: to and they just grew, grew, grew. It's kind of 409 00:23:19,480 --> 00:23:22,520 Speaker 1: like that, but with fully functioning organs. 410 00:23:22,720 --> 00:23:26,159 Speaker 2: Yeah. How far are we away from someone trying to 411 00:23:26,560 --> 00:23:28,960 Speaker 2: grow a human out in a lab? 412 00:23:30,160 --> 00:23:32,720 Speaker 1: I'm sure somebody's probably trying it already, but I don't 413 00:23:32,760 --> 00:23:34,959 Speaker 1: know how long it'll be till they're successful. 414 00:23:35,600 --> 00:23:37,720 Speaker 2: I mean, we had Dolly the sheep, that Dolly was 415 00:23:37,720 --> 00:23:38,760 Speaker 2: a clone, right. 416 00:23:39,000 --> 00:23:42,880 Speaker 1: Yes, And I don't know if everybody's read our book, 417 00:23:42,920 --> 00:23:45,000 Speaker 1: and if you haven't, I'll just go ahead and share 418 00:23:45,040 --> 00:23:49,679 Speaker 1: with you a little fact from it. Yeah, passage of 419 00:23:49,720 --> 00:23:54,639 Speaker 1: dramatic reading. Apparently Dolly was named Dolly because she was 420 00:23:54,640 --> 00:23:57,960 Speaker 1: grown from a mammary cell. So it was a nineties 421 00:23:58,040 --> 00:23:59,960 Speaker 1: haha joke about Dolly Park. 422 00:24:00,560 --> 00:24:03,440 Speaker 2: Yeah. I think we might have said that in the 423 00:24:03,520 --> 00:24:04,320 Speaker 2: Delli parton episode. 424 00:24:04,320 --> 00:24:05,879 Speaker 1: Didn't make or did If we did, I'll have to 425 00:24:05,880 --> 00:24:08,240 Speaker 1: go back and listen. If so, we'll edit this part out. 426 00:24:08,400 --> 00:24:10,120 Speaker 2: No, no, no, the beers repeating, I think. 427 00:24:10,320 --> 00:24:12,879 Speaker 1: Okay, So, yes, we're a little ways off, because not 428 00:24:12,920 --> 00:24:16,200 Speaker 1: only Chuck, are we not capable of growing human from cells. 429 00:24:16,520 --> 00:24:21,320 Speaker 1: We're not capable of growing kidneys or hearts, but we 430 00:24:21,359 --> 00:24:28,040 Speaker 1: are somewhere. We've grown and successfully transplanted windpipes, bladders fairly 431 00:24:28,160 --> 00:24:33,359 Speaker 1: like simple organs and structures. But I mean simple as 432 00:24:33,440 --> 00:24:35,600 Speaker 1: like a relative term, because we're talking about something that 433 00:24:35,680 --> 00:24:40,280 Speaker 1: was grown from that person's own cells into the very 434 00:24:40,880 --> 00:24:43,720 Speaker 1: piece of equipment that they needed and then put in 435 00:24:43,800 --> 00:24:44,879 Speaker 1: them and it worked. 436 00:24:45,600 --> 00:24:48,720 Speaker 2: Yeah, which is remarkable. I know that they can do 437 00:24:48,800 --> 00:24:52,840 Speaker 2: this with, at least right now, the epidermis. So if 438 00:24:52,840 --> 00:24:56,639 Speaker 2: you're a burn victim, you can get your own stem 439 00:24:56,680 --> 00:25:00,359 Speaker 2: cells and you can get some new epidermis of about 440 00:25:00,400 --> 00:25:02,960 Speaker 2: to say, just skin, but they're working on growing like 441 00:25:03,000 --> 00:25:05,080 Speaker 2: the entire thickness of the skin. They're not there yet, 442 00:25:05,480 --> 00:25:09,119 Speaker 2: but they can now grow epidermis from your own stem 443 00:25:09,119 --> 00:25:14,040 Speaker 2: cells in a lab and they transfer it to something 444 00:25:14,080 --> 00:25:17,439 Speaker 2: called fibrine, which is a protein that really kind of 445 00:25:17,440 --> 00:25:19,960 Speaker 2: helps your blood clot when you get a cut or something, 446 00:25:20,760 --> 00:25:23,439 Speaker 2: and then they put it on your body and it 447 00:25:23,720 --> 00:25:27,480 Speaker 2: just it goes and then you're done. 448 00:25:27,720 --> 00:25:29,320 Speaker 1: Yeah, that's it makes that sound. 449 00:25:29,560 --> 00:25:31,600 Speaker 2: It takes you know, obviously it's a process. I was 450 00:25:31,640 --> 00:25:34,840 Speaker 2: just kidding around, But right now they can't like grow 451 00:25:35,280 --> 00:25:38,720 Speaker 2: skin that grows hair or sebaceous glands and stuff like that. 452 00:25:38,760 --> 00:25:41,120 Speaker 2: But if you're a burn victim and you can get 453 00:25:41,160 --> 00:25:46,160 Speaker 2: you know, your own epidermis to replace you know, your 454 00:25:46,200 --> 00:25:49,440 Speaker 2: scarred skin on your body, then that's pretty amazing. 455 00:25:49,800 --> 00:25:53,359 Speaker 1: It's kind of akin to laying sod, but with skin, 456 00:25:54,040 --> 00:25:59,479 Speaker 1: you know. Sure, So right now, I think the state 457 00:25:59,560 --> 00:26:02,800 Speaker 1: of the art with organ and genesis that extra oh 458 00:26:02,840 --> 00:26:05,800 Speaker 1: trips me up and I like to add syllables. So 459 00:26:05,840 --> 00:26:12,240 Speaker 1: that's a real tricky one is growing organs in other animals, 460 00:26:13,280 --> 00:26:16,639 Speaker 1: And as we'll see, hopefully we're going to be moving 461 00:26:16,680 --> 00:26:19,320 Speaker 1: away from that because to take that organ from the 462 00:26:19,359 --> 00:26:21,679 Speaker 1: animal and transplant it into a human, you kill that 463 00:26:21,760 --> 00:26:25,440 Speaker 1: animal in the process, right Like you don't. You can't 464 00:26:25,440 --> 00:26:27,359 Speaker 1: take a pig's heart and be like, good luck with 465 00:26:27,400 --> 00:26:29,080 Speaker 1: the rest of your life, because it doesn't have a 466 00:26:29,160 --> 00:26:32,800 Speaker 1: rest of its life. It's missing its heart. And from 467 00:26:32,840 --> 00:26:35,560 Speaker 1: a lot of the trends that I've seen, it seems 468 00:26:35,840 --> 00:26:38,840 Speaker 1: like a fairly safe bet that we're moving in a 469 00:26:38,880 --> 00:26:43,159 Speaker 1: direction where animal welfare is going to become more and 470 00:26:43,240 --> 00:26:47,600 Speaker 1: more and more important to where how we treat animals 471 00:26:47,640 --> 00:26:50,919 Speaker 1: will be maybe the most critical thing that people will 472 00:26:51,040 --> 00:26:53,640 Speaker 1: of the future will look back at us on you. 473 00:26:53,600 --> 00:26:57,040 Speaker 2: Know, yeah, which that's coming up in a more robust 474 00:26:57,080 --> 00:26:59,920 Speaker 2: way in a second. But to finish this up, there's 475 00:27:00,119 --> 00:27:05,040 Speaker 2: also three D bioprinting, which is pretty amazing. I remember 476 00:27:05,160 --> 00:27:08,040 Speaker 2: telling the story one time. I've known two people in 477 00:27:08,080 --> 00:27:14,560 Speaker 2: my life who were born without an external ear, and 478 00:27:15,520 --> 00:27:19,080 Speaker 2: the process back then was they formed a sort of 479 00:27:19,080 --> 00:27:24,160 Speaker 2: a skeleton of the shape of an ear, and if 480 00:27:24,160 --> 00:27:25,840 Speaker 2: I'm not I'm not sure what it was made of, 481 00:27:26,160 --> 00:27:29,680 Speaker 2: I think cartilage. But if I'm not mistaken, that then 482 00:27:30,160 --> 00:27:34,560 Speaker 2: was there was like a skin bubble around it, and 483 00:27:34,560 --> 00:27:37,480 Speaker 2: then they would suck the air out of that skin 484 00:27:37,520 --> 00:27:42,159 Speaker 2: bubble very quickly to onto that cartilage to form you 485 00:27:42,200 --> 00:27:46,560 Speaker 2: know what looked close enough to an external ear. And 486 00:27:46,600 --> 00:27:49,199 Speaker 2: I say external like you know, the ear parts or. 487 00:27:49,280 --> 00:27:50,400 Speaker 1: Yeah, I know it's talking about. 488 00:27:51,520 --> 00:27:53,240 Speaker 2: And I've known two people in my life that had 489 00:27:53,240 --> 00:27:55,320 Speaker 2: that done and they, you know, back in the day, 490 00:27:55,400 --> 00:27:56,960 Speaker 2: it was not like it is now. I think the 491 00:27:57,000 --> 00:28:01,400 Speaker 2: three D bioprinting of ears is much much further along 492 00:28:01,400 --> 00:28:03,040 Speaker 2: and they look much better than they used to. And 493 00:28:03,040 --> 00:28:05,359 Speaker 2: that's kind of the point, right, But they're thinking that, 494 00:28:05,840 --> 00:28:08,880 Speaker 2: you know, maybe one day we can three D bioprint 495 00:28:08,920 --> 00:28:09,359 Speaker 2: a liver. 496 00:28:10,119 --> 00:28:13,600 Speaker 1: Yeah, pretty amazing, and that'll kind of come up well 497 00:28:13,640 --> 00:28:16,239 Speaker 1: in the next section two. So I say we move 498 00:28:16,280 --> 00:28:18,080 Speaker 1: on to the next section because it does kind of 499 00:28:18,119 --> 00:28:19,639 Speaker 1: tie into what we were talking about. 500 00:28:19,680 --> 00:28:21,400 Speaker 2: Just now, that's right, let's do it. 501 00:28:21,960 --> 00:28:26,160 Speaker 1: So getting meat from animals is probably something that will 502 00:28:26,200 --> 00:28:29,639 Speaker 1: really be looked down upon in the future because we 503 00:28:29,720 --> 00:28:35,119 Speaker 1: already have techniques that will that make it so we 504 00:28:35,119 --> 00:28:39,400 Speaker 1: don't really need live animals to create meat to eat meat, 505 00:28:40,880 --> 00:28:43,840 Speaker 1: and yet we're still eating meat. And that's despite and 506 00:28:43,880 --> 00:28:46,520 Speaker 1: I'm very much guilty of this too, that's despite knowing 507 00:28:46,560 --> 00:28:51,200 Speaker 1: how horrific and terrible factory farming is for the animals themselves, 508 00:28:51,200 --> 00:28:54,720 Speaker 1: for the environment. People just really love meat and it's 509 00:28:54,760 --> 00:28:57,040 Speaker 1: tough to give up. So rather than forcing people to 510 00:28:57,040 --> 00:28:59,560 Speaker 1: give up, there's other alternatives that people are working on 511 00:28:59,560 --> 00:29:03,400 Speaker 1: to replace. We're going to need to do that too, because, apparently, Chuck, 512 00:29:04,400 --> 00:29:07,520 Speaker 1: the growing demand for meat is going to be totally 513 00:29:07,640 --> 00:29:09,520 Speaker 1: unsustainable in the next couple decades. 514 00:29:10,040 --> 00:29:12,840 Speaker 2: Yeah, I mean, there are statistics like the UN will 515 00:29:12,880 --> 00:29:15,719 Speaker 2: throw out that say we're going to by the year 516 00:29:15,760 --> 00:29:17,960 Speaker 2: twenty fifty, the meat demand means we're going to have 517 00:29:18,040 --> 00:29:20,440 Speaker 2: to produce fifty to one hundred percent more meat than 518 00:29:20,480 --> 00:29:24,520 Speaker 2: we do now. But there's also other people saying, like, Hey, 519 00:29:24,600 --> 00:29:29,120 Speaker 2: this whole notion of you know, wealthier countries eat meat 520 00:29:29,120 --> 00:29:32,160 Speaker 2: because they can afford it and countries that are more 521 00:29:32,160 --> 00:29:36,640 Speaker 2: developing eat agriculturally largely or vegetarian because they're forced to, 522 00:29:37,520 --> 00:29:40,600 Speaker 2: isn't really the case any or at least moving forward, 523 00:29:40,640 --> 00:29:43,480 Speaker 2: it looks like because what they found is the emerging 524 00:29:43,520 --> 00:29:47,280 Speaker 2: trend is that people are eating less meat once they 525 00:29:47,520 --> 00:29:50,120 Speaker 2: get enough of wealth to afford it for a bunch 526 00:29:50,160 --> 00:29:52,080 Speaker 2: of reasons, and one of which is what you're talking about, 527 00:29:52,120 --> 00:29:56,200 Speaker 2: is there's just a forever changing way that humans look 528 00:29:56,280 --> 00:29:59,840 Speaker 2: at animals and animal welfare for one. And also you 529 00:30:00,320 --> 00:30:02,440 Speaker 2: red meat in the fact that it's terrible for your 530 00:30:02,480 --> 00:30:04,160 Speaker 2: body is another one. 531 00:30:04,040 --> 00:30:08,600 Speaker 1: And terrible for the environment livestock raising that includes transportation, 532 00:30:08,840 --> 00:30:12,080 Speaker 1: tractor emissions, but also methane from the cow shooting ducks 533 00:30:12,120 --> 00:30:14,640 Speaker 1: all the time that it counts for fourteen and a 534 00:30:14,680 --> 00:30:18,600 Speaker 1: half percent of global greenhouse gas emissions. So it has 535 00:30:18,680 --> 00:30:23,360 Speaker 1: like this triple impact, triple negative impact on the animal's welfare, 536 00:30:23,760 --> 00:30:28,160 Speaker 1: the human body, and the health of the earth. And 537 00:30:28,440 --> 00:30:31,320 Speaker 1: for those reasons, it does seem like people in wealthier 538 00:30:31,360 --> 00:30:34,520 Speaker 1: countries are starting to move away from meat, and so 539 00:30:34,640 --> 00:30:39,480 Speaker 1: I think the Organization for Economic Cooperation and Development they 540 00:30:39,520 --> 00:30:42,480 Speaker 1: released an Agricultural Outlook within the last couple of years 541 00:30:42,880 --> 00:30:45,760 Speaker 1: and they predicted that around twenty seventy five, the whole 542 00:30:45,800 --> 00:30:50,400 Speaker 1: world will start moving away from meat and that eventually 543 00:30:50,400 --> 00:30:53,640 Speaker 1: we're just going to stop eating what's called carcass meat 544 00:30:54,320 --> 00:30:56,000 Speaker 1: very appropriately altogether. 545 00:30:56,640 --> 00:30:59,240 Speaker 2: Yeah, can I tell you something really quick about a 546 00:30:59,480 --> 00:31:03,640 Speaker 2: Instagram all today. Yeah, it has to do with the 547 00:31:03,680 --> 00:31:07,560 Speaker 2: cows and the methane. Actually, our colleague, our old friend, 548 00:31:07,560 --> 00:31:10,120 Speaker 2: Tamika at work posted this, yeah, and it was a 549 00:31:10,200 --> 00:31:13,200 Speaker 2: video of a guy that was showed how they treat 550 00:31:13,240 --> 00:31:16,560 Speaker 2: bloat in cows. Have you seen this? No, when a 551 00:31:16,600 --> 00:31:21,880 Speaker 2: cow gets bloated with gas, they stick a needle into 552 00:31:22,080 --> 00:31:27,320 Speaker 2: the cow's stomach, releasing methane and they light that thing 553 00:31:28,680 --> 00:31:31,320 Speaker 2: so they can see a flame to judge like how 554 00:31:31,400 --> 00:31:34,320 Speaker 2: much gas is still in there. Oh wow, So there 555 00:31:34,360 --> 00:31:36,920 Speaker 2: was a video of a cow with a with a 556 00:31:37,080 --> 00:31:38,520 Speaker 2: blowtorch coming out of its side. 557 00:31:38,560 --> 00:31:41,360 Speaker 1: Essentially wow, what was the cow's expression? 558 00:31:41,520 --> 00:31:43,520 Speaker 2: Like, Well, all I saw was the cow, but the 559 00:31:44,240 --> 00:31:47,840 Speaker 2: guy who was hosting the video his expression was horrified 560 00:31:47,880 --> 00:31:50,600 Speaker 2: because he was like, you know, can you believe that 561 00:31:50,640 --> 00:31:51,880 Speaker 2: this is where we are in the world. 562 00:31:52,320 --> 00:31:55,240 Speaker 1: Yeah, I totally can believe that. That doesn't surprise me at all, 563 00:31:55,280 --> 00:31:56,160 Speaker 1: to tell you the truth. 564 00:31:56,320 --> 00:31:57,920 Speaker 2: And they're trying to help the cow, but it's you know, 565 00:31:57,960 --> 00:32:01,800 Speaker 2: the cows. The reason cow's there is because of factory partment. 566 00:32:02,160 --> 00:32:04,760 Speaker 1: Right. Tamiko, by the way, has one of the better 567 00:32:05,320 --> 00:32:08,360 Speaker 1: non celebrity Instagram feeds you can find. 568 00:32:08,760 --> 00:32:10,720 Speaker 2: Yeah to me, because this is great. It's good stuff. 569 00:32:11,160 --> 00:32:14,680 Speaker 2: But what you were saying about moving away from what'd 570 00:32:14,680 --> 00:32:17,680 Speaker 2: you call it, carcass meat? Yeah, there are two main 571 00:32:17,720 --> 00:32:20,600 Speaker 2: ways that that's happening right now, and that is obviously 572 00:32:21,160 --> 00:32:25,440 Speaker 2: what they call novel vegan meat replacements, you know, fake meat, 573 00:32:25,480 --> 00:32:30,480 Speaker 2: impossible stuff, beyond stuff, and then lab grown meat, which 574 00:32:31,000 --> 00:32:32,720 Speaker 2: did we do a whole episode on lab grown meat? 575 00:32:32,800 --> 00:32:33,960 Speaker 1: Yeah? We did a while back. 576 00:32:34,240 --> 00:32:35,960 Speaker 2: I thought so we should. 577 00:32:35,760 --> 00:32:38,200 Speaker 1: Update it like we did the recycling episode, you know, 578 00:32:38,240 --> 00:32:41,360 Speaker 1: like so much stuff has changed since total I'm sure, well, 579 00:32:41,360 --> 00:32:44,240 Speaker 1: we'll update it eventually. But lab grow meat or cultured 580 00:32:44,280 --> 00:32:47,360 Speaker 1: meat is exactly what it sounds like. You use a bioreactor, 581 00:32:47,480 --> 00:32:51,800 Speaker 1: sometimes a three D like bioprinter using animal cells to 582 00:32:51,920 --> 00:32:57,720 Speaker 1: recreate meat, and they I think the there's a consulting 583 00:32:57,760 --> 00:33:00,920 Speaker 1: group called at Kearney, they predict that by twenty forty, 584 00:33:00,960 --> 00:33:04,360 Speaker 1: which is not that long off everybody, the sixty percent 585 00:33:04,400 --> 00:33:09,120 Speaker 1: of global meat consumption will be from cultured or non 586 00:33:09,240 --> 00:33:13,960 Speaker 1: vegan meat replacements. Yeah, like that's significant. That's a huge change. 587 00:33:14,320 --> 00:33:17,120 Speaker 1: Like there may be countries that are developing now that 588 00:33:17,200 --> 00:33:20,960 Speaker 1: won't even eat carcass meat when they become wealthy because 589 00:33:21,280 --> 00:33:23,880 Speaker 1: the replacements will have become so great there'll be no 590 00:33:24,040 --> 00:33:25,160 Speaker 1: reason to eat meat. 591 00:33:26,400 --> 00:33:29,719 Speaker 2: Yeah. You know, if you ask the CEOs of beyond 592 00:33:29,760 --> 00:33:33,160 Speaker 2: it impossible, they're going to say, then fifteen years there 593 00:33:33,200 --> 00:33:36,400 Speaker 2: will be no more eating of meat. Yeah, that's a 594 00:33:36,400 --> 00:33:39,280 Speaker 2: little ambitious, and I think that, you know, maybe they're 595 00:33:39,280 --> 00:33:43,480 Speaker 2: trying to drive up the stock price, so that's probably 596 00:33:43,480 --> 00:33:45,480 Speaker 2: not going to be the case, but that that at 597 00:33:45,640 --> 00:33:49,880 Speaker 2: Kearney group prediction, like, that's that seems quite possible. 598 00:33:50,000 --> 00:33:52,800 Speaker 1: I buy that, especially if there's a couple of challenges 599 00:33:52,840 --> 00:33:56,200 Speaker 1: that are overcome by then, which is you know, that's 600 00:33:56,200 --> 00:34:01,120 Speaker 1: seventeenth sorry, sixteen years now away, and that's plenty of 601 00:34:01,200 --> 00:34:06,280 Speaker 1: time to overcome some relative speed bumps. One is replicating no, 602 00:34:06,440 --> 00:34:09,239 Speaker 1: I think they have flavor kind of locked, at least 603 00:34:09,239 --> 00:34:10,759 Speaker 1: as far as cultured meat goes. 604 00:34:10,920 --> 00:34:13,560 Speaker 2: Yeah, texture textures. 605 00:34:13,040 --> 00:34:15,719 Speaker 1: The problem, because you don't want to eat like a 606 00:34:15,760 --> 00:34:22,440 Speaker 1: little a little scobie of beef that tastes just like 607 00:34:22,520 --> 00:34:25,600 Speaker 1: beef but looks like a scobi from a kombucha batch. 608 00:34:26,160 --> 00:34:29,800 Speaker 1: No one wants to eat that. And yet Japanese researchers 609 00:34:29,840 --> 00:34:34,160 Speaker 1: recently showed I think according to freethink, this great website 610 00:34:34,160 --> 00:34:37,840 Speaker 1: I found in twenty twenty one, they recreated a Wagoo 611 00:34:37,960 --> 00:34:41,480 Speaker 1: steak which has got some of the most complex marbling 612 00:34:41,520 --> 00:34:44,640 Speaker 1: of fat mixed in with the meat that you could 613 00:34:44,680 --> 00:34:49,319 Speaker 1: possibly ever come across, and they faithfully recreated one. I'm 614 00:34:49,360 --> 00:34:52,120 Speaker 1: sure it cost them a million and a half dollars 615 00:34:52,120 --> 00:34:54,400 Speaker 1: to make that one steak, but there was a proof 616 00:34:54,440 --> 00:34:57,560 Speaker 1: of concept that it can be done. The other big 617 00:34:57,640 --> 00:35:01,520 Speaker 1: challenge is right now, when you're making that Wagou steak 618 00:35:01,800 --> 00:35:05,680 Speaker 1: from cellular culture, you actually need to take it from 619 00:35:05,680 --> 00:35:08,919 Speaker 1: an unborn calf as you slaughter the mom. I don't 620 00:35:08,920 --> 00:35:11,640 Speaker 1: think the mom has to be slaughtered. I think they 621 00:35:11,719 --> 00:35:14,680 Speaker 1: just take it while they're slaughtering the mom. Yeah, and 622 00:35:14,719 --> 00:35:17,120 Speaker 1: that's what they used to grow meat. Right now, and 623 00:35:17,160 --> 00:35:19,600 Speaker 1: a lot of people are like, no, still, I'm not 624 00:35:19,680 --> 00:35:23,960 Speaker 1: okay with that. It's still an animal suffers somehow, some way. 625 00:35:25,040 --> 00:35:28,440 Speaker 1: And so there's a company called Meetable, a Dutch company 626 00:35:28,440 --> 00:35:30,520 Speaker 1: that said, we got this, we got our way around this. 627 00:35:31,080 --> 00:35:34,520 Speaker 2: Yeah, they made us sausage in July twenty twenty two 628 00:35:35,239 --> 00:35:40,239 Speaker 2: that was lab grown sausage, lab grown pork. But it 629 00:35:40,320 --> 00:35:43,080 Speaker 2: was it did not use I don't think we said. 630 00:35:43,120 --> 00:35:46,600 Speaker 2: What it's called fetal bovine serum. Is that blood drawn 631 00:35:46,680 --> 00:35:49,319 Speaker 2: from the cow's fetus, and that's what you said, it's 632 00:35:49,360 --> 00:35:52,359 Speaker 2: typically used, but they didn't use that at all. It 633 00:35:52,480 --> 00:35:56,640 Speaker 2: was you know, there was no animal involved. 634 00:35:56,840 --> 00:35:59,359 Speaker 1: Yeah, they used cells from like a live animal that 635 00:35:59,440 --> 00:36:00,440 Speaker 1: was unharmed by it. 636 00:36:00,480 --> 00:36:02,320 Speaker 2: Well yeah, yeah, that's what I meant. No, no animal 637 00:36:02,520 --> 00:36:04,160 Speaker 2: involved is in their death? 638 00:36:04,239 --> 00:36:07,480 Speaker 1: Was not involved, right right, exactly? Yeah, the animal couldn't 639 00:36:07,480 --> 00:36:10,839 Speaker 1: have cared less either way. From what I understand they did. 640 00:36:11,640 --> 00:36:14,239 Speaker 1: They were in the process of having they were being degassed, 641 00:36:14,280 --> 00:36:16,840 Speaker 1: so they had bigger fish to fry than somebody scraping 642 00:36:16,880 --> 00:36:18,360 Speaker 1: a few cells off their hind quarters. 643 00:36:18,360 --> 00:36:19,879 Speaker 2: You know, it's like I got a blowtorch coming out 644 00:36:19,880 --> 00:36:21,200 Speaker 2: of my exactly. 645 00:36:22,080 --> 00:36:24,879 Speaker 1: Uh, okay, I say we move on. Oh but first, Chuck, 646 00:36:24,960 --> 00:36:29,080 Speaker 1: let's take a break, because it's it's that kind of time. 647 00:36:29,719 --> 00:36:30,239 Speaker 2: Let's do it. 648 00:36:51,239 --> 00:36:56,080 Speaker 1: Okay, we're back, Charles. We're talking about what people in 649 00:36:56,120 --> 00:36:57,920 Speaker 1: the future are going to think of us based on 650 00:36:57,960 --> 00:37:00,880 Speaker 1: the stuff we do today that may seem primitive, and 651 00:37:00,960 --> 00:37:03,720 Speaker 1: one of them might not seem well, it could seem primitive, 652 00:37:03,719 --> 00:37:07,920 Speaker 1: it'll seem quaint. Probably is driving a car yourself, Yeah, 653 00:37:08,040 --> 00:37:11,600 Speaker 1: or maybe even owning your own car, because the predictions 654 00:37:11,600 --> 00:37:16,200 Speaker 1: for the future are that car hailing apps will become 655 00:37:16,239 --> 00:37:18,799 Speaker 1: so ubiquitous that you're gonna need your own car less 656 00:37:18,800 --> 00:37:21,879 Speaker 1: and less and less. This is a prediction from Karas 657 00:37:21,920 --> 00:37:25,759 Speaker 1: Swisser Swisher the New York Times tech colonists. Sorry Kara, 658 00:37:27,520 --> 00:37:30,920 Speaker 1: that in not too many years, owning your own car 659 00:37:31,000 --> 00:37:34,280 Speaker 1: is going to become obsolete. And then eventually the next step. 660 00:37:34,440 --> 00:37:37,719 Speaker 1: This is me adding to that prediction, those cars that 661 00:37:37,840 --> 00:37:39,920 Speaker 1: pick you up when you use a ride hailing app 662 00:37:40,120 --> 00:37:42,600 Speaker 1: will not have a driver in them. You will just 663 00:37:42,640 --> 00:37:43,640 Speaker 1: get in the back and go. 664 00:37:44,320 --> 00:37:47,000 Speaker 2: Yeah. I mean, self driving cars has been in the 665 00:37:47,040 --> 00:37:50,640 Speaker 2: news a lot over the past you know, decade or so. 666 00:37:51,360 --> 00:37:54,520 Speaker 2: I remember being in San Francisco, a couple of years 667 00:37:54,560 --> 00:37:58,759 Speaker 2: ago and seeing a car with a crazy contraption on top, 668 00:37:59,480 --> 00:38:00,880 Speaker 2: and I was like, what in the world is that? 669 00:38:01,120 --> 00:38:03,959 Speaker 2: And is that a Google Maps or a Google Earth 670 00:38:04,120 --> 00:38:05,760 Speaker 2: like thing taking pictures. 671 00:38:05,400 --> 00:38:06,960 Speaker 1: That car's wearing braces. 672 00:38:07,400 --> 00:38:10,120 Speaker 2: Then I looked inside. It's like no, no, no, there's 673 00:38:10,160 --> 00:38:13,000 Speaker 2: no human in that car. And it kind of startled me. 674 00:38:13,160 --> 00:38:15,440 Speaker 1: But just showing witchcraft, right. 675 00:38:16,960 --> 00:38:19,719 Speaker 2: I did. I threw a Molotov cocktail out of it 676 00:38:19,719 --> 00:38:23,040 Speaker 2: all to care of that problem. But there was a company, 677 00:38:23,600 --> 00:38:24,880 Speaker 2: I think there were you know, there's more than one 678 00:38:24,880 --> 00:38:27,320 Speaker 2: company that's trying this stuff out, but there's a company 679 00:38:27,360 --> 00:38:32,160 Speaker 2: called Cruise, which just recently in October of this year. Well, 680 00:38:32,200 --> 00:38:35,040 Speaker 2: I guess last year now of twenty twenty three, the 681 00:38:35,080 --> 00:38:40,480 Speaker 2: California state government said you can't do this, you can't 682 00:38:40,480 --> 00:38:44,239 Speaker 2: practice this anymore, no more driverless practicing out of you 683 00:38:45,320 --> 00:38:48,200 Speaker 2: because well, for a lot of reasons building up to 684 00:38:48,760 --> 00:38:52,560 Speaker 2: what was called the incident, but minor incidents involved things 685 00:38:52,600 --> 00:38:56,560 Speaker 2: like blocking ambulances, stopping in the middle of an intersection, 686 00:38:57,920 --> 00:39:01,960 Speaker 2: rearinding a bus, running red light, stuff like that. But 687 00:39:02,080 --> 00:39:05,760 Speaker 2: the big incident was when a pedestrian finally was bound 688 00:39:05,760 --> 00:39:10,160 Speaker 2: to happen, was hit in downtown San Francisco when she 689 00:39:10,320 --> 00:39:12,560 Speaker 2: was hit by another car driven by a real human, 690 00:39:13,080 --> 00:39:16,279 Speaker 2: knocked into the other lane, and then the Cruise car 691 00:39:16,480 --> 00:39:22,320 Speaker 2: apparently braked but then rolled over her anyway, pulled her forward, 692 00:39:22,440 --> 00:39:24,520 Speaker 2: and then stopped on top of her. 693 00:39:24,880 --> 00:39:26,719 Speaker 1: Just stopped. It was like, Okay, I'm fine, I don't 694 00:39:26,760 --> 00:39:28,719 Speaker 1: know what to do. I'm just gonna freeze right here 695 00:39:28,760 --> 00:39:32,000 Speaker 1: on top of this pedestrian. So yeah, Cruise is far 696 00:39:32,080 --> 00:39:36,919 Speaker 1: and away the only company from having problems with their 697 00:39:37,160 --> 00:39:40,080 Speaker 1: tech that they're working on. They're just the most recent 698 00:39:40,160 --> 00:39:43,280 Speaker 1: poster child of the problems with self driving cars. 699 00:39:43,680 --> 00:39:45,720 Speaker 2: Yeah she didn't die, by the way, No, thank. 700 00:39:45,560 --> 00:39:46,279 Speaker 1: You for saying that. 701 00:39:46,520 --> 00:39:46,960 Speaker 2: Yeah. 702 00:39:47,080 --> 00:39:50,640 Speaker 1: The point of this is, though, is that despite these setbacks, 703 00:39:51,680 --> 00:39:54,320 Speaker 1: we exist in the time of setbacks. In a couple 704 00:39:54,360 --> 00:39:56,840 Speaker 1: of decades, will exist in the time where were beyond 705 00:39:56,840 --> 00:39:59,759 Speaker 1: those setbacks and we have driverless cars. These setbacks don't 706 00:39:59,760 --> 00:40:02,560 Speaker 1: mean we're never going to have driverless cars. In fact, 707 00:40:02,560 --> 00:40:05,919 Speaker 1: even people who are super skeptical of them right now 708 00:40:06,080 --> 00:40:09,160 Speaker 1: still admit we're probably going to have them at some 709 00:40:09,360 --> 00:40:11,919 Speaker 1: point in the future. It's just a question of when. 710 00:40:12,320 --> 00:40:16,040 Speaker 1: And it seems like we're a little further behind than 711 00:40:16,080 --> 00:40:18,160 Speaker 1: we may have thought a few years ago. 712 00:40:18,960 --> 00:40:21,040 Speaker 2: Yeah, and you know one thing that if it's not, 713 00:40:21,400 --> 00:40:24,400 Speaker 2: I think the road there may not be as abrupt 714 00:40:24,520 --> 00:40:28,000 Speaker 2: because we already see in newer cars a lot of 715 00:40:28,080 --> 00:40:33,080 Speaker 2: like things like lane assistance, Like your car will correct 716 00:40:33,080 --> 00:40:36,280 Speaker 2: itself and steer itself back if it sees that it's 717 00:40:36,360 --> 00:40:39,280 Speaker 2: driving off the road, like if you're drowsy or you're 718 00:40:39,320 --> 00:40:42,120 Speaker 2: on your phone, which you should never be, so you 719 00:40:42,160 --> 00:40:45,640 Speaker 2: see like lane assistants and stuff like that. You know, 720 00:40:45,680 --> 00:40:49,000 Speaker 2: if your speed like really really changes a lot. A 721 00:40:49,000 --> 00:40:50,920 Speaker 2: lot of times cars these days will send you an 722 00:40:50,920 --> 00:40:53,879 Speaker 2: alert that says, like, you know, are you okay? Maybe 723 00:40:53,880 --> 00:40:56,799 Speaker 2: you should pull over, stuff like that. So that's sort 724 00:40:56,840 --> 00:41:00,239 Speaker 2: of like these are the intermediary steps that will lead 725 00:41:00,280 --> 00:41:03,040 Speaker 2: to full automation, and they've already come a long way, 726 00:41:03,040 --> 00:41:06,000 Speaker 2: but apparently, again with the help of AI, they could 727 00:41:06,000 --> 00:41:06,720 Speaker 2: go a lot further. 728 00:41:07,120 --> 00:41:09,120 Speaker 1: Yeah, eventually the cars is going to start talking to 729 00:41:09,200 --> 00:41:11,359 Speaker 1: itself and you'll feel so left out you just don't 730 00:41:11,360 --> 00:41:14,879 Speaker 1: even get in the driver's seat anymore. But the whole 731 00:41:14,920 --> 00:41:19,080 Speaker 1: point of removing humans from cars is to remove humans 732 00:41:19,160 --> 00:41:23,759 Speaker 1: from the equation of driving, not for our convenience necessarily, 733 00:41:23,760 --> 00:41:26,120 Speaker 1: but for our safety because we're our own worst enemies 734 00:41:26,120 --> 00:41:29,960 Speaker 1: when it comes to driving. You found a stat that recently, 735 00:41:30,080 --> 00:41:32,120 Speaker 1: it was it like twenty twenty twenty twenty one. 736 00:41:32,120 --> 00:41:35,640 Speaker 2: Do you know it's twenty twenty one, But just over 737 00:41:35,680 --> 00:41:37,960 Speaker 2: the last few years, in general, it's been about thirty 738 00:41:38,000 --> 00:41:39,319 Speaker 2: to thirty three percent. 739 00:41:39,360 --> 00:41:43,160 Speaker 1: Of fatalities involved at least one of the drivers being drunk. 740 00:41:43,680 --> 00:41:46,840 Speaker 1: I couldn't find any statistics that also include drugs, but 741 00:41:47,239 --> 00:41:51,680 Speaker 1: just being drunk alone, thirty percent of people who die 742 00:41:51,719 --> 00:41:54,880 Speaker 1: in the United States die because one of the people 743 00:41:55,280 --> 00:42:00,160 Speaker 1: involved in that crash was drunk. That is unacceptable, but 744 00:42:00,239 --> 00:42:03,160 Speaker 1: it's humans. People do that. It's a terrible decision. People 745 00:42:03,160 --> 00:42:04,880 Speaker 1: think that that's not going to happen to them, and 746 00:42:05,160 --> 00:42:07,920 Speaker 1: it does, and it accounts for thousands and thousands and 747 00:42:07,960 --> 00:42:12,400 Speaker 1: thousands of deaths every year. Yeah, driverless cars don't drink. 748 00:42:12,520 --> 00:42:15,600 Speaker 1: They have other problems right now, But as we work 749 00:42:15,640 --> 00:42:18,600 Speaker 1: them out, those problems will become a part of the past, 750 00:42:18,680 --> 00:42:22,080 Speaker 1: and drunk driving accidents will become a part of the 751 00:42:22,120 --> 00:42:24,200 Speaker 1: past as well, which will be great for everybody. 752 00:42:24,719 --> 00:42:28,200 Speaker 2: Yeah, I mean, ninety four percent of any accident in 753 00:42:28,239 --> 00:42:31,680 Speaker 2: the United States involves some kind of human error. So 754 00:42:32,120 --> 00:42:35,279 Speaker 2: you know what I'm curious about is what the acceptable 755 00:42:35,320 --> 00:42:39,480 Speaker 2: percentage of driverless car error, because it seems like it 756 00:42:39,480 --> 00:42:44,359 Speaker 2: seems like human car error is just endlessly forgivable, to 757 00:42:44,440 --> 00:42:48,000 Speaker 2: the point where, you know, like every car these days 758 00:42:48,000 --> 00:42:50,560 Speaker 2: you shouldn't be able to start unless you can blow 759 00:42:50,600 --> 00:42:53,839 Speaker 2: into a breathalyzer like that technology is there. 760 00:42:54,040 --> 00:42:56,320 Speaker 1: Yeah, we're harder on computers than we are in ourselves, 761 00:42:56,400 --> 00:42:56,959 Speaker 1: is what you're saying. 762 00:42:57,000 --> 00:42:59,759 Speaker 2: Huh, well exactly, So, like what if all of a sudden, 763 00:42:59,840 --> 00:43:02,720 Speaker 2: driverless cars they prove, like you know, they can reduce 764 00:43:03,000 --> 00:43:06,960 Speaker 2: total accidents by ninety percent, there would still be people saying, like, 765 00:43:07,239 --> 00:43:10,520 Speaker 2: in those ten percent of cases where someone died, it 766 00:43:10,600 --> 00:43:12,560 Speaker 2: was some AI computer or whatever. 767 00:43:12,719 --> 00:43:13,239 Speaker 1: Yeah. 768 00:43:13,280 --> 00:43:15,080 Speaker 2: So it's just I don't know, just find it really 769 00:43:15,120 --> 00:43:18,160 Speaker 2: interesting that we still allow people to get into a 770 00:43:18,200 --> 00:43:21,440 Speaker 2: car after they've been drinking and drive, even though the 771 00:43:21,480 --> 00:43:24,760 Speaker 2: technology exists to stop that from happening. 772 00:43:24,880 --> 00:43:27,440 Speaker 1: Well, yeah, I think that it's a cognitive bias of ours. 773 00:43:27,480 --> 00:43:29,840 Speaker 1: We tend to focus on the more sensational, and the 774 00:43:29,880 --> 00:43:32,720 Speaker 1: more sensational. Is a car being driven by a computer 775 00:43:32,920 --> 00:43:39,560 Speaker 1: killing somebody than a drunk dude killing somebody in his car. Yeah, 776 00:43:39,840 --> 00:43:43,160 Speaker 1: So yeah, there's just removing people from the equation should 777 00:43:43,200 --> 00:43:48,440 Speaker 1: increase safety. It should also probably increase or decrease pollution 778 00:43:48,760 --> 00:43:50,719 Speaker 1: as a result. There's somebody who came up with the 779 00:43:50,719 --> 00:43:54,000 Speaker 1: eye popping statistic that thirty percent of the traffic in 780 00:43:54,080 --> 00:43:58,560 Speaker 1: metropolitan areas is people circling the block looking for a 781 00:43:58,600 --> 00:44:01,560 Speaker 1: place to park. If you don't own a car and 782 00:44:01,600 --> 00:44:04,160 Speaker 1: you're not driving your own car, that goes away. So 783 00:44:04,239 --> 00:44:08,760 Speaker 1: thirty percent of traffic goes away instantaneously with that, Yeah, 784 00:44:09,000 --> 00:44:13,080 Speaker 1: I mean that you have me right there. Yeah, yeah, 785 00:44:13,360 --> 00:44:16,839 Speaker 1: So driverless cars almost certainly coming down the pike, as 786 00:44:16,880 --> 00:44:19,120 Speaker 1: long as AI doesn't take over the world, of course, 787 00:44:19,160 --> 00:44:21,480 Speaker 1: I think we should caveat this entire episode with that 788 00:44:21,600 --> 00:44:24,560 Speaker 1: all of this is going to happen if AI doesn't 789 00:44:24,560 --> 00:44:25,440 Speaker 1: take over the world. 790 00:44:25,440 --> 00:44:28,319 Speaker 2: Okay, that's right, And we're going to finish up with 791 00:44:28,360 --> 00:44:32,080 Speaker 2: a couple of shorter ones that I think you're just 792 00:44:32,600 --> 00:44:37,000 Speaker 2: pretty awesome and interesting. One is the fact that sort 793 00:44:37,000 --> 00:44:39,840 Speaker 2: of the current thinking is that we we tend to 794 00:44:39,920 --> 00:44:45,480 Speaker 2: tie like progress as a nation, definitely in the United States, 795 00:44:45,480 --> 00:44:49,000 Speaker 2: but in most places around the world, to how like 796 00:44:49,160 --> 00:44:53,279 Speaker 2: robust and economy is. It's like always tied to finances 797 00:44:53,680 --> 00:44:56,200 Speaker 2: on what kind of progress we're making, and there are 798 00:44:56,239 --> 00:44:59,399 Speaker 2: people that think like sort of like with the way 799 00:44:59,440 --> 00:45:02,399 Speaker 2: we're starting to look at animals, like, you know, one 800 00:45:02,480 --> 00:45:04,920 Speaker 2: day that's not going to be the most important factor 801 00:45:04,960 --> 00:45:08,880 Speaker 2: for humans, and things like the health of the earth 802 00:45:09,120 --> 00:45:12,799 Speaker 2: and human beings health and well being both physically and 803 00:45:12,840 --> 00:45:16,840 Speaker 2: mentally is the you should equate that with the success 804 00:45:16,840 --> 00:45:19,279 Speaker 2: of a nation, And one day they're going to look 805 00:45:19,280 --> 00:45:21,600 Speaker 2: back and say, you remember when we all that we 806 00:45:21,680 --> 00:45:25,680 Speaker 2: cared about was the fact that the stock market was flesh. 807 00:45:25,800 --> 00:45:30,040 Speaker 1: Yeah, because we GDP just tells you whether an economy 808 00:45:30,120 --> 00:45:34,319 Speaker 1: is growing or shrinking, right, That's basically all it tells you, 809 00:45:35,280 --> 00:45:37,799 Speaker 1: And it leaves out a lot of stuff, like you said, 810 00:45:37,960 --> 00:45:42,239 Speaker 1: human well being, things like whether people are dying of 811 00:45:42,320 --> 00:45:46,200 Speaker 1: deaths of despair or whether they're generally happy, how many 812 00:45:46,239 --> 00:45:49,360 Speaker 1: resources are being depleted. Is anybody working on an alternative 813 00:45:49,440 --> 00:45:53,440 Speaker 1: that all of the stuff that creates that growing economy 814 00:45:53,560 --> 00:45:56,680 Speaker 1: just is totally ignored? Yeah, And I think that's what 815 00:45:56,719 --> 00:46:01,240 Speaker 1: that economist Kate Rayworth was saying, is like it's it's madness, 816 00:46:01,280 --> 00:46:04,759 Speaker 1: Like it's so ridiculous to just completely not count all 817 00:46:04,880 --> 00:46:09,319 Speaker 1: this stuff that really really counts in favor of just 818 00:46:09,400 --> 00:46:13,560 Speaker 1: this one metric, which is growth or shrinkage. And not 819 00:46:13,600 --> 00:46:17,640 Speaker 1: only is that probably going to be thought of as 820 00:46:17,719 --> 00:46:21,200 Speaker 1: ridiculous in the future. People younger people today, who are 821 00:46:21,200 --> 00:46:24,480 Speaker 1: becoming adults or who have recently become adults, they already 822 00:46:24,960 --> 00:46:28,279 Speaker 1: tend to think this way as a group. So it's 823 00:46:28,280 --> 00:46:31,720 Speaker 1: a pretty sure indicator that we're going to leave GDP 824 00:46:31,960 --> 00:46:35,360 Speaker 1: or growth behind as an indicator of the health of 825 00:46:35,400 --> 00:46:38,520 Speaker 1: an economy and start thinking more about the other stuff, 826 00:46:38,560 --> 00:46:42,080 Speaker 1: the more important stuff. And who knows what can result 827 00:46:42,080 --> 00:46:48,359 Speaker 1: from that, like what great cascading knock on effects that 828 00:46:48,360 --> 00:46:48,919 Speaker 1: that will have. 829 00:46:49,680 --> 00:46:54,960 Speaker 2: Yeah, you found this Princeton University bioethicist named Peter Singer 830 00:46:55,239 --> 00:46:57,520 Speaker 2: who talked about the fact that the circle of concern 831 00:46:58,239 --> 00:47:03,960 Speaker 2: as humankind and advances is expanding, And that's just a 832 00:47:04,000 --> 00:47:06,080 Speaker 2: wonderful thought. And you know, you see it in everything 833 00:47:06,120 --> 00:47:08,400 Speaker 2: from the fact that you know, we've laughed before it 834 00:47:08,520 --> 00:47:10,440 Speaker 2: like the mad Men episode where people used to just 835 00:47:10,760 --> 00:47:14,719 Speaker 2: willingly throw litter on the ground, to you know, we 836 00:47:14,760 --> 00:47:19,200 Speaker 2: look back at that as barbaric generally. And that's just 837 00:47:19,239 --> 00:47:22,640 Speaker 2: one small example. So as as humans are evolving down 838 00:47:22,680 --> 00:47:26,200 Speaker 2: the line, that circle of concern is expanding and people 839 00:47:26,200 --> 00:47:29,399 Speaker 2: are caring about more and more things that they didn't 840 00:47:29,440 --> 00:47:31,320 Speaker 2: care about before, and that's that's great. 841 00:47:31,480 --> 00:47:33,600 Speaker 1: Yeah. And Peter Singer, by the way, is very famous 842 00:47:34,160 --> 00:47:38,000 Speaker 1: ethicists as far as animal rights are concerned in animal welfare. 843 00:47:38,840 --> 00:47:41,120 Speaker 1: So yeah, his whole thing is like, we're gonna stop 844 00:47:41,200 --> 00:47:46,480 Speaker 1: focusing on conspicuous consumption and rather you'll be more considered 845 00:47:46,800 --> 00:47:48,839 Speaker 1: like a great person, not from your wealth, but from 846 00:47:48,840 --> 00:47:51,520 Speaker 1: your charity and your charitable giving, which would be great. 847 00:47:52,280 --> 00:47:55,480 Speaker 1: And then that circle of concern kind of leads us 848 00:47:55,480 --> 00:47:58,960 Speaker 1: to our last one too, because the most recent inclusion 849 00:47:58,960 --> 00:48:01,799 Speaker 1: into the circle of conc is the environment, the earth, 850 00:48:01,880 --> 00:48:06,640 Speaker 1: the health of the earth, and that this one is 851 00:48:06,760 --> 00:48:10,359 Speaker 1: just a sure give me, there is no way that 852 00:48:10,400 --> 00:48:12,759 Speaker 1: we're not going to be looked down upon for this 853 00:48:12,960 --> 00:48:16,080 Speaker 1: by our descendants. So that is burning fossil fuels. 854 00:48:16,480 --> 00:48:20,600 Speaker 2: Yeah, I mean in five hundred years, who knows, maybe sooner. 855 00:48:21,280 --> 00:48:23,359 Speaker 2: It seems like people will definitely look back and say, 856 00:48:23,400 --> 00:48:27,319 Speaker 2: I can't believe that we used to burn fossil fuels 857 00:48:27,360 --> 00:48:29,920 Speaker 2: like we did. And for a lot of reasons, not 858 00:48:30,040 --> 00:48:33,880 Speaker 2: just the you know the process of removing fossil fuels 859 00:48:33,920 --> 00:48:38,320 Speaker 2: and all that goes into that, or even the climate 860 00:48:38,360 --> 00:48:41,440 Speaker 2: and the ozone, which are all huge concerns obviously, but 861 00:48:41,760 --> 00:48:45,439 Speaker 2: just things like pollution and air quality and the fact 862 00:48:45,480 --> 00:48:48,920 Speaker 2: that you know that kills people and that costs so 863 00:48:49,160 --> 00:48:52,320 Speaker 2: much money in healthcare. I think there was a study 864 00:48:52,719 --> 00:48:56,400 Speaker 2: from the University of Wisconsin in Madison that said if 865 00:48:56,640 --> 00:48:59,800 Speaker 2: we stopped burning fossil fuels all together, it would have 866 00:49:00,480 --> 00:49:03,799 Speaker 2: about fifty thousand premature deaths per year because of air 867 00:49:03,880 --> 00:49:08,360 Speaker 2: quality alone, and about six hundred billion dollars annually and 868 00:49:08,440 --> 00:49:10,520 Speaker 2: the US alone and healthcare costs. 869 00:49:10,640 --> 00:49:12,840 Speaker 1: Yeah, And I think even more than looking at us 870 00:49:12,880 --> 00:49:15,440 Speaker 1: as like dumb dumbs for ignoring that we're going to 871 00:49:15,440 --> 00:49:18,520 Speaker 1: be looked at, is kind of reviled because of the 872 00:49:18,520 --> 00:49:21,840 Speaker 1: future will have delivered our descendants because of the climate 873 00:49:21,920 --> 00:49:23,640 Speaker 1: change we just allowed to happen. 874 00:49:24,080 --> 00:49:24,279 Speaker 2: Yeah. 875 00:49:24,560 --> 00:49:30,240 Speaker 1: I saw who estimate that two hundred and fifty thousand 876 00:49:30,239 --> 00:49:34,799 Speaker 1: additional deaths per year are expected to come each year 877 00:49:35,160 --> 00:49:38,840 Speaker 1: between twenty thirty and twenty fifty because of climate change 878 00:49:39,080 --> 00:49:44,040 Speaker 1: from things like heat stress, malnutrition, insect born diseases. That 879 00:49:44,160 --> 00:49:46,880 Speaker 1: an additional quarter of a million people are going to 880 00:49:46,920 --> 00:49:50,759 Speaker 1: die every year because of climate change starting six years 881 00:49:50,760 --> 00:49:54,000 Speaker 1: from now. That's nuts. So I can only imagine what 882 00:49:54,080 --> 00:49:56,680 Speaker 1: the people of you know, twenty one hundred are going 883 00:49:56,719 --> 00:49:59,560 Speaker 1: to think of us. Hopefully they'll have everything under control 884 00:49:59,600 --> 00:50:01,720 Speaker 1: by then, but they're probably going to be pretty ticked 885 00:50:01,719 --> 00:50:03,200 Speaker 1: off that they had to go to the trouble. 886 00:50:03,960 --> 00:50:05,719 Speaker 2: Yeah. I mean, you can see this coming because it 887 00:50:05,760 --> 00:50:09,480 Speaker 2: already happens now once again by seeing younger generations already 888 00:50:09,480 --> 00:50:13,719 Speaker 2: looking at previous generations as barbaric and how we treat 889 00:50:13,719 --> 00:50:14,080 Speaker 2: the earth. 890 00:50:14,160 --> 00:50:18,400 Speaker 1: For sure, I saw an RHS financial estimate or prediction 891 00:50:18,880 --> 00:50:23,400 Speaker 1: that the oil market will collapse this decade that we're 892 00:50:23,440 --> 00:50:26,040 Speaker 1: just based on trends, current trends now and the way 893 00:50:26,120 --> 00:50:29,960 Speaker 1: people think now that probably we won't be using oil 894 00:50:30,600 --> 00:50:33,520 Speaker 1: nearly as much in the next ten twenty years. 895 00:50:33,760 --> 00:50:34,360 Speaker 2: Very interesting. 896 00:50:34,560 --> 00:50:38,800 Speaker 1: Yeah, the future is interesting, Chuck, And it is the future. 897 00:50:38,920 --> 00:50:40,960 Speaker 1: As a matter of fact. It's almost twenty twenty four, 898 00:50:41,400 --> 00:50:43,440 Speaker 1: and I just want to say happy new year to everybody. 899 00:50:43,520 --> 00:50:47,960 Speaker 2: Huh, that's right, Happy new year everyone. We thank you 900 00:50:48,040 --> 00:50:50,479 Speaker 2: once again for your support. We say it all the time. 901 00:50:50,719 --> 00:50:52,839 Speaker 2: If there was no U, that would be no US. 902 00:50:53,760 --> 00:50:57,000 Speaker 2: We are always grateful that we are allowed to do 903 00:50:57,040 --> 00:50:58,280 Speaker 2: this job because you listen. 904 00:50:58,480 --> 00:51:01,160 Speaker 1: Yeah, thank you, and happy new year to everyone. Happy 905 00:51:01,160 --> 00:51:05,520 Speaker 1: birthday to you me Andy Birthday, Thanks Chuck, and we'll 906 00:51:05,560 --> 00:51:07,640 Speaker 1: see you guys next year. And if you want to 907 00:51:07,640 --> 00:51:09,480 Speaker 1: get in touch with this in the interim, in this 908 00:51:09,719 --> 00:51:12,200 Speaker 1: very short time left in twenty twenty three, you can 909 00:51:12,239 --> 00:51:16,080 Speaker 1: do it via email almost instantaneously. Wrap it up, spank 910 00:51:16,120 --> 00:51:18,279 Speaker 1: it on the bottom, put a sash onto this says 911 00:51:18,320 --> 00:51:21,680 Speaker 1: twenty twenty four, and send it off to Stuff podcast 912 00:51:22,000 --> 00:51:25,840 Speaker 1: at iHeartRadio dot com. 913 00:51:26,719 --> 00:51:29,600 Speaker 2: Stuff you Should Know is a production of iHeartRadio. For 914 00:51:29,719 --> 00:51:33,880 Speaker 2: more podcasts my heart Radio, visit the iHeartRadio app, Apple Podcasts, 915 00:51:34,000 --> 00:51:40,440 Speaker 2: or wherever you listen to your favorite shows.