1 00:00:04,240 --> 00:00:06,200 Speaker 1: Hey, and welcome to the Short Stuff. I'm Josh and 2 00:00:06,240 --> 00:00:08,840 Speaker 1: there's Chuck and Jerry sitting in for Dave. So that 3 00:00:08,960 --> 00:00:11,600 Speaker 1: makes this short stuff and this is a good one. 4 00:00:11,600 --> 00:00:15,720 Speaker 1: I'm pretty excited about it, Chuck, that's right, big thanks 5 00:00:15,720 --> 00:00:20,480 Speaker 1: to web md, the National Institute of Health, and then 6 00:00:20,680 --> 00:00:23,760 Speaker 1: what Live Science. He found a cool interview from Live Science. Well, 7 00:00:23,800 --> 00:00:25,920 Speaker 1: they had a guest post from a guy named doctor 8 00:00:26,000 --> 00:00:29,200 Speaker 1: John Growhole. I think that's how you say his name, 9 00:00:29,240 --> 00:00:32,840 Speaker 1: but he's CEO and founder of psych Central and Live 10 00:00:32,920 --> 00:00:36,560 Speaker 1: Science apparently said doctor grohole, why don't you come over 11 00:00:36,600 --> 00:00:39,400 Speaker 1: and write about sleeping on it for us? And he 12 00:00:39,479 --> 00:00:42,440 Speaker 1: said sure, and that was one of the sources we used. 13 00:00:42,560 --> 00:00:45,599 Speaker 1: I bet it's a grohole. That's what I said. Grohole, 14 00:00:46,159 --> 00:00:50,479 Speaker 1: no groal basically roll. Yeah, you're probably right, but I 15 00:00:50,520 --> 00:00:52,760 Speaker 1: mean the extra O between the H and the L 16 00:00:52,880 --> 00:00:55,880 Speaker 1: is what's throwed me off. But maybe the good doctor 17 00:00:55,920 --> 00:00:58,400 Speaker 1: did not say, Chary, I'll do it. Maybe he said, oh, 18 00:00:58,480 --> 00:01:00,279 Speaker 1: let me sleep on it and I'll let you know 19 00:01:00,880 --> 00:01:03,360 Speaker 1: very nice. What word are we not going to say 20 00:01:03,360 --> 00:01:08,880 Speaker 1: in this episode, Chuck, don't grow whole? So um, it 21 00:01:08,959 --> 00:01:11,400 Speaker 1: is kind of interesting if you stop and think about it. 22 00:01:11,400 --> 00:01:13,840 Speaker 1: I mean, basically all of us have had, if not 23 00:01:13,880 --> 00:01:19,360 Speaker 1: by design them, probably by accident, a period where we 24 00:01:19,360 --> 00:01:23,800 Speaker 1: were facing some dilemma, some big decision, and we slapped 25 00:01:23,840 --> 00:01:26,440 Speaker 1: on it. We slept before making the decision, and after 26 00:01:26,600 --> 00:01:29,880 Speaker 1: we slept, it was just way clearer the next day, 27 00:01:30,240 --> 00:01:33,000 Speaker 1: and we probably made the right decision from that point on. 28 00:01:33,520 --> 00:01:36,720 Speaker 1: That's what everybody calls sleeping on the decision, right, But 29 00:01:36,920 --> 00:01:40,320 Speaker 1: why would that actually happen? Like that's an actual, like 30 00:01:40,400 --> 00:01:45,120 Speaker 1: a fascinating, amazing component of human life that I just 31 00:01:45,160 --> 00:01:49,080 Speaker 1: think is the tops. Yeah, and there's Here's the thing is, 32 00:01:49,120 --> 00:01:52,880 Speaker 1: there's more to it than like we all know sleep 33 00:01:53,000 --> 00:01:57,320 Speaker 1: is just great for everything, but there's more to it 34 00:01:57,400 --> 00:02:01,040 Speaker 1: than like, but it is also this and like, you know, 35 00:02:01,560 --> 00:02:05,400 Speaker 1: you shouldn't send that email, that angry email, like sleep 36 00:02:05,440 --> 00:02:09,760 Speaker 1: on that sleep on the maybe sleep on the argument 37 00:02:09,840 --> 00:02:12,839 Speaker 1: that you're in the middle of having with your significant 38 00:02:12,840 --> 00:02:17,120 Speaker 1: other or your friend or whoever. Supposedly not going to 39 00:02:17,200 --> 00:02:19,680 Speaker 1: bed angry as a rule is not actually a very 40 00:02:19,720 --> 00:02:22,400 Speaker 1: good rule. We go to bed angry all the time 41 00:02:23,360 --> 00:02:27,760 Speaker 1: every night, not every night, but man, like you gotta sleep, 42 00:02:27,760 --> 00:02:30,160 Speaker 1: and if you're still angry, what are you gonna do? Right? 43 00:02:30,200 --> 00:02:32,160 Speaker 1: Exactly like it is for in the morning. We got 44 00:02:32,160 --> 00:02:35,360 Speaker 1: to works as out right, but there is more to 45 00:02:35,440 --> 00:02:38,880 Speaker 1: it than that, because just human sort of intuition would 46 00:02:38,919 --> 00:02:41,760 Speaker 1: tell you that, you know, you get some sleep, you 47 00:02:41,880 --> 00:02:44,920 Speaker 1: clear your head out, you make a better decision in 48 00:02:44,919 --> 00:02:47,000 Speaker 1: the morning. It just makes sense, but it's it's more 49 00:02:47,080 --> 00:02:50,040 Speaker 1: specific than that, right, Yeah, And we should say that 50 00:02:50,080 --> 00:02:52,960 Speaker 1: this pretty much applies as far as the research goes 51 00:02:53,000 --> 00:02:57,520 Speaker 1: to important, tough, complex decisions, like if you're at a 52 00:02:57,639 --> 00:02:59,600 Speaker 1: deli and you're like, should I have roast beef or 53 00:02:59,600 --> 00:03:02,560 Speaker 1: past me on rye? And you go and curl up 54 00:03:02,560 --> 00:03:04,920 Speaker 1: in the corner for thirty minutes and then wake up 55 00:03:05,200 --> 00:03:07,600 Speaker 1: with your answer. It's actually not going to help. You're 56 00:03:07,600 --> 00:03:10,840 Speaker 1: probably going to have the same answer. It's complicated decisions 57 00:03:10,880 --> 00:03:14,440 Speaker 1: that this works for. And most people say the reason 58 00:03:14,480 --> 00:03:18,280 Speaker 1: why is because when we sleep, we organize our memories 59 00:03:18,320 --> 00:03:22,920 Speaker 1: and process information during that period, and that we're probably 60 00:03:22,960 --> 00:03:26,440 Speaker 1: taking all this different information, new information, connecting it to 61 00:03:26,560 --> 00:03:30,840 Speaker 1: old information, cementing these neural pathways so that we understand 62 00:03:30,880 --> 00:03:34,680 Speaker 1: it better after we've slept than we understood it before, 63 00:03:35,080 --> 00:03:38,760 Speaker 1: and that we can gain clarity from sleeping on a decision. 64 00:03:38,800 --> 00:03:41,880 Speaker 1: And that's how that works. Yeah, and they've also found 65 00:03:42,040 --> 00:03:45,800 Speaker 1: and this comes from cal Berkeley, from doctor Matthew Walker 66 00:03:45,920 --> 00:03:48,680 Speaker 1: that who was a sleep scientist, which I'm fascinated by 67 00:03:48,680 --> 00:03:53,120 Speaker 1: that job that you know at getting good sleep helps 68 00:03:53,120 --> 00:03:55,640 Speaker 1: you learn, Like that's why you probably learn better, Like 69 00:03:55,760 --> 00:03:57,920 Speaker 1: right when after you wake up in the morning, your 70 00:03:57,920 --> 00:04:02,240 Speaker 1: morning classes in school, you're sharper than those afternoon classes. 71 00:04:02,280 --> 00:04:05,960 Speaker 1: But also that going to sleep after you learn for 72 00:04:06,040 --> 00:04:10,360 Speaker 1: reasons that you mentioned is a really big deal, which 73 00:04:10,440 --> 00:04:13,560 Speaker 1: led me to my next topic that I'm going to 74 00:04:13,840 --> 00:04:18,400 Speaker 1: do for a long form episode, which is napping. Oh okay, 75 00:04:18,760 --> 00:04:20,520 Speaker 1: because I think I told you on our super secret 76 00:04:20,640 --> 00:04:23,520 Speaker 1: research trip that I've been started taking a daily nap 77 00:04:23,600 --> 00:04:27,560 Speaker 1: last year. Yeah, yeah, and then it's been like life changing. 78 00:04:27,600 --> 00:04:30,480 Speaker 1: So I really want to know more about why so napping, 79 00:04:31,240 --> 00:04:34,719 Speaker 1: and again it has to do I'm sure in a 80 00:04:34,800 --> 00:04:37,279 Speaker 1: smaller way the reason a good night's sleep is good 81 00:04:37,279 --> 00:04:41,680 Speaker 1: for you. I would think so for sure. So napping 82 00:04:41,760 --> 00:04:45,040 Speaker 1: is coming soon, is what I'm saying. Okay, for me personally, 83 00:04:46,160 --> 00:04:49,159 Speaker 1: right after I record and then we'll take a breaking 84 00:04:49,200 --> 00:04:50,680 Speaker 1: you can take a nap and then we'll come back. 85 00:04:50,720 --> 00:05:18,719 Speaker 1: How about that. Okay, let's do it. Okay, Chuck, it's 86 00:05:18,760 --> 00:05:21,760 Speaker 1: forty five minutes later. I've been standing here waiting. Well, 87 00:05:21,800 --> 00:05:25,160 Speaker 1: I spoke up nap. I feel gratefully this is an 88 00:05:25,200 --> 00:05:28,960 Speaker 1: even better episode had you not napped. All right, So 89 00:05:29,000 --> 00:05:32,279 Speaker 1: we're moving on to some even more specifics which I 90 00:05:32,320 --> 00:05:37,479 Speaker 1: found super, super fascinating about sleeping on it, which is 91 00:05:38,080 --> 00:05:42,440 Speaker 1: the way your conscious mind and your unconscious mind works. Right. Yeah. 92 00:05:42,480 --> 00:05:44,880 Speaker 1: They think that maybe it doesn't really have much to 93 00:05:44,920 --> 00:05:48,520 Speaker 1: do with the actual act of sleeping or cementing memories 94 00:05:48,600 --> 00:05:52,520 Speaker 1: or new information, but that while we're sleeping, we're using 95 00:05:52,640 --> 00:05:56,960 Speaker 1: unconscious thought rather than conscious thought, and that you could actually, 96 00:05:57,240 --> 00:05:59,880 Speaker 1: if this is true, you could do the same thing 97 00:06:00,400 --> 00:06:03,960 Speaker 1: um as sleeping on a decision without actually having to 98 00:06:03,960 --> 00:06:07,080 Speaker 1: go to sleep, if you can engage an unconscious thought 99 00:06:07,120 --> 00:06:10,320 Speaker 1: about it. And that sounds hard, but actually I think 100 00:06:10,320 --> 00:06:15,600 Speaker 1: it amounts to, um, just stopping thinking about and chewing 101 00:06:15,640 --> 00:06:18,120 Speaker 1: over the decision and going and doing something else, building 102 00:06:18,120 --> 00:06:20,880 Speaker 1: a model airplane, doing your tax It's just something else 103 00:06:21,240 --> 00:06:24,760 Speaker 1: because while you're doing that, um this this um. The 104 00:06:24,839 --> 00:06:30,919 Speaker 1: guy doctor Groholm points out that you um that that 105 00:06:31,200 --> 00:06:34,800 Speaker 1: unconscious thought is like not just zoning out, and it's 106 00:06:34,839 --> 00:06:38,200 Speaker 1: not not thinking, it's just a different mode of thinking, 107 00:06:38,839 --> 00:06:42,960 Speaker 1: and that you're you're still engaging an unconscious thought about 108 00:06:42,960 --> 00:06:44,960 Speaker 1: this problem, this dilemma, it's just not in the front 109 00:06:44,960 --> 00:06:47,280 Speaker 1: of your mind. And by removing it from the front 110 00:06:47,279 --> 00:06:50,400 Speaker 1: of your mind, putting it into unconscious thought, it seems 111 00:06:50,440 --> 00:06:53,760 Speaker 1: that that actually can produce really good decisions, just like 112 00:06:53,800 --> 00:06:57,320 Speaker 1: sleeping on something can too. Yeah, and it seems like 113 00:06:57,360 --> 00:07:01,200 Speaker 1: one of the big reasons why is buy it if 114 00:07:01,240 --> 00:07:06,600 Speaker 1: you are And I think this is where where overthinking, 115 00:07:06,680 --> 00:07:09,080 Speaker 1: like if you say, oh, you're overthinking it, or if 116 00:07:08,920 --> 00:07:12,360 Speaker 1: you think something to death or something uh, that is 117 00:07:12,680 --> 00:07:16,320 Speaker 1: that is concentrating so hard on something on a decision 118 00:07:16,320 --> 00:07:19,480 Speaker 1: that your biases are creeping in. You're thinking about all 119 00:07:19,560 --> 00:07:22,880 Speaker 1: these different angles and you're quite literally just sort of 120 00:07:22,920 --> 00:07:26,720 Speaker 1: overdoing it. And apparently when you're just using your unconscious 121 00:07:26,720 --> 00:07:28,720 Speaker 1: thought and kind of parking it for a second in 122 00:07:28,840 --> 00:07:31,840 Speaker 1: the forefront of your brain, those biases will fade away. 123 00:07:32,080 --> 00:07:33,640 Speaker 1: And the way I read it, or at least the 124 00:07:33,640 --> 00:07:35,880 Speaker 1: way it feels like in my brain, because I, you know, 125 00:07:36,040 --> 00:07:38,120 Speaker 1: try to do this stuff some is it feels like 126 00:07:38,200 --> 00:07:40,160 Speaker 1: it just sort of clears the table a little bit 127 00:07:41,040 --> 00:07:45,560 Speaker 1: and what's left is what's important. Yeah, because if with 128 00:07:45,680 --> 00:07:49,880 Speaker 1: your biases, with your conscious biases, you're you might push 129 00:07:49,880 --> 00:07:52,560 Speaker 1: yourself to a bad decision, because that's the one where 130 00:07:52,600 --> 00:07:56,320 Speaker 1: you get, like a snickerdoodle cookie, you know, where if 131 00:07:56,360 --> 00:07:59,120 Speaker 1: you engage it, I would do a lot for a 132 00:07:59,160 --> 00:08:02,000 Speaker 1: snickerdoodle cook but like probably things I should not be 133 00:08:02,040 --> 00:08:07,080 Speaker 1: doing if somebody offered a snickerdoodle, Right, But that if 134 00:08:07,080 --> 00:08:09,800 Speaker 1: you just engage in unconscious thinking, your brain is like, 135 00:08:09,840 --> 00:08:12,720 Speaker 1: snickerdoodle cookie doesn't really matter that much. We're not going 136 00:08:12,760 --> 00:08:15,760 Speaker 1: to use that to weight the different factors in the decision. 137 00:08:16,560 --> 00:08:19,720 Speaker 1: These are all going to be fairly balanced out because 138 00:08:19,760 --> 00:08:22,800 Speaker 1: the biases are removed. So you can essentially step back, 139 00:08:23,160 --> 00:08:25,440 Speaker 1: look at all the different components, all the different possible 140 00:08:25,480 --> 00:08:29,360 Speaker 1: outcomes pretty much equally on an equal basis, and then say, 141 00:08:29,440 --> 00:08:32,160 Speaker 1: this is the one that's obviously the right decision to make. 142 00:08:32,679 --> 00:08:35,240 Speaker 1: Even though I'm not getting a Snickerdoodle cookie. I can 143 00:08:35,280 --> 00:08:38,160 Speaker 1: go buy a snickerdoodle cookie and still make this correct 144 00:08:38,200 --> 00:08:40,240 Speaker 1: decision and have the best of both worlds. I could 145 00:08:40,240 --> 00:08:43,240 Speaker 1: have my snick snickerdoodle and eat it too. Oh man, 146 00:08:43,640 --> 00:08:46,280 Speaker 1: I knew that's what you're gonna say. Did you sure 147 00:08:46,720 --> 00:08:49,600 Speaker 1: you knew me, you know me. It's man, the wheels 148 00:08:49,600 --> 00:08:51,600 Speaker 1: are starting to fall off of this brain. I'll tell 149 00:08:51,640 --> 00:08:54,839 Speaker 1: you that there has been some study on this as 150 00:08:54,840 --> 00:08:57,160 Speaker 1: far as sleeping on it. There was that one that 151 00:08:57,320 --> 00:09:00,320 Speaker 1: I guess you found from. Was this from the NIH 152 00:09:00,440 --> 00:09:04,160 Speaker 1: or was this web MD? It was doctor grohol that 153 00:09:04,240 --> 00:09:06,480 Speaker 1: cited this one. The Okay, let me see if I 154 00:09:06,520 --> 00:09:13,080 Speaker 1: can hazard this name dister Hughes Dixterhuse. I think I 155 00:09:13,200 --> 00:09:16,520 Speaker 1: nailed it that last time, Dixterhuse, d I j K 156 00:09:17,000 --> 00:09:20,400 Speaker 1: S t E R h U I S And I'll 157 00:09:20,440 --> 00:09:24,079 Speaker 1: bet during college everybody call them the dixter Oh sure 158 00:09:24,280 --> 00:09:27,559 Speaker 1: at parties. Yeah, yeah, I don't know how you pronounce that. 159 00:09:27,559 --> 00:09:30,839 Speaker 1: That sounds about right now, Dixterhus. So this was an 160 00:09:30,840 --> 00:09:35,439 Speaker 1: experiment where they would try to get results from having, 161 00:09:35,559 --> 00:09:39,160 Speaker 1: you know, people sleep on a big decision or just 162 00:09:39,280 --> 00:09:42,440 Speaker 1: a decision that's um. I think in this case they 163 00:09:42,440 --> 00:09:45,960 Speaker 1: were talking about like choosing apartments, yeah, and like where 164 00:09:46,000 --> 00:09:50,600 Speaker 1: to live. I guess. So they would choose these participants, 165 00:09:51,080 --> 00:09:54,160 Speaker 1: they would have a few apartments to choose from. They 166 00:09:54,200 --> 00:09:57,559 Speaker 1: would describe what's going on in these apartments, of course, 167 00:09:57,679 --> 00:10:01,320 Speaker 1: like you would with any movie real estate decision like 168 00:10:01,400 --> 00:10:04,640 Speaker 1: it's scutto this one has an extra half baths. That 169 00:10:04,679 --> 00:10:07,160 Speaker 1: one has a chalk body outline, that's right, I want 170 00:10:07,160 --> 00:10:10,360 Speaker 1: to avoid that one unless that's your thing. Uh. And 171 00:10:11,520 --> 00:10:14,360 Speaker 1: after reading these descriptions, they were asked to make their 172 00:10:14,440 --> 00:10:18,440 Speaker 1: choice UM following an additional period of conscious thought or 173 00:10:18,559 --> 00:10:23,360 Speaker 1: unconscious thought. And as you would expect, the unconscious thinkers, 174 00:10:23,840 --> 00:10:25,680 Speaker 1: well this is this is the only part I wonder about. 175 00:10:25,720 --> 00:10:30,040 Speaker 1: It has made the better decision than conscious thinkers. But 176 00:10:30,400 --> 00:10:33,520 Speaker 1: what's the better decision? Like would they be like, well, 177 00:10:33,520 --> 00:10:36,560 Speaker 1: he shows the wrong place clearly. Yeah, now you have 178 00:10:36,640 --> 00:10:39,360 Speaker 1: to actually go live in there in reality because that 179 00:10:39,440 --> 00:10:41,920 Speaker 1: part is subjective, right, And whether or not that was 180 00:10:41,960 --> 00:10:44,160 Speaker 1: the right choice, yeah, I don't know what the what 181 00:10:44,200 --> 00:10:48,040 Speaker 1: the criteria was, but clearly something was more desirable than 182 00:10:48,400 --> 00:10:54,640 Speaker 1: than others in the apartments, because maybe outline probably was. 183 00:10:55,160 --> 00:10:58,360 Speaker 1: But the thing that I saw there was like some 184 00:10:58,400 --> 00:11:02,760 Speaker 1: other follow up UM studies or subsections to the study, 185 00:11:03,240 --> 00:11:06,880 Speaker 1: and they found that experts as well are subject to 186 00:11:06,880 --> 00:11:08,840 Speaker 1: the same thing. Like you could take an expert and 187 00:11:08,920 --> 00:11:12,120 Speaker 1: ask them to make a snap decision about something and 188 00:11:12,320 --> 00:11:14,840 Speaker 1: they're probably going to make a worse decision than a 189 00:11:14,960 --> 00:11:17,360 Speaker 1: non expert who has had time to sleep on it 190 00:11:17,480 --> 00:11:21,560 Speaker 1: or engage an unconscious thought about that decision. Interesting, And 191 00:11:21,640 --> 00:11:24,720 Speaker 1: you know what that tracks, because I think one of 192 00:11:24,760 --> 00:11:29,400 Speaker 1: the traits of what you would call highly successful people 193 00:11:30,520 --> 00:11:33,679 Speaker 1: is to be able to make the right decision quickly 194 00:11:34,080 --> 00:11:39,679 Speaker 1: and even under duress. Yes, but that is asking for 195 00:11:39,760 --> 00:11:42,719 Speaker 1: a lot and a lot of people who say one 196 00:11:42,760 --> 00:11:45,400 Speaker 1: of the reasons why sleeping on it or engaging in 197 00:11:45,600 --> 00:11:49,800 Speaker 1: unconscious thought is just by virtue of stepping away from 198 00:11:49,800 --> 00:11:55,160 Speaker 1: the problem, you are relinquishing the stress of immediacy. You're saying, 199 00:11:55,400 --> 00:11:58,240 Speaker 1: I'm not going to make this decision under duress. I'm 200 00:11:58,280 --> 00:12:00,760 Speaker 1: going to give myself some time and up away from it, 201 00:12:01,040 --> 00:12:04,640 Speaker 1: And you are just automatically taking yourself out of a 202 00:12:04,679 --> 00:12:07,640 Speaker 1: stressful situation, and then by virtue of that alone, you're 203 00:12:07,640 --> 00:12:09,960 Speaker 1: probably going to make a better decision. So I think 204 00:12:09,960 --> 00:12:12,120 Speaker 1: the upshot is is unless someone has a gun to 205 00:12:12,200 --> 00:12:14,720 Speaker 1: your head and is telling you to hack into some mainframe, 206 00:12:15,760 --> 00:12:19,360 Speaker 1: you can probably step away from the decision. You're probably 207 00:12:19,520 --> 00:12:23,440 Speaker 1: putting yourself under undue pressure, and the more likely you 208 00:12:23,480 --> 00:12:25,080 Speaker 1: are to do that, the more likely you are to 209 00:12:25,120 --> 00:12:27,160 Speaker 1: probably make a better decision than if you just make 210 00:12:27,200 --> 00:12:30,800 Speaker 1: a snap one under duress. The only thing I'm going 211 00:12:30,880 --> 00:12:34,640 Speaker 1: to disagree with there is that people are a sort 212 00:12:34,679 --> 00:12:38,520 Speaker 1: of can ask for and or afforded that luxury because 213 00:12:38,559 --> 00:12:42,920 Speaker 1: I think far too many jobs when they don't need 214 00:12:42,960 --> 00:12:48,000 Speaker 1: to require immediacy of action that you know, if you're 215 00:12:48,000 --> 00:12:50,200 Speaker 1: like can I sleep on it, They're like, this isn't 216 00:12:50,200 --> 00:12:53,680 Speaker 1: the job for you? Then? So like, give me an example, Oh, 217 00:12:53,720 --> 00:12:57,319 Speaker 1: I mean any big, high stakes, sort of high pressure 218 00:12:58,640 --> 00:13:01,080 Speaker 1: upper management stuff. I don't feel like there's a lot 219 00:13:01,120 --> 00:13:03,560 Speaker 1: of those people that are. I think the ones that 220 00:13:03,559 --> 00:13:07,360 Speaker 1: are maybe truly successful are probably drawing those boundaries and 221 00:13:07,400 --> 00:13:09,319 Speaker 1: saying like, no, we should take our time with this decision. 222 00:13:09,400 --> 00:13:11,400 Speaker 1: But sure, I feel like a lot of those are like, no, 223 00:13:11,480 --> 00:13:13,040 Speaker 1: we need to we need to figure this out and 224 00:13:13,080 --> 00:13:16,640 Speaker 1: act on it. That's probably true. I'm not disagreeing with that, 225 00:13:16,720 --> 00:13:19,720 Speaker 1: but I'll bet that they That means that the decisions 226 00:13:19,760 --> 00:13:23,079 Speaker 1: that are being made at these higher echelons are probably 227 00:13:23,160 --> 00:13:27,920 Speaker 1: routinely bad decisions. Well, there may be something anythink about it, 228 00:13:28,040 --> 00:13:31,600 Speaker 1: you know, you know, yeah, I want my CEOs to 229 00:13:31,640 --> 00:13:35,280 Speaker 1: go sleep on things. That's my that's my game. Yeah, 230 00:13:35,280 --> 00:13:37,240 Speaker 1: we should start a new corporation called sleep on It. 231 00:13:37,720 --> 00:13:40,040 Speaker 1: That's a great idea. We'll have to edit that out 232 00:13:40,040 --> 00:13:44,320 Speaker 1: so nobody steals our idea. It's probably out there already. Okay, well, 233 00:13:44,040 --> 00:13:48,559 Speaker 1: well we'll look, we'll do a business search. Okay, Well, 234 00:13:48,559 --> 00:13:51,080 Speaker 1: while we're doing a business search, everybody, this short stuff 235 00:13:51,120 --> 00:13:57,200 Speaker 1: is out. Stuff you should know is a production of iHeartRadio. 236 00:13:57,720 --> 00:14:00,280 Speaker 1: For more podcasts my heart Radio, visit the I heart 237 00:14:00,360 --> 00:14:03,240 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 238 00:14:03,240 --> 00:14:04,000 Speaker 1: favorite shows.