1 00:00:00,280 --> 00:00:02,920 Speaker 1: Brought to you by the reinvented two thousand twelve Camray. 2 00:00:03,160 --> 00:00:07,560 Speaker 1: It's ready. Are you welcome to Stuff You Should Know? 3 00:00:08,119 --> 00:00:16,439 Speaker 1: From House Stuff Works dot Com. Hey, and welcome to 4 00:00:16,440 --> 00:00:19,120 Speaker 1: the podcast. I'm Josh Clark with me as always as 5 00:00:19,200 --> 00:00:22,360 Speaker 1: Charles W. Chuck Bryant. And that makes the stuff you 6 00:00:22,400 --> 00:00:27,080 Speaker 1: should know. That's right, Yes, it is not other imitators. 7 00:00:27,160 --> 00:00:29,360 Speaker 1: I wonder how many times I've said that that makes 8 00:00:29,360 --> 00:00:31,960 Speaker 1: the stuff you should know. No, the just the whole spiel, 9 00:00:31,960 --> 00:00:34,520 Speaker 1: the whole opening. Hey, you welcome to the podcast. Well 10 00:00:34,560 --> 00:00:37,640 Speaker 1: you've said it about two hundred and seventy. Sometimes I think, 11 00:00:38,400 --> 00:00:40,520 Speaker 1: luckily we have them all saved and we could count 12 00:00:40,520 --> 00:00:43,240 Speaker 1: we do. I don't know if it's lucky, though, Chuck, 13 00:00:43,320 --> 00:00:44,880 Speaker 1: that's a lot of shows, dude, we should do something 14 00:00:44,880 --> 00:00:48,320 Speaker 1: special for three hundred. That's like, that's a lot of shows. 15 00:00:48,479 --> 00:00:52,040 Speaker 1: It is that makes me probably Okay, Well, do you 16 00:00:52,040 --> 00:00:54,320 Speaker 1: think maybe we could get some cake around here or something? 17 00:00:54,760 --> 00:00:58,080 Speaker 1: Shrimp cocktail the love of Pete. No, I'm allergic to 18 00:00:58,080 --> 00:01:00,320 Speaker 1: shrimp now, I remember, I know, But to let a 19 00:01:00,360 --> 00:01:04,480 Speaker 1: throat out there. Actually, I had a shrimp wanton the 20 00:01:04,520 --> 00:01:08,119 Speaker 1: other day and nothing happened. I had a wanton with 21 00:01:08,160 --> 00:01:11,200 Speaker 1: shrimp and nothing happened. So it was just like tiny 22 00:01:11,240 --> 00:01:12,840 Speaker 1: little bits of shrimp. And I don't know either that 23 00:01:13,000 --> 00:01:16,600 Speaker 1: or I'm getting stronger. Maybe so superhuman, you might say 24 00:01:16,840 --> 00:01:22,240 Speaker 1: trans human speaking of human um, chuck. There is a 25 00:01:22,280 --> 00:01:26,440 Speaker 1: recent study that came out in part from one of 26 00:01:26,480 --> 00:01:30,760 Speaker 1: our universities here in the city, Emory, right down the street, 27 00:01:30,880 --> 00:01:35,880 Speaker 1: Great School. There's been this problem that's been plaguing researchers 28 00:01:35,959 --> 00:01:38,679 Speaker 1: for a really long time, and that is at the 29 00:01:38,760 --> 00:01:42,240 Speaker 1: beginning of the Lower Paleolithic period, which is about two 30 00:01:42,280 --> 00:01:47,200 Speaker 1: points seven million years ago, we started using sharp rocks 31 00:01:47,760 --> 00:01:52,880 Speaker 1: as bashing and cutting tools. So we figured that out. Okay, 32 00:01:53,000 --> 00:01:56,040 Speaker 1: you can take a rock, that's technology. That's not horse, 33 00:01:56,080 --> 00:01:59,080 Speaker 1: that's technology. Yes, okay, you can take this rock and 34 00:01:59,120 --> 00:02:01,400 Speaker 1: you can use it to when a coconut or the 35 00:02:01,400 --> 00:02:04,120 Speaker 1: head of someone who's wrong. Do you using an implement 36 00:02:04,400 --> 00:02:09,920 Speaker 1: to complete attack? Well, specifically sharp rocks? Okay? It took 37 00:02:10,320 --> 00:02:14,840 Speaker 1: two million years the end of the Lower Paleolithic period 38 00:02:15,120 --> 00:02:18,400 Speaker 1: before we figured out that we could actually attach handles 39 00:02:18,400 --> 00:02:21,280 Speaker 1: to these things. And tournament how long it took? Yes, 40 00:02:21,840 --> 00:02:25,359 Speaker 1: And this is baffled scientists, like, how could it possibly 41 00:02:25,440 --> 00:02:29,119 Speaker 1: have taken two million years to go from using your 42 00:02:29,160 --> 00:02:33,680 Speaker 1: hand to attaching a stick. You know, this doesn't make 43 00:02:33,680 --> 00:02:36,880 Speaker 1: any sense. So, um, well they were dumb back then. 44 00:02:37,120 --> 00:02:41,359 Speaker 1: Well a dumb is close to it. They would literally 45 00:02:41,400 --> 00:02:45,960 Speaker 1: were lacking the region of the brain needed. Apparently, according 46 00:02:46,040 --> 00:02:51,280 Speaker 1: to this new study, Um, they they basically we developed 47 00:02:51,280 --> 00:02:55,600 Speaker 1: a region in the right hemisphere, specifically the supra marginal gyrus, 48 00:02:56,520 --> 00:02:59,919 Speaker 1: which allowed us to go, hey, let's put a hand 49 00:03:00,200 --> 00:03:02,799 Speaker 1: on this. And after we did that, we moved out 50 00:03:02,800 --> 00:03:04,920 Speaker 1: of Africa and started colonizing the rest of the world. 51 00:03:05,480 --> 00:03:07,840 Speaker 1: So that's they've been pointed the region of the brain 52 00:03:07,880 --> 00:03:13,120 Speaker 1: that is specific to innovation, too specific to um stone toolmaking. Okay, 53 00:03:13,200 --> 00:03:15,520 Speaker 1: I thought you meant innovation in general. No, like that's 54 00:03:15,520 --> 00:03:17,800 Speaker 1: where your ideas come from. No, give me a second, 55 00:03:17,800 --> 00:03:21,320 Speaker 1: old ramp all shoot, did every rue it? It's okay? Um. 56 00:03:21,360 --> 00:03:23,440 Speaker 1: So we go from I can't figure out how to 57 00:03:23,440 --> 00:03:26,600 Speaker 1: attach a handle to a sharp rocky two million years 58 00:03:26,639 --> 00:03:29,680 Speaker 1: we figured that out, We leave Africa, and we start 59 00:03:29,800 --> 00:03:31,280 Speaker 1: colonizing the rest of the world, and all of a 60 00:03:31,320 --> 00:03:34,880 Speaker 1: sudden things start entering light speed, right, And it seems 61 00:03:34,960 --> 00:03:39,600 Speaker 1: like over the last couple hundred years, you know, especially 62 00:03:39,680 --> 00:03:43,640 Speaker 1: since the Industrial Revolution, our ability to innovate, to grasp 63 00:03:43,720 --> 00:03:47,160 Speaker 1: new ideas, to understand the world around us has just 64 00:03:47,200 --> 00:03:50,920 Speaker 1: been hitting this hyper speed, and a lot of people 65 00:03:50,960 --> 00:03:55,200 Speaker 1: wonder if we've reached a point where all the ideas, 66 00:03:56,000 --> 00:03:58,640 Speaker 1: all the good ones at least, have already been discovered, 67 00:03:58,680 --> 00:04:02,800 Speaker 1: all the we under stand how everything works, and there's 68 00:04:03,080 --> 00:04:05,840 Speaker 1: really just figuring out how to dot the eyes and 69 00:04:05,920 --> 00:04:09,080 Speaker 1: cross the teas right right. There was actually a guy 70 00:04:09,120 --> 00:04:13,440 Speaker 1: who famously said in guy named Charles Buell. I love 71 00:04:13,480 --> 00:04:15,880 Speaker 1: this quote. He was, He was the commissioner of the 72 00:04:15,920 --> 00:04:19,039 Speaker 1: Patents Office. That's attributed to him, I should say, but 73 00:04:19,200 --> 00:04:22,520 Speaker 1: he said something like everything that can be invented has 74 00:04:22,560 --> 00:04:25,160 Speaker 1: already been invented. And he said this in a memo, 75 00:04:25,320 --> 00:04:27,400 Speaker 1: basically saying like you should go ahead and shut down 76 00:04:27,400 --> 00:04:31,440 Speaker 1: the patent office. He clearly had never considered the Snuggie 77 00:04:33,200 --> 00:04:39,120 Speaker 1: Josh or anything that's been invented since. So here's what 78 00:04:39,120 --> 00:04:40,880 Speaker 1: I'm gonna say. I'm gonna go ahead and give you 79 00:04:41,000 --> 00:04:44,400 Speaker 1: my summation early on, Okay, is that I think people 80 00:04:44,520 --> 00:04:49,160 Speaker 1: think at various times in history that they've plateaued, and 81 00:04:49,200 --> 00:04:52,919 Speaker 1: then I think things happen. People come along innovators and 82 00:04:52,960 --> 00:04:55,560 Speaker 1: then they reached new heights and they go, oh, well, 83 00:04:55,600 --> 00:04:59,320 Speaker 1: we didn't know that, and we there are new ideas, right. 84 00:04:59,360 --> 00:05:03,240 Speaker 1: It's it's on most umb. It almost displays a shameful 85 00:05:03,720 --> 00:05:08,560 Speaker 1: lack of historic awareness to say we've reached the end 86 00:05:08,600 --> 00:05:11,480 Speaker 1: of all of our good ideas. It's just it's just 87 00:05:11,600 --> 00:05:14,440 Speaker 1: asking to be made a fool of, yeah, or for 88 00:05:14,520 --> 00:05:16,800 Speaker 1: people to maybe people do that on purpose, to go 89 00:05:17,040 --> 00:05:20,880 Speaker 1: the innovators and say, oh, yeah, right, using reverse psychology, 90 00:05:21,400 --> 00:05:23,400 Speaker 1: you know that's how innovation works. Yeah, you might as 91 00:05:23,400 --> 00:05:28,320 Speaker 1: well just give up. Reverse psychology drives innovation. There are 92 00:05:28,360 --> 00:05:32,880 Speaker 1: people though, that say that technological that real technological innovation 93 00:05:33,320 --> 00:05:37,240 Speaker 1: has been stalled for quite a while. Yes, after the 94 00:05:37,320 --> 00:05:40,719 Speaker 1: nineties computer revolution, everything else since then has kind of 95 00:05:40,720 --> 00:05:46,680 Speaker 1: been like, uh, packaging it and better looking cases and 96 00:05:47,160 --> 00:05:50,120 Speaker 1: sleeker designs, and it's all like design oriented it is, 97 00:05:50,279 --> 00:05:56,360 Speaker 1: or marketing oriented. These these guys Cedric Lagare and Eric Virdo, 98 00:05:57,200 --> 00:06:02,000 Speaker 1: we're both with scheme of business school. Um basically say, smartphones, Yes, 99 00:06:02,080 --> 00:06:05,520 Speaker 1: they seem incredibly new and cutting in, but really they're 100 00:06:05,560 --> 00:06:10,279 Speaker 1: just the packaging of several already extant technologies into a 101 00:06:10,320 --> 00:06:14,360 Speaker 1: really sharp looking handheld device, but there's still a new idea. 102 00:06:14,800 --> 00:06:16,719 Speaker 1: I would argue it is still a new idea. But 103 00:06:16,760 --> 00:06:19,400 Speaker 1: I think what their point is is saying, like, but 104 00:06:19,440 --> 00:06:23,000 Speaker 1: before the late nineties and before the eighties, let's say, 105 00:06:23,000 --> 00:06:26,080 Speaker 1: with computers, but especially the tech boom of the telecom 106 00:06:26,120 --> 00:06:29,719 Speaker 1: boom of the late nineties, Like this stuff wasn't around, Like, 107 00:06:29,760 --> 00:06:34,000 Speaker 1: it's not true innovation rights, it's kind of repurposing and 108 00:06:34,080 --> 00:06:38,080 Speaker 1: what you were saying, like the cosmetic changes to a computer. Um. 109 00:06:38,839 --> 00:06:41,600 Speaker 1: One of the reasons why they they believe that this 110 00:06:41,680 --> 00:06:44,520 Speaker 1: is going on is because we've come to a point 111 00:06:45,040 --> 00:06:50,080 Speaker 1: in the computer revolution. I think, chuck, where, Um, it's 112 00:06:50,320 --> 00:06:54,160 Speaker 1: not you can still make tons of cash just by 113 00:06:54,320 --> 00:06:57,520 Speaker 1: changing the casing of a CPU. Yeah, there's like no 114 00:06:57,600 --> 00:06:59,919 Speaker 1: money in innovation basically, is what I got from this 115 00:07:00,040 --> 00:07:04,240 Speaker 1: one article is that innovation costs more than it's worth 116 00:07:04,680 --> 00:07:07,160 Speaker 1: when you can just repackage what you've got in a 117 00:07:07,279 --> 00:07:11,840 Speaker 1: sleeker design and people buy it up. Exactly. Um, these 118 00:07:11,880 --> 00:07:15,520 Speaker 1: two authors of this article, UM predict that we're gonna 119 00:07:15,520 --> 00:07:19,040 Speaker 1: have two trends that will drive innovation. I guess currently right, yes, 120 00:07:19,360 --> 00:07:23,720 Speaker 1: that consolidation where basically like especially with I think they're 121 00:07:23,720 --> 00:07:27,400 Speaker 1: talking just about computers. Yeah, because they're saying the big 122 00:07:27,440 --> 00:07:30,000 Speaker 1: hardware firms are going to all consolidate all of the 123 00:07:30,240 --> 00:07:33,520 Speaker 1: smaller hardware firms to where they'll just basically be like 124 00:07:33,560 --> 00:07:37,120 Speaker 1: the Big three or five, and that will leave it 125 00:07:37,160 --> 00:07:39,920 Speaker 1: to the software firms to compete and innovate, so we'll 126 00:07:39,920 --> 00:07:42,000 Speaker 1: see more innovation in the software side rather than the 127 00:07:42,000 --> 00:07:45,800 Speaker 1: hardware side. And they're also saying that, um, the green 128 00:07:45,920 --> 00:07:49,560 Speaker 1: boom is going to drive innovation that makes sense, like 129 00:07:49,680 --> 00:07:54,160 Speaker 1: coming up with sustainable packages or sustainable solutions. Yeah, totally. 130 00:07:55,200 --> 00:07:56,640 Speaker 1: One of the other things I pointed out thought was 131 00:07:56,680 --> 00:08:00,880 Speaker 1: interesting was the they said they said the tes uh 132 00:08:00,920 --> 00:08:04,200 Speaker 1: they call it the tech refresh cycle is too small 133 00:08:04,320 --> 00:08:07,880 Speaker 1: right now. So what's happening is they'll say, Um, you 134 00:08:07,920 --> 00:08:11,000 Speaker 1: like your CD, well you're gonna love the the Super 135 00:08:11,040 --> 00:08:13,840 Speaker 1: Audio CD or Blu Ray. Do you like your DVD. 136 00:08:13,840 --> 00:08:16,160 Speaker 1: You're gonna love Blu Ray. But guess what's coming up 137 00:08:16,160 --> 00:08:18,800 Speaker 1: after Blu Ray. It's gonna be like Super Blue Ray. 138 00:08:19,000 --> 00:08:23,280 Speaker 1: It's happening so fast. People aren't abandoning their current systems. 139 00:08:23,320 --> 00:08:24,440 Speaker 1: They're just like, you know what, I'm gonna hold on 140 00:08:24,480 --> 00:08:25,800 Speaker 1: because I don't want to be the guy stuck with 141 00:08:25,840 --> 00:08:29,640 Speaker 1: the laser disc player in a couple of years, so 142 00:08:29,800 --> 00:08:31,720 Speaker 1: all of a sudden, the same thing happens. No one's 143 00:08:31,720 --> 00:08:34,040 Speaker 1: buying it, so it's not worth a much money, which 144 00:08:34,080 --> 00:08:36,600 Speaker 1: means that nobody's putting any effort into it and money 145 00:08:36,640 --> 00:08:39,400 Speaker 1: into it. So innovation sees right. And there's a guy, 146 00:08:39,640 --> 00:08:43,960 Speaker 1: um named Edmund Phelps who's a professor of political economy 147 00:08:44,000 --> 00:08:48,280 Speaker 1: at Columbia University, right, and he's basically kind of saying 148 00:08:48,320 --> 00:08:52,400 Speaker 1: the same thing. He's saying that, Um, there's not enough 149 00:08:52,480 --> 00:08:56,920 Speaker 1: money going toward innovation. But rather than the the onus 150 00:08:56,960 --> 00:09:00,800 Speaker 1: being put on consumers not buying blue rays of fear 151 00:09:00,800 --> 00:09:05,960 Speaker 1: of looking like laser disc jerks, Um, it's actually government 152 00:09:06,080 --> 00:09:10,839 Speaker 1: and big business that's not pouring money into small innovators. Yeah, 153 00:09:10,920 --> 00:09:12,680 Speaker 1: he said, that's the innovation is the only thing not 154 00:09:12,720 --> 00:09:15,439 Speaker 1: subsidized by the United States government, which he says is 155 00:09:15,480 --> 00:09:20,240 Speaker 1: actually attacks in a way because it's not being subsidized. Reach. 156 00:09:20,320 --> 00:09:22,920 Speaker 1: You could definitely, Yeah, I think a lot of these 157 00:09:22,920 --> 00:09:26,640 Speaker 1: guy's points are reached. But um, what he's suggesting is 158 00:09:27,520 --> 00:09:31,600 Speaker 1: if the government isn't pouring money into big business so 159 00:09:31,640 --> 00:09:37,760 Speaker 1: that they can pour money into I guess small venture firms. Um, 160 00:09:37,880 --> 00:09:40,640 Speaker 1: these people who are in their garages aren't going to 161 00:09:40,920 --> 00:09:43,880 Speaker 1: take you know, risks, They're not going to innovate. There's 162 00:09:43,880 --> 00:09:46,760 Speaker 1: no incentive. Right. I disagree with this. I dispute this 163 00:09:47,240 --> 00:09:50,840 Speaker 1: because he's saying, like the people who do work in 164 00:09:50,880 --> 00:09:53,800 Speaker 1: their garages and you know, are the Steve Jobs and 165 00:09:54,120 --> 00:09:57,400 Speaker 1: Bill Gates in the seventies, that they were driven by 166 00:09:57,520 --> 00:10:00,600 Speaker 1: this lust for money exactly. And I think that's wrong. 167 00:10:00,600 --> 00:10:04,080 Speaker 1: I think that people innovate first and foremost to get 168 00:10:04,120 --> 00:10:06,880 Speaker 1: this idea out of their head and birth into reality. Right. 169 00:10:06,920 --> 00:10:10,160 Speaker 1: I'm glad you said this because I completely agreed. Regardless 170 00:10:10,200 --> 00:10:12,760 Speaker 1: of what you think of the Facebook. Mark Zuckerberg didn't 171 00:10:12,800 --> 00:10:15,720 Speaker 1: invent Facebook to make gobs of money. He invented to 172 00:10:15,760 --> 00:10:18,520 Speaker 1: make real friends. Yeah, to to innovate. And that's what 173 00:10:18,880 --> 00:10:21,000 Speaker 1: that's my point that you made is that these people 174 00:10:21,000 --> 00:10:24,720 Speaker 1: in the garage, the true innovators. They don't care if 175 00:10:24,720 --> 00:10:27,000 Speaker 1: they have two pennies to rub together. They're still gonna 176 00:10:27,000 --> 00:10:29,719 Speaker 1: be trying to innovate and make a name for themselves 177 00:10:29,920 --> 00:10:31,800 Speaker 1: and come up with something awesome. Right. And now there 178 00:10:31,840 --> 00:10:34,880 Speaker 1: are people out there who are trying to innovate for 179 00:10:35,120 --> 00:10:39,120 Speaker 1: you know, the riches sure. The guy who invented the 180 00:10:39,160 --> 00:10:41,000 Speaker 1: snugg he wasn't in his garage and just wanted to 181 00:10:41,240 --> 00:10:43,560 Speaker 1: kind of get this out or else I'm never gonna sleep. Yeah, 182 00:10:43,600 --> 00:10:45,760 Speaker 1: that's the people that are looking for the next get 183 00:10:45,880 --> 00:10:48,720 Speaker 1: rich quick thing. But I think you can also make 184 00:10:48,880 --> 00:10:53,400 Speaker 1: a point that, um, when you introduce money to innovation, 185 00:10:53,679 --> 00:10:58,559 Speaker 1: it leads to actual stagnation. Um, because when you introduce money, 186 00:10:58,600 --> 00:11:03,040 Speaker 1: there's now uh something to lose, and people are less 187 00:11:03,040 --> 00:11:05,280 Speaker 1: willing to take risks, and risk is one of the 188 00:11:05,360 --> 00:11:07,719 Speaker 1: driving the willingness to take risk is one of the 189 00:11:07,800 --> 00:11:10,520 Speaker 1: driving forces of innovation. You know. Yeah, well, I Phelps 190 00:11:10,520 --> 00:11:12,960 Speaker 1: had a good idea, uh and this will never happen, 191 00:11:13,000 --> 00:11:15,600 Speaker 1: of course, because it's a good idea to create the 192 00:11:15,640 --> 00:11:20,120 Speaker 1: first national Bank of innovation all capitalized capitalized not all caps, 193 00:11:20,120 --> 00:11:22,480 Speaker 1: but each word is capitalized. He should do it all 194 00:11:22,559 --> 00:11:26,560 Speaker 1: in all caps with exclamation points. But basically, it would 195 00:11:26,559 --> 00:11:31,280 Speaker 1: be a bank that you could go and partner, you know, 196 00:11:31,360 --> 00:11:33,360 Speaker 1: as a startup company and partner with his bank for 197 00:11:33,440 --> 00:11:37,040 Speaker 1: financing and you know, get I would guess some sort 198 00:11:37,040 --> 00:11:40,199 Speaker 1: of low interest loans to spur innovation. Right. That was 199 00:11:40,240 --> 00:11:42,280 Speaker 1: a great idea, So it is. It is a good 200 00:11:42,280 --> 00:11:44,720 Speaker 1: idea and this does happen in the real world, and 201 00:11:44,760 --> 00:11:49,239 Speaker 1: the government does pour money into innovation. He's not exactly 202 00:11:49,280 --> 00:11:52,240 Speaker 1: correct in that sense. And I also kind of resented 203 00:11:52,280 --> 00:11:57,840 Speaker 1: that he placed big business in between you know, people 204 00:11:57,840 --> 00:12:02,120 Speaker 1: in their garage innovating and you know, government subsidies that 205 00:12:02,200 --> 00:12:04,800 Speaker 1: we have to have big business give them the money 206 00:12:04,880 --> 00:12:06,520 Speaker 1: and then skim a little off the top and give 207 00:12:06,559 --> 00:12:08,520 Speaker 1: it to this guy in the garage. He's drawing broad 208 00:12:08,559 --> 00:12:11,600 Speaker 1: strokes here. For sure. There are government programs, and we'll 209 00:12:11,600 --> 00:12:14,560 Speaker 1: talk about one from the National Institutes of Health where 210 00:12:14,640 --> 00:12:17,760 Speaker 1: the government says, hey, you have a really good idea, 211 00:12:18,080 --> 00:12:21,640 Speaker 1: Mr or MS research scientists, and we're gonna give you 212 00:12:21,800 --> 00:12:25,200 Speaker 1: enough money to survive for three years. Yeah, because the 213 00:12:25,240 --> 00:12:27,600 Speaker 1: deal is you can always get grants if you know, 214 00:12:27,760 --> 00:12:30,640 Speaker 1: you put together a nice package. But this program with 215 00:12:30,679 --> 00:12:34,360 Speaker 1: the n i H what's it called the New Innovator Award, 216 00:12:34,440 --> 00:12:38,439 Speaker 1: Director's New Innovator Award. This is uh intended for people 217 00:12:38,480 --> 00:12:42,120 Speaker 1: who have such a good idea, but it's so new 218 00:12:42,200 --> 00:12:44,800 Speaker 1: that they don't have the data to write a grant 219 00:12:44,880 --> 00:12:46,720 Speaker 1: where people would say, like, it looks like you're onto 220 00:12:46,760 --> 00:12:49,120 Speaker 1: something here, So they're sort of throwing money at stuff. 221 00:12:49,160 --> 00:12:51,920 Speaker 1: That's like, you know, you're the dude in the garage 222 00:12:51,960 --> 00:12:54,000 Speaker 1: and we believe in this idea. Go see what you 223 00:12:54,040 --> 00:12:56,520 Speaker 1: can find out, right, and we're keeping big business out 224 00:12:56,559 --> 00:12:58,920 Speaker 1: of the way. Yes, but now that and I h 225 00:12:59,000 --> 00:13:02,520 Speaker 1: owns you for the rest of your career. Yeah, So 226 00:13:02,600 --> 00:13:05,000 Speaker 1: let's talk about UM. There's three people at U c 227 00:13:05,240 --> 00:13:08,520 Speaker 1: l A That got these grants recently and they're up 228 00:13:08,559 --> 00:13:12,920 Speaker 1: to some kind of some interesting one could say innovative stuff, right. 229 00:13:13,000 --> 00:13:15,640 Speaker 1: They have some good ideas, hugely innovative about how to 230 00:13:16,080 --> 00:13:21,160 Speaker 1: UM approach problems, like the professor Dino di Carlo. All 231 00:13:21,160 --> 00:13:23,400 Speaker 1: these I think these people are younger than us. By 232 00:13:23,440 --> 00:13:27,880 Speaker 1: the way, UM Dino de Carlo is working on ways 233 00:13:27,960 --> 00:13:32,520 Speaker 1: to basically apply heat or pressure or chemicals to very 234 00:13:32,640 --> 00:13:39,000 Speaker 1: specific sites in cells using nanoparticles and magnets, which is tough, 235 00:13:39,080 --> 00:13:42,400 Speaker 1: sounds like a winning idea to me. Is basically one 236 00:13:42,440 --> 00:13:45,200 Speaker 1: of the big problems we have with UM getting cells, 237 00:13:45,280 --> 00:13:49,840 Speaker 1: engineering cells to do specific things, like UM, I don't know, 238 00:13:50,360 --> 00:13:53,719 Speaker 1: attack other cells for fun. Like if tell me that 239 00:13:53,720 --> 00:13:57,240 Speaker 1: wouldn't be like a big Christmas gift this year, if 240 00:13:57,240 --> 00:13:59,760 Speaker 1: you could like make cells fight with one another under 241 00:13:59,800 --> 00:14:05,440 Speaker 1: the microscope, UM then what you have to basically try 242 00:14:05,480 --> 00:14:08,640 Speaker 1: to engineer the cell, you know, time after time after time, 243 00:14:08,640 --> 00:14:11,760 Speaker 1: and basically program it to do what you want it 244 00:14:11,800 --> 00:14:14,600 Speaker 1: to do. What Di Carlo is coming up with is 245 00:14:14,679 --> 00:14:19,160 Speaker 1: a way to use very tiny magnets and even tinier 246 00:14:19,240 --> 00:14:24,520 Speaker 1: nano particles that can basically you, my brain is so small. 247 00:14:24,880 --> 00:14:29,680 Speaker 1: When you move the magnet with a joystick, it attracts 248 00:14:29,720 --> 00:14:31,920 Speaker 1: the nano particles in a certain direction or whatever, and 249 00:14:31,960 --> 00:14:34,880 Speaker 1: you can have the nano particles apply heat or pressure 250 00:14:35,000 --> 00:14:37,360 Speaker 1: or a specific chemical to a specific site on a 251 00:14:37,440 --> 00:14:40,800 Speaker 1: cell and direct it to go attack another cell for 252 00:14:40,880 --> 00:14:44,160 Speaker 1: your pleasure. That's awesome, your amusement. So one point five 253 00:14:44,200 --> 00:14:47,800 Speaker 1: mill goes to uh de Carlo and for a good reason. 254 00:14:47,880 --> 00:14:49,680 Speaker 1: And for a good reason. The other winner, one of 255 00:14:49,680 --> 00:14:53,880 Speaker 1: the other winners, was Hugh Wang, and you came up 256 00:14:53,960 --> 00:14:58,840 Speaker 1: with basically, I'm gonna break this down easy. Instead of saying, 257 00:14:59,320 --> 00:15:01,880 Speaker 1: let me come up with cure for cancer, hu Wang said, 258 00:15:02,400 --> 00:15:05,760 Speaker 1: let me come up with a way to detect cancer 259 00:15:06,120 --> 00:15:10,240 Speaker 1: so early, like way earlier than we've ever detected it before, 260 00:15:10,600 --> 00:15:13,200 Speaker 1: that we can stop in his track essentially curing cancer. 261 00:15:13,920 --> 00:15:17,760 Speaker 1: And he's doing this, actually I don't know, she's doing 262 00:15:17,800 --> 00:15:22,560 Speaker 1: this um through uh nano material called graphine that is 263 00:15:22,640 --> 00:15:27,520 Speaker 1: just one atom thick. Yes, graphine is like the super 264 00:15:28,040 --> 00:15:32,400 Speaker 1: clearly not of this world material. It's literally a carbon 265 00:15:32,440 --> 00:15:36,320 Speaker 1: atom thick. That's it. So, but it is a biological 266 00:15:36,360 --> 00:15:39,040 Speaker 1: sensor to tell you when cells aren't doing the things 267 00:15:39,080 --> 00:15:40,960 Speaker 1: that should be doing. So did you know a graham 268 00:15:40,960 --> 00:15:46,600 Speaker 1: of this stuff? It's flattened covers a football field. Wow, 269 00:15:46,640 --> 00:15:50,720 Speaker 1: it's ultra light. That is thin, my friend, it's one 270 00:15:50,800 --> 00:15:54,560 Speaker 1: atom thin. So one point five mail to hu Wang, Right, well, 271 00:15:54,560 --> 00:15:57,640 Speaker 1: did you explain how? Let me let me try my 272 00:15:57,680 --> 00:16:00,240 Speaker 1: hand at this. So basically what you do is you um, 273 00:16:00,600 --> 00:16:08,160 Speaker 1: you put a graphine conductor transistor UM in a cell 274 00:16:08,600 --> 00:16:12,120 Speaker 1: and when these biological markers right say his stones or 275 00:16:12,160 --> 00:16:16,400 Speaker 1: something like that, start to accumulate, they're attracted to the graphing. 276 00:16:16,640 --> 00:16:18,880 Speaker 1: And these, by the way these biological markers are, we 277 00:16:19,000 --> 00:16:22,720 Speaker 1: found are correlated with the growth of cancer, the origin 278 00:16:22,760 --> 00:16:25,080 Speaker 1: of cancer. That's where they're starting. UM. And when some 279 00:16:25,200 --> 00:16:28,320 Speaker 1: of these markers like are attracted to the graphing, they 280 00:16:28,360 --> 00:16:31,280 Speaker 1: create an electrical charge that we can sense. And the 281 00:16:31,320 --> 00:16:35,360 Speaker 1: graphine is so thin but so highly conductive that um 282 00:16:35,440 --> 00:16:39,160 Speaker 1: with just a couple of these molecules attaching to the graphing. 283 00:16:39,360 --> 00:16:41,040 Speaker 1: We would be able to detect it and be like, 284 00:16:41,160 --> 00:16:46,960 Speaker 1: oh right, we'd be like, oh crap, you have cancer 285 00:16:47,160 --> 00:16:50,320 Speaker 1: and we'd cure it right then. Wow, Yeah, that's awesome. Yeah, 286 00:16:50,360 --> 00:16:51,960 Speaker 1: and it's a good way to approach a cure for cancer. 287 00:16:52,000 --> 00:16:53,720 Speaker 1: If you asked me, did I explain that? Well? I 288 00:16:53,720 --> 00:16:56,120 Speaker 1: think so. I think the last winner this year was 289 00:16:56,640 --> 00:17:00,960 Speaker 1: Jen Hill Lee and Jin is trying to debug the 290 00:17:00,960 --> 00:17:03,680 Speaker 1: brain circuit using you know, we have the Wonder Machine, 291 00:17:03,760 --> 00:17:05,600 Speaker 1: which is our favorite thing in the world fm R I, 292 00:17:06,440 --> 00:17:10,680 Speaker 1: which measures uh measures. It measures blood and oxygen levels 293 00:17:10,680 --> 00:17:13,480 Speaker 1: in the brain, so it tells you these areas light 294 00:17:13,560 --> 00:17:16,840 Speaker 1: up there called bold signals blood and oxygen level dependent. 295 00:17:16,880 --> 00:17:19,600 Speaker 1: They light up to correspond a certain brain right, And 296 00:17:19,640 --> 00:17:22,680 Speaker 1: we've talked about this before, like you're seeing that there's 297 00:17:22,760 --> 00:17:25,880 Speaker 1: more oxygen that's going to that part of the brain. 298 00:17:25,960 --> 00:17:27,960 Speaker 1: So we've assumed this is the basis of the f 299 00:17:28,119 --> 00:17:31,520 Speaker 1: m R I. If it has more oxygen being delivered 300 00:17:31,520 --> 00:17:33,199 Speaker 1: to it, that must mean that that region of the 301 00:17:33,200 --> 00:17:36,040 Speaker 1: brain is active. When you show somebody a picture of 302 00:17:36,560 --> 00:17:39,800 Speaker 1: you know, their kid, like being carried away into a 303 00:17:39,880 --> 00:17:44,080 Speaker 1: van that you know that's the fear region right there. Um. 304 00:17:44,200 --> 00:17:47,200 Speaker 1: That doesn't really say anything though, and it doesn't. It 305 00:17:47,320 --> 00:17:51,480 Speaker 1: doesn't also implicate well it's it's not it's showing. Okay, 306 00:17:51,520 --> 00:17:55,120 Speaker 1: well there's more oxygen in this region, right? What this is, um? 307 00:17:55,200 --> 00:17:59,960 Speaker 1: What what genuine Lee is looking at is um how 308 00:18:00,119 --> 00:18:05,400 Speaker 1: or what specifically on the neuronal level is being activated? Right? 309 00:18:05,440 --> 00:18:08,159 Speaker 1: And he's using opt to genetics, So it's going to 310 00:18:08,160 --> 00:18:11,400 Speaker 1: be called the O F M R I. And that's 311 00:18:11,600 --> 00:18:14,480 Speaker 1: beyond even what we thought was the wonder machine. So 312 00:18:14,520 --> 00:18:17,320 Speaker 1: this is the super duper wonder machine. Basically, he's using 313 00:18:17,440 --> 00:18:22,520 Speaker 1: light to allow genetically specified neurons to be activated. Right. 314 00:18:22,600 --> 00:18:25,119 Speaker 1: Do you know, um, are one of our listeners that 315 00:18:25,200 --> 00:18:27,960 Speaker 1: Emory has been harping on us doing one on opt 316 00:18:27,960 --> 00:18:31,359 Speaker 1: to genetics for a while. We should get this person 317 00:18:31,359 --> 00:18:33,520 Speaker 1: in here. This is probably his closest forever gonna come 318 00:18:33,560 --> 00:18:37,640 Speaker 1: out and it uh, well it's a great idea though 319 00:18:37,680 --> 00:18:42,200 Speaker 1: obviously because Jin Young Lee won one of the Innovator 320 00:18:42,240 --> 00:18:44,960 Speaker 1: Awards as well. Yes, and they give these out every year, 321 00:18:45,480 --> 00:18:47,800 Speaker 1: so they clearly believe that we're not out of good ideas. 322 00:18:48,920 --> 00:18:52,680 Speaker 1: No excellent point choke h no, and we're not out 323 00:18:52,680 --> 00:18:56,520 Speaker 1: of good ideas. So yes, Chuck, we we you pick 324 00:18:56,600 --> 00:18:58,919 Speaker 1: those out. You found those guys all right, Well I 325 00:18:58,920 --> 00:19:01,720 Speaker 1: didn't personally find them. You're like, these guys should get 326 00:19:01,720 --> 00:19:06,440 Speaker 1: the UM. There are very good ideas out there, right, 327 00:19:07,359 --> 00:19:15,200 Speaker 1: But there is a debate that's raging in science UM 328 00:19:15,240 --> 00:19:20,919 Speaker 1: about whether these ideas like optogenetics or um, you know, 329 00:19:21,320 --> 00:19:27,360 Speaker 1: using graphine or nanoparticles to cure detect cancer. UM, are 330 00:19:27,400 --> 00:19:32,000 Speaker 1: these variations on a theme? Are they applying cosmetic changes 331 00:19:32,080 --> 00:19:37,320 Speaker 1: to a computer rather than really creating new parts to it? Right? 332 00:19:38,119 --> 00:19:43,080 Speaker 1: And basically the question is, um, are are are there 333 00:19:43,119 --> 00:19:46,200 Speaker 1: any more major discoveries for us to make or are 334 00:19:46,240 --> 00:19:50,000 Speaker 1: these really just basically at Remember I've always said like 335 00:19:50,080 --> 00:19:51,879 Speaker 1: we we have the pieces on the table, now we 336 00:19:51,920 --> 00:19:53,880 Speaker 1: just have to put it together. Is that the point 337 00:19:53,920 --> 00:19:57,679 Speaker 1: that we're at? UM said we were, I did, and 338 00:19:57,720 --> 00:20:00,000 Speaker 1: then we started researching this, and I'm like, I wonder, 339 00:20:00,600 --> 00:20:03,680 Speaker 1: I think I still do believe that. UM. But within 340 00:20:03,760 --> 00:20:06,320 Speaker 1: that though, there's so much that it's to me a 341 00:20:06,400 --> 00:20:10,600 Speaker 1: little bit like splitting hair. Well, but you're absolutely right, 342 00:20:10,760 --> 00:20:14,040 Speaker 1: especially when you throw in the word discovered, right, Discovery 343 00:20:14,080 --> 00:20:17,040 Speaker 1: indicates something that's already out there. We just figure it 344 00:20:17,080 --> 00:20:21,360 Speaker 1: out or stumble upon it and an idea necessarily kind 345 00:20:21,400 --> 00:20:24,880 Speaker 1: of UM invention leads, Yeah, it leads to an invention. 346 00:20:24,920 --> 00:20:28,720 Speaker 1: It's something we we've created, like technology. So let's talk 347 00:20:28,760 --> 00:20:34,280 Speaker 1: about discovery, right. We have a lot of UM problems 348 00:20:34,320 --> 00:20:37,400 Speaker 1: that are still facing us and how we understand the universe, 349 00:20:37,520 --> 00:20:43,360 Speaker 1: like human consciousness. How do brain cells create our understanding 350 00:20:43,359 --> 00:20:46,160 Speaker 1: of the world, like what we see as reality? How 351 00:20:46,240 --> 00:20:49,720 Speaker 1: is that possible? And can we figure everything out? Well, 352 00:20:49,760 --> 00:20:53,080 Speaker 1: that's the big question. Is there there's a UM there's 353 00:20:53,160 --> 00:20:54,960 Speaker 1: Like I said, there's a lot of debate about whether 354 00:20:55,040 --> 00:20:58,200 Speaker 1: or not we will ever be able to figure everything out, 355 00:20:58,600 --> 00:21:03,080 Speaker 1: or if the human brain just simply isn't UM programmed 356 00:21:03,840 --> 00:21:09,040 Speaker 1: to understand the world? Uh fully, you know, will will. 357 00:21:09,359 --> 00:21:11,479 Speaker 1: There's a guy who's a physicist. His name is um 358 00:21:11,560 --> 00:21:14,719 Speaker 1: Russell Standard, and he's written this book called The End 359 00:21:14,720 --> 00:21:17,720 Speaker 1: of Discovery, and basically he's saying he says that we're 360 00:21:17,720 --> 00:21:21,959 Speaker 1: in quote a transient age of human development, right where 361 00:21:21,600 --> 00:21:24,760 Speaker 1: we're past the point where we figured out you can 362 00:21:24,800 --> 00:21:28,000 Speaker 1: put a handle on a rock and make it an ax. 363 00:21:29,040 --> 00:21:33,600 Speaker 1: But we're right before the point where we can no 364 00:21:33,680 --> 00:21:37,680 Speaker 1: longer make discoveries. Not because we've understood everything or figured 365 00:21:37,720 --> 00:21:40,000 Speaker 1: everything out. But because we've reached the limits of what 366 00:21:40,160 --> 00:21:44,040 Speaker 1: is noble for the human brain. But even that, look 367 00:21:44,080 --> 00:21:48,000 Speaker 1: at that, that part of the the right hemisphere that 368 00:21:48,240 --> 00:21:51,840 Speaker 1: developed and allowed us to put the axe handle on, right, 369 00:21:52,280 --> 00:21:55,000 Speaker 1: who's to say that our brain won't that we won't 370 00:21:55,040 --> 00:21:58,359 Speaker 1: reach that point where we can't know anything any longer, 371 00:21:58,560 --> 00:22:01,600 Speaker 1: or I can't know everything? And in uh, we evolve 372 00:22:01,640 --> 00:22:04,160 Speaker 1: even further and all of a sudden we're even better 373 00:22:04,320 --> 00:22:09,040 Speaker 1: at um understanding our world. Right, But will we end 374 00:22:09,080 --> 00:22:14,200 Speaker 1: up eventually coming to a point where humans understand everything 375 00:22:14,640 --> 00:22:17,520 Speaker 1: and there is no more discovery to make I say no, 376 00:22:17,680 --> 00:22:19,600 Speaker 1: because he he points out in here and this is 377 00:22:19,640 --> 00:22:23,119 Speaker 1: I think very valid from the Midnight century, the nineteenth century. 378 00:22:23,160 --> 00:22:25,159 Speaker 1: I'm sorry they said that. A lot of people in 379 00:22:25,240 --> 00:22:27,520 Speaker 1: science said, you know, we've kind of debunked religion and 380 00:22:27,520 --> 00:22:31,080 Speaker 1: philosophy and all these things with scientific discovery. But he 381 00:22:31,160 --> 00:22:33,439 Speaker 1: points out, and I agree that even if you figure 382 00:22:33,440 --> 00:22:36,200 Speaker 1: out all the problems of science, which will never happen, 383 00:22:36,800 --> 00:22:41,520 Speaker 1: there's still human life and consciousness in the subjectivity of 384 00:22:42,359 --> 00:22:45,920 Speaker 1: what goes on inside a person's head. You're never gonna 385 00:22:46,000 --> 00:22:52,480 Speaker 1: solve that's not solvable, right, that's what I argue. That's subjectivism. Yeah, 386 00:22:52,720 --> 00:22:54,640 Speaker 1: before I think I believe in that there, well, they're 387 00:22:54,720 --> 00:22:58,480 Speaker 1: the whole the I guess I I agree with you. Um, 388 00:22:58,640 --> 00:23:02,200 Speaker 1: there's this aspect of the universe that Kant called the 389 00:23:02,320 --> 00:23:07,440 Speaker 1: new Amenon, new amenon that was specifically tailored from my 390 00:23:07,520 --> 00:23:11,040 Speaker 1: thick time. But basically the new aminon is the thing 391 00:23:11,160 --> 00:23:16,600 Speaker 1: itself right where um it has. It's just the objective. 392 00:23:17,880 --> 00:23:22,240 Speaker 1: It's the objective universe, and we don't interact with that. 393 00:23:22,400 --> 00:23:26,119 Speaker 1: Everything we know and understand is subjective, and this is 394 00:23:26,119 --> 00:23:29,240 Speaker 1: where subjectivism is is based that basically we can never 395 00:23:29,440 --> 00:23:33,520 Speaker 1: fully know anything or and we certainly won't ever know 396 00:23:33,640 --> 00:23:36,400 Speaker 1: everything because one thing that will always be elusive is 397 00:23:36,600 --> 00:23:40,040 Speaker 1: what you see. My reality is different than your reality exactly, 398 00:23:40,240 --> 00:23:43,160 Speaker 1: and there's different. There's an extreme version of it called 399 00:23:43,240 --> 00:23:49,680 Speaker 1: so solipsism, right, yes, and solipsism is the the um 400 00:23:49,800 --> 00:23:55,800 Speaker 1: this extreme version of subjectivism that basically says, um, we 401 00:23:56,480 --> 00:24:00,239 Speaker 1: everything is so subjective that I can't fully verify that 402 00:24:00,280 --> 00:24:03,080 Speaker 1: you exist. The only thing I know that exists is 403 00:24:03,080 --> 00:24:05,840 Speaker 1: my reality, but all of you may be made up. 404 00:24:05,880 --> 00:24:08,720 Speaker 1: I may be totally completely out of my mind and 405 00:24:08,760 --> 00:24:11,639 Speaker 1: actually in a padded cell right now and none of 406 00:24:11,680 --> 00:24:15,080 Speaker 1: you are really real Well, that's sort of touches on 407 00:24:15,160 --> 00:24:20,160 Speaker 1: the whole quantum mechanics thing, right, don't you think Please? Well, 408 00:24:20,200 --> 00:24:21,680 Speaker 1: I mean, I don't have a whole to say about 409 00:24:21,680 --> 00:24:24,960 Speaker 1: it because we've covered it, but it definitely is along 410 00:24:24,960 --> 00:24:26,840 Speaker 1: the same line. So I think, well, yeah, there's a 411 00:24:26,920 --> 00:24:31,680 Speaker 1: there's an interpretation of quantum mechanics that basically says, um, 412 00:24:31,720 --> 00:24:34,720 Speaker 1: everything we know about the universe we know through observation, 413 00:24:35,640 --> 00:24:38,440 Speaker 1: and but once you observe it, it it changes. That's part 414 00:24:38,480 --> 00:24:42,520 Speaker 1: of it. And when when we observe, we we gain information, right, 415 00:24:42,720 --> 00:24:45,320 Speaker 1: but we can't observe everything at once, So all we 416 00:24:45,440 --> 00:24:49,200 Speaker 1: know exists in our reality for sure, is what we're observing. 417 00:24:49,520 --> 00:24:52,199 Speaker 1: So everything else, like what's going on out there in 418 00:24:52,200 --> 00:24:55,440 Speaker 1: the office right now, doesn't exist because we're not there 419 00:24:55,480 --> 00:24:59,399 Speaker 1: to observe it. Mind blowing. Once again, it is mind blowing, 420 00:24:59,440 --> 00:25:01,800 Speaker 1: but it all So we say all this not just 421 00:25:01,840 --> 00:25:07,159 Speaker 1: to you know, rock out to Floyd, but um, because 422 00:25:07,280 --> 00:25:09,280 Speaker 1: this is this is what science is up against. This 423 00:25:09,320 --> 00:25:12,479 Speaker 1: isn't just jibberish. This isn't just philosophical jibberish, as much 424 00:25:12,480 --> 00:25:14,919 Speaker 1: as science would like it to be. There is a 425 00:25:15,240 --> 00:25:19,920 Speaker 1: true problem with the fact that subjectivity, not objectivity, is 426 00:25:19,960 --> 00:25:22,359 Speaker 1: how we interact with our universe, even though science is 427 00:25:22,400 --> 00:25:28,000 Speaker 1: based it's supposed to be based exclusively on objectivity. Right. Well, uh, 428 00:25:28,240 --> 00:25:31,280 Speaker 1: Stephen Hawking, you might have heard of him and another 429 00:25:31,400 --> 00:25:36,199 Speaker 1: dude named Leonard load Loader. Now is how I'm going 430 00:25:36,240 --> 00:25:38,439 Speaker 1: to pronounce that there's a silent m in there somewhere. 431 00:25:38,800 --> 00:25:40,719 Speaker 1: They have a new book called The Grand Design, and 432 00:25:40,760 --> 00:25:44,119 Speaker 1: they are now saying that I think scientists used to 433 00:25:44,160 --> 00:25:46,240 Speaker 1: say we're gonna find the theory of everything. Now they're saying, 434 00:25:46,240 --> 00:25:48,119 Speaker 1: you know what, We're probably not going to find the 435 00:25:48,119 --> 00:25:50,679 Speaker 1: theory of everything, but it's probably gonna be more like 436 00:25:50,960 --> 00:25:54,840 Speaker 1: what they call, quote a family of interconnected theories which 437 00:25:54,920 --> 00:25:58,200 Speaker 1: describe your reality under very specific conditions. And this is 438 00:25:58,280 --> 00:26:00,840 Speaker 1: kind of huge for Stephen Hawking because he's long been 439 00:26:00,880 --> 00:26:03,760 Speaker 1: a big supporter of the theory of everything, which takes 440 00:26:03,800 --> 00:26:08,720 Speaker 1: the standard model of physics, includes gravity, which has always 441 00:26:08,760 --> 00:26:12,800 Speaker 1: been elusive, and then marries it with quantum mechanics to 442 00:26:12,960 --> 00:26:15,359 Speaker 1: explain everything. That's the theory of everything. It's one theory 443 00:26:15,400 --> 00:26:20,160 Speaker 1: that explains everything, right, like that surfer guy exactly, Garrett Leci. 444 00:26:20,280 --> 00:26:23,280 Speaker 1: I think it's damn it was. And you know it's 445 00:26:23,280 --> 00:26:26,200 Speaker 1: going to be years before he's shown to be correct 446 00:26:26,320 --> 00:26:29,879 Speaker 1: or incorrect. But Hawking saying it's probably not going to 447 00:26:29,960 --> 00:26:32,840 Speaker 1: be the case. There's going to there's too many different 448 00:26:32,920 --> 00:26:36,760 Speaker 1: variables that don't fit together. But the thing that really 449 00:26:36,800 --> 00:26:40,680 Speaker 1: scares a physicist, that will scare any physicist, is this sports. 450 00:26:42,040 --> 00:26:45,680 Speaker 1: Those are those models that we've come up with. Are 451 00:26:45,760 --> 00:26:49,520 Speaker 1: they how the universe actually works? Or how we look 452 00:26:49,560 --> 00:26:52,080 Speaker 1: at the universe and see how it works? You see 453 00:26:52,080 --> 00:26:54,520 Speaker 1: what I'm saying. There's that subjectivism again. It can't be 454 00:26:54,600 --> 00:26:56,760 Speaker 1: whipped well. And all the things that we've said over 455 00:26:56,800 --> 00:27:00,200 Speaker 1: the years that we have formed to be true, are 456 00:27:00,240 --> 00:27:04,040 Speaker 1: those even true? Or are are the conclusions we're reaching 457 00:27:04,080 --> 00:27:08,720 Speaker 1: just based on years of thought compiled that may not 458 00:27:08,800 --> 00:27:11,239 Speaker 1: have been true to begin with. So I like we 459 00:27:11,600 --> 00:27:15,440 Speaker 1: arrive at reality by consensus. Yeah, but is that consensus? 460 00:27:15,520 --> 00:27:18,320 Speaker 1: Was that even accurate along the way? Not necessarily. It's 461 00:27:18,359 --> 00:27:20,679 Speaker 1: been shown time and time again that it's it hasn't 462 00:27:20,720 --> 00:27:25,600 Speaker 1: been accurate through these um the five revolutions, as VM 463 00:27:25,720 --> 00:27:29,480 Speaker 1: Ramaschandra and puts them. Arnicus Copernicus was the first one 464 00:27:29,520 --> 00:27:31,439 Speaker 1: who said that Earth is not the center of the universe, 465 00:27:31,520 --> 00:27:35,719 Speaker 1: Darwinism dark very good. Chuck Darwin's says like, hey, we're 466 00:27:35,760 --> 00:27:40,240 Speaker 1: actually just a bunch of apes DNA Freud Freud Freud 467 00:27:40,359 --> 00:27:43,879 Speaker 1: saying like we we actually are driven by desires that 468 00:27:43,920 --> 00:27:47,800 Speaker 1: we can't control and aren't really aware of DNA, which 469 00:27:47,840 --> 00:27:52,159 Speaker 1: is saying I think James Watson, who found DNA along 470 00:27:52,160 --> 00:27:55,400 Speaker 1: with Francis Crick, said quote, there are only molecules, everything 471 00:27:55,480 --> 00:27:57,679 Speaker 1: else is sociologist. I love that quote, man, It's one 472 00:27:57,680 --> 00:28:00,680 Speaker 1: of my favorite. And then um, the Fifth Revolution, the 473 00:28:00,760 --> 00:28:04,760 Speaker 1: neuroscience revolution, that we're all everything, are all of our 474 00:28:05,160 --> 00:28:10,399 Speaker 1: understanding movements and and experiences are nothing but um neuronal transmissions, 475 00:28:10,440 --> 00:28:14,119 Speaker 1: electrochemical impulses. Right, so there's not even sociology that even 476 00:28:14,359 --> 00:28:17,520 Speaker 1: is just based on firing neurons. Right, That's that's where 477 00:28:17,520 --> 00:28:19,800 Speaker 1: we're at right now. That's why I say, I think 478 00:28:19,960 --> 00:28:22,639 Speaker 1: we have everything on the table, just haven't put it together. 479 00:28:23,040 --> 00:28:27,919 Speaker 1: But it's entirely possible historically speaking, to say, well we 480 00:28:28,040 --> 00:28:32,560 Speaker 1: thought that before and we didn't. And what revolution is next? 481 00:28:32,640 --> 00:28:35,520 Speaker 1: Will that will the next revolution get us over the 482 00:28:35,520 --> 00:28:39,920 Speaker 1: wall of subjectivism or will that be the wall that 483 00:28:39,960 --> 00:28:43,560 Speaker 1: we always run into? This is a good one, and well, 484 00:28:43,600 --> 00:28:45,800 Speaker 1: I was worried about this one. It came out pretty 485 00:28:45,800 --> 00:28:47,720 Speaker 1: good in it. I think, so, yeah, don't you like 486 00:28:47,760 --> 00:28:49,760 Speaker 1: it when we like pat ourselves in the back of 487 00:28:49,840 --> 00:28:51,960 Speaker 1: the end of the show, I think this one deserves it. 488 00:28:52,040 --> 00:28:56,440 Speaker 1: Man well, so from Blue Rays to Carons And at 489 00:28:56,440 --> 00:28:59,320 Speaker 1: the end of the day, Josh and Chuck say, we 490 00:28:59,360 --> 00:29:01,320 Speaker 1: are not out of new ideas. Can I speak for you? 491 00:29:01,920 --> 00:29:04,400 Speaker 1: Go ahead, We're not out of new ideas. And just 492 00:29:04,440 --> 00:29:05,960 Speaker 1: when you think you're out of new ideas, just when 493 00:29:05,960 --> 00:29:09,560 Speaker 1: you think of plateaued comes up you wang along to 494 00:29:09,560 --> 00:29:12,000 Speaker 1: say no, no, no, no, there are new ideas and 495 00:29:12,000 --> 00:29:14,520 Speaker 1: here's one you know not give me the cash exactly. 496 00:29:15,720 --> 00:29:19,080 Speaker 1: If you want to learn more about innovation and new ideas, 497 00:29:19,640 --> 00:29:21,880 Speaker 1: we have tons of stuff all over the site. Just 498 00:29:21,920 --> 00:29:25,800 Speaker 1: type in innovation, type in discovery. I'm sure that'll bring 499 00:29:25,880 --> 00:29:30,160 Speaker 1: up a ton of stuff. Um, and type in neurons. 500 00:29:30,760 --> 00:29:34,200 Speaker 1: That will bring up some pretty cool stuff too. Agreed. Uh, 501 00:29:34,520 --> 00:29:37,760 Speaker 1: you can type all those words into the handy search 502 00:29:37,800 --> 00:29:40,720 Speaker 1: bar how stuff works dot com, which means it's time 503 00:29:40,760 --> 00:29:46,800 Speaker 1: for a listener mail. Yes, Josh, I'm gonna follow this 504 00:29:46,880 --> 00:29:52,560 Speaker 1: very heavy podcast with the opposite an email for him Okay, 505 00:29:52,800 --> 00:29:57,000 Speaker 1: this is from our thirteen year old fan Payton in California. Well, hello, 506 00:29:57,200 --> 00:29:59,520 Speaker 1: I'm sending us from my eye touch while laying in bed. 507 00:29:59,760 --> 00:30:03,120 Speaker 1: I'm supposed to be asleep, so anyway, I just started 508 00:30:03,120 --> 00:30:05,880 Speaker 1: listening to your podcast after my friend Claire. Yes, that's 509 00:30:05,880 --> 00:30:09,280 Speaker 1: the Claire from California whose email you read on the air, 510 00:30:09,320 --> 00:30:12,360 Speaker 1: who thinks Jerry looks like Tina f A. Uh. Claire 511 00:30:12,440 --> 00:30:14,800 Speaker 1: is his his Peyton's friend. So she said, oh, you 512 00:30:14,840 --> 00:30:18,120 Speaker 1: got in the year, so i'letna starting listening to you. Um. Actually, 513 00:30:18,160 --> 00:30:20,040 Speaker 1: I'm saying Peyton is a girl, Peten maybe a boy. 514 00:30:20,080 --> 00:30:25,360 Speaker 1: You never know? Oh really, yeah, it's indragynus right, yeah, 515 00:30:25,720 --> 00:30:30,120 Speaker 1: ambivalent at least. Uh. Claire posted on her Facebook page 516 00:30:30,160 --> 00:30:32,160 Speaker 1: that I said, listen to the most recent podcast because 517 00:30:32,160 --> 00:30:34,440 Speaker 1: you guys read her letter or something. I thought it 518 00:30:34,480 --> 00:30:37,400 Speaker 1: was so cool. Claire and I are really good friends. Anyways, 519 00:30:37,480 --> 00:30:40,520 Speaker 1: I love this podcast. Gosh, I feel so boring because 520 00:30:40,520 --> 00:30:43,880 Speaker 1: I keep saying podcast. Is there like another word for that? 521 00:30:45,400 --> 00:30:49,600 Speaker 1: Jared laughed at that. Anyways, I definitely she does that 522 00:30:49,640 --> 00:30:51,200 Speaker 1: thing like the kids do now where they put like 523 00:30:51,240 --> 00:30:52,920 Speaker 1: eight s at the end of a word. Have you 524 00:30:52,960 --> 00:30:55,520 Speaker 1: seen that yeah, I don't get that. I don't either, original, 525 00:30:55,680 --> 00:30:57,959 Speaker 1: I guess. So. I most definitely enjoyed the podcast on 526 00:30:58,000 --> 00:31:01,720 Speaker 1: the Octopi and stuff I thought was Octopie. I thought 527 00:31:01,720 --> 00:31:04,120 Speaker 1: it was informational and funny. By the way, this email 528 00:31:04,160 --> 00:31:06,040 Speaker 1: doesn't make any sense. It's because my eye touch is 529 00:31:06,160 --> 00:31:09,400 Speaker 1: dumb and auto correct words that have already spelled right 530 00:31:10,280 --> 00:31:15,280 Speaker 1: ERG moving on your iPhone does that too? And mind 531 00:31:15,280 --> 00:31:17,680 Speaker 1: does that? What's this? An email written with one of 532 00:31:17,680 --> 00:31:20,040 Speaker 1: those pens that has like four different color ink you 533 00:31:20,080 --> 00:31:22,840 Speaker 1: can select rons that it feels like. But the reason 534 00:31:22,880 --> 00:31:24,440 Speaker 1: I brought that up is I have an idea to 535 00:31:24,640 --> 00:31:29,120 Speaker 1: start a website called my ipop my iPhone spelled what 536 00:31:29,720 --> 00:31:32,480 Speaker 1: dot com Because you ever look at some of them 537 00:31:32,480 --> 00:31:34,720 Speaker 1: you sinned and you're like, can you please make sure 538 00:31:34,800 --> 00:31:36,800 Speaker 1: you take the sofa out of the oven when you 539 00:31:36,840 --> 00:31:41,800 Speaker 1: get them when you meant to say, um sturgeon, let's 540 00:31:41,800 --> 00:31:45,800 Speaker 1: say surgeon is so far I would surgeon, okay, take 541 00:31:45,840 --> 00:31:47,760 Speaker 1: the surgeon out of the oven? Which is I think 542 00:31:47,800 --> 00:31:50,520 Speaker 1: so much better. I wish you would have planned this anyway. 543 00:31:50,560 --> 00:31:52,760 Speaker 1: It can make for a lot of fun. So that's 544 00:31:52,760 --> 00:31:56,160 Speaker 1: my new idea. And that's some lots of love from Peyton, 545 00:31:56,200 --> 00:31:59,200 Speaker 1: aged thirteen and Cali thanks a lot of patent age 546 00:31:59,200 --> 00:32:01,920 Speaker 1: thirteen and a boy or girl. We're not exactly sure, 547 00:32:01,960 --> 00:32:04,000 Speaker 1: but either way, we appreciate you taking the time to 548 00:32:04,120 --> 00:32:07,120 Speaker 1: write in. And if you have a movie that Chuck 549 00:32:07,160 --> 00:32:09,520 Speaker 1: and I have not seen, you assume we haven't seen 550 00:32:09,560 --> 00:32:12,560 Speaker 1: that you think we should see, best movie, best overlooked 551 00:32:12,600 --> 00:32:15,880 Speaker 1: movie of all time. We're always looking for good suggestions. 552 00:32:16,240 --> 00:32:18,320 Speaker 1: Wrap it up in an email and send it to 553 00:32:18,600 --> 00:32:24,880 Speaker 1: Stuff podcast at how stuff works dot com. For more 554 00:32:24,920 --> 00:32:27,520 Speaker 1: on this and thousands of other topics, visit how stuff 555 00:32:27,560 --> 00:32:29,880 Speaker 1: works dot com. The how stuff Works dot Com i 556 00:32:29,960 --> 00:32:32,800 Speaker 1: phone app is coming soon. Get access to our content 557 00:32:32,880 --> 00:32:36,120 Speaker 1: in a new way, articles, videos, and more all on 558 00:32:36,160 --> 00:32:39,320 Speaker 1: the go. Check out the latest podcasts and blog post 559 00:32:39,600 --> 00:32:42,320 Speaker 1: and see what we're saying on Facebook and Twitter. Coming 560 00:32:42,400 --> 00:32:48,240 Speaker 1: soon to iTunes. Brought to you by the reinvented two 561 00:32:48,280 --> 00:32:50,800 Speaker 1: thousand twelve camera. It's ready, are you