1 00:00:00,800 --> 00:00:03,440 Speaker 1: Hey, everybody, it's me Josh, and for this week's s 2 00:00:03,560 --> 00:00:07,240 Speaker 1: Y s K Selects, I've chosen an episode from two 3 00:00:07,240 --> 00:00:10,960 Speaker 1: thousand and ten. Have all the good ideas been discovered? Uh? 4 00:00:11,000 --> 00:00:13,360 Speaker 1: It's an interesting one and in a strange way, it 5 00:00:13,440 --> 00:00:17,160 Speaker 1: ties into the planned Obsolescence episode we released recently, even 6 00:00:17,160 --> 00:00:21,040 Speaker 1: though it was recorded almost ten years before. And I 7 00:00:21,120 --> 00:00:24,760 Speaker 1: want to make a note it's possible that the listener 8 00:00:24,880 --> 00:00:28,880 Speaker 1: male person who wrote in in this episode actually predicted 9 00:00:28,920 --> 00:00:32,880 Speaker 1: the coming of the wildly popular site. Damn you auto 10 00:00:32,920 --> 00:00:37,000 Speaker 1: correct proved me wrong at any rate. Enjoy this episode. 11 00:00:39,760 --> 00:00:42,040 Speaker 1: Welcome to Stuff You Should Know, a production of My 12 00:00:42,240 --> 00:00:51,240 Speaker 1: Heart Radios How Stuff Works. Hey, and welcome to the podcast. 13 00:00:51,280 --> 00:00:53,840 Speaker 1: I'm Josh Clark. With me as always is Charles W. 14 00:00:54,120 --> 00:00:57,000 Speaker 1: Chuck Bryant. And that makes the stuff you should know. 15 00:00:58,040 --> 00:01:01,640 Speaker 1: That's right, Yes it is not. They're imitators. I wonder 16 00:01:01,680 --> 00:01:03,840 Speaker 1: how many times I've said that that makes the stuff 17 00:01:03,880 --> 00:01:06,160 Speaker 1: you should know. No, that's just the whole spiel, the 18 00:01:06,200 --> 00:01:08,840 Speaker 1: whole opening. Hey, you welcome to the podcast. Well you've 19 00:01:08,840 --> 00:01:11,800 Speaker 1: said it about two hundred and seventy. Sometimes I think, 20 00:01:12,560 --> 00:01:14,640 Speaker 1: luckily we have them all saved, and we could count. 21 00:01:14,800 --> 00:01:17,400 Speaker 1: We do I don't know if it's lucky though, Chuck. 22 00:01:17,480 --> 00:01:19,040 Speaker 1: That's a lot of shows, dude, we should do something 23 00:01:19,040 --> 00:01:22,480 Speaker 1: special for three hundred. That's like, that's a lot of shows. 24 00:01:22,640 --> 00:01:26,200 Speaker 1: It is that makes me proud. Okay, well, do you 25 00:01:26,200 --> 00:01:28,440 Speaker 1: think maybe we could get some cake around here or something? 26 00:01:28,920 --> 00:01:33,560 Speaker 1: Shrimp cocktail? No, I'm allergic to shrimp now, remember I know, 27 00:01:33,880 --> 00:01:36,319 Speaker 1: but I still like to throw it out there. Actually, 28 00:01:36,400 --> 00:01:40,759 Speaker 1: I had a shrimp wanton the other day and nothing happened. Really, 29 00:01:41,280 --> 00:01:44,240 Speaker 1: I ate a wanton with shrimp and nothing happened. So 30 00:01:44,280 --> 00:01:46,240 Speaker 1: it's just like tiny little bits of shrimp. And I 31 00:01:46,280 --> 00:01:49,160 Speaker 1: don't know. Either that or I'm getting stronger. Maybe so 32 00:01:49,400 --> 00:01:54,960 Speaker 1: superhuman you might say, trans human speaking of human Yes, um, Chuck. 33 00:01:55,280 --> 00:01:58,920 Speaker 1: There is a recent study that came out in part 34 00:01:59,520 --> 00:02:03,760 Speaker 1: from one of our universities here in the city, Emory 35 00:02:04,200 --> 00:02:07,680 Speaker 1: right down the street, great school. There's been this problem 36 00:02:07,720 --> 00:02:11,920 Speaker 1: that's been plaguing researchers for a really long time, and 37 00:02:11,960 --> 00:02:15,120 Speaker 1: that is, at the beginning of the Lower Paleolithic period, 38 00:02:15,320 --> 00:02:19,320 Speaker 1: which is about two points seven million years ago, we 39 00:02:19,360 --> 00:02:25,240 Speaker 1: started using sharp rocks as bashing and cutting tools. So 40 00:02:25,280 --> 00:02:27,959 Speaker 1: we figured that out. Okay, you can take a rock 41 00:02:28,040 --> 00:02:32,280 Speaker 1: that's technology. That's not horse, that's technology. Yes, okay, you 42 00:02:32,320 --> 00:02:33,960 Speaker 1: can take this rock and you can use it to 43 00:02:34,000 --> 00:02:36,679 Speaker 1: open a coconut or the head of someone who's wrong. 44 00:02:36,760 --> 00:02:40,280 Speaker 1: Do you using an implement to complete attack? Well, specifically 45 00:02:40,480 --> 00:02:47,160 Speaker 1: sharp rocks? Okay? It took two million years the end 46 00:02:47,200 --> 00:02:50,800 Speaker 1: of the Lower Paleolithic period before we figured out that 47 00:02:50,840 --> 00:02:53,360 Speaker 1: we could actually attach to handles to these things. And 48 00:02:53,440 --> 00:02:58,000 Speaker 1: tournament how long it took? Yes, And this is baffled scientists, like, 49 00:02:58,120 --> 00:03:01,920 Speaker 1: how could it possibly have taken can two million years 50 00:03:01,919 --> 00:03:05,799 Speaker 1: to go from using your hand to attaching a stick? 51 00:03:06,600 --> 00:03:10,079 Speaker 1: You know, this doesn't make any sense. So, um, well 52 00:03:10,120 --> 00:03:14,239 Speaker 1: they were dumb back then. Well a dumb is close 53 00:03:14,320 --> 00:03:17,480 Speaker 1: to it. They would literally were lacking the region of 54 00:03:17,520 --> 00:03:22,120 Speaker 1: the brain needed. Apparently, according to this new study. Um, 55 00:03:22,240 --> 00:03:26,880 Speaker 1: they they basically, we developed a region in the right hemisphere, 56 00:03:26,919 --> 00:03:32,920 Speaker 1: specifically the supra marginal gyrus, which allowed us to go, hey, 57 00:03:33,080 --> 00:03:36,280 Speaker 1: let's put a handle on this. And after we did that, 58 00:03:36,360 --> 00:03:38,640 Speaker 1: we moved out of Africa and started colonizing the rest 59 00:03:38,680 --> 00:03:41,640 Speaker 1: of the world. So that's they've pinpointed the region of 60 00:03:41,640 --> 00:03:44,880 Speaker 1: the brain that is specific to innovation, too specific to 61 00:03:45,160 --> 00:03:49,480 Speaker 1: um stone toolmaking. Okay, I thought you meant innovation in general. No, like, 62 00:03:49,520 --> 00:03:51,680 Speaker 1: that's where your ideas come from. No, give me a 63 00:03:51,680 --> 00:03:53,920 Speaker 1: second old rent all time. Shoot, did I ruin it? 64 00:03:53,920 --> 00:03:57,400 Speaker 1: It's okay. Um. So we go from can't figure out 65 00:03:57,400 --> 00:03:59,560 Speaker 1: how to attach a handle to a sharp rock? Okay, 66 00:04:00,000 --> 00:04:02,480 Speaker 1: a million years we figured that out. We leave Africa 67 00:04:02,920 --> 00:04:05,160 Speaker 1: and we start colonizing the rest of the world, and 68 00:04:05,200 --> 00:04:07,720 Speaker 1: all of a sudden things start entering light speed, right, 69 00:04:08,320 --> 00:04:12,720 Speaker 1: And it seems like over the last couple hundred years, 70 00:04:13,120 --> 00:04:17,240 Speaker 1: you know, especially since the Industrial Revolution, our ability to innovate, 71 00:04:17,279 --> 00:04:20,360 Speaker 1: to grasp new ideas, to understand the world around us, 72 00:04:20,960 --> 00:04:24,640 Speaker 1: has just been hitting this hyper speed, and a lot 73 00:04:24,680 --> 00:04:28,640 Speaker 1: of people wonder if we've reached a point where all 74 00:04:28,680 --> 00:04:32,159 Speaker 1: the ideas, all the good ones at least, have already 75 00:04:32,200 --> 00:04:36,640 Speaker 1: been discovered, all the we understand how everything works, and 76 00:04:36,680 --> 00:04:39,839 Speaker 1: there's really just figuring out how to dot the eyes 77 00:04:39,920 --> 00:04:43,040 Speaker 1: and cross the teas right right. There was actually a 78 00:04:43,040 --> 00:04:48,159 Speaker 1: guy who famously said in guy named Charles Buell. He 79 00:04:48,240 --> 00:04:51,080 Speaker 1: was He was the commissioner of the Patents Office. That's 80 00:04:51,080 --> 00:04:54,280 Speaker 1: attributed to him, I should say, But he said something 81 00:04:54,320 --> 00:04:57,800 Speaker 1: like everything that can be invented has already been invented, 82 00:04:58,120 --> 00:05:00,359 Speaker 1: and he said this in a memo, basically like you 83 00:05:00,360 --> 00:05:03,039 Speaker 1: should go ahead and shut down the patent office. He 84 00:05:03,240 --> 00:05:09,720 Speaker 1: clearly had never considered the snuggy Josh or anything that's 85 00:05:09,720 --> 00:05:14,000 Speaker 1: been invented since. So here's what I'm gonna say. I'm 86 00:05:14,040 --> 00:05:16,839 Speaker 1: gonna go ahead and give you my summation early on. Okay, 87 00:05:17,560 --> 00:05:20,359 Speaker 1: is that I think people think at various times in 88 00:05:20,440 --> 00:05:24,960 Speaker 1: history that they've plateaued, and then I think things happen. 89 00:05:25,040 --> 00:05:28,120 Speaker 1: People come along innovators, and then they reach new heights 90 00:05:28,160 --> 00:05:31,600 Speaker 1: and they go, oh, well, we didn't know that, and 91 00:05:31,640 --> 00:05:34,680 Speaker 1: we there are new ideas, right. It's it's almost umb. 92 00:05:35,400 --> 00:05:40,599 Speaker 1: It almost displays a shameful lack of historic awareness to 93 00:05:40,800 --> 00:05:44,279 Speaker 1: say we've reached the end of all of our good ideas. 94 00:05:44,360 --> 00:05:46,960 Speaker 1: It's just silly. It's just ask him to be made 95 00:05:46,960 --> 00:05:49,720 Speaker 1: a fool of yeah, or for people to maybe people 96 00:05:49,760 --> 00:05:52,080 Speaker 1: do that on purpose, to go the innovators and to 97 00:05:52,120 --> 00:05:56,120 Speaker 1: say no, yeah, right, using reverse psychology exactly, that's how 98 00:05:56,160 --> 00:05:58,280 Speaker 1: innovation works. Yeah, you might as well just give up 99 00:05:58,320 --> 00:06:03,599 Speaker 1: reverse psychology drive innovation. There are people though, that say 100 00:06:03,680 --> 00:06:08,919 Speaker 1: that technological that real technological innovation has been stalled for 101 00:06:09,040 --> 00:06:13,839 Speaker 1: quite a while. Yes, after the nineties computer revolution. Everything 102 00:06:13,839 --> 00:06:17,160 Speaker 1: else since then has kind of been like, uh, packaging 103 00:06:17,160 --> 00:06:22,600 Speaker 1: it and better looking cases and sleeker designs, and it's 104 00:06:22,600 --> 00:06:26,040 Speaker 1: all like design oriented it is or marketing oriented marketing. 105 00:06:26,440 --> 00:06:31,760 Speaker 1: These these guys, Cedric Lagare and Eric Virdo, we're both 106 00:06:31,800 --> 00:06:36,160 Speaker 1: with scheme of business school. Um basically say, smartphones. Yes, 107 00:06:36,240 --> 00:06:39,680 Speaker 1: they seem incredibly new and cutting edge, but really they're 108 00:06:39,720 --> 00:06:44,440 Speaker 1: just the packaging of several already extant technologies into a 109 00:06:44,480 --> 00:06:48,480 Speaker 1: really sharp looking handheld device. But that's still a new idea. 110 00:06:48,920 --> 00:06:50,880 Speaker 1: I would argue it is still a new idea. But 111 00:06:50,920 --> 00:06:53,520 Speaker 1: I think what their point is is saying, like, but 112 00:06:53,560 --> 00:06:57,159 Speaker 1: before the late nineties and before the eighties, let's say, 113 00:06:57,160 --> 00:06:59,880 Speaker 1: with computers, but especially the tech boom of the tele 114 00:07:00,080 --> 00:07:03,640 Speaker 1: com boom of the late nineties, like this stuff wasn't around. 115 00:07:03,800 --> 00:07:07,360 Speaker 1: Like it's not true innovation. It's it's kind of repurposing, right, 116 00:07:07,960 --> 00:07:10,840 Speaker 1: And what you were saying, like the cosmetic changes to 117 00:07:10,880 --> 00:07:15,280 Speaker 1: a computer. Um. One of the reasons why they believe 118 00:07:15,360 --> 00:07:18,120 Speaker 1: that this is going on is because we've come to 119 00:07:18,160 --> 00:07:23,920 Speaker 1: a point in the computer revolution. I think, chuck, where um, 120 00:07:23,960 --> 00:07:28,080 Speaker 1: it's not you can still make tons of cash just 121 00:07:28,120 --> 00:07:31,560 Speaker 1: by changing the casing of a CPU. Yeah, there's like 122 00:07:31,560 --> 00:07:34,000 Speaker 1: no money in innovation basically, is what I got from 123 00:07:34,000 --> 00:07:38,040 Speaker 1: this one article is that innovation costs more than it's 124 00:07:38,080 --> 00:07:41,240 Speaker 1: worth when you can just repackage what you've got in 125 00:07:41,280 --> 00:07:45,640 Speaker 1: a sleeker design and people buy it up exactly. Um. 126 00:07:45,760 --> 00:07:49,440 Speaker 1: These two authors of this article, UM predict that we're 127 00:07:49,440 --> 00:07:53,200 Speaker 1: gonna have two trends that will drive innovation. I guess currently, right, Yes, 128 00:07:53,520 --> 00:07:57,880 Speaker 1: that consolidation where basically, like especially with I think they're 129 00:07:57,880 --> 00:08:01,560 Speaker 1: talking just about computers. Yeah, because they're saying the big 130 00:08:01,560 --> 00:08:04,160 Speaker 1: hardware firms are going to all consolidate all of the 131 00:08:04,400 --> 00:08:07,680 Speaker 1: smaller hardware firms to where they'll just basically be like 132 00:08:07,720 --> 00:08:11,280 Speaker 1: the big three or five, and that will leave it 133 00:08:11,320 --> 00:08:14,080 Speaker 1: to the software firms to compete and innovate. So we'll 134 00:08:14,080 --> 00:08:16,160 Speaker 1: see more innovation in the software side row than the 135 00:08:16,160 --> 00:08:19,600 Speaker 1: hardware side, right. And they're also saying that, um, the 136 00:08:19,640 --> 00:08:23,600 Speaker 1: green boom is going to drive innovation. That makes sense, 137 00:08:23,640 --> 00:08:28,400 Speaker 1: like coming up with sustainable packages. Are sustainable solutions? Yeah, totally. 138 00:08:29,320 --> 00:08:30,800 Speaker 1: One of the other things they pointed out thought was 139 00:08:30,840 --> 00:08:35,040 Speaker 1: interesting was the they said they said the tech uh 140 00:08:35,040 --> 00:08:38,360 Speaker 1: they call it the tech refresh cycle is too small 141 00:08:38,480 --> 00:08:41,959 Speaker 1: right now. So what's happening is they'll say, Um, you 142 00:08:42,080 --> 00:08:45,160 Speaker 1: like your CD, well you're gonna love the the Super 143 00:08:45,200 --> 00:08:48,120 Speaker 1: Audio CD or Blu ray. You like your DVD, You're 144 00:08:48,160 --> 00:08:50,520 Speaker 1: gonna love Blu Ray. But guess what's coming up after 145 00:08:50,520 --> 00:08:53,320 Speaker 1: blue ray. It's gonna be like Super Blue Ray. It's 146 00:08:53,320 --> 00:08:57,640 Speaker 1: happening so fast. People aren't abandoning their current systems. They're 147 00:08:57,679 --> 00:08:58,840 Speaker 1: just like, you know what, I'm gonna hold on because 148 00:08:58,840 --> 00:09:00,040 Speaker 1: I don't want to be the guy stuck with the 149 00:09:00,120 --> 00:09:04,040 Speaker 1: laser disc player in a couple of years. So all 150 00:09:04,080 --> 00:09:06,280 Speaker 1: of a sudden, the same thing happens. No one's buying it, 151 00:09:06,440 --> 00:09:08,520 Speaker 1: so it's not worth as much money, which means that 152 00:09:08,559 --> 00:09:11,200 Speaker 1: nobody's putting any effort into it and money into it. 153 00:09:11,280 --> 00:09:15,320 Speaker 1: So innovations. Right. And there's a guy um named Edmund 154 00:09:15,320 --> 00:09:19,640 Speaker 1: Phelps who's a professor of political economy at Columbia University, right, 155 00:09:20,160 --> 00:09:23,640 Speaker 1: and he's basically kind of saying the same thing. He's 156 00:09:23,679 --> 00:09:28,120 Speaker 1: saying that, um, there's not enough money going toward innovation. 157 00:09:28,200 --> 00:09:32,719 Speaker 1: But rather than the the onus being put on consumers 158 00:09:32,720 --> 00:09:35,560 Speaker 1: not buying blue rays out of fear of looking like 159 00:09:35,720 --> 00:09:41,360 Speaker 1: laser disc jerks, um, it's actually government and big business 160 00:09:41,440 --> 00:09:45,360 Speaker 1: that's not pouring money into small innovators. Yeah, he said 161 00:09:45,400 --> 00:09:47,600 Speaker 1: that's the innovation is the only thing not subsidized by 162 00:09:47,600 --> 00:09:50,520 Speaker 1: the United States government, which he says is actually attacks 163 00:09:51,760 --> 00:09:54,079 Speaker 1: in a way because it's not being subsidized. Sort of 164 00:09:54,080 --> 00:09:56,679 Speaker 1: a reach, you could definitely, Yeah, I think a lot 165 00:09:56,720 --> 00:09:59,560 Speaker 1: of these guy's points are a reach. But, um, what 166 00:09:59,679 --> 00:10:04,280 Speaker 1: he's suggesting is if the government isn't pouring money into 167 00:10:04,440 --> 00:10:08,960 Speaker 1: big business so that they can pour money into I 168 00:10:08,960 --> 00:10:12,880 Speaker 1: guess small venture firms. Um, these people who are in 169 00:10:12,880 --> 00:10:16,800 Speaker 1: their garages aren't going to take you know, risks, They're 170 00:10:16,800 --> 00:10:19,920 Speaker 1: not going to innovate. There's no incentive. Right. I disagree 171 00:10:19,960 --> 00:10:23,680 Speaker 1: with this. I dispute this because he's saying, like the 172 00:10:23,800 --> 00:10:26,320 Speaker 1: people who do work in their garages and you know, 173 00:10:26,440 --> 00:10:30,000 Speaker 1: are the Steve Jobs and Bill Gates in the seventies, 174 00:10:30,400 --> 00:10:33,760 Speaker 1: that they were driven by this lust for money. And 175 00:10:33,800 --> 00:10:36,160 Speaker 1: I think that that's wrong. I think that people innovate 176 00:10:36,760 --> 00:10:39,319 Speaker 1: first and foremost to get this idea out of their 177 00:10:39,320 --> 00:10:41,720 Speaker 1: head and birth into reality. Right. I'm glad you said 178 00:10:41,720 --> 00:10:44,760 Speaker 1: this because I completely agreed. Regardless of what you think 179 00:10:44,800 --> 00:10:48,280 Speaker 1: of the Facebook, Mark Zuckerberg didn't invent Facebook. To make 180 00:10:48,360 --> 00:10:51,160 Speaker 1: gobs of money. He invented to make real friends. Yeah, 181 00:10:51,200 --> 00:10:53,920 Speaker 1: to to innovate. And that's what that's my point that 182 00:10:54,000 --> 00:10:56,480 Speaker 1: you made is that these people in the garage, the 183 00:10:56,520 --> 00:10:59,520 Speaker 1: true innovators. They don't care if they have two pennies 184 00:10:59,559 --> 00:11:01,440 Speaker 1: to rub the other, right, They're still gonna be trying 185 00:11:01,480 --> 00:11:04,319 Speaker 1: to innovate and make a name for themselves and come 186 00:11:04,400 --> 00:11:06,520 Speaker 1: up with something awesome. Right. And now there are people 187 00:11:06,520 --> 00:11:09,480 Speaker 1: out there who are trying to innovate for you know, 188 00:11:09,520 --> 00:11:13,600 Speaker 1: the riches Snuggy. Sure, the guy who invented the snugg 189 00:11:13,640 --> 00:11:15,560 Speaker 1: he wasn't in his garage and just wanted to kind 190 00:11:15,559 --> 00:11:17,720 Speaker 1: of get this out or else I'm never gonna sleep. Yeah, 191 00:11:17,760 --> 00:11:19,920 Speaker 1: that's the people that are looking for the next get 192 00:11:20,040 --> 00:11:22,880 Speaker 1: rich quick thing. But I think you can also make 193 00:11:23,000 --> 00:11:27,520 Speaker 1: a point that, um, when you introduce money to innovation, 194 00:11:27,840 --> 00:11:32,720 Speaker 1: it leads to actual stagnation because when you introduce money, 195 00:11:32,760 --> 00:11:37,200 Speaker 1: there's now, um something to lose, and people are less 196 00:11:37,200 --> 00:11:39,440 Speaker 1: willing to take risks, and risk is one of the 197 00:11:39,520 --> 00:11:41,880 Speaker 1: driving The willingness to take risk is one of the 198 00:11:41,960 --> 00:12:08,720 Speaker 1: driving forces of innovation. You know, Yeah, stuff pholps had 199 00:12:08,720 --> 00:12:11,360 Speaker 1: a good idea, uh, and this will never happen, of course, 200 00:12:11,360 --> 00:12:14,360 Speaker 1: because it's a good idea to create the first national 201 00:12:14,400 --> 00:12:18,320 Speaker 1: Bank of Innovation all capitalized capitalized not all caps, but 202 00:12:18,400 --> 00:12:20,720 Speaker 1: each word is capitalized. He should do it all in 203 00:12:20,760 --> 00:12:24,840 Speaker 1: all caps with exclamation points. But basically, it would be 204 00:12:24,920 --> 00:12:29,360 Speaker 1: a bank that you could go and partner, you know, 205 00:12:29,440 --> 00:12:32,079 Speaker 1: as a startup company and partner with his bank for financing, 206 00:12:32,720 --> 00:12:35,240 Speaker 1: and you know, get I would guess some sort of 207 00:12:35,240 --> 00:12:38,400 Speaker 1: low interest loans to spur innovation. Right. That was a 208 00:12:38,400 --> 00:12:40,640 Speaker 1: great idea. So it is. It is a good idea, 209 00:12:40,720 --> 00:12:42,920 Speaker 1: and this does happen in the real world, and the 210 00:12:42,960 --> 00:12:47,800 Speaker 1: government does pour money into innovation. He's not exactly correct 211 00:12:47,840 --> 00:12:50,560 Speaker 1: in that sense. And I also kind of resented that 212 00:12:50,640 --> 00:12:56,000 Speaker 1: he placed big business in between you know, people in 213 00:12:56,040 --> 00:13:00,440 Speaker 1: their garage innovating and you know, government subsidies that we 214 00:13:00,520 --> 00:13:03,000 Speaker 1: have to have big business give them the money and 215 00:13:03,040 --> 00:13:04,960 Speaker 1: then skimalt off the top and give it to this 216 00:13:05,000 --> 00:13:07,520 Speaker 1: guy in the garage. He's drawing broad strokes here. For sure. 217 00:13:07,760 --> 00:13:10,600 Speaker 1: There are government programs, and we'll talk about one from 218 00:13:10,600 --> 00:13:13,920 Speaker 1: the National Institutes of Health where the government says, hey, 219 00:13:14,240 --> 00:13:18,400 Speaker 1: you have a really good idea, Mr or MS research scientists, 220 00:13:18,800 --> 00:13:21,320 Speaker 1: and we're gonna give you enough money to survive for 221 00:13:21,440 --> 00:13:24,240 Speaker 1: three years. Yeah, because the deal is you can always 222 00:13:24,280 --> 00:13:27,200 Speaker 1: get grants if you know, you put together a nice package. 223 00:13:27,200 --> 00:13:30,080 Speaker 1: But this program with the ni H what's it called 224 00:13:30,120 --> 00:13:34,840 Speaker 1: the New Innovator Award Director's New Innovator Award. This is 225 00:13:35,200 --> 00:13:38,480 Speaker 1: UH intended for people who have such a good idea, 226 00:13:39,200 --> 00:13:41,559 Speaker 1: but it's so new that they don't have the data 227 00:13:42,000 --> 00:13:44,080 Speaker 1: to write a grant where people would say, like, it 228 00:13:44,080 --> 00:13:46,280 Speaker 1: looks like you're onto something here, So they're sort of 229 00:13:46,280 --> 00:13:49,079 Speaker 1: throwing money at stuff that's like, you know, you're the 230 00:13:49,160 --> 00:13:51,240 Speaker 1: dude in the garage, and we believe in this idea, 231 00:13:51,480 --> 00:13:53,400 Speaker 1: go see what you can find out, right, and we're 232 00:13:53,480 --> 00:13:56,080 Speaker 1: keeping big business out of the way. Yes, But now 233 00:13:56,120 --> 00:13:58,080 Speaker 1: that and I H owns you for the rest of 234 00:13:58,080 --> 00:14:01,920 Speaker 1: your career probably. So so let's talk about UM. There's 235 00:14:01,960 --> 00:14:05,000 Speaker 1: three people at u c l A that got these 236 00:14:05,240 --> 00:14:07,400 Speaker 1: grants recently and they're up to some kind of some 237 00:14:07,559 --> 00:14:11,480 Speaker 1: interesting one could say, innovative stuff. Right. They have some 238 00:14:11,520 --> 00:14:15,319 Speaker 1: good ideas, hugely innovative about how to UM approach problems, 239 00:14:15,320 --> 00:14:20,160 Speaker 1: like the professor Dino di Carlo. All these I think 240 00:14:20,160 --> 00:14:22,520 Speaker 1: these people are younger than us by the way they 241 00:14:22,520 --> 00:14:26,120 Speaker 1: are UM. Dino de Carlo is working on ways to 242 00:14:26,720 --> 00:14:31,280 Speaker 1: basically apply heat or pressure or chemicals to very specific 243 00:14:31,320 --> 00:14:37,080 Speaker 1: sites in cells using nanoparticles and magnets, which is tough, 244 00:14:37,200 --> 00:14:40,360 Speaker 1: sounds like a winning idea to me. It is basically 245 00:14:40,360 --> 00:14:43,320 Speaker 1: one of the big problems we have with UM getting cells, 246 00:14:43,360 --> 00:14:47,920 Speaker 1: engineering cells to do specific things like UM, I don't know, 247 00:14:48,480 --> 00:14:51,800 Speaker 1: attack other cells for fun. Like if tell me that 248 00:14:51,800 --> 00:14:55,320 Speaker 1: wouldn't be like a big Christmas gift this year. If 249 00:14:55,320 --> 00:14:57,880 Speaker 1: you could like make cells fight with one another under 250 00:14:57,880 --> 00:15:03,520 Speaker 1: a microscope, UM, then what you have to basically try 251 00:15:03,600 --> 00:15:06,720 Speaker 1: to engineer the cell, you know, time after time after time, 252 00:15:06,720 --> 00:15:09,840 Speaker 1: and basically program it to do what you want it 253 00:15:09,920 --> 00:15:12,720 Speaker 1: to do. What de Carlo is coming up with is 254 00:15:12,760 --> 00:15:16,840 Speaker 1: a way to use UM very tiny magnets and even 255 00:15:16,880 --> 00:15:22,000 Speaker 1: tinier nano particles that can basically you my brain is 256 00:15:22,040 --> 00:15:25,880 Speaker 1: so small. When you move the magnet with a joystick, 257 00:15:26,560 --> 00:15:29,920 Speaker 1: it attracts the nano particles in a certain direction or whatever, 258 00:15:29,960 --> 00:15:32,560 Speaker 1: and you can have the nano particles apply heat or 259 00:15:32,560 --> 00:15:35,320 Speaker 1: pressure or a specific chemical to a specific site on 260 00:15:35,360 --> 00:15:38,280 Speaker 1: a cell and direct it to go attack another cell 261 00:15:38,600 --> 00:15:42,000 Speaker 1: for your pleasure. That's awesome, your amusement. So one point 262 00:15:42,040 --> 00:15:45,360 Speaker 1: five mill goes to uh de Carlo and for a 263 00:15:45,360 --> 00:15:47,600 Speaker 1: good reason. And for a good reason, the other winner, 264 00:15:47,600 --> 00:15:50,640 Speaker 1: one of the other winners was Hu Huang, And you 265 00:15:51,520 --> 00:15:55,880 Speaker 1: came up with, uh, basically, I'm gonna break this down easy. 266 00:15:56,120 --> 00:15:58,360 Speaker 1: Instead of saying, let me come up with a cure 267 00:15:58,400 --> 00:16:02,240 Speaker 1: for cancer, hu Wang said, let me come up with 268 00:16:02,280 --> 00:16:06,920 Speaker 1: a way to detect cancer so early, like way earlier 269 00:16:06,920 --> 00:16:09,360 Speaker 1: than we've ever detected it before, that we can stop 270 00:16:09,400 --> 00:16:12,760 Speaker 1: in this track, essentially curing cancer. And he's doing this, 271 00:16:13,040 --> 00:16:17,800 Speaker 1: actually I don't know. She's doing this um through uh 272 00:16:18,160 --> 00:16:23,680 Speaker 1: nano material called graphine that is just one atom thick. Yes, 273 00:16:23,800 --> 00:16:28,400 Speaker 1: graphing is like the super clearly not of this world material. 274 00:16:29,160 --> 00:16:31,880 Speaker 1: It's literally a carbon atom thick. That's it. So, but 275 00:16:32,040 --> 00:16:36,480 Speaker 1: it is a biological sensor to tell you when cells 276 00:16:36,520 --> 00:16:38,320 Speaker 1: aren't doing the things they should be doing. So did 277 00:16:38,320 --> 00:16:41,880 Speaker 1: you know a graham of this stuff's flattened covers a 278 00:16:41,880 --> 00:16:47,000 Speaker 1: football field? Am, Wow, it's ultra light. That is thin, 279 00:16:47,680 --> 00:16:50,960 Speaker 1: my friend, it's one atom thin. So one point five 280 00:16:50,960 --> 00:16:54,200 Speaker 1: mail to hu Wang, Right, well, did you explain how? Oh, 281 00:16:54,880 --> 00:16:56,760 Speaker 1: let me let me try my hand at this. So 282 00:16:56,800 --> 00:17:00,040 Speaker 1: basically what you do is you, um, you put a 283 00:17:00,080 --> 00:17:07,400 Speaker 1: graphine conductor transistor UM in a cell and when these 284 00:17:07,440 --> 00:17:10,960 Speaker 1: biological markers, right say his stones or something like that, 285 00:17:11,240 --> 00:17:15,120 Speaker 1: start to accumulate, they're attracted to the graphing. And these, 286 00:17:15,119 --> 00:17:17,680 Speaker 1: by the way these biological markers are, we found are 287 00:17:17,800 --> 00:17:21,320 Speaker 1: correlated with the growth of cancer, the origin of cancer. 288 00:17:21,359 --> 00:17:23,560 Speaker 1: That's where they're starting. UM. And when some of these 289 00:17:23,600 --> 00:17:26,800 Speaker 1: markers like are attracted to the graphing, they create an 290 00:17:26,800 --> 00:17:30,040 Speaker 1: electrical charge that we can sense. And the graphine is 291 00:17:30,080 --> 00:17:33,879 Speaker 1: so thin but so highly conductive that UM. With just 292 00:17:33,960 --> 00:17:37,560 Speaker 1: a couple of these molecules attaching to the graphing, we 293 00:17:37,560 --> 00:17:42,760 Speaker 1: would be able to detect it and be like who right, 294 00:17:43,040 --> 00:17:45,880 Speaker 1: We'd be like, oh crap, you have cancer, and we'd 295 00:17:45,920 --> 00:17:48,560 Speaker 1: cure it right then. Wow. Yeah, that's awesome. Yeah, And 296 00:17:48,560 --> 00:17:50,080 Speaker 1: that's a good way to approach a cure for cancer. 297 00:17:50,080 --> 00:17:51,800 Speaker 1: If you asked me, did I explain that? Well? I 298 00:17:51,840 --> 00:17:54,040 Speaker 1: think so, I think. And the last winner this year 299 00:17:54,119 --> 00:17:58,960 Speaker 1: was Jin Hi Lee and Jin is trying to debug 300 00:17:58,960 --> 00:18:01,760 Speaker 1: the brain circuit useing. You know, we have the Wonder Machine, 301 00:18:01,840 --> 00:18:03,679 Speaker 1: which is our favorite thing in the world. Fm r I, 302 00:18:04,520 --> 00:18:08,840 Speaker 1: which measures UH measures measures blood and oxygen levels in 303 00:18:08,840 --> 00:18:11,760 Speaker 1: the brain. So it tells you. These areas light up 304 00:18:11,760 --> 00:18:15,080 Speaker 1: there called bold signals blood and oxygen level dependent. They 305 00:18:15,160 --> 00:18:17,920 Speaker 1: light up to correspond a certain brain right, And we've 306 00:18:17,960 --> 00:18:21,760 Speaker 1: talked about this before, Like you're seeing that there's more 307 00:18:21,760 --> 00:18:24,159 Speaker 1: oxygen that's going to that part of the brain. So 308 00:18:24,240 --> 00:18:26,280 Speaker 1: we've assumed this is the basis of the f m 309 00:18:26,359 --> 00:18:29,840 Speaker 1: R I. If it has more oxygen being delivered to it, 310 00:18:29,920 --> 00:18:32,240 Speaker 1: that must mean that that region of the brain is active. 311 00:18:32,240 --> 00:18:35,720 Speaker 1: When you show somebody a picture of you know, their kid, 312 00:18:36,200 --> 00:18:39,399 Speaker 1: like being carried away into a van, that you know, 313 00:18:39,520 --> 00:18:43,040 Speaker 1: that's the fear region right there. Um, that doesn't really 314 00:18:43,080 --> 00:18:47,360 Speaker 1: say anything though, and it doesn't it doesn't implicate well 315 00:18:47,440 --> 00:18:50,680 Speaker 1: it's it's not it's showing Okay, well there's more oxygen 316 00:18:50,720 --> 00:18:53,679 Speaker 1: in this region, right. What this is, um, What what 317 00:18:53,800 --> 00:18:59,200 Speaker 1: genuine Lee is looking at is, um, how what specifically 318 00:18:59,520 --> 00:19:03,639 Speaker 1: on the new ronal level is being activated? Right? And 319 00:19:03,680 --> 00:19:06,359 Speaker 1: he's using opt to genetics. So it's going to be 320 00:19:06,400 --> 00:19:10,080 Speaker 1: called the o f m R I. And that's beyond 321 00:19:10,160 --> 00:19:12,720 Speaker 1: even what we thought was the Wonder machine. So this 322 00:19:12,760 --> 00:19:15,720 Speaker 1: is the super duper Wonder machine. Basically, he's using light 323 00:19:16,440 --> 00:19:20,760 Speaker 1: to allow genetically specified neurons to be activated. Right, do 324 00:19:20,840 --> 00:19:23,600 Speaker 1: you know, um are one of our listeners that Emory 325 00:19:23,640 --> 00:19:26,680 Speaker 1: has been harping on us doing one on opti genetics 326 00:19:26,680 --> 00:19:29,760 Speaker 1: for a while. We should get this person in here. 327 00:19:29,840 --> 00:19:31,840 Speaker 1: This is probably as closest forever going to come down 328 00:19:32,000 --> 00:19:35,719 Speaker 1: to do it. Uh. Well, it's a great idea though, 329 00:19:35,760 --> 00:19:40,280 Speaker 1: obviously because Jin Young Lee won one of the Innovator 330 00:19:40,320 --> 00:19:43,080 Speaker 1: Awards as well. Yes, and they give these out every year, 331 00:19:43,600 --> 00:19:45,880 Speaker 1: so they clearly believe that we're not out of good ideas. 332 00:19:47,040 --> 00:19:50,400 Speaker 1: No excellent point the n I h No, and we're 333 00:19:50,440 --> 00:19:53,400 Speaker 1: not out of good ideas. So yes, Chuck, we we 334 00:19:54,280 --> 00:19:56,679 Speaker 1: you pick those out? You found those guys all right, 335 00:19:56,760 --> 00:19:59,320 Speaker 1: Well I didn't personally find them. You're like, these guys 336 00:19:59,359 --> 00:20:02,800 Speaker 1: should get the that and I found them. Um. There 337 00:20:02,840 --> 00:20:06,800 Speaker 1: are very good ideas out there, right, But there is 338 00:20:07,000 --> 00:20:15,200 Speaker 1: a debate that's raging in science UM about whether these 339 00:20:15,240 --> 00:20:20,560 Speaker 1: ideas like optogenetics or um, you know, using graphine or 340 00:20:20,640 --> 00:20:26,679 Speaker 1: nanoparticles to cure detect cancer. Um are these variations on 341 00:20:26,720 --> 00:20:31,120 Speaker 1: a theme? Are they applying cosmetic changes to a computer 342 00:20:31,640 --> 00:20:36,440 Speaker 1: rather than really creating new parts to it? Right? And 343 00:20:36,800 --> 00:20:41,440 Speaker 1: basically the question is, um are are are there any 344 00:20:41,440 --> 00:20:44,560 Speaker 1: more major discoveries for us to make or are these 345 00:20:44,640 --> 00:20:48,560 Speaker 1: really just basically remember I've always said, like we we 346 00:20:48,640 --> 00:20:50,199 Speaker 1: have the pieces on the table, now we just have 347 00:20:50,240 --> 00:20:53,560 Speaker 1: to put it together. Is that the point that we're at, UM? Well, 348 00:20:53,560 --> 00:20:56,119 Speaker 1: as you said we were, I did, and then we 349 00:20:56,119 --> 00:20:59,040 Speaker 1: started researching this, and I'm like, I wonder, I think 350 00:20:59,080 --> 00:21:02,320 Speaker 1: I still do believe that UM. But within that though, 351 00:21:02,359 --> 00:21:04,880 Speaker 1: there's so much that it's to me a little bit 352 00:21:05,040 --> 00:21:09,399 Speaker 1: like splitting hairs. Well, but you're absolutely right, especially when 353 00:21:09,440 --> 00:21:12,880 Speaker 1: you throw in the word discovered, right, Discovery indicates something 354 00:21:12,920 --> 00:21:15,600 Speaker 1: that's already out there. We just figure it out or 355 00:21:15,680 --> 00:21:19,600 Speaker 1: stumble upon it. Sure, and an idea necessarily kind of 356 00:21:19,760 --> 00:21:23,200 Speaker 1: UM invention leads Yeah, it leads to an invention. It's 357 00:21:23,200 --> 00:21:48,960 Speaker 1: something we we've created. Like technology, let's talk about discovery, right, 358 00:21:48,960 --> 00:21:53,760 Speaker 1: we have a lot of UM problems that are still 359 00:21:53,840 --> 00:21:58,119 Speaker 1: facing us and how we understand the universe, like human consciousness. 360 00:21:58,640 --> 00:22:02,800 Speaker 1: How do brain cells create our understanding of the world, 361 00:22:02,800 --> 00:22:06,080 Speaker 1: like what we see as reality? How is that possible? 362 00:22:06,480 --> 00:22:09,800 Speaker 1: And can we figure everything out? Well, that's the big question. 363 00:22:10,000 --> 00:22:12,640 Speaker 1: Is there there's a UM there's Like I said, there's 364 00:22:12,640 --> 00:22:15,600 Speaker 1: a lot of debate about whether or not we'll ever 365 00:22:15,680 --> 00:22:18,639 Speaker 1: be able to figure everything out? Or if the human 366 00:22:18,680 --> 00:22:24,439 Speaker 1: brain just simply isn't um programmed to understand the world, 367 00:22:25,200 --> 00:22:28,760 Speaker 1: uh fully, you know will will. So there's a guy 368 00:22:28,800 --> 00:22:32,399 Speaker 1: who's a physicist. His name is um Russell Standard, and 369 00:22:32,440 --> 00:22:34,679 Speaker 1: he's written this book called The End of Discovery, And 370 00:22:34,720 --> 00:22:37,280 Speaker 1: basically he's saying he says that we're in quote a 371 00:22:37,320 --> 00:22:41,560 Speaker 1: transient age of human development, right where we're past the 372 00:22:41,560 --> 00:22:45,320 Speaker 1: point where we figured out you can put a handle 373 00:22:45,600 --> 00:22:48,360 Speaker 1: on a rock and make it an ax. But we're 374 00:22:49,359 --> 00:22:53,880 Speaker 1: right before the point where we can no longer make discoveries, 375 00:22:53,920 --> 00:22:57,199 Speaker 1: not because we've understood everything or figured everything out, but 376 00:22:57,240 --> 00:22:59,639 Speaker 1: because we've reached the limits of what is noble for 377 00:22:59,720 --> 00:23:03,480 Speaker 1: the mean brain. Sure, but even that look at that 378 00:23:03,480 --> 00:23:08,040 Speaker 1: that part of the right hemisphere that developed and allowed 379 00:23:08,119 --> 00:23:11,639 Speaker 1: us to put the acts handle on, Right, who's to 380 00:23:11,680 --> 00:23:14,439 Speaker 1: say that our brain won't, that we won't reach that 381 00:23:14,480 --> 00:23:17,600 Speaker 1: point where we can't know anything any longer or I 382 00:23:17,680 --> 00:23:21,119 Speaker 1: can't know everything? And then uh, we evolve even further 383 00:23:21,359 --> 00:23:23,919 Speaker 1: and all of a sudden we're even better at um 384 00:23:24,440 --> 00:23:28,840 Speaker 1: understanding our world. Right, But will we end up eventually 385 00:23:29,160 --> 00:23:33,880 Speaker 1: coming to a point where humans understand everything and there 386 00:23:34,000 --> 00:23:36,960 Speaker 1: is no more discovery to make I say no, because 387 00:23:37,119 --> 00:23:38,800 Speaker 1: he points out in here and this is I think 388 00:23:38,920 --> 00:23:42,199 Speaker 1: very valid from the Midnight century, the nineteenth century. I'm 389 00:23:42,240 --> 00:23:44,840 Speaker 1: sorry they said that. A lot of people in science said, 390 00:23:44,840 --> 00:23:47,040 Speaker 1: you know, we've kind of debunked religion and philosophy and 391 00:23:47,040 --> 00:23:50,640 Speaker 1: all these things with scientific discovery. But he points out, 392 00:23:50,680 --> 00:23:52,960 Speaker 1: and I agree that even if you figure out all 393 00:23:53,000 --> 00:23:56,400 Speaker 1: the problems of science, which will never happen, there's still 394 00:23:57,280 --> 00:24:01,720 Speaker 1: human life and consciousness in the subjectivity of what goes 395 00:24:01,720 --> 00:24:05,320 Speaker 1: on inside a person's head. You're never going to solve 396 00:24:05,600 --> 00:24:11,360 Speaker 1: that's not solvable, right, That's what I argue. That's subjectivism. Yeah, 397 00:24:11,680 --> 00:24:13,560 Speaker 1: before I think I believe in that there, well, they're 398 00:24:13,640 --> 00:24:17,399 Speaker 1: the whole, I guess I I agree with you. Um, 399 00:24:17,560 --> 00:24:21,120 Speaker 1: there's this aspect of the universe that Kant called the 400 00:24:21,240 --> 00:24:26,320 Speaker 1: new Amenon new Amenon. Okay, that was specifically tailored from 401 00:24:26,320 --> 00:24:29,720 Speaker 1: my thick time. But basically the new Aminon is the 402 00:24:29,760 --> 00:24:35,520 Speaker 1: thing itself right where um it has it's just the objective. 403 00:24:36,800 --> 00:24:41,159 Speaker 1: It's the objective universe, and we don't interact with that. 404 00:24:41,359 --> 00:24:45,040 Speaker 1: Everything we know and understand is subjective. And this is 405 00:24:45,040 --> 00:24:48,160 Speaker 1: where subjectivism is is based. That basically we can never 406 00:24:48,359 --> 00:24:53,000 Speaker 1: fully know anything or and we certainly won't ever know everything, 407 00:24:53,000 --> 00:24:55,720 Speaker 1: because one thing that will always be elusive is what 408 00:24:55,920 --> 00:24:58,960 Speaker 1: you see. My reality is different than your reality exactly, 409 00:24:59,200 --> 00:25:01,760 Speaker 1: and there's a different There's an extreme version of it 410 00:25:01,840 --> 00:25:08,600 Speaker 1: called so solipsism, right, yes, and solipsism is the the um, 411 00:25:08,760 --> 00:25:14,760 Speaker 1: this extreme version of subjectivism that basically says, um, we 412 00:25:15,440 --> 00:25:19,119 Speaker 1: everything is so subjective that I can't fully verify that 413 00:25:19,200 --> 00:25:22,000 Speaker 1: you exist. The only thing I know that exists is 414 00:25:22,040 --> 00:25:24,800 Speaker 1: my reality. But all of you may be made up. 415 00:25:24,840 --> 00:25:27,639 Speaker 1: I may be totally completely out of my mind and 416 00:25:27,680 --> 00:25:30,560 Speaker 1: actually in a padded cell right now, and none of 417 00:25:30,600 --> 00:25:34,040 Speaker 1: you are really real. Well that's sort of touches on 418 00:25:34,080 --> 00:25:39,120 Speaker 1: the whole quantum mechanics thing, right, don't you think. Please? Well, 419 00:25:39,160 --> 00:25:40,360 Speaker 1: I mean, I don't have a whole lot to say 420 00:25:40,359 --> 00:25:43,280 Speaker 1: about it because we've covered it, but it definitely is 421 00:25:43,600 --> 00:25:45,680 Speaker 1: along the same line. So you think, well, yeah, there's 422 00:25:45,720 --> 00:25:50,600 Speaker 1: a there's an interpretation of quantum mechanics that basically says, um, 423 00:25:50,640 --> 00:25:53,640 Speaker 1: everything we know about the universe we know through observation, 424 00:25:54,560 --> 00:25:57,400 Speaker 1: and but once you observe it, it it changes. That's part 425 00:25:57,400 --> 00:26:01,440 Speaker 1: of it. And when when we observe we we gain information, right, 426 00:26:01,640 --> 00:26:04,280 Speaker 1: but we can't observe everything at once. So all we 427 00:26:04,359 --> 00:26:08,120 Speaker 1: know exists in our reality for sure, is what we're observing. 428 00:26:08,440 --> 00:26:11,119 Speaker 1: So everything else, like what's going on out there in 429 00:26:11,160 --> 00:26:14,360 Speaker 1: the office right now, doesn't exist because we're not there 430 00:26:14,400 --> 00:26:18,359 Speaker 1: to observe it. Mind blowing. Once again, it is mind blowing. 431 00:26:18,359 --> 00:26:20,719 Speaker 1: But it also that we say all this not just 432 00:26:20,800 --> 00:26:26,119 Speaker 1: to you know, rock out to Floyd, but um, because 433 00:26:26,200 --> 00:26:28,199 Speaker 1: this is this is what science is up against. This 434 00:26:28,240 --> 00:26:31,399 Speaker 1: isn't just jibberish. This isn't just philosophical jibberish. As much 435 00:26:31,400 --> 00:26:34,080 Speaker 1: as science would like it to be, there is a 436 00:26:34,160 --> 00:26:38,840 Speaker 1: true problem with the fact that subjectivity, not objectivity, is 437 00:26:38,880 --> 00:26:41,280 Speaker 1: how we interact with our universe. Even though science is 438 00:26:41,320 --> 00:26:46,960 Speaker 1: based it's supposed to be based exclusively on objectivity. Right. Well, uh, 439 00:26:47,200 --> 00:26:50,240 Speaker 1: Stephen Hawking, you might have heard of him, and another 440 00:26:50,320 --> 00:26:55,119 Speaker 1: dude name Leonard load Loader. Now is how I'm going 441 00:26:55,200 --> 00:26:57,359 Speaker 1: to pronounce that there's a silent m in there somewhere. 442 00:26:57,720 --> 00:26:59,639 Speaker 1: They have a new book called The Grand Design, and 443 00:26:59,680 --> 00:27:03,040 Speaker 1: they are now saying that I think scientists used to 444 00:27:03,080 --> 00:27:04,720 Speaker 1: say we're going to find the theory of everything. Now 445 00:27:04,760 --> 00:27:06,680 Speaker 1: they're saying, you know what, We're probably not going to 446 00:27:06,760 --> 00:27:09,040 Speaker 1: find the theory of everything, but it's probably gonna be 447 00:27:09,240 --> 00:27:12,560 Speaker 1: more like what they call, quote a family of interconnected theories, 448 00:27:13,520 --> 00:27:17,040 Speaker 1: which describe your reality under very specific conditions. And this 449 00:27:17,119 --> 00:27:19,520 Speaker 1: is kind of huge for Stephen Hawking because he's long 450 00:27:19,600 --> 00:27:22,160 Speaker 1: been a big supporter of the theory of everything, which 451 00:27:22,320 --> 00:27:27,359 Speaker 1: takes the standard model of physics, includes gravity, which has 452 00:27:27,400 --> 00:27:31,240 Speaker 1: always been elusive, and then marries it with quantum mechanics 453 00:27:31,680 --> 00:27:33,959 Speaker 1: to explain everything. That's the theory of everything. It's one 454 00:27:34,040 --> 00:27:37,960 Speaker 1: theory that explains everything, right, Like that surfer guy exactly, 455 00:27:38,320 --> 00:27:41,280 Speaker 1: Garrett Leacy. I think it's a long time ago it was, 456 00:27:41,400 --> 00:27:43,800 Speaker 1: and you know it's going to be years before he's 457 00:27:43,920 --> 00:27:48,160 Speaker 1: shown to be correct or incorrect. But Hawking saying it's 458 00:27:48,160 --> 00:27:49,960 Speaker 1: probably not going to be the case. There's going to 459 00:27:50,040 --> 00:27:54,400 Speaker 1: there's too many different variables that don't fit together. Right. 460 00:27:54,640 --> 00:27:57,000 Speaker 1: But the thing that really scares a physicist, that will 461 00:27:57,040 --> 00:28:02,720 Speaker 1: scare any physicist is this sports. Those are those models 462 00:28:02,760 --> 00:28:06,280 Speaker 1: that we've come up with. Are they how the universe 463 00:28:06,280 --> 00:28:09,679 Speaker 1: actually works or how we look at the universe and 464 00:28:09,760 --> 00:28:11,560 Speaker 1: see how it works. You see what I'm saying. There's 465 00:28:11,600 --> 00:28:14,600 Speaker 1: that subjectivism again. It can't be whipped well. And all 466 00:28:14,640 --> 00:28:16,400 Speaker 1: the things that we've said over the years that we 467 00:28:16,440 --> 00:28:21,000 Speaker 1: have formed to be true, are those even true? Or 468 00:28:21,040 --> 00:28:24,639 Speaker 1: are or the conclusions we're reaching just based on years 469 00:28:24,680 --> 00:28:28,200 Speaker 1: of thought compiled that may not have been true to 470 00:28:28,240 --> 00:28:31,640 Speaker 1: begin with. So I mean like we arrive at reality 471 00:28:31,680 --> 00:28:34,920 Speaker 1: by consensus. Yeah, but is that consensus was that even 472 00:28:35,040 --> 00:28:37,880 Speaker 1: accurate along the way? Not necessarily. It's been showing time 473 00:28:37,880 --> 00:28:41,280 Speaker 1: and time again that it's hasn't been accurate through these 474 00:28:41,400 --> 00:28:46,360 Speaker 1: um the Five Revolutions, as VM Ramashchandra and puts them. 475 00:28:46,360 --> 00:28:49,160 Speaker 1: Bernicus Copernicus was the first one who said that Earth 476 00:28:49,200 --> 00:28:52,720 Speaker 1: is not the center of the universe. Darwinism dark very good. 477 00:28:52,840 --> 00:28:55,520 Speaker 1: Chuck Darwin's says like, hey, we're actually just a bunch 478 00:28:55,520 --> 00:29:00,520 Speaker 1: of apes DNA Freud Freud Freud saying like we we 479 00:29:00,600 --> 00:29:03,720 Speaker 1: actually are driven by desires that we can't control and 480 00:29:03,760 --> 00:29:07,920 Speaker 1: aren't really aware of d NA, which is saying I 481 00:29:07,960 --> 00:29:12,000 Speaker 1: think James Watson, who found DNA along with Francis Crick, 482 00:29:12,080 --> 00:29:15,520 Speaker 1: said quote, there are only molecules, everything else is sociologist. 483 00:29:15,520 --> 00:29:17,200 Speaker 1: I love that quote, man, It's one of my favorites. 484 00:29:17,480 --> 00:29:21,040 Speaker 1: And then um, the Fifth Revolution, the neuroscience revolution, that 485 00:29:21,120 --> 00:29:25,280 Speaker 1: we're all everything, are all of our understanding of movements 486 00:29:25,280 --> 00:29:31,000 Speaker 1: and and experiences are nothing but um neuronal transmissions electrochemical impulses. Right, 487 00:29:31,040 --> 00:29:34,080 Speaker 1: so there's not even sociology that even is just based 488 00:29:34,120 --> 00:29:37,080 Speaker 1: on firing neurons. Right, That's that's where we're at right now. 489 00:29:37,120 --> 00:29:40,200 Speaker 1: That's why I say, I think we have everything on 490 00:29:40,200 --> 00:29:42,959 Speaker 1: the table, just haven't put it together. But it's entirely 491 00:29:43,000 --> 00:29:47,960 Speaker 1: possible historically speaking to say, well we thought that before 492 00:29:48,600 --> 00:29:51,880 Speaker 1: and we didn't. And what revolution is next? Will that? 493 00:29:52,160 --> 00:29:57,520 Speaker 1: Will the next revolution get us over the wall of subjectivism? 494 00:29:57,640 --> 00:29:59,920 Speaker 1: Or will that be the wall that we always run into? 495 00:30:01,040 --> 00:30:02,960 Speaker 1: This is a good one, and well I was worried 496 00:30:02,960 --> 00:30:05,200 Speaker 1: about this one. It came out pretty good in it. 497 00:30:05,240 --> 00:30:07,080 Speaker 1: I think, so, yeah, don't you like it when we 498 00:30:07,640 --> 00:30:09,000 Speaker 1: like pat ourselves in the back of the end of 499 00:30:09,040 --> 00:30:11,800 Speaker 1: the show. I think this one deserves it. Man. Well, 500 00:30:11,960 --> 00:30:15,440 Speaker 1: so we're from blue Rays to Turns and at the 501 00:30:15,520 --> 00:30:18,440 Speaker 1: end of the day, Josh and Chuck say, we are 502 00:30:18,480 --> 00:30:21,200 Speaker 1: not out of new ideas. Canna speak for you, go ahead, 503 00:30:21,560 --> 00:30:23,520 Speaker 1: We are not out of new ideas. And just when 504 00:30:23,560 --> 00:30:24,920 Speaker 1: you think you're out of new ideas, just when you 505 00:30:24,960 --> 00:30:29,520 Speaker 1: think of plateaued comes up. You wang along to say no, no, no, no, 506 00:30:29,960 --> 00:30:32,280 Speaker 1: there are new ideas, and here's one. Not give me 507 00:30:32,320 --> 00:30:35,880 Speaker 1: the cash exactly. If you want to learn more about 508 00:30:35,960 --> 00:30:39,960 Speaker 1: innovation and new ideas, we have tons of stuff all 509 00:30:39,960 --> 00:30:44,000 Speaker 1: over the site. Just type in innovation, type in discovery. 510 00:30:44,000 --> 00:30:47,200 Speaker 1: I'm sure that'll bring up a ton of stuff. Um, 511 00:30:47,280 --> 00:30:50,760 Speaker 1: and type in neurons. That will bring up some pretty 512 00:30:50,760 --> 00:30:54,480 Speaker 1: cool stuff too. Agreed. Uh. You can type all those 513 00:30:54,520 --> 00:30:57,760 Speaker 1: words into the handy search bar at how stuff works 514 00:30:57,800 --> 00:31:05,040 Speaker 1: dot com, which means it's time for a sner mail. Yes, Josh, 515 00:31:05,040 --> 00:31:09,240 Speaker 1: I'm gonna follow this very heavy podcast with the opposite 516 00:31:09,800 --> 00:31:12,560 Speaker 1: an email for him. Okay, this is from our thirteen 517 00:31:12,600 --> 00:31:16,600 Speaker 1: year old fan Peyton in California. Well, hello, I'm sending 518 00:31:16,680 --> 00:31:18,840 Speaker 1: this from my eye touch while laying in bed. I'm 519 00:31:18,880 --> 00:31:22,280 Speaker 1: supposed to be asleep, so anyway, I just started listening 520 00:31:22,320 --> 00:31:24,880 Speaker 1: to your podcast after my friend Claire. Yes, that's the 521 00:31:24,920 --> 00:31:28,200 Speaker 1: Claire from California whose email you read on the air, 522 00:31:28,240 --> 00:31:31,280 Speaker 1: who thinks Jerry looks like Tina f A. Uh. Claire 523 00:31:31,400 --> 00:31:33,720 Speaker 1: is his his Peyton's friend. So she said, oh, you 524 00:31:33,760 --> 00:31:37,040 Speaker 1: got on the year. So I'm gonna start listening to you. Um, actually, 525 00:31:37,080 --> 00:31:39,000 Speaker 1: I'm saying Peyton is a girl. Peyton maybe a boy? 526 00:31:39,000 --> 00:31:43,720 Speaker 1: You never know? Oh? Really, yeah, it's in dragyn this right, Yeah, 527 00:31:44,640 --> 00:31:49,080 Speaker 1: ambivalent at least. Uh. Claire posted on her Facebook page 528 00:31:49,080 --> 00:31:51,080 Speaker 1: that I said, listen to the most recent podcast because 529 00:31:51,080 --> 00:31:53,360 Speaker 1: you guys read her letter or something. I thought it 530 00:31:53,400 --> 00:31:56,320 Speaker 1: was so cool. Claire and I are really good friends. Anyways, 531 00:31:56,400 --> 00:31:59,440 Speaker 1: I love this podcast. Gosh, I feel so boring because 532 00:31:59,440 --> 00:32:02,760 Speaker 1: I keep saying podcast. Is there like another word for that? 533 00:32:04,320 --> 00:32:08,520 Speaker 1: Jared laughed at that. Anyways, I definitely she does that 534 00:32:08,600 --> 00:32:10,120 Speaker 1: thing like the kids do now where they put like 535 00:32:10,160 --> 00:32:11,840 Speaker 1: eight s at the end of a word. Have you 536 00:32:11,880 --> 00:32:13,880 Speaker 1: seen that? Yeah? I don't get that. I don't either. 537 00:32:13,920 --> 00:32:16,120 Speaker 1: We're getting old, I guess so. I most definitely enjoyed 538 00:32:16,120 --> 00:32:18,920 Speaker 1: the podcast on the OCTOPI and stuff. I thought it 539 00:32:19,000 --> 00:32:21,920 Speaker 1: was OCTOPI. I thought it was informational and funny. By 540 00:32:21,920 --> 00:32:24,320 Speaker 1: the way, this email doesn't make any sense. It's because 541 00:32:24,320 --> 00:32:27,360 Speaker 1: my eye touch is dumb and auto correct words that 542 00:32:27,360 --> 00:32:32,680 Speaker 1: I've already spelled, right, ERG moving on your iPhone does 543 00:32:32,720 --> 00:32:35,719 Speaker 1: that too? And mind does that? What's this? An email 544 00:32:35,840 --> 00:32:37,720 Speaker 1: written with one of those pens that has like four 545 00:32:37,720 --> 00:32:40,480 Speaker 1: different color inc you can select rons, but it feels 546 00:32:40,520 --> 00:32:42,480 Speaker 1: like But the reason I brought that up is I 547 00:32:42,520 --> 00:32:46,200 Speaker 1: have an idea to start a website called my Ipop 548 00:32:46,280 --> 00:32:50,840 Speaker 1: my iPhone spelled what dot com because you ever look 549 00:32:50,880 --> 00:32:52,840 Speaker 1: at some of them, you sinned and you're like, can 550 00:32:52,880 --> 00:32:55,080 Speaker 1: you please make sure you take the sofa out of 551 00:32:55,120 --> 00:32:58,920 Speaker 1: the oven when you get home when you meant to say, um, sturgeon, 552 00:33:00,520 --> 00:33:04,520 Speaker 1: let's say, surgeon is so far I would surgeon, Okay, 553 00:33:04,560 --> 00:33:06,520 Speaker 1: take the surgeon out of the oven? Which is I 554 00:33:06,520 --> 00:33:08,120 Speaker 1: think so much better. I wish you would have planned this. 555 00:33:08,120 --> 00:33:10,320 Speaker 1: It's okay, buddy. Anyway, it can make for a lot 556 00:33:10,360 --> 00:33:14,000 Speaker 1: of fun. So that's my new idea. Okay, And that's um. 557 00:33:14,120 --> 00:33:17,200 Speaker 1: Lots of love from Peyton age thirteen and Cali. Thanks 558 00:33:17,200 --> 00:33:19,880 Speaker 1: a lot, Peyton, age thirteen and Callie boy or girl. 559 00:33:19,920 --> 00:33:22,239 Speaker 1: We're not exactly sure, but either way, we appreciate you 560 00:33:22,280 --> 00:33:24,360 Speaker 1: taking the time to write in. And if you have 561 00:33:24,480 --> 00:33:27,440 Speaker 1: a movie that Chuck and I have not seen, you 562 00:33:27,480 --> 00:33:29,520 Speaker 1: assume we haven't seen that you think we should see, 563 00:33:29,640 --> 00:33:33,000 Speaker 1: best movie, best overlooked movie of all time. We're always 564 00:33:33,040 --> 00:33:36,480 Speaker 1: looking for good suggestions. Wrap it up in an email 565 00:33:36,520 --> 00:33:40,200 Speaker 1: and send it to Stuff podcast at how stuff works 566 00:33:40,240 --> 00:33:53,320 Speaker 1: dot com. Stuff you Should Know is a production of 567 00:33:53,360 --> 00:33:56,120 Speaker 1: iHeart Radio's How Stuff Works. For more podcasts for my 568 00:33:56,160 --> 00:33:58,840 Speaker 1: heart Radio is at the iHeart Radio app, Apple Podcasts, 569 00:33:58,880 --> 00:34:00,600 Speaker 1: or wherever you listen to your favorite shows.