1 00:00:01,120 --> 00:00:03,440 Speaker 1: Welcome to Stuff You should know, a production of I 2 00:00:03,600 --> 00:00:12,760 Speaker 1: Heart Radios How Stuff Works. Hey, and welcome to the podcast. 3 00:00:12,840 --> 00:00:16,639 Speaker 1: I'm Josh Clark. There's Charles W. Chuck Bryant, and there's 4 00:00:16,720 --> 00:00:21,160 Speaker 1: guest producer Josh T over there, Josh T Josh Tizzy. 5 00:00:21,840 --> 00:00:26,920 Speaker 1: That's his new nickname. Okay, Josh nodding, goodhead, Yep, he knows, 6 00:00:27,080 --> 00:00:31,639 Speaker 1: he knows the score. How you doing, man? Oh, I've 7 00:00:31,680 --> 00:00:35,000 Speaker 1: had better days and weeks. But you know, if only 8 00:00:35,040 --> 00:00:37,720 Speaker 1: there was a la ed light, someone could blink in 9 00:00:37,800 --> 00:00:41,839 Speaker 1: my eyeballs and hits everything. I know. I that's actually 10 00:00:41,880 --> 00:00:44,640 Speaker 1: that was a question of mine, like earlier about you know, 11 00:00:44,920 --> 00:00:46,960 Speaker 1: could you just shine a light in somebody's eyeballs and 12 00:00:47,040 --> 00:00:49,960 Speaker 1: make this work? And I that's probably the future, but 13 00:00:50,080 --> 00:00:53,000 Speaker 1: who knows. It's not the it's not the present. Should 14 00:00:53,000 --> 00:00:56,960 Speaker 1: fortunately know? So um soon enough, Chuck though, soon enough. 15 00:00:57,160 --> 00:01:00,880 Speaker 1: Just hang on another fifty years. Okay. So we're talking 16 00:01:00,920 --> 00:01:04,200 Speaker 1: today about opti genetics. And if that word doesn't sound 17 00:01:04,240 --> 00:01:07,200 Speaker 1: at all familiar, don't worry. It's only been around for 18 00:01:07,360 --> 00:01:15,440 Speaker 1: honestly fifteen years. It's like the cutting edge in um 19 00:01:15,520 --> 00:01:20,000 Speaker 1: manipulating the function of brain cells to make them do 20 00:01:20,160 --> 00:01:24,240 Speaker 1: what you want to do or to study brain pathways 21 00:01:24,319 --> 00:01:28,160 Speaker 1: to see which ones are responsible for what. And it's 22 00:01:28,520 --> 00:01:32,959 Speaker 1: really really difficult to get across in the details. But 23 00:01:33,000 --> 00:01:38,000 Speaker 1: it's one of those really interesting science tech things that 24 00:01:38,120 --> 00:01:42,240 Speaker 1: the broad Strokes are like really understandable, you know. Yeah, 25 00:01:42,319 --> 00:01:46,920 Speaker 1: I mean you're literally one day, hopefully well I don't 26 00:01:46,880 --> 00:01:51,240 Speaker 1: know about hopefully, but possibly going to be able to 27 00:01:51,280 --> 00:01:56,559 Speaker 1: turn on and turn off uh neural cells. Yeah, after 28 00:01:56,640 --> 00:02:00,800 Speaker 1: we have modified them, right, so we can control them. Yeah, 29 00:02:00,800 --> 00:02:05,040 Speaker 1: and modify them genetically. That's a big, big key here, yes, 30 00:02:05,960 --> 00:02:09,320 Speaker 1: So but this is really important. Ed put this together 31 00:02:09,360 --> 00:02:11,160 Speaker 1: for us, and he makes a really good point like 32 00:02:11,560 --> 00:02:15,360 Speaker 1: if you read you know, kind of um cutting edge 33 00:02:15,440 --> 00:02:19,920 Speaker 1: side tech articles about this stuff, it sounds like we're 34 00:02:20,040 --> 00:02:23,040 Speaker 1: right there, like we're about to start, you know, flipping 35 00:02:23,040 --> 00:02:25,799 Speaker 1: on and off neural neural circuits and humans any day. 36 00:02:26,120 --> 00:02:28,880 Speaker 1: We're not. We are way far away from that. We're 37 00:02:28,919 --> 00:02:32,280 Speaker 1: still figuring out like the ethical and legal implications of 38 00:02:32,320 --> 00:02:35,520 Speaker 1: even beginning to try that. Yeah. I think the writers 39 00:02:35,600 --> 00:02:38,960 Speaker 1: like that get they get really excitable about stuff. They're 40 00:02:38,960 --> 00:02:41,080 Speaker 1: like fruit flies are so boring, and they're like we 41 00:02:41,080 --> 00:02:42,760 Speaker 1: could do this, and just think we could do this 42 00:02:42,800 --> 00:02:46,000 Speaker 1: and this, and it's like maybe one day, many many 43 00:02:46,040 --> 00:02:49,280 Speaker 1: years from now, but maybe not even yeah, because of 44 00:02:49,280 --> 00:02:52,760 Speaker 1: that whole moral and legal and ethical implications of it. 45 00:02:52,800 --> 00:02:55,240 Speaker 1: But I think, um, I think there are probably plenty 46 00:02:55,240 --> 00:02:57,360 Speaker 1: of people out there who are, like, my depression is 47 00:02:57,400 --> 00:03:00,600 Speaker 1: severe enough that I'm fine with the moral and ethical 48 00:03:00,639 --> 00:03:02,960 Speaker 1: implications of this. I just want this to to fix 49 00:03:03,040 --> 00:03:06,800 Speaker 1: things for me, because it could conceivably someday. But we 50 00:03:06,840 --> 00:03:09,680 Speaker 1: say that just to say, like, what we're talking about 51 00:03:09,760 --> 00:03:13,440 Speaker 1: is on the front tier of science. Although some of 52 00:03:13,440 --> 00:03:17,040 Speaker 1: the research that's been conducted has been successful, but it's 53 00:03:17,040 --> 00:03:20,079 Speaker 1: just been conducted in things like mice and fish and 54 00:03:20,320 --> 00:03:24,920 Speaker 1: fruit flies. Poor little well, we'll put a pin in 55 00:03:24,960 --> 00:03:28,760 Speaker 1: that one, not literally, but well maybe yeah, poor little 56 00:03:28,760 --> 00:03:32,400 Speaker 1: fruit flies done some things to fruit flies. So here's 57 00:03:32,440 --> 00:03:38,240 Speaker 1: the thing, right, the human brain is pretty complex as 58 00:03:38,280 --> 00:03:41,080 Speaker 1: far as organs go. You compared to your spleen, your 59 00:03:41,080 --> 00:03:43,000 Speaker 1: spleen is just gonna slink away and be like, there's 60 00:03:43,040 --> 00:03:47,520 Speaker 1: no comparison here. I just produced bile, you know. So 61 00:03:47,560 --> 00:03:51,000 Speaker 1: the brain is far more complicated than the spleen, which everybody, 62 00:03:51,120 --> 00:03:53,960 Speaker 1: everybody knows. And the reason it's so complicated because there's 63 00:03:54,000 --> 00:03:59,360 Speaker 1: so many specialized cells inside that brain. Neurons, right, Neurons 64 00:03:59,360 --> 00:04:01,680 Speaker 1: are just one tie put brains out. Yeah, and you know, 65 00:04:01,720 --> 00:04:03,840 Speaker 1: we've talked about the brain a lot over the years 66 00:04:03,880 --> 00:04:06,640 Speaker 1: on this show, and we always kind of come back 67 00:04:06,680 --> 00:04:08,920 Speaker 1: to the same thing, which is as much as we've learned, 68 00:04:08,960 --> 00:04:12,280 Speaker 1: which has been a ton, there's still a lot of 69 00:04:12,360 --> 00:04:16,640 Speaker 1: shrugging in the room. Yeah for sure. Jeez, I don't know. 70 00:04:16,760 --> 00:04:19,480 Speaker 1: I mean, but when you look at the hundred billion 71 00:04:19,520 --> 00:04:26,320 Speaker 1: neurons and the quadrillion synapsies, yeah, thousand trillion, that's you know, 72 00:04:26,440 --> 00:04:29,400 Speaker 1: give it, I'm giving humans a break here that we 73 00:04:29,480 --> 00:04:32,760 Speaker 1: haven't figured all of this out at this point. We haven't. 74 00:04:32,960 --> 00:04:34,760 Speaker 1: And then you look at the brain it's just I mean, 75 00:04:34,920 --> 00:04:37,920 Speaker 1: you look AT's just a big, gross, lumpy, gray mess. Yeah, 76 00:04:37,960 --> 00:04:40,440 Speaker 1: it's like it's like a spleen on steroids. I know, Like, 77 00:04:40,440 --> 00:04:42,440 Speaker 1: who even wants to get in that thing to begin with? 78 00:04:43,480 --> 00:04:46,880 Speaker 1: People who like making squishy sounds at their finger. It 79 00:04:46,920 --> 00:04:54,839 Speaker 1: should be shiny and sparkly, and god, you gotta stop 80 00:04:54,839 --> 00:04:56,880 Speaker 1: doing that. It is a little sparkly though, if you 81 00:04:56,920 --> 00:04:59,159 Speaker 1: think about it, like it's shiny because it's got it's 82 00:04:59,160 --> 00:05:02,599 Speaker 1: coated and it's bathed in cerebro spinal fluid. Remember, Yeah, 83 00:05:02,760 --> 00:05:04,679 Speaker 1: I guess I've never seen a picture of the brain 84 00:05:04,720 --> 00:05:07,640 Speaker 1: when it's really donuts thing. I didn't know it's so 85 00:05:07,680 --> 00:05:12,240 Speaker 1: exciting looking, so so okay, So the brain is extremely complex, 86 00:05:12,480 --> 00:05:16,799 Speaker 1: and we figured out some stuff about it. Um. Mainly 87 00:05:16,880 --> 00:05:19,200 Speaker 1: what we figured out starting back in the nineteenth century 88 00:05:19,640 --> 00:05:23,520 Speaker 1: that all of these connections, these thousand trillions and napses 89 00:05:24,200 --> 00:05:28,360 Speaker 1: UM that that allow neurons to communicate with one another 90 00:05:28,480 --> 00:05:30,880 Speaker 1: and carry like an impulse through the brain. All that 91 00:05:31,000 --> 00:05:35,240 Speaker 1: is based on electricity chemical electricity right to where there's 92 00:05:35,240 --> 00:05:38,640 Speaker 1: a difference in the concentration of different types of ion, 93 00:05:38,760 --> 00:05:43,599 Speaker 1: say like calcium in potassium in the cell, so that 94 00:05:43,880 --> 00:05:47,280 Speaker 1: when it reaches a certain concentration, it actually generates an 95 00:05:47,279 --> 00:05:50,920 Speaker 1: electrical impulse, and then that impulse can be translated or 96 00:05:50,960 --> 00:05:53,920 Speaker 1: transferred to another neuron, and then that neuron may send 97 00:05:53,920 --> 00:05:56,839 Speaker 1: that electrical impulse on and on and on until it 98 00:05:56,920 --> 00:06:01,159 Speaker 1: finally reaches its destination where suddenly you're you're flooded in 99 00:06:01,240 --> 00:06:03,960 Speaker 1: dopamine and you're feeling pretty good because you just try 100 00:06:04,040 --> 00:06:06,960 Speaker 1: to crispy cream that was fresh and hot right off 101 00:06:06,960 --> 00:06:10,400 Speaker 1: of the line. Yeah, So like when you hear people say, 102 00:06:10,480 --> 00:06:13,200 Speaker 1: or us say, like when your neurons are firing, that's 103 00:06:13,279 --> 00:06:18,240 Speaker 1: literally what's going on. They are tiny little electrical charges. Uh, 104 00:06:18,560 --> 00:06:22,120 Speaker 1: we can call them action potentials, and they measure them 105 00:06:22,120 --> 00:06:25,800 Speaker 1: in tiny little Miller volts. It's adorable, it is. They 106 00:06:25,800 --> 00:06:28,640 Speaker 1: have little bow ties on and short pants. Yeah, but 107 00:06:28,680 --> 00:06:34,000 Speaker 1: there's little tiny electrical triggers that go off constantly, right, right, so, 108 00:06:34,200 --> 00:06:37,520 Speaker 1: or they don't go off, which which also has um 109 00:06:37,600 --> 00:06:39,480 Speaker 1: in effect as well. Right, So, like you can have 110 00:06:39,600 --> 00:06:41,840 Speaker 1: something firing firing and firing and that stops firing and 111 00:06:41,839 --> 00:06:44,320 Speaker 1: you're suddenly not feeling pain any longer, which is great. 112 00:06:44,520 --> 00:06:46,000 Speaker 1: So you want to have them on and off. But 113 00:06:46,080 --> 00:06:48,599 Speaker 1: it all is based on electricity. And we figured this 114 00:06:48,600 --> 00:06:52,440 Speaker 1: out thanks to a guy named Chuck. Are you talking 115 00:06:52,440 --> 00:06:56,240 Speaker 1: about Luis gi Galvani? Ye, yes i am. And you 116 00:06:56,279 --> 00:06:59,000 Speaker 1: know that famous experiment with frog legs where you can 117 00:06:59,279 --> 00:07:02,160 Speaker 1: take dismembered frog legs and sprinkled salt on them and 118 00:07:02,160 --> 00:07:05,520 Speaker 1: they'll start twitching or whatever. Those are always creepy. Well, 119 00:07:05,560 --> 00:07:08,640 Speaker 1: this same guy figured out that you could introduce electricity 120 00:07:08,640 --> 00:07:11,080 Speaker 1: into the brains of frogs and you can make the 121 00:07:11,240 --> 00:07:13,640 Speaker 1: frog legs kind of twitch and hop in the in 122 00:07:13,680 --> 00:07:17,560 Speaker 1: the brain of a dead frog. So it shows pretty 123 00:07:17,560 --> 00:07:20,960 Speaker 1: clearly that electricity is what move makes the brain move, 124 00:07:21,040 --> 00:07:23,280 Speaker 1: and that the brain is what makes the legs move. Right. 125 00:07:23,880 --> 00:07:25,960 Speaker 1: And then later on there was a guy named m 126 00:07:26,400 --> 00:07:30,840 Speaker 1: Robert's Bartholow. Boy, this guy, did you look up this experiment? 127 00:07:31,080 --> 00:07:34,640 Speaker 1: I did pretty pretty pretty bad. Yeah. So there was 128 00:07:34,680 --> 00:07:38,040 Speaker 1: a woman named Mary Rafferty who had an ulcer on 129 00:07:38,160 --> 00:07:43,800 Speaker 1: her brain, which ended up resulting in a literal hole 130 00:07:43,880 --> 00:07:48,080 Speaker 1: in her skull. So her brain was exposed, and Robert's Bartholow, 131 00:07:48,120 --> 00:07:50,920 Speaker 1: I guess, was like, well, well, perfect, this is just 132 00:07:51,000 --> 00:07:54,480 Speaker 1: what I've been waiting for, is access to a human brain. 133 00:07:55,160 --> 00:07:58,800 Speaker 1: So let me see if I can stimulate these neurons 134 00:07:58,840 --> 00:08:03,760 Speaker 1: by looking at with needles her brain, and see what 135 00:08:03,920 --> 00:08:07,680 Speaker 1: happens when I stimulate that with electricity. And he kept 136 00:08:07,720 --> 00:08:12,080 Speaker 1: it super low voltage at first and noticed some things 137 00:08:12,120 --> 00:08:14,600 Speaker 1: like wow, when I poke here, her arm moves right. 138 00:08:14,640 --> 00:08:19,679 Speaker 1: He's like, does anyone have a question, But he ramped 139 00:08:19,720 --> 00:08:22,400 Speaker 1: that electricity up at higher voltage, looking for what he 140 00:08:22,440 --> 00:08:28,560 Speaker 1: called a more decided reaction, and he well, he argued 141 00:08:28,920 --> 00:08:32,040 Speaker 1: afterwards that he did not cause her death, but she 142 00:08:32,160 --> 00:08:35,360 Speaker 1: had a seizure. She went into a coma and she died. Right, 143 00:08:35,600 --> 00:08:39,320 Speaker 1: So the kind of the sticking point here is and 144 00:08:39,400 --> 00:08:41,120 Speaker 1: he was censured by the A M A. But nothing 145 00:08:41,160 --> 00:08:45,679 Speaker 1: really happened was that he was experimenting on a human being, 146 00:08:46,360 --> 00:08:50,080 Speaker 1: but not with the aim of curing anything that was 147 00:08:50,080 --> 00:08:52,360 Speaker 1: wrong with her. No. He even said in the study 148 00:08:52,400 --> 00:08:55,360 Speaker 1: that he produced that anyone who tried to replicate this 149 00:08:55,400 --> 00:09:00,480 Speaker 1: would be would be UM conducting like a criminal experiment. 150 00:09:00,800 --> 00:09:03,199 Speaker 1: It may be criminal. Redo it. Yeah, I'm good, right, 151 00:09:03,559 --> 00:09:06,000 Speaker 1: I'm all good, but just don't do this again. But 152 00:09:06,520 --> 00:09:08,520 Speaker 1: what's what was interesting to me is like it wasn't 153 00:09:08,600 --> 00:09:12,920 Speaker 1: until six that we started to um, like the scientific 154 00:09:12,960 --> 00:09:17,240 Speaker 1: community started to enforce informed consent after the Nazi atrocities 155 00:09:17,280 --> 00:09:20,520 Speaker 1: of World War two. UM, And this guy was was 156 00:09:20,559 --> 00:09:23,439 Speaker 1: carrying this experiment and I think in eighteen seventy four. 157 00:09:24,440 --> 00:09:27,240 Speaker 1: But even at the time, so in his defense, people 158 00:09:27,280 --> 00:09:30,440 Speaker 1: weren't about informed consent, and there were like the ethics 159 00:09:30,480 --> 00:09:33,920 Speaker 1: of scientific experiments weren't nearly as pronounced and structured as 160 00:09:33,920 --> 00:09:37,439 Speaker 1: they are today. And yet his experiment was still denounced, 161 00:09:37,480 --> 00:09:40,600 Speaker 1: like everybody could see that on some level that hadn't 162 00:09:40,640 --> 00:09:45,080 Speaker 1: been like elucidated, yet he had violated something which is 163 00:09:45,120 --> 00:09:49,160 Speaker 1: actually like the life of a person, like something's bothering me, 164 00:09:49,240 --> 00:09:51,480 Speaker 1: but I can't quite put my finger on it. Oh well, 165 00:09:51,520 --> 00:09:54,240 Speaker 1: now he's hit me with the electric needles in my fingers, 166 00:09:54,280 --> 00:09:57,560 Speaker 1: going exactly where he wanted to. Oh boy, the am 167 00:09:57,559 --> 00:10:00,760 Speaker 1: A actually banned human experimentation if it was not for 168 00:10:00,800 --> 00:10:04,320 Speaker 1: the purposes of saving a human life after this very 169 00:10:04,320 --> 00:10:08,760 Speaker 1: good stuff. So what we figured out though from Galvani 170 00:10:09,000 --> 00:10:12,640 Speaker 1: and um barth Bartho, Yeah, he's got a tough one, 171 00:10:12,960 --> 00:10:15,960 Speaker 1: a tough last thing, Um and others who showed that 172 00:10:16,040 --> 00:10:20,800 Speaker 1: electricity is the currency that moves messages around the brain. 173 00:10:21,720 --> 00:10:25,120 Speaker 1: Um that you can actually stimulate the brain with electricity 174 00:10:25,160 --> 00:10:29,200 Speaker 1: to go around its internal drives and externally make it 175 00:10:29,240 --> 00:10:31,920 Speaker 1: do things right. But the problem is is like if 176 00:10:31,920 --> 00:10:34,640 Speaker 1: you're using the society of the brain, it's really clumsy. 177 00:10:34,720 --> 00:10:37,480 Speaker 1: It really like an electrical impulse is really tough to 178 00:10:37,559 --> 00:10:40,640 Speaker 1: keep localized. So if you're trying to just kind of 179 00:10:40,679 --> 00:10:44,079 Speaker 1: see what one particular type of neuron does, well, t 180 00:10:44,360 --> 00:10:46,800 Speaker 1: s for you because you're electrically going to stimulate a 181 00:10:46,800 --> 00:10:49,720 Speaker 1: whole bunch of neurons in the neighborhood and it's not 182 00:10:49,800 --> 00:10:54,079 Speaker 1: a very um a fine tuned way of studying how 183 00:10:54,120 --> 00:10:56,960 Speaker 1: the brain works. And again, it's really important that we 184 00:10:57,080 --> 00:10:59,760 Speaker 1: understand what regions of the brain are responsible for what. 185 00:11:00,280 --> 00:11:03,160 Speaker 1: So if we're just kind of trying to see what 186 00:11:03,200 --> 00:11:05,959 Speaker 1: regions are responsible for raising your arm, we might hit 187 00:11:06,000 --> 00:11:08,600 Speaker 1: those neurons with electric needles, but we might also like 188 00:11:08,679 --> 00:11:11,760 Speaker 1: kick the leg out too. That just kind of it's 189 00:11:11,760 --> 00:11:14,160 Speaker 1: not as precise as it needs to be. Do you 190 00:11:14,200 --> 00:11:19,280 Speaker 1: want to uh use this repeated metaphor? It was a 191 00:11:19,320 --> 00:11:21,960 Speaker 1: fine metaphor, but it was a mixed metaphor, and the 192 00:11:22,080 --> 00:11:25,520 Speaker 1: first one really didn't work. Yeah, let's just go ahead 193 00:11:25,559 --> 00:11:28,600 Speaker 1: and say it, because it does get a little bit 194 00:11:28,920 --> 00:11:33,040 Speaker 1: more credible as the metaphor develops. But I agree this 195 00:11:33,080 --> 00:11:37,400 Speaker 1: first one was a little rough. But just just take 196 00:11:37,400 --> 00:11:40,120 Speaker 1: this metaphor, put in your pocket, everybody, and smoke it 197 00:11:40,679 --> 00:11:43,520 Speaker 1: with some salt. I don't even know what that means. 198 00:11:44,240 --> 00:11:47,920 Speaker 1: So imagine a neighborhood or a city, if you will, 199 00:11:48,360 --> 00:11:51,320 Speaker 1: with all the people, let's say New York City and 200 00:11:51,440 --> 00:11:56,000 Speaker 1: people everywhere moving around. These are your neural uh, this 201 00:11:56,120 --> 00:11:59,640 Speaker 1: is your neural network. Everyone's going places. They're taking subways 202 00:11:59,640 --> 00:12:02,240 Speaker 1: to run busses or driving cards, are walking. Some of 203 00:12:02,280 --> 00:12:04,080 Speaker 1: them have that have no conscience or in a horse 204 00:12:04,120 --> 00:12:07,440 Speaker 1: and buggy in Central Park and it's not trade to 205 00:12:07,440 --> 00:12:12,960 Speaker 1: the giant. He just stole it. And Uh, electrical stimulation, 206 00:12:13,080 --> 00:12:15,800 Speaker 1: which is something that deep brain stimulation we've talked about 207 00:12:15,800 --> 00:12:18,200 Speaker 1: on the show, with something we currently are doing and 208 00:12:18,240 --> 00:12:22,960 Speaker 1: are able to do just very imprecisely. So, Uh, that 209 00:12:23,000 --> 00:12:27,160 Speaker 1: electrical stimulation is like trying to learn about people only 210 00:12:27,240 --> 00:12:31,520 Speaker 1: driving Ferraris through New York City by setting a city 211 00:12:31,559 --> 00:12:35,679 Speaker 1: block on fire. That's where it loses me because it 212 00:12:35,679 --> 00:12:41,040 Speaker 1: doesn't make any sense. I just said, shocking entire city block. Sure, 213 00:12:41,200 --> 00:12:43,520 Speaker 1: I guess so, Yeah, that would have made more sense, right. Yeah. 214 00:12:43,559 --> 00:12:46,280 Speaker 1: I saw another analogy on you know how we always 215 00:12:46,320 --> 00:12:48,520 Speaker 1: say when you can't understand something, go to like the 216 00:12:48,600 --> 00:12:53,400 Speaker 1: kids science website. Sure, I found one called Frontiers for 217 00:12:53,520 --> 00:12:55,800 Speaker 1: Young Minds, and they were explaining up to Janet's and 218 00:12:55,800 --> 00:12:58,520 Speaker 1: they basically put it similarly saying, if you wanted to 219 00:12:58,600 --> 00:13:01,640 Speaker 1: study the movement of track Affrick in the city, Um, 220 00:13:02,760 --> 00:13:04,760 Speaker 1: but you wanted to see like like you were saying, 221 00:13:04,760 --> 00:13:09,200 Speaker 1: how ferrari Um car drivers drive. Um, you want to 222 00:13:09,240 --> 00:13:11,240 Speaker 1: be able to tell everybody when to drive. But the 223 00:13:11,240 --> 00:13:14,199 Speaker 1: problem is if you're using an electrical stimulation that doesn't 224 00:13:14,240 --> 00:13:16,840 Speaker 1: just tell Ferrari drivers when to drive and tells everybody 225 00:13:16,960 --> 00:13:19,040 Speaker 1: in the city you want to start to start driving, 226 00:13:19,080 --> 00:13:21,720 Speaker 1: and everyone starts driving, So it doesn't tell you anything 227 00:13:21,760 --> 00:13:25,600 Speaker 1: about just the Ferrari drivers. Yeah, that makes sense. And 228 00:13:25,640 --> 00:13:29,120 Speaker 1: by the way, Ferrari, you chucking me in Ferrari each 229 00:13:29,240 --> 00:13:32,560 Speaker 1: for all this buzz Ferrari marketing. Ferrari, I would just 230 00:13:32,640 --> 00:13:35,400 Speaker 1: like to drive one once. That'd be fun. Don't don't 231 00:13:35,240 --> 00:13:38,079 Speaker 1: set your slights higher than that, chuck. See if we 232 00:13:38,120 --> 00:13:40,320 Speaker 1: can get a free one Ferrari, just stress me out. 233 00:13:40,440 --> 00:13:43,480 Speaker 1: We'll sell it on Craigslist. I'm gonna park a Ferrari 234 00:13:43,600 --> 00:13:50,160 Speaker 1: in my driveway backwards. You back it in? Oh goodness. 235 00:13:50,440 --> 00:13:52,840 Speaker 1: Uh so should we take a break now that we 236 00:13:52,880 --> 00:13:55,240 Speaker 1: have a nice little set up in hand. Sure, all right, 237 00:13:55,320 --> 00:13:57,960 Speaker 1: let's take a break, and we're gonna talk about potassium 238 00:13:57,960 --> 00:14:20,760 Speaker 1: and calcium and color dye right for this shock. Alright. 239 00:14:20,800 --> 00:14:24,160 Speaker 1: So people figured out pretty quickly that yes, electrical impulses 240 00:14:24,200 --> 00:14:26,280 Speaker 1: will make parts of the brain work, but it's not 241 00:14:26,400 --> 00:14:28,720 Speaker 1: very precise. We need a more precise way to study 242 00:14:29,200 --> 00:14:31,440 Speaker 1: the different parts of the brain to see what's going 243 00:14:31,520 --> 00:14:34,960 Speaker 1: on where at any given time. That's right, and enter 244 00:14:35,200 --> 00:14:39,520 Speaker 1: Lawrence Cohen in the nineteen seventies, Leonard Cohen's brother. No 245 00:14:40,240 --> 00:14:44,560 Speaker 1: it could be uh no, No, I was so disappointed. 246 00:14:45,080 --> 00:14:49,400 Speaker 1: I thought, wow, that's amazing. All the genius in one family. Yeah, 247 00:14:49,440 --> 00:14:51,800 Speaker 1: it's a lot of genius. Uh And in nineteen eighty 248 00:14:51,840 --> 00:14:55,800 Speaker 1: it was further developed by man named Roger CN, Leonard 249 00:14:55,840 --> 00:14:59,320 Speaker 1: Cohen's one time stage manager. Okay, it was waiting on 250 00:14:59,400 --> 00:15:02,800 Speaker 1: that is it c N? Is it c N d N? 251 00:15:03,320 --> 00:15:07,040 Speaker 1: I don't know? T S I E N anytime your 252 00:15:07,120 --> 00:15:08,920 Speaker 1: name starts with T S I seen one of those 253 00:15:09,000 --> 00:15:11,960 Speaker 1: is silent. Yeah, but I think they together make a 254 00:15:12,040 --> 00:15:17,920 Speaker 1: D sound. Oh really, I think so? In what language? Chinese, Mandarin, 255 00:15:18,000 --> 00:15:20,880 Speaker 1: maybe Cantonese, one of those two. Okay, oh god, I 256 00:15:20,880 --> 00:15:26,560 Speaker 1: feel like I'm drowning. It's okay, grab hold of me, thanks. 257 00:15:26,680 --> 00:15:32,200 Speaker 1: Uh So what they did was they worked on um 258 00:15:32,360 --> 00:15:36,040 Speaker 1: this synthetic die UM, like I said, coming in the seventies, 259 00:15:36,400 --> 00:15:41,840 Speaker 1: refined in the eighties by Roger T and Roger you 260 00:15:41,920 --> 00:15:45,480 Speaker 1: already yeah, you already talked about in the intro about 261 00:15:45,640 --> 00:15:49,400 Speaker 1: the action potential in a neuron that's created that little 262 00:15:49,560 --> 00:15:52,800 Speaker 1: electrical charges. It's not like it's plugged into something. It's 263 00:15:52,840 --> 00:15:58,920 Speaker 1: created by concentrations of potassium and calcium shifting around right right. 264 00:15:59,000 --> 00:16:02,920 Speaker 1: So what they figured out, what Um, Lawrence and Roger 265 00:16:03,000 --> 00:16:06,040 Speaker 1: figured out is that you can actually introduce the synthetic 266 00:16:06,120 --> 00:16:09,960 Speaker 1: die so that the dye is produced or triggered or 267 00:16:10,520 --> 00:16:15,960 Speaker 1: it becomes a parent once calcium ion concentration reaches a 268 00:16:16,000 --> 00:16:18,680 Speaker 1: certain point. And if you know that a calcium ion 269 00:16:19,440 --> 00:16:23,760 Speaker 1: concentration will trigger this action potential, this electrical impulse in 270 00:16:23,800 --> 00:16:27,320 Speaker 1: the neuron, if the neuron suddenly is glowing or has 271 00:16:27,400 --> 00:16:31,240 Speaker 1: this dye colored dye that's showing up under a microscope, 272 00:16:31,520 --> 00:16:34,000 Speaker 1: you know that that neuron is just fired because the 273 00:16:34,040 --> 00:16:38,320 Speaker 1: calcium concentration changed enough for that die to become a parent. Yeah, 274 00:16:38,360 --> 00:16:40,720 Speaker 1: it's like the very easy way to say this is 275 00:16:40,760 --> 00:16:46,040 Speaker 1: scientists basically said, you know when someone uh metaphorically turns 276 00:16:46,040 --> 00:16:48,040 Speaker 1: on a light and that neuron that'd be great an 277 00:16:48,080 --> 00:16:50,960 Speaker 1: actual light turned on. Yeah, this is very similar to 278 00:16:51,000 --> 00:16:54,280 Speaker 1: that for sure, And it's still was a little it's 279 00:16:54,280 --> 00:16:58,680 Speaker 1: a little clunky, um because well, I'm not fully under 280 00:16:58,840 --> 00:17:01,000 Speaker 1: I don't fully understand why it's a little clink. I 281 00:17:01,000 --> 00:17:04,399 Speaker 1: think it's that maybe you can't control it. It's just 282 00:17:04,600 --> 00:17:09,040 Speaker 1: you can witness it. I think is the issue with it. Well, 283 00:17:09,160 --> 00:17:12,679 Speaker 1: if we're going to further that metaphor, it's really giant 284 00:17:12,760 --> 00:17:15,840 Speaker 1: to escate fast to this point, but go ahead. The 285 00:17:15,920 --> 00:17:18,280 Speaker 1: next step would be, uh, you want to learn about 286 00:17:18,320 --> 00:17:20,520 Speaker 1: these Ferrari drivers in New York City, So you just 287 00:17:20,600 --> 00:17:23,159 Speaker 1: paint and inside and the entire city block instead of 288 00:17:23,840 --> 00:17:27,320 Speaker 1: shocking it with electricity or setting it on fire. But 289 00:17:27,600 --> 00:17:30,400 Speaker 1: any car that's driving on that city plot city block 290 00:17:30,520 --> 00:17:34,040 Speaker 1: is gonna it's gonna glow or it's gonna move through 291 00:17:34,040 --> 00:17:37,200 Speaker 1: the paint. So you're gonna get car track. Sure, you're 292 00:17:37,200 --> 00:17:39,960 Speaker 1: gonna get glow paint all over every car. You still 293 00:17:40,000 --> 00:17:45,840 Speaker 1: are not just targeting the Ferrari. But it's better the metaphor. Sure, 294 00:17:46,359 --> 00:17:50,040 Speaker 1: that's right, much better, but it's still not precise enough. 295 00:17:50,040 --> 00:17:53,800 Speaker 1: And I think where it's lacking is that you, yes, 296 00:17:53,880 --> 00:17:56,640 Speaker 1: you can see now what neuron has just gone off, 297 00:17:57,320 --> 00:18:00,880 Speaker 1: but you can't make the neuron go off. But but um, 298 00:18:01,000 --> 00:18:05,359 Speaker 1: Lawrence and Roger gave future researchers an idea. They're like, 299 00:18:05,359 --> 00:18:08,120 Speaker 1: wait a minute, we're onto something here, like being able 300 00:18:08,200 --> 00:18:11,400 Speaker 1: to to to see when a neuron has gone off. 301 00:18:11,480 --> 00:18:13,560 Speaker 1: That is a great idea. Let's figure out how to 302 00:18:13,640 --> 00:18:17,199 Speaker 1: do that but also make neurons go off. And to 303 00:18:17,240 --> 00:18:21,000 Speaker 1: do this they turn to our friends in the sea 304 00:18:21,320 --> 00:18:24,119 Speaker 1: for help. Yeah, this is really interesting, and this is 305 00:18:24,160 --> 00:18:28,760 Speaker 1: where um genetics come into play, because it is it 306 00:18:28,840 --> 00:18:33,640 Speaker 1: is important to point out that neurons are basically the same. Uh. 307 00:18:33,680 --> 00:18:38,200 Speaker 1: They all contain basically the same genetic information even but 308 00:18:38,920 --> 00:18:43,119 Speaker 1: it's that mystery of the differences switching these genes on 309 00:18:43,240 --> 00:18:45,920 Speaker 1: and off, and why would one be switched on when 310 00:18:45,960 --> 00:18:48,800 Speaker 1: another switched off. That's sort of like what makes them 311 00:18:48,880 --> 00:18:52,800 Speaker 1: unique among each other, right, right, So, like if you 312 00:18:52,840 --> 00:18:55,680 Speaker 1: have a human cell, especially like say a stem cell 313 00:18:55,760 --> 00:18:57,720 Speaker 1: or whatever, but any cell has all of the your 314 00:18:57,760 --> 00:19:00,960 Speaker 1: genetic blueprint, and it's just depending on what genes are 315 00:19:01,000 --> 00:19:04,040 Speaker 1: on or off that determines what kind of cell it 316 00:19:04,119 --> 00:19:06,639 Speaker 1: is and what it's responsible for doing. You know. So 317 00:19:06,680 --> 00:19:08,880 Speaker 1: maybe it's like a retinal cell and it detects light, 318 00:19:09,000 --> 00:19:11,600 Speaker 1: or maybe it's a cardiac cell and it it makes 319 00:19:11,680 --> 00:19:15,800 Speaker 1: up heart muscle. All of them have the same DNA, 320 00:19:16,119 --> 00:19:18,480 Speaker 1: the same genetic blueprint, but some of those genes are 321 00:19:18,480 --> 00:19:20,480 Speaker 1: gonna be turned off, some are gonna be turned on. 322 00:19:20,800 --> 00:19:23,080 Speaker 1: And the same is true for neural cells. To right, 323 00:19:23,400 --> 00:19:26,760 Speaker 1: you have neural cells that are responsible for releasing dopamine. 324 00:19:26,920 --> 00:19:31,200 Speaker 1: You have neural cells that are responsible for sensing temperature. Um. 325 00:19:31,440 --> 00:19:33,600 Speaker 1: You have all these different neural cells, and all of 326 00:19:33,640 --> 00:19:36,479 Speaker 1: them are roughly the same kind of cell, but they 327 00:19:36,520 --> 00:19:39,600 Speaker 1: have different genes turned on and off. And once you 328 00:19:39,640 --> 00:19:42,760 Speaker 1: know that, and once you can differentiate between one gene 329 00:19:42,760 --> 00:19:46,200 Speaker 1: and another, you've just taken your first step towards genetically 330 00:19:46,200 --> 00:19:51,040 Speaker 1: manipulating these different genes, you know, and understanding Ferrari drivers exactly. 331 00:19:51,440 --> 00:19:53,960 Speaker 1: So you brought us to the sea, and I jumped 332 00:19:53,960 --> 00:19:55,720 Speaker 1: it right back out again, and now we are back 333 00:19:55,760 --> 00:20:01,280 Speaker 1: at the sea like the manatee. That's right. But here 334 00:20:01,359 --> 00:20:04,040 Speaker 1: this is where it gets super super cool. Uh. And 335 00:20:04,320 --> 00:20:06,440 Speaker 1: it sounds like it's confusing, but it's really not. It's 336 00:20:06,480 --> 00:20:10,440 Speaker 1: really pretty simple still. Um. There are genes in mother 337 00:20:10,560 --> 00:20:13,760 Speaker 1: nature that respond to light, and then there are proteins 338 00:20:13,880 --> 00:20:18,080 Speaker 1: that emit light when they're triggered by something. Flores Yeah, 339 00:20:18,160 --> 00:20:20,280 Speaker 1: I like to say glow. In fact, if you'll look here, 340 00:20:20,359 --> 00:20:24,520 Speaker 1: I scratched out fluoresce every single time as you did glow. 341 00:20:24,760 --> 00:20:26,439 Speaker 1: That's a lot of work you put into those, just 342 00:20:26,520 --> 00:20:28,560 Speaker 1: a lot easier to say glow. I think people get 343 00:20:28,560 --> 00:20:31,359 Speaker 1: it a lot better than Fluoresce. I watched Coming to 344 00:20:31,400 --> 00:20:34,720 Speaker 1: America the other day of Man Soul. Glow is so hilarious. 345 00:20:34,960 --> 00:20:37,560 Speaker 1: It still holds up. That movie is even better than 346 00:20:37,600 --> 00:20:40,159 Speaker 1: I remember. Actually it's great. Yeah, it's when we go 347 00:20:40,240 --> 00:20:42,600 Speaker 1: back to a lot. Like I knew Eddie Murphy was 348 00:20:42,720 --> 00:20:48,840 Speaker 1: a charmer, but dude, that guy is one charming human being. Yeah, 349 00:20:49,160 --> 00:20:51,520 Speaker 1: all the all the barbershop stuff is just so classic. 350 00:20:51,600 --> 00:20:53,480 Speaker 1: It's great, but all of it like it really, it's 351 00:20:53,520 --> 00:20:56,280 Speaker 1: just a great, great movie. You know, they're sequeling that thing. 352 00:20:57,040 --> 00:21:01,240 Speaker 1: They've been shooting it in Atlanta. Sequeling or rebooting sequel 353 00:21:01,520 --> 00:21:04,639 Speaker 1: Oh good? So uh. I mean, I think the the 354 00:21:04,680 --> 00:21:07,400 Speaker 1: easiest way to go about a sequel is what they're doing, 355 00:21:07,440 --> 00:21:10,919 Speaker 1: which is now King of Zamunda. Eddie Murphy has a 356 00:21:11,000 --> 00:21:13,600 Speaker 1: son who wants to find his love. Yeah, but I 357 00:21:13,600 --> 00:21:18,560 Speaker 1: think everyone's back, like our Cineo's back. Sure they Oh, 358 00:21:18,640 --> 00:21:21,480 Speaker 1: I'm not gonna be mean good. They found him in 359 00:21:21,520 --> 00:21:24,200 Speaker 1: great spirits and he was eager to work. He said, yeah, 360 00:21:24,240 --> 00:21:26,520 Speaker 1: that's a great idea. I can fit it into my skin. 361 00:21:26,720 --> 00:21:29,440 Speaker 1: Then he went, whoo, who did you see the Grammy's 362 00:21:29,480 --> 00:21:32,560 Speaker 1: the other day. I don't usually watch those either, but 363 00:21:32,640 --> 00:21:36,320 Speaker 1: I happened to see the entire thing, and um, I 364 00:21:36,320 --> 00:21:39,560 Speaker 1: didn't know that you were kidnapped and it was it 365 00:21:39,680 --> 00:21:45,840 Speaker 1: was like um mel Gibson and um oh conspiracy theory. 366 00:21:45,920 --> 00:21:48,160 Speaker 1: I was tied to a wheelchair and my eyes were 367 00:21:48,200 --> 00:21:51,200 Speaker 1: taped open. So you watched all of it? Huh? Yeah, 368 00:21:51,200 --> 00:21:52,800 Speaker 1: But I haven't watched all the Grammy since I was 369 00:21:52,800 --> 00:21:55,280 Speaker 1: like thirteen, Bro, I haven't either. It was really something. 370 00:21:55,480 --> 00:21:59,080 Speaker 1: It's like a marathon or an ultrathon really, but um, 371 00:21:59,080 --> 00:22:01,480 Speaker 1: Tyler the Cree eat or did like a live thing 372 00:22:01,600 --> 00:22:04,480 Speaker 1: and it was amazing. Dude. I've never heard a single 373 00:22:04,560 --> 00:22:07,000 Speaker 1: second of any of his songs or really him perform 374 00:22:07,119 --> 00:22:09,600 Speaker 1: or anything, but um, I like that guy. Now, Yeah, 375 00:22:09,600 --> 00:22:11,879 Speaker 1: he's great. And you know I listened to that early. 376 00:22:12,880 --> 00:22:16,159 Speaker 1: I can't even remember the acronym, but his sort of 377 00:22:16,880 --> 00:22:20,280 Speaker 1: hip hop collective band that they all started out of that, 378 00:22:20,320 --> 00:22:22,480 Speaker 1: like Frank Ocean came out of that. Tyler the Creator 379 00:22:22,560 --> 00:22:24,880 Speaker 1: and a bunch of other guys whoa what's it called? 380 00:22:25,560 --> 00:22:29,760 Speaker 1: Oh what was it? Do you know? Odd Future? And 381 00:22:29,800 --> 00:22:32,400 Speaker 1: then it had another like five or six words after 382 00:22:32,440 --> 00:22:37,120 Speaker 1: that Odd Future was the shortened version. Good stuff, very 383 00:22:37,119 --> 00:22:39,960 Speaker 1: good stuff. Thanks to Josh T for swooping in and 384 00:22:40,000 --> 00:22:43,960 Speaker 1: that uh that the first Frank Ocean album was is amazing. 385 00:22:44,720 --> 00:22:46,640 Speaker 1: I've not heard that either. Oh man, it's so good 386 00:22:46,720 --> 00:22:50,360 Speaker 1: Channel Orange. I think I'm always confused by rappers who 387 00:22:50,520 --> 00:22:55,320 Speaker 1: who just have normal names. Uh Frank Ocean. Yeah, he's 388 00:22:55,400 --> 00:22:57,840 Speaker 1: kind of a singer, crooner type. I mean he does Okay, 389 00:22:57,880 --> 00:23:00,679 Speaker 1: all right, now that makes sense. Yeah, he's he's awesome. 390 00:23:01,560 --> 00:23:05,600 Speaker 1: So cheeze. That's how it happens. We're in the sea. 391 00:23:06,840 --> 00:23:09,840 Speaker 1: We're in the ocean. The Grammys are over right. We 392 00:23:09,880 --> 00:23:13,960 Speaker 1: have found genes that respond uh to light and also 393 00:23:14,040 --> 00:23:18,360 Speaker 1: proteins and other organisms that emit light when triggered. I'll 394 00:23:18,440 --> 00:23:21,200 Speaker 1: let you walk people through what those two things are. 395 00:23:21,320 --> 00:23:24,000 Speaker 1: But the point is they said, we've got the two 396 00:23:24,080 --> 00:23:28,360 Speaker 1: components to make this happen. We can build we can 397 00:23:28,400 --> 00:23:31,760 Speaker 1: control jenes turning on a light, and we can also 398 00:23:32,720 --> 00:23:34,960 Speaker 1: see what happens when something responds to the light. We 399 00:23:35,040 --> 00:23:37,880 Speaker 1: just need to be able to get these two things 400 00:23:37,880 --> 00:23:40,080 Speaker 1: from two different organisms into one thing that we can 401 00:23:40,080 --> 00:23:45,440 Speaker 1: control exactly exactly, And that's the entire basis of optogenetics. Um. 402 00:23:45,480 --> 00:23:47,560 Speaker 1: I think you did a fine job of explaining it. Well, 403 00:23:47,600 --> 00:23:49,200 Speaker 1: I didn't know if you want to talk literally about 404 00:23:49,240 --> 00:23:53,480 Speaker 1: the jellyfish, and well sure so if you don't mind, no, 405 00:23:53,560 --> 00:23:56,639 Speaker 1: it's great. So the algae, like green algae, has something 406 00:23:56,640 --> 00:23:59,520 Speaker 1: called an eye so it's like a single cell organism, right, yeah, 407 00:24:00,160 --> 00:24:03,800 Speaker 1: And it has an eyehole which is light sensitive. It's 408 00:24:03,800 --> 00:24:07,359 Speaker 1: a light sensitive area on the cell. And when sunlight 409 00:24:07,960 --> 00:24:11,400 Speaker 1: hits that eyehole, it triggers the tail of the algae 410 00:24:11,400 --> 00:24:13,920 Speaker 1: to start moving towards the sunlight so that that single 411 00:24:13,960 --> 00:24:17,800 Speaker 1: cell algae can can maximize its exposure to sunlight as 412 00:24:17,880 --> 00:24:19,800 Speaker 1: much as possible. All right, So that's one half. You 413 00:24:19,880 --> 00:24:22,840 Speaker 1: got the thing that sees light and reacts to it, right, 414 00:24:23,080 --> 00:24:26,360 Speaker 1: And again, all this stuff has to do with ion channels. 415 00:24:26,520 --> 00:24:29,000 Speaker 1: That has to do with the concentration of minerals inside 416 00:24:29,000 --> 00:24:32,320 Speaker 1: and outside of these the cell in these channels, right, 417 00:24:32,720 --> 00:24:35,119 Speaker 1: and that's what triggers this movement. That's what triggers the 418 00:24:35,160 --> 00:24:39,640 Speaker 1: electrical impulse. That's the basis of all life apparently, are 419 00:24:39,680 --> 00:24:43,120 Speaker 1: the movement of minerals inside and outside of cell membranes 420 00:24:43,160 --> 00:24:48,840 Speaker 1: triggering electrical impulses. That's life in that bizarre So then 421 00:24:48,840 --> 00:24:51,439 Speaker 1: with jellyfish they have a similar thing too. We're not 422 00:24:51,520 --> 00:24:55,960 Speaker 1: exactly sure why they fluoresce, but say like a predator 423 00:24:56,040 --> 00:24:58,560 Speaker 1: comes up and they sense like a predator's coming, it 424 00:24:58,680 --> 00:25:01,640 Speaker 1: might trigger a chain inge in their eye on concentration, 425 00:25:01,760 --> 00:25:05,240 Speaker 1: which triggers a protein that fluoresces to be produced. So 426 00:25:05,280 --> 00:25:09,320 Speaker 1: the jellyfish starts to glow. And these are two separate things, 427 00:25:09,520 --> 00:25:11,800 Speaker 1: but like you were saying, at some point, I think 428 00:25:11,800 --> 00:25:14,480 Speaker 1: in two thousand five, a team led by Carl dis 429 00:25:14,640 --> 00:25:18,800 Speaker 1: roth Um published a paper that said, hey, man, we 430 00:25:18,840 --> 00:25:23,480 Speaker 1: could take this algae light sensitive gene, and we can 431 00:25:23,520 --> 00:25:29,119 Speaker 1: take this jellyfish fluorescent gene and put them together, and 432 00:25:29,160 --> 00:25:31,919 Speaker 1: then take that so that one triggers the other, and 433 00:25:31,920 --> 00:25:34,000 Speaker 1: like kind of this rubics QB way, so that if 434 00:25:34,040 --> 00:25:37,240 Speaker 1: you shine light on this one gene, it will trigger 435 00:25:37,320 --> 00:25:40,520 Speaker 1: the production of this fluorescence. And we can if we 436 00:25:40,560 --> 00:25:42,960 Speaker 1: can just figure out how to take that gene combination 437 00:25:43,200 --> 00:25:45,680 Speaker 1: and put it into another organism that doesn't have either, 438 00:25:46,200 --> 00:25:48,680 Speaker 1: then we could shine a light on the organism and 439 00:25:48,720 --> 00:25:53,520 Speaker 1: make the cells in that organism glow. And now finally, finally, 440 00:25:54,080 --> 00:25:57,440 Speaker 1: just the ferraris would start to move when we signaled 441 00:25:57,480 --> 00:25:59,399 Speaker 1: for them to move. We don't have to set a 442 00:25:59,480 --> 00:26:01,840 Speaker 1: city block on fire. We don't have to code everything 443 00:26:01,880 --> 00:26:05,080 Speaker 1: and glowing paint. We can just signal to the ferraris. 444 00:26:05,520 --> 00:26:09,280 Speaker 1: Can you believe this? It's it's astounding that that they 445 00:26:09,400 --> 00:26:12,399 Speaker 1: figured out not only like in theory, how to do this, 446 00:26:12,440 --> 00:26:15,080 Speaker 1: but they have actually, over the last fifteen years, been 447 00:26:15,119 --> 00:26:17,600 Speaker 1: successful in doing it. It seems like something that if 448 00:26:17,640 --> 00:26:20,560 Speaker 1: someone was describing, they would just be laughed out of 449 00:26:20,560 --> 00:26:23,879 Speaker 1: a room and say, yeah, that's great. Take this jellyfish 450 00:26:23,920 --> 00:26:27,399 Speaker 1: things alogy thing, put him together, shove it in a 451 00:26:27,440 --> 00:26:30,960 Speaker 1: fruit fly, up at fruit flies, but and then shine 452 00:26:31,000 --> 00:26:32,800 Speaker 1: a light on his face and make him rob a bank. 453 00:26:33,200 --> 00:26:37,040 Speaker 1: I think that's I think that's how that's the ultimate goal. Really, 454 00:26:37,960 --> 00:26:40,680 Speaker 1: it's unbelievable. It really is chuck, all right. So the 455 00:26:40,720 --> 00:26:45,120 Speaker 1: fruit fly is a great little candidate because we've been 456 00:26:45,160 --> 00:26:47,800 Speaker 1: working with fruit flies for a long long time. When 457 00:26:47,800 --> 00:26:51,840 Speaker 1: it comes to genetics, they also um we share like 458 00:26:52,040 --> 00:26:55,679 Speaker 1: jeans in gene sequences that are so closely matched that 459 00:26:55,720 --> 00:26:59,040 Speaker 1: when we find a like a novel gene in a 460 00:26:59,080 --> 00:27:01,480 Speaker 1: fruit fly, we go look at the human genome and 461 00:27:01,520 --> 00:27:03,760 Speaker 1: just try to find its match, and it usually matches. 462 00:27:04,000 --> 00:27:08,080 Speaker 1: That's how closely related we are of human genetic diseases 463 00:27:08,080 --> 00:27:10,679 Speaker 1: are also found in fruit flies. This all seems made up, 464 00:27:11,520 --> 00:27:16,040 Speaker 1: am I being? Maybe this is gonna come out April first? Maybe? 465 00:27:16,600 --> 00:27:20,359 Speaker 1: Oh uh, So the fruit fly is a great little 466 00:27:20,359 --> 00:27:23,280 Speaker 1: candidate for all those reasons, and for one other reason 467 00:27:23,960 --> 00:27:27,080 Speaker 1: is we can actually, uh, we don't need to cut 468 00:27:27,080 --> 00:27:29,119 Speaker 1: a fruit fly's head open to see its brain. We 469 00:27:29,119 --> 00:27:32,320 Speaker 1: can see that little guy's brain through a microscope. That's 470 00:27:32,359 --> 00:27:34,440 Speaker 1: pretty great, Which is a pretty good way to analyze 471 00:27:34,440 --> 00:27:37,280 Speaker 1: something just by letting it do its do its thing, 472 00:27:37,320 --> 00:27:39,960 Speaker 1: you know, especially as far as the fruit flies concerned. 473 00:27:40,080 --> 00:27:42,080 Speaker 1: Oh sure, it's like, yeah, just just hold me down, 474 00:27:42,080 --> 00:27:43,760 Speaker 1: that's fine, Just don't cut my head off. Yeah, but 475 00:27:43,800 --> 00:27:45,520 Speaker 1: you're setting people up to think it's all of mine 476 00:27:45,520 --> 00:27:48,720 Speaker 1: and roses. Yeah, it gets pretty bad. That's when you 477 00:27:48,760 --> 00:27:51,480 Speaker 1: pull a ruck out from under him. Shock. So what's 478 00:27:51,520 --> 00:27:53,480 Speaker 1: happening though, is they're putting that stuff in the fruit fly. 479 00:27:53,680 --> 00:27:55,960 Speaker 1: And then what you do is you have to breed 480 00:27:56,040 --> 00:27:57,800 Speaker 1: like the next generation. I think I don't think it 481 00:27:57,840 --> 00:28:01,560 Speaker 1: would work on that one, would it? Um? No, But 482 00:28:01,560 --> 00:28:04,560 Speaker 1: but you can very easily cultivate like a fruit fly 483 00:28:05,240 --> 00:28:09,000 Speaker 1: colony that is now genetically modified. Just throw some throwing 484 00:28:09,000 --> 00:28:12,560 Speaker 1: in a cage with some martinis and a little bit 485 00:28:12,560 --> 00:28:17,280 Speaker 1: of Sinatra classics. So this is what they did, and 486 00:28:17,320 --> 00:28:21,040 Speaker 1: it was successful. And so this gave them the ability 487 00:28:21,520 --> 00:28:25,000 Speaker 1: to do two things to map out where all these 488 00:28:25,040 --> 00:28:28,000 Speaker 1: neurons are, which was the first kind of big part 489 00:28:28,000 --> 00:28:30,560 Speaker 1: of this problem. And the second thing they could do 490 00:28:30,720 --> 00:28:36,320 Speaker 1: is actually activate these neurons with light. Right, So, now, 491 00:28:36,680 --> 00:28:39,160 Speaker 1: like one of the first things they experimented on, are 492 00:28:39,200 --> 00:28:41,760 Speaker 1: you ready to pull the rug out from people? Sure? 493 00:28:42,160 --> 00:28:45,440 Speaker 1: One of the first fruit fly experiments that they conducted, 494 00:28:45,520 --> 00:28:47,160 Speaker 1: and I shouldn't say one of the first, but one 495 00:28:47,200 --> 00:28:51,960 Speaker 1: of the big ones was that, um, they they genetically 496 00:28:52,040 --> 00:28:57,160 Speaker 1: modified fruit flies whose neurons responsible for their escape reflex, 497 00:28:57,800 --> 00:29:00,320 Speaker 1: which is when their legs tense up and their wings 498 00:29:00,360 --> 00:29:03,160 Speaker 1: tents up and they just fly away when they sense danger. 499 00:29:03,800 --> 00:29:08,000 Speaker 1: These were now genetically modified with an algae and jellyfish combination, 500 00:29:08,560 --> 00:29:13,840 Speaker 1: uh gene sequence, that's right. So they shine a light 501 00:29:13,880 --> 00:29:17,440 Speaker 1: on the fruit flies, and the fruit flies sprung away, 502 00:29:17,720 --> 00:29:20,440 Speaker 1: and they said, that's pretty great, but makes sense. It's 503 00:29:20,600 --> 00:29:23,240 Speaker 1: entirely possible that we just scared him with the light. 504 00:29:24,160 --> 00:29:27,680 Speaker 1: How could we possibly figure out if the actual neurons 505 00:29:27,680 --> 00:29:31,560 Speaker 1: are being activated optogenetically. Right. And in the movie scene, 506 00:29:32,240 --> 00:29:33,840 Speaker 1: you just hear a voice on the other side of 507 00:29:33,840 --> 00:29:37,560 Speaker 1: a desk of some scientists eating Chinese food out of 508 00:29:37,560 --> 00:29:40,520 Speaker 1: a box. He goes, no, you cut their heads off 509 00:29:40,560 --> 00:29:43,360 Speaker 1: and they still live for a little while. That's funny. 510 00:29:43,400 --> 00:29:47,240 Speaker 1: You should do that, I imagine instead um, Robert what's 511 00:29:47,280 --> 00:29:52,480 Speaker 1: his name like scratching the chalkboards slowly with yeah, with 512 00:29:52,600 --> 00:29:55,840 Speaker 1: his idea about cutting their heads off. They could probably 513 00:29:55,920 --> 00:29:59,000 Speaker 1: cheat it both ways because just like Mike the headless chicken, 514 00:29:59,200 --> 00:30:01,640 Speaker 1: had a lot of brain left when they cut his 515 00:30:01,680 --> 00:30:04,480 Speaker 1: head off, so too with the fruit fly. There are 516 00:30:04,640 --> 00:30:08,480 Speaker 1: genes um or neurons i should say, associated with the 517 00:30:08,600 --> 00:30:12,320 Speaker 1: escape reflects that are not just located in the fruit 518 00:30:12,320 --> 00:30:16,160 Speaker 1: fly's head. So they cut the fruit flies heads off because, 519 00:30:16,200 --> 00:30:18,400 Speaker 1: like you said, or like the guy who eating Chinese 520 00:30:18,400 --> 00:30:21,760 Speaker 1: food said, um, the fruit fly will still be able 521 00:30:21,760 --> 00:30:24,040 Speaker 1: to fly around and move around for a little while 522 00:30:24,080 --> 00:30:27,080 Speaker 1: without a head. So they cut the heads off, and 523 00:30:27,080 --> 00:30:29,320 Speaker 1: then they shine the light into the thorax where some 524 00:30:29,400 --> 00:30:32,920 Speaker 1: of these neurons are, and sure enough the fruit flies 525 00:30:33,360 --> 00:30:37,240 Speaker 1: sprung away and flew into the air headless, zombie like. 526 00:30:37,880 --> 00:30:41,920 Speaker 1: But they did it specifically because those neurons were reacting 527 00:30:41,960 --> 00:30:45,080 Speaker 1: to light. So they successfully showed that you can control 528 00:30:45,120 --> 00:30:49,400 Speaker 1: the behavior of a once living organism by shining a 529 00:30:49,480 --> 00:30:53,000 Speaker 1: light on it, once you genetically modified it's it's neurons 530 00:30:53,040 --> 00:30:55,960 Speaker 1: with these proteins. Yeah, I wanted to know a little 531 00:30:56,000 --> 00:31:00,520 Speaker 1: bit more about that second part. I'm sure I did 532 00:31:00,560 --> 00:31:03,640 Speaker 1: a lot of other controls, but my first instinct was 533 00:31:04,520 --> 00:31:07,320 Speaker 1: how close was this light? Did it feel like the 534 00:31:07,440 --> 00:31:09,360 Speaker 1: air move when they put it in front of it, 535 00:31:09,480 --> 00:31:13,200 Speaker 1: or was it you know, distance. But you know, they're scientists, 536 00:31:13,200 --> 00:31:17,040 Speaker 1: I'm sure Rodney and his Chinese food, I'm sure a 537 00:31:17,080 --> 00:31:20,920 Speaker 1: lot of other great suggestions for everybody. Right. The other 538 00:31:21,000 --> 00:31:23,600 Speaker 1: questions are did they mash the heads with their thumbs 539 00:31:23,640 --> 00:31:25,640 Speaker 1: to make sure there was no way that they were 540 00:31:25,640 --> 00:31:28,560 Speaker 1: getting any light info? Al Right, I feel like we 541 00:31:28,560 --> 00:31:33,400 Speaker 1: should take another break because what we've described as almost 542 00:31:33,400 --> 00:31:36,120 Speaker 1: a miracle, but like, what good does that do us? 543 00:31:36,640 --> 00:31:38,720 Speaker 1: Great question, and well we'll talk about what good it 544 00:31:38,840 --> 00:32:01,680 Speaker 1: could do us write after this. Okay, So the fruit 545 00:32:01,720 --> 00:32:04,239 Speaker 1: fly experiment that was that was pretty huge, and it 546 00:32:04,280 --> 00:32:06,440 Speaker 1: wasn't It didn't just end with fruit flies, like we said, 547 00:32:06,440 --> 00:32:12,360 Speaker 1: They've successfully experimented with mice, with fish, um worms, worms yep, 548 00:32:12,600 --> 00:32:15,760 Speaker 1: and all of these are they they use these UM 549 00:32:15,920 --> 00:32:20,200 Speaker 1: these types of of um ion channels or ion pumps 550 00:32:20,240 --> 00:32:26,240 Speaker 1: called dopsins or opsins, it's specifically rhodopsins. They respond to light, 551 00:32:26,320 --> 00:32:29,520 Speaker 1: they're stimulated by light. UM. But they've figured out how 552 00:32:29,520 --> 00:32:34,280 Speaker 1: to insert different ones in the different genes and UM. Eventually, 553 00:32:34,760 --> 00:32:37,400 Speaker 1: what they're thinking is that if we can figure out 554 00:32:37,400 --> 00:32:40,080 Speaker 1: how to use these in humans, we will be able 555 00:32:40,120 --> 00:32:42,960 Speaker 1: to do all manner of things, some of which we've 556 00:32:42,960 --> 00:32:47,280 Speaker 1: already successfully demonstrated on on things like mice and fruit flies, 557 00:32:47,440 --> 00:32:49,680 Speaker 1: not just to get a human to jump using our 558 00:32:49,800 --> 00:32:53,240 Speaker 1: escape reflex, but things like UM. Treating depression is a 559 00:32:53,240 --> 00:32:56,440 Speaker 1: big one. Well, yeah, that's sort of one of the 560 00:32:56,440 --> 00:32:59,760 Speaker 1: the huge potential benefits here is what if we could 561 00:33:00,040 --> 00:33:02,840 Speaker 1: really control the really release of dopamine and someone's brain 562 00:33:03,640 --> 00:33:06,719 Speaker 1: and when people suffer from depression and they're having a 563 00:33:06,720 --> 00:33:12,520 Speaker 1: hard time getting their dopamine reactions to occur naturally instead 564 00:33:12,560 --> 00:33:15,400 Speaker 1: of putting them on pills, which you know, a pill 565 00:33:15,440 --> 00:33:19,240 Speaker 1: doesn't just affect the cells uh that it needs to. 566 00:33:19,480 --> 00:33:21,520 Speaker 1: That's why they have a whole list of side effects 567 00:33:21,800 --> 00:33:24,840 Speaker 1: because they affect everything. UM. They're like, maybe we can 568 00:33:24,880 --> 00:33:27,880 Speaker 1: get so specific that we can literally turn on those 569 00:33:27,880 --> 00:33:31,800 Speaker 1: cells with light, give someone a dopamine hit that will 570 00:33:31,840 --> 00:33:34,560 Speaker 1: take seconds instead of weeks and weeks of being on 571 00:33:34,640 --> 00:33:36,720 Speaker 1: medication that may or may not work and may or 572 00:33:36,760 --> 00:33:39,600 Speaker 1: may not have devastating side effects. Yeah, and you just 573 00:33:39,680 --> 00:33:41,480 Speaker 1: hit the nail on the head that the effect will 574 00:33:41,480 --> 00:33:45,520 Speaker 1: take seconds. Um. That's one of the really big um 575 00:33:45,680 --> 00:33:49,719 Speaker 1: advantages of optogenetics is it's light controlled, and we have 576 00:33:49,880 --> 00:33:53,000 Speaker 1: really great lights that can turn on and off very 577 00:33:53,080 --> 00:33:56,960 Speaker 1: very quickly, like um lasers connected to fiber optics is 578 00:33:57,000 --> 00:33:59,200 Speaker 1: one way that they have figured out how to deliver this. 579 00:33:59,520 --> 00:34:03,720 Speaker 1: I saw it's cute, heartbreaking picture of a mouse with 580 00:34:03,880 --> 00:34:06,400 Speaker 1: like this kind of plastic helmet on the side of 581 00:34:06,400 --> 00:34:08,640 Speaker 1: its head, and coming out of it was a single 582 00:34:08,719 --> 00:34:11,960 Speaker 1: fiber optic cable. Remember those fiber optic kind of brushes 583 00:34:12,440 --> 00:34:14,120 Speaker 1: that had like a light source of the bottom, and 584 00:34:14,160 --> 00:34:16,920 Speaker 1: like the brush itself was just this beautiful, colorful thing. 585 00:34:18,800 --> 00:34:21,480 Speaker 1: I love those. I went and looked through like Google 586 00:34:21,520 --> 00:34:23,799 Speaker 1: images pictures of those and it's just like, God, these 587 00:34:23,800 --> 00:34:28,000 Speaker 1: are so pretty. So they had one of those fiber 588 00:34:28,040 --> 00:34:30,960 Speaker 1: optic little fibers coming out of the mouse's head and 589 00:34:31,000 --> 00:34:33,120 Speaker 1: the mouse is just this little dirt looking at the 590 00:34:33,200 --> 00:34:36,520 Speaker 1: camera like what um, But they can they can connect 591 00:34:36,600 --> 00:34:39,640 Speaker 1: the end of that fiber optic cable to a laser 592 00:34:40,200 --> 00:34:43,160 Speaker 1: and it will deliver that light source too inside the 593 00:34:43,160 --> 00:34:47,040 Speaker 1: mouse's brain. The problem is that there's um all sorts 594 00:34:47,040 --> 00:34:51,160 Speaker 1: of brain damage that you can create by inserting even 595 00:34:51,200 --> 00:34:54,040 Speaker 1: like a really tiny fiber optic fiber into the brain 596 00:34:54,080 --> 00:34:57,759 Speaker 1: of something. But it is one way to do it now. UM. 597 00:34:57,840 --> 00:34:59,839 Speaker 1: What they're working on also is, like I said, those 598 00:35:00,200 --> 00:35:03,680 Speaker 1: Rhodopson's UM. One of the one of the ones they're 599 00:35:03,680 --> 00:35:07,120 Speaker 1: looking at is like red shifted towards the red end 600 00:35:07,120 --> 00:35:09,720 Speaker 1: of the spectrum, which means that you can use something 601 00:35:09,760 --> 00:35:13,160 Speaker 1: like infrared light, which is absorbed more deeply into the body, 602 00:35:13,760 --> 00:35:15,920 Speaker 1: as an external light source. So you just shine like 603 00:35:15,960 --> 00:35:19,160 Speaker 1: an infrared light through the skull and then that will 604 00:35:19,400 --> 00:35:23,359 Speaker 1: um will activate the neurons in the brain too. So 605 00:35:23,600 --> 00:35:25,920 Speaker 1: I don't remember exactly how we started on this, but 606 00:35:26,280 --> 00:35:29,319 Speaker 1: there's there's stuff that we're starting to figure out from 607 00:35:29,400 --> 00:35:33,840 Speaker 1: these mouse models, UM, including things like treating depression. Oh yeah, 608 00:35:33,840 --> 00:35:37,279 Speaker 1: how precise it is, How precise the delivery of light is, 609 00:35:38,160 --> 00:35:41,920 Speaker 1: which is really really important because the timing of neurons 610 00:35:42,360 --> 00:35:46,360 Speaker 1: um and the the triggering them in the the cascade 611 00:35:46,360 --> 00:35:49,560 Speaker 1: of events that it sets off is extremely precisely time. 612 00:35:49,760 --> 00:35:52,080 Speaker 1: So you can't just use like a flashlight and expect 613 00:35:52,120 --> 00:35:54,360 Speaker 1: to treat depression. You would have to be able to 614 00:35:54,760 --> 00:35:56,560 Speaker 1: time it in the way that the brain is supposed 615 00:35:56,600 --> 00:35:58,759 Speaker 1: to be doing it in the first place. Yeah, what 616 00:35:58,800 --> 00:36:02,239 Speaker 1: I wonder is if in the future, and first of all, 617 00:36:02,280 --> 00:36:04,480 Speaker 1: you've got to get past all the ethical hurdles of 618 00:36:04,600 --> 00:36:09,520 Speaker 1: gene therapy to begin with, which are many um and complex. 619 00:36:10,560 --> 00:36:12,200 Speaker 1: So let's say we do get through all that, and 620 00:36:12,239 --> 00:36:15,000 Speaker 1: let's say we get FDA approval to start therapies like this. 621 00:36:16,440 --> 00:36:18,920 Speaker 1: What kind of what does that look like? Because if 622 00:36:18,920 --> 00:36:21,520 Speaker 1: it happens in seconds, does do you make an appointment 623 00:36:22,040 --> 00:36:25,040 Speaker 1: and go to a a a specialist who does this 624 00:36:25,200 --> 00:36:29,120 Speaker 1: light therapy or is this something that you do You 625 00:36:29,160 --> 00:36:31,600 Speaker 1: have a device that you're in control of, right, so, 626 00:36:31,800 --> 00:36:34,399 Speaker 1: like it would probably follow a model like deep brain 627 00:36:34,440 --> 00:36:37,960 Speaker 1: stimulation what you mentioned earlier, where you have electrodes and 628 00:36:38,040 --> 00:36:41,000 Speaker 1: planted in your brain that are doing basically the same thing, 629 00:36:41,680 --> 00:36:45,200 Speaker 1: but a lot less precise and a lot more clumsy. 630 00:36:45,239 --> 00:36:49,239 Speaker 1: But they're electrically stimulating neurons say that released dopamine to 631 00:36:49,280 --> 00:36:51,120 Speaker 1: treat depression. I don't know if we're doing that yet, 632 00:36:51,120 --> 00:36:54,120 Speaker 1: but there's definitely deep brain stimulation. But do you go 633 00:36:54,239 --> 00:36:56,160 Speaker 1: to a place to have that done? You have like 634 00:36:56,160 --> 00:37:00,160 Speaker 1: a pacemaker like device connected via wire from your sin, 635 00:37:00,239 --> 00:37:02,720 Speaker 1: and then the devices like under your skin in your chest, 636 00:37:03,280 --> 00:37:08,560 Speaker 1: but it's being controlled by a computer, Like you have 637 00:37:08,600 --> 00:37:11,120 Speaker 1: an onboard computer on you on you right, But what 638 00:37:11,160 --> 00:37:15,000 Speaker 1: I'm saying is you don't like carry around a button, No, 639 00:37:15,120 --> 00:37:18,439 Speaker 1: it's under your skin. So how would this work? Then? 640 00:37:19,000 --> 00:37:21,160 Speaker 1: I would guess the same way that we would figure 641 00:37:21,160 --> 00:37:25,640 Speaker 1: out exactly from studying opto genetically these neurons that glow 642 00:37:25,719 --> 00:37:27,840 Speaker 1: when when they go off, So we'll figure out the 643 00:37:27,880 --> 00:37:30,720 Speaker 1: brain pathways in the regions responsible for things like depression 644 00:37:30,760 --> 00:37:33,560 Speaker 1: and all that. We would figure out what this standard 645 00:37:34,520 --> 00:37:38,520 Speaker 1: normal pattern is and then to recreate it and then 646 00:37:38,520 --> 00:37:41,480 Speaker 1: the computer would regulate it when needed in the brain. 647 00:37:41,880 --> 00:37:43,480 Speaker 1: Well that makes a little more sense, But I mean 648 00:37:43,560 --> 00:37:47,280 Speaker 1: just that kind of stuff, like just that alone shows 649 00:37:47,320 --> 00:37:50,280 Speaker 1: you how far we are from actually doing this in humans, 650 00:37:50,280 --> 00:37:53,960 Speaker 1: Like we have no idea what the normal pattern in 651 00:37:54,000 --> 00:37:57,920 Speaker 1: the brain is for like um, the like normal serotonin 652 00:37:58,040 --> 00:38:01,200 Speaker 1: release for you know, a normal mood. But it also 653 00:38:01,280 --> 00:38:04,480 Speaker 1: raises these other questions to Chuck where it's like, Okay, 654 00:38:04,520 --> 00:38:06,799 Speaker 1: if we figure that out and we figure out how 655 00:38:06,880 --> 00:38:10,680 Speaker 1: to um, how to how to replicate that, why stop there? 656 00:38:10,680 --> 00:38:14,640 Speaker 1: Like why not just make everybody happier than we are normally? Yeah, 657 00:38:14,680 --> 00:38:18,080 Speaker 1: which brings in the whole free will debate which has 658 00:38:18,160 --> 00:38:20,960 Speaker 1: been around since the dawn of time, And it also 659 00:38:21,239 --> 00:38:23,000 Speaker 1: UM and ED does a great job of kind of 660 00:38:23,000 --> 00:38:25,279 Speaker 1: wrapping it up and pointing out that kind of makes 661 00:38:25,280 --> 00:38:28,840 Speaker 1: you think about things like if we are we just 662 00:38:28,920 --> 00:38:32,799 Speaker 1: a bag of cells that can be uh manipulated by 663 00:38:32,800 --> 00:38:36,399 Speaker 1: a flashing light? Like, uh, is that where you're saying 664 00:38:36,480 --> 00:38:40,239 Speaker 1: is yes, we are? Like is that what happiness is? Like? 665 00:38:40,360 --> 00:38:43,200 Speaker 1: You think happiness is seeing your dog when you get 666 00:38:43,200 --> 00:38:46,040 Speaker 1: home from work and getting those licks. But if those 667 00:38:46,040 --> 00:38:49,120 Speaker 1: are just synapsies firing, that's a very I mean, that's 668 00:38:49,200 --> 00:38:51,360 Speaker 1: scientifically what's going on, but it is a very cold 669 00:38:51,600 --> 00:38:54,080 Speaker 1: and humane way to look at things. I think I 670 00:38:54,760 --> 00:38:56,799 Speaker 1: disagree with that. I think it's just a that's a 671 00:38:56,840 --> 00:38:59,040 Speaker 1: better understanding of what's going on, But I don't think 672 00:38:59,040 --> 00:39:03,040 Speaker 1: it undermines the happiness you're experiencing. I think for a 673 00:39:03,040 --> 00:39:05,719 Speaker 1: lot of people it might well, yeah, I mean I 674 00:39:05,760 --> 00:39:07,600 Speaker 1: it's not like I can't see how it wouldn't. But 675 00:39:07,680 --> 00:39:10,880 Speaker 1: to me, it's like, no, I mean, you're still experiencing happiness. 676 00:39:11,080 --> 00:39:13,960 Speaker 1: The happiness is still important to you. Happiness is still 677 00:39:14,000 --> 00:39:17,160 Speaker 1: the point of life. This is just understanding the mechanism 678 00:39:17,239 --> 00:39:20,200 Speaker 1: that we experience happiness by. That's true. And I have 679 00:39:20,280 --> 00:39:23,160 Speaker 1: seen you around dogs and you constantly are just saying, 680 00:39:23,760 --> 00:39:26,839 Speaker 1: I'm a bag of neurons firing at once. It's like 681 00:39:27,200 --> 00:39:30,160 Speaker 1: Francis Crick, the guy who co discovered DNA. He had 682 00:39:30,400 --> 00:39:33,680 Speaker 1: a book in the seventies called The Astonishing Hypothesis. I 683 00:39:33,680 --> 00:39:35,200 Speaker 1: know we've talked about it before, but he had this 684 00:39:35,239 --> 00:39:37,960 Speaker 1: famous quote where he said, you're nothing but a pack 685 00:39:38,000 --> 00:39:41,919 Speaker 1: of neurons. And I mean, like, to me, it's that's 686 00:39:41,920 --> 00:39:46,120 Speaker 1: a really good way of maintaining a positive outlook on things. 687 00:39:46,120 --> 00:39:48,560 Speaker 1: It's like, no matter how bad things get, it's just 688 00:39:48,760 --> 00:39:52,759 Speaker 1: neurotransmitters in your head that are going haywire or that 689 00:39:52,800 --> 00:39:56,840 Speaker 1: are doing right. Well, that's that's when. That's that's the 690 00:39:56,920 --> 00:39:59,839 Speaker 1: reason to do all this is to regain control over 691 00:39:59,840 --> 00:40:03,319 Speaker 1: it and it's not functioning correctly, and then making things 692 00:40:03,320 --> 00:40:06,680 Speaker 1: even better than they are normally Naturally, there's no written 693 00:40:06,800 --> 00:40:09,799 Speaker 1: law that says if we figure out how to make 694 00:40:09,840 --> 00:40:13,919 Speaker 1: ourselves happier, that we shouldn't do that. As a matter 695 00:40:13,920 --> 00:40:16,600 Speaker 1: of fact, basically every moral code there is says we 696 00:40:16,640 --> 00:40:21,080 Speaker 1: should do that. If we can be happier, let's figure 697 00:40:21,080 --> 00:40:23,680 Speaker 1: out how to be happier. Yeah. I think the other 698 00:40:23,719 --> 00:40:28,640 Speaker 1: thing it makes me think about, slippery slope wise, is, um, 699 00:40:28,920 --> 00:40:33,239 Speaker 1: will people cease to do the things that they do 700 00:40:33,360 --> 00:40:36,759 Speaker 1: to make them happy if they can simply touch a 701 00:40:36,800 --> 00:40:39,600 Speaker 1: button to do so? Yeah, that's called wire heading. And 702 00:40:39,640 --> 00:40:43,000 Speaker 1: that's actually a big problem with artificial intelligence is um 703 00:40:43,040 --> 00:40:45,840 Speaker 1: they're saying, like, Okay, if we train artificial intelligence to 704 00:40:45,840 --> 00:40:48,840 Speaker 1: do something based on a reward, the artificial intelligence just 705 00:40:48,880 --> 00:40:51,040 Speaker 1: gonna go figure out how to go right to the reward. 706 00:40:51,120 --> 00:40:53,680 Speaker 1: But it's not. It's going to circumvent that um. And 707 00:40:53,719 --> 00:40:56,080 Speaker 1: that's that's a great question too, where if we start 708 00:40:56,160 --> 00:40:59,560 Speaker 1: to become like digital consciousness right where we migrate online 709 00:40:59,600 --> 00:41:02,080 Speaker 1: and we should at our bodies and our consciousness just 710 00:41:02,160 --> 00:41:04,799 Speaker 1: exists in digital form, then all that stuff will be 711 00:41:04,840 --> 00:41:08,040 Speaker 1: available to us. And it does make you think like, okay, 712 00:41:08,080 --> 00:41:11,880 Speaker 1: if if our existence is just digital, there's no purpose 713 00:41:11,920 --> 00:41:14,680 Speaker 1: to it except to experience pleasure. Is there anything wrong 714 00:41:14,719 --> 00:41:17,239 Speaker 1: with just sitting around experiencing pleasure all the time? Or 715 00:41:17,239 --> 00:41:19,520 Speaker 1: do we need more than that? I don't know. That's 716 00:41:19,520 --> 00:41:23,359 Speaker 1: a that's a next next level question if you ask me. Yeah, 717 00:41:23,400 --> 00:41:24,839 Speaker 1: I mean it kind of do you ever see wall 718 00:41:24,960 --> 00:41:28,680 Speaker 1: e yes? Sort of like that there could be the future, 719 00:41:28,760 --> 00:41:30,400 Speaker 1: Like why go out and take a walk if you're 720 00:41:30,400 --> 00:41:34,000 Speaker 1: feeling down to get some sunshine on your face? If 721 00:41:34,040 --> 00:41:37,439 Speaker 1: you can just press a button to do the same thing. Yeah, 722 00:41:37,480 --> 00:41:40,279 Speaker 1: And like in that movie, it's it's like there's well, 723 00:41:40,320 --> 00:41:43,759 Speaker 1: there's something inherently wrong with that. But I don't know, man, 724 00:41:43,880 --> 00:41:45,400 Speaker 1: because like if you think about it, when you go 725 00:41:45,440 --> 00:41:47,759 Speaker 1: outside and you get a walk, you feel better, you 726 00:41:47,960 --> 00:41:51,000 Speaker 1: feel like more positive if you can get that without 727 00:41:51,080 --> 00:41:53,440 Speaker 1: doing the walk. To you, if you can get everything 728 00:41:53,560 --> 00:41:56,440 Speaker 1: from a walk without having to go on a walk, 729 00:41:56,600 --> 00:41:58,640 Speaker 1: do you still need to go on a walk? Will 730 00:41:58,719 --> 00:42:01,520 Speaker 1: including like the benefits to your health and body? Yes? 731 00:42:01,680 --> 00:42:04,239 Speaker 1: If you could get every single scrap of benefit that 732 00:42:04,280 --> 00:42:07,360 Speaker 1: you can get from a walk digitally or somehow without 733 00:42:07,400 --> 00:42:10,319 Speaker 1: actually going on a walk, do you need to go 734 00:42:10,360 --> 00:42:12,640 Speaker 1: on a walk? I say yes, but you and I 735 00:42:12,680 --> 00:42:14,960 Speaker 1: are different. No, No, I'm with you. I still say 736 00:42:15,040 --> 00:42:18,319 Speaker 1: yes as well, but I can't. I can't explain why. Okay, yeah, yeah, 737 00:42:18,320 --> 00:42:21,360 Speaker 1: I'm not just like this full transhumanist guy. I definitely 738 00:42:21,400 --> 00:42:23,279 Speaker 1: have questions about the whole thing too. I think you 739 00:42:23,360 --> 00:42:26,799 Speaker 1: just built some bond water on the carpet that's never 740 00:42:26,840 --> 00:42:29,000 Speaker 1: gonna come out. I'm gonna stop it up with a 741 00:42:29,160 --> 00:42:35,160 Speaker 1: fa breeze dryer sheet. Remember that, Remember when when kids? 742 00:42:35,239 --> 00:42:37,200 Speaker 1: You saw kids do that at the dorms. Yeah, I 743 00:42:37,440 --> 00:42:39,600 Speaker 1: don't know if kids had fabrize dryer sheets when I 744 00:42:39,640 --> 00:42:42,120 Speaker 1: was in college, they didn't exist yet, Okay. Or you 745 00:42:42,120 --> 00:42:46,200 Speaker 1: mean to just like the bounce sheets? Yes, oh sure, Yeah, 746 00:42:46,320 --> 00:42:51,919 Speaker 1: yeah I've seen I've seen those old tricks. It's hilarious. Okay, Well, 747 00:42:51,960 --> 00:42:54,200 Speaker 1: you got anything else about opt to genetics? No, it's 748 00:42:54,239 --> 00:42:58,400 Speaker 1: pretty pretty fascinating stuff. Yeah, we'll see where it goes, agreed. Actually, 749 00:42:58,480 --> 00:43:00,239 Speaker 1: probably won't see where it goes in our life time. 750 00:43:00,280 --> 00:43:03,120 Speaker 1: But I don't know, man. I suspect that while we're alive, 751 00:43:03,160 --> 00:43:05,520 Speaker 1: things are going to change quite a bit. We'll live 752 00:43:05,560 --> 00:43:07,560 Speaker 1: to see a lot of this stuff. I'm gonna check 753 00:43:07,560 --> 00:43:11,279 Speaker 1: in with you in in thirty five years, we'll be 754 00:43:11,320 --> 00:43:13,480 Speaker 1: sitting across the desk from me. Still, when we get 755 00:43:13,480 --> 00:43:17,040 Speaker 1: commemorated into the Podcasting Hall of Fame, you're on that, 756 00:43:17,080 --> 00:43:19,919 Speaker 1: aren't you. You're gonna stroll into that room wearing your 757 00:43:20,560 --> 00:43:24,919 Speaker 1: your VR headset, pressing your little dopamine button right talking 758 00:43:24,920 --> 00:43:28,239 Speaker 1: about how great life is, right, just wire headed to 759 00:43:28,280 --> 00:43:31,759 Speaker 1: the Guild's right. So if you want to know more 760 00:43:31,760 --> 00:43:35,800 Speaker 1: about optogenetics, well go start reading about It's pretty interesting stuff. 761 00:43:36,280 --> 00:43:39,359 Speaker 1: And since I said that it's time for listener mail, 762 00:43:42,000 --> 00:43:44,680 Speaker 1: listener mail, I think it was me who goofed up 763 00:43:44,719 --> 00:43:49,400 Speaker 1: on the postal going Postal EPP. When I think off handedly, 764 00:43:49,400 --> 00:43:53,680 Speaker 1: when they were talking about the Californo Commission about how 765 00:43:53,719 --> 00:43:57,600 Speaker 1: much money we spent, I think it's a tex stollar money. Yeah, 766 00:43:57,640 --> 00:44:00,760 Speaker 1: like a dope because we've covered the U. S. Postal 767 00:44:00,840 --> 00:44:04,040 Speaker 1: service and we know that that is not the case. 768 00:44:04,200 --> 00:44:07,279 Speaker 1: And this is from Peter among many others. Hey guys, 769 00:44:07,320 --> 00:44:08,759 Speaker 1: want to start off by saying how much I love 770 00:44:08,840 --> 00:44:11,000 Speaker 1: the show. You always do a great job researching the 771 00:44:11,000 --> 00:44:14,160 Speaker 1: subjects you talk about. However, I knew, however, I got 772 00:44:14,160 --> 00:44:16,680 Speaker 1: a small bone to pick. In your recent episode Why 773 00:44:16,760 --> 00:44:19,160 Speaker 1: Postal Employees Go Postal, you talked about how the US 774 00:44:19,200 --> 00:44:22,240 Speaker 1: Postal Service spent four million tax dollars on the Joseph 775 00:44:22,280 --> 00:44:26,000 Speaker 1: Californo Commission. While Congress does still control the USPS budget, 776 00:44:26,360 --> 00:44:28,839 Speaker 1: it receives no funding from them at all, and has 777 00:44:28,880 --> 00:44:32,640 Speaker 1: not since the early nineteen eighties. The USPS operates solely 778 00:44:32,680 --> 00:44:36,600 Speaker 1: on the money they make from stamps and packages zero 779 00:44:36,719 --> 00:44:40,160 Speaker 1: tax dollars. Anyway, thanks for the amazing content. May you 780 00:44:40,239 --> 00:44:42,200 Speaker 1: keep doing so for many years to come. That is 781 00:44:42,239 --> 00:44:46,160 Speaker 1: from Peter and many many others. And by the way, 782 00:44:46,200 --> 00:44:49,960 Speaker 1: we heard from a lot of people, uh, postal employees 783 00:44:50,000 --> 00:44:52,800 Speaker 1: or people whose family was are or were in the 784 00:44:52,840 --> 00:44:56,240 Speaker 1: Postal Service, and we got a range of things from 785 00:44:56,400 --> 00:44:58,520 Speaker 1: you guys are crazy, my post office is great, there's 786 00:44:58,520 --> 00:45:02,680 Speaker 1: no toxic environment, to people saying oh, they're absolutely is 787 00:45:02,719 --> 00:45:05,440 Speaker 1: a very toxic environment. Yeah, like it's even worse than 788 00:45:05,480 --> 00:45:08,240 Speaker 1: you guys said. Yeah, So I think for the people 789 00:45:08,239 --> 00:45:10,160 Speaker 1: that wrote in that said that was not the case, 790 00:45:10,239 --> 00:45:13,000 Speaker 1: then I am very happy that you work in a 791 00:45:13,080 --> 00:45:15,840 Speaker 1: great place that has a great environment. But it seems 792 00:45:15,880 --> 00:45:19,480 Speaker 1: like there is a range. They're right, that's the nicest 793 00:45:19,480 --> 00:45:23,200 Speaker 1: way to say it. Yeah. Uh, well that was Peter, right, Peter, 794 00:45:23,719 --> 00:45:25,960 Speaker 1: Thanks a lot, Peter. That was a very nice way 795 00:45:26,000 --> 00:45:27,319 Speaker 1: to put it. And if you want to get in 796 00:45:27,320 --> 00:45:30,799 Speaker 1: touch with us like Peter did, you can go and 797 00:45:31,000 --> 00:45:33,759 Speaker 1: send us an email. Send it off to stuff podcast 798 00:45:33,960 --> 00:45:39,200 Speaker 1: at iHeart radio dot com. Stuff you Should Know is 799 00:45:39,239 --> 00:45:41,839 Speaker 1: a production of iHeart Radios. How stuff works for more 800 00:45:41,880 --> 00:45:44,319 Speaker 1: podcasts for my heart Radio because at the iHeart Radio app, 801 00:45:44,400 --> 00:45:47,040 Speaker 1: Apple podcasts are wherever you listen to your favorite shows.