1 00:00:04,440 --> 00:00:12,800 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,880 --> 00:00:15,040 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:15,040 --> 00:00:17,120 Speaker 1: an executive producer with iHeart Podcast And how the tech 4 00:00:17,160 --> 00:00:20,840 Speaker 1: are you? Yes, I am excited. That is why I 5 00:00:20,880 --> 00:00:24,759 Speaker 1: had the crazy high energy introduction for this episode. That's 6 00:00:24,800 --> 00:00:31,280 Speaker 1: because I have two phenomenal guests on today's episode. They 7 00:00:31,280 --> 00:00:34,960 Speaker 1: are the hosts of Daniel and Kelly's Extraordinary Universe. There 8 00:00:34,960 --> 00:00:40,120 Speaker 1: are two brilliant scientists here to talk science and tech 9 00:00:40,159 --> 00:00:43,760 Speaker 1: and depictions of science and popular media and how they 10 00:00:43,880 --> 00:00:46,360 Speaker 1: always get at one hundred percent right all the time. 11 00:00:46,400 --> 00:00:49,680 Speaker 1: It's Daniel and it's Kelly. Welcome to Tech Stuff. 12 00:00:50,159 --> 00:00:51,680 Speaker 2: Thanks very much, Thank you. 13 00:00:51,720 --> 00:00:53,440 Speaker 3: I wish you could call me every day to cheer 14 00:00:53,440 --> 00:00:56,000 Speaker 3: me up with with you know, a nice introduction just 15 00:00:56,040 --> 00:00:56,240 Speaker 3: like that. 16 00:00:56,600 --> 00:00:59,200 Speaker 1: Oh, I'd be happy too, I'd be happy to My 17 00:00:59,320 --> 00:01:02,040 Speaker 1: rates are very and pretty soon I'm going to be 18 00:01:02,080 --> 00:01:03,840 Speaker 1: looking for a gig. I don't know if you have 19 00:01:04,000 --> 00:01:06,880 Speaker 1: heard this, but I'm actually stepping away from hosting tech 20 00:01:06,959 --> 00:01:08,040 Speaker 1: Stuff in the new year. 21 00:01:08,920 --> 00:01:10,440 Speaker 3: Oh what are you going to do instead? 22 00:01:10,760 --> 00:01:15,000 Speaker 1: There's a giant, cloudy question mark in my near future. Actually, 23 00:01:15,120 --> 00:01:18,160 Speaker 1: I will be my listeners know that I'm stepping away. 24 00:01:18,319 --> 00:01:20,560 Speaker 1: I did an episode talking about it. But yeah, I've 25 00:01:20,600 --> 00:01:22,920 Speaker 1: been hosting Tech Stuff for sixteen and a half years. 26 00:01:22,959 --> 00:01:25,880 Speaker 1: I've got more than twenty three hundred episodes under my belt, 27 00:01:26,120 --> 00:01:28,080 Speaker 1: and I thought it was time for someone else to 28 00:01:28,080 --> 00:01:31,800 Speaker 1: talk about tech for a while, and so I'm stepping 29 00:01:31,840 --> 00:01:34,800 Speaker 1: back in early January, and I will be focusing on 30 00:01:34,880 --> 00:01:40,600 Speaker 1: executive producer duties moving forward, potentially hosting the occasional episode 31 00:01:40,600 --> 00:01:42,959 Speaker 1: and or show in the future. Nothing is set in stone, 32 00:01:43,080 --> 00:01:45,800 Speaker 1: but I'm excited to be able to kind of shift 33 00:01:45,840 --> 00:01:46,440 Speaker 1: some gears. 34 00:01:46,920 --> 00:01:48,480 Speaker 3: Happy retirement, Thank you. 35 00:01:48,880 --> 00:01:51,440 Speaker 1: I mean, I'll still be working like a crazy person, 36 00:01:51,480 --> 00:01:53,320 Speaker 1: it just won't be I just won't be creating three 37 00:01:53,400 --> 00:01:56,120 Speaker 1: episodes a week. But I'm so glad to have you here. Daniel, 38 00:01:56,200 --> 00:01:58,279 Speaker 1: Welcome back to the show. I don't know if you remember, 39 00:01:58,360 --> 00:02:01,200 Speaker 1: but years ago you did a little guest appearance on 40 00:02:01,320 --> 00:02:03,880 Speaker 1: Tech Stuff, And I'm sure you don't remember this, but 41 00:02:04,000 --> 00:02:06,240 Speaker 1: I do because it's an important moment in my life. 42 00:02:06,360 --> 00:02:09,679 Speaker 1: You name me an honorary engineer in that episode, and 43 00:02:10,560 --> 00:02:14,360 Speaker 1: I've been holding onto that title like with a death 44 00:02:14,440 --> 00:02:15,359 Speaker 1: grip ever since. 45 00:02:16,639 --> 00:02:18,720 Speaker 3: He hasn't given me that title. 46 00:02:19,200 --> 00:02:21,200 Speaker 2: Yeah, I'm not sure that title is something you should 47 00:02:21,200 --> 00:02:24,720 Speaker 2: be proud of. You know, scientists and engineers are usually 48 00:02:24,840 --> 00:02:27,200 Speaker 2: sort of, you know, tooth and nail against each other, 49 00:02:27,320 --> 00:02:28,440 Speaker 2: so I'm not sure I meant that in a. 50 00:02:28,360 --> 00:02:30,440 Speaker 1: Positive pay Oh, that's fair, that's fair. I think we 51 00:02:30,720 --> 00:02:34,040 Speaker 1: were chatting about how I think of engineers as people 52 00:02:34,040 --> 00:02:36,440 Speaker 1: who view the world as a series of problems that 53 00:02:36,520 --> 00:02:39,640 Speaker 1: need to be solved, and by the end of it 54 00:02:39,680 --> 00:02:41,880 Speaker 1: you said I was an honorary engineer, which made sense 55 00:02:41,880 --> 00:02:43,800 Speaker 1: because I do have an Android phone, and I feel 56 00:02:43,800 --> 00:02:47,200 Speaker 1: like that's a device made by engineers for engineers. 57 00:02:47,520 --> 00:02:48,720 Speaker 3: That's like sixty percent of it. 58 00:02:48,840 --> 00:02:51,760 Speaker 1: Yeah, yeah, yeah, yeah, Like the iPhone is incredibly intuitive 59 00:02:51,800 --> 00:02:53,560 Speaker 1: and the Android is it does stuff. 60 00:02:53,720 --> 00:02:57,600 Speaker 3: So yeah, well my engineer dad has an Android, so 61 00:02:57,680 --> 00:02:58,680 Speaker 3: that's totally consistent. 62 00:02:58,960 --> 00:03:01,000 Speaker 1: It tracks right, Like, well, there we go. That's not 63 00:03:01,080 --> 00:03:03,480 Speaker 1: confirmation by us or anything. We're just going to go 64 00:03:03,520 --> 00:03:06,440 Speaker 1: with that. Yes, So here on tech stuff, we have 65 00:03:06,560 --> 00:03:10,280 Speaker 1: this kind of underlying philosophy which does not directly relate 66 00:03:10,320 --> 00:03:12,560 Speaker 1: to tech. It relates to the direction of the show, 67 00:03:12,880 --> 00:03:16,440 Speaker 1: which is I say, it's a combination of creative thinking approach, 68 00:03:16,560 --> 00:03:21,119 Speaker 1: it's compassion, and it's critical thinking, and I'm curious if 69 00:03:21,160 --> 00:03:24,000 Speaker 1: you think of a similar philosophy for your show. 70 00:03:24,560 --> 00:03:28,640 Speaker 2: I think our show is all about curiosity and excitement 71 00:03:29,000 --> 00:03:32,639 Speaker 2: because we viewed the whole universe as like a big mystery. 72 00:03:32,800 --> 00:03:34,880 Speaker 2: That because we viewed the whole universe as like a 73 00:03:34,880 --> 00:03:38,160 Speaker 2: big mystery to unravel, and every new little piece of 74 00:03:38,240 --> 00:03:41,200 Speaker 2: truth that we pry away at, the truth minds. You know, 75 00:03:41,240 --> 00:03:44,280 Speaker 2: where we go to work every day is an exciting discovery. 76 00:03:44,560 --> 00:03:47,920 Speaker 2: It's always fantastic. Whatever the universe chooses to reveal to us, 77 00:03:47,960 --> 00:03:48,960 Speaker 2: it's always amazing. 78 00:03:49,320 --> 00:03:52,400 Speaker 3: Yeah, I think enthusiasm is huge. We're really excited about 79 00:03:52,520 --> 00:03:54,880 Speaker 3: the information we've managed to extract from the Earth and 80 00:03:54,960 --> 00:03:56,760 Speaker 3: all the cool stuff that's left to figure out. 81 00:03:56,880 --> 00:03:59,480 Speaker 1: Yeah. I've been listening to your show and I just 82 00:03:59,480 --> 00:04:01,480 Speaker 1: want to say I also want to congratulate you on 83 00:04:01,520 --> 00:04:04,800 Speaker 1: your incredible sense of timing because we're heading up to 84 00:04:04,840 --> 00:04:07,400 Speaker 1: Thanksgiving and then the most recent episode I listened to 85 00:04:07,520 --> 00:04:11,200 Speaker 1: was all about cannibalism, and I just felt that you 86 00:04:11,200 --> 00:04:14,960 Speaker 1: you really you planned those thematic shows really well. 87 00:04:16,040 --> 00:04:17,840 Speaker 3: Yeah, you know, we put a lot of thought into 88 00:04:17,839 --> 00:04:21,000 Speaker 3: each episode, and we do have a like an actual 89 00:04:21,040 --> 00:04:25,159 Speaker 3: Thanksgiving episode coming out on Thanksgiving where we Daniel. So 90 00:04:25,279 --> 00:04:27,760 Speaker 3: Daniel talks about cannibalism a lot, even though he's not 91 00:04:27,800 --> 00:04:30,360 Speaker 3: the biologist. Did you manage to get a reference to 92 00:04:30,400 --> 00:04:31,960 Speaker 3: cannibalism into that episode a bit? 93 00:04:32,320 --> 00:04:35,440 Speaker 2: Absolutely? Yes, Okay, cannibalism and aliens. I always got to 94 00:04:35,480 --> 00:04:37,640 Speaker 2: work those two angles into every episode somehow. 95 00:04:38,080 --> 00:04:42,279 Speaker 3: Yeah, So it's enthusiasm, creativity, cannibalism, aliens. That's what our podcasts. 96 00:04:42,279 --> 00:04:44,240 Speaker 1: Is kind of amazing that you didn't name the show 97 00:04:44,320 --> 00:04:44,960 Speaker 1: What's Eating You. 98 00:04:47,320 --> 00:04:50,920 Speaker 3: I had a paper published called What's Gotten Into You, 99 00:04:51,640 --> 00:04:53,600 Speaker 3: and it was all about how parasites changed behavior. But 100 00:04:53,640 --> 00:04:55,520 Speaker 3: I guess it's not What's Eating You, but it's it 101 00:04:55,560 --> 00:04:57,599 Speaker 3: was What's Gotten into You? Anyway, it was I felt 102 00:04:57,680 --> 00:04:58,839 Speaker 3: very good about that time well. 103 00:04:58,720 --> 00:05:00,280 Speaker 1: And that was kind of like one of the things 104 00:05:00,320 --> 00:05:02,760 Speaker 1: that sort of was the genesis of my idea for 105 00:05:02,800 --> 00:05:05,720 Speaker 1: this episode was that talk about you know, your work 106 00:05:06,000 --> 00:05:10,160 Speaker 1: in science and then how you view the media depiction 107 00:05:10,360 --> 00:05:13,560 Speaker 1: of science. You know, I'm sure there's like part of 108 00:05:13,600 --> 00:05:15,520 Speaker 1: you correct me if I'm wrong, but I'm sure there's 109 00:05:15,560 --> 00:05:18,240 Speaker 1: part of you that just appreciates. Oh well, obviously they 110 00:05:18,279 --> 00:05:20,520 Speaker 1: had to do this for the narrative to work, like 111 00:05:20,600 --> 00:05:24,760 Speaker 1: this is a storytelling convention, it's required. They're not presenting 112 00:05:24,760 --> 00:05:27,360 Speaker 1: this as if it's hard science. And so therefore, if 113 00:05:27,360 --> 00:05:29,560 Speaker 1: they do the Arthur C. Clark thing where they just 114 00:05:29,640 --> 00:05:32,479 Speaker 1: use science as a way to stand in for magic, 115 00:05:32,640 --> 00:05:35,719 Speaker 1: that's okay. And then there's probably part of you that's like, cush, 116 00:05:35,800 --> 00:05:40,440 Speaker 1: darn it, that's not what particle accelerators do. And I 117 00:05:40,520 --> 00:05:44,919 Speaker 1: wish they would stop because people are getting the wrong idea. So, Kelly, 118 00:05:44,960 --> 00:05:46,880 Speaker 1: one of the ones I mentioned to you was there's 119 00:05:46,960 --> 00:05:49,000 Speaker 1: the show The Last of Us, which actually they did 120 00:05:49,040 --> 00:05:52,840 Speaker 1: a fairly decent job in talking about how the fungal 121 00:05:52,880 --> 00:05:56,839 Speaker 1: parasites that can affect insects and control them and their 122 00:05:56,839 --> 00:06:00,640 Speaker 1: behavior to some extent, how that really wouldn't trate to 123 00:06:01,080 --> 00:06:03,800 Speaker 1: more complex life forms. They actually go so far as 124 00:06:03,839 --> 00:06:06,480 Speaker 1: to say that in the intro, so they lampshaded it 125 00:06:06,600 --> 00:06:08,920 Speaker 1: quite a bit, right, They said, this is not really possible, 126 00:06:09,000 --> 00:06:11,560 Speaker 1: and yet hear it happened in the show, which of 127 00:06:11,600 --> 00:06:15,680 Speaker 1: course dates back to the actual video game that inspired 128 00:06:15,880 --> 00:06:18,719 Speaker 1: that series. So when you see things like that, like, 129 00:06:18,800 --> 00:06:22,719 Speaker 1: what's your reaction having actually written about these kinds of 130 00:06:22,760 --> 00:06:25,320 Speaker 1: parasites that do alter behaviors? 131 00:06:25,760 --> 00:06:28,839 Speaker 3: My philosophy in general is that you know fiction is 132 00:06:28,880 --> 00:06:30,880 Speaker 3: it just it's a totally different thing. And as long 133 00:06:30,920 --> 00:06:33,760 Speaker 3: as they create worlds that are consistent and then they 134 00:06:33,800 --> 00:06:35,920 Speaker 3: stick with the rules that they've made for their world, 135 00:06:36,120 --> 00:06:37,760 Speaker 3: I'm cool with that. The only thing that gets me 136 00:06:37,800 --> 00:06:40,680 Speaker 3: grumpy is when people like change their rules halfway through. 137 00:06:40,760 --> 00:06:42,760 Speaker 3: Like I hated Lost because it was like I thought 138 00:06:42,760 --> 00:06:44,680 Speaker 3: we were in the real world and now there's like 139 00:06:44,760 --> 00:06:47,760 Speaker 3: smoke monsters and like, yeah, anyway, so I thought the 140 00:06:47,839 --> 00:06:48,680 Speaker 3: Last of Us was great. 141 00:06:48,800 --> 00:06:50,960 Speaker 1: Oh gosh, I have a feeling that you and I 142 00:06:51,000 --> 00:06:55,480 Speaker 1: would have some of the best post viewing coffee sessions 143 00:06:55,720 --> 00:06:58,599 Speaker 1: for various shows and movies, because you have hit upon 144 00:06:58,839 --> 00:07:03,560 Speaker 1: one of my most passionate topics, which is consistency and rules. 145 00:07:03,800 --> 00:07:06,280 Speaker 1: As much as as I enjoyed the Buffy the Vampire 146 00:07:06,360 --> 00:07:10,080 Speaker 1: Slayer television show when it first aired, I got consistently 147 00:07:10,160 --> 00:07:12,440 Speaker 1: more upset with that show as it went on, as 148 00:07:12,480 --> 00:07:15,000 Speaker 1: I felt that it was violating the rules that it 149 00:07:15,000 --> 00:07:18,320 Speaker 1: itself had set up in previous seasons. And meanwhile, all 150 00:07:18,360 --> 00:07:20,840 Speaker 1: my friends were Buffy fans, were like, why are you 151 00:07:20,920 --> 00:07:23,320 Speaker 1: so upset about this? I'm like, it's the easiest thing 152 00:07:23,320 --> 00:07:25,440 Speaker 1: in the world to be consistent. You have to go 153 00:07:25,560 --> 00:07:27,600 Speaker 1: out of your way to be inconsistent. 154 00:07:28,560 --> 00:07:30,600 Speaker 3: We should get together for coffee because you know, I 155 00:07:30,600 --> 00:07:33,520 Speaker 3: feel like often the you need to be consistent about 156 00:07:33,560 --> 00:07:36,080 Speaker 3: the rules. People like aren't the first one to invite 157 00:07:36,120 --> 00:07:39,000 Speaker 3: to the parties, right, so we have to stick together. 158 00:07:39,080 --> 00:07:42,640 Speaker 1: Yes, we're the ones that somehow get overlooked when the 159 00:07:42,680 --> 00:07:45,000 Speaker 1: invitations go out. I don't understand it either. 160 00:07:45,640 --> 00:07:47,160 Speaker 3: It's not a fair world we live in. 161 00:07:47,320 --> 00:07:47,760 Speaker 1: It is not. 162 00:07:48,200 --> 00:07:50,320 Speaker 2: And you know, if you're watching like a murder mystery, 163 00:07:50,360 --> 00:07:52,760 Speaker 2: everybody expects you to follow the rules, right, Like if 164 00:07:52,800 --> 00:07:54,800 Speaker 2: there was a clue laid in act one, you expect 165 00:07:54,800 --> 00:07:57,760 Speaker 2: that to still be relevant in act three. But somehow 166 00:07:57,800 --> 00:07:59,960 Speaker 2: if it's science fiction and like, hey, we could just 167 00:08:00,400 --> 00:08:03,160 Speaker 2: those rules out the window. You know, let's be fair 168 00:08:03,200 --> 00:08:04,280 Speaker 2: about the standards we apply. 169 00:08:04,520 --> 00:08:07,440 Speaker 1: Yes, absolutely, same thing's true for horror movies. By the way, 170 00:08:07,800 --> 00:08:12,040 Speaker 1: if your horror movie is predicated upon information purposefully withheld 171 00:08:12,080 --> 00:08:15,360 Speaker 1: from the audience, and you can't expect me to be 172 00:08:15,480 --> 00:08:17,320 Speaker 1: wold when you have your big reveal, you need to 173 00:08:17,360 --> 00:08:19,240 Speaker 1: set that stuff up in subtle ways that I don't 174 00:08:19,280 --> 00:08:22,600 Speaker 1: pick up on, but then can appreciate at the end. Yes, amen, 175 00:08:22,880 --> 00:08:24,600 Speaker 1: I say that as someone who just watched the horror 176 00:08:24,600 --> 00:08:26,480 Speaker 1: movie this past weekend, and I was like, Oh, that's 177 00:08:26,760 --> 00:08:29,520 Speaker 1: that's not Chekhov's gun, but that is Chekhov's wooden plank 178 00:08:29,600 --> 00:08:32,720 Speaker 1: with nails sticking out of it. People who have seen 179 00:08:32,720 --> 00:08:35,720 Speaker 1: it now know what I'm talking about. Daniel, like, same 180 00:08:35,760 --> 00:08:37,920 Speaker 1: sort of thing for you. I mean, you've obviously worked 181 00:08:37,920 --> 00:08:41,760 Speaker 1: on some of the most famous particle physics projects in 182 00:08:41,840 --> 00:08:45,240 Speaker 1: the world, and not only have we seen interesting depictions 183 00:08:45,280 --> 00:08:49,080 Speaker 1: of that in fiction media, but obviously the news media 184 00:08:49,200 --> 00:08:52,920 Speaker 1: had all sorts of speculation as to what was going 185 00:08:52,960 --> 00:08:56,319 Speaker 1: to happen once the large Hadron collider was operating at 186 00:08:56,320 --> 00:08:59,120 Speaker 1: full power. People may not remember that, but there were 187 00:08:59,240 --> 00:09:03,720 Speaker 1: so many stories about everything from time traveling, conspiracy theories 188 00:09:03,960 --> 00:09:09,000 Speaker 1: involving birds dropping twigs down ventilation shafts, to some sort 189 00:09:09,040 --> 00:09:12,160 Speaker 1: of message from the future being sent back saying, for 190 00:09:12,360 --> 00:09:14,720 Speaker 1: the love of all that's holy, don't turn it on. 191 00:09:15,840 --> 00:09:18,439 Speaker 1: What was your reaction as all this was unfolding. 192 00:09:18,640 --> 00:09:21,200 Speaker 2: I thought it was sort of amazing and hilarious, but 193 00:09:21,480 --> 00:09:24,920 Speaker 2: also terrifying. I mean, it's always fun to see your 194 00:09:25,000 --> 00:09:28,040 Speaker 2: own work depicted in the popular media, and you know, 195 00:09:28,520 --> 00:09:30,640 Speaker 2: Stephen Colbert when he was still on The Daily Show, 196 00:09:30,720 --> 00:09:32,920 Speaker 2: did a whole bit about whether we should turn on 197 00:09:32,960 --> 00:09:36,520 Speaker 2: the LEDC and understanding and probability and you know, it's 198 00:09:36,520 --> 00:09:38,480 Speaker 2: not safe for work, but people should definitely go check 199 00:09:38,520 --> 00:09:41,920 Speaker 2: that out because it's really fun. But it's also a 200 00:09:41,960 --> 00:09:44,840 Speaker 2: little terrifying to see these concepts that you've worked on, 201 00:09:44,880 --> 00:09:48,240 Speaker 2: these projects that you're passionate about, sort of leave your 202 00:09:48,280 --> 00:09:52,080 Speaker 2: control and enter the broader cultural conversation where you don't 203 00:09:52,120 --> 00:09:55,480 Speaker 2: really necessarily have a voice to combat the misinformation. And 204 00:09:55,520 --> 00:09:58,960 Speaker 2: there's a lot of well meaning misinformation out there, people 205 00:09:59,160 --> 00:10:01,319 Speaker 2: who try to cover this stuff and just don't get 206 00:10:01,320 --> 00:10:04,319 Speaker 2: the details right. And I don't have, or it didn't 207 00:10:04,360 --> 00:10:07,800 Speaker 2: at the time, at least have a platform to be like, hey, actually, 208 00:10:08,120 --> 00:10:10,720 Speaker 2: this is how it works, and I don't want to 209 00:10:10,720 --> 00:10:13,160 Speaker 2: throw cold water on your ideas because the reality is 210 00:10:13,240 --> 00:10:16,320 Speaker 2: much more interesting and fascinating, right. That's the thing that 211 00:10:16,320 --> 00:10:18,800 Speaker 2: frustrates me is when they get the story wrong, but 212 00:10:18,840 --> 00:10:22,240 Speaker 2: the real story is even more exciting. It's even more 213 00:10:22,280 --> 00:10:25,559 Speaker 2: clickbaity than the misconceptions. 214 00:10:25,640 --> 00:10:28,640 Speaker 1: When you reach a certain level with science, obviously, it 215 00:10:28,640 --> 00:10:31,280 Speaker 1: gets to a point where there is no easy way 216 00:10:31,280 --> 00:10:35,959 Speaker 1: to communicate the topic to a general audience. You're either 217 00:10:36,040 --> 00:10:38,440 Speaker 1: going to have to spend an an ordinate amount of 218 00:10:38,440 --> 00:10:41,120 Speaker 1: time setting it up and building upon a foundation and 219 00:10:41,160 --> 00:10:44,000 Speaker 1: hoping that your audience stays with you for the entirety 220 00:10:44,200 --> 00:10:48,320 Speaker 1: of that, or you run the risk of oversimplifying, hoping 221 00:10:48,360 --> 00:10:50,960 Speaker 1: that they get the gist of it, knowing that you're 222 00:10:51,000 --> 00:10:54,640 Speaker 1: going to be leaving out some important nuance that it 223 00:10:54,679 --> 00:10:56,880 Speaker 1: really is needed to have a full understanding of or 224 00:10:56,920 --> 00:11:00,120 Speaker 1: even a working understanding of the topic. That's something that 225 00:11:00,160 --> 00:11:03,000 Speaker 1: I struggle with all the time on tech stuff, right, Like, 226 00:11:03,800 --> 00:11:09,080 Speaker 1: there are tech topics that are really complex, and how 227 00:11:09,120 --> 00:11:12,160 Speaker 1: do you approach discussing this in a way that is 228 00:11:12,520 --> 00:11:16,760 Speaker 1: I think responsible, so that your audience has an understanding. 229 00:11:17,080 --> 00:11:20,680 Speaker 1: And of course I always advocate that people look into things, 230 00:11:20,720 --> 00:11:23,560 Speaker 1: read up more about them if they're interested, to do 231 00:11:23,640 --> 00:11:26,120 Speaker 1: more exploration, because there's only so much I can say 232 00:11:26,280 --> 00:11:30,040 Speaker 1: in an hour long episode. But you know that still 233 00:11:30,080 --> 00:11:33,960 Speaker 1: weighs on me. That being said, sometimes there are layups 234 00:11:34,000 --> 00:11:38,480 Speaker 1: out there where you'll see something portrayed and you'll just say, well, 235 00:11:38,480 --> 00:11:42,559 Speaker 1: that's just wrong, that's just wrong. It's not even remotely 236 00:11:42,559 --> 00:11:45,120 Speaker 1: in the realm of possibility. I'm reminded. I wish I 237 00:11:45,120 --> 00:11:47,880 Speaker 1: could remember which procedural it was. It was one of 238 00:11:47,920 --> 00:11:51,160 Speaker 1: those cop procedural shows where there's always something about hacking. 239 00:11:51,520 --> 00:11:55,240 Speaker 1: So there was a scene where there are two characters 240 00:11:55,280 --> 00:11:59,960 Speaker 1: simultaneously typing on the same keyboard while trying to counteract 241 00:12:00,080 --> 00:12:02,280 Speaker 1: to hacker who's trying to get into a system. And 242 00:12:02,360 --> 00:12:06,240 Speaker 1: I thought, in one world, can a keyboard accept two 243 00:12:06,320 --> 00:12:10,120 Speaker 1: separate forms of input at the same time and haven't 244 00:12:10,320 --> 00:12:12,880 Speaker 1: mean anything. It would just be like me just smashing 245 00:12:12,880 --> 00:12:16,200 Speaker 1: my hands randomly on keys. This makes no sense. I'm like, 246 00:12:16,280 --> 00:12:18,079 Speaker 1: it couldn't have made sense when they shot it. Did 247 00:12:18,120 --> 00:12:21,559 Speaker 1: they do this on purpose? As like a wind up? 248 00:12:21,840 --> 00:12:24,240 Speaker 1: So there are moments like that that stand out as. 249 00:12:24,120 --> 00:12:26,720 Speaker 3: A parasitologist, as person who studies parasites. There was an 250 00:12:26,720 --> 00:12:30,120 Speaker 3: episode of Grey's Anatomy where someone had neurosistas yorcosis. So 251 00:12:30,160 --> 00:12:32,720 Speaker 3: it's this big fluid filled sack filled with tapeworms in 252 00:12:32,760 --> 00:12:35,400 Speaker 3: your brains that causes nervous system problems. Yeah, and it's 253 00:12:35,440 --> 00:12:38,160 Speaker 3: a very delicate procedure to remove that sack because if 254 00:12:38,160 --> 00:12:41,720 Speaker 3: it breaks, you have this crazy like neuroimmune response to 255 00:12:41,760 --> 00:12:43,800 Speaker 3: the sack and it's really bad. But they pulled it 256 00:12:43,840 --> 00:12:46,760 Speaker 3: out in the reveal it was a nematode, and all right, 257 00:12:46,800 --> 00:12:48,760 Speaker 3: so I get upset. No one else would care. That's 258 00:12:48,800 --> 00:12:51,720 Speaker 3: a totally different phylum than the actual parasite that causes 259 00:12:51,720 --> 00:12:54,160 Speaker 3: the problem. But it was like a squiggly worm that 260 00:12:54,200 --> 00:12:56,120 Speaker 3: fell back in and they were like reaching around trying 261 00:12:56,160 --> 00:12:58,200 Speaker 3: to get it, and I'm like, you're screwing up the 262 00:12:58,200 --> 00:13:00,720 Speaker 3: brain and like, actually it's a sack. It's not a 263 00:13:00,880 --> 00:13:04,720 Speaker 3: like the little nematode. Anyway, I was Everything was wrong 264 00:13:04,760 --> 00:13:06,000 Speaker 3: about that. That kept me up at night. 265 00:13:06,080 --> 00:13:08,000 Speaker 1: See now that makes me think of the Adventures of 266 00:13:08,000 --> 00:13:10,920 Speaker 1: Buckaroo Bonzai where there's a moment early in that film 267 00:13:10,920 --> 00:13:15,280 Speaker 1: where Buckaroo Bonzi, who is a rocket scientist neurosurgeon rock 268 00:13:15,320 --> 00:13:19,880 Speaker 1: star wow. Peter Weller played him. Fantastic, Very slow movie 269 00:13:20,200 --> 00:13:23,720 Speaker 1: nineteen eighties science fiction movie. John Lithgal plays a crazy 270 00:13:23,760 --> 00:13:26,280 Speaker 1: alien in it. If you haven't seen it, it's worth watching. 271 00:13:26,360 --> 00:13:29,240 Speaker 1: But it is slow paced and convoluted. 272 00:13:29,320 --> 00:13:31,240 Speaker 3: You're not selling it well. 273 00:13:31,520 --> 00:13:33,720 Speaker 1: I have to warn because I introduced a friend of mine, 274 00:13:33,960 --> 00:13:37,319 Speaker 1: a millennial, to this film, and she was like, that 275 00:13:37,480 --> 00:13:41,120 Speaker 1: was the slowest, most confusing science fiction film I've ever seen. 276 00:13:41,160 --> 00:13:46,520 Speaker 1: I'm like, there's a rock star neuroscientist, rocket scientist, race 277 00:13:46,600 --> 00:13:49,080 Speaker 1: car driver as the main character. How could you not 278 00:13:49,120 --> 00:13:51,320 Speaker 1: love it? But there's a moment where they're doing brain 279 00:13:51,360 --> 00:13:55,360 Speaker 1: surgery where that character says to his surgical assistant, no, no, no, 280 00:13:55,400 --> 00:13:58,120 Speaker 1: don't pull on that. You don't know what's connected to Like, okay, Well, 281 00:13:58,120 --> 00:14:00,880 Speaker 1: that seems realistic to me. That seems like realistic depiction 282 00:14:00,960 --> 00:14:03,320 Speaker 1: of brain surgery. I think I'm okay with that, Daniel. 283 00:14:03,360 --> 00:14:06,160 Speaker 1: Are there any like depictions that you have seen that 284 00:14:06,320 --> 00:14:08,920 Speaker 1: kind of stand out in your your mind? As this 285 00:14:09,000 --> 00:14:11,160 Speaker 1: might have been a very creative way to tell a story, 286 00:14:11,200 --> 00:14:15,160 Speaker 1: but it was a terrible way to leverage actual science 287 00:14:15,200 --> 00:14:16,360 Speaker 1: that you are familiar with. 288 00:14:17,080 --> 00:14:19,280 Speaker 2: Well, I have a lot of respect for science fiction authors, 289 00:14:19,320 --> 00:14:21,960 Speaker 2: and I get what they're going for. Often they read 290 00:14:22,080 --> 00:14:25,520 Speaker 2: some new idea or they hear an explanation, and then 291 00:14:25,600 --> 00:14:28,120 Speaker 2: inspire some story they imagine that they could tell only 292 00:14:28,160 --> 00:14:31,400 Speaker 2: in some alternative universe. And with Kelly, I agree, like, 293 00:14:31,480 --> 00:14:33,320 Speaker 2: it doesn't have to follow the rules of our universe 294 00:14:33,360 --> 00:14:35,600 Speaker 2: as long as they are consistent about it. But the 295 00:14:35,640 --> 00:14:38,520 Speaker 2: thing that does frustrate me is that I think it 296 00:14:38,600 --> 00:14:40,480 Speaker 2: is possible for a lot of folks out there to 297 00:14:40,720 --> 00:14:43,840 Speaker 2: understand the subtle nuances of the science as long as 298 00:14:43,840 --> 00:14:46,280 Speaker 2: it's explained well and on our show at least, we 299 00:14:46,320 --> 00:14:50,160 Speaker 2: really believe that the science can be communicated clearly and effectively, 300 00:14:50,240 --> 00:14:53,240 Speaker 2: that it's possible for people to really get these things. 301 00:14:53,280 --> 00:14:54,880 Speaker 2: I think there's a lot of folks out there who 302 00:14:55,400 --> 00:14:58,560 Speaker 2: really love science and maybe wanted to do it, but 303 00:14:58,600 --> 00:15:00,360 Speaker 2: something happened in their life they didn't end up being 304 00:15:00,360 --> 00:15:02,280 Speaker 2: a scientist, and so they want to stay up on 305 00:15:02,320 --> 00:15:04,440 Speaker 2: the details and they want to hear more than just like, 306 00:15:04,720 --> 00:15:06,760 Speaker 2: here's the pop size story. You've heard a lot about 307 00:15:06,840 --> 00:15:10,000 Speaker 2: quantum mechanics and entanglement and to really have it connect 308 00:15:10,040 --> 00:15:12,160 Speaker 2: for them, and that's what we try to do on 309 00:15:12,200 --> 00:15:15,040 Speaker 2: the show. But to directly answer your question, one of 310 00:15:15,040 --> 00:15:17,040 Speaker 2: the things that sort of grinds my gears a lot 311 00:15:17,400 --> 00:15:21,440 Speaker 2: is when people use like two entangled particles to communicate 312 00:15:21,520 --> 00:15:23,880 Speaker 2: faster than light. And you know, you hear this in 313 00:15:23,960 --> 00:15:27,120 Speaker 2: pop science all the time, and there are subtle nuances there, 314 00:15:27,160 --> 00:15:30,240 Speaker 2: like there is something happening faster than light when two 315 00:15:30,440 --> 00:15:33,720 Speaker 2: entangled particles collapse. But you can't use it to send 316 00:15:33,800 --> 00:15:36,920 Speaker 2: messages across the stars, as much as I wish you could. 317 00:15:37,160 --> 00:15:41,080 Speaker 1: Right right that whole concept of the entangled particles. Usually 318 00:15:41,120 --> 00:15:44,600 Speaker 1: we're talking about something like spin particle spin. So if 319 00:15:44,600 --> 00:15:47,200 Speaker 1: one's spinning up, the other one's spinning down, and they're 320 00:15:47,360 --> 00:15:51,280 Speaker 1: entangled together, and no matter how far apart you separate 321 00:15:51,360 --> 00:15:56,000 Speaker 1: these two particles until the system collapses, until that entanglement breaks, 322 00:15:56,240 --> 00:15:58,080 Speaker 1: then you're still going to have one spinning up, one 323 00:15:58,080 --> 00:16:00,720 Speaker 1: spinning down. There'll always be in that connect. But then 324 00:16:00,760 --> 00:16:04,880 Speaker 1: once you observe it all breaks down and that connection 325 00:16:05,040 --> 00:16:07,920 Speaker 1: is severed and you can't do anything useful with it. Yeah, 326 00:16:07,960 --> 00:16:11,200 Speaker 1: we've used that discussion when talking about things like quantum computers, 327 00:16:11,680 --> 00:16:15,760 Speaker 1: when talking about entanglement and superposition and all these really 328 00:16:15,800 --> 00:16:19,720 Speaker 1: interesting ideas. I'm fascinated with the methods people have created 329 00:16:19,960 --> 00:16:25,120 Speaker 1: to try and take advantage of these quantum effects. And 330 00:16:25,560 --> 00:16:28,200 Speaker 1: to me, like, that's one of those areas where I 331 00:16:28,560 --> 00:16:32,320 Speaker 1: definitely run up against the barriers of my ignorance. Right, 332 00:16:32,400 --> 00:16:36,800 Speaker 1: I understand up to a point, and then after that 333 00:16:36,880 --> 00:16:40,320 Speaker 1: I'm like, WHOA, Okay, there was a leap made here 334 00:16:40,720 --> 00:16:46,280 Speaker 1: that I can't follow, and I from a kind of 335 00:16:46,320 --> 00:16:49,320 Speaker 1: like a high level concept I can understand, but it's 336 00:16:49,400 --> 00:16:52,800 Speaker 1: so counterintuitive based upon just my day to day experience 337 00:16:52,880 --> 00:16:56,000 Speaker 1: with physics on the macro level, that you might as 338 00:16:56,080 --> 00:16:58,360 Speaker 1: well be describing a magic spell to me, because it's 339 00:16:58,400 --> 00:17:01,280 Speaker 1: so it's so unusual. I mean, I can see why 340 00:17:01,280 --> 00:17:03,920 Speaker 1: Einstein would call it spooky action at a distance, right 341 00:17:03,960 --> 00:17:07,240 Speaker 1: like it's it seems like it doesn't make any any 342 00:17:07,280 --> 00:17:07,879 Speaker 1: real sense. 343 00:17:08,200 --> 00:17:10,399 Speaker 2: And that's the incredible thing about the universe, right it 344 00:17:10,480 --> 00:17:13,120 Speaker 2: is this bizarre mystery. It seems like it doesn't make sense. 345 00:17:13,359 --> 00:17:16,120 Speaker 2: And yet over the years and the decades and the centuries, 346 00:17:16,160 --> 00:17:18,600 Speaker 2: we have slowly chipped away at the truth and always 347 00:17:18,640 --> 00:17:21,879 Speaker 2: found some explanation. And that explanation is often not what 348 00:17:21,920 --> 00:17:24,480 Speaker 2: we expected, not what we thought it was going to be, 349 00:17:24,760 --> 00:17:27,959 Speaker 2: and mind blowing when we understand it. But amazingly it 350 00:17:28,040 --> 00:17:30,600 Speaker 2: is possible to understand this universe like we have no 351 00:17:30,640 --> 00:17:33,480 Speaker 2: guarantee of that. It could be that eventually the universe 352 00:17:33,880 --> 00:17:37,720 Speaker 2: runs on some systems, some mathematics that are beyond our comprehension, 353 00:17:38,000 --> 00:17:40,639 Speaker 2: but so far we've always been able with our puny 354 00:17:40,680 --> 00:17:42,280 Speaker 2: little ape brains to figure it out. 355 00:17:42,320 --> 00:17:43,679 Speaker 3: But for some of this stuff, we're still in the 356 00:17:43,680 --> 00:17:45,800 Speaker 3: process of trying to understand it. And so, you know, you 357 00:17:45,840 --> 00:17:47,359 Speaker 3: were saying that you get to a point where you're like, oh, 358 00:17:47,400 --> 00:17:50,199 Speaker 3: my brain just can't make that leap. While talking to Daniel, 359 00:17:50,280 --> 00:17:52,080 Speaker 3: I've learned that a lot of the places where I 360 00:17:52,119 --> 00:17:54,359 Speaker 3: was like, oh, my brain can't make that leap. When 361 00:17:54,520 --> 00:17:56,040 Speaker 3: I talked to Daniel, He's like, oh, we just we 362 00:17:56,040 --> 00:17:57,600 Speaker 3: don't know why that happens yet, and I'm like, no, 363 00:17:58,240 --> 00:18:01,240 Speaker 3: we just hit against everyone's under standing there. And so, 364 00:18:01,320 --> 00:18:03,160 Speaker 3: you know, I think that's a great thing about science 365 00:18:03,160 --> 00:18:05,159 Speaker 3: if it's explained well, you can let people know that, like, no, 366 00:18:05,200 --> 00:18:06,440 Speaker 3: we're all we all get a wall. 367 00:18:06,480 --> 00:18:09,480 Speaker 1: Eventually, we'll be back in just a moment to talk 368 00:18:09,560 --> 00:18:13,800 Speaker 1: more about science and science, depictions of media, and who 369 00:18:13,880 --> 00:18:17,040 Speaker 1: knows what else with Daniel and Kelly after these messages. 370 00:18:26,520 --> 00:18:28,360 Speaker 1: When I used to write for How Stuff Works, there 371 00:18:28,400 --> 00:18:31,280 Speaker 1: was a point where I was given the assignment of 372 00:18:31,320 --> 00:18:35,520 Speaker 1: how string theory works. So I was I was diving 373 00:18:35,560 --> 00:18:40,080 Speaker 1: into everything string theory. And I'm an English Lit major, y'all. 374 00:18:40,359 --> 00:18:43,920 Speaker 1: My major was in Shakespeare, and I can rattle Shakespeare 375 00:18:43,960 --> 00:18:46,439 Speaker 1: off like nobody's business. But you asked me to describe 376 00:18:46,440 --> 00:18:48,560 Speaker 1: string theory, and we got ourselves a problem. 377 00:18:48,800 --> 00:18:50,760 Speaker 2: I want to hear a sonnet about string theory. 378 00:18:51,119 --> 00:18:52,639 Speaker 1: You know what? Give me a week and I'll have 379 00:18:52,680 --> 00:18:58,360 Speaker 1: it for you. I'm assuming you mean Shakespearean not Edwardians, 380 00:18:58,400 --> 00:19:02,200 Speaker 1: because you're Spencerian. Okay, good, I can do that. I'm 381 00:19:02,240 --> 00:19:06,280 Speaker 1: an expert at Shakespearean's on It's excellent. I remember distinctly 382 00:19:06,320 --> 00:19:09,600 Speaker 1: I watched a I think it was a Nova special 383 00:19:09,680 --> 00:19:13,480 Speaker 1: about string theory, and I'm taking notes and I mean, 384 00:19:13,520 --> 00:19:16,560 Speaker 1: I've got like like seven eight pages of handwritten notes 385 00:19:16,680 --> 00:19:19,959 Speaker 1: and I'm just sweating bullets watching this and everyone's really 386 00:19:20,160 --> 00:19:22,639 Speaker 1: a great communicator on that special. And it gets to 387 00:19:22,640 --> 00:19:25,040 Speaker 1: one point where one of the experts is being interviewed 388 00:19:25,040 --> 00:19:26,760 Speaker 1: and he says, yeah, I think this is the point 389 00:19:26,800 --> 00:19:29,040 Speaker 1: where I just have to admit I don't understand how 390 00:19:29,080 --> 00:19:30,840 Speaker 1: it works. I just know that this is what the 391 00:19:30,880 --> 00:19:33,199 Speaker 1: math says. And I'm like, oh my gosh, if you 392 00:19:33,320 --> 00:19:37,679 Speaker 1: don't know, what hope is there for me? But that, 393 00:19:37,800 --> 00:19:39,920 Speaker 1: to me is fascinating. It's one of the reasons why, 394 00:19:40,240 --> 00:19:43,600 Speaker 1: by the way, I love doing a tech podcast is 395 00:19:43,640 --> 00:19:50,439 Speaker 1: that technology is science manifest into devices and gadgets, and 396 00:19:50,800 --> 00:19:54,200 Speaker 1: it's proof that science works, right because if science didn't work, 397 00:19:54,240 --> 00:19:57,680 Speaker 1: the tech wouldn't work. So I've always pointed at like 398 00:19:57,720 --> 00:20:03,479 Speaker 1: we are constantly surrounded by proof that the scientific approach works, 399 00:20:03,640 --> 00:20:06,640 Speaker 1: because otherwise the things that we come in contact with 400 00:20:06,960 --> 00:20:10,920 Speaker 1: on a minute by minute basis wouldn't function absolutely. 401 00:20:11,359 --> 00:20:13,119 Speaker 3: What gets me about a lot of tech, though, is 402 00:20:13,119 --> 00:20:16,000 Speaker 3: that it works, but the science we don't always even 403 00:20:16,040 --> 00:20:18,760 Speaker 3: know why it works. I was talking to someone who 404 00:20:18,760 --> 00:20:21,320 Speaker 3: does deep brain stimulation, so like the essentially, you know, 405 00:20:21,400 --> 00:20:23,600 Speaker 3: electrocute the center of your brain to try to stop 406 00:20:23,680 --> 00:20:26,719 Speaker 3: the electrical storms that cause seizures. And I asked her, 407 00:20:26,760 --> 00:20:30,440 Speaker 3: I was like, why does like putting electrical circuits through 408 00:20:30,440 --> 00:20:32,600 Speaker 3: someone's brain stop it? And they're like, we don't know, 409 00:20:32,720 --> 00:20:34,720 Speaker 3: but it's great that it works. And I was like, wow, 410 00:20:35,640 --> 00:20:38,080 Speaker 3: So I mean sometimes just because engineering works doesn't mean 411 00:20:38,119 --> 00:20:39,840 Speaker 3: we understand it at the level that we'd like to. 412 00:20:39,960 --> 00:20:41,080 Speaker 3: But it's great that it works. 413 00:20:41,200 --> 00:20:45,080 Speaker 1: Yeah, that's true. Yeah, there are lots of examples of that. Like, 414 00:20:45,080 --> 00:20:47,600 Speaker 1: it's to me, it's interesting that we have great stories 415 00:20:48,080 --> 00:20:51,159 Speaker 1: going in both directions, right. We have the stories of 416 00:20:51,200 --> 00:20:56,120 Speaker 1: the people who were real innovators, really phenomenal. They thought 417 00:20:56,160 --> 00:20:59,480 Speaker 1: outside the box. You know, they're probably building upon work 418 00:20:59,480 --> 00:21:03,240 Speaker 1: that previous generations had done, but they've done something themselves 419 00:21:03,320 --> 00:21:07,560 Speaker 1: that has transformed thinking in some way and has really 420 00:21:07,560 --> 00:21:10,960 Speaker 1: pushed science forward. So the person I love to talk about, 421 00:21:11,119 --> 00:21:13,119 Speaker 1: and I was actually just chatting about this with my 422 00:21:13,200 --> 00:21:17,800 Speaker 1: producer the other day, Ada Lovelace. You know Lord Byron's daughter. 423 00:21:18,280 --> 00:21:22,160 Speaker 1: Ada Lovelace worked with Charles Babbage, who built the Decision Machine, 424 00:21:22,200 --> 00:21:25,920 Speaker 1: the Decision Engine, and she was one of the first people, 425 00:21:25,960 --> 00:21:28,560 Speaker 1: probably the first person that I know of, who came 426 00:21:28,640 --> 00:21:31,600 Speaker 1: up with the concept of what if we took things 427 00:21:31,840 --> 00:21:37,680 Speaker 1: like pictures or music and we converted that into mathematical statements, 428 00:21:38,080 --> 00:21:40,720 Speaker 1: and then we used a machine like this to process that. 429 00:21:40,880 --> 00:21:43,960 Speaker 1: We could create art, we could create music, we could 430 00:21:44,040 --> 00:21:47,639 Speaker 1: alter existing pieces. And so she was essentially talking about 431 00:21:47,800 --> 00:21:53,080 Speaker 1: computer programming, you know, generations before anyone else would even 432 00:21:53,119 --> 00:21:55,480 Speaker 1: be able to do this, and the fact that she 433 00:21:55,640 --> 00:21:59,880 Speaker 1: was able to ideate around that without there being any 434 00:22:00,119 --> 00:22:02,080 Speaker 1: lead up, Like, as far as I can tell, she 435 00:22:02,240 --> 00:22:04,880 Speaker 1: just sort of had a Eureka moment. And those are 436 00:22:04,920 --> 00:22:06,879 Speaker 1: so rare, Like it's easy for us to point at 437 00:22:06,920 --> 00:22:08,800 Speaker 1: things and say, this is the inventor of such and such, 438 00:22:08,840 --> 00:22:10,760 Speaker 1: but when you look into it, you're like, oh, actually 439 00:22:10,800 --> 00:22:14,159 Speaker 1: they were iterating on previous generations of stuff. But as 440 00:22:14,160 --> 00:22:16,240 Speaker 1: far as I can tell, This is like Ada Lovelace 441 00:22:16,320 --> 00:22:19,639 Speaker 1: coming up with a truly novel idea. And that's one 442 00:22:19,680 --> 00:22:22,800 Speaker 1: of the things I love most about podcasting is finding 443 00:22:22,840 --> 00:22:26,000 Speaker 1: those stories, as well as the ones where someone's like, yeah, 444 00:22:26,040 --> 00:22:27,520 Speaker 1: we didn't know if it would work, and we tried 445 00:22:27,560 --> 00:22:29,320 Speaker 1: it and we work. We don't know why it works, 446 00:22:29,359 --> 00:22:31,720 Speaker 1: but it does seem to work. So we'll figure out 447 00:22:31,760 --> 00:22:35,359 Speaker 1: the why someday maybe, but for now, the fact that 448 00:22:35,400 --> 00:22:36,800 Speaker 1: it works is important enough. 449 00:22:37,080 --> 00:22:39,760 Speaker 2: Yeah, and it really highlights the human side of science. 450 00:22:39,840 --> 00:22:42,720 Speaker 2: You know, I feel like there's a sandwich there. People 451 00:22:42,720 --> 00:22:44,879 Speaker 2: feel like, oh, science, maybe it's sort of like a 452 00:22:45,040 --> 00:22:48,919 Speaker 2: sterile or it's intellectual, or it's bigger than people, but 453 00:22:49,080 --> 00:22:52,040 Speaker 2: you know, it really exists only in a human context, 454 00:22:52,240 --> 00:22:54,920 Speaker 2: both because of the philosophical side of it, like you're 455 00:22:54,960 --> 00:22:57,239 Speaker 2: talking about, Oh, we understand the math, but what does 456 00:22:57,280 --> 00:23:00,520 Speaker 2: this really mean? Right? We always want to understand signs 457 00:23:00,560 --> 00:23:02,800 Speaker 2: in terms of a story, like what is this telling 458 00:23:02,880 --> 00:23:05,160 Speaker 2: us about how the universe works? About what is real? 459 00:23:05,359 --> 00:23:07,080 Speaker 2: But then also on the other side of it, like 460 00:23:07,119 --> 00:23:09,640 Speaker 2: what does it mean for humans to live? How does 461 00:23:09,680 --> 00:23:12,679 Speaker 2: this technology change our lives? How does it change what 462 00:23:12,720 --> 00:23:14,280 Speaker 2: it's like to be a human in the world? What 463 00:23:14,320 --> 00:23:16,399 Speaker 2: does it let us do? What kind of lives can 464 00:23:16,400 --> 00:23:18,840 Speaker 2: we lead that we couldn't lead before? I mean, there's 465 00:23:19,040 --> 00:23:21,240 Speaker 2: human stories everywhere on both sides of it, from the 466 00:23:21,240 --> 00:23:22,800 Speaker 2: philosophy to the engineering. 467 00:23:23,520 --> 00:23:25,960 Speaker 1: Yeah, we're seeing a lot of that too right now. 468 00:23:26,240 --> 00:23:29,680 Speaker 1: Just a lot of overt discussion about that with the 469 00:23:29,760 --> 00:23:32,760 Speaker 1: explosion of generative AI, which some people are now saying 470 00:23:32,800 --> 00:23:36,560 Speaker 1: like we might be hitting like peak AI for at 471 00:23:36,640 --> 00:23:39,240 Speaker 1: least the time being, not that it won't go any higher, 472 00:23:39,400 --> 00:23:42,640 Speaker 1: but that the level of evolution may be slowing down 473 00:23:42,680 --> 00:23:45,199 Speaker 1: a bit as we're kind of hitting the limits of 474 00:23:45,200 --> 00:23:46,639 Speaker 1: what large language models can do. 475 00:23:46,840 --> 00:23:48,320 Speaker 2: Oh my gosh, can I rant about that? For am 476 00:23:48,280 --> 00:23:50,920 Speaker 2: a lead? Because I see people saying like, let's spend 477 00:23:50,960 --> 00:23:55,040 Speaker 2: a trillion dollars and build a huge, large language model 478 00:23:55,080 --> 00:23:58,040 Speaker 2: because it will solve physics. And I'm like, you know what, 479 00:23:58,200 --> 00:24:00,879 Speaker 2: let's spend a trillion dollars on human physics first and 480 00:24:00,960 --> 00:24:04,320 Speaker 2: try that, because trust me, if we spend that much 481 00:24:04,359 --> 00:24:06,640 Speaker 2: money on physics, we could probably bigen the thing out. 482 00:24:07,160 --> 00:24:09,200 Speaker 2: Like why do we need to dumb do an AI? 483 00:24:09,840 --> 00:24:12,920 Speaker 1: I've got very strong my listeners know I don't need 484 00:24:12,960 --> 00:24:15,040 Speaker 1: to go on another rant for my listeners but I 485 00:24:15,080 --> 00:24:18,280 Speaker 1: have very strong feelings about generative AI. I don't think 486 00:24:18,280 --> 00:24:23,640 Speaker 1: it's a inherently bad technology. I mean, it's again going 487 00:24:23,680 --> 00:24:25,919 Speaker 1: back to Shakespeare, there's nothing either good nor bad, but 488 00:24:26,000 --> 00:24:28,560 Speaker 1: thinking makes it. So this is what AI. You've got 489 00:24:28,640 --> 00:24:31,680 Speaker 1: artificial thinking. But I have a lot of strong opinions 490 00:24:31,720 --> 00:24:33,680 Speaker 1: on it. I even did an episode of tech Stuff 491 00:24:33,680 --> 00:24:37,800 Speaker 1: where I titled it. This episode was written by AI 492 00:24:38,560 --> 00:24:42,280 Speaker 1: open bracket, sort of in bracket. And what I did was, 493 00:24:42,320 --> 00:24:46,639 Speaker 1: I had I had that do an episode like, I 494 00:24:46,640 --> 00:24:48,960 Speaker 1: gave it a prompt chat GPT is what I used. 495 00:24:49,280 --> 00:24:51,199 Speaker 1: I gave it a prompt, I told it to make 496 00:24:51,240 --> 00:24:55,040 Speaker 1: an episode of tech stuff, and so it had me 497 00:24:55,440 --> 00:24:56,400 Speaker 1: inserted in there. 498 00:24:56,960 --> 00:24:57,240 Speaker 2: Wow. 499 00:24:57,280 --> 00:25:00,560 Speaker 1: And the thing that I found the most distressed First, 500 00:25:00,560 --> 00:25:04,240 Speaker 1: it got stuff wrong, which is bad, right, But I'm like, well, 501 00:25:04,400 --> 00:25:07,359 Speaker 1: I get stuff wrong too, sometimes I'm human. It does happen. 502 00:25:07,600 --> 00:25:11,160 Speaker 1: But the thing that upset me was it invented experts 503 00:25:11,600 --> 00:25:14,679 Speaker 1: to deliver points of data. So it was as if 504 00:25:14,720 --> 00:25:17,720 Speaker 1: I had interviewed I think it was three or four 505 00:25:17,800 --> 00:25:21,000 Speaker 1: different people in the transcript that it printed, but none 506 00:25:21,040 --> 00:25:23,760 Speaker 1: of those people even exist, let alone. I never spoke 507 00:25:23,800 --> 00:25:26,320 Speaker 1: to anyone obviously, but they don't even exist. There were 508 00:25:26,400 --> 00:25:31,159 Speaker 1: these names and they were given titles and given a 509 00:25:31,200 --> 00:25:33,840 Speaker 1: position at places. So I tried to verify it. Not 510 00:25:33,920 --> 00:25:37,560 Speaker 1: a single person existed, And I thought, you're inventing quotes, 511 00:25:38,280 --> 00:25:41,439 Speaker 1: and then you're attributing those invented quotes to invented people, 512 00:25:42,200 --> 00:25:46,639 Speaker 1: and you're presenting it without any indication that these are inventions. 513 00:25:46,640 --> 00:25:48,159 Speaker 1: I mean, this is where we get into you know, 514 00:25:48,200 --> 00:25:52,720 Speaker 1: confabulations or hallucinations with AI, and to me, that immediately 515 00:25:52,720 --> 00:25:55,600 Speaker 1: sets off enormous alarms, which is why I would not 516 00:25:55,880 --> 00:26:01,639 Speaker 1: want to entrust using large language model based generative AI 517 00:26:01,800 --> 00:26:06,840 Speaker 1: to tackle the greatest problems in physics, because ultimately you 518 00:26:06,920 --> 00:26:11,919 Speaker 1: don't know at what point it's just creating the statistically 519 00:26:13,040 --> 00:26:17,080 Speaker 1: likely sentence, right Like, it's just statistically choosing words to 520 00:26:17,119 --> 00:26:20,320 Speaker 1: create grammatically correct sentences, and there's no regard as to 521 00:26:20,359 --> 00:26:22,240 Speaker 1: whether or not there's truth in there. 522 00:26:22,480 --> 00:26:23,760 Speaker 2: Yeah. 523 00:26:23,840 --> 00:26:26,040 Speaker 3: Yeah. We wrote a book about living in space and 524 00:26:26,080 --> 00:26:27,840 Speaker 3: we had a chapter on food and space, and we 525 00:26:27,840 --> 00:26:29,919 Speaker 3: thought we had found all the books on food and space. 526 00:26:30,359 --> 00:26:32,440 Speaker 3: And after the book came out, we were on chat 527 00:26:32,480 --> 00:26:35,119 Speaker 3: GPT and we were like, tell us the ten books 528 00:26:35,119 --> 00:26:37,840 Speaker 3: about food and space, and it popped out ten because 529 00:26:37,840 --> 00:26:40,199 Speaker 3: we had asked for ten and some of them were 530 00:26:40,240 --> 00:26:41,880 Speaker 3: books we had, and the rest we were like, oh, 531 00:26:41,920 --> 00:26:43,760 Speaker 3: how did we miss these books? And we looked them 532 00:26:43,840 --> 00:26:46,080 Speaker 3: up and they didn't exist. And just like you said, 533 00:26:46,440 --> 00:26:48,760 Speaker 3: we asked, like, well, tell us more about the person 534 00:26:48,800 --> 00:26:50,879 Speaker 3: who wrote this book that doesn't seem to exist, and 535 00:26:50,920 --> 00:26:54,199 Speaker 3: they gave them a bio, they gave them affiliations. And 536 00:26:54,480 --> 00:26:57,040 Speaker 3: these hallucinations are really scary if you're going to rely 537 00:26:57,160 --> 00:26:58,000 Speaker 3: on it for research. 538 00:26:58,080 --> 00:27:00,920 Speaker 2: Oh yeah, yeah, nobody I ever should rely on it 539 00:27:01,000 --> 00:27:03,520 Speaker 2: for research. It's not a research tool, right, Yeah. It 540 00:27:03,640 --> 00:27:06,840 Speaker 2: generates the facsimile of research. It doesn't do any research, 541 00:27:06,840 --> 00:27:10,040 Speaker 2: doesn't do any thinking. There's no reasoning there, and sometimes 542 00:27:10,040 --> 00:27:12,639 Speaker 2: that's very useful for research. I use Genai in my 543 00:27:12,720 --> 00:27:16,320 Speaker 2: research all the time because it's very good at generating examples, 544 00:27:16,320 --> 00:27:19,760 Speaker 2: and sometimes you need something which can rapidly generate examples. 545 00:27:20,040 --> 00:27:22,719 Speaker 2: But you have to know exactly what you're asking to do, yeah, 546 00:27:22,760 --> 00:27:23,680 Speaker 2: because it can't think. 547 00:27:23,880 --> 00:27:27,080 Speaker 1: Yeah. I've used it to create like a bullet point 548 00:27:27,240 --> 00:27:30,240 Speaker 1: of a bullet list of summary points from like really 549 00:27:30,320 --> 00:27:33,159 Speaker 1: long articles, and then I went back and checked to 550 00:27:33,200 --> 00:27:35,600 Speaker 1: make sure that what it was creating was reflected in 551 00:27:35,600 --> 00:27:38,359 Speaker 1: the articles, and in that case, at least that specific case, 552 00:27:38,640 --> 00:27:41,840 Speaker 1: it was whether or not that's ninety nine times out 553 00:27:41,840 --> 00:27:43,560 Speaker 1: of one hundred or one hundred times out of one hundred. 554 00:27:43,600 --> 00:27:46,800 Speaker 1: I can't say. In my one use case, which is 555 00:27:46,840 --> 00:27:50,600 Speaker 1: a terribly small sample size, it did work out. But 556 00:27:50,760 --> 00:27:53,080 Speaker 1: I am not naive enough to suggest that that means 557 00:27:53,080 --> 00:27:55,159 Speaker 1: it's going to work every single time you use it. 558 00:27:55,440 --> 00:27:57,159 Speaker 1: But yeah, same sort of thing. I've used it a 559 00:27:57,160 --> 00:28:00,480 Speaker 1: couple of times for the purposes of like organizing thoughts, 560 00:28:00,600 --> 00:28:02,960 Speaker 1: that kind of thing, because otherwise I get, as you 561 00:28:02,960 --> 00:28:06,320 Speaker 1: can probably tell from this episode, real Lucy Goosey with 562 00:28:06,520 --> 00:28:09,719 Speaker 1: my approach, my approach to podcasting. 563 00:28:09,280 --> 00:28:11,600 Speaker 2: Well, fundamentally, I think that humans have to play a 564 00:28:11,680 --> 00:28:14,640 Speaker 2: role in science because science is a human thing. It's 565 00:28:14,720 --> 00:28:17,800 Speaker 2: like by people and for people. Right, we have questions 566 00:28:17,840 --> 00:28:21,000 Speaker 2: we want answers to and then we want those answers 567 00:28:21,040 --> 00:28:23,600 Speaker 2: to make sense to us. So it doesn't make any 568 00:28:23,600 --> 00:28:25,840 Speaker 2: sense to me to take humans out of that equation, 569 00:28:26,400 --> 00:28:27,960 Speaker 2: you know, to say it to an AI like go 570 00:28:28,000 --> 00:28:31,640 Speaker 2: figure out the universe, Like I want to understand the universe, 571 00:28:32,080 --> 00:28:33,919 Speaker 2: and it has to make sense to me and it 572 00:28:33,960 --> 00:28:37,840 Speaker 2: has to answer my questions. So humanity is an important 573 00:28:37,840 --> 00:28:38,520 Speaker 2: part of science. 574 00:28:38,840 --> 00:28:40,880 Speaker 3: What if it answered the question and explained it to 575 00:28:40,920 --> 00:28:42,960 Speaker 3: you clearly, why would it not be okay that AI 576 00:28:43,040 --> 00:28:43,680 Speaker 3: got the answer. 577 00:28:44,560 --> 00:28:46,720 Speaker 2: It would be fine if AI found the answer, but 578 00:28:46,960 --> 00:28:49,080 Speaker 2: it's I got to ask the questions and I have 579 00:28:49,160 --> 00:28:50,520 Speaker 2: to be satisfied with the answer. 580 00:28:50,680 --> 00:28:53,800 Speaker 1: I would like to refer both of you to the 581 00:28:54,240 --> 00:28:57,320 Speaker 1: great nonfiction work of The Hitchhiker's Guide to the Galaxy, 582 00:28:58,120 --> 00:29:01,160 Speaker 1: in which we create an incredibly colligent computer that tells 583 00:29:01,200 --> 00:29:03,120 Speaker 1: us the answer to life, the universe, and everything is 584 00:29:03,160 --> 00:29:06,400 Speaker 1: forty two, and then spends the next eternity trying to 585 00:29:06,400 --> 00:29:07,880 Speaker 1: figure out what the question is. 586 00:29:09,400 --> 00:29:09,920 Speaker 2: Exactly. 587 00:29:09,920 --> 00:29:12,280 Speaker 1: But that seems exactly like what you were talking about there, 588 00:29:12,360 --> 00:29:14,479 Speaker 1: Daniel is like, we don't want to create a world 589 00:29:15,280 --> 00:29:18,280 Speaker 1: that is ideal for AI. We want to make sure 590 00:29:18,360 --> 00:29:20,640 Speaker 1: that we continue to work on a world that's ideal 591 00:29:20,680 --> 00:29:23,760 Speaker 1: for humans, which involves making some tough choices like maybe 592 00:29:23,760 --> 00:29:26,960 Speaker 1: not using so much AI that's incredibly hungry for all 593 00:29:27,000 --> 00:29:31,560 Speaker 1: the electricity, and maybe maybe going down that pathway is 594 00:29:31,600 --> 00:29:35,520 Speaker 1: not the most productive. So, Kelly, I'm curious. I'm sure 595 00:29:35,680 --> 00:29:39,360 Speaker 1: you must have read and or seen The Martian. Oh yeah, 596 00:29:39,400 --> 00:29:39,960 Speaker 1: what's your take? 597 00:29:40,000 --> 00:29:40,240 Speaker 3: On that. 598 00:29:40,360 --> 00:29:42,520 Speaker 1: I'm curious because I remember when I read it, I thought, 599 00:29:42,720 --> 00:29:48,280 Speaker 1: this feels like it's fairly realistic, maybe with a couple 600 00:29:48,320 --> 00:29:49,520 Speaker 1: of caveats. 601 00:29:50,080 --> 00:29:52,959 Speaker 3: Yeah, I think it's fairly realistic with a couple of caveats. 602 00:29:53,160 --> 00:29:55,360 Speaker 3: And again, like I think, the rules are what matters, 603 00:29:55,400 --> 00:29:57,760 Speaker 3: and he was very andy. Weir was very consistent with 604 00:29:57,800 --> 00:29:59,920 Speaker 3: the rules of the world we created. So mars Is 605 00:30:00,000 --> 00:30:02,920 Speaker 3: atmosphere is one percent of Earth's. That's enough to support 606 00:30:03,000 --> 00:30:05,920 Speaker 3: dust storms that engulf the entire planets, but it's probably 607 00:30:05,920 --> 00:30:08,800 Speaker 3: not enough to knock over their giant rocket because one 608 00:30:08,840 --> 00:30:11,160 Speaker 3: percent atmosphere is just not enough. And that's the premise 609 00:30:11,200 --> 00:30:13,960 Speaker 3: that sets up the whole movie. Yep. And additionally, Mark 610 00:30:14,000 --> 00:30:19,560 Speaker 3: Wattney is growing potatoes in soil and his own feces. Yes, Martian, 611 00:30:19,680 --> 00:30:20,600 Speaker 3: it's called regolith. 612 00:30:20,760 --> 00:30:21,160 Speaker 1: Yeah. 613 00:30:21,280 --> 00:30:24,040 Speaker 3: That dirt on the surface is laced with prochlorates, which 614 00:30:24,040 --> 00:30:27,520 Speaker 3: are endocrine disrupting hormones that mess up your metabolism. So 615 00:30:27,960 --> 00:30:31,320 Speaker 3: if he had been growing his potatoes directly in there, 616 00:30:31,360 --> 00:30:34,160 Speaker 3: he would have been massively poisoning himself. In addition, all 617 00:30:34,200 --> 00:30:36,560 Speaker 3: the like fecal material he had in there that he'd 618 00:30:36,560 --> 00:30:37,800 Speaker 3: have to be careful to clean up. 619 00:30:37,880 --> 00:30:40,240 Speaker 1: Wow, Yeah, I didn't know that part. I know that 620 00:30:40,680 --> 00:30:44,880 Speaker 1: NASA has been using simulacrums of regolith in order to 621 00:30:45,600 --> 00:30:48,560 Speaker 1: do experimentation about what could or could not be grown 622 00:30:48,640 --> 00:30:50,720 Speaker 1: on the surface of Mars. That part I knew, but 623 00:30:50,760 --> 00:30:55,120 Speaker 1: I didn't know about the potential for really messing up 624 00:30:55,120 --> 00:30:58,480 Speaker 1: your system. Yeah. I agree that, Like the thin atmosphere 625 00:30:58,520 --> 00:31:00,239 Speaker 1: was one of those things that kind of tripped me up. 626 00:31:00,240 --> 00:31:02,760 Speaker 1: Another was, and I'm sure it was explained in the book, 627 00:31:02,840 --> 00:31:04,680 Speaker 1: and it's been a long time since I've read the book. 628 00:31:04,840 --> 00:31:07,640 Speaker 1: I've seen the movie more recently, but the concept of 629 00:31:07,680 --> 00:31:11,520 Speaker 1: a short term Martian expedition made no sense to me, 630 00:31:11,600 --> 00:31:15,200 Speaker 1: simply because to have the planets line up properly so 631 00:31:15,240 --> 00:31:17,440 Speaker 1: that you can make the journey with the least amount 632 00:31:17,560 --> 00:31:21,760 Speaker 1: of fuel needed to do it, you would, in my understanding, 633 00:31:21,800 --> 00:31:24,320 Speaker 1: need to kind of be ready for like a two 634 00:31:24,480 --> 00:31:28,040 Speaker 1: year stay on the planet before you could try and 635 00:31:28,040 --> 00:31:30,760 Speaker 1: make your trip back home. At least in the film version, 636 00:31:30,840 --> 00:31:32,280 Speaker 1: it felt like it was supposed to be like, oh, 637 00:31:32,320 --> 00:31:34,160 Speaker 1: this is a mission that lasts maybe a month and 638 00:31:34,240 --> 00:31:37,400 Speaker 1: a half and then we head back and I'm like, well, 639 00:31:37,440 --> 00:31:40,240 Speaker 1: how does that work? How much fuel are you bringing 640 00:31:40,280 --> 00:31:42,640 Speaker 1: with you or is this a nuclear powered rocket? What's 641 00:31:42,720 --> 00:31:44,280 Speaker 1: going on? So that was one of those things that 642 00:31:44,320 --> 00:31:46,960 Speaker 1: tripped me up to But yeah, the windstorm, that was 643 00:31:47,000 --> 00:31:50,040 Speaker 1: another one where you would need to be able to 644 00:31:50,080 --> 00:31:52,760 Speaker 1: go out and clean the solar panels, clear them of 645 00:31:52,840 --> 00:31:55,040 Speaker 1: dust and all that. But it wouldn't be enough to 646 00:31:55,360 --> 00:32:00,520 Speaker 1: maybe drive a big metal shard through someone's suit into 647 00:32:00,560 --> 00:32:03,520 Speaker 1: their body the way it does near the beginning of 648 00:32:03,560 --> 00:32:04,120 Speaker 1: the story. 649 00:32:04,760 --> 00:32:07,400 Speaker 3: Yeah, or knock over that rocket, which, yeah, which was 650 00:32:07,960 --> 00:32:10,840 Speaker 3: part of the rocket. Yeah, and Daniel would be very 651 00:32:10,840 --> 00:32:14,600 Speaker 3: happy that they included cannibalism in there. That was really important. 652 00:32:16,120 --> 00:32:19,840 Speaker 1: Yeah, I'm glad that we can stick on topic. I 653 00:32:19,880 --> 00:32:23,160 Speaker 1: didn't think of any Italian cannibal horror movies that I 654 00:32:23,160 --> 00:32:26,680 Speaker 1: could add into the discussion here. Also, it would be 655 00:32:26,760 --> 00:32:28,880 Speaker 1: terrifying to find out that you had watched those. But 656 00:32:32,360 --> 00:32:34,560 Speaker 1: Daniel and Kelly have a lot more to say as 657 00:32:34,560 --> 00:32:39,360 Speaker 1: we talk all things science and geekiness. And I'm absolutely 658 00:32:39,360 --> 00:32:41,440 Speaker 1: loving every second of this. But first let's take a 659 00:32:41,480 --> 00:32:53,160 Speaker 1: quick break to thank our sponsors. Let me ask you this, 660 00:32:53,320 --> 00:32:55,440 Speaker 1: out of all the topics you've covered so far on 661 00:32:55,480 --> 00:32:58,160 Speaker 1: your show, are there any standouts of things that were 662 00:32:58,320 --> 00:33:02,760 Speaker 1: just either delight or surprising to you are just you 663 00:33:02,800 --> 00:33:05,800 Speaker 1: were really eager to sink your teeth into when you 664 00:33:05,840 --> 00:33:07,960 Speaker 1: were going to communicate those to your audience. 665 00:33:08,320 --> 00:33:10,160 Speaker 2: From my point of view, I like to drill down 666 00:33:10,200 --> 00:33:12,680 Speaker 2: into the basics, and because I think that people are 667 00:33:12,680 --> 00:33:15,920 Speaker 2: curious about the universe and have pretty simple questions that 668 00:33:15,960 --> 00:33:20,320 Speaker 2: they want answers to, questions like what is space? Or 669 00:33:20,360 --> 00:33:23,680 Speaker 2: how does time work? Or like what is everything made 670 00:33:23,720 --> 00:33:26,160 Speaker 2: out of? And all of the answer in the end, 671 00:33:26,480 --> 00:33:29,120 Speaker 2: the disappoint Kelly is almost always like, well, we know 672 00:33:29,240 --> 00:33:31,840 Speaker 2: up to hear, and then we have questions. I think 673 00:33:31,880 --> 00:33:33,920 Speaker 2: it's worthwhile to bring people up to that point of 674 00:33:34,040 --> 00:33:37,320 Speaker 2: understanding of these really basic questions that remain, because I 675 00:33:37,360 --> 00:33:39,520 Speaker 2: think a lot of times people get the impression that 676 00:33:39,560 --> 00:33:42,160 Speaker 2: science has progressed so far and we've basically figured it out. 677 00:33:42,240 --> 00:33:44,640 Speaker 2: It's just like a couple of details remaining. But there's 678 00:33:44,720 --> 00:33:48,560 Speaker 2: like very basic, vast questions about the universe we don't 679 00:33:48,640 --> 00:33:51,960 Speaker 2: have simple answers to, and I think it's totally fun 680 00:33:52,040 --> 00:33:54,480 Speaker 2: and very worthwhile to explain those to people in a 681 00:33:54,480 --> 00:33:55,720 Speaker 2: way they can really get. 682 00:33:55,960 --> 00:33:57,520 Speaker 3: I think it's hard to pick a favorite, you know, 683 00:33:57,560 --> 00:34:01,240 Speaker 3: like it's hard to pick a favorite child. We've interviewed 684 00:34:01,320 --> 00:34:04,479 Speaker 3: these episodes haven't come out yet, but we interviewed James 685 00:34:04,520 --> 00:34:06,880 Speaker 3: sa Corey, so the guys who wrote the expanse and 686 00:34:07,200 --> 00:34:09,839 Speaker 3: Mary Roach and I love all of those people, and 687 00:34:09,960 --> 00:34:13,120 Speaker 3: podcasting is a great opportunity to reach out to amazing 688 00:34:13,160 --> 00:34:15,279 Speaker 3: people and ask them to talk to you for an hour. 689 00:34:15,520 --> 00:34:17,960 Speaker 3: So that was great, but I really all also have 690 00:34:18,120 --> 00:34:21,120 Speaker 3: enjoyed the episodes on space and time because Daniel is 691 00:34:21,160 --> 00:34:23,840 Speaker 3: really good at explaining things and I really love learning 692 00:34:23,880 --> 00:34:26,640 Speaker 3: about stuff that I don't already know. And physics. I 693 00:34:26,680 --> 00:34:29,680 Speaker 3: took physics in college. I did fine, but I definitely 694 00:34:29,719 --> 00:34:31,120 Speaker 3: studied to get an A on the test and then 695 00:34:31,160 --> 00:34:34,920 Speaker 3: forget it immediately afterwards. Yeah, and I know, sorry, there's 696 00:34:34,920 --> 00:34:38,000 Speaker 3: a lot of stuff I didn't understand, and it feels 697 00:34:38,000 --> 00:34:40,759 Speaker 3: good to have like professionals say okay, we've reached the 698 00:34:40,880 --> 00:34:44,040 Speaker 3: end of what we understand and be like, oh okay, 699 00:34:44,080 --> 00:34:46,640 Speaker 3: I can follow it up until the edge, and like, anyway, 700 00:34:46,640 --> 00:34:47,959 Speaker 3: I've learned a lot and it's been fun. 701 00:34:48,040 --> 00:34:51,800 Speaker 1: That's cool. Yeah. So both my parents are science fiction authors, 702 00:34:52,120 --> 00:34:55,239 Speaker 1: so yeah, so I grew up going to science fiction 703 00:34:55,320 --> 00:34:59,839 Speaker 1: conventions being around other authors as well as science can 704 00:34:59,840 --> 00:35:02,080 Speaker 1: meet indicators and stuff. I had a really fortunate childhood 705 00:35:02,080 --> 00:35:04,000 Speaker 1: in that way, and I keep thinking one of my 706 00:35:04,040 --> 00:35:07,240 Speaker 1: missed opportunities is I never had my dad on the podcast. 707 00:35:07,239 --> 00:35:08,960 Speaker 1: I could still ask him if he'd want to do it, 708 00:35:09,800 --> 00:35:13,280 Speaker 1: because I would love to have a conversation about things 709 00:35:13,280 --> 00:35:17,720 Speaker 1: that science fiction authors predicted that ultimately came to pass. 710 00:35:17,960 --> 00:35:20,560 Speaker 1: So Arthur C. Clark is obviously a big example with 711 00:35:20,640 --> 00:35:24,360 Speaker 1: geostationary satellites, and you know, you have Isaac Asimov and 712 00:35:24,400 --> 00:35:28,359 Speaker 1: the laws of robotics and things of that nature. My dad, famously, 713 00:35:28,480 --> 00:35:32,200 Speaker 1: in my mind anyway, made an incredible prediction. He predicted 714 00:35:32,200 --> 00:35:33,600 Speaker 1: the existence of Teddy Ruxman. 715 00:35:34,280 --> 00:35:37,320 Speaker 3: Oh nice, Yeah, I loved Teddy Ruxman. 716 00:35:37,480 --> 00:35:42,800 Speaker 1: And he also predicted commercials that do not, upon first glance, 717 00:35:42,880 --> 00:35:46,480 Speaker 1: tell you what the heck they're advertising. He called it 718 00:35:46,560 --> 00:35:52,080 Speaker 1: the oblique cell, but that was in his fantasy novel anyway. 719 00:35:52,400 --> 00:35:55,040 Speaker 1: It just makes your discussion there just made me think 720 00:35:55,080 --> 00:35:58,399 Speaker 1: about that as well. I love how people can take 721 00:35:58,440 --> 00:36:02,359 Speaker 1: inspiration from not just the observations they make, but from 722 00:36:02,360 --> 00:36:05,160 Speaker 1: fiction as well and say like, well, well, there's no 723 00:36:05,239 --> 00:36:07,799 Speaker 1: reason why that shouldn't work if we just figure some 724 00:36:07,880 --> 00:36:10,920 Speaker 1: things out and then work toward that. Now, obviously it 725 00:36:10,920 --> 00:36:13,719 Speaker 1: doesn't work for everything. Like my own personal opinion with 726 00:36:13,760 --> 00:36:15,800 Speaker 1: Star Trek is that if you were to use a transporter, 727 00:36:15,920 --> 00:36:18,400 Speaker 1: you're committing suicide and you're just creating a copy of 728 00:36:18,400 --> 00:36:21,359 Speaker 1: yourself somewhere else. And that's not really you. It's someone 729 00:36:21,400 --> 00:36:23,960 Speaker 1: else who has all your thoughts and beliefs and everything, 730 00:36:24,000 --> 00:36:28,239 Speaker 1: but you are. You're dead, and that's a copy. And 731 00:36:28,600 --> 00:36:30,480 Speaker 1: so I'm a bones McCoy kind of guy when it 732 00:36:30,520 --> 00:36:32,239 Speaker 1: comes to the transporter. 733 00:36:32,800 --> 00:36:35,080 Speaker 3: I do ask that question every time we interview a 734 00:36:35,120 --> 00:36:36,399 Speaker 3: sci fi person on our show. 735 00:36:36,960 --> 00:36:40,480 Speaker 1: Oh, is the transporter just killing you? Yeah? Yeah, I'm 736 00:36:40,520 --> 00:36:42,560 Speaker 1: fully on board with the transporters killing you. 737 00:36:42,719 --> 00:36:45,440 Speaker 2: I'm but would you step into one knowing that it's 738 00:36:45,440 --> 00:36:46,680 Speaker 2: going to kill you and then it's going to be 739 00:36:46,680 --> 00:36:48,240 Speaker 2: another version of you on the other side. 740 00:36:48,440 --> 00:36:53,400 Speaker 1: No, I'm too egotistical. I require this version of my 741 00:36:53,560 --> 00:36:57,000 Speaker 1: ego to continue and not a perfect copy. First of all, 742 00:36:57,120 --> 00:36:58,759 Speaker 1: I don't want there to be a perfect copy of 743 00:36:58,760 --> 00:37:02,080 Speaker 1: me out there. I got enough competition out there as 744 00:37:02,120 --> 00:37:06,360 Speaker 1: it is. No, I don't think I could. And maybe 745 00:37:06,400 --> 00:37:10,160 Speaker 1: if we were talking about distances that are beyond what 746 00:37:10,160 --> 00:37:14,080 Speaker 1: we can conveniently travel today, and like, yeah you can, 747 00:37:14,239 --> 00:37:17,720 Speaker 1: you can hop on over. We've colonized a planet that's 748 00:37:18,080 --> 00:37:21,279 Speaker 1: you know, twenty light years away. There's no way you're 749 00:37:21,280 --> 00:37:25,279 Speaker 1: going to get there. Otherwise, like, well, had a good 750 00:37:25,360 --> 00:37:28,799 Speaker 1: run here, Jonathan two point zero can take it from 751 00:37:28,840 --> 00:37:33,240 Speaker 1: here and make a new life on Earth two or whatever. 752 00:37:34,360 --> 00:37:38,200 Speaker 1: Are there Are there any particular thing, you know, sources 753 00:37:38,239 --> 00:37:40,920 Speaker 1: of inspiration that got you both or either of you 754 00:37:42,120 --> 00:37:45,080 Speaker 1: interested in pursuing a career in science, Like was there 755 00:37:45,360 --> 00:37:48,560 Speaker 1: was there someone or something that you know, you thought, well, 756 00:37:48,600 --> 00:37:51,279 Speaker 1: this is this is what makes I want to do, 757 00:37:51,719 --> 00:37:54,600 Speaker 1: whether it was directly in your field or maybe even 758 00:37:54,640 --> 00:37:56,080 Speaker 1: an unrelated field of science. 759 00:37:56,400 --> 00:37:59,440 Speaker 2: Well, like many people, I was inspired by Well, like 760 00:37:59,480 --> 00:38:02,120 Speaker 2: many people, I was inspired by my high school physics teacher. 761 00:38:02,520 --> 00:38:04,359 Speaker 2: I grew up in Los Alamos, New Mexico, so there's 762 00:38:04,400 --> 00:38:07,840 Speaker 2: lots of physicists around, and one of them came to 763 00:38:07,880 --> 00:38:10,160 Speaker 2: teach our ap physics class in the morning before he 764 00:38:10,239 --> 00:38:12,520 Speaker 2: went to work at the lab, and he seemed to 765 00:38:12,560 --> 00:38:13,960 Speaker 2: be having a lot of fun and he was a 766 00:38:14,000 --> 00:38:16,600 Speaker 2: great teacher. And you know, I just want to say 767 00:38:16,640 --> 00:38:19,319 Speaker 2: thank you to all the science teachers out there in 768 00:38:19,360 --> 00:38:22,520 Speaker 2: high school working on the front lines, because They're literally 769 00:38:22,600 --> 00:38:26,480 Speaker 2: responsible for creating the next generation of scientists. Huge fraction 770 00:38:26,560 --> 00:38:29,040 Speaker 2: of the scientists I know were inspired to enter their 771 00:38:29,040 --> 00:38:32,239 Speaker 2: field chemistry or biology or whatever because of their high 772 00:38:32,280 --> 00:38:35,480 Speaker 2: school science teacher. It's such an important job. 773 00:38:35,920 --> 00:38:40,480 Speaker 3: I really liked the show er, and so I really 774 00:38:40,480 --> 00:38:42,560 Speaker 3: wanted to be an er surgeon. And then I talked 775 00:38:42,560 --> 00:38:44,600 Speaker 3: to an er surgeon who was like, actually, most of 776 00:38:44,640 --> 00:38:47,160 Speaker 3: your day is really boring. It's nowhere near as exciting. 777 00:38:47,200 --> 00:38:48,800 Speaker 3: You're like waiting for someone to come in so you 778 00:38:48,840 --> 00:38:50,880 Speaker 3: can like stitch their hand because they like cut it 779 00:38:50,920 --> 00:38:52,520 Speaker 3: on the side of their car or something. And then 780 00:38:52,560 --> 00:38:55,040 Speaker 3: I thought maybe i'd be a veterinarian, but then I'd 781 00:38:55,080 --> 00:38:59,839 Speaker 3: still have to deal with people, and I turns out 782 00:39:00,680 --> 00:39:04,440 Speaker 3: I feel okay about people. But but then I like, 783 00:39:04,480 --> 00:39:06,640 Speaker 3: I had an overwhelming course load, and so I was like, 784 00:39:06,640 --> 00:39:08,879 Speaker 3: I'm gonna take ecology as an easy a because it's 785 00:39:08,920 --> 00:39:12,160 Speaker 3: like hippies playing in the woods. And it turned out 786 00:39:12,400 --> 00:39:15,520 Speaker 3: that they were using all of these like elegant equations 787 00:39:15,560 --> 00:39:19,160 Speaker 3: to describe the behavior of the animals interacting with each other, 788 00:39:19,200 --> 00:39:21,799 Speaker 3: and I was like, this is actually rigorous, and these 789 00:39:21,840 --> 00:39:24,920 Speaker 3: people are spending like their whole lives outside collecting these 790 00:39:25,000 --> 00:39:27,440 Speaker 3: data and then they get to think about what's happening outside. 791 00:39:27,680 --> 00:39:30,040 Speaker 3: And like, I fell in love with ecology in that class, 792 00:39:30,040 --> 00:39:34,799 Speaker 3: and so that easy a totally like derailed my financial 793 00:39:35,080 --> 00:39:37,520 Speaker 3: future and I went into science instead. 794 00:39:37,880 --> 00:39:40,560 Speaker 2: But you became a rigorous hippie in the woods. 795 00:39:40,800 --> 00:39:42,600 Speaker 3: I did, Man, I love it out here and then 796 00:39:42,640 --> 00:39:44,000 Speaker 3: I but I thought that I was going to be 797 00:39:44,040 --> 00:39:47,040 Speaker 3: studying like lions in the Serengetti, and I applied to 798 00:39:47,080 --> 00:39:49,600 Speaker 3: all of those labs for grad school, and then I 799 00:39:49,680 --> 00:39:52,960 Speaker 3: ended up studying like their parent, whatever's happening in their guts. 800 00:39:53,080 --> 00:39:55,800 Speaker 3: Turns out that's what I'm most interested in. So I 801 00:39:56,320 --> 00:39:57,120 Speaker 3: like the creepy stuff. 802 00:39:57,120 --> 00:39:59,080 Speaker 2: I guess that's the edge of all knowledge, right, that's 803 00:39:59,080 --> 00:40:00,720 Speaker 2: the biological dark matter for sure. 804 00:40:00,960 --> 00:40:01,600 Speaker 3: Yeah, there you go. 805 00:40:01,760 --> 00:40:04,719 Speaker 1: Yeah, I love it. I love these stories. 806 00:40:05,160 --> 00:40:07,680 Speaker 3: What about you? How did you go from Shakespeare to 807 00:40:08,000 --> 00:40:09,000 Speaker 3: podcasting about tech? 808 00:40:09,040 --> 00:40:12,200 Speaker 1: Okay, Yeah, that's a great question. For the first several 809 00:40:12,280 --> 00:40:15,319 Speaker 1: years of my professional career, I actually worked at a 810 00:40:15,440 --> 00:40:18,799 Speaker 1: human resources management consulting firm. So I like to say 811 00:40:18,840 --> 00:40:23,720 Speaker 1: I worked for the Bobs in office space, the two Bobs. Yeah. 812 00:40:23,760 --> 00:40:26,920 Speaker 1: More often than not, I was writing up reports about 813 00:40:27,080 --> 00:40:32,120 Speaker 1: why our consultants were recommending that a company layoff ten 814 00:40:32,160 --> 00:40:35,160 Speaker 1: percent of their employees. It was very demoralizing. I was 815 00:40:35,200 --> 00:40:37,200 Speaker 1: good at what I did. I hated my job. I 816 00:40:37,200 --> 00:40:40,200 Speaker 1: love the people I worked with, but I hated my job. 817 00:40:40,400 --> 00:40:43,200 Speaker 1: But I was good at it, and so I didn't 818 00:40:43,239 --> 00:40:45,359 Speaker 1: want to leave it. One of the consultants I worked for, 819 00:40:45,560 --> 00:40:48,399 Speaker 1: his name is frank Casa Grande, or Frankie big House 820 00:40:48,440 --> 00:40:51,359 Speaker 1: as I used to call him. Frankie big House. He's 821 00:40:51,360 --> 00:40:54,000 Speaker 1: from New Jersey. Came to me, He's like, John, John, 822 00:40:54,600 --> 00:40:56,719 Speaker 1: this isn't your passion, John, you got to get out 823 00:40:56,760 --> 00:41:01,600 Speaker 1: of here. You're killing yourself meant for bigger things. And 824 00:41:01,680 --> 00:41:05,239 Speaker 1: so it really sunk in. But I was so comfortable 825 00:41:05,520 --> 00:41:08,799 Speaker 1: that I did not leave that job until they got 826 00:41:08,840 --> 00:41:11,000 Speaker 1: rid of the position I was in. So I didn't 827 00:41:11,040 --> 00:41:13,600 Speaker 1: I wasn't fired. I just didn't have a job anymore. 828 00:41:14,480 --> 00:41:16,800 Speaker 1: The distinction was lost on me. I will be honest 829 00:41:16,840 --> 00:41:20,600 Speaker 1: with you. So then for for six months, I'm out 830 00:41:20,640 --> 00:41:22,680 Speaker 1: of work, trying to get a job, and I try 831 00:41:23,040 --> 00:41:26,720 Speaker 1: applying for writing gigs at different places around Atlanta, including 832 00:41:26,760 --> 00:41:30,719 Speaker 1: places like the Cartoon Network and Turner and CNN and 833 00:41:30,800 --> 00:41:34,000 Speaker 1: which are all the same company. But whatever, and ultimately 834 00:41:34,080 --> 00:41:36,040 Speaker 1: none of those pan out. I took another job at 835 00:41:36,040 --> 00:41:39,600 Speaker 1: a different consulting firm, which was soul crushing, and six 836 00:41:39,640 --> 00:41:42,399 Speaker 1: months after that, I one of the jobs I had 837 00:41:42,520 --> 00:41:45,120 Speaker 1: i'd interviewed for but I had not heard back from, 838 00:41:45,440 --> 00:41:48,680 Speaker 1: gets back to me, and it was HowStuffWorks dot Com. 839 00:41:49,080 --> 00:41:51,440 Speaker 1: So I come in, I interview with how stuffworks dot Com. 840 00:41:52,239 --> 00:41:54,560 Speaker 1: I get the job. I become one of two staff 841 00:41:54,600 --> 00:41:57,520 Speaker 1: writers at how stuffworks dot com, the other one Tracy Wilson, 842 00:41:57,520 --> 00:41:59,920 Speaker 1: who is currently a host of Stuff you missed in 843 00:42:00,080 --> 00:42:02,880 Speaker 1: history class. That's where the stuff and tech stuff comes from. 844 00:42:03,120 --> 00:42:08,440 Speaker 1: So I start writing articles. And my background in English 845 00:42:08,480 --> 00:42:12,319 Speaker 1: literature involved lots of research and lots of writing, and 846 00:42:12,400 --> 00:42:14,480 Speaker 1: so I was just applying that same skill set. But 847 00:42:14,560 --> 00:42:18,560 Speaker 1: now I got a chance to explore topics that I 848 00:42:18,600 --> 00:42:21,080 Speaker 1: often knew very little about. So I got to learn 849 00:42:21,200 --> 00:42:23,840 Speaker 1: and it was so exciting. It was like being in college, 850 00:42:23,960 --> 00:42:26,239 Speaker 1: but without all the pressures of college and all the 851 00:42:26,320 --> 00:42:30,640 Speaker 1: distractions of college. And I learned all about different things. 852 00:42:31,120 --> 00:42:33,880 Speaker 1: And then it was my job to explain what I 853 00:42:33,920 --> 00:42:37,000 Speaker 1: had learned to an audience and to synthesize that information 854 00:42:37,080 --> 00:42:40,480 Speaker 1: and communicate it outward. And I loved that too, and 855 00:42:40,760 --> 00:42:44,960 Speaker 1: it soon became clear that, unlike the other English majors 856 00:42:45,000 --> 00:42:48,560 Speaker 1: who joined the editorial department of HowStuffWorks dot com, I 857 00:42:48,680 --> 00:42:52,759 Speaker 1: was not afraid of technology. Everybody else we almost all 858 00:42:52,760 --> 00:42:55,319 Speaker 1: of us had attended the University of Georgia, almost all 859 00:42:55,360 --> 00:42:57,880 Speaker 1: of us had a degree in some form of English, 860 00:42:58,040 --> 00:43:01,040 Speaker 1: whether it was English Led or English whatever. But they 861 00:43:01,040 --> 00:43:03,720 Speaker 1: were all scared of tackling tech, and I wasn't because, 862 00:43:03,960 --> 00:43:06,520 Speaker 1: as I described to them, like tech either works or 863 00:43:06,560 --> 00:43:09,759 Speaker 1: it doesn't. It's it gets pretty simple when you get 864 00:43:09,800 --> 00:43:12,120 Speaker 1: to the macro level. It's when you dive into the 865 00:43:12,480 --> 00:43:14,880 Speaker 1: particulars where it can get a little intimidating. But I 866 00:43:14,960 --> 00:43:18,800 Speaker 1: found that those skills of being able to research a topic, 867 00:43:19,200 --> 00:43:23,799 Speaker 1: to find an understanding, and then to communicate that understanding 868 00:43:23,840 --> 00:43:27,480 Speaker 1: to an audience, that that tapped into a passion I 869 00:43:27,520 --> 00:43:31,480 Speaker 1: didn't even know I had. And so the podcast started 870 00:43:32,000 --> 00:43:35,279 Speaker 1: as an extension of the website and gave me even 871 00:43:35,320 --> 00:43:41,560 Speaker 1: more opportunity to not only communicate these these ideas about technology, 872 00:43:41,800 --> 00:43:44,120 Speaker 1: not just how it works, but how it impacts us 873 00:43:44,160 --> 00:43:47,760 Speaker 1: and how we change the tech. It also, most importantly 874 00:43:47,760 --> 00:43:53,239 Speaker 1: gave me an opportunity to make terrible puns. That's all 875 00:43:53,280 --> 00:43:56,680 Speaker 1: we really want to Yeah, getting paid to make terrible 876 00:43:56,760 --> 00:43:59,560 Speaker 1: puns and if you if you, if you position them 877 00:43:59,680 --> 00:44:02,840 Speaker 1: just you know, they had to hear it, and that's 878 00:44:02,880 --> 00:44:03,719 Speaker 1: the best. 879 00:44:05,120 --> 00:44:05,400 Speaker 3: Feeling. 880 00:44:05,560 --> 00:44:08,080 Speaker 1: Yeah, it is, it's great. It's like it's the only 881 00:44:08,120 --> 00:44:10,759 Speaker 1: thing I regret is that podcasting is a one way 882 00:44:10,840 --> 00:44:13,759 Speaker 1: communication medium and I can't hear the groans on the 883 00:44:13,800 --> 00:44:17,360 Speaker 1: other side, which is you know, that's what really nourishes 884 00:44:17,400 --> 00:44:20,080 Speaker 1: the soul in my opinion. But yeah, that's how it 885 00:44:20,080 --> 00:44:22,439 Speaker 1: all happened for me, and I have always had great 886 00:44:22,440 --> 00:44:26,719 Speaker 1: admiration for not just scientists, but science communicators who take 887 00:44:26,760 --> 00:44:31,560 Speaker 1: it very seriously, you know, learning and communicating and sort 888 00:44:31,560 --> 00:44:35,080 Speaker 1: of instilling that communication with the passion they have for 889 00:44:35,360 --> 00:44:37,880 Speaker 1: the process of learning. Like I think that's really what 890 00:44:37,920 --> 00:44:39,960 Speaker 1: it boils down to for me, is that I love 891 00:44:40,040 --> 00:44:42,680 Speaker 1: to learn. And I feel like a lot of scientists 892 00:44:42,719 --> 00:44:46,160 Speaker 1: that's kind of their core principle too. They just have 893 00:44:46,280 --> 00:44:48,160 Speaker 1: this love of learning. As you said at the beginning, 894 00:44:48,480 --> 00:44:52,920 Speaker 1: curiosity is one of those those guiding principles, and I 895 00:44:53,560 --> 00:44:57,040 Speaker 1: genuinely think that in order to live a fulfilling life, 896 00:44:57,120 --> 00:44:59,440 Speaker 1: you do have to maintain a sense of curiosity. Throughout. 897 00:44:59,480 --> 00:45:01,560 Speaker 3: It sure helps. 898 00:45:01,719 --> 00:45:03,520 Speaker 2: And it's what powers all of science. 899 00:45:03,560 --> 00:45:03,759 Speaker 3: You know. 900 00:45:03,840 --> 00:45:06,680 Speaker 2: The reason that we get millions or billions of dollars 901 00:45:06,719 --> 00:45:08,759 Speaker 2: to build a collider is because people want to know 902 00:45:08,800 --> 00:45:11,279 Speaker 2: the answer to these questions, you know, And so I 903 00:45:11,280 --> 00:45:13,799 Speaker 2: think we have an obligation not just to do the 904 00:45:13,840 --> 00:45:15,960 Speaker 2: science and to try to understand the universe, but to 905 00:45:16,040 --> 00:45:19,440 Speaker 2: share the answers with everybody, because everybody deserves to know 906 00:45:19,480 --> 00:45:27,240 Speaker 2: what we do and don't know about particles and cannibals. 907 00:45:25,040 --> 00:45:30,359 Speaker 1: All. Yeah, I don't think I find it. I think 908 00:45:30,400 --> 00:45:32,680 Speaker 1: one of the most distressing things I encounter is when 909 00:45:32,719 --> 00:45:35,640 Speaker 1: someone says, well, what good is that going to do me? 910 00:45:35,800 --> 00:45:38,080 Speaker 1: If you learn that, I'm like, oh my gosh, it's 911 00:45:38,120 --> 00:45:41,160 Speaker 1: like I'm talking to a different kind of creature because 912 00:45:41,239 --> 00:45:45,560 Speaker 1: I can't understand why the act of knowing isn't exciting 913 00:45:45,719 --> 00:45:48,920 Speaker 1: enough all on its own. Why you have to be like, okay, 914 00:45:48,920 --> 00:45:50,880 Speaker 1: but is that going to make my cell phone faster? Like, 915 00:45:51,200 --> 00:45:53,000 Speaker 1: first of all, we don't know what it'll do. 916 00:45:53,560 --> 00:45:54,160 Speaker 3: That's yeah. 917 00:45:54,360 --> 00:45:57,359 Speaker 1: Like, if you look back at all the technological advancements 918 00:45:57,360 --> 00:46:01,200 Speaker 1: that have been possible over the years, we didn't anticipate 919 00:46:01,640 --> 00:46:03,239 Speaker 1: most of those because we had no way of knowing. 920 00:46:03,280 --> 00:46:06,600 Speaker 1: I mean, certainly miniaturization alone was something no one had 921 00:46:06,640 --> 00:46:09,520 Speaker 1: thought about. Back in the twenties and thirties, everyone thought 922 00:46:09,760 --> 00:46:13,240 Speaker 1: computers were going to be the size of skyscrapers because 923 00:46:13,280 --> 00:46:18,000 Speaker 1: they had to be so like, don't worry about you know, 924 00:46:18,120 --> 00:46:23,040 Speaker 1: a specific benefit that you get beyond knowing knowing itself 925 00:46:23,080 --> 00:46:25,600 Speaker 1: as a benefit. And then on top of that, who 926 00:46:25,760 --> 00:46:28,360 Speaker 1: knows what else will be able to leverage from the 927 00:46:28,440 --> 00:46:33,120 Speaker 1: knowledge we uncover. So amen, yeah, yeah, so, gosh darn it. 928 00:46:33,280 --> 00:46:39,080 Speaker 1: Funded science, yes, please, yes, fund easy and obvious investment 929 00:46:39,239 --> 00:46:40,880 Speaker 1: in humanity and in our future. 930 00:46:40,960 --> 00:46:43,520 Speaker 2: I don't understand why it's not a bipartisan thing, you know. 931 00:46:43,600 --> 00:46:48,120 Speaker 1: Yeah, either do I I always I always don't. I 932 00:46:48,160 --> 00:46:51,400 Speaker 1: don't get why people will will portray science as somehow 933 00:46:51,440 --> 00:46:56,880 Speaker 1: having a political bias. That's not the way science works. 934 00:46:57,520 --> 00:46:57,640 Speaker 3: Uh. 935 00:46:58,760 --> 00:47:02,120 Speaker 1: Daniel and Kelly, thank you so much for joining tech stuff. 936 00:47:02,120 --> 00:47:03,640 Speaker 1: This has been a lot of fun. I know it's 937 00:47:03,800 --> 00:47:07,640 Speaker 1: kind of an unhinged whirlwind episode, but I mean, I'm 938 00:47:07,640 --> 00:47:09,920 Speaker 1: coming up to the end of my run kind of 939 00:47:09,960 --> 00:47:13,240 Speaker 1: like all the safety measures are off as far as I'm. 940 00:47:13,080 --> 00:47:16,600 Speaker 3: Concerned, the rails are off. That's the most fun. Thank you. 941 00:47:16,680 --> 00:47:18,239 Speaker 3: So much for having us on the show. This was 942 00:47:18,239 --> 00:47:18,760 Speaker 3: a blast. 943 00:47:18,960 --> 00:47:19,960 Speaker 2: Yeah, thank you very much. 944 00:47:20,040 --> 00:47:22,920 Speaker 1: I recommend everyone go check out the podcast. You have 945 00:47:23,000 --> 00:47:25,279 Speaker 1: to listen, especially if you know you're sitting there and 946 00:47:25,320 --> 00:47:28,000 Speaker 1: curious about which animals will eat their own kind, because 947 00:47:28,040 --> 00:47:31,719 Speaker 1: you'll learn. You will learn some things you cannot unlearn. 948 00:47:32,840 --> 00:47:36,799 Speaker 1: You're welcome, Thanks again, and I hope you have a 949 00:47:36,840 --> 00:47:37,359 Speaker 1: great day. 950 00:47:37,560 --> 00:47:40,080 Speaker 3: Thanks you too, Thanks very much. Well. 951 00:47:40,120 --> 00:47:42,719 Speaker 1: I hope all of you enjoyed that conversation I had 952 00:47:42,719 --> 00:47:45,680 Speaker 1: with Daniel and Kelly of Daniel and Kelly's Extraordinary Universe. 953 00:47:45,680 --> 00:47:47,839 Speaker 1: You should definitely check out that podcast. Like I said, 954 00:47:47,880 --> 00:47:51,480 Speaker 1: it's fascinating stuff. They tackle all things science. They've had 955 00:47:51,520 --> 00:47:53,960 Speaker 1: some great guests on as well to talk about stuff 956 00:47:54,000 --> 00:47:58,560 Speaker 1: like quantum computing. They really take these complicated topics and 957 00:47:58,600 --> 00:48:01,399 Speaker 1: break them down in ways that are really accessible and 958 00:48:01,480 --> 00:48:04,440 Speaker 1: fun and exciting. And it's the sort of stuff I 959 00:48:04,480 --> 00:48:07,680 Speaker 1: always aspire to do. I hope that on occasion I 960 00:48:07,880 --> 00:48:11,200 Speaker 1: achieve that. Based upon some of the wonderful messages I've 961 00:48:11,200 --> 00:48:14,760 Speaker 1: been getting from listeners who have reached out to express 962 00:48:14,800 --> 00:48:17,480 Speaker 1: how how much they appreciate the show, it feels like 963 00:48:17,560 --> 00:48:19,960 Speaker 1: once in a while I hit that target, which is 964 00:48:20,000 --> 00:48:22,719 Speaker 1: a great feeling. I hope all of you out there 965 00:48:22,840 --> 00:48:26,120 Speaker 1: are doing well. If you are here in the United States, 966 00:48:26,640 --> 00:48:30,320 Speaker 1: happy Thanksgiving. I hope you have a wonderful, safe time 967 00:48:30,400 --> 00:48:34,040 Speaker 1: with their loved ones nearby. For everyone else around the world, 968 00:48:34,320 --> 00:48:37,560 Speaker 1: have a great week. You know, if you want to 969 00:48:37,600 --> 00:48:41,440 Speaker 1: participate in Thanksgiving on kind of a metaphorical level, just 970 00:48:41,719 --> 00:48:44,439 Speaker 1: pour some gravy down your throat and be thankful. That's 971 00:48:44,480 --> 00:48:48,359 Speaker 1: what I do every year, Tari can attest. She says 972 00:48:48,400 --> 00:48:50,480 Speaker 1: it's disturbing and that if I do it again in 973 00:48:50,520 --> 00:48:53,960 Speaker 1: front of her, she's going to hr Well, have a 974 00:48:54,000 --> 00:48:57,319 Speaker 1: great week, and I'll talk to you again really soon. 975 00:49:03,440 --> 00:49:08,080 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 976 00:49:08,400 --> 00:49:12,120 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 977 00:49:12,160 --> 00:49:13,240 Speaker 1: to your favorite shows.