1 00:00:00,120 --> 00:00:09,959 Speaker 1: Fascinating. It's one more thing. I'm one more thing. Two 2 00:00:09,960 --> 00:00:12,559 Speaker 1: notes from the world of science fiction. Jack, I know 3 00:00:12,720 --> 00:00:17,280 Speaker 1: you love that genre of entertainment slash literature. 4 00:00:17,280 --> 00:00:18,720 Speaker 2: I am not a science fiction person. 5 00:00:18,720 --> 00:00:20,439 Speaker 1: I don't know why. I just don't know. I don't 6 00:00:20,440 --> 00:00:23,119 Speaker 1: know why either. It's it's great. There's some that's just 7 00:00:23,200 --> 00:00:26,000 Speaker 1: it's some of the wisest best things. You see. You 8 00:00:26,160 --> 00:00:30,040 Speaker 1: change the setting, Katie, What did did? Did your dog 9 00:00:30,200 --> 00:00:32,600 Speaker 1: break wind? Why do you have that look on your face? 10 00:00:32,680 --> 00:00:35,880 Speaker 2: It feels the same way I do about science fiction science. 11 00:00:36,080 --> 00:00:41,920 Speaker 1: I work with philistines. No, it changed, all right, you 12 00:00:41,920 --> 00:00:46,040 Speaker 1: know what. I'm not going to try to explain Beethoven 13 00:00:46,760 --> 00:00:51,680 Speaker 1: to a couple of bassett hounds. No offense. Funny now, 14 00:00:51,760 --> 00:00:53,599 Speaker 1: you either like it or you don't. But it is 15 00:00:53,760 --> 00:00:57,720 Speaker 1: somebody more. I'll bet Tim Sandefer could could wax incredibly 16 00:00:57,760 --> 00:01:00,880 Speaker 1: eloquent on why science fiction is so good because it 17 00:01:01,000 --> 00:01:04,080 Speaker 1: changes enough frames of reference that you can isolate various 18 00:01:04,160 --> 00:01:07,520 Speaker 1: aspects of human behavior in a way that highlights them 19 00:01:08,360 --> 00:01:12,640 Speaker 1: in a way that's difficult with other fiction. Anyway, fascinating exactly. 20 00:01:13,440 --> 00:01:18,319 Speaker 1: I have no blanket idea. This is note from science 21 00:01:18,319 --> 00:01:21,880 Speaker 1: fiction number one. Why this popped into my head. I'm 22 00:01:21,920 --> 00:01:27,000 Speaker 1: wandering off to the bathroom the radio ranch. It must 23 00:01:27,000 --> 00:01:30,480 Speaker 1: have been. I was thinking about something that was really interesting, 24 00:01:31,280 --> 00:01:35,199 Speaker 1: and I heard Spock's voice in my head saying, fascinating, fascinating, 25 00:01:35,880 --> 00:01:39,040 Speaker 1: and you know what occurred to me? Tell me about 26 00:01:39,120 --> 00:01:42,440 Speaker 1: mister Spock, Michael, I'm putting you on the spot here. 27 00:01:42,720 --> 00:01:47,120 Speaker 1: What is the most distinctive thing about the character mister Spock? 28 00:01:47,400 --> 00:01:48,160 Speaker 1: It was his ears? 29 00:01:48,560 --> 00:01:52,760 Speaker 2: No, no, you know what, that's not a bad assimal. 30 00:01:52,840 --> 00:01:55,800 Speaker 2: That's a perfect answer. If you're gonna look at him, 31 00:01:55,800 --> 00:01:59,440 Speaker 2: but not physically. How about his personality? What was most 32 00:01:59,480 --> 00:02:01,520 Speaker 2: notable class anyone? 33 00:02:01,880 --> 00:02:02,440 Speaker 1: That's right? 34 00:02:02,640 --> 00:02:04,920 Speaker 2: Well, I'm interested to Michael, you don't have answer. Have 35 00:02:05,000 --> 00:02:07,400 Speaker 2: you never seen Star Trek here and there? But not really? 36 00:02:07,760 --> 00:02:11,840 Speaker 2: He was emotionless, right, exactly? Have you seen Star Trek Katie? 37 00:02:13,240 --> 00:02:16,480 Speaker 1: When I was a kid, the original Trek, my dad 38 00:02:16,520 --> 00:02:19,480 Speaker 1: watched a bit of it and I was forced. Oh 39 00:02:19,600 --> 00:02:22,120 Speaker 1: I was. I was a big fan. And then I 40 00:02:22,160 --> 00:02:25,640 Speaker 1: got to college and a bunch of buddies in mine 41 00:02:26,280 --> 00:02:30,200 Speaker 1: and I would get baked every Sunday night after our 42 00:02:30,360 --> 00:02:33,080 Speaker 1: like homework was done and stuff like this study and 43 00:02:33,080 --> 00:02:35,240 Speaker 1: we would get baked and watch Star Trek. 44 00:02:36,000 --> 00:02:39,079 Speaker 2: Guy one episode or multiple No, just one, just one, 45 00:02:39,120 --> 00:02:39,760 Speaker 2: over and over again. 46 00:02:40,000 --> 00:02:41,799 Speaker 1: There was one of the TV stations that aired Star 47 00:02:41,840 --> 00:02:43,280 Speaker 1: Trek at nine o'clock Sunday nights. 48 00:02:43,520 --> 00:02:45,680 Speaker 2: What a different era in that you would now you know, 49 00:02:45,800 --> 00:02:48,320 Speaker 2: be able to binge the whole series in one weekend. 50 00:02:48,120 --> 00:02:52,520 Speaker 1: Right exactly. But oh, we had ground rules about there's 51 00:02:52,560 --> 00:02:54,920 Speaker 1: no saving seats. If you got up to hit the john, 52 00:02:55,600 --> 00:02:58,160 Speaker 1: you're you're the next guy is going to jump into 53 00:02:58,160 --> 00:03:00,280 Speaker 1: that good seat and you come back and you'd you know, 54 00:03:00,280 --> 00:03:05,520 Speaker 1: shuffling feet, lose your seat. There's etiquette. It was very formalized. Anyway, 55 00:03:06,840 --> 00:03:13,639 Speaker 1: it occurred to me Spock saying, fascinating, that's an emotional reaction. 56 00:03:15,400 --> 00:03:22,400 Speaker 1: His catchphrase contradicted his very essence. If you are a 57 00:03:22,480 --> 00:03:24,480 Speaker 1: robot like and feel no emotion, you're not gonna be 58 00:03:24,520 --> 00:03:26,919 Speaker 1: fascinated by a problem, no or an issue. 59 00:03:26,919 --> 00:03:30,239 Speaker 2: It's just more information, but it has no emotional effect. 60 00:03:29,960 --> 00:03:34,360 Speaker 1: On right exactly. His very catchphrase contradicted his character. I 61 00:03:34,400 --> 00:03:37,080 Speaker 1: hope Tim Sandifer hears this. I would love to hear 62 00:03:37,120 --> 00:03:38,840 Speaker 1: his comments someday. 63 00:03:38,520 --> 00:03:40,760 Speaker 2: Or the episode where he thought Jim was dead and 64 00:03:40,800 --> 00:03:42,520 Speaker 2: he was so excited to see Jim that he has 65 00:03:42,520 --> 00:03:46,360 Speaker 2: a big smut, Jim, you're alive, briefly breaking character and 66 00:03:46,400 --> 00:03:47,040 Speaker 2: having emotions. 67 00:03:47,040 --> 00:03:49,360 Speaker 1: Of course, he was half human, Jack, Oh, that's right, 68 00:03:49,560 --> 00:03:54,560 Speaker 1: he was half human. Tim Sandfer has written a piece 69 00:03:54,720 --> 00:03:58,680 Speaker 1: on the evolving point of view and plots of Star 70 00:03:58,840 --> 00:04:04,280 Speaker 1: Trek and how it parallels American foreign policy that is 71 00:04:05,160 --> 00:04:06,080 Speaker 1: knockout good. 72 00:04:06,200 --> 00:04:08,880 Speaker 2: Oh really, Oh yeah, that's interesting. It's not about how 73 00:04:08,880 --> 00:04:11,440 Speaker 2: William Shatton or some seasons would be fat and some 74 00:04:11,480 --> 00:04:12,480 Speaker 2: seasons would be fit. 75 00:04:12,680 --> 00:04:15,240 Speaker 1: No, al loough. That is worthy of studying itself. All right, 76 00:04:16,040 --> 00:04:24,720 Speaker 1: birds science fiction note number two, And there are a 77 00:04:24,800 --> 00:04:27,800 Speaker 1: couple of people who email this. We were talking about 78 00:04:28,680 --> 00:04:33,560 Speaker 1: gene splicing, editing designer baby. 79 00:04:33,360 --> 00:04:35,400 Speaker 2: Yeah, and how it's a real thing. This New York 80 00:04:35,400 --> 00:04:38,000 Speaker 2: Times columnist writing about how she and her husband with 81 00:04:38,040 --> 00:04:39,720 Speaker 2: IVF were given the choice of whether they wanted a 82 00:04:39,720 --> 00:04:41,800 Speaker 2: boy or a girl, and they decided they didn't like 83 00:04:43,000 --> 00:04:46,159 Speaker 2: the power to choose that, but then found out that 84 00:04:46,320 --> 00:04:48,239 Speaker 2: there are a whole bunch of other things you can choose. 85 00:04:48,320 --> 00:04:51,080 Speaker 2: And a poll came out that forty percent of Americans 86 00:04:51,080 --> 00:04:55,559 Speaker 2: said they would make choices in embryos if it would 87 00:04:56,160 --> 00:04:58,440 Speaker 2: make it more likely that their kid could get into 88 00:04:58,480 --> 00:05:01,400 Speaker 2: a good school whatever. I hate that phrase. I don't 89 00:05:01,400 --> 00:05:02,360 Speaker 2: even know what that moves. 90 00:05:02,720 --> 00:05:05,000 Speaker 1: Oh yeah, I kind of hate that attitude. 91 00:05:05,040 --> 00:05:08,160 Speaker 2: Yeah, so like make taller, better looking, smarter. You can 92 00:05:08,279 --> 00:05:09,080 Speaker 2: choose that or. 93 00:05:09,200 --> 00:05:11,960 Speaker 1: Just quote unquote happy, happier. 94 00:05:12,120 --> 00:05:14,239 Speaker 3: Yeah, you guys ever see that show Black Mirror? 95 00:05:14,800 --> 00:05:17,520 Speaker 1: On that No, I've got a note to watch that 96 00:05:17,600 --> 00:05:18,320 Speaker 1: and I haven't yet. 97 00:05:18,360 --> 00:05:19,159 Speaker 2: What's the premise? 98 00:05:19,520 --> 00:05:24,640 Speaker 3: The premise it's basically way in the future, and it's 99 00:05:25,120 --> 00:05:28,120 Speaker 3: very tech like technology based. So at one point you 100 00:05:28,160 --> 00:05:31,000 Speaker 3: have people walking around and they have like a social 101 00:05:31,080 --> 00:05:33,960 Speaker 3: status above their heads that you can see, you know, 102 00:05:34,160 --> 00:05:37,039 Speaker 3: and if you do something that's socially unacceptable, your status 103 00:05:37,080 --> 00:05:39,279 Speaker 3: goes down. It's just like a bunch of really terrible 104 00:05:39,320 --> 00:05:42,600 Speaker 3: technology scenarios. This sounds like something out of Black Mirror. 105 00:05:42,640 --> 00:05:45,120 Speaker 1: But did you enjoy that show? I love that, which 106 00:05:45,160 --> 00:05:47,960 Speaker 1: is science fiction? You hypocrite. 107 00:05:48,520 --> 00:05:50,800 Speaker 2: I don't do the whole space Aliens thing. 108 00:05:50,839 --> 00:05:53,400 Speaker 1: I don't know. That's where you draw a lot, That's 109 00:05:53,400 --> 00:05:55,520 Speaker 1: where I draw a line. Yeah, okay, I will watch 110 00:05:55,560 --> 00:05:57,680 Speaker 1: it because it sounds really interesting. Yeah, it really is. 111 00:05:57,880 --> 00:06:00,280 Speaker 1: So anyway, we were talking about this and I just 112 00:06:00,320 --> 00:06:03,479 Speaker 1: did that Mike Judge, who was behind the We thought 113 00:06:03,480 --> 00:06:06,440 Speaker 1: it was a comedy. Turns out it was a documentary idiocracy, 114 00:06:07,080 --> 00:06:10,159 Speaker 1: one of our mutual favorite movies of all time. It's 115 00:06:10,200 --> 00:06:13,680 Speaker 1: it's dumb, but it's smart anyway. He needs to make 116 00:06:13,720 --> 00:06:17,280 Speaker 1: an idiocracy for genetically engineered humans who are all, you know, 117 00:06:17,400 --> 00:06:20,520 Speaker 1: good looking, tall, blue eyed, whatever, kind of a Stepford 118 00:06:20,560 --> 00:06:21,840 Speaker 1: Wives baby. 119 00:06:21,800 --> 00:06:27,200 Speaker 2: And pleasant or happy. Because we mentioned the show, every writer, musician, 120 00:06:28,279 --> 00:06:32,400 Speaker 2: artist was unhappy. Has there ever been a happy, great 121 00:06:32,440 --> 00:06:34,200 Speaker 2: guitar player. I don't know if there has been. 122 00:06:34,240 --> 00:06:38,320 Speaker 1: Right, whether alienated or lonely or a misfit who is 123 00:06:38,360 --> 00:06:41,800 Speaker 1: looking for some purpose to their life. Yeah, yeah, it's well, 124 00:06:42,080 --> 00:06:50,559 Speaker 1: discontent in general drives achievement anyway. So a couple of people, 125 00:06:50,760 --> 00:06:56,280 Speaker 1: including Christy, wrote notes that said, hey, you're talking about 126 00:06:56,400 --> 00:06:59,400 Speaker 1: the movie Gatica, which came out and I think somebody 127 00:06:59,400 --> 00:07:03,680 Speaker 1: said nineteen ninety seven, it's not a comedy, but it's 128 00:07:03,680 --> 00:07:06,000 Speaker 1: a beautiful film about the not too distant future where 129 00:07:06,000 --> 00:07:08,320 Speaker 1: people choose traits and those who are born naturally are 130 00:07:08,320 --> 00:07:10,040 Speaker 1: considered second class citizens. 131 00:07:10,360 --> 00:07:11,160 Speaker 2: Oh wow. 132 00:07:11,280 --> 00:07:14,640 Speaker 1: I'm a genetic counselor and former genetic research scientist, and 133 00:07:15,080 --> 00:07:18,040 Speaker 1: it is literally referenced in ethical discussions all the time, 134 00:07:18,320 --> 00:07:20,560 Speaker 1: it's normal to have someone ask the question, isn't that 135 00:07:20,600 --> 00:07:23,320 Speaker 1: a little too gatica? Wow? 136 00:07:23,440 --> 00:07:29,080 Speaker 2: What would cause U naturally born to be second class citizens? 137 00:07:30,000 --> 00:07:33,520 Speaker 1: Well, because they're inferior, they're not all all and beautiful 138 00:07:33,560 --> 00:07:35,520 Speaker 1: and perfect, And well, yeah. 139 00:07:35,320 --> 00:07:37,080 Speaker 2: Just naturally you would because you wouldn't be as good 140 00:07:37,120 --> 00:07:38,160 Speaker 2: looking or as smart. 141 00:07:38,480 --> 00:07:41,040 Speaker 1: And by the way, Christine Rancho Cordova, I will, I'll 142 00:07:41,120 --> 00:07:45,360 Speaker 1: drop her note, but she says, uh, I won't bore 143 00:07:45,400 --> 00:07:47,880 Speaker 1: you with all the details, blah blah blah. No, go ahead, 144 00:07:49,200 --> 00:07:50,920 Speaker 1: bore us with as many details as you want. To 145 00:07:50,920 --> 00:07:55,040 Speaker 1: find this all fascinating. And she makes the point that, yeah, 146 00:07:55,080 --> 00:07:58,200 Speaker 1: we talk about the ethics all the time, but I 147 00:07:58,200 --> 00:08:01,119 Speaker 1: do think it's just a matter of time, especially because 148 00:08:01,120 --> 00:08:05,960 Speaker 1: other countries have been dabbling in it anyway. Uh, And 149 00:08:06,120 --> 00:08:11,360 Speaker 1: I thought, Gatica, that sounds really familiar. Getka, why does 150 00:08:11,360 --> 00:08:17,080 Speaker 1: that sound so familiar? And I think, I think that's 151 00:08:17,160 --> 00:08:22,000 Speaker 1: the movie I got to retell a story that's been 152 00:08:22,000 --> 00:08:24,280 Speaker 1: told more than once on the show through the last 153 00:08:24,560 --> 00:08:32,000 Speaker 1: well nineteen ninety seven, twenty five years whatever. I took 154 00:08:32,040 --> 00:08:37,000 Speaker 1: my kids to see Elf, a good movie in the theater, 155 00:08:37,760 --> 00:08:42,600 Speaker 1: movie and they had a couple of previews, a young 156 00:08:42,720 --> 00:08:46,240 Speaker 1: Zoey Deschanel. Oh so cute anyway. 157 00:08:46,880 --> 00:08:50,120 Speaker 2: Katie rolls her eyes. I'm a straight Maybe you were 158 00:08:50,120 --> 00:08:53,400 Speaker 2: there for an older Ed Asner, so. 159 00:08:53,559 --> 00:08:55,839 Speaker 1: Any oh, she was so lovable in that movie. She's 160 00:08:55,920 --> 00:09:01,280 Speaker 1: darling anyway. So a couple of previews and then another 161 00:09:01,360 --> 00:09:06,760 Speaker 1: preview we thought of a very dark and serious nature, 162 00:09:06,960 --> 00:09:12,800 Speaker 1: very dark and serious scene unfolds and it is as 163 00:09:12,800 --> 00:09:20,200 Speaker 1: I recall, oh, what's her name, a beautiful African American actress, Yeah, 164 00:09:21,040 --> 00:09:24,520 Speaker 1: who was sitting in a chair talking to some sort 165 00:09:24,559 --> 00:09:27,520 Speaker 1: of counselor or detective or something like that. And get 166 00:09:27,520 --> 00:09:31,440 Speaker 1: the bleepers ready, and what's the ages on your kids? Again? 167 00:09:31,520 --> 00:09:35,200 Speaker 1: This is a theater full of parents and children ages 168 00:09:35,320 --> 00:09:36,840 Speaker 1: five through thirteen. 169 00:09:37,000 --> 00:09:38,480 Speaker 2: Oh yeah, you're seeing Elf. 170 00:09:39,360 --> 00:09:45,199 Speaker 1: In the theater. And the scene unfolded as follows. Blah 171 00:09:45,200 --> 00:09:48,000 Speaker 1: blah blah. The detective or counselor or whatever the person 172 00:09:48,160 --> 00:09:51,720 Speaker 1: was says to the halle Berry character, so are you 173 00:09:51,880 --> 00:09:59,080 Speaker 1: telling me you talked to Satan? And the halle Berry character, 174 00:09:59,240 --> 00:10:03,560 Speaker 1: I think it was halle Bery says, talk to Satan, 175 00:10:04,360 --> 00:10:06,800 Speaker 1: I fucked him, and. 176 00:10:06,720 --> 00:10:13,600 Speaker 4: The whole theater goes, WHOA, what is happening here, and 177 00:10:13,600 --> 00:10:18,640 Speaker 4: there's a huge uproar, including me, why you're doing everybody 178 00:10:18,679 --> 00:10:23,280 Speaker 4: shouting at the projection booth, and they and everything stops 179 00:10:23,280 --> 00:10:26,000 Speaker 4: and the lights go on. In about two or three 180 00:10:26,080 --> 00:10:28,880 Speaker 4: minutes of there's a great deal of buzz in the theater. 181 00:10:28,960 --> 00:10:31,600 Speaker 1: You can imagine the parents and the kids going, oh 182 00:10:31,600 --> 00:10:35,120 Speaker 1: my god, what does that mean? What does it mean 183 00:10:35,120 --> 00:10:39,520 Speaker 1: to fuck Satan? Oh jeez, oh my gosh, es. 184 00:10:39,559 --> 00:10:41,280 Speaker 2: Elf come out crying. 185 00:10:42,920 --> 00:10:47,400 Speaker 1: They come on and say, we'd like to apologize the 186 00:10:47,440 --> 00:10:50,319 Speaker 1: projection booth ran the wrong film. We'll have Elf cued 187 00:10:50,440 --> 00:10:51,520 Speaker 1: up for you in just a moment. 188 00:10:51,520 --> 00:10:53,400 Speaker 2: If you'd like to sign some paperwork on the way 189 00:10:53,400 --> 00:10:55,880 Speaker 2: out saying you will not hold us responsible for any 190 00:10:56,240 --> 00:10:58,959 Speaker 2: future trauma, we will give you a free small popcorn. 191 00:10:59,000 --> 00:11:04,120 Speaker 1: And I think that movie was gatica. 192 00:11:05,120 --> 00:11:05,320 Speaker 4: Yeah. 193 00:11:05,360 --> 00:11:08,760 Speaker 3: I'm wondering if you guys actually enjoyed treating your customers 194 00:11:08,760 --> 00:11:10,440 Speaker 3: like a piece of yeah. 195 00:11:10,840 --> 00:11:12,160 Speaker 1: Huh oh. 196 00:11:12,200 --> 00:11:17,120 Speaker 3: I can only imagine the parents in there, Oh. 197 00:11:16,559 --> 00:11:22,160 Speaker 2: My god, it is something, you know, since we got 198 00:11:22,200 --> 00:11:24,079 Speaker 2: an F bomb in the show. Already out to the store, 199 00:11:24,120 --> 00:11:26,280 Speaker 2: because this happened two days ago. I got the little 200 00:11:26,320 --> 00:11:28,560 Speaker 2: local market by my house, a little grocery store that 201 00:11:29,360 --> 00:11:31,360 Speaker 2: the good enough grocery store in case unless you need 202 00:11:31,360 --> 00:11:32,640 Speaker 2: a whole bunch of stuff every but it goes to 203 00:11:33,080 --> 00:11:35,440 Speaker 2: nice grocery store. It just came under new ownership, and 204 00:11:35,440 --> 00:11:38,960 Speaker 2: the guy's working really hard to uh, to please everybody 205 00:11:39,040 --> 00:11:41,480 Speaker 2: and saying hello to everybody and introduce himself and everything 206 00:11:41,520 --> 00:11:43,560 Speaker 2: like that. That's really nice and everything like that. I'm in 207 00:11:43,559 --> 00:11:45,720 Speaker 2: the other there the other day and I'm going through 208 00:11:45,760 --> 00:11:48,800 Speaker 2: and I said, it looks like it's a knocking the 209 00:11:48,880 --> 00:11:53,240 Speaker 2: rain day, and he said, yeah, it's pretty fucking nice outside, 210 00:11:55,440 --> 00:11:55,839 Speaker 2: all right. 211 00:11:56,480 --> 00:12:00,720 Speaker 1: I love that. I thought, what, No, it's of society. 212 00:12:00,800 --> 00:12:03,480 Speaker 1: He's just a nice, cheerful guy. He's like sixty two 213 00:12:03,600 --> 00:12:04,160 Speaker 1: or something. 214 00:12:03,960 --> 00:12:06,520 Speaker 2: Like that, well, owns the store, kind of a fat guy, 215 00:12:06,559 --> 00:12:09,960 Speaker 2: and just very super pleasant, like helping the old ladies 216 00:12:09,960 --> 00:12:13,240 Speaker 2: to their car. It's pretty fucking nice. Sam told me, 217 00:12:13,280 --> 00:12:14,720 Speaker 2: he said, you gotta start talking to him that way. 218 00:12:14,720 --> 00:12:16,000 Speaker 2: It's like, where's the fucking milk? 219 00:12:18,320 --> 00:12:27,559 Speaker 1: No, no, no, oh, that's awesome. Well, I guess that's it. 220 00:12:28,880 --> 00:12:30,400 Speaker 1: I was really hoping you were gonna say, I guess 221 00:12:30,400 --> 00:12:33,640 Speaker 1: that's fucking me too. Great minds. I thought of it.