1 00:00:01,360 --> 00:00:07,680 Speaker 1: Welcome to stuff you should know, a production of iHeartRadio. 2 00:00:11,119 --> 00:00:13,800 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's Chuck. 3 00:00:13,880 --> 00:00:17,120 Speaker 2: I'm Chucky already. 4 00:00:18,280 --> 00:00:21,119 Speaker 1: You know, you know I thought a moose apropos. 5 00:00:25,160 --> 00:00:30,280 Speaker 2: Oh it's my turn to talk. Yeah, oh, well played. 6 00:00:30,600 --> 00:00:32,800 Speaker 2: I'm not very good at this. As you know, if 7 00:00:32,840 --> 00:00:35,479 Speaker 2: you have ever listened to the podcast, and I know 8 00:00:35,560 --> 00:00:38,199 Speaker 2: you have because you're one of the co hosts, you 9 00:00:38,280 --> 00:00:42,200 Speaker 2: know that I step on you a lot. Now, yep, 10 00:00:42,520 --> 00:00:46,440 Speaker 2: I'll keep going with my turn constructional unit. Then how 11 00:00:46,520 --> 00:00:48,400 Speaker 2: confused are people? Do you think right now? 12 00:00:49,320 --> 00:00:52,440 Speaker 1: I don't know, probably very I mean we should say 13 00:00:52,440 --> 00:00:55,120 Speaker 1: this is all just a bit to sort of demonstrate 14 00:00:55,880 --> 00:01:01,360 Speaker 1: a conversational analysis. Yeah, well thatword demonst the analysis. If 15 00:01:01,360 --> 00:01:04,000 Speaker 1: someone was sitting in another room making notes about how 16 00:01:04,080 --> 00:01:06,800 Speaker 1: we were talking like a creep, Yeah, that would be 17 00:01:06,800 --> 00:01:12,600 Speaker 1: a conversational analysis. We were just demonstrating poor communication pretty much. 18 00:01:13,000 --> 00:01:15,759 Speaker 2: But it would be a bonanza for a conversation analyst 19 00:01:15,959 --> 00:01:19,360 Speaker 2: a CAA as they like to call themselves. This is 20 00:01:19,360 --> 00:01:23,520 Speaker 2: a super super niche field of science. I guess it 21 00:01:23,520 --> 00:01:26,600 Speaker 2: would be a social science because it branched off from sociology, 22 00:01:27,480 --> 00:01:30,199 Speaker 2: but one of the things that I've noticed about it 23 00:01:30,240 --> 00:01:33,520 Speaker 2: is that people like to try to push it into 24 00:01:34,200 --> 00:01:37,120 Speaker 2: a typical social science, right, like come up with some 25 00:01:37,200 --> 00:01:39,640 Speaker 2: theories like why do people do these things that you 26 00:01:39,680 --> 00:01:44,200 Speaker 2: guys are studying, and conversation analysis says, no, we're not 27 00:01:44,280 --> 00:01:47,960 Speaker 2: going to do that. Instead, we are purely about observation, 28 00:01:48,640 --> 00:01:52,200 Speaker 2: noticing patterns and then figuring out how those patterns predict 29 00:01:52,280 --> 00:01:55,280 Speaker 2: other patterns, and how all these different patterns fit together 30 00:01:55,600 --> 00:02:00,360 Speaker 2: in this grand way to make up conversation. And you 31 00:02:00,440 --> 00:02:02,600 Speaker 2: might say, well, that's pretty boring, wouldn't you, Chuck. 32 00:02:04,320 --> 00:02:05,800 Speaker 1: I mean, I'll let you finish and then I'll give 33 00:02:05,840 --> 00:02:06,240 Speaker 1: you my take. 34 00:02:06,440 --> 00:02:10,320 Speaker 2: Okay, you might say it's pretty boring if you were Chuck. 35 00:02:11,040 --> 00:02:14,040 Speaker 2: The reason why it's interesting is because it reveals something 36 00:02:14,080 --> 00:02:16,480 Speaker 2: about us that conversation is one of those things that 37 00:02:16,480 --> 00:02:21,200 Speaker 2: we're really really good at without realizing what we're doing. 38 00:02:21,760 --> 00:02:25,440 Speaker 2: That conversation is an amazing interaction between two or more 39 00:02:25,480 --> 00:02:29,400 Speaker 2: people that gets stuff done, that shares information that you 40 00:02:29,440 --> 00:02:33,800 Speaker 2: can make a case basically our entire human civilization is 41 00:02:33,840 --> 00:02:36,160 Speaker 2: based on the fact that we're able to converse pretty 42 00:02:36,200 --> 00:02:38,760 Speaker 2: much effortlessly, even though in a lot of times it 43 00:02:38,919 --> 00:02:40,040 Speaker 2: just does not make sense. 44 00:02:40,960 --> 00:02:44,680 Speaker 1: Yeah, my deal with this is I'll get to this 45 00:02:44,760 --> 00:02:47,360 Speaker 1: episode and then I want to wipe it from my 46 00:02:47,440 --> 00:02:51,280 Speaker 1: memory bank because I'm one of those people that the 47 00:02:51,400 --> 00:02:55,799 Speaker 1: last thing I want to think about is how I'm 48 00:02:55,800 --> 00:02:58,960 Speaker 1: conversing with somebody. And it reminds me that that scene 49 00:02:58,960 --> 00:03:02,600 Speaker 1: in Better Off Dead, when early on John Cusack is 50 00:03:02,639 --> 00:03:05,640 Speaker 1: having like his early A flashback, I think to his 51 00:03:05,919 --> 00:03:09,120 Speaker 1: first meeting with his girlfriend and whether they're in his 52 00:03:09,240 --> 00:03:11,080 Speaker 1: head and he's like, Oh, she just touched her nose. 53 00:03:11,639 --> 00:03:13,600 Speaker 1: Does that mean I have something on my face? And 54 00:03:13,639 --> 00:03:15,680 Speaker 1: then she's like, oh, he just touched his face, Do 55 00:03:15,800 --> 00:03:18,040 Speaker 1: I mustard on my face or something like that? And 56 00:03:18,080 --> 00:03:20,320 Speaker 1: then before you know it, they're just going crazy. And 57 00:03:20,320 --> 00:03:22,880 Speaker 1: that's kind of what this does to me, is I 58 00:03:22,919 --> 00:03:27,320 Speaker 1: don't want to think about Like I'm very much organic 59 00:03:27,320 --> 00:03:29,480 Speaker 1: when it comes to stuff like this, and the last 60 00:03:29,520 --> 00:03:32,839 Speaker 1: thing I want to think about is did I say 61 00:03:32,880 --> 00:03:35,200 Speaker 1: that right? Or did I interrupt somebody? Or was I 62 00:03:35,200 --> 00:03:37,880 Speaker 1: did I act interested enough? Like that kind of thing? 63 00:03:38,200 --> 00:03:40,280 Speaker 1: Just I have no place for that in my life. 64 00:03:40,840 --> 00:03:43,920 Speaker 2: That's funny because that is almost one hundred percent of 65 00:03:43,960 --> 00:03:46,800 Speaker 2: what goes on in my mind when I'm talking or 66 00:03:46,840 --> 00:03:49,080 Speaker 2: when someone else is talking like, I can't help but 67 00:03:49,160 --> 00:03:49,480 Speaker 2: do that. 68 00:03:50,000 --> 00:03:53,200 Speaker 1: I know, and I know that, and I feel for 69 00:03:53,240 --> 00:03:55,680 Speaker 1: you for that because that can't be fun. 70 00:03:55,920 --> 00:04:01,600 Speaker 2: It's really tiring. So Okay, this is like potentially a 71 00:04:01,760 --> 00:04:04,920 Speaker 2: career ending raphic pick that I made. 72 00:04:05,000 --> 00:04:08,520 Speaker 1: Hunt no, no, no, no, no. I just you know, it's interesting, 73 00:04:08,600 --> 00:04:10,240 Speaker 1: and then I just don't want to ever think about it. 74 00:04:10,320 --> 00:04:12,200 Speaker 2: Okay, we'll do a good job doing that, and I'm 75 00:04:12,200 --> 00:04:15,120 Speaker 2: sorry for even picking it. Well, let's dive into all 76 00:04:15,160 --> 00:04:17,520 Speaker 2: of this because it is interesting in and of itself, 77 00:04:17,560 --> 00:04:20,200 Speaker 2: even though it is a really strange discipline in the 78 00:04:20,200 --> 00:04:21,240 Speaker 2: way that it's set up. 79 00:04:21,839 --> 00:04:25,560 Speaker 1: Yeah, it draws from a couple of fields. Primarily Libya 80 00:04:25,640 --> 00:04:28,159 Speaker 1: helped us with this one, and I can tell because 81 00:04:28,160 --> 00:04:34,520 Speaker 1: it's awesome. Ethn No methodology, and that is it's studying 82 00:04:34,520 --> 00:04:36,960 Speaker 1: how people not just how they make sense of the world, 83 00:04:37,040 --> 00:04:39,240 Speaker 1: but how they do it in relation to others and 84 00:04:39,240 --> 00:04:40,440 Speaker 1: how they collaborate with others. 85 00:04:40,680 --> 00:04:40,920 Speaker 2: Right. 86 00:04:41,520 --> 00:04:46,400 Speaker 1: And then socio linguistics, which is language but not just language, 87 00:04:46,520 --> 00:04:50,799 Speaker 1: language specifically with like how it relates in specific cultures, 88 00:04:50,800 --> 00:04:53,720 Speaker 1: in the context of different cultures. And there were three 89 00:04:54,040 --> 00:04:59,560 Speaker 1: key players and researchers doing this in the nineteen seventies, 90 00:04:59,560 --> 00:05:04,760 Speaker 1: mainly Yeah, at UCLA go bruins. The persone is a 91 00:05:04,800 --> 00:05:08,680 Speaker 1: sociologist named Harvey Sachs, and he seems to have kind 92 00:05:08,680 --> 00:05:12,280 Speaker 1: of been the ringleader here. He's the grand Pappy of 93 00:05:12,400 --> 00:05:16,520 Speaker 1: conversation analysis. He started at UCLA in the sixties, that 94 00:05:16,600 --> 00:05:18,640 Speaker 1: really got into this in the mid seventies. 95 00:05:19,760 --> 00:05:22,040 Speaker 2: Yeah, and he stopped in the seventies because the poor 96 00:05:22,040 --> 00:05:24,240 Speaker 2: guy died in a car crash at forty years old 97 00:05:24,279 --> 00:05:27,520 Speaker 2: in nineteen seventy five. And he really only worked on 98 00:05:27,560 --> 00:05:30,760 Speaker 2: this for just over a decade, but he figured it out. 99 00:05:30,800 --> 00:05:34,240 Speaker 2: He laid this down, and part of it was that 100 00:05:34,320 --> 00:05:38,800 Speaker 2: he benefited from working closely with some other sociologists, including 101 00:05:38,800 --> 00:05:43,400 Speaker 2: Irvin Goffman, who was the star of our impression Management episode. 102 00:05:43,960 --> 00:05:45,640 Speaker 2: And I think that might have been where I first 103 00:05:45,680 --> 00:05:49,560 Speaker 2: heard of conversation in alliance. And then so he was 104 00:05:49,600 --> 00:05:52,960 Speaker 2: working with IRVN. Goffman. They weren't doing the same thing, 105 00:05:53,000 --> 00:05:56,120 Speaker 2: but they were both coming from that same strain of sociology, 106 00:05:56,160 --> 00:05:59,560 Speaker 2: which is it was really transitional at a time from 107 00:05:59,600 --> 00:06:04,440 Speaker 2: studying huge institutions like religion or government and zooming into 108 00:06:04,480 --> 00:06:08,400 Speaker 2: a much more granular, almost micro interaction level. So that's 109 00:06:08,400 --> 00:06:11,520 Speaker 2: what Goffman was into with impression management. Harvey Sachs was 110 00:06:11,520 --> 00:06:12,680 Speaker 2: into that with conversation. 111 00:06:13,440 --> 00:06:16,320 Speaker 1: Yeah, he didn't publish a lot of stuff. This wasn't 112 00:06:16,400 --> 00:06:19,240 Speaker 1: like white paper, peer reviewed kind of stuff. That was 113 00:06:19,279 --> 00:06:22,160 Speaker 1: mainly like, you know, sort of pre ted talk kind 114 00:06:22,200 --> 00:06:26,040 Speaker 1: of thing like, Hey, isn't this interesting. Here's my lectures. 115 00:06:26,080 --> 00:06:27,960 Speaker 1: I'm going to make them available. You can take a 116 00:06:28,000 --> 00:06:31,159 Speaker 1: gander if you want. Right then, there was a student 117 00:06:31,360 --> 00:06:34,400 Speaker 1: of SAS named Gail Jefferson there at ECLA who was 118 00:06:34,440 --> 00:06:37,800 Speaker 1: a dance major but had this interesting job. She worked 119 00:06:37,800 --> 00:06:39,880 Speaker 1: as a typist for the Department of Public Health, and 120 00:06:40,120 --> 00:06:45,120 Speaker 1: part of that included transcribing sensitivity training sessions with prison guards, 121 00:06:46,000 --> 00:06:49,920 Speaker 1: and so she got really interested in a very kind 122 00:06:49,920 --> 00:06:52,560 Speaker 1: of key part of conversation with something called turntaking, when 123 00:06:52,600 --> 00:06:55,200 Speaker 1: you you know, take turns talking and a lot of 124 00:06:55,200 --> 00:06:58,080 Speaker 1: this will be about like the cues that people give 125 00:06:58,200 --> 00:07:00,200 Speaker 1: to let the other person know, hey, now it's your 126 00:07:00,200 --> 00:07:05,200 Speaker 1: turn to speak, or how to interject constructively or interrupt constructively, 127 00:07:05,240 --> 00:07:07,760 Speaker 1: things like that. But that seemed to kind of fascinate 128 00:07:07,800 --> 00:07:11,360 Speaker 1: her when she was transcribing these training sessions with the 129 00:07:11,360 --> 00:07:15,800 Speaker 1: prison guards, and so she got interested in that. She 130 00:07:16,120 --> 00:07:20,880 Speaker 1: ended up developing a whole system called the Jefferson transcription System, 131 00:07:21,160 --> 00:07:23,000 Speaker 1: which you're going to talk about a little bit, basically, 132 00:07:23,080 --> 00:07:25,280 Speaker 1: how to kind of make sense of all this stuff 133 00:07:26,120 --> 00:07:29,880 Speaker 1: when you're writing down how people are speaking to one 134 00:07:29,880 --> 00:07:32,800 Speaker 1: another and then later on it's kind of interesting. I 135 00:07:32,800 --> 00:07:36,920 Speaker 1: think she worked with laughter. She was fascinated by laughter 136 00:07:37,200 --> 00:07:40,200 Speaker 1: and how that works us way into a conversation and 137 00:07:40,200 --> 00:07:44,000 Speaker 1: how someone may cue someone to laugh at something they've 138 00:07:44,000 --> 00:07:47,280 Speaker 1: even said themselves by giving a small laugh after they've 139 00:07:47,360 --> 00:07:47,920 Speaker 1: said that thing. 140 00:07:48,120 --> 00:07:55,480 Speaker 2: Yeah. The third guy is Emmanual Sheglof, and he seems 141 00:07:55,520 --> 00:07:58,520 Speaker 2: to have kind of taken the reins after Harvey Sachs 142 00:07:58,560 --> 00:08:01,840 Speaker 2: passed away, that's my stake. He became the chair of 143 00:08:01,880 --> 00:08:06,080 Speaker 2: the Conversation Analysis Department at UCLA, which seems to be 144 00:08:06,480 --> 00:08:09,880 Speaker 2: the center of conversation analysis from what I can tell. 145 00:08:10,760 --> 00:08:13,600 Speaker 2: He also received a Lifetime Achievement Award from the American 146 00:08:13,680 --> 00:08:17,320 Speaker 2: Sociological Association in twenty ten. So he was a big 147 00:08:17,360 --> 00:08:18,720 Speaker 2: man on campus essentially. 148 00:08:19,760 --> 00:08:21,920 Speaker 1: Yeah, he was a big brewin on campus. But he 149 00:08:22,520 --> 00:08:25,760 Speaker 1: they kind of started out and got, you know, with 150 00:08:26,080 --> 00:08:28,120 Speaker 1: kind of the simpler side of things, which is like, hey, 151 00:08:28,200 --> 00:08:32,640 Speaker 1: let's look at telephone calls and just sort of everyday 152 00:08:32,679 --> 00:08:36,439 Speaker 1: interactions with people, like what do people say at the 153 00:08:36,480 --> 00:08:38,360 Speaker 1: beginning of a phone call? What do people say at 154 00:08:38,400 --> 00:08:40,840 Speaker 1: the end of a phone call? And this is sort 155 00:08:40,840 --> 00:08:44,520 Speaker 1: of the bird's eye view of like just very basic 156 00:08:44,559 --> 00:08:49,360 Speaker 1: interactions before they got more specific with their observations, I guess. 157 00:08:49,720 --> 00:08:52,520 Speaker 2: Yeah. But also one of the genius things about studying 158 00:08:52,559 --> 00:08:56,160 Speaker 2: phone calls is how do two people who aren't looking 159 00:08:56,200 --> 00:08:58,560 Speaker 2: at each other know when it's their turn to talk? 160 00:08:58,920 --> 00:09:01,560 Speaker 2: And they don't just exactly and they don't just sit 161 00:09:01,600 --> 00:09:03,960 Speaker 2: there and talk over one another constantly and it's just 162 00:09:04,000 --> 00:09:06,959 Speaker 2: one big jumbled mass. That's what I was talking about earlier. 163 00:09:07,000 --> 00:09:10,080 Speaker 2: We're really good at conversation. We don't even realize it. Yeah, 164 00:09:10,120 --> 00:09:12,040 Speaker 2: but one of the first things you have to do 165 00:09:12,120 --> 00:09:15,480 Speaker 2: then is not just record the phone call. You have 166 00:09:15,559 --> 00:09:18,120 Speaker 2: to transcribe it. And that's what Gail Jefferson came up with, 167 00:09:18,280 --> 00:09:21,280 Speaker 2: was that method of transcribing that it's pretty clever. If 168 00:09:21,360 --> 00:09:23,560 Speaker 2: you know what you're looking at. You can get a 169 00:09:23,720 --> 00:09:27,199 Speaker 2: lot of information from this transcription if it's used the 170 00:09:27,800 --> 00:09:28,800 Speaker 2: Jefferson method. 171 00:09:29,520 --> 00:09:32,120 Speaker 1: Yeah, and this was like, you know, this came around 172 00:09:32,120 --> 00:09:35,839 Speaker 1: the early seventies. This is when linguist Gnome Chomsky was 173 00:09:35,920 --> 00:09:38,720 Speaker 1: kind of out there in the public sphere with his 174 00:09:38,840 --> 00:09:42,720 Speaker 1: idea that there's a universal grammar and this doesn't you know, 175 00:09:43,000 --> 00:09:45,720 Speaker 1: this didn't set out to disprove that or anything like that. 176 00:09:46,080 --> 00:09:48,600 Speaker 1: It was really more of let's look at different cultures 177 00:09:48,600 --> 00:09:52,480 Speaker 1: and dynamics within a conversation. Because Chomsky, and you know 178 00:09:52,559 --> 00:09:57,120 Speaker 1: some of his cohorts, was like, you know, conversations are 179 00:09:57,200 --> 00:10:00,720 Speaker 1: just you can't analyze this kind of stuff stmatically. The 180 00:10:00,800 --> 00:10:04,240 Speaker 1: conversations are too irregular and too different between people. And 181 00:10:04,280 --> 00:10:06,520 Speaker 1: they were like, nah, I think we can actually come 182 00:10:06,600 --> 00:10:09,280 Speaker 1: up with some principles that are consistent enough to do it. 183 00:10:09,320 --> 00:10:11,520 Speaker 1: And I think they did totally. 184 00:10:11,720 --> 00:10:14,760 Speaker 2: And one of the first projects that they started, one 185 00:10:14,760 --> 00:10:18,440 Speaker 2: of Harvey Sach's first projects was he worked with a 186 00:10:18,440 --> 00:10:23,439 Speaker 2: psychiatric hospital, an emergency psychiatric hospital. So their work's pretty urgent, 187 00:10:24,240 --> 00:10:26,480 Speaker 2: you can imagine. And one of the things that they 188 00:10:26,480 --> 00:10:30,000 Speaker 2: wanted to figure out was how to get patients to 189 00:10:30,040 --> 00:10:33,520 Speaker 2: give them their name when they called in, because there 190 00:10:33,600 --> 00:10:35,760 Speaker 2: was a certain amount of reluctance, as you can imagine, 191 00:10:35,840 --> 00:10:38,280 Speaker 2: especially back in the mid to late sixties. 192 00:10:38,679 --> 00:10:42,840 Speaker 1: Yeah, they found out when a call was answered at 193 00:10:42,840 --> 00:10:44,839 Speaker 1: one of these places, they would say, if they said 194 00:10:44,920 --> 00:10:49,320 Speaker 1: just hello, the person might just say hello. But if 195 00:10:49,360 --> 00:10:53,000 Speaker 1: they said, well, hi, this is doctor Charles Bryant. What 196 00:10:53,000 --> 00:10:54,360 Speaker 1: can I do for you today? May I help you? 197 00:10:55,040 --> 00:10:58,800 Speaker 1: The people were much more inclined to then respond by saying, oh, well, 198 00:10:58,840 --> 00:11:02,840 Speaker 1: this is also Charles, and I'm calling because I'm having 199 00:11:02,920 --> 00:11:05,559 Speaker 1: some intrusive thoughts or something like that. 200 00:11:05,440 --> 00:11:08,319 Speaker 2: Right, And then the receptionists would go, ah, I got you, 201 00:11:08,360 --> 00:11:10,480 Speaker 2: I got your name, we know your name now. 202 00:11:13,000 --> 00:11:16,439 Speaker 1: Yeah. Sometimes they found that people would not respond in 203 00:11:16,520 --> 00:11:20,319 Speaker 1: kind with their name, and in those cases, it's pretty 204 00:11:20,360 --> 00:11:22,600 Speaker 1: interesting and this, you know, kind of provided another little 205 00:11:22,920 --> 00:11:26,120 Speaker 1: nugget of information for how these things go. Yeah, when 206 00:11:26,120 --> 00:11:28,720 Speaker 1: they did not say, oh, yeah, my name's Chuck and 207 00:11:28,760 --> 00:11:31,560 Speaker 1: I'm having, you know, intrusive thoughts, they would sort of 208 00:11:31,960 --> 00:11:35,160 Speaker 1: introduce like an like a disruptor. They would say something 209 00:11:35,200 --> 00:11:38,600 Speaker 1: like huh or what and just a small little bump 210 00:11:38,600 --> 00:11:42,480 Speaker 1: in the road to change the conversational flow, right, subtly 211 00:11:42,559 --> 00:11:44,400 Speaker 1: kind of saying like I don't want to give you 212 00:11:44,440 --> 00:11:46,000 Speaker 1: my name without saying I don't want to give you 213 00:11:46,000 --> 00:11:46,559 Speaker 1: my name. 214 00:11:46,600 --> 00:11:50,920 Speaker 2: Right, Because what they found, conversation analysts found was that 215 00:11:51,120 --> 00:11:56,520 Speaker 2: we follow set patterns, these kind of prescribed rhythms of conversations. 216 00:11:56,760 --> 00:12:01,480 Speaker 2: So if you interrupt this the flow of one type 217 00:12:01,480 --> 00:12:04,600 Speaker 2: of conversation would say a huh, a new set of 218 00:12:04,679 --> 00:12:08,200 Speaker 2: rules comes up that takes the conversation from there that 219 00:12:08,320 --> 00:12:10,920 Speaker 2: both people are aware of but don't realize they're aware of, 220 00:12:11,640 --> 00:12:14,079 Speaker 2: which to me, I hadn't thought about it. But if 221 00:12:14,080 --> 00:12:17,200 Speaker 2: you've ever said huh to somebody when you knew full 222 00:12:17,240 --> 00:12:20,040 Speaker 2: well what they had just said, you were just reflexively 223 00:12:20,040 --> 00:12:23,679 Speaker 2: trying to derail or disrupt that type of script in 224 00:12:23,679 --> 00:12:26,200 Speaker 2: favor of a different one. Never realized that before, but 225 00:12:26,480 --> 00:12:27,439 Speaker 2: it makes sense. 226 00:12:27,320 --> 00:12:30,720 Speaker 1: Or maybe by time even, but it's just some sort 227 00:12:30,760 --> 00:12:34,680 Speaker 1: of a disruptor to divert something for some reason. Sacks 228 00:12:34,720 --> 00:12:40,000 Speaker 1: identified another thing called composits, and they're phrases that are 229 00:12:40,320 --> 00:12:43,680 Speaker 1: kind of combined as a unit, and it usually is 230 00:12:43,720 --> 00:12:46,280 Speaker 1: a prompt for some kind of response, Like if someone 231 00:12:46,280 --> 00:12:48,359 Speaker 1: says may I help you like that on the telephone, 232 00:12:49,200 --> 00:12:52,880 Speaker 1: what they're then obviously is asking you for a response. 233 00:12:52,920 --> 00:12:54,880 Speaker 1: They'll let you know what's going on, and in the 234 00:12:54,880 --> 00:12:58,080 Speaker 1: case of like an emergency call center, they might literally 235 00:12:58,320 --> 00:13:00,480 Speaker 1: respond to may I help you by saying I don't know. 236 00:13:01,160 --> 00:13:04,120 Speaker 1: And what they found was is it wasn't like like 237 00:13:04,120 --> 00:13:07,280 Speaker 1: that was a reasonable response, like somebody might literally say like, 238 00:13:07,640 --> 00:13:09,760 Speaker 1: I don't know if you can help me, and it 239 00:13:09,840 --> 00:13:12,000 Speaker 1: needed to be sort of taking a face value like. 240 00:13:11,960 --> 00:13:16,839 Speaker 2: That, right. What that suggested that Sex discovered was that 241 00:13:16,880 --> 00:13:19,240 Speaker 2: there were these composites where if you're saying, man, I 242 00:13:19,240 --> 00:13:22,200 Speaker 2: help you, you don't mean it at face value. It's 243 00:13:22,640 --> 00:13:24,920 Speaker 2: a part of a I think what would be called 244 00:13:25,000 --> 00:13:29,240 Speaker 2: later an adjacent pair, where you prompt, you say something 245 00:13:29,720 --> 00:13:33,240 Speaker 2: that's a prompt and there's an expected like range of 246 00:13:33,480 --> 00:13:37,920 Speaker 2: responses to it, and anything outside of that is like Okay, 247 00:13:38,160 --> 00:13:41,559 Speaker 2: that makes sense on paper, but it doesn't make sense conversationally. 248 00:13:41,960 --> 00:13:45,160 Speaker 2: And he kind of supported this with the idea he 249 00:13:45,200 --> 00:13:47,840 Speaker 2: wrote in a nineteen seventy five paper that everybody has 250 00:13:47,880 --> 00:13:51,880 Speaker 2: to lie yeah, And he used the example of like 251 00:13:52,120 --> 00:13:56,400 Speaker 2: a greeting among people meeting on the street, where you say, like, 252 00:13:56,520 --> 00:13:59,400 Speaker 2: how are you doing, and if the person says anything 253 00:13:59,480 --> 00:14:02,760 Speaker 2: other than fine or great or good, they have just 254 00:14:02,920 --> 00:14:07,319 Speaker 2: violated this type of composite prompt. You're not supposed to 255 00:14:07,360 --> 00:14:10,200 Speaker 2: say anything else. And even more interesting than this, chuck 256 00:14:10,280 --> 00:14:13,760 Speaker 2: is that they seem to have found that this is 257 00:14:13,800 --> 00:14:17,280 Speaker 2: actually universal. It's not just like among Americans or English 258 00:14:17,280 --> 00:14:21,360 Speaker 2: speakers or Germans or anything like that. Everybody essentially does 259 00:14:21,400 --> 00:14:23,680 Speaker 2: not want to know how you're actually doing. 260 00:14:24,320 --> 00:14:27,880 Speaker 1: Yeah, And I found that that's a very good indicator 261 00:14:27,920 --> 00:14:30,840 Speaker 1: of closeness and how you know that you've developed a 262 00:14:30,880 --> 00:14:34,760 Speaker 1: true like closeness with someone else, like a friendship or whatever, 263 00:14:35,480 --> 00:14:38,360 Speaker 1: because that's much more of a formal thing. Even if 264 00:14:38,400 --> 00:14:40,640 Speaker 1: you know somebody but don't know them that well, you'll 265 00:14:40,640 --> 00:14:42,440 Speaker 1: say oh, like yeah, yeah, I'm doing fine. They're like, oh, 266 00:14:42,480 --> 00:14:44,800 Speaker 1: pretty good. But if it's somebody you really know when 267 00:14:44,800 --> 00:14:48,760 Speaker 1: you're close to, you don't have to lie. You can 268 00:14:48,880 --> 00:14:51,920 Speaker 1: very easily say I'm super tired or I'm not doing 269 00:14:51,960 --> 00:14:53,040 Speaker 1: great because xyz. 270 00:14:53,360 --> 00:14:55,720 Speaker 2: Yeah. If you find somebody who actually does want to 271 00:14:55,760 --> 00:14:57,360 Speaker 2: know how you're doing, you hang on to that. 272 00:14:57,280 --> 00:15:00,480 Speaker 1: Person, right. Or if you meet someone up blue and 273 00:15:00,520 --> 00:15:02,640 Speaker 1: say how you doing and they start in with the truth, 274 00:15:02,720 --> 00:15:03,440 Speaker 1: then just walk. 275 00:15:03,320 --> 00:15:05,840 Speaker 2: Away, right, maybe even jog away. 276 00:15:06,280 --> 00:15:07,120 Speaker 1: Yeah, red flag. 277 00:15:08,360 --> 00:15:11,520 Speaker 2: Uh So. One of the other big breakthroughs was came 278 00:15:11,560 --> 00:15:16,000 Speaker 2: along when you could rent VCRs and they had giant 279 00:15:16,800 --> 00:15:19,840 Speaker 2: recording equipment like the kind they used in poultry Geist. 280 00:15:20,280 --> 00:15:24,160 Speaker 2: That changed conversation analysis, where now all of a sudden 281 00:15:24,160 --> 00:15:26,320 Speaker 2: you could see all the stuff that goes along with it. 282 00:15:26,320 --> 00:15:29,720 Speaker 2: It wasn't just telephone calls from disembodied voices. You could 283 00:15:29,720 --> 00:15:32,360 Speaker 2: see how people interacted and it opened up this whole 284 00:15:32,400 --> 00:15:34,040 Speaker 2: new world for sure. 285 00:15:34,560 --> 00:15:35,480 Speaker 1: Should we take a break? 286 00:15:35,840 --> 00:15:37,120 Speaker 2: I knew you were gonna say that. 287 00:15:38,040 --> 00:15:39,440 Speaker 1: All right, we'll be right back. 288 00:15:39,640 --> 00:15:46,120 Speaker 2: Stucks, It sucks, you know it stucks. It's a great name. 289 00:15:46,200 --> 00:15:49,280 Speaker 1: Yeah, that's the name of it. 290 00:15:49,280 --> 00:15:51,600 Speaker 2: It's a great name. All right, Stucks met. 291 00:15:51,520 --> 00:16:07,000 Speaker 1: With an X. You know, before we broke you talked 292 00:16:07,000 --> 00:16:10,880 Speaker 1: about the huge cameras, and I'm reading Matthew Modine's diary 293 00:16:11,560 --> 00:16:15,120 Speaker 1: of making a Full Metal Jacket, you know, the actor 294 00:16:15,160 --> 00:16:15,800 Speaker 1: of Matthew Modine. 295 00:16:15,920 --> 00:16:17,960 Speaker 2: Of course I know exactly what you're talking about. But 296 00:16:18,280 --> 00:16:20,280 Speaker 2: number one, I can't believe there is such a thing. 297 00:16:20,320 --> 00:16:22,000 Speaker 2: In number two, I can't believe you're reading it. 298 00:16:22,480 --> 00:16:24,840 Speaker 1: Oh, it's great. It's his diary when he was making 299 00:16:24,920 --> 00:16:27,440 Speaker 1: the movie. Casey actually got it for me when we 300 00:16:27,440 --> 00:16:30,000 Speaker 1: were doing movie Crushes. Very sweet gift. But he at 301 00:16:30,040 --> 00:16:34,480 Speaker 1: one point he was talking about having to put yourself 302 00:16:34,520 --> 00:16:38,000 Speaker 1: on tape for an audition, which is something routinely done 303 00:16:38,040 --> 00:16:40,280 Speaker 1: all the time now, especially since the writer strikes and 304 00:16:40,360 --> 00:16:43,200 Speaker 1: COVID and stuff. But he was like, it's just such 305 00:16:43,200 --> 00:16:45,160 Speaker 1: a pain. You got to know somebody who has one 306 00:16:45,160 --> 00:16:47,320 Speaker 1: of those huge video cameras and you have to go 307 00:16:47,360 --> 00:16:49,240 Speaker 1: to their studio and blah blah blah. 308 00:16:49,440 --> 00:16:51,600 Speaker 2: It was just like it was very cute and quaint Yep, 309 00:16:51,640 --> 00:16:53,280 Speaker 2: my niece Mila has to do that a lot. 310 00:16:53,960 --> 00:16:56,000 Speaker 1: Yeah, Yeah, it's it's super It's kind of the way 311 00:16:56,000 --> 00:16:56,600 Speaker 1: it's done now. 312 00:16:56,640 --> 00:17:00,960 Speaker 2: For sure, sure, but yeah, I imagine it's it's I 313 00:17:00,960 --> 00:17:03,240 Speaker 2: don't know which would be worse doing it live in 314 00:17:03,240 --> 00:17:05,199 Speaker 2: front of people, are doing it in front of a 315 00:17:05,240 --> 00:17:08,080 Speaker 2: recording that you're getting zero feedback from. 316 00:17:07,960 --> 00:17:10,960 Speaker 1: The Every actor I knows hates putting themselves on tape. 317 00:17:10,960 --> 00:17:11,960 Speaker 1: They would much rather be in the. 318 00:17:12,000 --> 00:17:18,000 Speaker 2: Room, gotcha, because they're all energy vampires, right exactly. So 319 00:17:18,640 --> 00:17:20,760 Speaker 2: one of the things I kind of alluded to earlier 320 00:17:20,800 --> 00:17:24,160 Speaker 2: is that conversation analysis is not a standard social science, 321 00:17:24,200 --> 00:17:27,200 Speaker 2: and that it doesn't develop theories of why people are 322 00:17:27,400 --> 00:17:30,119 Speaker 2: doing these things or why you said this when somebody 323 00:17:30,160 --> 00:17:33,120 Speaker 2: else said that. Again, they're just looking for patterns. They're 324 00:17:33,119 --> 00:17:35,880 Speaker 2: looking at its structure. And the cool thing about it 325 00:17:35,920 --> 00:17:38,480 Speaker 2: is that that doesn't mean that they're not deriving any meaning. 326 00:17:39,160 --> 00:17:42,640 Speaker 2: They're not postulating what it means. Like for example, they're 327 00:17:42,640 --> 00:17:46,680 Speaker 2: not going up to two listeners or two speakers and 328 00:17:46,840 --> 00:17:49,159 Speaker 2: they go to speaker number two and say, what do 329 00:17:49,200 --> 00:17:51,680 Speaker 2: you think speaker number one meant when they said how 330 00:17:51,720 --> 00:17:56,640 Speaker 2: are you doing? They just analyze the conversation, and based 331 00:17:56,640 --> 00:18:02,240 Speaker 2: on speaker two's response, that tells the conversation analysts what 332 00:18:02,600 --> 00:18:05,399 Speaker 2: speaker number two thought speaker number one was saying. So 333 00:18:05,840 --> 00:18:08,399 Speaker 2: just by examining it they can come up with meaning 334 00:18:08,520 --> 00:18:11,919 Speaker 2: or derive meaning from it. And again, it's just not 335 00:18:12,119 --> 00:18:15,240 Speaker 2: like other social sciences, and it seems to really stick 336 00:18:15,280 --> 00:18:17,040 Speaker 2: in the craw of everybody else. I love it. 337 00:18:17,720 --> 00:18:20,720 Speaker 1: Yeah, they also didn't want to They wanted to be 338 00:18:20,760 --> 00:18:23,360 Speaker 1: as organic as possible and just have people have naturally 339 00:18:24,200 --> 00:18:27,520 Speaker 1: natural conversations with each other rather than orchestrating some big 340 00:18:27,600 --> 00:18:31,960 Speaker 1: like scenarios. You do have to just because of scientific 341 00:18:32,040 --> 00:18:34,440 Speaker 1: ethics and stuff, you have to tell people they're being recorded, 342 00:18:34,760 --> 00:18:37,040 Speaker 1: so you can't truly be just a fly on the wall. 343 00:18:37,400 --> 00:18:41,000 Speaker 1: But they did find that just the introduction of a 344 00:18:41,080 --> 00:18:43,960 Speaker 1: recording device they didn't feel like and I think they've 345 00:18:44,000 --> 00:18:47,840 Speaker 1: shown through evidence that it didn't really significantly change things 346 00:18:47,960 --> 00:18:51,560 Speaker 1: enough to where the result was thrown out or whatever. 347 00:18:51,720 --> 00:18:56,440 Speaker 2: Right, right, And like we said that, when you are 348 00:18:56,560 --> 00:19:00,920 Speaker 2: beginning a conversation analysis, you start with a recording of 349 00:19:00,960 --> 00:19:06,399 Speaker 2: a conversation. Nowadays it's with videotapes, and then you transcribe it. 350 00:19:06,440 --> 00:19:09,560 Speaker 2: And one of the big things in conversation analysis is 351 00:19:09,560 --> 00:19:11,840 Speaker 2: when you transcribe it, you need to do it as 352 00:19:11,840 --> 00:19:15,119 Speaker 2: objectively as possible. You need to keep out your own 353 00:19:15,440 --> 00:19:19,800 Speaker 2: subjective thoughts about who did what and just faithfully say 354 00:19:19,920 --> 00:19:23,280 Speaker 2: this was an interruption, this was a TCU. This person 355 00:19:23,920 --> 00:19:26,080 Speaker 2: took a break breath in the middle of their word. 356 00:19:27,359 --> 00:19:31,000 Speaker 2: Josh just said he corrected himself in the middle of 357 00:19:31,000 --> 00:19:33,640 Speaker 2: the sentence. So he just used the repair and notate 358 00:19:33,680 --> 00:19:38,560 Speaker 2: all this stuff without any subjective input from you, and 359 00:19:38,600 --> 00:19:40,960 Speaker 2: then you go back and you analyze it after it's 360 00:19:40,960 --> 00:19:42,200 Speaker 2: been fully transcribed. 361 00:19:42,840 --> 00:19:46,639 Speaker 1: Yeah, and so you've been very cleverly, i might say, 362 00:19:47,960 --> 00:19:50,879 Speaker 1: subtly dropping in little little words here and there that 363 00:19:50,920 --> 00:19:53,200 Speaker 1: people are like, what's he talking about with this stuff? 364 00:19:53,880 --> 00:19:55,760 Speaker 1: That's the stuff that they're looking for, and that's the 365 00:19:55,760 --> 00:19:59,880 Speaker 1: stuff that they named, like things that like in common parlance, 366 00:20:00,200 --> 00:20:03,200 Speaker 1: we know some of these things like rejoinders and interruptions 367 00:20:03,200 --> 00:20:05,399 Speaker 1: and things like that. But you know they're analysts, so 368 00:20:05,440 --> 00:20:08,479 Speaker 1: they took it a step further. And here are some 369 00:20:08,520 --> 00:20:10,600 Speaker 1: of those right now. One of them is called a 370 00:20:10,720 --> 00:20:15,879 Speaker 1: turn constructional unit TCU obviously not Texas Christian University. 371 00:20:17,000 --> 00:20:20,399 Speaker 2: There are frogs, horn toads, horn frogs. I think something 372 00:20:20,520 --> 00:20:22,879 Speaker 2: like that, Yeah, chief and tree frogs. 373 00:20:23,400 --> 00:20:27,240 Speaker 1: I think that's it. But turn construction units are the 374 00:20:27,240 --> 00:20:31,320 Speaker 1: building blocks of any conversation, of every conversation. And it 375 00:20:31,359 --> 00:20:35,160 Speaker 1: can be just a gesture like a nod at somebody, 376 00:20:35,200 --> 00:20:38,399 Speaker 1: it can be multiple sentences, but they end up with 377 00:20:38,480 --> 00:20:42,679 Speaker 1: what's called a transition relevance place a TRP, and that 378 00:20:42,840 --> 00:20:46,359 Speaker 1: is a moment where like what you've said has ended 379 00:20:46,920 --> 00:20:49,760 Speaker 1: and someone else may have a turn to speak now, 380 00:20:49,960 --> 00:20:52,720 Speaker 1: or you may say something else after that, and that's 381 00:20:52,800 --> 00:20:58,440 Speaker 1: just you taking another turn, basically and having two turn 382 00:20:58,560 --> 00:21:00,000 Speaker 1: constructional units in a row. 383 00:21:01,040 --> 00:21:03,360 Speaker 2: And so what just right after you said in a row, 384 00:21:03,480 --> 00:21:07,040 Speaker 2: that's where the transition relevance place was. That was it 385 00:21:07,320 --> 00:21:10,280 Speaker 2: because it gave me a chance to start talking, and 386 00:21:10,440 --> 00:21:12,800 Speaker 2: everything you said leading up to it was the turn 387 00:21:12,920 --> 00:21:16,760 Speaker 2: constructional unit. That was your turn. You took your turn 388 00:21:16,920 --> 00:21:20,040 Speaker 2: in the conversation. There was a pause that allowed me 389 00:21:20,080 --> 00:21:23,639 Speaker 2: to start taking my turn, And that's the basic building 390 00:21:23,680 --> 00:21:24,879 Speaker 2: blocks of the conversation. 391 00:21:25,560 --> 00:21:28,239 Speaker 1: That's right, Okay, see, you're good at this. 392 00:21:28,400 --> 00:21:34,080 Speaker 2: There's also when I analyze it, sure, there's also like 393 00:21:34,160 --> 00:21:37,560 Speaker 2: a lot of different I guess rules or exceptions or whatever. 394 00:21:37,680 --> 00:21:40,200 Speaker 2: Like you could say, you could use two turns in 395 00:21:40,240 --> 00:21:44,240 Speaker 2: a row without really allowing for a transition relevance, place 396 00:21:44,320 --> 00:21:47,800 Speaker 2: that pause that allows the other speaker to start. For example, 397 00:21:49,440 --> 00:21:52,320 Speaker 2: you could say, are you hungry? I could go for 398 00:21:52,359 --> 00:21:55,240 Speaker 2: a burger. You actually just took two turns like a 399 00:21:55,240 --> 00:22:00,280 Speaker 2: big fat hog without any pause in the middle. And 400 00:22:00,400 --> 00:22:03,920 Speaker 2: yet that's not considered like any sort of violation of conversation. 401 00:22:04,080 --> 00:22:07,240 Speaker 2: It's just again, it's like an exception. It's a way 402 00:22:07,280 --> 00:22:09,480 Speaker 2: that we've kind of we're so good at conversation, we 403 00:22:09,560 --> 00:22:11,800 Speaker 2: can show off by taking two turns in a row 404 00:22:12,040 --> 00:22:14,000 Speaker 2: and not mess up the flow of conversation. 405 00:22:14,760 --> 00:22:17,840 Speaker 1: Yeah, exactly if it's and we have to point out 406 00:22:17,840 --> 00:22:19,679 Speaker 1: too that a lot of times they were looking at 407 00:22:19,720 --> 00:22:23,080 Speaker 1: conversations between just two people, but you can also analyze 408 00:22:23,080 --> 00:22:25,879 Speaker 1: conversations in groups. It's just sort of a different beast. 409 00:22:26,359 --> 00:22:29,600 Speaker 1: But in conversations with more than two people at that 410 00:22:29,680 --> 00:22:33,560 Speaker 1: transition relevance place where it's probably someone else's turn to talk. 411 00:22:34,560 --> 00:22:36,680 Speaker 1: Like if you're a group of people at a dinner 412 00:22:36,760 --> 00:22:39,639 Speaker 1: party and you're telling a story, it's very common to 413 00:22:39,920 --> 00:22:43,200 Speaker 1: finish up the story and not just stare blankly into space. 414 00:22:43,240 --> 00:22:45,520 Speaker 1: But you finish up the story and maybe look at 415 00:22:45,520 --> 00:22:48,639 Speaker 1: one particular person, and there may be a reason for that. 416 00:22:48,760 --> 00:22:51,679 Speaker 1: Maybe it's your person, or maybe it's the person you 417 00:22:51,760 --> 00:22:54,840 Speaker 1: originally sort of started this story in reference to what 418 00:22:54,960 --> 00:22:56,680 Speaker 1: this other person was saying, so you'll kind of turn 419 00:22:56,720 --> 00:22:58,880 Speaker 1: it back to them. But you know, that's one way 420 00:22:58,960 --> 00:23:01,680 Speaker 1: you can sort of indicate like, hey, now I'm looking 421 00:23:01,760 --> 00:23:04,240 Speaker 1: at you, and that they may not speak at that point. 422 00:23:04,280 --> 00:23:05,840 Speaker 1: It may, you know, someone else may jump in. It 423 00:23:05,880 --> 00:23:07,040 Speaker 1: just sort of depends, right. 424 00:23:07,520 --> 00:23:09,960 Speaker 2: You could also make a finger gun and go at 425 00:23:09,960 --> 00:23:14,199 Speaker 2: that person and they and they'll take over. One of 426 00:23:14,200 --> 00:23:16,280 Speaker 2: the things. I laughed a minute ago when you talked 427 00:23:16,320 --> 00:23:20,880 Speaker 2: about staring blankly at the ceiling. It's so silly when 428 00:23:20,920 --> 00:23:23,639 Speaker 2: you when you just you just take out like a 429 00:23:23,680 --> 00:23:26,200 Speaker 2: proper response and put in something else. It just makes 430 00:23:26,240 --> 00:23:29,760 Speaker 2: me laugh every time because it's so prescribed like this 431 00:23:29,920 --> 00:23:33,120 Speaker 2: these scripts are so prescribed that doing anything other than 432 00:23:33,160 --> 00:23:36,040 Speaker 2: that is is just absurd and hilarious. 433 00:23:36,560 --> 00:23:39,040 Speaker 1: Yeah, I mean, it's it's like hidden camera material. 434 00:23:39,200 --> 00:23:44,840 Speaker 2: You know. So I mentioned I think I corrected myself. 435 00:23:45,480 --> 00:23:49,520 Speaker 2: Actually it's considered an interruption when I misspoke and said 436 00:23:49,600 --> 00:23:53,520 Speaker 2: breath weirdly and then set it again correctly right after 437 00:23:54,000 --> 00:23:56,639 Speaker 2: I actually interrupted myself, and I referred to that as 438 00:23:56,680 --> 00:23:59,760 Speaker 2: a repair mechanism. Yeah, that's exactly what it is, because 439 00:23:59,800 --> 00:24:04,719 Speaker 2: one of the unsung parts of human interaction and conversation 440 00:24:05,320 --> 00:24:08,040 Speaker 2: is that we have to have ways of correcting and 441 00:24:08,080 --> 00:24:13,520 Speaker 2: adjusting misunderstandings. If we didn't, we would be able to converse, 442 00:24:13,880 --> 00:24:17,359 Speaker 2: but two people would walk away from the conversation potentially 443 00:24:17,359 --> 00:24:22,160 Speaker 2: with totally different understandings of the information that was just exchanged. Right, 444 00:24:22,480 --> 00:24:25,760 Speaker 2: So we have to be able to correct ourselves when 445 00:24:25,800 --> 00:24:28,639 Speaker 2: we know we made a mistake, and then also conversely, 446 00:24:28,720 --> 00:24:32,360 Speaker 2: we need to be able to ask for clarification if 447 00:24:32,400 --> 00:24:34,439 Speaker 2: we didn't understand what the person was saying. 448 00:24:34,920 --> 00:24:38,119 Speaker 1: Yeah, or someone else may ask for that clarification or 449 00:24:38,119 --> 00:24:40,600 Speaker 1: something like that. So the repair doesn't have to come. 450 00:24:40,720 --> 00:24:43,280 Speaker 1: It's not necessarily a self repair always. 451 00:24:42,960 --> 00:24:45,040 Speaker 2: Right, But it doesn't mean like going to the person 452 00:24:45,080 --> 00:24:47,439 Speaker 2: and being like, I'm really sorry, and I'm not going 453 00:24:47,520 --> 00:24:49,240 Speaker 2: to do this anymore, and here's how we're going to 454 00:24:49,320 --> 00:24:51,800 Speaker 2: do it better from this point forward. Not that kind 455 00:24:51,840 --> 00:24:52,280 Speaker 2: of repair. 456 00:24:52,920 --> 00:24:57,200 Speaker 1: There are also gaps something to identify when they're analyzing conversations, 457 00:24:57,200 --> 00:24:59,679 Speaker 1: and we all know what that is that I believe 458 00:25:00,000 --> 00:25:01,920 Speaker 1: I always call them, and friends have called them awkward 459 00:25:01,920 --> 00:25:05,160 Speaker 1: pauses when it's not clear who the next speaker gonna 460 00:25:05,280 --> 00:25:08,520 Speaker 1: is going to be, and that can happen, like you 461 00:25:08,560 --> 00:25:11,800 Speaker 1: know you can, and even with groups of close friends 462 00:25:11,800 --> 00:25:13,919 Speaker 1: in a very social situation. In fact, I feel like 463 00:25:13,920 --> 00:25:17,760 Speaker 1: that's when it's most sort of noticeable, is when like 464 00:25:17,800 --> 00:25:19,920 Speaker 1: you're at a dinner party in something, and everyone's laughing 465 00:25:19,920 --> 00:25:22,359 Speaker 1: and saying things, and then everyone just draws a blank 466 00:25:22,400 --> 00:25:24,920 Speaker 1: for a couple of beats. Sure, and then someone will 467 00:25:25,000 --> 00:25:27,480 Speaker 1: usually say like awkward pause or something like that and 468 00:25:27,960 --> 00:25:30,800 Speaker 1: not say like that's a technically it's a conversation gap. 469 00:25:31,480 --> 00:25:34,440 Speaker 2: That's I think a good a good replacement now because 470 00:25:34,440 --> 00:25:36,359 Speaker 2: awkward pause is so used up to just be like 471 00:25:36,400 --> 00:25:42,360 Speaker 2: conversation gap, so it can be uncomfortable when it happens naturally, 472 00:25:42,560 --> 00:25:45,240 Speaker 2: naturally like that, like every everybody's just kind of run 473 00:25:45,280 --> 00:25:48,000 Speaker 2: out of things to say about whatever that conversation was. 474 00:25:48,680 --> 00:25:54,120 Speaker 2: It's even more uncomfortable when somebody misses their turn to speak, right, 475 00:25:54,160 --> 00:25:57,600 Speaker 2: they don't clearly yes, well, they don't give any sort 476 00:25:57,600 --> 00:26:01,960 Speaker 2: of response right. That is, that can cause a gap, 477 00:26:02,960 --> 00:26:05,840 Speaker 2: and a lot of times you can signal that by 478 00:26:06,320 --> 00:26:11,440 Speaker 2: like repeating the question you just asked, saying the punchline 479 00:26:11,440 --> 00:26:14,560 Speaker 2: one more time, saying something like what do you think 480 00:26:14,600 --> 00:26:19,040 Speaker 2: about that? A prompting thing, And then there's also nonverbal 481 00:26:19,800 --> 00:26:22,520 Speaker 2: ways of signaling gaps, like putting both hands in the 482 00:26:22,520 --> 00:26:24,560 Speaker 2: air and going into a forward lunch. 483 00:26:25,280 --> 00:26:27,520 Speaker 1: Right. I've always found that if the joke doesn't go, 484 00:26:27,600 --> 00:26:30,200 Speaker 1: we're just merely repeating the punchline again, always. 485 00:26:29,880 --> 00:26:32,960 Speaker 2: Works exactly over and over and over again. 486 00:26:33,040 --> 00:26:37,960 Speaker 1: You may not have heard me adjacent sea pairs. I 487 00:26:38,000 --> 00:26:42,440 Speaker 1: think you might have mentioned that, But that's when when 488 00:26:42,440 --> 00:26:45,760 Speaker 1: a specific kind of response is expected, so like how's 489 00:26:45,760 --> 00:26:49,479 Speaker 1: it going, Hey, I'm doing pretty good. They're also a 490 00:26:49,480 --> 00:26:52,399 Speaker 1: lot of times referred to as pre sequences, so just 491 00:26:52,440 --> 00:26:54,800 Speaker 1: sort of like it's sort of like a pad answer almost. 492 00:26:54,560 --> 00:26:58,600 Speaker 2: Yeah, like come in, won't you thank you? That'd be invitation, acceptance, 493 00:26:58,680 --> 00:27:03,720 Speaker 2: greeting greeting question answer. There's actually a lot of those 494 00:27:03,800 --> 00:27:07,359 Speaker 2: that those are the ones that are maybe even silliest 495 00:27:07,400 --> 00:27:09,840 Speaker 2: when you replace it. Like if somebody says would you 496 00:27:09,960 --> 00:27:12,720 Speaker 2: like a slice of cake? And you go hello, Right, 497 00:27:12,760 --> 00:27:13,480 Speaker 2: it doesn't work. 498 00:27:15,119 --> 00:27:18,919 Speaker 1: Well, what if you went hello, exactly. 499 00:27:18,600 --> 00:27:20,639 Speaker 2: I was gonna say the same thing. Like, we've figured 500 00:27:20,640 --> 00:27:23,199 Speaker 2: out ways around that. You can massage the rules and 501 00:27:23,240 --> 00:27:25,719 Speaker 2: get even more creative with the whole thing. That you 502 00:27:25,760 --> 00:27:28,880 Speaker 2: could use something that's totally inappropriate and make it appropriate. 503 00:27:30,000 --> 00:27:33,000 Speaker 2: The best thing about something that funny is explaining it 504 00:27:33,040 --> 00:27:34,359 Speaker 2: to death. 505 00:27:34,640 --> 00:27:36,760 Speaker 1: All right, here you ask me for cake one more time? 506 00:27:36,760 --> 00:27:37,320 Speaker 1: I got one more? 507 00:27:37,359 --> 00:27:39,680 Speaker 2: Okay, would you like a slice of cake? 508 00:27:40,440 --> 00:27:41,400 Speaker 1: Cok yourself? 509 00:27:43,480 --> 00:27:46,639 Speaker 2: That works. It kind of works to an extent for sure. 510 00:27:47,840 --> 00:27:50,680 Speaker 1: Stories is another one. When you mentioned the staring into 511 00:27:50,680 --> 00:27:53,240 Speaker 1: space at the end of like a question to you, 512 00:27:53,680 --> 00:27:55,800 Speaker 1: that can also happen at the end of a story 513 00:27:55,880 --> 00:27:57,840 Speaker 1: if you don't know what you're doing as a as 514 00:27:57,840 --> 00:28:00,920 Speaker 1: a communicator, when you start a story, you a lot 515 00:28:00,920 --> 00:28:03,640 Speaker 1: of time give an indicator that you know that it's 516 00:28:03,680 --> 00:28:06,840 Speaker 1: going to be you for a minute or two by 517 00:28:06,880 --> 00:28:09,320 Speaker 1: saying something like didn't I ever tell you about or 518 00:28:09,359 --> 00:28:11,840 Speaker 1: something like that, or get a sort of interest, Yeah, 519 00:28:11,880 --> 00:28:13,520 Speaker 1: get a load of this, or I'll tell you what 520 00:28:13,520 --> 00:28:16,119 Speaker 1: happened to me, And then you start your sort of story, 521 00:28:16,160 --> 00:28:18,560 Speaker 1: and usually you will end it by kind of looking 522 00:28:18,640 --> 00:28:21,879 Speaker 1: in someone's direction. And that's when you should sort of 523 00:28:21,960 --> 00:28:25,560 Speaker 1: acknowledge by either saying like, oh man, that's so funny 524 00:28:25,680 --> 00:28:28,360 Speaker 1: or just something like that, and not just stare blankly 525 00:28:28,440 --> 00:28:29,200 Speaker 1: back at somebody. 526 00:28:29,280 --> 00:28:32,560 Speaker 2: So I have I'm guilty of doing that, especially before 527 00:28:32,600 --> 00:28:36,200 Speaker 2: I started treating my ADHD. My mind would wander very 528 00:28:36,320 --> 00:28:39,120 Speaker 2: easily when somebody was telling a story, and I would 529 00:28:39,120 --> 00:28:42,000 Speaker 2: know that I really miffed it when the person would 530 00:28:42,040 --> 00:28:44,360 Speaker 2: look at me and then have to feel like they 531 00:28:44,400 --> 00:28:47,360 Speaker 2: had to explain why I should be reacting more than 532 00:28:47,400 --> 00:28:49,640 Speaker 2: I am. And then I'd be like, oh, yeah, that 533 00:28:49,720 --> 00:28:52,320 Speaker 2: really sucks that that happened to you. It was not 534 00:28:52,640 --> 00:28:56,680 Speaker 2: It doesn't really make for good interactions, really makes people 535 00:28:56,760 --> 00:28:58,000 Speaker 2: want to stay away from you. 536 00:28:58,320 --> 00:28:59,880 Speaker 1: Oh Josh, really dug that one? 537 00:29:00,080 --> 00:29:04,960 Speaker 2: Huh right, like huh what? Because yeah, you don't say 538 00:29:05,040 --> 00:29:07,240 Speaker 2: huh what. I wasn't paying attention. You try to play 539 00:29:07,240 --> 00:29:09,040 Speaker 2: it off, and that actually makes it worse. 540 00:29:09,680 --> 00:29:13,520 Speaker 1: Yeah, there are discourse markers, and they're just sort of words. 541 00:29:13,600 --> 00:29:20,600 Speaker 1: Or phrases like like organizationally help out like oh, or because, 542 00:29:21,120 --> 00:29:24,320 Speaker 1: and you're usually like connecting something to something that came 543 00:29:24,360 --> 00:29:25,120 Speaker 1: before it, right. 544 00:29:25,680 --> 00:29:28,400 Speaker 2: And then the last one is laminated action, which is 545 00:29:29,080 --> 00:29:32,360 Speaker 2: when you combine it with a gesture that it doesn't 546 00:29:32,440 --> 00:29:35,520 Speaker 2: just change the meaning, it actually completes the meaning. Yeah, 547 00:29:35,640 --> 00:29:39,000 Speaker 2: Olivia gave an example of when you say, oh, yeah, 548 00:29:39,040 --> 00:29:42,520 Speaker 2: I've met him and you roll your eyes. Right, if 549 00:29:42,560 --> 00:29:45,760 Speaker 2: you just say oh yeah, i've met him, even that 550 00:29:45,800 --> 00:29:49,040 Speaker 2: same intonation, it doesn't tell the person what you actually 551 00:29:49,080 --> 00:29:52,000 Speaker 2: think about them. You roll your eyes, then they get 552 00:29:52,040 --> 00:29:55,520 Speaker 2: the whole picture. You've met them, you've judged them, you 553 00:29:55,560 --> 00:29:58,600 Speaker 2: can't stand them, you wish they were dead, dead dead. 554 00:29:59,120 --> 00:30:01,840 Speaker 1: Yeah, role can sail those things exactly? 555 00:30:03,400 --> 00:30:05,440 Speaker 2: Should we take another break now or keep going? 556 00:30:07,200 --> 00:30:09,640 Speaker 1: Let's talk about overlap maybe and then we can take 557 00:30:09,640 --> 00:30:12,920 Speaker 1: a break. That feel about it? The idea, So overlap 558 00:30:13,120 --> 00:30:17,720 Speaker 1: is a really really like I feel like conversation analysts 559 00:30:17,840 --> 00:30:20,239 Speaker 1: just sort of light up whenever there's an overlap that 560 00:30:20,240 --> 00:30:22,920 Speaker 1: they can witness. They get pretty turned on by that 561 00:30:23,000 --> 00:30:27,560 Speaker 1: kind of thing. One common form is just like just 562 00:30:27,560 --> 00:30:30,280 Speaker 1: a simple misunderstanding, like I didn't know that your turn 563 00:30:30,440 --> 00:30:33,760 Speaker 1: was over I'm sorry. It's not the same thing as interruption. 564 00:30:35,000 --> 00:30:38,560 Speaker 1: Those are two different things. But interruption is like when 565 00:30:38,560 --> 00:30:40,680 Speaker 1: you stop in the middle of like stop somebody in 566 00:30:40,680 --> 00:30:42,360 Speaker 1: the middle of their sentence and talk over them. And 567 00:30:42,400 --> 00:30:46,840 Speaker 1: overlap is just when someone stops talking and had something 568 00:30:46,840 --> 00:30:49,760 Speaker 1: else to say, maybe and you start on your own train. 569 00:30:49,920 --> 00:30:51,760 Speaker 2: Yeah, that's the thing I think I do the most 570 00:30:51,760 --> 00:30:53,520 Speaker 2: to you. I think you've done and then I keep 571 00:30:53,560 --> 00:30:59,760 Speaker 2: talking or I start talking. That is Yeah, that's just 572 00:31:00,280 --> 00:31:04,680 Speaker 2: up interruption. Unintentional. There is such thing as intentional interruption, 573 00:31:04,720 --> 00:31:08,880 Speaker 2: where somebody's trying to like gain control or dominate a conversation. Yeah, 574 00:31:08,920 --> 00:31:10,680 Speaker 2: aka total jerks. 575 00:31:11,200 --> 00:31:11,400 Speaker 1: Right. 576 00:31:11,520 --> 00:31:15,320 Speaker 2: There's also a different kind of interruption, which is a 577 00:31:15,360 --> 00:31:18,840 Speaker 2: cooperative interruption. Like when I say right to you while 578 00:31:18,880 --> 00:31:21,880 Speaker 2: you're telling a story, I'm actually interjecting it while you're 579 00:31:21,920 --> 00:31:25,520 Speaker 2: still using your turn. You're you're making a turn construction unit, 580 00:31:25,920 --> 00:31:30,880 Speaker 2: uh huh unit. But I'm helping you along at the 581 00:31:30,960 --> 00:31:34,360 Speaker 2: very least demonstrating I'm listening and participating in the conversation, 582 00:31:34,680 --> 00:31:36,040 Speaker 2: which makes it cooperative. 583 00:31:36,760 --> 00:31:40,440 Speaker 1: Yeah, and those are just fine. You can interrupt people 584 00:31:40,560 --> 00:31:42,240 Speaker 1: all the time in the middle of their story and 585 00:31:42,280 --> 00:31:45,880 Speaker 1: even add to it if you're if you can, like 586 00:31:45,960 --> 00:31:49,520 Speaker 1: maybe you'll interrupt and say, like if somebody who's telling 587 00:31:49,560 --> 00:31:51,600 Speaker 1: a story about driving their car, you know, it's like, 588 00:31:51,920 --> 00:31:53,480 Speaker 1: you know it was a guy, Well kind of car 589 00:31:53,520 --> 00:31:56,320 Speaker 1: was you driving? And they'll say, oh, a BMW And 590 00:31:56,360 --> 00:31:59,360 Speaker 1: then everyone's like yeah, and they may have left out 591 00:31:59,360 --> 00:32:02,400 Speaker 1: that details. So that's all just sort of active participation 592 00:32:02,480 --> 00:32:04,040 Speaker 1: in the conversation precisely. 593 00:32:05,080 --> 00:32:05,760 Speaker 2: We're also no. 594 00:32:06,120 --> 00:32:08,040 Speaker 1: Shade toward BMW drivers. By the way, I'm not sure 595 00:32:08,040 --> 00:32:08,840 Speaker 1: why I set that car. 596 00:32:09,920 --> 00:32:13,320 Speaker 2: We're also so good at this whole thing that we 597 00:32:13,480 --> 00:32:17,760 Speaker 2: can interrupt while someone's telling a story without taking away 598 00:32:17,760 --> 00:32:21,280 Speaker 2: from the story. For example, if you're sitting there having 599 00:32:21,400 --> 00:32:23,880 Speaker 2: dinner with somebody and they're telling a story and you say, hey, 600 00:32:23,920 --> 00:32:28,200 Speaker 2: pass the potatoes, right, it doesn't actually like derail the conversation, 601 00:32:28,280 --> 00:32:31,360 Speaker 2: and the person's not offended. Yeah, you're just you're just 602 00:32:31,440 --> 00:32:34,320 Speaker 2: fitting that in there so you can eat the potatoes 603 00:32:34,360 --> 00:32:36,520 Speaker 2: and enjoy them while you're hearing the story too. 604 00:32:37,360 --> 00:32:40,040 Speaker 1: Yeah, and that's that can happen even I mean, dinner 605 00:32:40,080 --> 00:32:42,280 Speaker 1: party is such a good sort of experiment because it's 606 00:32:42,320 --> 00:32:44,440 Speaker 1: everyone seated around and looking at each other, and all 607 00:32:44,480 --> 00:32:48,200 Speaker 1: these conversations are happening that you can even do that 608 00:32:48,240 --> 00:32:51,040 Speaker 1: to someone else at the table during someone's story if 609 00:32:51,080 --> 00:32:53,480 Speaker 1: the potatoes are closer, but you might do it in 610 00:32:53,520 --> 00:32:55,600 Speaker 1: a hush tone, like during the middle of their story, Hey, 611 00:32:55,600 --> 00:32:57,880 Speaker 1: can you pass potatoes? Yeah, And that person may even 612 00:32:57,920 --> 00:33:00,680 Speaker 1: go they're so good, so like something like that. 613 00:33:00,760 --> 00:33:03,000 Speaker 2: What kind of potatoes did you imagine when I said 614 00:33:03,040 --> 00:33:04,959 Speaker 2: pass the potatoes up? 615 00:33:05,040 --> 00:33:05,360 Speaker 1: Mashed? 616 00:33:05,760 --> 00:33:09,680 Speaker 2: Did you? I? I, for some reason thought of steamed 617 00:33:09,840 --> 00:33:12,480 Speaker 2: or baked red potatoes, and then I was like, those 618 00:33:12,480 --> 00:33:15,240 Speaker 2: are no good, So I changed it to scallop potatoes, 619 00:33:15,280 --> 00:33:16,320 Speaker 2: which are great. 620 00:33:16,600 --> 00:33:18,360 Speaker 1: Oh man, you ever think everything, don't you? 621 00:33:19,320 --> 00:33:22,120 Speaker 2: I totally do. And then I thought about human makes 622 00:33:22,200 --> 00:33:25,480 Speaker 2: really good scallop potatoes. It just kind of kept going 623 00:33:25,520 --> 00:33:25,960 Speaker 2: from there. 624 00:33:26,280 --> 00:33:29,080 Speaker 1: I wonder what Chuck's talking about right now exactly? 625 00:33:29,600 --> 00:33:31,280 Speaker 2: No, I was listening to. That is a bit of 626 00:33:31,320 --> 00:33:33,000 Speaker 2: a talent. I can listen and do that at the 627 00:33:33,040 --> 00:33:33,560 Speaker 2: same time. 628 00:33:34,200 --> 00:33:39,640 Speaker 1: Okay, what exactly? I think? This is kind of really interesting. 629 00:33:39,640 --> 00:33:43,280 Speaker 1: There's a Georgetown University linguist named Deborah Tannan who did 630 00:33:43,280 --> 00:33:48,080 Speaker 1: a fun little experiment where she transcribed conversations between two Californians, 631 00:33:48,400 --> 00:33:52,040 Speaker 1: three New Yorkers, and a Londoner. And this should come 632 00:33:52,040 --> 00:33:55,680 Speaker 1: as no surprise. New Yorkers talked over everybody. And when 633 00:33:55,680 --> 00:33:58,080 Speaker 1: they did it with the fellow New Yorker, the other 634 00:33:58,160 --> 00:34:00,360 Speaker 1: New Yorker just kept talking, and they were sort of 635 00:34:00,400 --> 00:34:02,720 Speaker 1: talking over each other, and they were still enthusiastic and 636 00:34:02,800 --> 00:34:04,920 Speaker 1: having a good time. But when a New Yorker talked 637 00:34:04,920 --> 00:34:07,840 Speaker 1: over a Californian or a Londoner, they would stop talking 638 00:34:08,600 --> 00:34:11,399 Speaker 1: other people, you know. When they went back and looked 639 00:34:11,440 --> 00:34:14,640 Speaker 1: at it, others viewed it as like these New Yorkers 640 00:34:14,680 --> 00:34:17,160 Speaker 1: are dominating the conversation. They just want to take over. 641 00:34:17,280 --> 00:34:20,279 Speaker 1: Anytime I said anything, the New Yorkers were just like, hey, 642 00:34:20,320 --> 00:34:23,799 Speaker 1: it's all good. This is what we do. And they 643 00:34:23,880 --> 00:34:26,719 Speaker 1: found that as far as New Yorkers, and the New 644 00:34:26,800 --> 00:34:31,560 Speaker 1: Yorkers also thought that, like no one joined in, like 645 00:34:31,600 --> 00:34:33,719 Speaker 1: when they stopped talking, they were like, well, I guess 646 00:34:33,760 --> 00:34:35,319 Speaker 1: they didn't want to talk or whatever, because they're not 647 00:34:35,320 --> 00:34:38,960 Speaker 1: interrupting me right exactly. But they did find that other 648 00:34:39,000 --> 00:34:41,200 Speaker 1: scholars have found that there are these New York like 649 00:34:41,239 --> 00:34:46,680 Speaker 1: patterns and other cultures Samoan, Japanese and Italian American. And 650 00:34:46,760 --> 00:34:50,120 Speaker 1: so that's why every Italian American New York family. All 651 00:34:50,120 --> 00:34:52,160 Speaker 1: they do is just sit around and scream over each 652 00:34:52,200 --> 00:34:53,040 Speaker 1: other all the time. 653 00:34:53,239 --> 00:34:55,680 Speaker 2: Right, Hey, Japanese stood out to me and I'm like, 654 00:34:55,800 --> 00:34:58,000 Speaker 2: that doesn't sound right, And then I thought of have 655 00:34:58,040 --> 00:34:59,839 Speaker 2: you ever seen a Japanese like Morning Talk? 656 00:35:00,080 --> 00:35:02,200 Speaker 1: Oh? Sure? Yeah. 657 00:35:02,239 --> 00:35:05,279 Speaker 2: So the the somebody like a guest or some other 658 00:35:05,360 --> 00:35:07,839 Speaker 2: anchor something will be talking and then one of the 659 00:35:07,880 --> 00:35:11,279 Speaker 2: hosts will interject, usually a question before the person is 660 00:35:11,280 --> 00:35:14,319 Speaker 2: finished talking, and the person stops saying what they were 661 00:35:14,360 --> 00:35:19,800 Speaker 2: saying and answers that question and adjusts without being offended 662 00:35:19,840 --> 00:35:23,680 Speaker 2: at all. So it actually happens quite a bit, and 663 00:35:23,800 --> 00:35:26,440 Speaker 2: usually toward the end of a sentence or a story 664 00:35:26,560 --> 00:35:28,480 Speaker 2: or something like that. But it does happen a lot. 665 00:35:28,840 --> 00:35:33,560 Speaker 2: Whereas American English speakers, you are done speaking, then the 666 00:35:33,600 --> 00:35:37,520 Speaker 2: person starts speaking, or else you have transgressed on that 667 00:35:37,560 --> 00:35:39,759 Speaker 2: person's turn for sure. 668 00:35:39,960 --> 00:35:42,359 Speaker 1: Should we take a break? Yeah, all right, we'll take 669 00:35:42,360 --> 00:35:43,799 Speaker 1: a break and finish up right after this. 670 00:35:44,239 --> 00:35:46,360 Speaker 2: Stucks Who Stucks? 671 00:35:47,160 --> 00:35:47,920 Speaker 1: You know it? Stucks. 672 00:35:50,000 --> 00:35:50,719 Speaker 2: It's a great name. 673 00:35:50,800 --> 00:35:53,879 Speaker 1: Yeah, that's the name of it. 674 00:35:53,880 --> 00:35:54,719 Speaker 2: It's a great name. 675 00:35:55,000 --> 00:35:55,279 Speaker 1: All right. 676 00:35:55,480 --> 00:36:12,839 Speaker 2: Stucks met with an X so we talked about how 677 00:36:13,480 --> 00:36:16,000 Speaker 2: like all of these are scripts or templates like there's 678 00:36:16,160 --> 00:36:19,800 Speaker 2: you say something and there's a predictable response, and there's 679 00:36:19,880 --> 00:36:24,200 Speaker 2: actually you can boil it down to what are called families. 680 00:36:24,719 --> 00:36:27,840 Speaker 2: So like different types of conversations fall into different kinds 681 00:36:27,840 --> 00:36:32,319 Speaker 2: of families. The big ones that we've seen are reconstruction, moralizing, 682 00:36:32,360 --> 00:36:34,760 Speaker 2: and projection. Right. Yeah. 683 00:36:34,920 --> 00:36:40,239 Speaker 1: Reconstruction you know obviously reconstructing or remembering and sharing something 684 00:36:40,239 --> 00:36:44,400 Speaker 1: about an event with somebody. The projective is looking to 685 00:36:44,480 --> 00:36:47,600 Speaker 1: the future. Yeah, it could be very specific like where 686 00:36:47,600 --> 00:36:50,799 Speaker 1: we going to dinner, or just more like sitting around 687 00:36:50,840 --> 00:36:54,239 Speaker 1: and chatting out loud about like the real future. And 688 00:36:54,280 --> 00:36:57,400 Speaker 1: then what was moral communication about good and good and bad? 689 00:36:57,560 --> 00:37:02,200 Speaker 2: Yeah, it's more about tearing people down or complimenting people, 690 00:37:02,719 --> 00:37:06,720 Speaker 2: and we are we tend way more toward negativity, tearing 691 00:37:06,760 --> 00:37:12,240 Speaker 2: people down rather than building people up, and are tearing 692 00:37:12,280 --> 00:37:16,080 Speaker 2: people down types of genres are way more intricate and 693 00:37:16,120 --> 00:37:20,440 Speaker 2: sophisticated than are building people up or complimenting. Because we 694 00:37:20,440 --> 00:37:22,759 Speaker 2: have a negative bias as a species. 695 00:37:24,000 --> 00:37:24,759 Speaker 1: That's depressing. 696 00:37:25,640 --> 00:37:28,120 Speaker 2: So we'll I'll grow it one day, just give us 697 00:37:28,280 --> 00:37:30,319 Speaker 2: several tens of thousands of years. 698 00:37:31,360 --> 00:37:33,360 Speaker 1: Yeah, So at the beginning we kind of talked to 699 00:37:33,400 --> 00:37:35,160 Speaker 1: I think we gave an example of how this might 700 00:37:35,200 --> 00:37:38,879 Speaker 1: be used, like on the job or something, and sort 701 00:37:38,880 --> 00:37:41,759 Speaker 1: of practical applications do involve that, for sure, Like you 702 00:37:41,840 --> 00:37:44,319 Speaker 1: might be hired by a company to come in and 703 00:37:44,360 --> 00:37:48,239 Speaker 1: consult when that company does like has kind of the 704 00:37:48,280 --> 00:37:50,759 Speaker 1: same kind of conversation over and over with people, like 705 00:37:50,800 --> 00:37:54,520 Speaker 1: if it's a surgical team or a call center. For sure, 706 00:37:54,640 --> 00:37:57,280 Speaker 1: whenever you hear this call may be recorded for training 707 00:37:57,320 --> 00:38:00,640 Speaker 1: and evaluation purposes, that that's probably what they're doing right there, 708 00:38:00,760 --> 00:38:03,520 Speaker 1: or maybe just judging their own employees and how they're 709 00:38:03,520 --> 00:38:04,520 Speaker 1: doing on the job. 710 00:38:04,440 --> 00:38:08,560 Speaker 2: Right for sure. They've also found you can help people 711 00:38:08,760 --> 00:38:12,000 Speaker 2: get certain types of responses that you're looking for, Like 712 00:38:12,040 --> 00:38:16,520 Speaker 2: we talked about the emergency psychiatric hospital where they wanted 713 00:38:16,560 --> 00:38:18,799 Speaker 2: to get the person's name and then trick them into it. 714 00:38:19,760 --> 00:38:24,920 Speaker 2: There was a guy named John Heritage, unsurprisingly from UCLA, 715 00:38:25,200 --> 00:38:27,640 Speaker 2: who worked with doctors to figure out how they could 716 00:38:27,680 --> 00:38:32,239 Speaker 2: get patients to volunteer more problems that they needed help with. 717 00:38:32,520 --> 00:38:34,920 Speaker 2: And they found that doctors who say, is there anything 718 00:38:34,960 --> 00:38:39,640 Speaker 2: else that you need help with today? Apparently anything triggers 719 00:38:39,680 --> 00:38:43,040 Speaker 2: a response a predictable response, which is no. But if 720 00:38:43,040 --> 00:38:47,520 Speaker 2: you change anything to something that for some reason that 721 00:38:47,560 --> 00:38:52,200 Speaker 2: particular script or template opens up the possibility of sharing 722 00:38:52,239 --> 00:38:55,839 Speaker 2: more information and you would just never figure that out. 723 00:38:55,840 --> 00:38:58,600 Speaker 2: And this is one of the sterling examples of how 724 00:38:58,680 --> 00:39:03,359 Speaker 2: conversation and let's like actually help things change for the better. 725 00:39:04,080 --> 00:39:05,960 Speaker 1: Yeah, that was well said. 726 00:39:05,760 --> 00:39:09,640 Speaker 2: Thank you, wait to go John heritage at all. 727 00:39:10,680 --> 00:39:13,680 Speaker 1: A woman named Elizabeth Stoko of the London School of 728 00:39:13,719 --> 00:39:18,720 Speaker 1: Economics and Political Science found this when she studied conversations 729 00:39:18,760 --> 00:39:23,600 Speaker 1: in a mediation service that people. It seems like in 730 00:39:23,640 --> 00:39:25,120 Speaker 1: this case, people just kind of wanted to get down 731 00:39:25,160 --> 00:39:27,279 Speaker 1: to brass tacks on what actually they did. They didn't 732 00:39:27,280 --> 00:39:29,480 Speaker 1: want to hear things like well we don't take sides 733 00:39:29,520 --> 00:39:32,239 Speaker 1: here and we don't judge. They really wanted to hear 734 00:39:32,360 --> 00:39:34,560 Speaker 1: just sort of the step by step process of mediation 735 00:39:34,680 --> 00:39:35,440 Speaker 1: and how it worked. 736 00:39:35,840 --> 00:39:39,040 Speaker 2: Yes, she also works with companies that are trying to 737 00:39:39,040 --> 00:39:41,680 Speaker 2: install customer service bots and I was reading about that 738 00:39:41,719 --> 00:39:46,200 Speaker 2: it's not going very well yet. People hate customer service 739 00:39:46,280 --> 00:39:49,600 Speaker 2: bots and there's a there's agree. Yeah, I'm one of 740 00:39:49,600 --> 00:39:53,120 Speaker 2: them too. There's a question of okay, is the solution 741 00:39:53,320 --> 00:39:55,800 Speaker 2: making these bots way more human? Like should we insert 742 00:39:55,840 --> 00:40:00,520 Speaker 2: things like have a box or you know, like waffle 743 00:40:00,680 --> 00:40:03,120 Speaker 2: like a human does, and from what I saw, the 744 00:40:03,160 --> 00:40:07,200 Speaker 2: consensus is no, don't do that. Bots should be recognizable 745 00:40:07,400 --> 00:40:10,680 Speaker 2: and volunteer themselves as bots humans are humans. Keep the 746 00:40:10,719 --> 00:40:15,719 Speaker 2: two separate. And I don't know which direction it's going. 747 00:40:15,760 --> 00:40:17,360 Speaker 2: It does kind of seem like the whole thing's in 748 00:40:17,440 --> 00:40:21,880 Speaker 2: the quagmire currently. But I also did see that bots 749 00:40:21,960 --> 00:40:25,760 Speaker 2: are poised to start taking over the reins from human 750 00:40:25,840 --> 00:40:30,560 Speaker 2: conversation analysts and doing it themselves, and then training bots 751 00:40:30,600 --> 00:40:33,600 Speaker 2: how to be better at their job. So one bot 752 00:40:33,880 --> 00:40:37,439 Speaker 2: training another bot, right, that's from what I can tell 753 00:40:37,520 --> 00:40:40,160 Speaker 2: the future of conversation analysis. 754 00:40:40,480 --> 00:40:44,000 Speaker 1: Yeah. I mean with the movie Her that now is 755 00:40:44,000 --> 00:40:47,800 Speaker 1: like kind of freakily ahead of its time with Scarlett Johansson. 756 00:40:49,040 --> 00:40:53,040 Speaker 1: I think in that situation, they definitely wanted to make 757 00:40:53,120 --> 00:40:56,600 Speaker 1: her way more human and do things like stumble words 758 00:40:56,600 --> 00:40:59,680 Speaker 1: and make mistakes. But if it's something like a customer 759 00:40:59,719 --> 00:41:03,359 Speaker 1: service spot right, you don't want that. I don't want 760 00:41:03,400 --> 00:41:05,319 Speaker 1: it at all. But I definitely don't want one that's 761 00:41:05,400 --> 00:41:07,480 Speaker 1: like I just give Starn't. 762 00:41:07,160 --> 00:41:13,480 Speaker 2: I cute exactly? Lol. So let's get down to it, though, Chuck. 763 00:41:13,520 --> 00:41:16,560 Speaker 2: Here's the real reason we started talking about this. Do 764 00:41:16,760 --> 00:41:19,800 Speaker 2: men interrupt women as much as people think. 765 00:41:20,719 --> 00:41:22,760 Speaker 1: Well, I mean, this has been something that they've studied 766 00:41:22,760 --> 00:41:25,520 Speaker 1: a lot since the seventies about you know, the roles 767 00:41:25,560 --> 00:41:29,880 Speaker 1: that plays, and it's been mixed results. There have been 768 00:41:29,880 --> 00:41:33,520 Speaker 1: studies that found that men interrupt women much more. There's 769 00:41:33,560 --> 00:41:37,359 Speaker 1: some that found there's not much of a difference. There 770 00:41:37,360 --> 00:41:39,960 Speaker 1: was a meta analysis from ninety eight that found that 771 00:41:40,040 --> 00:41:44,600 Speaker 1: ginger divide becomes more clearcut when looking at intrusive, specifically 772 00:41:44,600 --> 00:41:49,360 Speaker 1: intrusive interruptions as to cooperative interruptions. And that's kind of 773 00:41:49,400 --> 00:41:52,439 Speaker 1: what I took away is that it seems like when 774 00:41:52,480 --> 00:41:58,160 Speaker 1: men are interrupting, it is definitely more intrusive maybe mansplaining, sure, 775 00:41:58,200 --> 00:42:01,840 Speaker 1: I don't know, yah, and interrupt maybe just as often, 776 00:42:01,920 --> 00:42:05,040 Speaker 1: but it's much more of the cooperative type, right. 777 00:42:05,200 --> 00:42:08,880 Speaker 2: And they chalk this up to different kinds of upbringings 778 00:42:08,920 --> 00:42:15,920 Speaker 2: where girls who become women are raised to essentially socialized 779 00:42:15,960 --> 00:42:19,880 Speaker 2: through communication, through conversation, so they become masters at it, 780 00:42:19,920 --> 00:42:23,920 Speaker 2: but they also develop expectations that men don't necessarily fulfill, 781 00:42:24,440 --> 00:42:28,840 Speaker 2: like cooperative interruptions, like oh, that's right, you don't say. 782 00:42:29,520 --> 00:42:31,560 Speaker 2: If a man doesn't do that, the women might feel 783 00:42:31,640 --> 00:42:35,759 Speaker 2: like she's not being listened to and Conversely, boys are 784 00:42:35,840 --> 00:42:41,080 Speaker 2: raised in a hierarchical manner where they might eventually come 785 00:42:41,160 --> 00:42:45,280 Speaker 2: to see listening as a form of submission, where instead 786 00:42:45,280 --> 00:42:47,680 Speaker 2: they're trying to dominate. They want to be the alpha male, 787 00:42:48,000 --> 00:42:51,799 Speaker 2: they want their puffy vest to be the coolest at 788 00:42:51,800 --> 00:42:54,799 Speaker 2: their kids football game, and so not only are they 789 00:42:54,840 --> 00:42:59,280 Speaker 2: not going to cooperatively interrupt, they're not even gonna listen, 790 00:42:59,360 --> 00:43:04,879 Speaker 2: and they may interrupt competitively too. So there's a lot 791 00:43:04,920 --> 00:43:09,040 Speaker 2: of at least anecdotal data to back that up for sure. 792 00:43:09,880 --> 00:43:12,359 Speaker 1: Yeah, and I think they also found that menton are 793 00:43:12,640 --> 00:43:15,880 Speaker 1: interrupt more in groups than a one on one, and 794 00:43:15,920 --> 00:43:19,040 Speaker 1: that definitely seems to fall in line with, like, you know, 795 00:43:19,120 --> 00:43:22,960 Speaker 1: trying to establish the power position and like if you're 796 00:43:23,000 --> 00:43:26,240 Speaker 1: working together in a group. They did also find another 797 00:43:26,280 --> 00:43:29,799 Speaker 1: interesting correlation where in studies where the first author of 798 00:43:29,840 --> 00:43:32,920 Speaker 1: the study was a woman, they found bigger differences, and 799 00:43:33,640 --> 00:43:37,000 Speaker 1: that just could be that the male and female researchers 800 00:43:37,000 --> 00:43:41,200 Speaker 1: are coding the interruptions in a different way. Pretty interesting, 801 00:43:41,440 --> 00:43:45,160 Speaker 1: I thought, so too. What about generationally, Well. 802 00:43:45,400 --> 00:43:47,880 Speaker 2: So apparently gen Z is just throwing a huge wrench 803 00:43:47,920 --> 00:43:51,080 Speaker 2: in the works. Remember we talked about how at the hospital, 804 00:43:51,080 --> 00:43:54,759 Speaker 2: the emergency hospital when somebody called and they said, my 805 00:43:54,920 --> 00:43:57,920 Speaker 2: name is May I help you? Right, the other person 806 00:43:57,920 --> 00:44:00,920 Speaker 2: felt obligated to give their name. It's not true like 807 00:44:00,960 --> 00:44:03,080 Speaker 2: when we were growing up, you would not feel like 808 00:44:03,120 --> 00:44:05,200 Speaker 2: you had to say, oh, well, I'm Josh Clark and 809 00:44:05,239 --> 00:44:07,879 Speaker 2: here's what I need from you. You just say, hey, 810 00:44:08,320 --> 00:44:12,040 Speaker 2: I need this or whatever. That's an example of a 811 00:44:12,120 --> 00:44:15,920 Speaker 2: generational change that took place. Now it's even more pronounced. 812 00:44:15,960 --> 00:44:18,600 Speaker 2: Apparently with gen Z there's something called the gen Z 813 00:44:18,800 --> 00:44:22,879 Speaker 2: stare where they're essentially pulling a Josh where you can 814 00:44:22,880 --> 00:44:25,319 Speaker 2: tell them a story and they just stare back at 815 00:44:25,320 --> 00:44:27,480 Speaker 2: you blankly at the end when it's their turn. And 816 00:44:27,560 --> 00:44:29,440 Speaker 2: apparently it's fairly disconcerting. 817 00:44:30,120 --> 00:44:32,239 Speaker 1: Yeah, I've heard about it and then I was like, 818 00:44:32,280 --> 00:44:34,080 Speaker 1: what is that? Then I read up about it and 819 00:44:34,640 --> 00:44:37,759 Speaker 1: it is very disconcerting, as is the phone call thing, 820 00:44:37,760 --> 00:44:40,000 Speaker 1: which I haven't experienced because people don't call each other 821 00:44:40,080 --> 00:44:44,440 Speaker 1: much anymore. Yeah, but apparently gen Z when they answer 822 00:44:44,480 --> 00:44:47,719 Speaker 1: a phone, they don't go hello, They expect the other 823 00:44:47,760 --> 00:44:50,000 Speaker 1: person to talk for so apparently there's a gen Z 824 00:44:50,080 --> 00:44:52,000 Speaker 1: thing where they just answer the phone like this. 825 00:44:53,920 --> 00:44:59,879 Speaker 2: Yeah, and then the other hello, or do you need help? 826 00:45:00,400 --> 00:45:01,319 Speaker 2: That's what I would say. 827 00:45:01,960 --> 00:45:05,440 Speaker 1: Yeah, And I mean, I guess I've definitely witnessed the 828 00:45:05,520 --> 00:45:08,839 Speaker 1: gen z stare with our friends' kids here and there 829 00:45:09,600 --> 00:45:12,120 Speaker 1: to where you're just like, boy, I just like I 830 00:45:12,200 --> 00:45:14,640 Speaker 1: must be the least interesting human on earth because they're 831 00:45:14,680 --> 00:45:15,880 Speaker 1: just blankly looking at me. 832 00:45:16,080 --> 00:45:18,680 Speaker 2: Yeah. Yeah. Or you can look at it the other 833 00:45:18,719 --> 00:45:21,279 Speaker 2: way and be like, yeah, good talking to you. I'll 834 00:45:21,280 --> 00:45:21,920 Speaker 2: see you later. 835 00:45:22,680 --> 00:45:27,000 Speaker 1: Yeah. But I've also found, especially when you're around teenagers, 836 00:45:27,080 --> 00:45:30,680 Speaker 1: like your friends that have teenage kids, like just don't 837 00:45:30,680 --> 00:45:33,000 Speaker 1: even you know, maybe say something nice and hello, but 838 00:45:33,480 --> 00:45:35,920 Speaker 1: don't try to strike up a conversation. They don't want 839 00:45:35,960 --> 00:45:38,319 Speaker 1: to talk to you. No, just don't, I think, just 840 00:45:38,360 --> 00:45:38,799 Speaker 1: move along. 841 00:45:38,920 --> 00:45:41,400 Speaker 2: That's been true since Tuck Tuck was a teenager, you 842 00:45:41,440 --> 00:45:42,279 Speaker 2: know what I mean. 843 00:45:42,880 --> 00:45:45,640 Speaker 1: You know what I usually do. I'll go like, Ye're like, oh, hey, 844 00:45:45,640 --> 00:45:48,359 Speaker 1: how's it going. How's school going this year? Oh? Good, good, 845 00:45:49,080 --> 00:45:51,080 Speaker 1: glad to hear it, and I'll just walk away like 846 00:45:51,560 --> 00:45:53,400 Speaker 1: a nice thing to say, and then just end it. 847 00:45:53,440 --> 00:45:55,760 Speaker 2: You don't follow up with like, are you really anxious 848 00:45:55,760 --> 00:45:57,840 Speaker 2: when you wake up in the morning before school? 849 00:45:59,480 --> 00:46:02,640 Speaker 1: No, no, no, no one wants to hear from an adult. 850 00:46:02,800 --> 00:46:04,480 Speaker 1: If you're at a certain age. 851 00:46:04,600 --> 00:46:07,000 Speaker 2: Okay, I'm going to have to rethink my approach. Then. 852 00:46:07,640 --> 00:46:10,239 Speaker 1: Do you remember when you were a kid like I 853 00:46:10,280 --> 00:46:12,520 Speaker 1: don't remember even having conversations with adults? 854 00:46:12,760 --> 00:46:15,920 Speaker 2: Oh yeah, no, absolutely not. You know, I'm totally kidding 855 00:46:16,000 --> 00:46:19,239 Speaker 2: this in everything I'm saying, right, Oh yeah, yeah, no, 856 00:46:19,280 --> 00:46:21,640 Speaker 2: I remember that it was very intimidating to talk to 857 00:46:21,680 --> 00:46:24,600 Speaker 2: an adult, let alone having very little in common. 858 00:46:25,000 --> 00:46:26,399 Speaker 1: Yeah, and they didn't want to talk to us. 859 00:46:26,760 --> 00:46:28,280 Speaker 2: No, No, for sure. 860 00:46:28,800 --> 00:46:31,480 Speaker 1: Gen X was famously ignored by most adults. 861 00:46:31,600 --> 00:46:34,560 Speaker 2: Yeah, very famous. Just ask Douglas Copelan. 862 00:46:35,640 --> 00:46:36,080 Speaker 1: Who's that. 863 00:46:36,239 --> 00:46:37,600 Speaker 2: He's the guy who wrote gen X. 864 00:46:38,320 --> 00:46:40,160 Speaker 1: Oh? Okay, was that a book? Famous book? 865 00:46:40,239 --> 00:46:42,600 Speaker 2: Yes? I believe he's the one who coined the term. 866 00:46:43,200 --> 00:46:45,120 Speaker 1: Oh you should read it. 867 00:46:45,120 --> 00:46:48,080 Speaker 2: It's good. It's a quick read. It's not like an 868 00:46:48,200 --> 00:46:50,319 Speaker 2: essay or anything like that. It's his story about three 869 00:46:50,960 --> 00:46:53,920 Speaker 2: gen xers and just going through life over I think, 870 00:46:54,040 --> 00:46:55,320 Speaker 2: just the course of a few days. 871 00:46:56,239 --> 00:46:58,239 Speaker 1: I've been a reading fool lately. I'll put it on 872 00:46:58,239 --> 00:46:58,560 Speaker 1: the list. 873 00:46:58,719 --> 00:47:01,960 Speaker 2: Nice. I just started in, and I'll bet I regret 874 00:47:02,120 --> 00:47:04,080 Speaker 2: ever announcing it publicly. 875 00:47:03,680 --> 00:47:06,200 Speaker 1: Because that David Foster Waller. 876 00:47:06,280 --> 00:47:08,640 Speaker 2: Yes, and I love that guy. But this is a 877 00:47:08,680 --> 00:47:09,719 Speaker 2: slog already. 878 00:47:10,640 --> 00:47:13,279 Speaker 1: Yeah, I'm finishing. I just finished the bonobook that I 879 00:47:13,280 --> 00:47:16,640 Speaker 1: had put down like a year ago, and I am 880 00:47:16,719 --> 00:47:20,800 Speaker 1: almost done with the Don Felder of the Eagles. Good lord. 881 00:47:21,600 --> 00:47:23,239 Speaker 1: You might be asking why would you read that? It's 882 00:47:23,480 --> 00:47:26,800 Speaker 1: specifically because I used to love the Eagles and he 883 00:47:28,360 --> 00:47:32,240 Speaker 1: apparently the book was just really bitchy. 884 00:47:32,880 --> 00:47:34,239 Speaker 2: Oh okay, yeah, I could say. 885 00:47:34,239 --> 00:47:35,560 Speaker 1: I was like, Ooh, I want to read this because 886 00:47:35,560 --> 00:47:37,520 Speaker 1: he's like, he hates those guys, so let me read this. 887 00:47:37,760 --> 00:47:40,840 Speaker 2: Did Matthew Modine recommend you read it in his diary 888 00:47:40,920 --> 00:47:42,440 Speaker 2: about Full Metal Jacket? 889 00:47:43,080 --> 00:47:45,759 Speaker 1: No, that's my bathroom book. So I've just been slow 890 00:47:45,840 --> 00:47:46,319 Speaker 1: rolling that one. 891 00:47:46,400 --> 00:47:49,520 Speaker 2: Gotcha. I don't think you should use words like slow 892 00:47:49,600 --> 00:47:51,840 Speaker 2: rolling when you talk about being in the bathroom. 893 00:47:52,440 --> 00:47:52,960 Speaker 1: Good point. 894 00:47:53,680 --> 00:47:57,040 Speaker 2: Well, I think we just brought about listener mail, whether 895 00:47:57,040 --> 00:48:00,000 Speaker 2: we like it or not, don't you. 896 00:48:00,440 --> 00:48:03,040 Speaker 1: That's right. This is not a really correction, just sort 897 00:48:03,040 --> 00:48:05,960 Speaker 1: of maybe a gentle reminder about our history of orthodonture. 898 00:48:06,840 --> 00:48:08,360 Speaker 1: I feel like we might have focused a little too 899 00:48:08,440 --> 00:48:11,840 Speaker 1: much on appearance this from erin. Hey, guys, appreciate the 900 00:48:11,840 --> 00:48:13,920 Speaker 1: depth of curiosity you bring to each topic, and I 901 00:48:13,960 --> 00:48:16,759 Speaker 1: wanted to offer an update regarding orthodontia, it's not just 902 00:48:16,760 --> 00:48:20,000 Speaker 1: about appearance anymore. The field has evolved significantly in current 903 00:48:20,000 --> 00:48:23,360 Speaker 1: research shows that strong connection between jaw and bite alignment 904 00:48:23,400 --> 00:48:28,279 Speaker 1: and conditions like sleep, apnea, ADHD, and TMJ dysfunction, which 905 00:48:28,320 --> 00:48:30,680 Speaker 1: is of course totally true. While some of these links 906 00:48:30,719 --> 00:48:33,800 Speaker 1: were suspected years ago, orthodonic treatment today is increasingly focused 907 00:48:33,840 --> 00:48:37,480 Speaker 1: on preventing or mitigating these issues and the ute before 908 00:48:37,480 --> 00:48:40,480 Speaker 1: they become chronic. On a personal note, my journey with 909 00:48:40,560 --> 00:48:44,080 Speaker 1: TMJ dysfunction led me down the path of exploring treatment options, 910 00:48:44,080 --> 00:48:47,399 Speaker 1: and after years of discomfort, I found relief through invisil line. 911 00:48:47,880 --> 00:48:50,359 Speaker 1: It not only helped with my smile, realign my bite 912 00:48:50,440 --> 00:48:54,120 Speaker 1: and significantly reduce my TMJ symptoms and open my eyes 913 00:48:54,120 --> 00:48:57,759 Speaker 1: to the broader health benefits of orthodonic care. And that 914 00:48:57,880 --> 00:49:00,840 Speaker 1: is warm regards from Aaron. Very nice email Aaron, and 915 00:49:00,960 --> 00:49:03,080 Speaker 1: visit right. 916 00:49:03,200 --> 00:49:05,839 Speaker 2: Yeah exactly, thanks a lot, Erin. I'm really glad Erin 917 00:49:05,880 --> 00:49:08,239 Speaker 2: with an E or a double A or one that's 918 00:49:08,280 --> 00:49:11,760 Speaker 2: E R I N. Thanks a lot, Aaron. I'm glad 919 00:49:11,760 --> 00:49:13,720 Speaker 2: that you were able to take care of your TMJ. 920 00:49:13,880 --> 00:49:16,480 Speaker 2: I can't imagine that that's a fun chronic condition. You know, 921 00:49:18,239 --> 00:49:20,640 Speaker 2: if you got rid of a condition that you're happy about, 922 00:49:20,680 --> 00:49:23,240 Speaker 2: we want to hear about that, or for whatever reason 923 00:49:23,440 --> 00:49:25,120 Speaker 2: you want to write in. You can send us an 924 00:49:25,160 --> 00:49:31,880 Speaker 2: email send it off to stuff podcast at iHeartRadio dot com. 925 00:49:32,000 --> 00:49:34,880 Speaker 1: Stuff you Should Know is a production of iHeartRadio. For 926 00:49:34,960 --> 00:49:39,160 Speaker 1: more podcasts my heart Radio, visit the iHeartRadio app, Apple Podcasts, 927 00:49:39,239 --> 00:49:41,080 Speaker 1: or wherever you listen to your favorite shows.