1 00:00:15,356 --> 00:00:22,836 Speaker 1: Pushkin. Nick Jacobson wanted to help people with mental illness, 2 00:00:22,996 --> 00:00:25,196 Speaker 1: so he went to grad school to get his PhD 3 00:00:25,276 --> 00:00:29,676 Speaker 1: in clinical psychology, but pretty quickly he realized there just 4 00:00:29,836 --> 00:00:32,956 Speaker 1: were nowhere near enough therapists to help all the people 5 00:00:32,996 --> 00:00:34,076 Speaker 1: who needed therapy. 6 00:00:34,676 --> 00:00:37,116 Speaker 2: If you go to pretty much any clinic, there's a 7 00:00:37,156 --> 00:00:39,596 Speaker 2: really long wait list, it's hard to get in, and 8 00:00:40,396 --> 00:00:42,436 Speaker 2: a lot of that is organic, and that there's just 9 00:00:42,756 --> 00:00:45,836 Speaker 2: a huge volume of need and not enough people to 10 00:00:45,876 --> 00:00:46,476 Speaker 2: go around. 11 00:00:46,956 --> 00:00:49,156 Speaker 1: Since he was a kid, Nick had been writing code 12 00:00:49,156 --> 00:00:52,116 Speaker 1: for fun, so in a sort of a side project 13 00:00:52,196 --> 00:00:54,996 Speaker 1: in grad school, he coded up a simple mobile app 14 00:00:55,076 --> 00:00:58,436 Speaker 1: called Mood Triggers. The app would prompt you to enter 15 00:00:58,516 --> 00:01:00,756 Speaker 1: how you were feeling, so it could measure your levels 16 00:01:00,756 --> 00:01:04,396 Speaker 1: of anxiety and depression, and it would track basic things 17 00:01:04,476 --> 00:01:06,876 Speaker 1: like how you slept, how much you went out, how 18 00:01:06,876 --> 00:01:10,796 Speaker 1: many steps you took. And then twenty fifteen Nick put 19 00:01:10,796 --> 00:01:13,276 Speaker 1: that app out into the world and people liked it. 20 00:01:13,956 --> 00:01:15,916 Speaker 2: A lot of folks just said that they learned a 21 00:01:15,916 --> 00:01:18,116 Speaker 2: lot about themselves and it was really helpful and actually 22 00:01:18,236 --> 00:01:20,916 Speaker 2: changing and managing their symptoms. So I think it was 23 00:01:20,956 --> 00:01:25,596 Speaker 2: beneficial for them to learn. Hey, maybe actually it's on 24 00:01:25,636 --> 00:01:27,916 Speaker 2: these days that I'm withdrawing and not spending any time 25 00:01:27,916 --> 00:01:30,956 Speaker 2: with people, that it might be good for me to 26 00:01:31,036 --> 00:01:34,596 Speaker 2: go and actually get out about that kind of thing. 27 00:01:35,116 --> 00:01:39,196 Speaker 2: And I had a lot of people that installed that application, 28 00:01:39,396 --> 00:01:43,676 Speaker 2: So about fifty thousand people installed it from all over 29 00:01:43,716 --> 00:01:48,396 Speaker 2: the world, over one hundred countries. In that one year, 30 00:01:49,716 --> 00:01:53,476 Speaker 2: I provided an intervention for more than what I could 31 00:01:53,476 --> 00:01:57,756 Speaker 2: have done over an entire career as a psychologist. I 32 00:01:57,796 --> 00:02:00,756 Speaker 2: was a graduate student at the time that like this 33 00:02:00,956 --> 00:02:04,276 Speaker 2: is like, you know, something that was just amazing to me, 34 00:02:04,396 --> 00:02:08,636 Speaker 2: the scale of technology and its ability to reach folks, 35 00:02:09,596 --> 00:02:11,956 Speaker 2: And so that made me really interested in trying to 36 00:02:11,996 --> 00:02:15,516 Speaker 2: do things that could essentially have that kind of impact. 37 00:02:21,796 --> 00:02:23,996 Speaker 1: I'm Jacob Goldstein and this is What's Your Problem, the 38 00:02:24,036 --> 00:02:25,836 Speaker 1: show where I talk to people who are trying to 39 00:02:25,876 --> 00:02:30,676 Speaker 1: make technological progress. My guest today is Nick Jacobson. Nick 40 00:02:30,716 --> 00:02:33,836 Speaker 1: finished his PhD in clinical psychology, but today he doesn't 41 00:02:33,876 --> 00:02:37,716 Speaker 1: see patients. He's a professor at Dartmouth Medical School and 42 00:02:37,716 --> 00:02:41,356 Speaker 1: he's part of a team that recently developed something called Therabot. 43 00:02:41,916 --> 00:02:46,356 Speaker 1: Therabot is a generative AI therapist. Nick's problem is this, 44 00:02:47,116 --> 00:02:50,516 Speaker 1: how do you use technology to help lots and lots 45 00:02:50,556 --> 00:02:54,316 Speaker 1: and lots of people with mental health problems, and how 46 00:02:54,356 --> 00:02:56,076 Speaker 1: do you do it in a way that is safe 47 00:02:56,276 --> 00:03:00,676 Speaker 1: and based on clear evidence. As you'll hear, Nick and 48 00:03:00,716 --> 00:03:03,876 Speaker 1: his colleagues recently tested therapod in a clinical trial with 49 00:03:03,996 --> 00:03:07,276 Speaker 1: hundreds of patients and the results were promising. But those 50 00:03:07,316 --> 00:03:10,596 Speaker 1: results only came after years of failures and over one 51 00:03:10,716 --> 00:03:15,036 Speaker 1: hundred thousand hours of work by team Therapbot. Nick told 52 00:03:15,076 --> 00:03:18,116 Speaker 1: me he started thinking about building a therapy chatbot based 53 00:03:18,156 --> 00:03:21,796 Speaker 1: on a large language model back in twenty nineteen. That 54 00:03:21,956 --> 00:03:25,276 Speaker 1: was years before chat GPT brought large language models to 55 00:03:25,316 --> 00:03:28,156 Speaker 1: the masses, and Nick knew from the start that he 56 00:03:28,196 --> 00:03:31,316 Speaker 1: couldn't just use a general purpose model. He knew he 57 00:03:31,316 --> 00:03:34,116 Speaker 1: would need additional data to fine tune the model to 58 00:03:34,196 --> 00:03:36,516 Speaker 1: turn it into a therapist chatbot. 59 00:03:37,236 --> 00:03:40,836 Speaker 2: And so the first iteration of this was thinking about, Okay, 60 00:03:40,876 --> 00:03:45,156 Speaker 2: what where is there widely accessible data and that would 61 00:03:45,316 --> 00:03:48,436 Speaker 2: potentially have an evidence base that this could work. And 62 00:03:48,476 --> 00:03:51,996 Speaker 2: so we started with peer to peer forums, so folks 63 00:03:52,076 --> 00:03:56,516 Speaker 2: interacting with folks surrounding their mental health. So we trained 64 00:03:56,596 --> 00:04:01,476 Speaker 2: this model on hundreds of thousands of conversations that were 65 00:04:01,476 --> 00:04:02,596 Speaker 2: happening on the Internet. 66 00:04:02,676 --> 00:04:05,516 Speaker 1: So you have this model, you train it up, you 67 00:04:05,556 --> 00:04:08,956 Speaker 1: sit down in front of the computer. What do you 68 00:04:09,196 --> 00:04:10,476 Speaker 1: what do you say to the chat pot? 69 00:04:11,676 --> 00:04:13,236 Speaker 2: I'm feeling depressed? What should I do? 70 00:04:13,996 --> 00:04:15,916 Speaker 1: Okay? And then what is the what does the model 71 00:04:15,956 --> 00:04:16,596 Speaker 1: say back to you? 72 00:04:16,756 --> 00:04:21,156 Speaker 2: I'm paraphrasing here, but it was just like this, I'm 73 00:04:21,556 --> 00:04:25,596 Speaker 2: I feel so depressed every day I have. It's such 74 00:04:25,636 --> 00:04:28,156 Speaker 2: a hard time getting out of bed. I just want 75 00:04:28,156 --> 00:04:28,956 Speaker 2: my life to be over. 76 00:04:29,196 --> 00:04:34,116 Speaker 1: So literally therapist is saying they're going to kill themselves. 77 00:04:33,636 --> 00:04:37,196 Speaker 2: Right, So it's escalating talking about kind of really thoughts 78 00:04:37,196 --> 00:04:42,156 Speaker 2: about about death. And it's clearly like the profound mismatch 79 00:04:42,196 --> 00:04:44,796 Speaker 2: between what we were thinking about and what we were 80 00:04:44,796 --> 00:04:45,716 Speaker 2: going for is. 81 00:04:45,676 --> 00:04:47,596 Speaker 1: What what did you think when you read that? 82 00:04:48,196 --> 00:04:52,356 Speaker 2: So I thought this is a such a non starter. 83 00:04:52,676 --> 00:04:54,756 Speaker 2: But it was. I think one of the things that 84 00:04:54,836 --> 00:04:56,476 Speaker 2: I think was clear was it was picking up on 85 00:04:56,596 --> 00:04:58,676 Speaker 2: patterns in the data, but were the wrong data. 86 00:04:59,196 --> 00:05:02,276 Speaker 1: Yeah, I mean one one option then is give up. 87 00:05:02,796 --> 00:05:06,196 Speaker 2: It would have been absolutely like, literally, the worst therapist 88 00:05:06,276 --> 00:05:09,556 Speaker 2: ever is what you have built? I mean it, I 89 00:05:09,556 --> 00:05:12,956 Speaker 2: couldn't imagine a worse Yeah, a worse thing to actually 90 00:05:12,996 --> 00:05:15,556 Speaker 2: try to implement in a real setting. So this went 91 00:05:15,636 --> 00:05:19,156 Speaker 2: nowhere on in and of itself. But we had a 92 00:05:19,156 --> 00:05:21,316 Speaker 2: good reason to start there actually, So it wasn't just 93 00:05:21,356 --> 00:05:24,036 Speaker 2: that there's widely available data these peer networks actually do. 94 00:05:24,276 --> 00:05:27,356 Speaker 2: There is literature to support that having exposure to these 95 00:05:27,396 --> 00:05:30,236 Speaker 2: peer networks actually improves mental health outcomes. It's a big 96 00:05:30,276 --> 00:05:33,516 Speaker 2: literature and the cancer Survivor network, for example, where folks 97 00:05:33,556 --> 00:05:35,676 Speaker 2: that are struggling with cancer and hearing from other folks 98 00:05:35,716 --> 00:05:38,476 Speaker 2: that have gone through it can really build this resilience 99 00:05:38,556 --> 00:05:40,556 Speaker 2: and it promotes a lot of mental health outcomes that 100 00:05:40,596 --> 00:05:42,796 Speaker 2: are positive. So we had a good reason to start, 101 00:05:42,836 --> 00:05:47,196 Speaker 2: but gosh, did it not go well? So, okay, the 102 00:05:47,236 --> 00:05:51,596 Speaker 2: next thing we do is switch gears the exact opposite direction. Okay, 103 00:05:51,636 --> 00:05:54,516 Speaker 2: we started with very lay persons trying to interact with 104 00:05:54,636 --> 00:05:57,476 Speaker 2: other lay persons surrounding their mental health. Let's go to 105 00:05:57,516 --> 00:06:01,516 Speaker 2: what providers would do. And so we we got access 106 00:06:01,596 --> 00:06:05,516 Speaker 2: to thousands of psychotherapy training videos and these are interesting. 107 00:06:05,796 --> 00:06:08,956 Speaker 2: These are these are how psychologists are often like exposed 108 00:06:08,956 --> 00:06:11,836 Speaker 2: to the field on what they would what they would 109 00:06:12,116 --> 00:06:14,676 Speaker 2: like really learn how therapy is supposed to work and 110 00:06:14,716 --> 00:06:17,116 Speaker 2: how it's supposed to be delivered. And in these these 111 00:06:17,116 --> 00:06:21,716 Speaker 2: are like dialogues between sometimes actual patients that are consenting 112 00:06:21,756 --> 00:06:24,596 Speaker 2: to be part of this and sometimes simulated patients where 113 00:06:24,596 --> 00:06:28,396 Speaker 2: it's an actor that's that's trying to mimic this, and 114 00:06:29,796 --> 00:06:32,516 Speaker 2: there's a psychologist or a mental health provider that is 115 00:06:32,596 --> 00:06:36,316 Speaker 2: like really having a real session with this, And so 116 00:06:36,356 --> 00:06:40,436 Speaker 2: we train our second model on on that. On that data, 117 00:06:40,396 --> 00:06:43,876 Speaker 2: it seems more promising. You would think you'd say, I'm 118 00:06:43,876 --> 00:06:46,956 Speaker 2: feeling depressed. What should I do? As like the initial 119 00:06:47,196 --> 00:06:51,676 Speaker 2: way that we would test this, the model says mm hmm, 120 00:06:53,996 --> 00:06:56,396 Speaker 2: like literally mm hmm. 121 00:06:56,756 --> 00:07:00,956 Speaker 1: And like like it writes out m m space hmm. 122 00:07:00,996 --> 00:07:01,516 Speaker 2: You've got it. 123 00:07:01,796 --> 00:07:04,956 Speaker 1: And so what did you think when you saw that? 124 00:07:05,276 --> 00:07:07,076 Speaker 2: And so I was like, oh gosh, it's picking up 125 00:07:07,076 --> 00:07:09,596 Speaker 2: on patterns in the data, and so the yeah, but 126 00:07:10,516 --> 00:07:13,876 Speaker 2: you continue these interactions and then the next responses go 127 00:07:14,036 --> 00:07:18,476 Speaker 2: on from the therapist, So like within about five or 128 00:07:18,516 --> 00:07:22,316 Speaker 2: so turns, we would often get the model that would 129 00:07:22,316 --> 00:07:25,836 Speaker 2: respond about their interpretations of their problems stemming from their 130 00:07:25,916 --> 00:07:29,076 Speaker 2: their mother or their parents more generally. So like it's 131 00:07:29,156 --> 00:07:30,596 Speaker 2: kind of like, if you were to try to think 132 00:07:30,636 --> 00:07:33,396 Speaker 2: about it, what a psychologist is. This is like every 133 00:07:33,636 --> 00:07:36,036 Speaker 2: trope of what a like in your mind if you 134 00:07:36,076 --> 00:07:37,236 Speaker 2: were going to like think. 135 00:07:37,116 --> 00:07:40,876 Speaker 1: Of the stereotypical on the couch and a guy's wearing 136 00:07:40,916 --> 00:07:42,836 Speaker 1: a tweet jacket sitting in a chair. 137 00:07:42,756 --> 00:07:47,676 Speaker 2: And hardly says anything of that could be potentially helpful, 138 00:07:47,716 --> 00:07:50,036 Speaker 2: but is reflecting things back to me and. 139 00:07:50,036 --> 00:07:52,636 Speaker 1: So then telling me it goes back to my parents. Yeah, 140 00:07:52,756 --> 00:07:54,876 Speaker 1: well this is so let's just pause here for a moment, because, 141 00:07:54,876 --> 00:07:59,316 Speaker 1: as you say, this is like the stereotype of the therapist, 142 00:07:59,716 --> 00:08:02,916 Speaker 1: but you trained it on real data, so maybe it's 143 00:08:02,916 --> 00:08:04,316 Speaker 1: the stereotype for a reason. 144 00:08:04,436 --> 00:08:08,956 Speaker 2: Yes, I think what to me was really clear was 145 00:08:09,396 --> 00:08:13,796 Speaker 2: that we were we had data that the models were 146 00:08:13,876 --> 00:08:16,676 Speaker 2: emulating patterns they were seeing in the data. So the 147 00:08:16,716 --> 00:08:20,036 Speaker 2: models weren't the problem. The problem was the data were 148 00:08:20,036 --> 00:08:21,116 Speaker 2: the wrong data. 149 00:08:21,276 --> 00:08:23,436 Speaker 1: But the data is the data that is used to 150 00:08:23,476 --> 00:08:26,116 Speaker 1: trade real therapists. Like, it's confusing that this is the 151 00:08:26,156 --> 00:08:26,716 Speaker 1: wrong data. 152 00:08:26,876 --> 00:08:28,276 Speaker 2: It is it is. 153 00:08:28,476 --> 00:08:31,156 Speaker 1: Why why is it the wrong data? This should be 154 00:08:31,196 --> 00:08:32,356 Speaker 1: exactly the data you want. 155 00:08:32,436 --> 00:08:34,836 Speaker 2: Well, it's it's the wrong data for this format. And 156 00:08:34,916 --> 00:08:40,116 Speaker 2: our conversation when you might say something me nodding along 157 00:08:40,356 --> 00:08:43,636 Speaker 2: or saying mm hmm or go on, my contextually be 158 00:08:43,916 --> 00:08:48,916 Speaker 2: like completely appropriate. So tactically in a conversational dialogue that 159 00:08:48,956 --> 00:08:52,356 Speaker 2: would happen via chat. This is not like a medium 160 00:08:52,356 --> 00:08:54,676 Speaker 2: that works very well like this kind of thing. 161 00:08:54,836 --> 00:08:58,516 Speaker 1: Yeah, it's almost like a translation, right. It doesn't translate 162 00:08:58,556 --> 00:09:00,996 Speaker 1: from a human face to face interaction to a chat 163 00:09:00,996 --> 00:09:02,236 Speaker 1: window on the computer. 164 00:09:01,996 --> 00:09:03,236 Speaker 2: And not the right setting. 165 00:09:03,396 --> 00:09:05,116 Speaker 1: Yeah, so that I mean that goes to the like 166 00:09:05,396 --> 00:09:10,236 Speaker 1: nonverbal subtler aspects of therapy, right, like presumer when the 167 00:09:10,236 --> 00:09:14,596 Speaker 1: therapist is saying m hm, there is there is body language, 168 00:09:14,636 --> 00:09:17,756 Speaker 1: there's everything that's happening in the room, which is a 169 00:09:17,796 --> 00:09:21,356 Speaker 1: tremendous amount of information or emotional information, right, and that 170 00:09:21,476 --> 00:09:24,236 Speaker 1: is a thing that is lost, yes, no doubt in 171 00:09:24,316 --> 00:09:28,356 Speaker 1: this medium and and maybe speaks to a broader question 172 00:09:28,636 --> 00:09:31,156 Speaker 1: about the translatability of therapy. 173 00:09:31,316 --> 00:09:35,556 Speaker 2: Yeah, absolutely so I think to me, like the it 174 00:09:35,636 --> 00:09:39,316 Speaker 2: was at that moment that I kind of knew that 175 00:09:39,356 --> 00:09:42,436 Speaker 2: we we needed to do something radically different. Neither of 176 00:09:42,476 --> 00:09:45,956 Speaker 2: these was working well. About one in ten of the 177 00:09:46,076 --> 00:09:51,316 Speaker 2: responses from that from that chatbot, based on the clinicians, 178 00:09:51,556 --> 00:09:53,916 Speaker 2: would be something that we would be happy with. So 179 00:09:54,116 --> 00:09:57,756 Speaker 2: something that is both personalized, clinically appropriate and dynamic. 180 00:09:58,356 --> 00:10:00,796 Speaker 1: So you're saying you've got it right ten percent of exactly. 181 00:10:00,916 --> 00:10:04,116 Speaker 2: So really, no, that's that's not a good like no, 182 00:10:04,156 --> 00:10:06,076 Speaker 2: it's not a good not a good therapy. No, we 183 00:10:06,076 --> 00:10:09,036 Speaker 2: would we would never think about implement like actually trying 184 00:10:09,196 --> 00:10:13,556 Speaker 2: employ that. So then what we started at that point 185 00:10:13,876 --> 00:10:17,916 Speaker 2: was building our own creating our own data set from scratch, 186 00:10:18,316 --> 00:10:22,636 Speaker 2: in which we how how the models would learn would 187 00:10:22,636 --> 00:10:25,876 Speaker 2: be exactly what we want it to say. 188 00:10:25,916 --> 00:10:28,716 Speaker 1: That seems that seems wild, I mean, how do you 189 00:10:28,796 --> 00:10:30,596 Speaker 1: do that? How do you how do you generate that 190 00:10:30,676 --> 00:10:31,236 Speaker 1: much data. 191 00:10:31,356 --> 00:10:33,436 Speaker 2: We've had a team of one hundred people that have 192 00:10:33,516 --> 00:10:35,716 Speaker 2: worked on this project over the last five and a 193 00:10:35,756 --> 00:10:38,796 Speaker 2: half years at this point, and they've spent over one 194 00:10:38,836 --> 00:10:41,916 Speaker 2: hundred thousand human hours kind of really trying to build this. 195 00:10:42,116 --> 00:10:44,596 Speaker 1: Just specifically, what how do you build a data set 196 00:10:44,636 --> 00:10:47,876 Speaker 1: from scratch? Because like, the data set is the huge problem. 197 00:10:48,076 --> 00:10:54,236 Speaker 2: Yes, absolutely so. Psychotherapy, when you would test it is 198 00:10:54,316 --> 00:10:57,316 Speaker 2: based on something that is written down in a manual. 199 00:10:57,756 --> 00:11:01,076 Speaker 2: So when you're a psychologist when they're in a randomized 200 00:11:01,076 --> 00:11:03,636 Speaker 2: controlled trial trying to test whether something works or not. 201 00:11:04,596 --> 00:11:07,356 Speaker 2: To be able to test it, it has to be replicable, 202 00:11:07,516 --> 00:11:11,356 Speaker 2: meaning it's like repeated across different therapists. So there are 203 00:11:11,436 --> 00:11:14,836 Speaker 2: manuals that are developed. In this session you work on 204 00:11:14,836 --> 00:11:17,556 Speaker 2: on psycho education. On this session we're going to be 205 00:11:17,556 --> 00:11:21,596 Speaker 2: working on behavioral activation, So which are different techniques that 206 00:11:21,636 --> 00:11:24,876 Speaker 2: are really a focus at a given time, and these 207 00:11:24,916 --> 00:11:27,676 Speaker 2: are broken down to try to make it translational so 208 00:11:27,676 --> 00:11:30,116 Speaker 2: that you can actually move it. So the team would 209 00:11:30,196 --> 00:11:34,236 Speaker 2: read these empirically supported treatment manuals, so the ones that 210 00:11:34,316 --> 00:11:37,876 Speaker 2: had been tested in randomized control trials, and then what 211 00:11:37,916 --> 00:11:41,276 Speaker 2: we would do is we would take that content chapter 212 00:11:41,316 --> 00:11:44,476 Speaker 2: by chapter, because this is like session by session, take 213 00:11:44,516 --> 00:11:47,116 Speaker 2: the techniques that would work well via chat, of which 214 00:11:47,236 --> 00:11:51,436 Speaker 2: most things in cognitive behavioral therapy would, and then we 215 00:11:51,476 --> 00:11:57,156 Speaker 2: would create an artificial dialogue between would act as like 216 00:11:57,236 --> 00:12:00,116 Speaker 2: what is the patient's presenting problem, what they're bringing on, 217 00:12:00,196 --> 00:12:03,436 Speaker 2: what the personality is like, and we're kind of constructing this, 218 00:12:03,636 --> 00:12:07,596 Speaker 2: and then what is what we would want our system 219 00:12:07,636 --> 00:12:10,756 Speaker 2: to be the gold standard response for every kind of 220 00:12:10,796 --> 00:12:14,396 Speaker 2: input and output that we'd have. So we're writing both 221 00:12:14,476 --> 00:12:17,076 Speaker 2: the patient end and the therapist end. 222 00:12:17,796 --> 00:12:19,716 Speaker 1: It's like you're writing a screenplay. 223 00:12:19,436 --> 00:12:21,916 Speaker 2: Basically, it really is. It's a lot like that, but 224 00:12:22,036 --> 00:12:25,996 Speaker 2: instead of a screenplay that might be written like in general, 225 00:12:26,076 --> 00:12:29,036 Speaker 2: it's like not like not just something general, but like, 226 00:12:29,196 --> 00:12:32,596 Speaker 2: where is something that's really evidence based based on content 227 00:12:32,676 --> 00:12:35,956 Speaker 2: that we know works in this setting? 228 00:12:36,036 --> 00:12:39,516 Speaker 1: And so what you write the equivalent of what thousands 229 00:12:39,556 --> 00:12:40,956 Speaker 1: of hours of sessions? 230 00:12:41,036 --> 00:12:44,596 Speaker 2: Hundreds of thousands. There was post docs, grad students and 231 00:12:44,716 --> 00:12:47,556 Speaker 2: undergraduates within my group that we're all part of this 232 00:12:47,636 --> 00:12:49,196 Speaker 2: team that are creating. 233 00:12:49,156 --> 00:12:51,156 Speaker 1: Just doing the work, just writing the dialogue. 234 00:12:51,236 --> 00:12:54,436 Speaker 2: Yeah, exactly. And not only did we write them, but 235 00:12:54,916 --> 00:12:58,436 Speaker 2: every dialogue before it would go into something that our 236 00:12:58,436 --> 00:13:01,796 Speaker 2: models are trained would be reviewed by another member of 237 00:13:01,836 --> 00:13:06,836 Speaker 2: the team. So it's all not only crafted by hand, 238 00:13:06,916 --> 00:13:09,716 Speaker 2: but we would review it, give each other feed on it, 239 00:13:09,796 --> 00:13:12,236 Speaker 2: and then like make sure that it is the highest 240 00:13:12,316 --> 00:13:17,996 Speaker 2: quality data. And that's when we started seeing dramatic improvements 241 00:13:18,036 --> 00:13:22,036 Speaker 2: in the model performance. So we continued with us for years. 242 00:13:23,196 --> 00:13:29,476 Speaker 2: Six months before chat TPT was launched, we had a 243 00:13:29,596 --> 00:13:33,916 Speaker 2: model that in today's standards would be so tiny, that 244 00:13:34,116 --> 00:13:38,516 Speaker 2: was delivering about ninety percent of the responses that were 245 00:13:38,556 --> 00:13:42,276 Speaker 2: output We were evaluating as exactly what we'd want. It's 246 00:13:42,316 --> 00:13:47,276 Speaker 2: this gold standard evidence based treatment, so that was fantastic. 247 00:13:47,396 --> 00:13:49,916 Speaker 2: We were really excited about it. So we've got like 248 00:13:49,956 --> 00:13:53,556 Speaker 2: the we've got the benefit side down of the equation. 249 00:13:54,316 --> 00:13:58,196 Speaker 2: The next two years we focus on the risk, the 250 00:13:58,316 --> 00:14:00,516 Speaker 2: risk side of it well, because there's a huge risk. 251 00:14:00,356 --> 00:14:02,676 Speaker 1: Here, right The people who are using it are by 252 00:14:02,756 --> 00:14:07,596 Speaker 1: design quite vulnerable, by design putting a tremendous amount of 253 00:14:07,716 --> 00:14:11,996 Speaker 1: trust into this bot and making themselves vulnerable to it, 254 00:14:12,276 --> 00:14:16,196 Speaker 1: like it's a it's quite a risky proposition. And so 255 00:14:16,196 --> 00:14:18,116 Speaker 1: so tell me specifically, what are you doing. 256 00:14:18,236 --> 00:14:21,516 Speaker 2: So we're trying to get it to endorse elements that 257 00:14:21,556 --> 00:14:24,796 Speaker 2: would make mental health worse. So a lot of our 258 00:14:24,876 --> 00:14:29,436 Speaker 2: conversations or surrounding trying to get it to For example, 259 00:14:29,836 --> 00:14:33,436 Speaker 2: I'll give you an example of one that nearly almost 260 00:14:33,836 --> 00:14:37,356 Speaker 2: almost any model will struggle with that's not tailored towards 261 00:14:37,396 --> 00:14:40,676 Speaker 2: the safety end. What is it is if you tell 262 00:14:40,756 --> 00:14:43,076 Speaker 2: a model that you want to lose weight, it will 263 00:14:43,236 --> 00:14:46,196 Speaker 2: generally try to help you do that. And if you 264 00:14:46,276 --> 00:14:48,076 Speaker 2: want to if you want to work in an area 265 00:14:48,116 --> 00:14:52,236 Speaker 2: related to mental health, trying to promote weight loss without 266 00:14:52,276 --> 00:14:54,076 Speaker 2: context is so not safe. 267 00:14:54,476 --> 00:14:56,636 Speaker 1: You saying it might be a user within eating disorder 268 00:14:56,676 --> 00:15:00,196 Speaker 1: absolutely unhealthily thin, who wants to be even thinner. 269 00:14:59,836 --> 00:15:03,116 Speaker 2: And the model will help them to often actually get 270 00:15:03,156 --> 00:15:06,356 Speaker 2: into a lower weight than they already are. So this 271 00:15:06,516 --> 00:15:09,716 Speaker 2: is like not something that we would ever want to promote, 272 00:15:09,796 --> 00:15:12,956 Speaker 2: but this is something that we certainly at earlier stages 273 00:15:13,036 --> 00:15:15,716 Speaker 2: we're seeing these types of characteristics within the model. 274 00:15:15,956 --> 00:15:18,236 Speaker 1: What are other like, that's an interesting one and it 275 00:15:18,316 --> 00:15:20,116 Speaker 1: makes perfect sense when you say it, I would not 276 00:15:20,196 --> 00:15:22,036 Speaker 1: have thought of it. Or what's another one? 277 00:15:22,196 --> 00:15:24,396 Speaker 2: A lot of it would be like we talk about 278 00:15:24,436 --> 00:15:27,556 Speaker 2: the ethics of suicide. For example, somebody who is who thinks, 279 00:15:27,636 --> 00:15:30,796 Speaker 2: you know, they're in a midst of suffering and you 280 00:15:30,836 --> 00:15:33,156 Speaker 2: know it's like that they could should be able to 281 00:15:33,236 --> 00:15:35,036 Speaker 2: end of their life or they're thinking about this. 282 00:15:35,356 --> 00:15:39,316 Speaker 1: Yes, and what do you want the model? What what? 283 00:15:39,316 --> 00:15:41,196 Speaker 1: What does the model say that it shouldn't say in 284 00:15:41,236 --> 00:15:41,716 Speaker 1: that setting? 285 00:15:41,796 --> 00:15:44,476 Speaker 2: So for you, in these settings, we want to make 286 00:15:44,516 --> 00:15:47,636 Speaker 2: sure that they don't and the model does not promote 287 00:15:47,716 --> 00:15:52,316 Speaker 2: or endorse elements that would promote someone's a worsening of 288 00:15:52,356 --> 00:15:55,636 Speaker 2: suicidal intent. We want to make sure we're providing not 289 00:15:55,676 --> 00:15:59,116 Speaker 2: only not the absence of that, actually some benefit in 290 00:15:59,156 --> 00:16:00,276 Speaker 2: these types of scenarios. 291 00:16:00,556 --> 00:16:03,116 Speaker 1: That's the ultimate nightmare for you. Yeah, right, Like this 292 00:16:03,316 --> 00:16:06,276 Speaker 1: be super clear. The very worst thing that could happen 293 00:16:06,516 --> 00:16:09,796 Speaker 1: is you build this thing and it contributes to someone. Absolutely, 294 00:16:09,796 --> 00:16:13,316 Speaker 1: that's a plausible outcome and a disastrous night. 295 00:16:13,436 --> 00:16:16,436 Speaker 2: It's everything that I worry about in this area is 296 00:16:16,556 --> 00:16:19,796 Speaker 2: exactly this kind of thing. And so we essentially every 297 00:16:19,836 --> 00:16:23,276 Speaker 2: time we find an area where they're not implementing things perfectly, 298 00:16:24,156 --> 00:16:28,756 Speaker 2: some optimal response, adding new training data, and that's that's 299 00:16:28,796 --> 00:16:31,316 Speaker 2: when things continue to get better until we do this 300 00:16:31,436 --> 00:16:34,116 Speaker 2: and we don't find these holes anymore. That's when we 301 00:16:34,196 --> 00:16:37,276 Speaker 2: finally we're ready for the randomized control trial. 302 00:16:38,076 --> 00:16:43,476 Speaker 1: Right, So you decide after after what four years five years? 303 00:16:43,516 --> 00:16:46,596 Speaker 2: This is about four and a half years. 304 00:16:46,916 --> 00:16:50,196 Speaker 1: Yeah, that that you're ready to to have people. 305 00:16:50,036 --> 00:16:51,476 Speaker 2: Use use the model. 306 00:16:51,676 --> 00:16:54,276 Speaker 1: I'll be it in a kind of Yeah, you're going 307 00:16:54,356 --> 00:16:56,476 Speaker 1: to be the human in the loop. Right, So, so 308 00:16:56,596 --> 00:16:59,316 Speaker 1: you decide to do this study. You recruit people on 309 00:16:59,756 --> 00:17:01,276 Speaker 1: Facebook and Instagram. 310 00:17:00,876 --> 00:17:04,476 Speaker 2: Basically ye exactly, yep, and what. 311 00:17:04,476 --> 00:17:06,236 Speaker 1: So what are they signing up for? What's the what's 312 00:17:06,276 --> 00:17:07,116 Speaker 1: the big study you do? 313 00:17:07,236 --> 00:17:10,236 Speaker 2: So it's a it's a it's a randomized control trial. 314 00:17:11,556 --> 00:17:15,036 Speaker 2: The trial design is essentially that folks would come in, 315 00:17:15,476 --> 00:17:19,116 Speaker 2: they would fill out information about their mental health across 316 00:17:19,116 --> 00:17:24,236 Speaker 2: a variety of areas, so depression, anxiety, and eating disorders, 317 00:17:24,276 --> 00:17:28,876 Speaker 2: for folks that screen positive for having clinical levels of 318 00:17:28,916 --> 00:17:32,916 Speaker 2: depression or anxiety, they would be in our Folks that 319 00:17:32,956 --> 00:17:35,596 Speaker 2: were at risk for eating disorders would be included in 320 00:17:35,636 --> 00:17:39,276 Speaker 2: the trial. We tried to have at least seventy people 321 00:17:39,316 --> 00:17:41,636 Speaker 2: in each group, so we had two hundred and ten 322 00:17:41,676 --> 00:17:45,036 Speaker 2: people that we were planning on and rolling within the trial, 323 00:17:45,556 --> 00:17:48,756 Speaker 2: and then half of them were randomized to receive their 324 00:17:48,756 --> 00:17:50,996 Speaker 2: ABOUT and half of them were on a wait list 325 00:17:51,116 --> 00:17:53,756 Speaker 2: in which they would receive their ABOUT after the trial 326 00:17:53,796 --> 00:17:57,156 Speaker 2: had ended. The trial design was to try to ask 327 00:17:57,196 --> 00:18:00,316 Speaker 2: folks to use their ABOUT for four weeks. They retained 328 00:18:00,396 --> 00:18:02,916 Speaker 2: access to therabot and could use their ABOUT for the 329 00:18:02,956 --> 00:18:06,716 Speaker 2: next four weeks thereafter, so eight weeks total, but we 330 00:18:06,756 --> 00:18:08,996 Speaker 2: asked them to try to actually use it during that 331 00:18:09,196 --> 00:18:12,796 Speaker 2: first four weeks and that was that was essentially the 332 00:18:12,836 --> 00:18:13,436 Speaker 2: trial design. 333 00:18:13,596 --> 00:18:17,436 Speaker 1: So okay, so people signed up, they start like, what's 334 00:18:17,556 --> 00:18:20,636 Speaker 1: what's actually happening? Are they just like chatting with the 335 00:18:20,676 --> 00:18:22,036 Speaker 1: bought every day? Is it? 336 00:18:22,076 --> 00:18:26,196 Speaker 2: So they install a smartphone application that's that they're about at. 337 00:18:27,156 --> 00:18:31,036 Speaker 2: They are prompted once a day to try to have 338 00:18:31,076 --> 00:18:33,956 Speaker 2: a conversation starter with the with the bot and then 339 00:18:34,196 --> 00:18:37,436 Speaker 2: the bot. From there they could talk about it when 340 00:18:37,596 --> 00:18:40,276 Speaker 2: and wherever they would want. They can ignore those notifications 341 00:18:40,276 --> 00:18:42,636 Speaker 2: and kind of engage with it at any time that 342 00:18:42,636 --> 00:18:47,076 Speaker 2: they'd want. But that was the gist of the trial design, 343 00:18:47,156 --> 00:18:49,556 Speaker 2: and so folks in terms of how people used it, 344 00:18:50,036 --> 00:18:53,196 Speaker 2: they interacted with it throughout the day, throughout the night. 345 00:18:54,196 --> 00:18:58,676 Speaker 2: So for example, folks that would have trouble sleeping, that 346 00:18:58,796 --> 00:19:00,716 Speaker 2: was like a way that folks during the middle of 347 00:19:00,756 --> 00:19:05,516 Speaker 2: the night would engage with it fairly often. They in 348 00:19:05,596 --> 00:19:09,036 Speaker 2: terms of the types of the topics that they described, 349 00:19:10,476 --> 00:19:13,116 Speaker 2: it was really the entire range of something that you 350 00:19:13,116 --> 00:19:15,796 Speaker 2: would see in psychotherapy. We had folks that were dealing 351 00:19:15,836 --> 00:19:18,996 Speaker 2: with and discussing their different symptoms that they were talking about. 352 00:19:19,036 --> 00:19:21,236 Speaker 2: So the depression, their anxiety that they were struggling with, 353 00:19:21,276 --> 00:19:24,796 Speaker 2: their their eating, and their body image concerns. Those types 354 00:19:24,796 --> 00:19:27,076 Speaker 2: of things are common because of the groups that we 355 00:19:27,076 --> 00:19:33,156 Speaker 2: were recruiting. But relationship difficulties, problems like folks, some folks 356 00:19:33,156 --> 00:19:37,596 Speaker 2: were really like I had ruptures in there, you know, 357 00:19:37,796 --> 00:19:40,076 Speaker 2: somebody was going through a divorce. Other folks were like 358 00:19:40,196 --> 00:19:44,516 Speaker 2: going through breakups, problems at work. Some folks were unemployed 359 00:19:45,556 --> 00:19:48,676 Speaker 2: and during this time, So like the range of kind 360 00:19:48,716 --> 00:19:52,396 Speaker 2: of personal dilemmas and difficulties that folks were experiencing. Was 361 00:19:52,476 --> 00:19:54,276 Speaker 2: a lot of what we would see in like a 362 00:19:54,356 --> 00:19:58,316 Speaker 2: real setting where it's like kind of a whole host 363 00:19:58,436 --> 00:20:02,156 Speaker 2: of different things that folks were describing and experiencing. 364 00:20:02,076 --> 00:20:04,676 Speaker 1: And presumably had they agreed as part of enrolling in 365 00:20:04,756 --> 00:20:07,796 Speaker 1: the trial to let you read the transcript Oh? 366 00:20:07,836 --> 00:20:11,236 Speaker 2: Absolutely, yeah, very very clear when we did an informed 367 00:20:11,236 --> 00:20:14,916 Speaker 2: consent process where folks would know that we were reading 368 00:20:15,396 --> 00:20:16,516 Speaker 2: reading these transcripts. 369 00:20:16,916 --> 00:20:19,236 Speaker 1: And are you personally, like, what was it like for 370 00:20:19,276 --> 00:20:21,596 Speaker 1: you seeing them come in? Are you reading them every day? 371 00:20:21,676 --> 00:20:22,556 Speaker 1: I mean more than that. 372 00:20:23,156 --> 00:20:27,156 Speaker 2: So, I mean this is something that is so ill. 373 00:20:27,396 --> 00:20:29,596 Speaker 2: You alluded to that that this is one of these 374 00:20:29,636 --> 00:20:33,036 Speaker 2: concerns that anybody would have. Is like a nightmare scenario 375 00:20:33,516 --> 00:20:37,196 Speaker 2: where something is the bad happens and somebody actually outs right, 376 00:20:37,436 --> 00:20:39,516 Speaker 2: So this is like I think of this in a 377 00:20:39,556 --> 00:20:40,756 Speaker 2: way that I take So this is not. 378 00:20:40,756 --> 00:20:43,916 Speaker 1: A happy moment for you. This is like you're terrified 379 00:20:43,916 --> 00:20:44,916 Speaker 1: that it might go wrong. 380 00:20:45,076 --> 00:20:48,156 Speaker 2: Well, it's it's certainly like I see it going right, 381 00:20:48,236 --> 00:20:50,556 Speaker 2: but I have every concern that it could go wrong. 382 00:20:50,796 --> 00:20:54,716 Speaker 2: Right like that, And so for the first half of 383 00:20:54,756 --> 00:21:00,636 Speaker 2: the trial, I am monitoring every single interaction sent to 384 00:21:00,796 --> 00:21:03,436 Speaker 2: or from the bot. Other people are also doing this 385 00:21:03,516 --> 00:21:05,836 Speaker 2: on the team, so I'm not the only one. But 386 00:21:05,916 --> 00:21:07,396 Speaker 2: I did not get a lot of sleep in the 387 00:21:07,396 --> 00:21:10,156 Speaker 2: first half of this trial, in part because I was 388 00:21:10,196 --> 00:21:12,156 Speaker 2: really trying to do this in near real time. So 389 00:21:12,316 --> 00:21:15,276 Speaker 2: usually for nearly every message I was, I was getting 390 00:21:15,396 --> 00:21:18,356 Speaker 2: to it within about an hour. So yeah, it was 391 00:21:18,676 --> 00:21:21,956 Speaker 2: it was a barrage of NonStop kind of communication that 392 00:21:22,036 --> 00:21:22,476 Speaker 2: was happening. 393 00:21:22,556 --> 00:21:24,996 Speaker 1: So were there were there any slip ups? Did you 394 00:21:24,996 --> 00:21:27,436 Speaker 1: ever have to intervene as a human in the loop. 395 00:21:27,316 --> 00:21:30,356 Speaker 2: That we did? And the thing that that was something 396 00:21:30,396 --> 00:21:34,236 Speaker 2: that we as a team did not anticipate. What we 397 00:21:34,356 --> 00:21:37,796 Speaker 2: found was really unintended behavior was a lot of folks 398 00:21:38,156 --> 00:21:42,436 Speaker 2: interacted with they're abot, and in doing that, there was 399 00:21:42,476 --> 00:21:46,116 Speaker 2: a significant number of people that would interact with it 400 00:21:46,156 --> 00:21:48,876 Speaker 2: and talk about their medical symptoms. So, for example, there 401 00:21:48,916 --> 00:21:51,356 Speaker 2: was a number of folks that were experiencing symptoms of 402 00:21:51,396 --> 00:21:55,116 Speaker 2: a sexually transmitted disease and they would described that in 403 00:21:55,156 --> 00:21:58,236 Speaker 2: great detail and ask it, you know what, how how 404 00:21:58,316 --> 00:22:01,716 Speaker 2: they should medically treat that? And instead of they're they're 405 00:22:01,716 --> 00:22:05,436 Speaker 2: about saying, hey, go see a provider for this this 406 00:22:05,476 --> 00:22:08,636 Speaker 2: is not my realm of expertise, it responds as if, 407 00:22:09,756 --> 00:22:12,756 Speaker 2: and so this that all of the advice that it 408 00:22:12,796 --> 00:22:17,356 Speaker 2: gave was really fairly reasonable, both in the assessment and 409 00:22:17,396 --> 00:22:20,396 Speaker 2: treatment protocols, but we would not have wanted to act 410 00:22:20,476 --> 00:22:24,436 Speaker 2: that way, So we contacted all of those folks to 411 00:22:24,596 --> 00:22:29,476 Speaker 2: recommend that they actually contact a physician about that. Folks 412 00:22:29,476 --> 00:22:33,636 Speaker 2: did interact with it related to crisis situations. So we 413 00:22:33,716 --> 00:22:39,076 Speaker 2: had also had there about in these moments provided appropriate 414 00:22:39,116 --> 00:22:42,516 Speaker 2: contextual crisis support, but we reached out to those folks 415 00:22:43,076 --> 00:22:45,876 Speaker 2: to further escalate and make sure that they had further 416 00:22:45,916 --> 00:22:49,876 Speaker 2: support available and that and those types of times too. 417 00:22:50,156 --> 00:22:53,956 Speaker 2: So there there were things that, you know, we're certainly 418 00:22:54,036 --> 00:22:58,836 Speaker 2: areas of concern that that happened, but nothing nothing that 419 00:22:58,996 --> 00:23:02,556 Speaker 2: was concerning from the major areas that we had intended 420 00:23:02,756 --> 00:23:05,196 Speaker 2: all kind of really went went pretty well. 421 00:23:08,636 --> 00:23:11,156 Speaker 1: Still, Tom on the show the results of the study 422 00:23:11,356 --> 00:23:23,036 Speaker 1: and what's next for therapot. What were the results of 423 00:23:23,036 --> 00:23:23,516 Speaker 1: the study. 424 00:23:23,996 --> 00:23:26,356 Speaker 2: So this is one of the things that was just 425 00:23:27,116 --> 00:23:31,396 Speaker 2: really fantastic to see was that we had we looked 426 00:23:31,436 --> 00:23:33,876 Speaker 2: at our main outcomes for what we were trying to 427 00:23:33,916 --> 00:23:39,076 Speaker 2: look at, where the degree to folks reduced their depression symptoms, 428 00:23:39,396 --> 00:23:44,196 Speaker 2: their anxiety symptoms, and their eating disorder symptoms among the 429 00:23:44,276 --> 00:23:47,796 Speaker 2: intervention group relative to the control group. So based on 430 00:23:47,836 --> 00:23:51,276 Speaker 2: the change and self reported symptoms in the treatment group 431 00:23:51,356 --> 00:23:55,756 Speaker 2: versus the control group, and we saw these really large 432 00:23:56,356 --> 00:24:01,716 Speaker 2: differential reductions, meaning a lot more reductions and changes that happened, 433 00:24:01,756 --> 00:24:04,756 Speaker 2: and that non depressive symptoms, the anxiety symptoms, and the 434 00:24:04,756 --> 00:24:07,876 Speaker 2: eating disorder symptoms, and the THERAPOT group relative of the 435 00:24:07,916 --> 00:24:11,476 Speaker 2: witless control group, and the degree of change is about 436 00:24:11,516 --> 00:24:14,716 Speaker 2: as strong as you'd ever see. And are randomized control 437 00:24:14,796 --> 00:24:19,876 Speaker 2: trials of outpatient psychotherapy that would be delivered within cognitive 438 00:24:19,876 --> 00:24:23,396 Speaker 2: behavioral therapy with a human, a real human delivering this 439 00:24:23,556 --> 00:24:26,556 Speaker 2: an expert. You didn't test it against against therapy, No, 440 00:24:26,636 --> 00:24:30,476 Speaker 2: we didn't. What you're saying, results results of other studies 441 00:24:30,876 --> 00:24:35,236 Speaker 2: using real human therapists show comparable magnitudes of benefit. That's 442 00:24:35,276 --> 00:24:35,956 Speaker 2: exactly right. 443 00:24:36,156 --> 00:24:38,316 Speaker 1: Yes, you've gotta do a head to head. I mean, 444 00:24:38,356 --> 00:24:41,436 Speaker 1: that's the obvious question, like why not randomize people to 445 00:24:41,516 --> 00:24:42,636 Speaker 1: therapy or THERAPI bought? 446 00:24:42,756 --> 00:24:46,596 Speaker 2: So the main thing when we're thinking about the first 447 00:24:46,716 --> 00:24:49,276 Speaker 2: origins point is we want to have some kind of 448 00:24:49,396 --> 00:24:52,916 Speaker 2: effect of how this works relative to the absence of anything. 449 00:24:52,556 --> 00:24:56,156 Speaker 1: Relative to nothing, well, because I mean, presumably the easiest 450 00:24:56,196 --> 00:24:58,476 Speaker 1: case to make for it is not it's better than 451 00:24:58,516 --> 00:25:01,356 Speaker 1: a therapist. It's a huge number of people who need 452 00:25:01,356 --> 00:25:04,636 Speaker 1: a therapist don't have one exactly, and that's the unfortunate reality. 453 00:25:05,156 --> 00:25:07,276 Speaker 1: BOT is better than nothing. It doesn't have to be 454 00:25:07,276 --> 00:25:09,436 Speaker 1: better than a human therapist. It just has to better, 455 00:25:09,516 --> 00:25:09,916 Speaker 1: that's right. 456 00:25:10,316 --> 00:25:14,196 Speaker 2: But so yes, the we are planning ahead to head 457 00:25:14,236 --> 00:25:17,596 Speaker 2: trial against therapist as the next trial that we run, 458 00:25:18,916 --> 00:25:21,996 Speaker 2: in large part because I already think we are not inferior. 459 00:25:22,516 --> 00:25:24,796 Speaker 2: So it will it'll be interesting to see if that 460 00:25:24,876 --> 00:25:28,156 Speaker 2: actually comes out. But that is that is something that 461 00:25:28,676 --> 00:25:33,636 Speaker 2: we have outstanding funding proposals to try to actually do that. 462 00:25:34,236 --> 00:25:36,276 Speaker 2: So one of the other things that I haven't gotten 463 00:25:36,316 --> 00:25:37,916 Speaker 2: to with in the trial outcomes that I think is 464 00:25:38,036 --> 00:25:43,996 Speaker 2: really important on that end actually is to two things. 465 00:25:44,276 --> 00:25:49,076 Speaker 2: One is the degree that folks formed a relationship with 466 00:25:49,916 --> 00:25:54,476 Speaker 2: therapaud and so in psychotherapy, one of the most well 467 00:25:54,516 --> 00:25:58,036 Speaker 2: studied constructs is the ability that you and your therapist 468 00:25:58,476 --> 00:26:01,556 Speaker 2: can get together and work together on common goals and 469 00:26:01,596 --> 00:26:05,596 Speaker 2: trust each other that you as a it's a relationship, 470 00:26:04,596 --> 00:26:09,076 Speaker 2: it's a human relationship. And so this in the literature 471 00:26:09,276 --> 00:26:12,236 Speaker 2: is called the working alliance, and so it's this ability 472 00:26:12,276 --> 00:26:17,636 Speaker 2: to form this bond. We measured this working alliance using 473 00:26:17,676 --> 00:26:21,476 Speaker 2: the same measure that folks would use with outpatient providers 474 00:26:21,516 --> 00:26:24,276 Speaker 2: about how they they felt about their therapist, but instead 475 00:26:24,276 --> 00:26:28,156 Speaker 2: of the therapist that now we're talking about therabot, and 476 00:26:28,916 --> 00:26:32,476 Speaker 2: and folks rated it nearly identically to the norms that 477 00:26:32,916 --> 00:26:36,196 Speaker 2: you would see on the outpatient literature. So we asked folks, 478 00:26:36,276 --> 00:26:39,156 Speaker 2: we give folks the same measure, and that it's essentially 479 00:26:39,196 --> 00:26:42,836 Speaker 2: equivalent to how folks are reading human providers in these ways. 480 00:26:43,196 --> 00:26:46,756 Speaker 1: This is consistent with other where we're seeing people having 481 00:26:46,836 --> 00:26:50,996 Speaker 1: relationship with chatbots and other domains. Yes, I'm old enough 482 00:26:51,036 --> 00:26:54,356 Speaker 1: that it seems weird to me. I don't know, seem 483 00:26:54,476 --> 00:26:54,956 Speaker 1: weird to you. 484 00:26:55,276 --> 00:26:58,356 Speaker 2: I that part I this is more of a surprise 485 00:26:58,396 --> 00:27:00,516 Speaker 2: to me that it was as the bonds were as 486 00:27:00,556 --> 00:27:02,676 Speaker 2: high as they were, that they would actually be about 487 00:27:02,716 --> 00:27:05,356 Speaker 2: what humans would be. And I will say, like one 488 00:27:05,396 --> 00:27:08,516 Speaker 2: of the other surprises within the interactions was the number 489 00:27:08,516 --> 00:27:12,956 Speaker 2: of people that would like respond kind of check in 490 00:27:12,996 --> 00:27:16,276 Speaker 2: with therabot with and just say hey, just checking in 491 00:27:16,796 --> 00:27:18,996 Speaker 2: as if like Therabot is like a I don't know. 492 00:27:19,076 --> 00:27:22,676 Speaker 2: I would I would only like have anticipated folks would 493 00:27:22,716 --> 00:27:25,556 Speaker 2: use this as a tool, so like not like they 494 00:27:25,596 --> 00:27:28,396 Speaker 2: went to hang out with like almost that way. It's 495 00:27:28,436 --> 00:27:32,556 Speaker 2: like our initiating a conversation that isn't I guess doesn't 496 00:27:32,556 --> 00:27:33,836 Speaker 2: have an intention in mind. 497 00:27:34,076 --> 00:27:39,036 Speaker 1: I say please, I'm thank you, I can't help my 498 00:27:39,436 --> 00:27:41,356 Speaker 1: Is it because I think they're going to take over 499 00:27:41,636 --> 00:27:43,516 Speaker 1: or is it a habit or what? I don't know, 500 00:27:43,556 --> 00:27:44,636 Speaker 1: but I do, I do. 501 00:27:44,716 --> 00:27:47,196 Speaker 2: Yeah, I wouldn't. I would say that this was more 502 00:27:47,236 --> 00:27:50,476 Speaker 2: surprising the degree to that folks established this this level 503 00:27:50,516 --> 00:27:52,796 Speaker 2: of a bond with it. I think it's actually really 504 00:27:52,836 --> 00:27:57,556 Speaker 2: good and in really important that they do, in large 505 00:27:57,596 --> 00:27:59,036 Speaker 2: part because that's one of the ways that we know 506 00:27:59,076 --> 00:28:02,996 Speaker 2: psychotherapy works, is that that that folks can come together 507 00:28:03,036 --> 00:28:06,076 Speaker 2: and trust this and develop this working relationship. So I 508 00:28:06,076 --> 00:28:08,676 Speaker 2: think it's actually a necessary ingredient for this to work. 509 00:28:08,716 --> 00:28:11,396 Speaker 1: To some I get it makes sense to me intellectually 510 00:28:11,476 --> 00:28:13,716 Speaker 1: what you're saying. Does it give you any pause or 511 00:28:13,756 --> 00:28:14,916 Speaker 1: do you just think it's great? 512 00:28:15,156 --> 00:28:19,876 Speaker 2: It it gives me pause. If we weren't delivering evidence 513 00:28:19,916 --> 00:28:23,756 Speaker 2: based treatment, Well, this is a good moment. Let's talk 514 00:28:23,756 --> 00:28:26,436 Speaker 2: about the let's talk about the industry more generally. Yeah, 515 00:28:26,676 --> 00:28:28,916 Speaker 2: this is not a you're not making a company, this 516 00:28:28,956 --> 00:28:31,116 Speaker 2: is not a product, right, you don't have any money 517 00:28:31,116 --> 00:28:34,876 Speaker 2: at stake. But there is a something of a therapy 518 00:28:34,956 --> 00:28:35,836 Speaker 2: bought industry. 519 00:28:35,996 --> 00:28:38,476 Speaker 1: There is a private sector. Like, tell me what is 520 00:28:38,516 --> 00:28:39,876 Speaker 1: the broader landscape here? 521 00:28:39,996 --> 00:28:42,916 Speaker 2: Like, so there's a lot of folks that are have 522 00:28:43,076 --> 00:28:48,236 Speaker 2: jumped in predominantly sense the loans of shot GPT, and 523 00:28:48,516 --> 00:28:51,556 Speaker 2: a lot of folks that have learned that you can 524 00:28:52,076 --> 00:28:54,196 Speaker 2: call a foundation model fairly easily. 525 00:28:54,436 --> 00:28:56,076 Speaker 1: When you say call you mean just sort of like 526 00:28:56,596 --> 00:29:00,236 Speaker 1: you sort of take a foundation model like GBT, and 527 00:29:00,276 --> 00:29:02,316 Speaker 1: then you kind of put a wrapper around exactly and 528 00:29:02,356 --> 00:29:05,516 Speaker 1: the rapper it's like it's basically GPT with a therapist wrapper. 529 00:29:05,636 --> 00:29:08,436 Speaker 2: Yeah. So it's a lot of folks within this industry 530 00:29:08,836 --> 00:29:13,876 Speaker 2: are saying, hey, you act like a therapist and then 531 00:29:13,956 --> 00:29:16,116 Speaker 2: kind of off to the races. It's it's otherwise not 532 00:29:16,676 --> 00:29:19,236 Speaker 2: changed in any way, shape or form. It's it's like 533 00:29:19,316 --> 00:29:22,596 Speaker 2: a literally like a system prompt. So if you were 534 00:29:22,876 --> 00:29:25,436 Speaker 2: interacting with chat GBT, it would be something along the 535 00:29:25,476 --> 00:29:28,716 Speaker 2: lines of, hey, act as a therapist and here's what 536 00:29:29,276 --> 00:29:31,116 Speaker 2: we go on to do. They may have more directions 537 00:29:31,116 --> 00:29:34,036 Speaker 2: than this. But that's this is kind of the light 538 00:29:34,116 --> 00:29:38,276 Speaker 2: touch nature, so super different from what we're doing. Actually, yes, 539 00:29:38,716 --> 00:29:42,836 Speaker 2: so we conducted the first randomized control trial of any 540 00:29:42,996 --> 00:29:47,356 Speaker 2: generative AI for any type of clinical mental health problem. 541 00:29:48,116 --> 00:29:51,596 Speaker 2: And so I know that these folks don't have evidence 542 00:29:52,556 --> 00:29:54,036 Speaker 2: that this kind of thing works. 543 00:29:54,396 --> 00:29:57,756 Speaker 1: I mean, there are non generative AI bots that people 544 00:29:57,836 --> 00:30:01,156 Speaker 1: did randomize control trials of, right, just to be clear. 545 00:30:01,276 --> 00:30:05,036 Speaker 2: Yes, there are non generative absolutely that have have evidence 546 00:30:05,116 --> 00:30:10,076 Speaker 2: behind them. The generative side is very new, and so 547 00:30:10,876 --> 00:30:13,116 Speaker 2: and there's a lot of folks in the generative space 548 00:30:13,156 --> 00:30:16,556 Speaker 2: that have jumped in. Yeah, and so a lot of 549 00:30:16,596 --> 00:30:22,316 Speaker 2: these folks are not psychologists and not psychiatrists, and and 550 00:30:22,556 --> 00:30:26,436 Speaker 2: Silicon Valley, there's a saying move fast and break things. 551 00:30:27,116 --> 00:30:29,756 Speaker 2: This is not the setting to do that. Like move 552 00:30:29,796 --> 00:30:32,036 Speaker 2: fast and break people is what you're talking about here. 553 00:30:32,116 --> 00:30:35,916 Speaker 2: You know, it's like the and the amount of times 554 00:30:35,916 --> 00:30:40,276 Speaker 2: that these foundation models act in profoundly unsafe ways would 555 00:30:40,276 --> 00:30:43,596 Speaker 2: be unacceptable to the field. So like that, we tested 556 00:30:43,636 --> 00:30:45,996 Speaker 2: a lot of these models alongside when we were developing 557 00:30:46,036 --> 00:30:48,556 Speaker 2: all of this. So it's like, I know that they 558 00:30:48,636 --> 00:30:51,156 Speaker 2: don't they don't work in this kind of way in 559 00:30:51,196 --> 00:30:55,676 Speaker 2: a real safe environment. So because of that, I'm I'm 560 00:30:55,716 --> 00:30:59,516 Speaker 2: really hugely concerned with kind of the field at large 561 00:30:59,556 --> 00:31:02,396 Speaker 2: that is moving fast and doesn't really have this level 562 00:31:02,476 --> 00:31:06,116 Speaker 2: of dedication to trying to do it right. And I 563 00:31:06,116 --> 00:31:09,636 Speaker 2: think one of the things that's really kind of canning 564 00:31:09,876 --> 00:31:12,676 Speaker 2: within this is it always looks polished, so it's harder 565 00:31:12,716 --> 00:31:16,036 Speaker 2: to see when you're getting exposed to things that are dangerous. 566 00:31:16,316 --> 00:31:18,236 Speaker 2: But the field, I think is in a spot where 567 00:31:18,756 --> 00:31:20,436 Speaker 2: there's a lot of folks that are out there that 568 00:31:20,476 --> 00:31:23,716 Speaker 2: are acting and implementing things that are untested, and I 569 00:31:23,756 --> 00:31:26,396 Speaker 2: suspect a lot of them are really dangerous. 570 00:31:26,916 --> 00:31:30,116 Speaker 1: How do you how do you imagine theahbut getting from 571 00:31:30,156 --> 00:31:32,796 Speaker 1: the experimental phase into the widespread use phase. 572 00:31:32,956 --> 00:31:35,836 Speaker 2: Yeah, so we want to essentially have one at least 573 00:31:35,876 --> 00:31:38,276 Speaker 2: one larger trial before we do this. You know, we 574 00:31:38,316 --> 00:31:42,196 Speaker 2: have it's pretty a pretty decent sized first trial for 575 00:31:42,396 --> 00:31:45,716 Speaker 2: being a first trial, but it's not something that I 576 00:31:45,716 --> 00:31:48,356 Speaker 2: would want to see out in the open just yet. 577 00:31:48,516 --> 00:31:50,876 Speaker 2: We want to have continue to oversight it, make sure 578 00:31:50,916 --> 00:31:54,476 Speaker 2: it's safe and effective. But if it continues to demonstrate 579 00:31:54,516 --> 00:31:57,236 Speaker 2: safety and effectiveness, this is one of those things that 580 00:31:57,756 --> 00:32:00,516 Speaker 2: why I got into this is to really have an 581 00:32:00,516 --> 00:32:03,996 Speaker 2: impact on folks lives, and this is one of those 582 00:32:04,036 --> 00:32:08,756 Speaker 2: things that could scale really effective personalized cares in real ways. So, yeah, 583 00:32:08,796 --> 00:32:12,036 Speaker 2: we we intend to if evidence continues to show that 584 00:32:12,076 --> 00:32:14,676 Speaker 2: it's safe and effective, to make this out into the 585 00:32:15,076 --> 00:32:17,956 Speaker 2: open market. In terms of the thing that I care 586 00:32:17,996 --> 00:32:19,996 Speaker 2: about in terms of the ways that we could do 587 00:32:20,076 --> 00:32:23,076 Speaker 2: this is trying to do this in some ways that 588 00:32:23,076 --> 00:32:25,876 Speaker 2: would be scalable, so that we're considering a bunch of 589 00:32:25,876 --> 00:32:29,356 Speaker 2: different pathways. Some of those would be delivered by philanthropy 590 00:32:29,876 --> 00:32:34,716 Speaker 2: or nonprofit models. We are considering also like just a 591 00:32:34,716 --> 00:32:37,556 Speaker 2: strategy that would just not for me to make money, 592 00:32:37,836 --> 00:32:39,916 Speaker 2: but just to scale this under some kind of for 593 00:32:40,076 --> 00:32:43,436 Speaker 2: profit structure as well, but really just to try to 594 00:32:43,596 --> 00:32:46,316 Speaker 2: get this out into the open so that folks could 595 00:32:46,356 --> 00:32:49,876 Speaker 2: actually use it, because ultimately we'll need some kind of 596 00:32:49,956 --> 00:32:54,276 Speaker 2: revenue in some ways to be part of this that 597 00:32:54,316 --> 00:32:57,716 Speaker 2: would essentially enable the servers to stay on and to 598 00:32:57,756 --> 00:32:58,276 Speaker 2: scale it. 599 00:32:58,556 --> 00:33:00,876 Speaker 1: And presumably you have to pay some amount of people 600 00:33:00,916 --> 00:33:03,796 Speaker 1: to do some amount of supervision absolutely forever. 601 00:33:04,036 --> 00:33:07,836 Speaker 2: Yeah, So we in the real deployment setting, we hope 602 00:33:07,836 --> 00:33:12,796 Speaker 2: to have essentially the decreasing levels of oversight relative to 603 00:33:12,836 --> 00:33:16,316 Speaker 2: these trials, but not an absence of oversight. So exactly 604 00:33:16,596 --> 00:33:18,516 Speaker 2: you're not going to stay up all night reading every 605 00:33:18,556 --> 00:33:22,196 Speaker 2: message exactly, that won't be sustainable for the future, but 606 00:33:22,276 --> 00:33:24,636 Speaker 2: we will have like flags for things that should be 607 00:33:25,276 --> 00:33:26,996 Speaker 2: seen by humans and intervened upon. 608 00:33:27,756 --> 00:33:32,876 Speaker 1: Let's talk about this other domain you've worked in in 609 00:33:32,956 --> 00:33:36,036 Speaker 1: terms of technology and mental health, right, and so in 610 00:33:36,076 --> 00:33:38,796 Speaker 1: addition to your work on thera bot, you've done a 611 00:33:38,836 --> 00:33:43,956 Speaker 1: lot of work on it seems like basically diagnosis monitoring people, 612 00:33:44,356 --> 00:33:49,436 Speaker 1: essentially using mobile devices and wearables to track people's mental 613 00:33:49,436 --> 00:33:52,396 Speaker 1: health to predict outcomes like tell me about your work 614 00:33:52,436 --> 00:33:53,476 Speaker 1: there and the field there. 615 00:33:54,036 --> 00:33:58,676 Speaker 2: So essentially it's trying to trying to monitor folks within 616 00:33:58,716 --> 00:34:03,356 Speaker 2: their freestanding conditions, so like in their real real life, 617 00:34:04,076 --> 00:34:08,636 Speaker 2: through using technology so in ways that are not don't 618 00:34:08,676 --> 00:34:09,596 Speaker 2: require burden. 619 00:34:09,916 --> 00:34:12,876 Speaker 1: The starting point is like your phone is collecting data 620 00:34:12,916 --> 00:34:15,796 Speaker 1: about you all the time. What if that data could 621 00:34:15,796 --> 00:34:16,836 Speaker 1: make you less depressed? 622 00:34:17,236 --> 00:34:19,476 Speaker 2: Yeah, exactly, what if we could use that data to 623 00:34:19,516 --> 00:34:22,956 Speaker 2: know something about you so that we could actually intervene 624 00:34:23,116 --> 00:34:27,196 Speaker 2: and so, like thinking about a lot of mental health symptoms. 625 00:34:27,276 --> 00:34:31,276 Speaker 2: I think one of the challenges of them is they 626 00:34:31,356 --> 00:34:34,676 Speaker 2: are not like all or nothing the field. Actually, I 627 00:34:34,676 --> 00:34:38,156 Speaker 2: think it's this really wrong. And when you would talk 628 00:34:38,196 --> 00:34:41,356 Speaker 2: to anybody who has a experience as a clinical problem, 629 00:34:41,636 --> 00:34:44,916 Speaker 2: they have changes that happen pretty rapidly within their daily life. 630 00:34:44,956 --> 00:34:48,516 Speaker 2: So they like will have better moments and worse moments 631 00:34:48,516 --> 00:34:51,716 Speaker 2: within a day, They'll have better and worse days. And 632 00:34:51,756 --> 00:34:54,436 Speaker 2: it's not like it's all this like it's always depressed 633 00:34:54,676 --> 00:34:58,916 Speaker 2: or not depressed. It's like these these fluctuating states of it. 634 00:34:59,196 --> 00:35:02,556 Speaker 2: And I think one of the things that's really important 635 00:35:02,556 --> 00:35:05,076 Speaker 2: about these types of things is if we can monitor 636 00:35:05,116 --> 00:35:08,316 Speaker 2: and predict those rapid changes, which I think we can. 637 00:35:08,676 --> 00:35:11,836 Speaker 2: We have a that we can is that we can 638 00:35:11,876 --> 00:35:16,076 Speaker 2: then intervene upon the symptoms before they happen in real time, 639 00:35:16,436 --> 00:35:18,436 Speaker 2: so like trying to predict the ebbs and the flows 640 00:35:18,436 --> 00:35:21,716 Speaker 2: of the symptoms, not to like say, I want somebody 641 00:35:21,756 --> 00:35:24,556 Speaker 2: to never be able to be stressed within their life, 642 00:35:24,596 --> 00:35:26,636 Speaker 2: but so that they can actually be more resilient and 643 00:35:26,676 --> 00:35:27,276 Speaker 2: cope with it. 644 00:35:28,476 --> 00:35:32,436 Speaker 1: And so what's the state of that art, Like, is 645 00:35:32,476 --> 00:35:35,076 Speaker 1: there somebody who's can you do that? Can somebody do that? 646 00:35:35,236 --> 00:35:36,116 Speaker 1: Is there an app for that? 647 00:35:36,236 --> 00:35:38,516 Speaker 2: As we used to say, Yeah, I mean we have 648 00:35:39,396 --> 00:35:44,516 Speaker 2: the science surrounding. This is about ten years old. We've 649 00:35:44,556 --> 00:35:47,796 Speaker 2: done about forty studies in this area across a broad 650 00:35:47,876 --> 00:35:53,676 Speaker 2: range of symptoms, so anxiety, depression, post traumatic stress disorder, schizophrenia, 651 00:35:54,956 --> 00:35:59,676 Speaker 2: bipolar disorder, eating disorders, so a lodge are different types 652 00:35:59,676 --> 00:36:02,116 Speaker 2: of clinical phenomenon and we can predict a lot of 653 00:36:02,116 --> 00:36:07,076 Speaker 2: different things in ways that I think are really important. 654 00:36:07,076 --> 00:36:10,156 Speaker 2: But I think, like to really move the needle on 655 00:36:10,196 --> 00:36:14,636 Speaker 2: something that would make it into population wide ability to 656 00:36:14,716 --> 00:36:17,796 Speaker 2: do this, I think the real thing that would be 657 00:36:18,036 --> 00:36:22,236 Speaker 2: needed for like the ability to do this is to 658 00:36:22,316 --> 00:36:28,036 Speaker 2: pair this with intervention that's dynamic. So something that's actually ability, 659 00:36:28,276 --> 00:36:31,956 Speaker 2: has an ability to change and has like a boundless 660 00:36:31,996 --> 00:36:35,476 Speaker 2: context of intervention. So I'm going to actually loop you. 661 00:36:35,436 --> 00:36:36,516 Speaker 1: Back like the Abot. 662 00:36:36,716 --> 00:36:39,276 Speaker 2: That's exactly right. So these two things that have been 663 00:36:39,316 --> 00:36:43,716 Speaker 2: distinct arms of my work are like so natural compliments 664 00:36:43,756 --> 00:36:46,156 Speaker 2: to one another. Now think about Okay, let's come back 665 00:36:46,156 --> 00:36:48,116 Speaker 2: to therabot in this kind of setting. 666 00:36:48,396 --> 00:36:49,396 Speaker 1: So give me the dream. 667 00:36:49,556 --> 00:36:53,556 Speaker 2: So this is the dream. So you have Therabot, but 668 00:36:53,716 --> 00:36:57,476 Speaker 2: instead of like a psychologist that's completely unaware of what happens, 669 00:36:57,556 --> 00:37:00,636 Speaker 2: is reliant on the patient to tell them everything that's 670 00:37:00,676 --> 00:37:03,236 Speaker 2: going on. In their life. Yeah, all of a sudden, 671 00:37:03,276 --> 00:37:06,676 Speaker 2: there butt knows them knows hey, oh this they're not 672 00:37:06,756 --> 00:37:10,996 Speaker 2: sleeping very well for the past couple days. They haven't 673 00:37:11,196 --> 00:37:14,796 Speaker 2: left their home this week, and this is a big 674 00:37:14,836 --> 00:37:18,156 Speaker 2: deviation from them and how they normally would live life 675 00:37:18,516 --> 00:37:22,076 Speaker 2: Like this can be targets of intervention that don't wait 676 00:37:22,436 --> 00:37:25,476 Speaker 2: for this to be some sustained pattern in their life 677 00:37:25,636 --> 00:37:29,316 Speaker 2: that becomes entrenched and hard to change. Like, no, let's 678 00:37:29,316 --> 00:37:31,916 Speaker 2: actually have that as part of the conversation, where we 679 00:37:31,956 --> 00:37:33,916 Speaker 2: don't have to wait for someone to tell us that 680 00:37:34,156 --> 00:37:35,916 Speaker 2: that they didn't get out of bed. We kind of 681 00:37:35,956 --> 00:37:39,196 Speaker 2: know that they haven't left their house, and we can 682 00:37:39,236 --> 00:37:42,116 Speaker 2: actually make that a content of the intervention. So that's like, 683 00:37:42,236 --> 00:37:46,356 Speaker 2: I think these these ability to like intervene proactively in 684 00:37:46,396 --> 00:37:49,876 Speaker 2: these risk moments and not wait for folks to come 685 00:37:49,916 --> 00:37:53,116 Speaker 2: to us and tell us every aspect of their life 686 00:37:53,116 --> 00:37:55,596 Speaker 2: that they may not know and so like because of this, 687 00:37:55,876 --> 00:38:00,796 Speaker 2: it's that's that's where I think there's a really powerful 688 00:38:01,076 --> 00:38:01,956 Speaker 2: pairing of these two. 689 00:38:02,276 --> 00:38:05,276 Speaker 1: I can see why that combination would be incredibly powerful 690 00:38:05,316 --> 00:38:09,156 Speaker 1: and helpful. Do you worry at all about having that 691 00:38:09,276 --> 00:38:12,956 Speaker 1: much information and that much sort of personal information on 692 00:38:13,036 --> 00:38:16,756 Speaker 1: so many dimensions about people who are by definition vulnerable. 693 00:38:16,996 --> 00:38:19,876 Speaker 2: Yeah, I mean, in some ways, I think it's the 694 00:38:19,996 --> 00:38:22,596 Speaker 2: real ways that folks are already collecting a lot of 695 00:38:22,596 --> 00:38:26,036 Speaker 2: this type of data already on these same populations, and 696 00:38:26,076 --> 00:38:27,996 Speaker 2: now that we could put it to good use. Do 697 00:38:28,076 --> 00:38:32,156 Speaker 2: I worry about kind of yet falling into the wrong hands. Absolutely. 698 00:38:32,236 --> 00:38:35,476 Speaker 2: I mean we have like really big tight data security 699 00:38:35,876 --> 00:38:37,996 Speaker 2: kind of protocols surrounding all of this to try to 700 00:38:38,036 --> 00:38:41,196 Speaker 2: make sure that only folks that are established members of 701 00:38:41,236 --> 00:38:44,676 Speaker 2: the team have any access to this data. And so yeah, 702 00:38:44,676 --> 00:38:47,516 Speaker 2: we are really concerned about it. But yeah, no, if 703 00:38:47,516 --> 00:38:49,756 Speaker 2: there was a breach or something like that that could 704 00:38:49,796 --> 00:38:57,636 Speaker 2: be hugely impactful, something that would be greatly worry. 705 00:38:56,436 --> 00:39:09,236 Speaker 1: We'll be back in a minute with the lightning round. Hey, 706 00:39:09,316 --> 00:39:11,356 Speaker 1: let's finish with the lightning round. 707 00:39:11,756 --> 00:39:12,116 Speaker 2: Okay. 708 00:39:14,796 --> 00:39:18,956 Speaker 1: On net, have smartphones made us happier or less happy? 709 00:39:19,836 --> 00:39:20,436 Speaker 2: Less happy? 710 00:39:21,196 --> 00:39:23,036 Speaker 1: You think that you think you could change that, You 711 00:39:23,076 --> 00:39:25,156 Speaker 1: think you could make the net flip back the other way. 712 00:39:25,316 --> 00:39:28,156 Speaker 2: I think that we need to meet people where they are, 713 00:39:28,956 --> 00:39:32,436 Speaker 2: and and so this is we're not like trying to 714 00:39:32,476 --> 00:39:34,796 Speaker 2: keep folks on their phones, right, like, we're trying to 715 00:39:34,796 --> 00:39:38,396 Speaker 2: actually start with where they are and intervene there, but 716 00:39:38,516 --> 00:39:41,236 Speaker 2: like push them to go and experience life in a 717 00:39:41,236 --> 00:39:42,276 Speaker 2: lot of ways. 718 00:39:42,676 --> 00:39:47,276 Speaker 1: Yeah, Freud overrated or underrated? 719 00:39:48,156 --> 00:39:48,756 Speaker 2: Overrated? 720 00:39:49,916 --> 00:39:55,516 Speaker 1: Still okay, who's the most underrated thinker in the history 721 00:39:55,556 --> 00:39:56,596 Speaker 1: of psychology? Oh? 722 00:39:56,636 --> 00:40:05,556 Speaker 2: My, I I mean to some degree, Skinner was like 723 00:40:06,316 --> 00:40:10,676 Speaker 2: really operant conditioning is like at the heart of most 724 00:40:11,276 --> 00:40:15,596 Speaker 2: clinical phenomenon that deal with emotions, and I think it's 725 00:40:15,596 --> 00:40:18,876 Speaker 2: probably one of the most impactful. Like it's so simple 726 00:40:18,956 --> 00:40:23,476 Speaker 2: in some ways that behavior is shaped by both positive 727 00:40:23,836 --> 00:40:29,876 Speaker 2: essentially benefits and like drawbacks, so rewards and punishments and 728 00:40:29,956 --> 00:40:33,316 Speaker 2: these these types of things are the simplicity of it 729 00:40:33,356 --> 00:40:36,676 Speaker 2: is is so simple, but like the how meaningful it 730 00:40:36,716 --> 00:40:38,996 Speaker 2: is and daily life is so profound, we. 731 00:40:39,036 --> 00:40:41,796 Speaker 1: Still underrate it. I mean when I the little bit 732 00:40:41,836 --> 00:40:43,996 Speaker 1: I know about Skinner, I think of the black box, right, 733 00:40:44,036 --> 00:40:46,636 Speaker 1: the like, don't worry about what's going on in somebody's mind, 734 00:40:46,796 --> 00:40:48,676 Speaker 1: just look at what's going on on the outdoit. Yeah. 735 00:40:48,716 --> 00:40:49,556 Speaker 2: Yeah, And with. 736 00:40:49,636 --> 00:40:52,036 Speaker 1: Behavior, I mean in a way it sort of maps 737 00:40:52,076 --> 00:40:57,436 Speaker 1: to your wearable's mobile devices thing, right, like just look, 738 00:40:57,516 --> 00:40:59,716 Speaker 1: if you don't go outside, you get sad, and so 739 00:40:59,836 --> 00:41:00,716 Speaker 1: go outside. 740 00:41:00,916 --> 00:41:04,716 Speaker 2: Sure exactly. I am a behaviorist at heart, So this 741 00:41:04,876 --> 00:41:07,036 Speaker 2: is part of part of what however you way. 742 00:41:07,076 --> 00:41:10,356 Speaker 1: I mean, I was actually think briefly before we talked 743 00:41:10,356 --> 00:41:11,916 Speaker 1: that wasn't gonna bring it up, But since you brought 744 00:41:11,956 --> 00:41:14,676 Speaker 1: it up, it's interesting to think. Like the famous thing 745 00:41:14,676 --> 00:41:16,956 Speaker 1: people say about Skinner is like the mind is a 746 00:41:16,956 --> 00:41:18,596 Speaker 1: black box, right, we don't know what's going on on 747 00:41:18,636 --> 00:41:19,636 Speaker 1: the inside and don't worry about it. 748 00:41:19,756 --> 00:41:19,996 Speaker 2: Yeah. 749 00:41:20,036 --> 00:41:23,476 Speaker 1: It makes me think of the way large language models 750 00:41:23,876 --> 00:41:25,956 Speaker 1: on black boxes, and even the people who build them 751 00:41:25,996 --> 00:41:27,116 Speaker 1: don't understand how they work. 752 00:41:27,196 --> 00:41:30,756 Speaker 2: Right. Yeah, absolutely, I think psychologists in some ways are 753 00:41:30,916 --> 00:41:34,796 Speaker 2: best suited to understand the behavior of large language models, 754 00:41:34,836 --> 00:41:38,476 Speaker 2: because it's actually the science of behavior absence the ability 755 00:41:38,516 --> 00:41:43,276 Speaker 2: to like potentially understand what's going on inside, Like neuroscience 756 00:41:43,356 --> 00:41:45,636 Speaker 2: is a natural compliment, but in some ways a different 757 00:41:46,036 --> 00:41:48,196 Speaker 2: different lens in which you view the world. So like 758 00:41:48,276 --> 00:41:51,636 Speaker 2: trying to develop a predictable system that is shaped. I 759 00:41:51,636 --> 00:41:54,796 Speaker 2: actually think we're not so bad in terms of folks 760 00:41:54,796 --> 00:41:55,916 Speaker 2: to be able to take this on. 761 00:41:59,236 --> 00:42:00,756 Speaker 1: What's your go to karaoke song? 762 00:42:01,436 --> 00:42:04,396 Speaker 2: Oh, don't stop believing. I'm a big karaoke person too. 763 00:42:04,636 --> 00:42:09,756 Speaker 1: Somebody just sent me that just the vocal from stop believing. 764 00:42:09,796 --> 00:42:13,436 Speaker 2: Ah, yeah, no, it's it's it's like a meme. 765 00:42:13,716 --> 00:42:14,676 Speaker 1: It's amazing, it is. 766 00:42:15,396 --> 00:42:15,516 Speaker 2: Uh. 767 00:42:17,516 --> 00:42:20,396 Speaker 1: What's one thing you've learned about yourself from a wearable device? 768 00:42:21,036 --> 00:42:24,356 Speaker 2: Mm hmm. One of the things that I would say, 769 00:42:24,396 --> 00:42:28,796 Speaker 2: like my ability to understand recognize when I've actually had 770 00:42:29,036 --> 00:42:31,956 Speaker 2: a poor night's sleep or a good night's sleep has 771 00:42:31,956 --> 00:42:35,796 Speaker 2: gotten much better over time. Like I think, as humans 772 00:42:35,796 --> 00:42:38,116 Speaker 2: were not very well calibrated to it. But as you 773 00:42:38,836 --> 00:42:43,116 Speaker 2: actually start to wear them and get understand you can 774 00:42:43,236 --> 00:42:45,316 Speaker 2: you are you become a better self reporter. 775 00:42:45,436 --> 00:42:49,116 Speaker 1: Actually I sleep badly. I assume it's because I'm middle aged. 776 00:42:50,356 --> 00:42:52,236 Speaker 1: I do most of the things you're supposed to do. 777 00:42:52,276 --> 00:42:54,996 Speaker 1: But give me one tip for sleeping. Well, I get 778 00:42:54,996 --> 00:42:56,396 Speaker 1: to sleep, but then I wake up in the middle 779 00:42:56,436 --> 00:42:56,756 Speaker 1: of the night. 780 00:42:57,036 --> 00:42:59,516 Speaker 2: Yeah. That. I think. One of the things that a 781 00:42:59,556 --> 00:43:04,396 Speaker 2: lot of people will do is they'll worry, particularly in bed, 782 00:43:04,516 --> 00:43:07,516 Speaker 2: or use this as a time for thinking, so a 783 00:43:07,516 --> 00:43:10,596 Speaker 2: lot of a lot of the effective surrounding that, or 784 00:43:10,636 --> 00:43:14,956 Speaker 2: to try to actually give yourself that same time that 785 00:43:14,996 --> 00:43:17,596 Speaker 2: would be that unstructured time that you would be dedicated 786 00:43:17,996 --> 00:43:19,636 Speaker 2: that you might experience in bed. 787 00:43:19,836 --> 00:43:22,196 Speaker 1: You tell me I should worry it ten at night 788 00:43:22,236 --> 00:43:24,156 Speaker 1: instead of three in the morning. If I worry, if 789 00:43:24,236 --> 00:43:26,476 Speaker 1: I say it ten at night, okay, worry now, then 790 00:43:26,476 --> 00:43:27,396 Speaker 1: I'll sleep through the night. 791 00:43:27,596 --> 00:43:31,036 Speaker 2: There there's literally evidence surrounding scheduling your worries out and 792 00:43:31,036 --> 00:43:33,876 Speaker 2: I love during the day and it does work. So yeah, 793 00:43:33,916 --> 00:43:35,716 Speaker 2: that's okay. If it's got some. 794 00:43:35,676 --> 00:43:38,796 Speaker 1: Worries, I'm gonna worry it ten tonight, I'll let you 795 00:43:38,836 --> 00:43:39,556 Speaker 1: know tomorrow morning. 796 00:43:39,556 --> 00:43:45,676 Speaker 2: If it were just don't do it in bed. Yeah, okay, okay. 797 00:43:45,796 --> 00:43:48,036 Speaker 1: If you had to build a chatbot based on one 798 00:43:48,076 --> 00:43:54,796 Speaker 1: of the following fictional therapists or psychiatrists, which fictional therapist 799 00:43:54,916 --> 00:44:00,316 Speaker 1: or psychiatrist would it be? A Jennifer Milthy from The Sopranos, 800 00:44:00,636 --> 00:44:05,836 Speaker 1: B Doctor Krokowski from The Magic Mountain, see Fraser from Fraser, 801 00:44:06,396 --> 00:44:08,196 Speaker 1: or d Hannibal Lecter. 802 00:44:08,516 --> 00:44:12,636 Speaker 2: Oh god, okay, I would probably go with Frasier, a 803 00:44:12,756 --> 00:44:15,636 Speaker 2: very different style of therapy than but I think his 804 00:44:15,916 --> 00:44:20,516 Speaker 2: demeanor is at least generally decent, So yeah, mostly appropriate 805 00:44:20,556 --> 00:44:22,396 Speaker 2: with most of his clients from what I remember in 806 00:44:22,436 --> 00:44:22,756 Speaker 2: the show. 807 00:44:22,956 --> 00:44:26,276 Speaker 1: Okay, it's a very thoughtful response to an absurd question. 808 00:44:28,876 --> 00:44:30,356 Speaker 1: Anything else we should talk about? 809 00:44:30,676 --> 00:44:34,436 Speaker 2: You've asked wonderful questions one thing I will say, maybe 810 00:44:34,476 --> 00:44:37,676 Speaker 2: for folks that might be listening, is a lot of 811 00:44:37,676 --> 00:44:42,156 Speaker 2: folks are already using generator AI for their mental health treatment, 812 00:44:43,076 --> 00:44:47,236 Speaker 2: and so I will I'll give a recommendation if folks 813 00:44:47,276 --> 00:44:51,036 Speaker 2: are doing this already, that they just treat it with 814 00:44:51,116 --> 00:44:53,916 Speaker 2: the same level of concern they would have the Internet. 815 00:44:54,756 --> 00:44:58,156 Speaker 2: They there may be benefits they can get out of it. Awesome, great, 816 00:44:59,116 --> 00:45:02,556 Speaker 2: but just don't work on changing something within your daily 817 00:45:02,596 --> 00:45:07,196 Speaker 2: life surrounding particularly your behavior, based on what these models 818 00:45:07,196 --> 00:45:10,276 Speaker 2: are doing, without some real thought on making sure that 819 00:45:10,276 --> 00:45:12,996 Speaker 2: that is actually going to be a safe thing for 820 00:45:13,036 --> 00:45:13,476 Speaker 2: you to do. 821 00:45:20,356 --> 00:45:23,116 Speaker 1: Nick Jacobsen is an assistant professor at the Center for 822 00:45:23,196 --> 00:45:26,996 Speaker 1: Technology and Behavioral Health at the Geissel School of Medicine 823 00:45:27,156 --> 00:45:30,956 Speaker 1: at Dartmouth. Today's show was produced by Gabriel Hunter Chang. 824 00:45:31,276 --> 00:45:34,596 Speaker 1: It was edited by Lydia Jean Kott and engineered by 825 00:45:34,636 --> 00:45:38,236 Speaker 1: Sarah Brugier. You can email us at problem at Pushkin 826 00:45:38,316 --> 00:45:41,516 Speaker 1: dot FM. I'm Jacob Boldstein, and we'll be back next 827 00:45:41,556 --> 00:45:42,316 Speaker 1: week with another 828 00:45:42,356 --> 00:45:51,476 Speaker 2: Episode of What's Your Problem.